Science.gov

Sample records for hazard mitigation studies

  1. Unacceptable Risk: Earthquake Hazard Mitigation in One California School District. Hazard Mitigation Case Study.

    ERIC Educational Resources Information Center

    California State Office of Emergency Services, Sacramento.

    Earthquakes are a perpetual threat to California's school buildings. School administrators must be aware that hazard mitigation means much more than simply having a supply of water bottles in the school; it means getting everyone involved in efforts to prevent tragedies from occurring in school building in the event of an earthquake. The PTA in…

  2. Study proposes wholesale change in thinking about natural hazards mitigation

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    The “lollapaloozas,” the major natural catastrophes, are getting bigger and bigger, and it is time to confront this growing problem by dramatically changing the way that society approaches natural hazard mitigation, conducts itself in relation to the natural environment, and accepts responsibility for activities that could lead to or increase disasters, according to Dennis Mileti, principal investigator of a new study on natural hazards, and director of the Natural Hazards Research and Applications Information Center at the University of Colorado at Boulder.Since 1989, the United States has been struck by seven of the nation's 10 most costly natural disasters, including the 1994 Northridge earthquake in California that caused $25 billion in damages. Also since 1989, the financial cost of natural hazards in the United States—which includes floods, earthquakes, hurricanes, and wildfires, as well as landslides, heat, and fog—has frequently averaged $1 billion per week, a price that some experts say will continue rising. Internationally, the Kobe, Japan, earthquake cost more than $100 billion and is the most financially costly disaster in world history None of these figures include indirect losses related to natural disasters, such as lost economic productivity

  3. Numerical Study on Tsunami Hazard Mitigation Using a Submerged Breakwater

    PubMed Central

    Yoo, Jeseon; Han, Sejong; Cho, Yong-Sik

    2014-01-01

    Most coastal structures have been built in surf zones to protect coastal areas. In general, the transformation of waves in the surf zone is quite complicated and numerous hazards to coastal communities may be associated with such phenomena. Therefore, the behavior of waves in the surf zone should be carefully analyzed and predicted. Furthermore, an accurate analysis of deformed waves around coastal structures is directly related to the construction of economically sound and safe coastal structures because wave height plays an important role in determining the weight and shape of a levee body or armoring material. In this study, a numerical model using a large eddy simulation is employed to predict the runup heights of nonlinear waves that passed a submerged structure in the surf zone. Reduced runup heights are also predicted, and their characteristics in terms of wave reflection, transmission, and dissipation coefficients are investigated. PMID:25215334

  4. Teaching Hazards Mitigation.

    ERIC Educational Resources Information Center

    Abernethy, James

    1980-01-01

    It is recommended that courses be provided for architectural students in postoccupancy building performance and user experience. A course in disaster mitigation is described. It was introduced to increase student awareness of the importance of design decisions in building safety. (MSE)

  5. Washington Tsunami Hazard Mitigation Program

    NASA Astrophysics Data System (ADS)

    Walsh, T. J.; Schelling, J.

    2012-12-01

    Washington State has participated in the National Tsunami Hazard Mitigation Program (NTHMP) since its inception in 1995. We have participated in the tsunami inundation hazard mapping, evacuation planning, education, and outreach efforts that generally characterize the NTHMP efforts. We have also investigated hazards of significant interest to the Pacific Northwest. The hazard from locally generated earthquakes on the Cascadia subduction zone, which threatens tsunami inundation in less than hour following a magnitude 9 earthquake, creates special problems for low-lying accretionary shoreforms in Washington, such as the spits of Long Beach and Ocean Shores, where high ground is not accessible within the limited time available for evacuation. To ameliorate this problem, we convened a panel of the Applied Technology Council to develop guidelines for construction of facilities for vertical evacuation from tsunamis, published as FEMA 646, now incorporated in the International Building Code as Appendix M. We followed this with a program called Project Safe Haven (http://www.facebook.com/ProjectSafeHaven) to site such facilities along the Washington coast in appropriate locations and appropriate designs to blend with the local communities, as chosen by the citizens. This has now been completed for the entire outer coast of Washington. In conjunction with this effort, we have evaluated the potential for earthquake-induced ground failures in and near tsunami hazard zones to help develop cost estimates for these structures and to establish appropriate tsunami evacuation routes and evacuation assembly areas that are likely to to be available after a major subduction zone earthquake. We intend to continue these geotechnical evaluations for all tsunami hazard zones in Washington.

  6. Contributions of Nimbus 7 TOMS Data to Volcanic Study and Hazard Mitigation

    NASA Technical Reports Server (NTRS)

    Krueger, Arlin J.; Bluth, G. J. S.; Schaefer, S. A.

    1998-01-01

    Nimbus TOMS data have led to advancements among many volcano-related scientific disciplines, from the initial ability to quantify SO2 clouds leading to derivations of eruptive S budgets and fluxes, to tracking of individual clouds, assessing global volcanism and atmospheric impacts. Some of the major aspects of TOMS-related research, listed below, will be reviewed and updated: (1) Measurement of volcanic SO2 clouds: Nimbus TOMS observed over 100 individual SO2 clouds during its mission lifetime; large explosive eruptions are now routinely and reliably measured by satellite. (2) Eruption processes: quantification of SO2 emissions have allowed assessments of eruption sulfur budgets, the evaluation of "excess" sulfur, and inferences of H2S emissions. (3) Detection of ash: TOMS data are now used to detect volcanic particulates in the atmosphere, providing complementary analyses to infrared methods of detection. Paired TOMS and AVHRR studies have provided invaluable information on volcanic cloud compositions and processes. (4) Cloud tracking and hazard mitigation: volcanic clouds can be considered gigantic tracers in the atmosphere, and studies of the fates of these clouds have led to new knowledge of their physical and chemical dispersion in the atmosphere for predictive models. (5) Global trends: the long term data set has provided researchers an unparalleled record of explosive volcanism, and forms a key component in assessing annual to decadal trends in global S emissions. (6) Atmospheric impacts: TOMS data have been linked to independent records of atmospheric change, in order to compare cause and effect processes following a massive injection of SO2 into the atmosphere. (7) Future TOMS instruments and applications: Nimbus TOMS has given way to new satellite platforms, with several wavelength and resolution modifications. New efforts to launch a geostationary TOMS could provide unprecedented observations of volcanic activity.

  7. Predictability and extended-range prognosis in natural hazard risk mitigation process: A case study over west Greece

    NASA Astrophysics Data System (ADS)

    Matsangouras, Ioannis T.; Nastos, Panagiotis T.

    2014-05-01

    Natural hazards pose an increasing threat to society and new innovative techniques or methodologies are necessary to be developed, in order to enhance the risk mitigation process in nowadays. It is commonly accepted that disaster risk reduction is a vital key for future successful economic and social development. The systematic improvement accuracy of extended-range prognosis products, relating with monthly and seasonal predictability, introduced them as a new essential link in risk mitigation procedure. Aiming at decreasing the risk, this paper presents the use of seasonal and monthly forecasting process that was tested over west Greece from September to December, 2013. During that season significant severe weather events occurred, causing significant impact to the local society (severe storms/rainfalls, hail, flash floods, etc). Seasonal and monthly forecasting products from European Centre for Medium-Range Weather Forecasts (ECMWF) depicted, with probabilities stratified by terciles, areas of Greece where significant weather may occur. As atmospheric natural hazard early warning systems are able to deliver warnings up to 72 hours in advance, this study illustrates that extended-range prognosis could be introduced as a new technique in risk mitigation. Seasonal and monthly forecast products could highlight extended areas where severe weather events may occur in one month lead time. In addition, a risk mitigation procedure, that extended prognosis products are adopted, is also presented providing useful time to preparedness process at regional administration level.

  8. An economic and geographic appraisal of a spatial natural hazard risk: a study of landslide mitigation rules

    USGS Publications Warehouse

    Bernknopf, R.L.; Brookshire, D.S.; Campbell, R.H.; Shapiro, C.D.

    1988-01-01

    Efficient mitigation of natural hazards requires a spatial representation of the risk, based upon the geographic distribution of physical parameters and man-related development activities. Through such a representation, the spatial probability of landslides based upon physical science concepts is estimated for Cincinnati, Ohio. Mitigation programs designed to reduce loss from landslide natural hazards are then evaluated. An optimum mitigation rule is suggested that is spatially selective and is determined by objective measurements of hillside slope and properties of the underlying soil. -Authors

  9. Playing against nature: improving earthquake hazard mitigation

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Stein, J.

    2012-12-01

    The great 2011 Tohoku earthquake dramatically demonstrated the need to improve earthquake and tsunami hazard assessment and mitigation policies. The earthquake was much larger than predicted by hazard models, and the resulting tsunami overtopped coastal defenses, causing more than 15,000 deaths and $210 billion damage. Hence if and how such defenses should be rebuilt is a challenging question, because the defences fared poorly and building ones to withstand tsunamis as large as March's is too expensive,. A similar issue arises along the Nankai Trough to the south, where new estimates warning of tsunamis 2-5 times higher than in previous models raise the question of what to do, given that the timescale on which such events may occur is unknown. Thus in the words of economist H. Hori, "What should we do in face of uncertainty? Some say we should spend our resources on present problems instead of wasting them on things whose results are uncertain. Others say we should prepare for future unknown disasters precisely because they are uncertain". Thus society needs strategies to mitigate earthquake and tsunami hazards that make economic and societal sense, given that our ability to assess these hazards is poor, as illustrated by highly destructive earthquakes that often occur in areas predicted by hazard maps to be relatively safe. Conceptually, we are playing a game against nature "of which we still don't know all the rules" (Lomnitz, 1989). Nature chooses tsunami heights or ground shaking, and society selects the strategy to minimize the total costs of damage plus mitigation costs. As in any game of chance, we maximize our expectation value by selecting the best strategy, given our limited ability to estimate the occurrence and effects of future events. We thus outline a framework to find the optimal level of mitigation by balancing its cost against the expected damages, recognizing the uncertainties in the hazard estimates. This framework illustrates the role of the

  10. Mitigation of Hazardous Comets and Asteroids

    NASA Astrophysics Data System (ADS)

    Belton, Michael J. S.; Morgan, Thomas H.; Samarasinha, Nalin H.; Yeomans, Donald K.

    2011-03-01

    Preface; 1. Recent progress in interpreting the nature of the near-Earth object population W. Bottke, A. Morbidelli and R. Jedicke; 2. Earth impactors: orbital characteristics and warning times S. R. Chesley and T. B. Spahr; 3. The role of radar in predicting and preventing asteroid and comet collisions with Earth S. J. Ostro and J. D. Giorgini; 4. Interior structures for asteroids and cometary nuclei E. Asphaug; 5. What we know and don't know about surfaces of potentially hazardous small bodies C. R. Chapman; 6. About deflecting asteroids and comets K. A. Holsapple; 7. Scientific requirements for understanding the near-Earth asteroid population A. W. Harris; 8. Physical properties of comets and asteroids inferred from fireball observations M. D. Martino and A. Cellino; 9. Mitigation technologies and their requirements C. Gritzner and R. Kahle; 10. Peering inside near-Earth objects with radio tomography W. Kofman and A. Safaeinili; 11. Seismological imvestigation of asteroid and comet interiors J. D. Walker and W. F. Huebner; 12. Lander and penetrator science for near-Earth object mitigation studies A. J. Ball, P. Lognonne, K. Seiferlin, M. Patzold and T. Spohn; 13. Optimal interpretation and deflection of Earth-approaching asteroids using low-thrust electric propulsion B. A. Conway; 14. Close proximity operations at small bodies: orbiting, hovering, and hopping D. J. Scheeres; 15. Mission operations in low gravity regolith and dust D. Sears, M. Franzen, S. Moore, S. Nichols, M. Kareev and P. Benoit; 16. Impacts and the public: communicating the nature of the impact hazard D. Morrison, C. R. Chapman, D. Steel and R. P. Binzel; 17. Towards a program to remove the threat of hazardous NEOs M. J. S. Belton.

  11. Volcano hazard mitigation program in Indonesia

    USGS Publications Warehouse

    Sudradjat, A.

    1990-01-01

    Volcanological investigations in Indonesia were started in the 18th century, when Valentijn in 1726 prepared a chronological report of the eruption of Banda Api volcno, Maluku. Modern and intensive volcanological studies did not begin until the catastrophic eruption of Kelut volcano, East Java, in 1919. The eruption took 5,011 lives and destroyed thousands of acres of coffee plantation. An eruption lahar generated by the crater lake water mixed with volcanic eruptions products was the cause of death for a high number of victims. An effort to mitigate the danger from volcanic eruption was first initiated in 1921 by constructing a tunnel to drain the crater lake water of Kelut volcano. At the same time a Volcanological Survey was established by the government with the responsibility of seeking every means for minimizing the hazard caused by volcanic eruption. 

  12. WHC natural phenomena hazards mitigation implementation plan

    SciTech Connect

    Conrads, T.J.

    1996-09-11

    Natural phenomena hazards (NPH) are unexpected acts of nature which pose a threat or danger to workers, the public or to the environment. Earthquakes, extreme winds (hurricane and tornado),snow, flooding, volcanic ashfall, and lightning strike are examples of NPH at Hanford. It is the policy of U.S. Department of Energy (DOE) to design, construct and operate DOE facilitiesso that workers, the public and the environment are protected from NPH and other hazards. During 1993 DOE, Richland Operations Office (RL) transmitted DOE Order 5480.28, ``Natural Phenomena Hazards Mitigation,`` to Westinghouse Hanford COmpany (WHC) for compliance. The Order includes rigorous new NPH criteria for the design of new DOE facilities as well as for the evaluation and upgrade of existing DOE facilities. In 1995 DOE issued Order 420.1, ``Facility Safety`` which contains the same NPH requirements and invokes the same applicable standards as Order 5480.28. It will supersede Order 5480.28 when an in-force date for Order 420.1 is established through contract revision. Activities will be planned and accomplished in four phases: Mobilization; Prioritization; Evaluation; and Upgrade. The basis for the graded approach is the designation of facilities/structures into one of five performance categories based upon safety function, mission and cost. This Implementation Plan develops the program for the Prioritization Phase, as well as an overall strategy for the implemention of DOE Order 5480.2B.

  13. The National Tsunami Hazard Mitigation Program

    NASA Astrophysics Data System (ADS)

    Bernard, E. N.

    2003-12-01

    The National Tsunami Hazard Mitigation Program (NTHMP) is a state/Federal partnership that was created to reduce the impacts of tsunamis to U. S. Coastal areas. It is a coordinated effort between the states of Alaska, California, Hawaii, Oregon, and Washington and four Federal agencies: the National Oceanic and Atmospheric Administration(NOAA), the Federal Emergency Management Agency (FEMA), the U. S. Geological Survey (USGS), and the National Science Foundation(NSF). NOAA has led the effort to forge a solid partnership between the states and the Federal agencies because of it's responsibility to provide tsunami warning services to the nation. This successful partnership has established a mitigation program in each state that is preparing coastal communities for the next tsunami. Inundation maps are now available for many of the coastal communities of Alaska, California, Hawaii, Oregon, and Washington. These maps are used to develop evacuation plans and, in the case of Oregon, for land use management. The partnership has successfully upgraded the warning capability in NOAA so that earthquakes can be detected within 5 minutes and tsunamis can be detected in the open ocean in real time, paving the way for improved tsunami forecasts. NSF's new Network for Earthquake Engineering (NEES) program has agreed to work with the NTHMP to focus tsunami research on national needs. An overview of the NTHMP will be given including a discussion of accomplishments and the new collaboration with NEES.

  14. The National Tsunami Hazard Mitigation Program

    NASA Astrophysics Data System (ADS)

    Bernard, E. N.

    2004-12-01

    The National Tsunami Hazard Mitigation Program (NTHMP) is a state/Federal partnership that was created to reduce the impacts of tsunamis to U.S. Coastal areas. It is a coordinated effort between the states of Alaska, California, Hawaii, Oregon, and Washington and four Federal agencies: the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), the U.S. Geological Survey (USGS), and the National Science Foundation (NSF). NOAA has led the effort to forge a solid partnership between the states and the Federal agencies because of it's responsibility to provide tsunami warning services to the nation. The successful partnership has established a mitigation program in each state that is developing tsunami resilient coastal communities. Inundation maps are now available for many of the coastal communities of Alaska, California, Hawaii, Oregon, and Washington. These maps are used to develop evacuation plans and, in the case of Oregon, for land use management. The NTHMP mapping technology is now being applied to FEMA's Flood Insurance Rate Maps (FIRMs). The NTHMP has successfully upgraded the warning capability in NOAA so that earthquakes can be detected within 5 minutes and tsunamis can be detected in the open ocean in real time. Deep ocean reporting of tsunamis has already averted one unnecessary evacuation of Hawaii and demonstrated that real-time tsunami forecasting is now possible. NSF's new Network for Earthquake Engineering (NEES) program has agreed to work with the NTHMP to focus tsunami research on national needs. An overview of the NTHMP will be given including a discussion of accomplishments and a progress report on NEES and FIRM activities.

  15. Debris flow hazards mitigation--Mechanics, prediction, and assessment

    USGS Publications Warehouse

    Chen, C.-L.; Major, J.J.

    2007-01-01

    These proceedings contain papers presented at the Fourth International Conference on Debris-Flow Hazards Mitigation: Mechanics, Prediction, and Assessment held in Chengdu, China, September 10-13, 2007. The papers cover a wide range of topics on debris-flow science and engineering, including the factors triggering debris flows, geomorphic effects, mechanics of debris flows (e.g., rheology, fluvial mechanisms, erosion and deposition processes), numerical modeling, various debris-flow experiments, landslide-induced debris flows, assessment of debris-flow hazards and risk, field observations and measurements, monitoring and alert systems, structural and non-structural countermeasures against debris-flow hazards and case studies. The papers reflect the latest devel-opments and advances in debris-flow research. Several studies discuss the development and appli-cation of Geographic Information System (GIS) and Remote Sensing (RS) technologies in debris-flow hazard/risk assessment. Timely topics presented in a few papers also include the development of new or innovative techniques for debris-flow monitoring and alert systems, especially an infra-sound acoustic sensor for detecting debris flows. Many case studies illustrate a wide variety of debris-flow hazards and related phenomena as well as their hazardous effects on human activities and settlements.

  16. Potentially Hazardous Objects (PHO) Mitigation Program

    NASA Astrophysics Data System (ADS)

    Huebner, Walter

    Southwest Research Institute (SwRI) and its partner, Los Alamos National Laboratory (LANL), are prepared to develop, implement, and expand procedures to avert collisions of potentially hazardous objects (PHOs) with Earth as recommended by NASA in its White Paper "Near- Earth Object Survey and Deflection Analysis of Alternatives" requested by the US Congress and submitted to it in March 2007. In addition to developing the general mitigation program as outlined in the NASA White Paper, the program will be expanded to include aggressive mitigation procedures for small (e.g., Tunguska-sized) PHOs and other short warning-time PHOs such as some long-period comet nuclei. As a first step the program will concentrate on the most likely and critical cases, namely small objects and long-period comet nuclei with short warning-times, but without losing sight of objects with longer warning-times. Objects smaller than a few hundred meters are of interest because they are about 1000 times more abundant than kilometer-sized objects and are fainter and more difficult to detect, which may lead to short warning times and hence short reaction times. Yet, even these small PHOs can have devastating effects as the 30 June 1908, Tungaska event has shown. In addition, long-period comets, although relatively rare but large (sometimes tens of kilometers in size), cannot be predicted because of their long orbital periods. Comet C/1983 H1 (IRAS-Araki-Alcock), for example, has an orbital period of 963.22 years, was discovered 27 April 1983, and passed Earth only two weeks later, on 11 May 1983, at a distance of 0.0312 AU. Aggressive methods and continuous alertness will be needed to defend against objects with such short warning times. While intact deflection of a PHO remains a key objective, destruction of a PHO and dispersion of the pieces must also be considered. The effectiveness of several alternative methods including nuclear demolition munitions, conventional explosives, and hyper

  17. Influence of behavioral biases on the assessment of multi-hazard risks and the implementation of multi-hazard risks mitigation measures: case study of multi-hazard cyclone shelters in Tamil Nadu, India

    NASA Astrophysics Data System (ADS)

    Komendantova, Nadejda; Patt, Anthony

    2013-04-01

    In December 2004, a multiple hazards event devastated the Tamil Nadu province of India. The Sumatra -Andaman earthquake with a magnitude of Mw=9.1-9.3 caused the Indian Ocean tsunami with wave heights up to 30 m, and flooding that reached up to two kilometers inland in some locations. More than 7,790 persons were killed in the province of Tamil Nadu, with 206 in its capital Chennai. The time lag between the earthquake and the tsunami's arrival in India was over an hour, therefore, if a suitable early warning system existed, a proper means of communicating the warning and shelters existing for people would exist, than while this would not have prevented the destruction of infrastructure, several thousands of human lives would have been saved. India has over forty years of experience in the construction of cyclone shelters. With additional efforts and investment, these shelters could be adapted to other types of hazards such as tsunamis and flooding, as well as the construction of new multi-hazard cyclone shelters (MPCS). It would therefore be possible to mitigate one hazard such as cyclones by the construction of a network of shelters while at the same time adapting these shelters to also deal with, for example, tsunamis, with some additional investment. In this historical case, the failure to consider multiple hazards caused significant human losses. The current paper investigates the patterns of the national decision-making process with regards to multiple hazards mitigation measures and how the presence of behavioral and cognitive biases influenced the perceptions of the probabilities of multiple hazards and the choices made for their mitigation by the national decision-makers. Our methodology was based on the analysis of existing reports from national and international organizations as well as available scientific literature on behavioral economics and natural hazards. The results identified several biases in the national decision-making process when the

  18. Volcanic hazards and their mitigation: Progress and problems

    NASA Astrophysics Data System (ADS)

    Tilling, Robert I.

    1989-05-01

    At the beginning of the twentieth century, volcanology began to emerge as a modern science as a result of increased interest in eruptive phenomena following some of the worst volcanic disasters in recorded history: Krakatau (Indonesia) in 1883 and Mont Pelée (Martinique), Soufrière (St. Vincent), and Santa María (Guatemala) in 1902. Volcanology is again experiencing a period of heightened public awareness and scientific growth in the 1980s, the worst period since 1902 in terms of volcanic disasters and crises. A review of hazards mitigation approaches and techniques indicates that significant advances have been made in hazards assessment, volcano monitoring, and eruption forecasting. For example, the remarkable accuracy of the predictions of dome-building events at Mount St. Helens since June 1980 is unprecedented. Yet a predictive capability for more voluminous and explosive eruptions still has not been achieved. Studies of magma-induced seismicity and ground deformation continue to provide the most systematic and reliable data for early detection of precursors to eruptions and shallow intrusions. In addition, some other geophysical monitoring techniques and geochemical methods have been refined and are being more widely applied and tested. Comparison of the four major volcanic disasters of the 1980s (Mount St. Helens, U.S.A. (1980), El Chichón, Mexico (1982); Galunggung, Indonesia (1982); and Nevado del Ruíz, Colombia (1985) illustrates the importance of predisaster geoscience studies, volcanic hazards assessments, volcano monitoring, contingency planning, and effective communications between scientists and authorities. The death toll (>22,000) from the Ruíz catastrophe probably could have been greatly reduced; the reasons for the tragically ineffective implementation of evacuation measures are still unclear and puzzling in view of the fact that sufficient warnings were given. The most pressing problem in the mitigation of volcanic and associated hazards on

  19. Space options for tropical cyclone hazard mitigation

    NASA Astrophysics Data System (ADS)

    Dicaire, Isabelle; Nakamura, Ryoko; Arikawa, Yoshihisa; Okada, Kazuyuki; Itahashi, Takamasa; Summerer, Leopold

    2015-02-01

    This paper investigates potential space options for mitigating the impact of tropical cyclones on cities and civilians. Ground-based techniques combined with space-based remote sensing instrumentation are presented together with space-borne concepts employing space solar power technology. Two space-borne mitigation options are considered: atmospheric warming based on microwave irradiation and laser-induced cloud seeding based on laser power transfer. Finally technology roadmaps dedicated to the space-borne options are presented, including a detailed discussion on the technological viability and technology readiness level of our proposed systems. Based on these assessments, the space-borne cyclone mitigation options presented in this paper may be established in a quarter of a century.

  20. Volcanic hazards and their mitigation: progress and problems

    USGS Publications Warehouse

    Tilling, R.I.

    1989-01-01

    A review of hazards mitigation approaches and techniques indicates that significant advances have been made in hazards assessment, volcano monioring, and eruption forecasting. For example, the remarkable accuracy of the predictions of dome-building events at Mount St. Helens since June 1980 is unprecedented. Yet a predictive capability for more voluminous and explosive eruptions still has not been achieved. Studies of magma-induced seismicity and ground deformation continue to provide the most systematic and reliable data for early detection of precursors to eruptions and shallow intrusions. In addition, some other geophysical monitoring techniques and geochemical methods have been refined and are being more widely applied and tested. Comparison of the four major volcanic disasters of the 1980s (Mount St. Helens, U.S.A. (1980), El Chichon, Mexico (1982); Galunggung, Indonesia (1982); and Nevado del Ruiz, Colombia (1985)) illustrates the importance of predisaster geoscience studies, volcanic hazards assessments, volcano monitoring, contingency planning, and effective communications between scientists and authorities. -from Author

  1. Rockslide susceptibility and hazard assessment for mitigation works design along vertical rocky cliffs: workflow proposal based on a real case-study conducted in Sacco (Campania), Italy

    NASA Astrophysics Data System (ADS)

    Pignalosa, Antonio; Di Crescenzo, Giuseppe; Marino, Ermanno; Terracciano, Rosario; Santo, Antonio

    2015-04-01

    The work here presented concerns a case study in which a complete multidisciplinary workflow has been applied for an extensive assessment of the rockslide susceptibility and hazard in a common scenario such as a vertical and fractured rocky cliffs. The studied area is located in a high-relief zone in Southern Italy (Sacco, Salerno, Campania), characterized by wide vertical rocky cliffs formed by tectonized thick successions of shallow-water limestones. The study concerned the following phases: a) topographic surveying integrating of 3d laser scanning, photogrammetry and GNSS; b) gelogical surveying, characterization of single instabilities and geomecanichal surveying, conducted by geologists rock climbers; c) processing of 3d data and reconstruction of high resolution geometrical models; d) structural and geomechanical analyses; e) data filing in a GIS-based spatial database; f) geo-statistical and spatial analyses and mapping of the whole set of data; g) 3D rockfall analysis; The main goals of the study have been a) to set-up an investigation method to achieve a complete and thorough characterization of the slope stability conditions and b) to provide a detailed base for an accurate definition of the reinforcement and mitigation systems. For this purposes the most up-to-date methods of field surveying, remote sensing, 3d modelling and geospatial data analysis have been integrated in a systematic workflow, accounting of the economic sustainability of the whole project. A novel integrated approach have been applied both fusing deterministic and statistical surveying methods. This approach enabled to deal with the wide extension of the studied area (near to 200.000 m2), without compromising an high accuracy of the results. The deterministic phase, based on a field characterization of single instabilities and their further analyses on 3d models, has been applied for delineating the peculiarity of each single feature. The statistical approach, based on geostructural

  2. 76 FR 61070 - Disaster Assistance; Hazard Mitigation Grant Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-03

    ... authorities of other Federal agencies. It proposed to include vegetation management programs for wildfire... Resources Conservation Service of the U.S. Department of Agriculture. Wildfire and Erosion Under the NPRM, vegetation management related to wildfire and erosion hazard mitigation measures would be eligible for...

  3. Input space-dependent controller for multi-hazard mitigation

    NASA Astrophysics Data System (ADS)

    Cao, Liang; Laflamme, Simon

    2016-04-01

    Semi-active and active structural control systems are advanced mechanical devices and systems capable of high damping performance, ideal for mitigation of multi-hazards. The implementation of these devices within structural systems is still in its infancy, because of the complexity in designing a robust closed-loop control system that can ensure reliable and high mitigation performance. Particular challenges in designing a controller for multi-hazard mitigation include: 1) very large uncertainties on dynamic parameters and unknown excitations; 2) limited measurements with probabilities of sensor failure; 3) immediate performance requirements; and 4) unavailable sets of input-output during design. To facilitate the implementation of structural control systems, a new type of controllers with high adaptive capabilities is proposed. It is based on real-time identification of an embedding that represents the essential dynamics found in the input space, or in the sensors measurements. This type of controller is termed input-space dependent controllers (ISDC). In this paper, the principle of ISDC is presented, their stability and performance derived analytically for the case of harmonic inputs, and their performance demonstrated in the case of different types of hazards. Results show the promise of this new type of controller at mitigating multi-hazards by 1) relying on local and limited sensors only; 2) not requiring prior evaluation or training; and 3) adapting to systems non-stationarities.

  4. Mitigation options for accidental releases of hazardous gases

    SciTech Connect

    Fthenakis, V.M.

    1995-05-01

    The objective of this paper is to review and compare technologies available for mitigation of unconfined releases of toxic and flammable gases. These technologies include: secondary confinement, deinventory, vapor barriers, foam spraying, and water sprays/monitors. Guidelines for the design and/or operation of effective post-release mitigation systems and case studies involving actual industrial mitigation systems are also presented.

  5. Rainfall-triggered landslides, anthropogenic hazards, and mitigation strategies

    USGS Publications Warehouse

    Larsen, M.C.

    2008-01-01

    Rainfall-triggered landslides are part of a natural process of hillslope erosion that can result in catastrophic loss of life and extensive property damage in mountainous, densely populated areas. As global population expansion on or near steep hillslopes continues, the human and economic costs associated with landslides will increase. Landslide hazard mitigation strategies generally involve hazard assessment mapping, warning systems, control structures, and regional landslide planning and policy development. To be sustainable, hazard mitigation requires that management of natural resources is closely connected to local economic and social interests. A successful strategy is dependent on a combination of multi-disciplinary scientific and engineering approaches, and the political will to take action at the local community to national scale.

  6. Hazard Mitigation Potential of Earth-Sheltered Residences

    DTIC Science & Technology

    1983-11-01

    433-451 (1974). z ai’man, G ., R. Duncan, and J. Holbeck, Innovations and Organizations, New York: ~ John Wiley, 1973. Ziebarth, Allan M., Personal...Press Zaltman, G ., Duncan,E. Holbeck,J., Innovations and Organizationa [ 1973.] New York: John Wiley 3. Policy Isplementation PardachE., Ine...AA/3• 3C7 Hazard Mitigation Potential of UIYIONEarth-Sheltered Residecriot CARBIDE C. V. Chester H. B. Shapira G . A. Gristy M. Schw~eitzer S. A

  7. New Approaches to Tsunami Hazard Mitigation Demonstrated in Oregon

    NASA Astrophysics Data System (ADS)

    Priest, G. R.; Rizzo, A.; Madin, I.; Lyles Smith, R.; Stimely, L.

    2012-12-01

    Oregon Department of Geology and Mineral Industries and Oregon Emergency Management collaborated over the last four years to increase tsunami preparedness for residents and visitors to the Oregon coast. Utilizing support from the National Tsunami Hazards Mitigation Program (NTHMP), new approaches to outreach and tsunami hazard assessment were developed and then applied. Hazard assessment was approached by first doing two pilot studies aimed at calibrating theoretical models to direct observations of tsunami inundation gleaned from the historical and prehistoric (paleoseismic/paleotsunami) data. The results of these studies were then submitted to peer-reviewed journals and translated into 1:10,000-12,000-scale inundation maps. The inundation maps utilize a powerful new tsunami model, SELFE, developed by Joseph Zhang at the Oregon Health & Science University. SELFE uses unstructured computational grids and parallel processing technique to achieve fast accurate simulation of tsunami interactions with fine-scale coastal morphology. The inundation maps were simplified into tsunami evacuation zones accessed as map brochures and an interactive mapping portal at http://www.oregongeology.org/tsuclearinghouse/. Unique in the world are new evacuation maps that show separate evacuation zones for distant versus locally generated tsunamis. The brochure maps explain that evacuation time is four hours or more for distant tsunamis but 15-20 minutes for local tsunamis that are invariably accompanied by strong ground shaking. Since distant tsunamis occur much more frequently than local tsunamis, the two-zone maps avoid needless over evacuation (and expense) caused by one-zone maps. Inundation mapping for the entire Oregon coast will be complete by ~2014. Educational outreach was accomplished first by doing a pilot study to measure effectiveness of various approaches using before and after polling and then applying the most effective methods. In descending order, the most effective

  8. Collaborative Monitoring and Hazard Mitigation at Fuego Volcano, Guatemala

    NASA Astrophysics Data System (ADS)

    Lyons, J. J.; Bluth, G. J.; Rose, W. I.; Patrick, M.; Johnson, J. B.; Stix, J.

    2007-05-01

    A portable, digital sensor network has been installed to closely monitor changing activity at Fuego volcano, which takes advantage of an international collaborative effort among Guatemala, U.S. and Canadian universities, and the Peace Corps. The goal of this effort is to improve the understanding shallow internal processes, and consequently to more effectively mitigate volcanic hazards. Fuego volcano has had more than 60 historical eruptions and nearly-continuous activity make it an ideal laboratory to study volcanic processes. Close monitoring is needed to identify base-line activity, and rapidly identify and disseminate changes in the activity which might threaten nearby communities. The sensor network is comprised of a miniature DOAS ultraviolet spectrometer fitted with a system for automated plume scans, a digital video camera, and two seismo-acoustic stations and portable dataloggers. These sensors are on loan from scientists who visited Fuego during short field seasons and donated use of their sensors to a resident Peace Corps Masters International student from Michigan Technological University for extended data collection. The sensor network is based around the local volcano observatory maintained by Instituto National de Sismologia, Vulcanologia, Metrologia e Hidrologia (INSIVUMEH). INSIVUMEH provides local support and historical knowledge of Fuego activity as well as a secure location for storage of scientific equipment, data processing, and charging of the batteries that power the sensors. The complete sensor network came online in mid-February 2007 and here we present preliminary results from concurrent gas, seismic, and acoustic monitoring of activity from Fuego volcano.

  9. The hidden costs of coastal hazards: Implications for risk assessment and mitigation

    USGS Publications Warehouse

    Kunreuther, H.; Platt, R.; Baruch, S.; Bernknopf, R.L.; Buckley, M.; Burkett, V.; Conrad, D.; Davidson, T.; Deutsch, K.; Geis, D.; Jannereth, M.; Knap, A.; Lane, H.; Ljung, G.; McCauley, M.; Mileti, D.; Miller, T.; Morrow, B.; Meyers, J.; Pielke, R.; Pratt, A.; Tripp, J.

    2000-01-01

    Society has limited hazard mitigation dollars to invest. Which actions will be most cost effective, considering the true range of impacts and costs incurred? In 1997, the H. John Heinz III Center for Science, Economics and the Environment began a two-year study with a panel of experts to help develop new strategies to identify and reduce the costs of weather-related hazards associated with rapidly increasing coastal development activities.The Hidden Costs of Coastal Hazards presents the panel's findings, offering the first in-depth study that considers the costs of coastal hazards to natural resources, social institutions, business, and the built environment. Using Hurricane Hugo, which struck South Carolina in 1989, as a case study, it provides for the first time information on the full range of economic costs caused by a major coastal hazard event. The book:describes and examines unreported, undocumented, and hidden costs such as losses due to business interruption, reduction in property values, interruption of social services, psychological trauma, damage to natural systems, and othersexamines the concepts of risk and vulnerability, and discusses conventional approaches to risk assessment and the emerging area of vulnerability assessmentrecommends a comprehensive framework for developing and implementing mitigation strategiesdocuments the human impact of Hurricane Hugo and provides insight from those who lived through it.The Hidden Costs of Coastal Hazards takes a structured approach to the problem of coastal hazards, offering a new framework for community-based hazard mitigation along with specific recommendations for implementation. Decisionmakers -- both policymakers and planners -- who are interested in coastal hazard issues will find the book a unique source of new information and insight, as will private-sector decisionmakers including lenders, investors, developers, and insurers of coastal property.

  10. Composite Materials for Hazard Mitigation of Reactive Metal Hydrides.

    SciTech Connect

    Pratt, Joseph William; Cordaro, Joseph Gabriel; Sartor, George B.; Dedrick, Daniel E.; Reeder, Craig L.

    2012-02-01

    In an attempt to mitigate the hazards associated with storing large quantities of reactive metal hydrides, polymer composite materials were synthesized and tested under simulated usage and accident conditions. The composites were made by polymerizing vinyl monomers using free-radical polymerization chemistry, in the presence of the metal hydride. Composites with vinyl-containing siloxane oligomers were also polymerized with and without added styrene and divinyl benzene. Hydrogen capacity measurements revealed that addition of the polymer to the metal hydride reduced the inherent hydrogen storage capacity of the material. The composites were found to be initially effective at reducing the amount of heat released during oxidation. However, upon cycling the composites, the mitigating behavior was lost. While the polymer composites we investigated have mitigating potential and are physically robust, they undergo a chemical change upon cycling that makes them subsequently ineffective at mitigating heat release upon oxidation of the metal hydride. Acknowledgements The authors would like to thank the following people who participated in this project: Ned Stetson (U.S. Department of Energy) for sponsorship and support of the project. Ken Stewart (Sandia) for building the flow-through calorimeter and cycling test stations. Isidro Ruvalcaba, Jr. (Sandia) for qualitative experiments on the interaction of sodium alanate with water. Terry Johnson (Sandia) for sharing his expertise and knowledge of metal hydrides, and sodium alanate in particular. Marcina Moreno (Sandia) for programmatic assistance. John Khalil (United Technologies Research Corp) for insight into the hazards of reactive metal hydrides and real-world accident scenario experiments. Summary In an attempt to mitigate and/or manage hazards associated with storing bulk quantities of reactive metal hydrides, polymer composite materials (a mixture of a mitigating polymer and a metal hydride) were synthesized and tested

  11. Modeling and mitigating natural hazards: Stationarity is immortal!

    NASA Astrophysics Data System (ADS)

    Montanari, Alberto; Koutsoyiannis, Demetris

    2014-12-01

    Environmental change is a reason of relevant concern as it is occurring at an unprecedented pace and might increase natural hazards. Moreover, it is deemed to imply a reduced representativity of past experience and data on extreme hydroclimatic events. The latter concern has been epitomized by the statement that "stationarity is dead." Setting up policies for mitigating natural hazards, including those triggered by floods and droughts, is an urgent priority in many countries, which implies practical activities of management, engineering design, and construction. These latter necessarily need to be properly informed, and therefore, the research question on the value of past data is extremely important. We herein argue that there are mechanisms in hydrological systems that are time invariant, which may need to be interpreted through data inference. In particular, hydrological predictions are based on assumptions which should include stationarity. In fact, any hydrological model, including deterministic and nonstationary approaches, is affected by uncertainty and therefore should include a random component that is stationary. Given that an unnecessary resort to nonstationarity may imply a reduction of predictive capabilities, a pragmatic approach, based on the exploitation of past experience and data is a necessary prerequisite for setting up mitigation policies for environmental risk.

  12. 77 FR 24505 - Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential Buildings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-24

    ... SECURITY Federal Emergency Management Agency Hazard Mitigation Assistance for Wind Retrofit Projects for... comments on Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential Buildings... property from hazards and their effects. One such activity is the implementation of wind retrofit...

  13. 76 FR 23613 - Draft Programmatic Environmental Assessment for Hazard Mitigation Safe Room Construction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-27

    ... SECURITY Federal Emergency Management Agency Draft Programmatic Environmental Assessment for Hazard Mitigation Safe Room Construction AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice of... Mitigation Grant Program (HMGP), the Federal Emergency Management Agency (FEMA) may provide funding...

  14. Meteorological Hazard Assessment and Risk Mitigation in Rwanda.

    NASA Astrophysics Data System (ADS)

    Nduwayezu, Emmanuel; Jaboyedoff, Michel; Bugnon, Pierre-Charles; Nsengiyumva, Jean-Baptiste; Horton, Pascal; Derron, Marc-Henri

    2015-04-01

    used in identifying the most risky areas. Finally, based on practical experiences in this kind of field and produced documents some recommendations for low-cost mitigation measures will be proposed. Reference: MIDIMAR, Impacts of floods and landslides on socio-economic development profile. Case study: Musanze District. Kigali, June 2012.

  15. Transport and Reactivity of Decontaminants to Provide Hazard Mitigation of Chemical Warfare Agents from Materials

    DTIC Science & Technology

    2016-06-01

    A combined approach was developed that integrated two types of testing—dilute liquid-phase reactor results to determine 18 chemical reactivity...TRANSPORT AND REACTIVITY OF DECONTAMINANTS TO PROVIDE HAZARD MITIGATION OF CHEMICAL ...2013 4. TITLE AND SUBTITLE Transport and Reactivity of Decontaminants to Provide Hazard Mitigation of Chemical Warfare Agents from Materials 5a

  16. Tsunami hazard mitigation in tourism in the tropical and subtropical coastal areas: a case study in the Ryukyu Islands, southwest of Japan

    NASA Astrophysics Data System (ADS)

    Matsumoto, T.

    2006-12-01

    Life and economy (including tourism) in tropical and subtropical coastal areas, such as Okinawa Prefecture (Ryukyu) are highly relying on the sea. The sea has both "gentle" side to give people healing and "dangerous" side to kill people. If we are going to utilise the sea for marine tourism such as constructing resort facilities on the oceanfront, we should know all of the sea, including the both sides of the sea: especially the nature of tsunamis. And also we islanders should issue accurate information about the sea towards outsiders, especially tourists visiting the island. We have already learned a lesson about this issue from the Sumatra tsunami in 2004. However, measures against the tsunami disaster by marine tourism industry are still inadequate in these areas. The goal of tsunami hazard mitigation for those engaged in tourism industry in tropical and subtropical coastal areas should be as follows. (1) Preparedness against tsunamis: "Be aware of the characteristics of tsunamis." "Prepare tsunamis when you feel an earthquake." "Prepare tsunamis when an earthquake takes place somewhere in the world." (2) Maintenance of an exact tsunami hazard map under quantitative analyses of the characteristics of tsunamis: "Flooding areas by tsunami attacks are dependent not only on altitude but also on amplification and inundation due to the seafloor topography near the coast and the onland topographic relief." "Tsunami damage happens repeatedly." (3) Maintenance of a tsunami disaster prevention manual and training after the manual: "Who should do what in case of tsunamis?" "How should the resort hotel employees lead the guests to the safe place?" Such a policy for disaster prevention is discussed in the class of the general education of "Ocean Sciences" in University of the Ryukyus (UR) and summer school for high school students. The students (most of them are from Okinawa Prefecture) consider, discuss and make reports about what to do in case of tsunamis as an islander

  17. Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.

    2006-12-01

    An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes

  18. Climate resiliency: A unique multi-hazard mitigation approach.

    PubMed

    Baja, Kristin

    2016-01-01

    Baltimore's unique combination of shocks and stresses cuts across social, economic and environmental factors. Like many other post-industrial cities, over the past several decades, Baltimore has experienced a decline in its population -- resulting in a lower tax base. These trends have had deleterious effects on the city's ability to attend to much needed infrastructure improvements and human and social services. In addition to considerable social and economic issues, the city has begun to experience negative impacts due to climate change. The compounding nature of these trends has put Baltimore, like other post-industrial cities, in the position of having to do more with fewer available resources. Rather than wait for disaster to strike, Baltimore took a proactive approach to planning for shocks and stresses by determining unique ways to pre-emptively plan for and adapt to effects from climate change and incorporating these into the City's All Hazard Mitigation Plan. Since adopting the plan in 2013, Baltimore has been moving forward with various projects aimed at improving systems, enhancing adaptive capacity and building a more resilient and sustainable city. This paper describes the basis for the city's approach and offers a portrait of its efforts in order to broaden foundational knowledge of the emerging ways that cities are recasting the role of planning in light of unprecedented circumstances that demand complex solutions that draw on few resources.

  19. Mitigation of unconfined releases of hazardous gases via liquid spraying

    SciTech Connect

    Fthenakis, V.M.

    1997-02-01

    The capability of water sprays in mitigating clouds of hydrofluoric acid (HF) has been demonstrated in the large-scale field experiments of Goldfish and Hawk, which took place at the DOE Nevada Test Site. The effectiveness of water sprays and fire water monitors to remove HF from vapor plume, has also been studied theoretically using the model HGSPRAY5 with the near-field and far-field dispersion described by the HGSYSTEM models. This paper presents options to select and evaluate liquid spraying systems, based on the industry experience and mathematical modeling.

  20. The seismic project of the National Tsunami Hazard Mitigation Program

    USGS Publications Warehouse

    Oppenheimer, D.H.; Bittenbinder, A.N.; Bogaert, B.M.; Buland, R.P.; Dietz, L.D.; Hansen, R.A.; Malone, S.D.; McCreery, C.S.; Sokolowski, T.J.; Whitmore, P.M.; Weaver, C.S.

    2005-01-01

    In 1997, the Federal Emergency Management Agency (FEMA), National Oceanic and Atmospheric Administration (NOAA), U.S. Geological Survey (USGS), and the five western States of Alaska, California, Hawaii, Oregon, and Washington joined in a partnership called the National Tsunami Hazard Mitigation Program (NTHMP) to enhance the quality and quantity of seismic data provided to the NOAA tsunami warning centers in Alaska and Hawaii. The NTHMP funded a seismic project that now provides the warning centers with real-time seismic data over dedicated communication links and the Internet from regional seismic networks monitoring earthquakes in the five western states, the U.S. National Seismic Network in Colorado, and from domestic and global seismic stations operated by other agencies. The goal of the project is to reduce the time needed to issue a tsunami warning by providing the warning centers with high-dynamic range, broadband waveforms in near real time. An additional goal is to reduce the likelihood of issuing false tsunami warnings by rapidly providing to the warning centers parametric information on earthquakes that could indicate their tsunamigenic potential, such as hypocenters, magnitudes, moment tensors, and shake distribution maps. New or upgraded field instrumentation was installed over a 5-year period at 53 seismic stations in the five western states. Data from these instruments has been integrated into the seismic network utilizing Earthworm software. This network has significantly reduced the time needed to respond to teleseismic and regional earthquakes. Notably, the West Coast/Alaska Tsunami Warning Center responded to the 28 February 2001 Mw 6.8 Nisqually earthquake beneath Olympia, Washington within 2 minutes compared to an average response time of over 10 minutes for the previous 18 years. ?? Springer 2005.

  1. GNSS Buoy Array in the Ocean for Natural Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Kato, T.; Terada, Y.; Yamamoto, S. I.; Iwakiri, N.; Toyoshima, M.; Koshikawa, N.; Motohashi, O.; Hashimoto, G.; Wada, A.

    2015-12-01

    The GNSS buoy system for tsunami early warning has been developed in Japan. The system has been implemented as a national wave monitoring system and its record was used to update the tsunami warning at the 3.11 Tohoku-oki earthquake. The lessons learned in this experience was that the buoys are placed only less than 20km from the coast, which was not far enough for effective evacuation of people. We thus tried to improve the system for putting the buoy much farther from the coast. First, we tried to implement, different from current baseline mode RTK-GPS, a real-time PPP analysis strategy for positioning. In addition, we tried to use a two-way satellite data transmission in contrast with current surface radio system. We have made a series of experiments for this purpose in 2013 and 2014. A buoy of about 40km south of Shikoku, southwest Japan, was used for this purpose. GEONET data were used to obtain precise orbits and clocks of satellites. Then, the information was transferred to the GNSS buoy using LEX signal of QZSS satellite system. The received information on the buoy were used for real-time PPP analysis for every second. The obtained buoy position was then transmitted to the ground base, through an engineering test satellite, ETS-VIII. The received data was then disseminated to public through the internet. Both filtered short-term and long-term waves, were separately shown on the webpage. The success of these experiments indicates that the GNSS buoy can be placed at least more than 1,500 km from the ground based tracking network. Given this success, we would now be able to deploy a new GNSS buoy array system in the wide ocean. An array in the ocean can be used for ionospheric and atmospheric research in the same region as well as tsunami or ocean bottom crustal deformation monitoring through an application to the GNSS-acoustic system. We are now designing a regional GNSS buoy array in the western Pacific as a synthetic natural hazard mitigation system.

  2. Next-Generation GPS Station for Hazards Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2013-12-01

    Our objective is to better forecast, assess, and mitigate natural hazards, including earthquakes, tsunamis, and extreme storms and flooding through development and implementation of a modular technology for the next-generation in-situ geodetic station to support the flow of information from multiple stations to scientists, mission planners, decision makers, and first responders. The same technology developed under NASA funding can be applied to enhance monitoring of large engineering structures such as bridges, hospitals and other critical infrastructure. Meaningful warnings save lives when issued within 1-2 minutes for destructive earthquakes, several tens of minutes for tsunamis, and up to several hours for extreme storms and flooding, and can be provided by on-site fusion of multiple data types and generation of higher-order data products: GPS/GNSS and accelerometer measurements to estimate point displacements, and GPS/GNSS and meteorological measurements to estimate moisture variability in the free atmosphere. By operating semi-autonomously, each station can then provide low-latency, high-fidelity and compact data products within the constraints of narrow communications bandwidth that often accompanies natural disasters. We have developed a power-efficient, low-cost, plug-in Geodetic Module for fusion of data from in situ sensors including GPS, a strong-motion accelerometer module, and a meteorological sensor package, for deployment at existing continuous GPS stations in southern California; fifteen stations have already been upgraded. The low-cost modular design is scalable to the many existing continuous GPS stations worldwide. New on-the-fly data products are estimated with 1 mm precision and accuracy, including three-dimensional seismogeodetic displacements for earthquake, tsunami and structural monitoring and precipitable water for forecasting extreme weather events such as summer monsoons and atmospheric rivers experienced in California. Unlike more

  3. Numerical and probabilistic analysis of asteroid and comet impact hazard mitigation

    SciTech Connect

    Plesko, Catherine S; Weaver, Robert P; Huebner, Walter F

    2010-09-09

    The possibility of asteroid and comet impacts on Earth has received significant recent media and scientific attention. Still, there are many outstanding questions about the correct response once a potentially hazardous object (PHO) is found. Nuclear munitions are often suggested as a deflection mechanism because they have a high internal energy per unit launch mass. However, major uncertainties remain about the use of nuclear munitions for hazard mitigation. There are large uncertainties in a PHO's physical response to a strong deflection or dispersion impulse like that delivered by nuclear munitions. Objects smaller than 100 m may be solid, and objects at all sizes may be 'rubble piles' with large porosities and little strength. Objects with these different properties would respond very differently, so the effects of object properties must be accounted for. Recent ground-based observations and missions to asteroids and comets have improved the planetary science community's understanding of these objects. Computational power and simulation capabilities have improved such that it is possible to numerically model the hazard mitigation problem from first principles. Before we know that explosive yield Y at height h or depth -h from the target surface will produce a momentum change in or dispersion of a PHO, we must quantify energy deposition into the system of particles that make up the PHO. Here we present the initial results of a parameter study in which we model the efficiency of energy deposition from a stand-off nuclear burst onto targets made of PHO constituent materials.

  4. Aligning Natural Resource Conservation and Flood Hazard Mitigation in California

    PubMed Central

    Calil, Juliano; Beck, Michael W.; Gleason, Mary; Merrifield, Matthew; Klausmeyer, Kirk; Newkirk, Sarah

    2015-01-01

    Flooding is the most common and damaging of all natural disasters in the United States, and was a factor in almost all declared disasters in U.S. history. Direct flood losses in the U.S. in 2011 totaled $8.41 billion and flood damage has also been on the rise globally over the past century. The National Flood Insurance Program paid out more than $38 billion in claims since its inception in 1968, more than a third of which has gone to the one percent of policies that experienced multiple losses and are classified as “repetitive loss.” During the same period, the loss of coastal wetlands and other natural habitat has continued, and funds for conservation and restoration of these habitats are very limited. This study demonstrates that flood losses could be mitigated through action that meets both flood risk reduction and conservation objectives. We found that there are at least 11,243km2 of land in coastal California, which is both flood-prone and has natural resource conservation value, and where a property/structure buyout and habitat restoration project could meet multiple objectives. For example, our results show that in Sonoma County, the extent of land that meets these criteria is 564km2. Further, we explore flood mitigation grant programs that can be a significant source of funds to such projects. We demonstrate that government funded buyouts followed by restoration of targeted lands can support social, environmental, and economic objectives: reduction of flood exposure, restoration of natural resources, and efficient use of limited governmental funds. PMID:26200353

  5. Earthquake and Volcanic Hazard Mitigation and Capacity Building in Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Ayele, A.

    2012-04-01

    The East African Rift System (EARS) is a classic example of active continental rifting, and a natural laboratory setting to study initiation and early stage evolution of continental rifts. The EARS is at different stages of development that varies from relatively matured rift (16 mm/yr) in the Afar to a weakly extended Okavango Delta in the south with predicted opening velocity < 3 mm/yr. Recent studies in the region helped researchers to highlight the length and timescales of magmatism and faulting, the partitioning of strain between faulting and magmatism, and their implications for the development of along-axis segmentation. Although the human resource and instrument coverage is sparse in the continent, our understanding of rift processes and deep structure has improved in the last decade after the advent of space geodesy and broadband seismology. The recent major earthquakes, volcanic eruptions and mega dike intrusions that occurred along the EARS attracted several earth scientist teams across the globe. However, most African countries traversed by the rift do not have the full capacity to monitor and mitigate earthquake and volcanic hazards. Few monitoring facilities exist in some countries, and the data acquisition is rarely available in real-time for mitigation purpose. Many sub-Saharan Africa governments are currently focused on achieving the millennium development goals with massive infrastructure development scheme and urbanization while impending natural hazards of such nature are severely overlooked. Collaborations with overseas researchers and other joint efforts by the international community are opportunities to be used by African institutions to best utilize limited resources and to mitigate earthquake and volcano hazards.

  6. Hazard Mitigation Assistance Programs Available to Water and Wastewater Utilities

    EPA Pesticide Factsheets

    You can prevent damage to your utility before it occurs. Utilities can implement mitigation projects to better withstand a natural disaster, minimize damage and rapidly recover from disruptions to service.

  7. Evaluating fuel complexes for fire hazard mitigation planning in the southeastern United States.

    SciTech Connect

    Andreu, Anne G.; Shea, Dan; Parresol, Bernard, R.; Ottmar, Roger, D.

    2012-01-01

    Fire hazard mitigation planning requires an accurate accounting of fuel complexes to predict potential fire behavior and effects of treatment alternatives. In the southeastern United States, rapid vegetation growth coupled with complex land use history and forest management options requires a dynamic approach to fuel characterization. In this study we assessed potential surface fire behavior with the Fuel Characteristic Classification System (FCCS), a tool which uses inventoried fuelbed inputs to predict fire behavior. Using inventory data from 629 plots established in the upper Atlantic Coastal Plain, South Carolina, we constructed FCCS fuelbeds representing median fuel characteristics by major forest type and age class. With a dry fuel moisture scenario and 6.4 km h{sub 1} midflame wind speed, the FCCS predicted moderate to high potential fire hazard for the majority of the fuelbeds under study. To explore fire hazard under potential future fuel conditions, we developed fuelbeds representing the range of quantitative inventorydata for fuelbed components that drive surface fire behavior algorithms and adjusted shrub species composition to represent 30% and 60% relative cover of highly flammable shrub species. Results indicate that the primary drivers of surface fire behavior vary by forest type, age and surface fire behavior rating. Litter tends to be a primary or secondary driver in most forest types. In comparison to other surface fire contributors, reducing shrub loading results in reduced flame lengths most consistently across forest types. FCCS fuelbeds and the results from this project can be used for fire hazard mitigation planning throughout the southern Atlantic Coastal Plain where similar forest types occur. The approach of building simulated fuelbeds across the range of available surface fuel data produces sets of incrementally different fuel characteristics that can be applied to any dynamic forest types in which surface fuel conditions change rapidly.

  8. Spatio-temporal patterns of hazards and their use in risk assessment and mitigation. Case study of road accidents in Romania

    NASA Astrophysics Data System (ADS)

    Catalin Stanga, Iulian

    2013-04-01

    the spatial or temporal clustering of crash accidents. Since the 1990's, Geographical Informational Systems (GIS) became a very important tool for traffic and road safety management, allowing not only the spatial and multifactorial analysis, but also graphical and non-graphical outputs. The current paper presents an accessible GIS methodology to study the spatio-temporal pattern of injury related road accidents, to identify the high density accidents zones, to make a cluster analysis, to create multicriterial typologies, to identify spatial and temporal similarities and to explain them. In this purpose, a Geographical Information System was created, allowing a complex analysis that involves not only the events, but also a large set of interrelated and spatially linked attributes. The GIS includes the accidents as georeferenced point elements with a spatially linked attribute database: identification information (date, location details); accident type; main, secondary and aggravating causes; data about driver; vehicle information; consequences (damages, injured peoples and fatalities). Each attribute has its own number code that allows both the statistical analysis and the spatial interrogation. The database includes those road accidents that led to physical injuries and loss of human lives between 2007 and 2012 and the spatial analysis was realized using TNTmips 7.3 software facilities. Data aggregation and processing allowed creating the spatial pattern of injury related road accidents through Kernel density estimation at three different levels (national - Romania; county level - Iasi County; local level - Iasi town). Spider graphs were used to create the temporal pattern or road accidents at three levels (daily, weekly and monthly) directly related to their causes. Moreover the spatial and temporal database relates the natural hazards (glazed frost, fog, and blizzard) with the human made ones, giving the opportunity to evaluate the nature of uncertainties in risk

  9. Assessing the costs of hazard mitigation through landscape interventions in the urban structure

    NASA Astrophysics Data System (ADS)

    Bostenaru-Dan, Maria; Aldea Mendes, Diana; Panagopoulos, Thomas

    2014-05-01

    In this paper we look at an issue rarely approached, the economic efficiency of natural hazard risk mitigation. The urban scale at which a natural hazard can impact leads to the importance of urban planning strategy in risk management. However, usually natural, engineering, and social sciences deal with it, and the role of architecture and urban planning is neglected. Climate change can lead to risks related to increased floods, desertification, sea level rise among others. Reducing the sealed surfaces in cities through green spaces in the crowded centres can mitigate them, and can be foreseen in restructuration plans in presence or absence of disasters. For this purpose we reviewed the role of green spaces and community centres such as churches in games, which can build the core for restructuration efforts, as also field and archive studies show. We look at the way ICT can contribute to organize the information from the building survey to economic computations in direct modeling or through games. The roles of game theory, agent based modeling and networks and urban public policies in designing decision systems for risk management are discussed. Games rules are at the same time supported by our field and archive studies, as well as research by design. Also we take into consideration at a rare element, which is the role of landscape planning, through the inclusion of green elements in reconstruction after the natural and man-made disasters, or in restructuration efforts to mitigate climate change. Apart of existing old city tissue also landscape can be endangered by speculation and therefore it is vital to highlight its high economic value, also in this particular case. As ICOMOS highlights for the 2014 congress, heritage and landscape are two sides of the same coin. Landscape can become or be connected to a community centre, the first being necessary for building a settlement, the second raising its value, or can build connections between landmarks in urban routes

  10. Assessment and mitigation of combustible dust hazards in the plastics industry

    NASA Astrophysics Data System (ADS)

    Stern, Michael C.; Ibarreta, Alfonso; Myers, Timothy J.

    2015-05-01

    A number of recent industrial combustible dust fires and explosions, some involving powders used in the plastics industry, have led to heightened awareness of combustible dust hazards, increased regulatory enforcement, and changes to the current standards and regulations. This paper provides a summary of the fundamentals of combustible dust explosion hazards, comparing and contrasting combustible dust to flammable gases and vapors. The types of tests used to quantify and evaluate the potential hazard posed by plastic dusts are explored. Recent changes in NFPA 654, a standard applicable to combustible dust in the plastics industry, are also discussed. Finally, guidance on the primary methods for prevention and mitigation of combustible dust hazards are provided.

  11. Safety Design Requirements for Active Hazard Mitigation Device (AHMD) Employed to Address Fast and Slow Cook-off Thermal Threats

    DTIC Science & Technology

    2014-12-18

    Hazard Mitigation Device (AHMD) Employed to Address Fast and Slow Cook-off Thermal Threats 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...environments. 15. SUBJECT TERMS Active Hazard Mitigation Device insensitive munitions fast cook-off slow...DESIGN REQUIREMENTS FOR ACTIVE HAZARD MITIGATION DEVICE (AHMD) EMPLOYED TO ADDRESS FAST AND SLOW COOK-OFF THERMAL THREATS DOD Fuze Engineering

  12. An establishment on the hazard mitigation system of large scale landslides for Zengwen reservoir watershed management in Taiwan

    NASA Astrophysics Data System (ADS)

    Tsai, Kuang-Jung; Lee, Ming-Hsi; Chen, Yie-Ruey; Huang, Meng-Hsuan; Yu, Chia-Ching

    2016-04-01

    Extremely heavy rainfall with accumulated rainfall amount more than 2900mm within continuous 3 day event occurred at southern Taiwan has been recognized as a serious natural hazard caused by Morakot typhoon in august, 2009. Very destructive large scale landslides and debris flows were induced by this heavy rainfall event. According to the satellite image processing and monitoring project was conducted by Soil & Water Conservation Bureau after Morakot typhoon. More than 10904 sites of landslide with total sliding area of 18113 ha were significantly found by this project. Also, the field investigation on all landslide areas were executed by this research on the basis of disaster type, scale and location related to the topographic condition, colluvium soil characteristics, bedrock formation and geological structure after Morakot hazard. The mechanism, characteristics and behavior of this large scale landslide combined with debris flow disasters are analyzed and Investigated to rule out the interaction of factors concerned above and identify the disaster extent of rainfall induced landslide during the period of this study. In order to reduce the disaster risk of large scale landslide and debris flow, the adaption strategy of hazard mitigation system should be set up as soon as possible and taken into consideration of slope land conservation, landslide control countermeasure planning, disaster database establishment, environment impact analysis and disaster risk assessment respectively. As a result, this 3-year research has been focused on the field investigation by using GPS/GIS/RS integration, mechanism and behavior study regarding to the rainfall induced landslide occurrence, disaster database and hazard mitigation system establishment. In fact, this project has become an important issue which was seriously concerned by the government and people live in Taiwan. Hopefully, all results come from this research can be used as a guidance for the disaster prevention and

  13. Department of Energy Natural Phenomena Hazards Mitigation Program

    SciTech Connect

    Murray, R.C.

    1993-09-01

    This paper will present a summary of past and present accomplishments of the Natural Phenomena Hazards Program that has been ongoing at Lawrence Livermore National Laboratory since 1975. The Natural Phenomena covered includes earthquake; winds, hurricanes, and tornadoes; flooding and precipitation; lightning; and volcanic events. The work is organized into four major areas (1) Policy, requirements, standards, and guidance (2) Technical support, research development, (3) Technology transfer, and (4) Oversight.

  14. Advances(?) in mitigating volcano hazards in Latin America

    USGS Publications Warehouse

    Hall, M.L.

    1991-01-01

    The 1980's were incredible years for volcanology. As a consequence of the Mount St. Helens and other eruptions, major advances in our understanding of volcanic processes and eruption dynamics were made. the decade also witnessed the greatest death toll caused by volcanism since 1902. Following Mount St. Helens, awareness of volcano hazards increased throughout the world; however, in Latin America, subsequent events showed that much was still to be learned. 

  15. Fourth DOE Natural Phenomena Hazards Mitigation Conference: Proceedings. Volume 1

    SciTech Connect

    Not Available

    1993-12-31

    This conference allowed an interchange in the natural phenomena area among designers, safety professionals, and managers. The papers presented in Volume I of the proceedings are from sessions I - VIII which cover the general topics of: DOE standards, lessons learned and walkdowns, wind, waste tanks, ground motion, testing and materials, probabilistic seismic hazards, risk assessment, base isolation and energy dissipation, and lifelines and floods. Individual papers are indexed separately. (GH)

  16. Mitigating the Risk of Environmental Hazards in Mexico

    DTIC Science & Technology

    2011-10-28

    health effects in some personnel ( Agent Orange and Atsugi Japan). Many of the current hazards are due to inadequate infrastructure or improper waste...CancerCauses/OtherCarcinogens/IntheWorkplace/ agent - orange -and-cancer (accessed October 11, 2011). 56. VA Public Health , When and Where Agent Orange Was... health impact are discussed. Examples from past U.S. military exposures and insights on emerging issues are provided to illustrate why an effective risk

  17. Looking before we leap: an ongoing, quantative investigation of asteroid and comet impact hazard mitigation

    SciTech Connect

    Plesko, Catherine S; Weaver, Robert P; Bradley, Paul A; Huebner, Walter F

    2010-01-01

    There are many outstanding questions about the correct response to an asteroid or comet impact threat on Earth. Nuclear munitions are currently thought to be the most efficient method of delivering an impact-preventing impulse to a potentially hazardous object (PHO). However, there are major uncertainties about the response of PHOs to a nuclear burst, and the most appropriate ways to use nuclear munitions for hazard mitigation.

  18. Hazards in the Heliosphere: Forecasting and Mitigation Techniques

    NASA Astrophysics Data System (ADS)

    Crosby, N.

    2007-08-01

    Spacecraft have to survive very hostile environments which can severely limit space missions as well as pose threats to humans. Shielding requirements, including space storm shelters, both on the spacecraft as well as radiation protection facilities on the target, need to be taken into consideration with respect to travel time, local target space weather conditions and the phase of the solar cycle. Be it on Mars or a different planet, once we reach our target the local space weather conditions will be a function of the planet's location in the solar system and whether it has a magnetosphere and/or atmosphere around it. This presentation will look at the various opportunities that heliospheric exploration offers while in parallel evaluating the obstacles that must be overcome to realize these scenarios considering the feasibility to use and integrate existing systems (e.g. forecasting), as well as presenting innovative mitigation techniques.

  19. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  20. Quantifying the effect of early warning systems for mitigating risks from alpine hazards

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Sättele, Martina; Bründl, Michael

    2016-04-01

    Early warning systems (EWS) are increasingly applied as flexible and non-intrusive measures for mitigating risks from alpine hazards. They are typically planed and installed in an ad-hoc manner and their effectiveness is not quantified, which is in contrast to structural risk mitigation measures. The effect of an EWS on the risk depends on human decision makers: experts interpret the signals from EWS, authorities decide on intervention measures and the public responds to the warnings. This interaction of the EWS with humans makes the quantification of their effectiveness challenging. Nevertheless, such a quantification is an important step in understanding, improving and justifying the use of EWS. We systematically discuss and demonstrate the factors that influence EWS effectiveness for alpine hazards, and present approaches and tools for analysing them. These include Bayesian network models, which are a powerful tool for an integral probabilistic assessment. The theory is illustrated through applications of warning systems for debris flow and rockfall hazards. References: Sättele M., Bründl M., Straub D. (in print). Quantifying the effectiveness of early warning systems for natural hazards. Natural Hazards and Earth System Sciences. Sättele M., Bründl M., Straub D. (2015). Reliability and Effectiveness of Warning Systems for Natural Hazards: Concepts and Application to Debris Flow Warning. Reliability Engineering & System Safety, 142: 192-202

  1. Mitigation of EMU Cut Glove Hazard from Micrometeoroid and Orbital Debris Impacts on ISS Handrails

    NASA Technical Reports Server (NTRS)

    Ryan, Shannon; Christiansen, Eric L.; Davis, Bruce A.; Ordonez, Erick

    2009-01-01

    Recent cut damages sustained on crewmember gloves during extravehicular activity (ISS) onboard the International Space Station (ISS) have been caused by contact with sharp edges or a pinch point according to analysis of the damages. One potential source are protruding sharp edged crater lips from micrometeoroid and orbital debris (MMOD) impacts on metallic handrails along EVA translation paths. A number of hypervelocity impact tests were performed on ISS handrails, and found that mm-sized projectiles were capable of inducing crater lip heights two orders of magnitude above the minimum value for glove abrasion concerns. Two techniques were evaluated for mitigating the cut glove hazard of MMOD impacts on ISS handrails: flexible overwraps which act to limit contact between crewmember gloves and impact sites, and; alternate materials which form less hazardous impact crater profiles. In parallel with redesign efforts to increase the cut resilience of EMU gloves, the modifications to ISS handrails evaluated in this study provide the means to significantly reduce cut glove risk from MMOD impact craters

  2. The U.S. National Tsunami Hazard Mitigation Program: Successes in Tsunami Preparedness

    NASA Astrophysics Data System (ADS)

    Whitmore, P.; Wilson, R. I.

    2012-12-01

    Formed in 1995 by Congressional Action, the National Tsunami Hazards Mitigation Program (NTHMP) provides the framework for tsunami preparedness activities in the United States. The Program consists of the 28 U.S. coastal states, territories, and commonwealths (STCs), as well as three Federal agencies: the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), and the United States Geological Survey (USGS). Since its inception, the NTHMP has advanced tsunami preparedness in the United States through accomplishments in many areas of tsunami preparedness: - Coordination and funding of tsunami hazard analysis and preparedness activities in STCs; - Development and execution of a coordinated plan to address education and outreach activities (materials, signage, and guides) within its membership; - Lead the effort to assist communities in meeting National Weather Service (NWS) TsunamiReady guidelines through development of evacuation maps and other planning activities; - Determination of tsunami hazard zones in most highly threatened coastal communities throughout the country by detailed tsunami inundation studies; - Development of a benchmarking procedure for numerical tsunami models to ensure models used in the inundation studies meet consistent, NOAA standards; - Creation of a national tsunami exercise framework to test tsunami warning system response; - Funding community tsunami warning dissemination and reception systems such as sirens and NOAA Weather Radios; and, - Providing guidance to NOAA's Tsunami Warning Centers regarding warning dissemination and content. NTHMP activities have advanced the state of preparedness of United States coastal communities, and have helped save lives and property during recent tsunamis. Program successes as well as future plans, including maritime preparedness, are discussed.

  3. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    The United States spends approximately four million dollars each year searching for near-Earth objects (NEOs). The objective is to detect those that may collide with Earth. The majority of this funding supports the operation of several observatories that scan the sky searching for NEOs. This, however, is insufficient in detecting the majority of NEOs that may present a tangible threat to humanity. A significantly smaller amount of funding supports ways to protect the Earth from such a potential collision or "mitigation." In 2005, a Congressional mandate called for NASA to detect 90 percent of NEOs with diameters of 140 meters of greater by 2020. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies identifies the need for detection of objects as small as 30 to 50 meters as these can be highly destructive. The book explores four main types of mitigation including civil defense, "slow push" or "pull" methods, kinetic impactors and nuclear explosions. It also asserts that responding effectively to hazards posed by NEOs requires national and international cooperation. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies is a useful guide for scientists, astronomers, policy makers and engineers.

  4. Developing a scientific procedure for community based hazard mapping and risk mitigation

    NASA Astrophysics Data System (ADS)

    Verrier, M.

    2011-12-01

    As an international exchange student from the Geological Sciences Department at San Diego State University (SDSU), I joined the KKN-PPM program at Universitas Gadjah Mada (UGM), Yogyakarta, Indonesia, in July 2011 for 12 days (July 4th to July 16th) of its two month duration (July 4th to August 25th). The KKN-PPM group I was attached was designated 154 and was focused in Plosorejo Village, Karanganyar, Kerjo, Central Java, Indonesia. The mission of KKN-PPM 154 was to survey Plosorejo village for existing landslides, to generate a simple hazard susceptibility map that can be understood by local villagers, and then to begin dissemination of that map into the community. To generate our susceptibility map we first conducted a geological survey of the existing landslides in the field study area, with a focus on determining landslide triggers and gauging areas for susceptibility for future landslides. The methods for gauging susceptibility included lithological observation, the presence of linear cracking, visible loss of structural integrity in structures such as villager homes, as well as collaboration with local residents and with the local rescue and response team. There were three color distinctions used in representing susceptibility which were green, where there is no immediate danger of landslide damage; orange, where transportation routes are at risk of being disrupted by landslides; and red, where imminent landslide potential puts a home in direct danger. The landslide inventory and susceptibility data was compiled into digital mediums such as CorelDraw, ArcGIS and Google Earth. Once a technical map was generated, we presented it to the village leadership for confirmation and modification based on their experience. Finally, we began to use the technical susceptibility map to draft evacuation routes and meeting points in the event of landslides, as well as simple susceptibility maps that can be understood and utilized by local villagers. Landslide mitigation

  5. The influence of hazard models on GIS-based regional risk assessments and mitigation policies

    USGS Publications Warehouse

    Bernknopf, R.L.; Rabinovici, S.J.M.; Wood, N.J.; Dinitz, L.B.

    2006-01-01

    Geographic information systems (GIS) are important tools for understanding and communicating the spatial distribution of risks associated with natural hazards in regional economies. We present a GIS-based decision support system (DSS) for assessing community vulnerability to natural hazards and evaluating potential mitigation policy outcomes. The Land Use Portfolio Modeler (LUPM) integrates earth science and socioeconomic information to predict the economic impacts of loss-reduction strategies. However, the potential use of such systems in decision making may be limited when multiple but conflicting interpretations of the hazard are available. To explore this problem, we conduct a policy comparison using the LUPM to test the sensitivity of three available assessments of earthquake-induced lateral-spread ground failure susceptibility in a coastal California community. We find that the uncertainty regarding the interpretation of the science inputs can influence the development and implementation of natural hazard management policies. Copyright ?? 2006 Inderscience Enterprises Ltd.

  6. The NEOShield Project: Understanding the Mitigation-Relevant Physical Properties of Potentially Hazardous Asteroids

    NASA Astrophysics Data System (ADS)

    Harris, Alan W.; Drube, L.; Consortium, NEOShield

    2012-10-01

    NEOShield is a European-Union funded project to address impact hazard mitigation issues, coordinated by the German Aerospace Center, DLR. The NEOShield consortium consists of 13 research institutes, universities, and industrial partners from 6 countries and includes leading US and Russian space organizations. The primary aim of the 5.8 million euro, 3.5 year project, which commenced in January 2012, is to investigate in detail promising mitigation techniques, such as the kinetic impactor, blast deflection, and the gravity tractor, and devise feasible demonstration missions. Options for an international strategy for implementation when an actual impact threat arises will also be investigated. Our current scientific work is focused on examining the mitigation-relevant physical properties of the NEO population via observational data and laboratory experiments on asteroid surface analog materials. We are attempting to narrow the range of the expected properties of objects that are most likely to threaten the Earth and trigger space-borne mitigation attempts, and investigate how such objects would respond to different mitigation techniques. The results of our scientific work will flow into the technical phase of the project, during which detailed designs of feasible mitigation demonstration missions will be developed. We briefly describe the scope of the project and report on results obtained to date. Funded under EU FP7 program agreement no. 282703.

  7. Planning ahead for asteroid and comet hazard mitigation, phase 1: parameter space exploration and scenario modeling

    SciTech Connect

    Plesko, Catherine S; Clement, R Ryan; Weaver, Robert P; Bradley, Paul A; Huebner, Walter F

    2009-01-01

    The mitigation of impact hazards resulting from Earth-approaching asteroids and comets has received much attention in the popular press. However, many questions remain about the near-term and long-term, feasibility and appropriate application of all proposed methods. Recent and ongoing ground- and space-based observations of small solar-system body composition and dynamics have revolutionized our understanding of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). Ongoing increases in computing power and algorithm sophistication make it possible to calculate the response of these inhomogeneous objects to proposed mitigation techniques. Here we present the first phase of a comprehensive hazard mitigation planning effort undertaken by Southwest Research Institute and Los Alamos National Laboratory. We begin by reviewing the parameter space of the object's physical and chemical composition and trajectory. We then use the radiation hydrocode RAGE (Gittings et al. 2008), Monte Carlo N-Particle (MCNP) radiation transport (see Clement et al., this conference), and N-body dynamics codes to explore the effects these variations in object properties have on the coupling of energy into the object from a variety of mitigation techniques, including deflection and disruption by nuclear and conventional munitions, and a kinetic impactor.

  8. A perspective multidisciplinary geological approach for mitigation of effects due to the asbestos hazard

    NASA Astrophysics Data System (ADS)

    Vignaroli, Gianluca; Rossetti, Federico; Belardi, Girolamo; Billi, Andrea

    2010-05-01

    Asbestos-bearing rock sequences constitute a remarkable natural hazard that poses important threat to human health and may be at the origin of diseases such as asbestosis, mesothelioma and lung cancer). Presently, asbestos is classified as Category 1 carcinogen by world health authorities. Although regulatory agencies in many countries prohibit or restrict the use of asbestos, and discipline the environmental asbestos exposure, the impact of asbestos on human life still constitutes a major problem. Naturally occurring asbestos includes serpentine and amphibole minerals characterised by fibrous morphology and it is a constituent of mineralogical associations typical of mafic and ultramafic rocks within the ophiolitic sequences. Release of fibres can occur both through natural processes (erosion) and through human activities requiring fragmentation of ophiolite rocks (quarrying, tunnelling, railways construction, etc.). As a consequence, vulnerability is increasing in sites where workers and living people are involved by dispersion of fibres during mining and milling of ophiolitic rocks. By analysing in the field different exposures of ophiolitic sequences from the Italian peninsula and after an extensive review of the existing literature, we remark the importance of the geological context (origin, tectonic and deformation history) of ophiolites as a first-order parameter in evaluating the asbestos hazard. Integrated structural, textural, mineralogical and petrological studies significantly improve our understanding of the mechanisms governing the nucleation/growth of fibrous minerals in deformation structures (both ductile and brittle) within the ophiolitic rocks. A primary role is recognised in the structural processes favouring the fibrous mineralization, with correlation existing between the fibrous parameters (such as mineralogical composition, texture, mechanics characteristics) and the particles released in the air (such as shape, size, and amount liberated

  9. A portfolio approach to evaluating natural hazard mitigation policies: An Application to lateral-spread ground failure in Coastal California

    USGS Publications Warehouse

    Bernknopf, R.L.; Dinitz, L.B.; Rabinovici, S.J.M.; Evans, A.M.

    2001-01-01

    In the past, efforts to prevent catastrophic losses from natural hazards have largely been undertaken by individual property owners based on site-specific evaluations of risks to particular buildings. Public efforts to assess community vulnerability and encourage mitigation have focused on either aggregating site-specific estimates or adopting standards based upon broad assumptions about regional risks. This paper develops an alternative, intermediate-scale approach to regional risk assessment and the evaluation of community mitigation policies. Properties are grouped into types with similar land uses and levels of hazard, and hypothetical community mitigation strategies for protecting these properties are modeled like investment portfolios. The portfolios consist of investments in mitigation against the risk to a community posed by a specific natural hazard, and are defined by a community's mitigation budget and the proportion of the budget invested in locations of each type. The usefulness of this approach is demonstrated through an integrated assessment of earthquake-induced lateral-spread ground failure risk in the Watsonville, California area. Data from the magnitude 6.9 Loma Prieta earthquake of 1989 are used to model lateral-spread ground failure susceptibility. Earth science and economic data are combined and analyzed in a Geographic Information System (GIS). The portfolio model is then used to evaluate the benefits of mitigating the risk in different locations. Two mitigation policies, one that prioritizes mitigation by land use type and the other by hazard zone, are compared with a status quo policy of doing no further mitigation beyond that which already exists. The portfolio representing the hazard zone rule yields a higher expected return than the land use portfolio does: However, the hazard zone portfolio experiences a higher standard deviation. Therefore, neither portfolio is clearly preferred. The two mitigation policies both reduce expected losses

  10. Lidar and Electro-Optics for Atmospheric Hazard Sensing and Mitigation

    NASA Technical Reports Server (NTRS)

    Clark, Ivan O.

    2012-01-01

    This paper provides an overview of the research and development efforts of the Lidar and Electro-Optics element of NASA's Aviation Safety Program. This element is seeking to improve the understanding of the atmospheric environments encountered by aviation and to provide enhanced situation awareness for atmospheric hazards. The improved understanding of atmospheric conditions is specifically to develop sensor signatures for atmospheric hazards. The current emphasis is on kinetic air hazards such as turbulence, aircraft wake vortices, mountain rotors, and windshear. Additional efforts are underway to identify and quantify the hazards arising from multi-phase atmospheric conditions including liquid and solid hydrometeors and volcanic ash. When the multi-phase conditions act as obscurants that result in reduced visual awareness, the element seeks to mitigate the hazards associated with these diminished visual environments. The overall purpose of these efforts is to enable safety improvements for air transport class and business jet class aircraft as the transition to the Next Generation Air Transportation System occurs.

  11. Nationwide Operational Assessment of Hazards and success stories in disaster prevention and mitigation in the Philippines

    NASA Astrophysics Data System (ADS)

    Mahar Francisco Lagmay, Alfredo

    2016-04-01

    The Philippines, being a locus of typhoons, tsunamis, earthquakes, and volcanic eruptions, is a hotbed of disasters. Natural hazards inflict loss of lives and costly damage to property in the country. In 2011, after tropical storm Washi devastated cities in southern Philippines, the Department of Science and Technology put in place a responsive program to warn and give communities hours-in-advance lead-time to prepare for imminent hazards and use advanced science and technology to enhance geohazard maps for more effective disaster prevention and mitigation. Since its launch, there have been many success stories on the use of Project NOAH, which after Typhoon Haiyan was integrated into the Pre-Disaster Risk Assessment (PDRA) system of the National Disaster Risk Reduction and Management Council (NDRRMC), the government agency tasked to prepare for, and respond to, natural calamities. Learning from past disasters, NDRRMC now issues warnings, through scientific advise from DOST-Project NOAH and PAGASA (Philippine Weather Bureau) that are hazards-specific, area-focused and time-bound. Severe weather events in 2015 generated dangerous hazard phenomena such as widespread floods and massive debris flows, which if not for timely, accessible and understandable warnings, could have turned into disasters. We call these events as "disasters that did not happen". The innovative warning system of the Philippine government has so far proven effective in addressing the impacts of hydrometeorological hazards and can be employed elsewhere in the world.

  12. The asteroid and comet impact hazard: risk assessment and mitigation options.

    PubMed

    Gritzner, Christian; Dürfeld, Kai; Kasper, Jan; Fasoulas, Stefanos

    2006-08-01

    The impact of extraterrestrial matter onto Earth is a continuous process. On average, some 50,000 tons of dust are delivered to our planet every year. While objects smaller than about 30 m mainly disintegrate in the Earth's atmosphere, larger ones can penetrate through it and cause damage on the ground. When an object of hundreds of meters in diameter impacts an ocean, a tsunami is created that can devastate coastal cities. Further, if a km-sized object hit the Earth it would cause a global catastrophe due to the transport of enormous amounts of dust and vapour into the atmosphere resulting in a change in the Earth's climate. This article gives an overview of the near-Earth asteroid and comet (near-Earth object-NEO) impact hazard and the NEO search programmes which are gathering important data on these objects. It also points out options for impact hazard mitigation by using deflection systems. It further discusses the critical constraints for NEO deflection strategies and systems as well as mitigation and evacuation costs and benefits. Recommendations are given for future activities to solve the NEO impact hazard problem.

  13. Linear Aerospike SR-71 Experiment (LASRE): Aerospace Propulsion Hazard Mitigation Systems

    NASA Technical Reports Server (NTRS)

    Mizukami, Masashi; Corpening, Griffin P.; Ray, Ronald J.; Hass, Neal; Ennix, Kimberly A.; Lazaroff, Scott M.

    1998-01-01

    A major hazard posed by the propulsion system of hypersonic and space vehicles is the possibility of fire or explosion in the vehicle environment. The hazard is mitigated by minimizing or detecting, in the vehicle environment, the three ingredients essential to producing fire: fuel, oxidizer, and an ignition source. The Linear Aerospike SR-71 Experiment (LASRE) consisted of a linear aerospike rocket engine integrated into one-half of an X-33-like lifting body shape, carried on top of an SR-71 aircraft. Gaseous hydrogen and liquid oxygen were used as propellants. Although LASRE is a one-of-a-kind experimental system, it must be rated for piloted flight, so this test presented a unique challenge. To help meet safety requirements, the following propulsion hazard mitigation systems were incorporated into the experiment: pod inert purge, oxygen sensors, a hydrogen leak detection algorithm, hydrogen sensors, fire detection and pod temperature thermocouples, water misting, and control room displays. These systems are described, and their development discussed. Analyses, ground test, and flight test results are presented, as are findings and lessons learned.

  14. The asteroid and comet impact hazard: risk assessment and mitigation options

    NASA Astrophysics Data System (ADS)

    Gritzner, Christian; Dürfeld, Kai; Kasper, Jan; Fasoulas, Stefanos

    2006-08-01

    The impact of extraterrestrial matter onto Earth is a continuous process. On average, some 50,000 tons of dust are delivered to our planet every year. While objects smaller than about 30 m mainly disintegrate in the Earth’s atmosphere, larger ones can penetrate through it and cause damage on the ground. When an object of hundreds of meters in diameter impacts an ocean, a tsunami is created that can devastate coastal cities. Further, if a km-sized object hit the Earth it would cause a global catastrophe due to the transport of enormous amounts of dust and vapour into the atmosphere resulting in a change in the Earth’s climate. This article gives an overview of the near-Earth asteroid and comet (near-Earth object-NEO) impact hazard and the NEO search programmes which are gathering important data on these objects. It also points out options for impact hazard mitigation by using deflection systems. It further discusses the critical constraints for NEO deflection strategies and systems as well as mitigation and evacuation costs and benefits. Recommendations are given for future activities to solve the NEO impact hazard problem.

  15. L-Reactor Habitat Mitigation Study

    SciTech Connect

    Not Available

    1988-02-01

    The L-Reactor Fish and Wildlife Resource Mitigation Study was conducted to quantify the effects on habitat of the L-Reactor restart and to identify the appropriate mitigation for these impacts. The completed project evaluated in this study includes construction of a 1000 acre reactor cooling reservoir formed by damming Steel Creek. Habitat impacts identified include a loss of approximately 3,700 average annual habitat units. This report presents a mitigation plan, Plan A, to offset these habitat losses. Plan A will offset losses for all species studied, except whitetailed deer. The South Carolina Wildlife and Marine Resources Department strongly recommends creation of a game management area to provide realistic mitigation for loss of deer habitats. 10 refs., 5 figs., 3 tabs. (MHB)

  16. Planning Ahead for Asteroid Hazard Mitigation, Phase 1: Parameter Space Exploration and Scenario Modeling

    NASA Astrophysics Data System (ADS)

    Plesko, C. Weaver, R.; Clement, R.; Bradley, P.; Huebner, W.

    The mitigation of impact hazards resulting from Earth-approaching asteroids and comets has received much attention in the popular press. However, many questions remain about the near-term and long-term feasibility and appropriate application of all proposed methods. Recent and ongoing ground and space-based observations of small solar system body composition and dynamics have revolutionized our understanding of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). Ongoing increases in computing power and algorithm sophistication make it possible to calculate the response of these inhomogeneous objects to proposed mitigation techniques. Here we present the first phase of a comprehensive hazard mitigation planning effort undertaken by Southwest Research Institute and Los Alamos National Laboratory. We begin by reviewing the parameter space of the objects physical and chemical composition and trajectory. We then use the radiation hydrocode RAGE (Gittings et al. 2008), Monte Carlo N-Particle (MCNP) radiation transport (see Clement et al., this conference), and N-body dynamics codes to explore the effects these variations in object properties have on the coupling of energy into the object from a variety of mitigation techniques, including deflection and disruption by nuclear and conventional munitions, and a kinetic impactor. Preliminary results for models of the deflection of a 100 m basalt sphere by a 100 kt nuclear burst (Bradley et al., LPSC 2009) are encouraging. A 40 cm/s velocity away from the burst is imparted to the objects center of mass without disruption. Further results will be presented at the meeting.

  17. The Puerto Rico Component of the National Tsunami Hazard and Mitigation Program Pr-Nthmp

    NASA Astrophysics Data System (ADS)

    Huerfano Moreno, V. A.; Hincapie-Cardenas, C. M.

    2014-12-01

    Tsunami hazard assessment, detection, warning, education and outreach efforts are intended to reduce losses to life and property. The Puerto Rico Seismic Network (PRSN) is participating in an effort with local and federal agencies, to developing tsunami hazard risk reduction strategies under the National Tsunami Hazards and Mitigation Program (NTHMP). This grant supports the TsunamiReady program which is the base of the tsunami preparedness and mitigation in PR. The Caribbean region has a documented history of damaging tsunamis that have affected coastal areas. The seismic water waves originating in the prominent fault systems around PR are considered to be a near-field hazard for Puerto Rico and the Virgin islands (PR/VI) because they can reach coastal areas within a few minutes after the earthquake. Sources for local, regional and tele tsunamis have been identified and modeled and tsunami evacuation maps were prepared for PR. These maps were generated in three phases: First, hypothetical tsunami scenarios on the basis of the parameters of potential underwater earthquakes were developed. Secondly, each of these scenarios was simulated. The third step was to determine the worst case scenario (MOM). The run-ups were drawn on GIS referenced maps and aerial photographs. These products are being used by emergency managers to educate the public and develop mitigation strategies. Online maps and related evacuation products are available to the public via the PR-TDST (PR Tsunami Decision Support Tool). Currently all the 44 coastal municipalities were recognized as TsunamiReady by the US NWS. The main goal of the program is to declare Puerto Rico as TsunamiReady, including two cities that are not coastal but could be affected by tsunamis. Based on these evacuation maps, tsunami signs were installed, vulnerability profiles were created, communication systems to receive and disseminate tsunami messages were installed in each TWFP, and tsunami response plans were approved

  18. Use of a Novel Visual Metaphor Measure (PRISM) to Evaluate School Children's Perceptions of Natural Hazards, Sources of Hazard Information, Hazard Mitigation Organizations, and the Effectiveness of Future Hazard Education Programs in Dominica, Eastern Car

    NASA Astrophysics Data System (ADS)

    Parham, Martin; Day, Simon; Teeuw, Richard; Solana, Carmen; Sensky, Tom

    2015-04-01

    This project aims to study the development of understanding of natural hazards (and of hazard mitigation) from the age of 11 to the age of 15 in secondary school children from 5 geographically and socially different schools on Dominica, through repeated interviews with the students and their teachers. These interviews will be coupled with a structured course of hazard education in the Geography syllabus; the students not taking Geography will form a control group. To avoid distortion of our results arising from the developing verbalization and literacy skills of the students over the 5 years of the project, we have adapted the PRISM tool used in clinical practice to assess patient perceptions of illness and treatment (Buchi & Sensky, 1999). This novel measure is essentially non-verbal, and uses spatial positions of moveable markers ("object" markers) on a board, relative to a fixed marker that represents the subject's "self", as a visual metaphor for the importance of the object to the subject. The subjects also explain their reasons for placing the markers as they have, to provide additional qualitative information. The PRISM method thus produces data on the perceptions measured on the board that can be subjected to statistical analysis, and also succinct qualitative data about each subject. Our study will gather data on participants' perceptions of different natural hazards, different sources of information about these, and organizations or individuals to whom they would go for help in a disaster, and investigate how these vary with geographical and social factors. To illustrate the method, which is generalisable, we present results from our initial interviews of the cohort of 11 year olds whom we will follow through their secondary school education. Büchi, S., & Sensky, T. (1999). PRISM: Pictorial Representation of Illness and Self Measure: a brief nonverbal measure of illness impact and therapeutic aid in psychosomatic medicine. Psychosomatics, 40(4), 314-320.

  19. Use of a Novel Visual Metaphor Measure (PRISM) to Evaluate School Children's Perceptions of Natural Hazards, Sources of Hazard Information, Hazard Mitigation Organizations, and the Effectiveness of Future Hazard Education Programs in Dominica, Eastern Caribbean

    NASA Astrophysics Data System (ADS)

    Parham, M.; Day, S. J.; Teeuw, R. M.; Solana, C.; Sensky, T.

    2014-12-01

    This project aims to study the development of understanding of natural hazards (and of hazard mitigation) from the age of 11 to the age of 15 in secondary school children from 5 geographically and socially different schools on Dominica, through repeated interviews with the students and their teachers. These interviews will be coupled with a structured course of hazard education in the Geography syllabus; the students not taking Geography will form a control group. To avoid distortion of our results arising from the developing verbalization and literacy skills of the students over the 5 years of the project, we have adapted the PRISM tool used in clinical practice to assess patient perceptions of illness and treatment (Buchi & Sensky, 1999). This novel measure is essentially non-verbal, and uses spatial positions of moveable markers ("object" markers) on a board, relative to a fixed marker that represents the subject's "self", as a visual metaphor for the importance of the object to the subject. The subjects also explain their reasons for placing the markers as they have, to provide additional qualitative information. The PRISM method thus produces data on the perceptions measured on the board that can be subjected to statistical analysis, and also succinct qualitative data about each subject. Our study will gather data on participants' perceptions of different natural hazards, different sources of information about these, and organizations or individuals to whom they would go for help in a disaster, and investigate how these vary with geographical and social factors. To illustrate the method, which is generalisable, we present results from our initial interviews of the cohort of 11 year olds whom we will follow through their secondary school education.Büchi, S., & Sensky, T. (1999). PRISM: Pictorial Representation of Illness and Self Measure: a brief nonverbal measure of illness impact and therapeutic aid in psychosomatic medicine. Psychosomatics, 40(4), 314-320.

  20. Impact Hazard Mitigation: Understanding the Effects of Nuclear Explosive Outputs on Comets and Asteroids

    NASA Astrophysics Data System (ADS)

    Clement, R.

    The NASA 2007 white paper "Near-Earth Object Survey and Deflection Analysis of Alternatives" affirms deflection as the safest and most effective means of potentially hazardous object (PHO) impact prevention. It also calls for further studies of object deflection. In principle, deflection of a PHO may be accomplished by using kinetic impactors, chemical explosives, gravity tractors, solar sails, or nuclear munitions. Of the sudden impulse options, nuclear munitions are by far the most efficient in terms of yield-per-unit-mass launched and are technically mature. However, there are still significant questions about the response of a comet or asteroid to a nuclear burst. Recent and ongoing observational and experimental work is revolutionizing our understanding of the physical and chemical properties of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). The combination of this improved understanding of small solar-system bodies combined with current state-of-the-art modeling and simulation capabilities, which have also improved dramatically in recent years, allow for a science-based, comprehensive study of PHO mitigation techniques. Here we present an examination of the effects of radiation from a nuclear explosion on potentially hazardous asteroids and comets through Monte Carlo N-Particle code (MCNP) simulation techniques. MCNP is a general-purpose particle transport code commonly used to model neutron, photon, and electron transport for medical physics, reactor design and safety, accelerator target and detector design, and a variety of other applications including modeling the propagation of epithermal neutrons through the Martian regolith (Prettyman 2002). It is a massively parallel code that can conduct simulations in 1-3 dimensions, complicated geometries, and with extremely powerful variance reduction techniques. It uses current nuclear cross section data, where available, and fills in the gaps with analytical models where data

  1. Impact hazard mitigation: understanding the effects of nuclear explosive outputs on comets and asteroids

    SciTech Connect

    Clement, Ralph R C; Plesko, Catherine S; Bradley, Paul A; Conlon, Leann M

    2009-01-01

    The NASA 2007 white paper ''Near-Earth Object Survey and Deflection Analysis of Alternatives'' affirms deflection as the safest and most effective means of potentially hazardous object (PHO) impact prevention. It also calls for further studies of object deflection. In principle, deflection of a PHO may be accomplished by using kinetic impactors, chemical explosives, gravity tractors, solar sails, or nuclear munitions. Of the sudden impulse options, nuclear munitions are by far the most efficient in terms of yield-per-unit-mass launched and are technically mature. However, there are still significant questions about the response of a comet or asteroid to a nuclear burst. Recent and ongoing observational and experimental work is revolutionizing our understanding of the physical and chemical properties of these bodies (e.g ., Ryan (2000) Fujiwara et al. (2006), and Jedicke et al. (2006)). The combination of this improved understanding of small solar-system bodies combined with current state-of-the-art modeling and simulation capabilities, which have also improved dramatically in recent years, allow for a science-based, comprehensive study of PHO mitigation techniques. Here we present an examination of the effects of radiation from a nuclear explosion on potentially hazardous asteroids and comets through Monte Carlo N-Particle code (MCNP) simulation techniques. MCNP is a general-purpose particle transport code commonly used to model neutron, photon, and electron transport for medical physics reactor design and safety, accelerator target and detector design, and a variety of other applications including modeling the propagation of epithermal neutrons through the Martian regolith (Prettyman 2002). It is a massively parallel code that can conduct simulations in 1-3 dimensions, complicated geometries, and with extremely powerful variance reduction techniques. It uses current nuclear cross section data, where available, and fills in the gaps with analytical models where

  2. Identification, prediction, and mitigation of sinkhole hazards in evaporite karst areas

    USGS Publications Warehouse

    Gutierrez, F.; Cooper, A.H.; Johnson, K.S.

    2008-01-01

    Sinkholes usually have a higher probability of occurrence and a greater genetic diversity in evaporite terrains than in carbonate karst areas. This is because evaporites have a higher solubility and, commonly, a lower mechanical strength. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas. To deal with these hazards, a phased approach is needed for sinkhole identification, investigation, prediction, and mitigation. Identification techniques include field surveys and geomorphological mapping combined with accounts from local people and historical sources. Detailed sinkhole maps can be constructed from sequential historical maps, recent topographical maps, and digital elevation models (DEMs) complemented with building-damage surveying, remote sensing, and high-resolution geodetic surveys. On a more detailed level, information from exposed paleosubsidence features (paleokarst), speleological explorations, geophysical investigations, trenching, dating techniques, and boreholes may help in investigating dissolution and subsidence features. Information on the hydrogeological pathways including caves, springs, and swallow holes are particularly important especially when corroborated by tracer tests. These diverse data sources make a valuable database-the karst inventory. From this dataset, sinkhole susceptibility zonations (relative probability) may be produced based on the spatial distribution of the features and good knowledge of the local geology. Sinkhole distribution can be investigated by spatial distribution analysis techniques including studies of preferential elongation, alignment, and nearest neighbor analysis. More objective susceptibility models may be obtained by analyzing the statistical relationships between the known sinkholes and the conditioning factors. Chronological information on sinkhole formation is required to estimate the probability of

  3. Hazardous crater lakes studied

    NASA Astrophysics Data System (ADS)

    Kusakabe, Minoru

    Crater lakes usually sit on top of volcanic conduits and act as condensers of magmatic vapor. Studies of crater lakes can therefore provide information on both deep magmatic activity and variations in the degassing state of a shallow magmatic body. The Lake Nyos gas disaster of August 1986 and a similar event in August 1984 at Lake Monoun, both in Cameroon, resulted from the accumulation of magmatic CO2 in the bottom layers of the lakes. Geochemical monitoring of crater lakes is a promising tool for forecasting not only limnic but also volcanic eruptions. Acid-mineralized waters formed by condensation of hot magmatic volatiles in crater lakes are thought to bear some resemblance to hydrothermal fluids acting in the genesis of acid-sulfate alteration and Au-Cu-Ag mineralization of volcanic-hosted precious metal deposits.

  4. 2009 ERUPTION OF REDOUBT VOLCANO: Lahars, Oil, and the Role of Science in Hazards Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Swenson, R.; Nye, C. J.

    2009-12-01

    In March, 2009, Redoubt Volcano erupted for the third time in 45 years. More than 19 explosions produced ash plumes to 60,000 ft asl, lahar flows of mud and ice down the Drift river ~30 miles to the coast, and tephra fall up to 1.5 mm onto surrounding communities. The eruption had severe impact on many operations. Airlines were forced to cancel or divert hundreds of international and domestic passenger and cargo flights, and Anchorage International airport closed for over 12 hours. Mudflows and floods down the Drift River to the coast impacted operations at the Drift River Oil Terminal (DROT) which was forced to shut down and ultimately be evacuated. Prior mitigation efforts to protect the DROT oil tank farm from potential impacts associated with a major eruptive event were successful, and none of the 148,000 barrels of oil stored at the facility was spilled or released. Nevertheless, the threat of continued eruptive activity at Redoubt, with the possibility of continued lahar flows down the Drift River alluvial fan, required an incident command post be established so that the US Coast Guard, Alaska Dept. of Environmental Conservation, and the Cook Inlet Pipeline Company could coordinate a response to the potential hazards. Ultimately, the incident command team relied heavily on continuous real-time data updates from the Alaska Volcano Observatory, as well as continuous geologic interpretations and risk analysis by the USGS Volcanic Hazards group, the State Division of Geological and Geophysical Surveys and the University of Alaska Geophysical Institute, all members of the collaborative effort of the Alaska Volcano Observatory. The great success story that unfolded attests to the efforts of the incident command team, and their reliance on real-time scientific analysis from scientific experts. The positive results also highlight how pre-disaster mitigation and monitoring efforts, in concert with hazards response planning, can be used in a cooperative industry

  5. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

    NASA Astrophysics Data System (ADS)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

    2013-12-01

    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at

  6. Coupling Radar Rainfall Estimation and Hydrological Modelling For Flash-flood Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Borga, M.; Creutin, J. D.

    Flood risk mitigation is accomplished through managing either or both the hazard and vulnerability. Flood hazard may be reduced through structural measures which alter the frequency of flood levels in the area. The vulnerability of a community to flood loss can be mitigated through changing or regulating land use and through flood warning and effective emergency response. When dealing with flash-flood hazard, it is gener- ally accepted that the most effective way (and in many instances the only affordable in a sustainable perspective) to mitigate the risk is by reducing the vulnerability of the involved communities, in particular by implementing flood warning systems and community self-help programs. However, both the inherent characteristics of the at- mospheric and hydrologic processes involved in flash-flooding and the changing soci- etal needs provide a tremendous challenge to traditional flood forecasting and warning concepts. In fact, the targets of these systems are traditionally localised like urbanised sectors or hydraulic structures. Given the small spatial scale that characterises flash floods and the development of dispersed urbanisation, transportation, green tourism and water sports, human lives and property are exposed to flash flood risk in a scat- tered manner. This must be taken into consideration in flash flood warning strategies and the investigated region should be considered as a whole and every section of the drainage network as a potential target for hydrological warnings. Radar technology offers the potential to provide information describing rain intensities almost contin- uously in time and space. Recent research results indicate that coupling radar infor- mation to distributed hydrologic modelling can provide hydrologic forecasts at all potentially flooded points of a region. Nevertheless, very few flood warning services use radar data more than on a qualitative basis. After a short review of current under- standing in this area, two

  7. Radar Studies of Aviation Hazards

    DTIC Science & Technology

    1994-05-31

    4. TITLE AND SURTITLE S. FUNDING NUMBERS RADAR STUDIES OF AVIATION HAZARDS F1 9628-93- C -0054 _____________ __PE63707F 6. AUTHOR(S) PR278 1...foilowing processing steps have been adopted: a. acquire single scan radar data, b. distinguish individual storms, c . eliminate spurious data for...occurred only with radar reflectivities above 40 dBZ at the -10° C level and cloud tops above the -200C level. Lightning occurred only when tops extended

  8. Earth sciences, GIS and geomatics for natural hazards assessment and risks mitigation: a civil protection perspective

    NASA Astrophysics Data System (ADS)

    Perotti, Luigi; Conte, Riccardo; Lanfranco, Massimo; Perrone, Gianluigi; Giardino, Marco; Ratto, Sara

    2010-05-01

    Geo-information and remote sensing are proper tools to enhance functional strategies for increasing awareness on natural hazards and risks and for supporting research and operational activities devoted to disaster reduction. An improved Earth Sciences knowledge coupled with Geomatics advanced technologies has been developed by the joint research group and applied by the ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action) centre, within its partnership with the UN World Food Programme (WFP) with the goal of reducing human, social, economic and environmental losses due to natural hazards and related disasters. By cooperating with local and regional authorities (Municipalities, Centro Funzionale of the Aosta Valley, Civil Protection Agency of Regione Piemonte), data on natural hazards and risks have been collected, compared to national and global data, then interpreted for helping communities and civil protection agencies of sensitive mountain regions to make strategic choices and decisions to better mitigation and adaption measures. To enhance the application of GIS and Remote-sensing technologies for geothematic mapping of geological and geomorphological risks of mountain territories of Europe and Developing Countries, research activities led to the collection and evaluation of data from scientific literature and historical technical archives, for the definition of predisposing/triggering factors and evolutionary processes of natural instability phenomena (landslides, floods, storms, …) and for the design and implementation of early-warning and early-impact systems. Geodatabases, Remote Sensing and Mobile-GIS applications were developed to perform analysis of : 1) large climate-related disaster (Hurricane Mitch, Central America), by the application of remote sensing techniques, either for early warning or mitigation measures at the national and international scale; 2) distribution of slope instabilities at the regional scale (Aosta

  9. The Brave New World of Real-time GPS for Hazards Mitigation

    NASA Astrophysics Data System (ADS)

    Melbourne, T. I.; Szeliga, W. M.; Santillan, V. M.; Scrivner, C. W.

    2015-12-01

    Over 600 continuously-operating, real-time telemetered GPS receivers operate throughout California, Oregon, Washington and Alaska. These receivers straddle active crustal faults, volcanoes and landslides, the magnitude-9 Cascadia and northeastern Alaskan subduction zones and their attendant tsunamigenic regions along the Pacific coast. Around the circum-Pacific, there are hundreds more and the number is growing steadily as real-time networks proliferate. Despite offering the potential for sub-cm positioning accuracy in real-time useful for a broad array of hazards mitigation, these GPS stations are only now being incorporated into routine seismic, tsunami, volcanic, land-slide, space-weather, or meterologic monitoring. We will discuss NASA's READI (Real-time Earthquake Analysis for DIsasters) initiative. This effort is focussed on developing all aspects of real-time GPS for hazards mitigation, from establishing international data-sharing agreements to improving basic positioning algorithms. READI's long-term goal is to expand real-time GPS monitoring throughout the circum-Pacific as overseas data become freely available, so that it may be adopted by NOAA, USGS and other operational agencies responsible for natural hazards monitoring. Currently ~100 stations are being jointly processed by CWU and Scripps Inst. of Oceanography for algorithm comparison and downstream merging purposes. The resultant solution streams include point-position estimates in a global reference frame every second with centimeter accuracy, ionospheric total electron content and tropospheric zenith water content. These solutions are freely available to third-party agencies over several streaming protocols to enable their incorporation and use in hazards monitoring. This number will ramp up to ~400 stations over the next year. We will also discuss technical efforts underway to develop a variety of downstream applications of the real-time position streams, including the ability to broadcast

  10. The Puerto Rico Component of the National Tsunami Hazard and Mitigation Program (PR-NTHMP)

    NASA Astrophysics Data System (ADS)

    Vanacore, E. A.; Huerfano Moreno, V. A.; Lopez, A. M.

    2015-12-01

    The Caribbean region has a documented history of damaging tsunamis that have affected coastal areas. Of particular interest is the Puerto Rico - Virgin Islands (PRVI) region, where the proximity of the coast to prominent tectonic faults would result in near-field tsunamis. Tsunami hazard assessment, detection capabilities, warning, education and outreach efforts are common tools intended to reduce loss of life and property. It is for these reasons that the PRSN is participating in an effort with local and federal agencies to develop tsunami hazard risk reduction strategies under the NTHMP. This grant supports the TsunamiReady program, which is the base of the tsunami preparedness and mitigation in PR. In order to recognize threatened communities in PR as TsunamiReady by the US NWS, the PR Component of the NTHMP have identified and modeled sources for local, regional and tele-tsunamis and the results of simulations have been used to develop tsunami response plans. The main goal of the PR-NTHMP is to strengthen resilient coastal communities that are prepared for tsunami hazards, and recognize PR as TsunamiReady. Evacuation maps were generated in three phases: First, hypothetical tsunami scenarios of potential underwater earthquakes were developed, and these scenarios were then modeled through during the second phase. The third phase consisted in determining the worst-case scenario based on the Maximum of Maximums (MOM). Inundation and evacuation zones were drawn on GIS referenced maps and aerial photographs. These products are being used by emergency managers to educate the public and develop mitigation strategies. Maps and related evacuation products, like evacuation times, can be accessed online via the PR Tsunami Decision Support Tool. Based on these evacuation maps, tsunami signs were installed, vulnerability profiles were created, communication systems to receive and disseminate tsunami messages were installed in each TWFP, and tsunami response plans were

  11. Solutions Network Formulation Report. NASA's Potential Contributions using ASTER Data in Marine Hazard Mitigation

    NASA Technical Reports Server (NTRS)

    Fletcher, Rose

    2010-01-01

    The 28-foot storm surge from Hurricane Katrina pushed inland along bays and rivers for a distance of 12 miles in some areas, contributing to the damage or destruction of about half of the fleet of boats in coastal Mississippi. Most of those boats had sought refuge in back bays and along rivers. Some boats were spared damage because the owners chose their mooring site well. Gulf mariners need a spatial analysis tool that provides guidance on the safest places to anchor their boats during future hurricanes. This product would support NOAA s mission to minimize the effects of coastal hazards through awareness, education, and mitigation strategies and could be incorporated in the Coastal Risk Atlas decision support tool.

  12. Rockfall hazard assessment, risk quantification, and mitigation options for reef cove resort development, False Cape, Queensland, Australia

    NASA Astrophysics Data System (ADS)

    Schlotfeldt, P.

    2009-04-01

    GIS and 2-D rock fall simulations were used as the primary tools during a rock fall hazard assessment and analyses for a major resort and township development near Cairns, Queensland in Australia. The methods used included 1) the development of a digital elevation model (DEM); undertaking rock fall trajectory analyses to determine the end points of rockfalls, the distribution of kinetic energy for identified rock fall runout Zones, and 3) undertaking event tree analyses based on a synthesis of all data in order to establish Zones with the highest risk of fatalities. This paper describes the methodology used and the results of this work. Recommendations to mitigate the hazard included having exclusions zones with no construction, scaling (including trim blasting), construction of berms and rockfall catch fences. Keywords: GIS, rockfall simulation, rockfall runout Zones, mitigation options INTRODUCTION False Cape is located on the east side of the Trinity inlet near Cairns (Figure 1). Construction is underway for a multi-million dollar development close the beach front. The development will ultimately cover about 1.5 km of prime coast line. The granite slopes above the development are steep and are covered with a number of large, potentially unstable boulders. Sheet jointing is present in the in-situ bedrock and these combined with other tectonic joint sets have provided a key mechanism for large side down slope on exposed bedrock. With each rock fall (evidence by boulders strew in gullies, over the lower parts of the slope, and on the beach) the failure mechanism migrates upslope. In order for the Developer to proceed with construction he needs to mitigate the identified rock fall hazard. The method used to study the hazard and key finding are presented in this paper. Discussion is provided in the conclusion on mitigation options. KEY METHODS USED TO STUDY THE HAZARD In summary the methods used to study the hazard for the False Cape project include; 1. The

  13. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    NASA Astrophysics Data System (ADS)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation

  14. Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Research Team

    NASA Technical Reports Server (NTRS)

    Kelly, Michael J.

    2013-01-01

    The Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage raft empennage.

  15. Oklahoma experiences largest earthquake during ongoing regional wastewater injection hazard mitigation efforts

    NASA Astrophysics Data System (ADS)

    Yeck, W. L.; Hayes, G. P.; McNamara, D. E.; Rubinstein, J. L.; Barnhart, W. D.; Earle, P. S.; Benz, H. M.

    2017-01-01

    The 3 September 2016, Mw 5.8 Pawnee earthquake was the largest recorded earthquake in the state of Oklahoma. Seismic and geodetic observations of the Pawnee sequence, including precise hypocenter locations and moment tensor modeling, shows that the Pawnee earthquake occurred on a previously unknown left-lateral strike-slip basement fault that intersects the mapped right-lateral Labette fault zone. The Pawnee earthquake is part of an unprecedented increase in the earthquake rate in Oklahoma that is largely considered the result of the deep injection of waste fluids from oil and gas production. If this is, indeed, the case for the M5.8 Pawnee earthquake, then this would be the largest event to have been induced by fluid injection. Since 2015, Oklahoma has undergone wide-scale mitigation efforts primarily aimed at reducing injection volumes. Thus far in 2016, the rate of M3 and greater earthquakes has decreased as compared to 2015, while the cumulative moment—or energy released from earthquakes—has increased. This highlights the difficulty in earthquake hazard mitigation efforts given the poorly understood long-term diffusive effects of wastewater injection and their connection to seismicity.

  16. Evaluation Of Risk And Possible Mitigation Schemes For Previously Unidentified Hazards

    NASA Technical Reports Server (NTRS)

    Linzey, William; McCutchan, Micah; Traskos, Michael; Gilbrech, Richard; Cherney, Robert; Slenski, George; Thomas, Walter, III

    2006-01-01

    protection wire schemes, 145 tests were conducted using various fuel/ox wire alternatives (shielded and unshielded) and/or different combinations of polytetrafuloroethylene (PTFE), Mystik tape and convoluted wraps to prevent unwanted coil activation. Test results were evaluated along with other pertinent data and information to develop a mitigation strategy for an inadvertent RCS firing. The SSP evaluated civilian aircraft wiring failures to search for aging trends in assessing the wire-short hazard. Appendix 2 applies Weibull statistical methods to the same data with a similar purpose.

  17. Seismic Hazard and Risk Assessment in Multi-Hazard Prone Urban Areas: The Case Study of Cologne, Germany

    NASA Astrophysics Data System (ADS)

    Tyagunov, S.; Fleming, K.; Parolai, S.; Pittore, M.; Vorogushyn, S.; Wieland, M.; Zschau, J.

    2012-04-01

    Most hazard and risk assessment studies usually analyze and represent different kinds of hazards and risks separately, although risk assessment and mitigation programs in multi-hazard prone urban areas should take into consideration possible interactions of different hazards. This is particularly true for communities located in seismically active zones, where, on the one hand, earthquakes are capable of triggering other types of hazards, while, on the other hand, one should bear in mind that temporal coincidence or succession of different hazardous events may influence the vulnerability of the existing built environment and, correspondingly, the level of the total risk. Therefore, possible inter-dependencies and inter-influences of different hazards should be reflected properly in the hazard, vulnerability and risk analyses. This work presents some methodological aspects and preliminary results of a study being implemented within the framework of the MATRIX (New Multi-Hazard and Multi-Risk Assessment Methods for Europe) project. One of the test cases of the MATRIX project is the city of Cologne, which is one of the largest cities of Germany. The area of Cologne, being exposed to windstorm, flood and earthquake hazards, has already been considered in comparative risk assessments. However, possible interactions of these different hazards have been neglected. The present study is aimed at the further development of a holistic multi-risk assessment methodology, taking into consideration possible time coincidence and inter-influences of flooding and earthquakes in the area.

  18. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    USGS Publications Warehouse

    Hearn,, Paul P.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  19. The 3D Elevation Program—Landslide recognition, hazard assessment, and mitigation support

    USGS Publications Warehouse

    Lukas, Vicki; Carswell, Jr., William J.

    2017-01-27

    The U.S. Geological Survey (USGS) Landslide Hazards Program conducts landslide hazard assessments, pursues landslide investigations and forecasts, provides technical assistance to respond to landslide emergencies, and engages in outreach. All of these activities benefit from the availability of high-resolution, three-dimensional (3D) elevation information in the form of light detection and ranging (lidar) data and interferometric synthetic aperture radar (IfSAR) data. Research on landslide processes addresses critical questions of where and when landslides are likely to occur as well as their size, speed, and effects. This understanding informs the development of methods and tools for hazard assessment and situational awareness used to guide efforts to avoid or mitigate landslide impacts. Such research is essential for the USGS to provide improved information on landslide potential associated with severe storms, earthquakes, volcanic activity, coastal wave erosion, and wildfire burn areas.Decisionmakers in government and the private sector increasingly depend on information the USGS provides before, during, and following disasters so that communities can live, work, travel, and build safely. The USGS 3D Elevation Program (3DEP) provides the programmatic infrastructure to generate and supply lidar-derived superior terrain data to address landslide applications and a wide range of other urgent needs nationwide. By providing data to users, 3DEP reduces users’ costs and risks and allows them to concentrate on their mission objectives. 3DEP includes (1) data acquisition partnerships that leverage funding, (2) contracts with experienced private mapping firms, (3) technical expertise, lidar data standards, and specifications, and (4) most important, public access to high-quality 3D elevation data.

  20. Volcano Hazard Tracking and Disaster Risk Mitigation: A Detailed Gap Analysis from Data-Collection to User Implementation

    NASA Astrophysics Data System (ADS)

    Faied, D.; Sanchez, A.

    2009-04-01

    Volcano Hazard Tracking and Disaster Risk Mitigation: A Detailed Gap Analysis from Data-Collection to User Implementation Dohy Faied, Aurora Sanchez (on behalf of SSP08 VAPOR Project Team) Dohy.Faied@masters.isunet.edu While numerous global initiatives exist to address the potential hazards posed by volcanic eruption events and assess impacts from a civil security viewpoint, there does not yet exist a single, unified, international system of early warning and hazard tracking for eruptions. Numerous gaps exist in the risk reduction cycle, from data collection, to data processing, and finally dissemination of salient information to relevant parties. As part of the 2008 International Space University's Space Studies Program, a detailed gap analysis of the state of volcano disaster risk reduction was undertaken, and this paper presents the principal results. This gap analysis considered current sensor technologies, data processing algorithms, and utilization of data products by various international organizations. Recommendations for strategies to minimize or eliminate certain gaps are also provided. In the effort to address the gaps, a framework evolved at system level. This framework, known as VIDA, is a tool to develop user requirements for civil security in hazardous contexts, and a candidate system concept for a detailed design phase. VIDA also offers substantial educational potential: the framework includes a centralized clearinghouse for volcanology data which could support education at a variety of levels. Basic geophysical data, satellite maps, and raw sensor data are combined and accessible in a way that allows the relationships between these data types to be explored and used in a training environment. Such a resource naturally lends itself to research efforts in the subject but also research in operational tools, system architecture, and human/machine interaction in civil protection or emergency scenarios.

  1. Field Guide for Testing Existing Photovoltaic Systems for Ground Faults and Installing Equipment to Mitigate Fire Hazards

    SciTech Connect

    Brooks, William; Basso, Thomas; Coddington, Michael

    2015-10-01

    Ground faults and arc faults are the two most common reasons for fires in photovoltaic (PV) arrays and methods exist that can mitigate the hazards. This report provides field procedures for testing PV arrays for ground faults, and for implementing high resolution ground fault and arc fault detectors in existing and new PV system designs.

  2. The Identification of Filters and Interdependencies for Effective Resource Allocation: Coupling the Mitigation of Natural Hazards to Economic Development.

    NASA Astrophysics Data System (ADS)

    Agar, S. M.; Kunreuther, H.

    2005-12-01

    Policy formulation for the mitigation and management of risks posed by natural hazards requires that governments confront difficult decisions for resource allocation and be able to justify their spending. Governments also need to recognize when spending offers little improvement and the circumstances in which relatively small amounts of spending can make substantial differences. Because natural hazards can have detrimental impacts on local and regional economies, patterns of economic development can also be affected by spending decisions for disaster mitigation. This paper argues that by mapping interdependencies among physical, social and economic factors, governments can improve resource allocation to mitigate the risks of natural hazards while improving economic development on local and regional scales. Case studies of natural hazards in Turkey have been used to explore specific "filters" that act to modify short- and long-term outcomes. Pre-event filters can prevent an event from becoming a natural disaster or change a routine event into a disaster. Post-event filters affect both short and long-term recovery and development. Some filters cannot be easily modified by spending (e.g., rural-urban migration) but others (e.g., land-use practices) provide realistic spending targets. Net social benefits derived from spending, however, will also depend on the ways by which filters are linked, or so-called "interdependencies". A single weak link in an interdependent system, such as a power grid, can trigger a cascade of failures. Similarly, weak links in social and commercial networks can send waves of disruption through communities. Conversely, by understanding the positive impacts of interdependencies, spending can be targeted to maximize net social benefits while mitigating risks and improving economic development. Detailed information on public spending was not available for this study but case studies illustrate how networks of interdependent filters can modify

  3. Challenges in understanding, modelling, and mitigating Lake Outburst Flood Hazard: experiences from Central Asia

    NASA Astrophysics Data System (ADS)

    Mergili, Martin; Schneider, Demian; Andres, Norina; Worni, Raphael; Gruber, Fabian; Schneider, Jean F.

    2010-05-01

    Lake Outburst Floods can evolve from complex process chains like avalanches of rock or ice that produce flood waves in a lake which may overtop and eventually breach glacial, morainic, landslide, or artificial dams. Rising lake levels can lead to progressive incision and destabilization of a dam, to enhanced ground water flow (piping), or even to hydrostatic failure of ice dams which can cause sudden outflow of accumulated water. These events often have a highly destructive potential because a large amount of water is released in a short time, with a high capacity to erode loose debris, leading to a powerful debris flow with a long travel distance. The best-known example of a lake outburst flood is the Vajont event (Northern Italy, 1963), where a landslide rushed into an artificial lake which spilled over and caused a flood leading to almost 2000 fatalities. Hazards from the failure of landslide dams are often (not always) fairly manageable: most breaches occur in the first few days or weeks after the landslide event and the rapid construction of a spillway - though problematic - has solved some hazardous situations (e.g. in the case of Hattian landslide in 2005 in Pakistan). Older dams, like Usoi dam (Lake Sarez) in Tajikistan, are usually fairly stable, though landsildes into the lakes may create floodwaves overtopping and eventually weakening the dams. The analysis and the mitigation of glacial lake outburst flood (GLOF) hazard remains a challenge. A number of GLOFs resulting in fatalities and severe damage have occurred during the previous decades, particularly in the Himalayas and in the mountains of Central Asia (Pamir, Tien Shan). The source area is usually far away from the area of impact and events occur at very long intervals or as singularities, so that the population at risk is usually not prepared. Even though potentially hazardous lakes can be identified relatively easily with remote sensing and field work, modeling and predicting of GLOFs (and also

  4. Environmental legislation as the legal framework for mitigating natural hazards in Spain

    NASA Astrophysics Data System (ADS)

    Garrido, Jesús; Arana, Estanislao; Jiménez Soto, Ignacio; Delgado, José

    2015-04-01

    In Spain, the socioeconomic losses due to natural hazards (floods, earthquakes or landslides) are considerable, and the indirect costs associated with them are rarely considered because they are very difficult to evaluate. The prevention of losses due to natural hazards is more economic and efficient through legislation and spatial planning rather than through structural measures, such as walls, anchorages or structural reinforcements. However, there isn't a Spanish natural hazards law and national and regional sector legislation make only sparse mention of them. After 1978, when the Spanish Constitution was enacted, the Autonomous Communities (Spanish regions) were able to legislate according to the different competences (urban planning, environment or civil protection), which were established in the Constitution. In the 1990's, the Civil Protection legislation (national law and regional civil protection tools) dealt specifically with natural hazards (floods, earthquakes and volcanoes), but this was before any soil, seismic or hydrological studies were recommended in the national sector legislation. On the other hand, some Autonomous Communities referred to natural hazards in the Environmental Impact Assessment legislation (EIA) and also in the spatial and urban planning legislation and tools. The National Land Act, enacted in 1998, established, for the first time, that those lands exposed to natural hazards should be classified as non-developable. The Spanish recast text of the Land Act, enacted by Royal Legislative Decree 2/2008, requires that a natural hazards map be included in the Environmental Sustainability Report (ESR), which is compulsory for all master plans, according to the provisions set out by Act 9/2006, known as Spanish Strategic Environmental Assessment (SEA). Consequently, the environmental legislation, after the aforementioned transposition of the SEA European Directive 2001/42/EC, is the legal framework to prevent losses due to natural hazards

  5. The Effective Organization and Use of Data in Bridging the Hazard Mitigation-Climate Change Adaptation Divide (Invited)

    NASA Astrophysics Data System (ADS)

    Smith, G. P.; Fox, J.; Shuford, S.

    2010-12-01

    The costs associated with managing natural hazards and disasters continue to rise in the US and elsewhere. Many climate change impacts are manifested in stronger or more frequent natural hazards such as floods, wildfire, hurricanes and typhoons, droughts, and heat waves. Despite this common problem, the climate change adaptation and hazards management communities have largely failed to acknowledge each other’s work in reducing hazard impacts. This is even reflected in the language that each community uses; for example, the hazards management community refers to hazard risk reduction as mitigation while the climate change community refers to it as adaptation. In order to help bridge this divide, we suggest each community utilize data in a more formally-organized and effective manner based on four principles: 1. The scale of the data must reflect the needs of the decision maker. In most cases, decision makers’ needs are most effectively met through the development of a multiple alternatives that takes into account a variety of possible impacts. 2. Investments intended to reduce vulnerability and increase resilience should be driven by the wise use of available data using a “risk-based” strategy. 3. Climate change adaptation and hazard mitigation strategies must be integrated with other value drivers when building resiliency. Development and use of data that underscore the concept of “no regrets” risk reduction can be used to accomplish this aim. 4. The use of common data is critical in building a bridge between the climate change adaptation and hazards management communities. We will explore how the creation of data repositories that collect, analyze, display and archive hazards and disaster data can help address the challenges posed by the current and hazards management and climate change adaptation divide.

  6. New Multi-HAzard and MulTi-RIsk Assessment MethodS for Europe (MATRIX): A research program towards mitigating multiple hazards and risks in Europe

    NASA Astrophysics Data System (ADS)

    Fleming, K. M.; Zschau, J.; Gasparini, P.; Modaressi, H.; Matrix Consortium

    2011-12-01

    Scientists, engineers, civil protection and disaster managers typically treat natural hazards and risks individually. This leads to the situation where the frequent causal relationships between the different hazards and risks, e.g., earthquakes and volcanos, or floods and landslides, are ignored. Such an oversight may potentially lead to inefficient mitigation planning. As part of their efforts to confront this issue, the European Union, under its FP7 program, is supporting the New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe or MATRIX project. The focus of MATRIX is on natural hazards, in particular earthquakes, landslides, volcanos, wild fires, storms and fluvial and coastal flooding. MATRIX will endeavour to develop methods and tools to tackle multi-type natural hazards and risks within a common framework, focusing on methodologies that are suited to the European context. The work will involve an assessment of current single-type hazard and risk assessment methodologies, including a comparison and quantification of uncertainties and harmonization of single-type methods, examining the consequence of cascade effects within a multi-hazard environment, time-dependent vulnerability, decision making and support for multi-hazard mitigation and adaption, and a series of test cases. Three test sites are being used to assess the methods developed within the project (Naples, Cologne, and the French West Indies), as well as a "virtual city" based on a comprehensive IT platform that will allow scenarios not represented by the test cases to be examined. In addition, a comprehensive dissemination program that will involve national platforms for disaster management, as well as various outreach activities, will be undertaken. The MATRIX consortium consists of ten research institutions (nine European and one Canadian), an end-user (i.e., one of the European national platforms for disaster reduction) and a partner from industry.

  7. Determination of Bedrock Variations and S-wave Velocity Structure in the NW part of Turkey for Earthquake Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Ozel, A. O.; Arslan, M. S.; Aksahin, B. B.; Genc, T.; Isseven, T.; Tuncer, M. K.

    2015-12-01

    Tekirdag region (NW Turkey) is quite close to the North Anatolian Fault which is capable of producing a large earthquake. Therefore, earthquake hazard mitigation studies are important for the urban areas close to the major faults. From this point of view, integration of different geophysical methods has important role for the study of seismic hazard problems including seismotectonic zoning. On the other hand, geological mapping and determining the subsurface structure, which is a key to assist management of new developed areas, conversion of current urban areas or assessment of urban geological hazards can be performed by integrated geophysical methods. This study has been performed in the frame of a national project, which is a complimentary project of the cooperative project between Turkey and Japan (JICA&JST), named as "Earthquake and Tsunami Disaster Mitigation in the Marmara Region and Disaster Education". With this principal aim, this study is focused on Tekirdag and its surrounding region (NW of Turkey) where some uncertainties in subsurface knowledge (maps of bedrock depth, thickness of quaternary sediments, basin geometry and seismic velocity structure,) need to be resolved. Several geophysical methods (microgravity, magnetic and single station and array microtremor measurements) are applied and the results are evaluated to characterize lithological changes in the region. Array microtremor measurements with several radiuses are taken in 30 locations and 1D-velocity structures of S-waves are determined by the inversion of phase velocities of surface waves, and the results of 1D structures are verified by theoretical Rayleigh wave modelling. Following the array measurements, single-station microtremor measurements are implemented at 75 locations to determine the predominant frequency distribution. The predominant frequencies in the region range from 0.5 Hz to 8 Hz in study area. On the other hand, microgravity and magnetic measurements are performed on

  8. A review of accidents, prevention and mitigation options related to hazardous gases

    SciTech Connect

    Fthenakis, V.M.

    1993-05-01

    Statistics on industrial accidents are incomplete due to lack of specific criteria on what constitutes a release or accident. In this country, most major industrial accidents were related to explosions and fires of flammable materials, not to releases of chemicals into the environment. The EPA in a study of 6,928 accidental releases of toxic chemicals revealed that accidents at stationary facilities accounted for 75% of the total number of releases, and transportation accidents for the other 25%. About 7% of all reported accidents (468 cases) resulted in 138 deaths and 4,717 injuries ranging from temporary respiratory problems to critical injuries. In-plant accidents accounted for 65% of the casualties. The most efficient strategy to reduce hazards is to choose technologies which do not require the use of large quantities of hazardous gases. For new technologies this approach can be implemented early in development, before large financial resources and efforts are committed to specific options. Once specific materials and options have been selected, strategies to prevent accident initiating events need to be evaluated and implemented. The next step is to implement safety options which suppress a hazard when an accident initiating event occurs. Releases can be prevented or reduced with fail-safe equipment and valves, adequate warning systems and controls to reduce and interrupt gas leakage. If an accident occurs and safety systems fail to contain a hazardous gas release, then engineering control systems will be relied on to reduce/minimize environmental releases. As a final defensive barrier, the prevention of human exposure is needed if a hazardous gas is released, in spite of previous strategies. Prevention of consequences forms the final defensive barrier. Medical facilities close by that can accommodate victims of the worst accident can reduce the consequences of personnel exposure to hazardous gases.

  9. Volcanic hazard in Mexico: a comprehensive on-line database for risk mitigation

    NASA Astrophysics Data System (ADS)

    Manea, Marina; Constantin Manea, Vlad; Capra, Lucia; Bonasia, Rosanna

    2013-04-01

    Researchers are currently working on several key aspects of the Mexican volcanoes, such as remote sensing, field data of old and recent volcaniclastic deposits, structural framework, monitoring (rainfall data and visual observation of lahars), and laboratory experiment (analogue models and numerical simulations - fall3D, titan2D). Each investigation is focused on specific processes, but it is fundamental to visualize the global status of the volcano in order to understand its behavior and to mitigate future hazards. The Mexican Volcanoes @nline represents a novel initiative aimed to collect, on a systematic basis, the complete set of data obtained so far on the volcanoes, and to continuously update the database with new data. All the information is compiled from published works and updated frequently. Maps, such as the geological map of the Mexican volcanos and the associated hazard zonation, as well as point data, such as stratigraphic sections, sedimentology and diagrams of rainfall intensities, are presented in Google Earth format in order to be easily accessed by the scientific community and the general public. An important section of this online database is the presentation of numerical simulations results for ash dispersion associated with the principal Mexican active volcanoes. Daily prediction of ash flow dispersion (based on real-time data from CENAPRED and the Mexican Meteorological Service), as well as large-scale high-resolution subduction simulations performed on HORUS (the Computational Geodynamics Laboratory's supercomputer) represent a central part of the Mexican Volcanos @nline database. The Mexican Volcanoes @nline database is maintained by the Computational Geodynamics Laboratory and it is based entirely on Open Source software. The website can be visited at: http://www.geociencias.unam.mx/mexican_volcanoes.

  10. Physical Prototype Development for the Real-Time Detection and Mitigation of Hazardous Releases into a Flow System

    NASA Astrophysics Data System (ADS)

    Rimer, Sara; Katopodes, Nikolaos

    2013-11-01

    The threat of accidental or deliberate toxic chemicals released into public spaces is a significant concern to public safety. The real-time detection and mitigation of such hazardous contaminants has the potential to minimize harm and save lives. In this study, we demonstrate the feasibility of feedback control of a hazardous contaminant by means of a laboratory-scale physical prototype integrated with a previously-developed robust predictive control numerical model. The physical prototype is designed to imitate a public space characterized by a long conduit with an ambient flow (e.g. airport terminal). Unidirectional air flows through a 24-foot long duct. The ``contaminant'' plume of propylene glycol smoke is released into the duct. Camera sensors are used to visually measure concentration of the plume. A pneumatic system is utilized to localize the contaminant via air curtains, and draw it out via vacuum nozzles. The control prescribed to the pneumatic system is based on the numerical model. NSF-CMMI 0856438.

  11. Mitigation of EMU Glove Cut Hazard by MMOD Impact Craters on Exposed ISS Handrails

    NASA Technical Reports Server (NTRS)

    Christiansen, Eric L.; Ryan, Shannon

    2009-01-01

    Recent cut damages to crewmember extravehicular mobility unit (EMU) gloves during extravehicular activity (EVA) onboard the International Space Station (ISS) has been found to result from contact with sharp edges or pinch points rather than general wear or abrasion. One possible source of cut-hazards are protruding sharp edged crater lips from impact of micrometeoroid and orbital debris (MMOD) particles on external metallic handrails along EVA translation paths. During impact of MMOD particles at hypervelocity an evacuation flow develops behind the shock wave, resulting in the formation of crater lips that can protrude above the target surface. In this study, two methods were evaluated to limit EMU glove cut-hazards due to MMOD impact craters. In the first phase, four flexible overwrap configurations are evaluated: a felt-reusable surface insulation (FRSI), polyurethane polyether foam with beta-cloth cover, double-layer polyurethane polyether foam with beta-cloth cover, and multi-layer beta-cloth with intermediate Dacron netting spacers. These overwraps are suitable for retrofitting ground equipment that has yet to be flown, and are not intended to protect the handrail from impact of MMOD particles, rather to act as a spacer between hazardous impact profiles and crewmember gloves. At the impact conditions considered, all four overwrap configurations evaluated were effective in limiting contact between EMU gloves and impact crater profiles. The multi-layer beta-cloth configuration was the most effective in reducing the height of potentially hazardous profiles in handrail-representative targets. In the second phase of the study, four material alternatives to current aluminum and stainless steel alloys were evaluated: a metal matrix composite, carbon fiber reinforced plastic (CFRP), fiberglass, and a fiber metal laminate. Alternative material handrails are intended to prevent the formation of hazardous damage profiles during MMOD impact and are suitable for flight

  12. Health hazards and mitigation of chronic poisoning from arsenic in drinking water: Taiwan experiences.

    PubMed

    Chen, Chien-Jen

    2014-01-01

    There are two endemic areas of long-term exposure to arsenic from drinking water in Taiwan. Residents in the southwestern and northeastern endemic areas started using high-arsenic artesian well water in the early 1910s and late 1940s, respectively. Public water supply system using surface water was implemented in southwestern and northeastern endemic areas in the 1970s and 1990s, respectively. Systemic health hazards of long-term exposure to arsenic in drinking water have been intensively investigated since the 1960s, especially after 1985 in Taiwan. Several diseases have been well documented to be associated with chronic arsenic poisoning from drinking water showing a dose-response relation. They include characteristic skin lesions like hyperpigmentation or depigmentation, hyperkeratosis in palms and soles, and Bowen disease, peripheral vascular disease (specifically blackfoot disease), ischemic heart disease, cerebral infarction, microvascular diseases, abnormal peripheral microcirculation, carotid atherosclerosis, QT prolongation and increased dispersion in electrocardiography, hypertension, goiter, diabetes mellitus, cataract (specifically posterior subcapsular lens opacity), pterygium, slow neural conduction, retarded neurobehavioral development, erectile dysfunction, and cancers of the skin, lung, urinary bladder, kidney, and liver. The method of choice to mitigate arsenic poisoning through drinking water is to use safe drinking water from uncontaminated sources.

  13. Catastrophic debris flows transformed from landslides in volcanic terrains : mobility, hazard assessment and mitigation strategies

    USGS Publications Warehouse

    Scott, Kevin M.; Macias, Jose Luis; Naranjo, Jose Antonio; Rodriguez, Sergio; McGeehin, John P.

    2001-01-01

    Communities in lowlands near volcanoes are vulnerable to significant volcanic flow hazards in addition to those associated directly with eruptions. The largest such risk is from debris flows beginning as volcanic landslides, with the potential to travel over 100 kilometers. Stratovolcanic edifices commonly are hydrothermal aquifers composed of unstable, altered rock forming steep slopes at high altitudes, and the terrain surrounding them is commonly mantled by readily mobilized, weathered airfall and ashflow deposits. We propose that volcano hazard assessments integrate the potential for unanticipated debris flows with, at active volcanoes, the greater but more predictable potential of magmatically triggered flows. This proposal reinforces the already powerful arguments for minimizing populations in potential flow pathways below both active and selected inactive volcanoes. It also addresses the potential for volcano flank collapse to occur with instability early in a magmatic episode, as well as the 'false-alarm problem'-the difficulty in evacuating the potential paths of these large mobile flows. Debris flows that transform from volcanic landslides, characterized by cohesive (muddy) deposits, create risk comparable to that of their syneruptive counterparts of snow and ice-melt origin, which yield noncohesive (granular) deposits, because: (1) Volcano collapses and the failures of airfall- and ashflow-mantled slopes commonly yield highly mobile debris flows as well as debris avalanches with limited runout potential. Runout potential of debris flows may increase several fold as their volumes enlarge beyond volcanoes through bulking (entrainment) of sediment. Through this mechanism, the runouts of even relatively small collapses at Cascade Range volcanoes, in the range of 0.1 to 0.2 cubic kilometers, can extend to populated lowlands. (2) Collapse is caused by a variety of triggers: tectonic and volcanic earthquakes, gravitational failure, hydrovolcanism, and

  14. Mitigating hazards to aircraft from drifting volcanic clouds by comparing and combining IR satellite data with forward transport models

    NASA Astrophysics Data System (ADS)

    Matiella Novak, M. Alexandra

    Volcanic ash clouds in the upper atmosphere (>10km) present a significant hazard to the aviation community and in some cases cause near-disastrous situations for aircraft that inadvertently encounter them. The two most commonly used techniques for mitigating hazards to aircraft from drifting volcanic clouds are (1) using data from satellite observations and (2) the forecasting of dispersion and trajectories with numerical models. This dissertation aims to aid in the mitigation of this hazard by using Moderate Infrared Resolution Spectroradiometer (MODIS) and Advanced Very High Resolution Radiometer (AVHRR) infrared (IR) satellite data to quantitatively analyze and constrain the uncertainties in the PUFF volcanic ash transport model. Furthermore, this dissertation has experimented with the viability of combining IR data with the PUFF model to increase the model's reliability. Comparing IR satellite data with forward transport models provides valuable information concerning the uncertainty and sensitivity of the transport models. A study analyzing the viability of combining satellite-based information with the PUFF model was also done. Factors controlling the cloud-shape evolution, such as the horizontal dispersion coefficient, vertical distribution of particles, the height of the cloud, and the location of the cloud were all updated based on observations from satellite data in an attempt to increase the reliability of the simulations. Comparing center of mass locations--calculated from satellite data--to HYSPLIT trajectory simulations provides insight into the vertical distribution of the cloud. A case study of the May 10, 2003 Anatahan Volcano eruption was undertaken to assess methods of calculating errors in PUFF simulations with respect to the transport and dispersion of the erupted cloud. An analysis of the factors controlling the cloud-shape evolution of the cloud in the model was also completed and compared to the shape evolution of the cloud observed in the

  15. Geo hazard studies and their policy implications in Nicaragua

    NASA Astrophysics Data System (ADS)

    Strauch, W.

    2007-05-01

    Nicaragua, situated at the Central American Subduction zone and placed in the trajectory of tropical storms and hurricanes, is a frequent showplace of natural disasters which have multiplied the negative effects of a long term socioeconomic crisis leaving Nicaragua currently as the second poorest country of the Americas. In the last years, multiple efforts were undertaken to prevent or mitigate the affectation of the natural phenomena to the country. National and local authorities have become more involved in disaster prevention policy and international cooperation boosted funding for disaster prevention and mitigation measures in the country. The National Geosciences Institution (INETER) in cooperation with foreign partners developed a national monitoring and early warning system on geological and hydro-meteorological phenomena. Geological and risk mapping projects were conducted by INETER and international partners. Universities, NGO´s, International Technical Assistance, and foreign scientific groups cooperated to capacitate Nicaraguan geoscientists and to improve higher education on disaster prevention up to the master degree. Funded by a World Bank loan, coordinated by the National System for Disaster Prevention, Mitigation and Attention (SINAPRED) and scientifically supervised by INETER, multidisciplinary hazard and vulnerability studies were carried out between 2003 and 2005 with emphasis on seismic hazard. These GIS based works provided proposals for land use policies on a local level in 30 municipalities and seismic vulnerability and risk information for each single building in Managua, Capital of Nicaragua. Another large multidisciplinary project produced high resolution air photos, elaborated 1:50,000 vectorized topographic maps, and a digital elevation model for Western Nicaragua. These data, integrated in GIS, were used to assess: 1) Seismic Hazard for Metropolitan Managua; 2) Tsunami hazard for the Pacific coast; 3) Volcano hazard for Telica

  16. A Possible Paradigm for the Mitigation of the Adverse Impacts of Natural Hazards in the Developing Countries

    NASA Astrophysics Data System (ADS)

    Aswathanarayana, U.

    2001-05-01

    The proneness of a country or region to a given natural hazard depends upon its geographical location, physiography, geological and structural setting, landuse/landcover situation, and biophysical and socioeconomic environments (e.g. cyclones and floods in Bangladesh, earthquakes in Turkey, drought in Sub-Saharan Africa). While the natural hazards themselves cannot be prevented, it is possible to mitigate their adverse effects, by a knowledge-based, environmentally-sustainable approach, involving the stakeholder communities: (i) by being prepared: on the basis of the understanding of the land conditions which are prone to a given hazard and the processes which could culminate in damage to life and property (e.g. planting of dense-rooted vegetation belts to protect against landslides in the earthquake-prone areas), (ii) by avoiding improper anthropogenic activities that may exacerbate a hazard (e.g. deforestation accentuating the floods and droughts), and (iii) by putting a hazard to a beneficial use, where possible (groundwater recharging of flood waters), etc. Mitigation strategies need to be custom-made for each country/region by integrating the biophysical and socioeconomic components. The proposed paradigm is illustrated in respect of Extreme Weather Events (EWEs), which is based on the adoption of three approaches: (i) Typology approach, involving the interpretation of remotely sensed data, to predict (say) temporal and spatial distribution of precipitation, (ii) "black box" approach, whereby the potential environmental consequences of an EWE are projected on the basis of previously known case histories, and (iii) Information Technology approach, to translate advanced technical information in the form of "virtual" do-it-yourself steps understandable to lay public.

  17. Disruption mitigation studies in DIII-D

    SciTech Connect

    Taylor, P.L.; Kellman, A.G.; Evans, T.E.

    1999-01-01

    Data on the discharge behavior, thermal loads, halo currents, and runaway electrons have been obtained in disruptions on the DIII-D tokamak. These experiments have also evaluated techniques to mitigate the disruptions while minimizing runaway electron production. Experiments injecting cryogenic impurity killer pellets of neon and argon and massive amounts of helium gas have successfully reduced these disruption effects. The halo current generation, scaling, and mitigation are understood and are in good agreement with predictions of a semianalytic model. Results from killer pellet injection have been used to benchmark theoretical models of the pellet ablation and energy loss. Runaway electrons are often generated by the pellets and new runaway generation mechanisms, modifications of the standard Dreicer process, have been found to explain the runaways. Experiments with the massive helium gas puff have also effectively mitigated disruptions without the formation of runaway electrons that can occur with killer pellets.

  18. Bike Helmets and Black Riders: Experiential Approaches to Helping Students Understand Natural Hazard Assessment and Mitigation Issues

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Kley, J.; Hindle, D.; Friedrich, A. M.

    2014-12-01

    Defending society against natural hazards is a high-stakes game of chance against nature, involving tough decisions. How should a developing nation allocate its budget between building schools for towns without ones or making existing schools earthquake-resistant? Does it make more sense to build levees to protect against floods, or to prevent development in the areas at risk? Would more lives be saved by making hospitals earthquake-resistant, or using the funds for patient care? These topics are challenging because they are far from normal experience, in that they involve rare events and large sums. To help students in natural hazard classes conceptualize them, we pose tough and thought-provoking questions about complex issues involved and explore them together via lectures, videos, field trips, and in-class and homework questions. We discuss analogous examples from the students' experiences, drawing on a new book "Playing Against Nature, Integrating Science and Economics to Mitigate Natural Hazards in an Uncertain World". Asking whether they wear bicycle helmets and why or why not shows the cultural perception of risk. Individual students' responses vary, and the overall results vary dramatically between the US, UK, and Germany. Challenges in hazard assessment in an uncertain world are illustrated by asking German students whether they buy a ticket on public transportation - accepting a known cost - or "ride black" - not paying but risking a heavy fine if caught. We explore the challenge of balancing mitigation costs and benefits via the question "If you were a student in Los Angeles, how much more would you pay in rent each month to live in an earthquake-safe building?" Students learn that interdisciplinary thinking is needed, and that due to both uncertainties and sociocultural factors, no unique or right strategies exist for a particular community, much the less all communities. However, we can seek robust policies that give sensible results given

  19. Looking Before We Leap: Recent Results From An Ongoing Quantitative Investigation Of Asteroid And Comet Impact Hazard Mitigation.

    NASA Astrophysics Data System (ADS)

    Plesko, Catherine; Weaver, R. P.; Korycansky, D. G.; Huebner, W. F.

    2010-10-01

    The asteroid and comet impact hazard is now part of public consciousness, as demonstrated by movies, Super Bowl commercials, and popular news stories. However, there is a popular misconception that hazard mitigation is a solved problem. Many people think, `we'll just nuke it.’ There are, however, significant scientific questions remaining in the hazard mitigation problem. Before we can say with certainty that an explosive yield Y at height of burst h will produce a momentum change in or dispersion of a potentially hazardous object (PHO), we need to quantify how and where energy is deposited into the rubble pile or conglomerate that may make up the PHO. We then need to understand how shock waves propagate through the system, what causes them to disrupt, and how long gravitationally bound fragments take to recombine. Here we present numerical models of energy deposition from an energy source into various materials that are known PHO constituents, and rigid body dynamics models of the recombination of disrupted objects. In the energy deposition models, we explore the effects of porosity and standoff distance as well as that of composition. In the dynamical models, we explore the effects of fragment size and velocity distributions on the time it takes for gravitationally bound fragments to recombine. Initial models indicate that this recombination time is relatively short, as little as 24 hours for a 1 km sized PHO composed of 1000 meter-scale self-gravitating fragments with an initial velocity field of v/r = 0.001 1/s.

  20. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    Richard C. Logan

    2002-03-28

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  1. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    J. L. Kubicek

    2001-09-07

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  2. Advances in Remote Sensing Approaches for Hazard Mitigation and Natural Resource Protection in Pacific Latin America: A Workshop for Advanced Graduate Students, Post- Doctoral Researchers, and Junior Faculty

    NASA Astrophysics Data System (ADS)

    Gierke, J. S.; Rose, W. I.; Waite, G. P.; Palma, J. L.; Gross, E. L.

    2008-12-01

    Though much of the developing world has the potential to gain significantly from remote sensing techniques in terms of public health and safety, they often lack resources for advancing the development and practice of remote sensing. All countries share a mutual interest in furthering remote sensing capabilities for natural hazard mitigation and resource development. With National Science Foundation support from the Partnerships in International Research and Education program, we are developing a new educational system of applied research and engineering for advancing collaborative linkages among agencies and institutions in Pacific Latin American countries (to date: Guatemala, El Salvador, Nicaragua, Costa Rica, Panama, and Ecuador) in the development of remote sensing tools for hazard mitigation and water resources management. The project aims to prepare students for careers in science and engineering through their efforts to solve suites of problems needing creative solutions: collaboration with foreign agencies; living abroad immersed in different cultures; and adapting their academic training to contend with potentially difficult field conditions and limited resources. The ultimate goal of integrating research with education is to encourage cross-disciplinary, creative, and critical thinking in problem solving and foster the ability to deal with uncertainty in analyzing problems and designing appropriate solutions. In addition to traditional approaches for graduate and undergraduate research, we have built new educational systems of applied research and engineering: (1) the Peace Corp/Master's International program in Natural Hazards which features a 2-year field assignment during service in the U.S. Peace Corps, (2) the Michigan Tech Enterprise program for undergraduates, which gives teams of students from different disciplines the opportunity to work for three years in a business-like setting to solve real-world problems, and (3) a unique university exchange

  3. Disruption mitigation studies in DIII-D

    SciTech Connect

    Taylor, P.L.; Kellman, A.G.; Evans, T.E.; Gray, D.S.; Humphreys, D.A.; Hyatt, A.W.; Jernigan, T.C.; Lee, R.L.; Leuer, J.A.; Luckhardt, S.C.; Parks, P.B.; Schaffer, M.J.; Whyte, D.G.; Zhang, J.

    1999-05-01

    Data on the discharge behavior, thermal loads, halo currents, and runaway electrons have been obtained in disruptions on the DIII-D tokamak [J. L. Luxon and L. G. Davis, Fusion Technol. {bold 8}, 2A 441 (1985)]. These experiments have also evaluated techniques to mitigate the disruptions while minimizing runaway electron production. Experiments injecting cryogenic impurity {open_quotes}killer{close_quotes} pellets of neon and argon and massive amounts of helium gas have successfully reduced these disruption effects. The halo current generation, scaling, and mitigation are understood and are in good agreement with predictions of a semianalytic model. Results from {open_quotes}killer{close_quotes} pellet injection have been used to benchmark theoretical models of the pellet ablation and energy loss. Runaway electrons are often generated by the pellets and new runaway generation mechanisms, modifications of the standard Dreicer process, have been found to explain the runaways. Experiments with the massive helium gas puff have also effectively mitigated disruptions without the formation of runaway electrons that can occur with {open_quotes}killer{close_quotes} pellets. {copyright} {ital 1999 American Institute of Physics.}

  4. Remote Sensing for Hazard Mitigation and Resource Protection in Pacific Latin America: New NSF sponsored initiative at Michigan Tech.

    NASA Astrophysics Data System (ADS)

    Rose, W. I.; Bluth, G. J.; Gierke, J. S.; Gross, E.

    2005-12-01

    Though much of the developing world has the potential to gain significantly from remote sensing techniques in terms of public health and safety and, eventually, economic development, they lack the resources required to advance the development and practice of remote sensing. Both developed and developing countries share a mutual interest in furthering remote sensing capabilities for natural hazard mitigation and resource development, and this common commitment creates a solid foundation upon which to build an integrated education and research project. This will prepare students for careers in science and engineering through their efforts to solve a suite of problems needing creative solutions: collaboration with foreign agencies; living abroad immersed in different cultures; and adapting their academic training to contend with potentially difficult field conditions and limited resources. This project makes two important advances: (1) We intend to develop the first formal linkage among geoscience agencies from four Pacific Latin American countries (Guatemala, El Salvador, Nicaragua and Ecuador), focusing on the collaborative development of remote sensing tools for hazard mitigation and water resource development; (2) We will build a new educational system of applied research and engineering, using two existing educational programs at Michigan Tech: a new Peace Corp/Master's International (PC/MI) program in Natural Hazards which features a 2-year field assignment, and an "Enterprise" program for undergraduates, which gives teams of geoengineering students the opportunity to work for three years in a business-like setting to solve real-world problems This project will involve 1-2 post-doctoral researchers, 3 Ph.D., 9 PC/MI, and roughly 20 undergraduate students each year.

  5. The respiratory health hazards of volcanic ash: a review for volcanic risk mitigation

    NASA Astrophysics Data System (ADS)

    Horwell, Claire J.; Baxter, Peter J.

    2006-07-01

    Studies of the respiratory health effects of different types of volcanic ash have been undertaken only in the last 40 years, and mostly since the eruption of Mt. St. Helens in 1980. This review of all published clinical, epidemiological and toxicological studies, and other work known to the authors up to and including 2005, highlights the sparseness of studies on acute health effects after eruptions and the complexity of evaluating the long-term health risk (silicosis, non-specific pneumoconiosis and chronic obstructive pulmonary disease) in populations from prolonged exposure to ash due to persistent eruptive activity. The acute and chronic health effects of volcanic ash depend upon particle size (particularly the proportion of respirable-sized material), mineralogical composition (including the crystalline silica content) and the physico-chemical properties of the surfaces of the ash particles, all of which vary between volcanoes and even eruptions of the same volcano, but adequate information on these key characteristics is not reported for most eruptions. The incidence of acute respiratory symptoms (e.g. asthma, bronchitis) varies greatly after ashfalls, from very few, if any, reported cases to population outbreaks of asthma. The studies are inadequate for excluding increases in acute respiratory mortality after eruptions. Individuals with pre-existing lung disease, including asthma, can be at increased risk of their symptoms being exacerbated after falls of fine ash. A comprehensive risk assessment, including toxicological studies, to determine the long-term risk of silicosis from chronic exposure to volcanic ash, has been undertaken only in the eruptions of Mt. St. Helens (1980), USA, and Soufrière Hills, Montserrat (1995 onwards). In the Soufrière Hills eruption, a long-term silicosis hazard has been identified and sufficient exposure and toxicological information obtained to make a probabilistic risk assessment for the development of silicosis in outdoor

  6. Piloted Simulation to Evaluate the Utility of a Real Time Envelope Protection System for Mitigating In-Flight Icing Hazards

    NASA Technical Reports Server (NTRS)

    Ranaudo, Richard J.; Martos, Borja; Norton, Bill W.; Gingras, David R.; Barnhart, Billy P.; Ratvasky, Thomas P.; Morelli, Eugene

    2011-01-01

    The utility of the Icing Contamination Envelope Protection (ICEPro) system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device (ICEFTD). ICEPro provides real time envelope protection cues and alerting messages on pilot displays. The pilots participating in this test were divided into two groups; a control group using baseline displays without ICEPro, and an experimental group using ICEPro driven display cueing. Each group flew identical precision approach and missed approach procedures with a simulated failure case icing condition. Pilot performance, workload, and survey questionnaires were collected for both groups of pilots. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their situation awareness of a hazardous aircraft state.

  7. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr

  8. Methodology for the Conduct of a Seismic Risk Mitigation Study.

    DTIC Science & Technology

    1985-06-01

    33 3. The Base..................34 4. The Study Group ................34 III. DETERMINATION OF MISSION ESSENTIAL FACILITIES .. 36 A...97 APPENDIX C: NAVY EARTHQUAKE ADVISORY GROUP MODELS . 115 A. SEISMIC RISK MITIGATION MODEL...........115 B. ILLUSTRATIVE EXAMPLE OF THE BASIC MODEL...study group formed to conduct the Seismic Risk Mitigation Study. The first phase of the study is determining the missions of the base and determining

  9. The Relation of Hazard Awareness to Adoption of Approved Mitigation Measures.

    ERIC Educational Resources Information Center

    Saarinen, Thomas F.

    The relationship between an individual's or community's awareness of natural hazards and subsequent behavior change is examined in this review of research. The document is presented in seven sections. Following Section I, the introduction, Section II discusses the role of experience in behavior change. Section III examines the role of education…

  10. Studies Update Vinyl Chloride Hazards.

    ERIC Educational Resources Information Center

    Rawls, Rebecca

    1980-01-01

    Extensive study affirms that vinyl chloride is a potent animal carcinogen. Epidemiological studies show elevated rates of human cancers in association with extended contact with the compound. (Author/RE)

  11. Debris flood hazard documentation and mitigation on the Tilcara alluvial fan (Quebrada de Humahuaca, Jujuy province, North-West Argentina)

    NASA Astrophysics Data System (ADS)

    Marcato, G.; Bossi, G.; Rivelli, F.; Borgatti, L.

    2012-06-01

    For some decades, mass wasting processes such as landslides and debris floods have been threatening villages and transportation routes in the Rio Grande Valley, named Quebrada de Humauhuaca. One of the most significant examples is the urban area of Tilcara, built on a large alluvial fan. In recent years, debris flood phenomena have been triggered in the tributary valley of the Huasamayo Stream and reached the alluvial fan on a decadal basis. In view of proper development of the area, hazard and risk assessment together with risk mitigation strategies are of paramount importance. The need is urgent also because the Quebrada de Humahuaca was recently included in the UNESCO World Cultural Heritage. Therefore, the growing tourism industry may lead to uncontrolled exploitation and urbanization of the valley, with a consequent increase of the vulnerability of the elements exposed to risk. In this context, structural and non structural mitigation measures not only have to be based on the understanding of natural processes, but also have to consider environmental and sociological factors that could hinder the effectiveness of the countermeasure works. The hydrogeological processes are described with reference to present-day hazard and risk conditions. Considering the socio-economic context, some possible interventions are outlined, which encompass budget constraints and local practices. One viable solution would be to build a protecting dam upstream of the fan apex and an artificial channel, in order to divert the floodwaters in a gully that would then convey water and sediments into the Rio Grande, some kilometers downstream of Tilcara. The proposed remedial measures should employ easily available and relatively cheap technologies and local workers, incorporating low environmental and visual impacts issues, in order to ensure both the future conservation of the site and its safe exploitation for inhabitants and tourists.

  12. Marine and Hydrokinetic Renewable Energy Devices, Potential Navigational Hazards and Mitigation Measures

    SciTech Connect

    Cool, Richard, M.; Hudon, Thomas, J.; Basco, David, R.; Rondorf, Neil, E.

    2009-12-01

    On April 15, 2008, the Department of Energy (DOE) issued a Funding Opportunity Announcement for Advanced Water Power Projects which included a Topic Area for Marine and Hydrokinetic Renewable Energy Market Acceleration Projects. Within this Topic Area, DOE identified potential navigational impacts of marine and hydrokinetic renewable energy technologies and measures to prevent adverse impacts on navigation as a sub-topic area. DOE defines marine and hydrokinetic technologies as those capable of utilizing one or more of the following resource categories for energy generation: ocean waves; tides or ocean currents; free flowing water in rivers or streams; and energy generation from the differentials in ocean temperature. PCCI was awarded Cooperative Agreement DE-FC36-08GO18177 from the DOE to identify the potential navigational impacts and mitigation measures for marine hydrokinetic technologies. A technical report addressing our findings is available on this Science and Technology Information site under the Product Title, "Marine and Hydrokinetic Renewable Energy Technologies: Potential Navigational Impacts and Mitigation Measures". This product is a brochure, primarily for project developers, that summarizes important issues in that more comprehensive report, identifies locations where that report can be downloaded, and identifies points of contact for more information.

  13. Societal transformation and adaptation necessary to manage dynamics in flood hazard and risk mitigation (TRANS-ADAPT)

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Thaler, Thomas; Bonnefond, Mathieu; Clarke, Darren; Driessen, Peter; Hegger, Dries; Gatien-Tournat, Amandine; Gralepois, Mathilde; Fournier, Marie; Mees, Heleen; Murphy, Conor; Servain-Courant, Sylvie

    2015-04-01

    Facing the challenges of climate change, this project aims to analyse and to evaluate the multiple use of flood alleviation schemes with respect to social transformation in communities exposed to flood hazards in Europe. The overall goals are: (1) the identification of indicators and parameters necessary for strategies to increase societal resilience, (2) an analysis of the institutional settings needed for societal transformation, and (3) perspectives of changing divisions of responsibilities between public and private actors necessary to arrive at more resilient societies. This proposal assesses societal transformations from the perspective of changing divisions of responsibilities between public and private actors necessary to arrive at more resilient societies. Yet each risk mitigation measure is built on a narrative of exchanges and relations between people and therefore may condition the outputs. As such, governance is done by people interacting and defining risk mitigation measures as well as climate change adaptation are therefore simultaneously both outcomes of, and productive to, public and private responsibilities. Building off current knowledge this project will focus on different dimensions of adaptation and mitigation strategies based on social, economic and institutional incentives and settings, centring on the linkages between these different dimensions and complementing existing flood risk governance arrangements. The policy dimension of adaptation, predominantly decisions on the societal admissible level of vulnerability and risk, will be evaluated by a human-environment interaction approach using multiple methods and the assessment of social capacities of stakeholders across scales. As such, the challenges of adaptation to flood risk will be tackled by converting scientific frameworks into practical assessment and policy advice. In addressing the relationship between these dimensions of adaptation on different temporal and spatial scales, this

  14. Detecting Slow Deformation Signals Preceding Dynamic Failure: A New Strategy For The Mitigation Of Natural Hazards (SAFER)

    NASA Astrophysics Data System (ADS)

    Vinciguerra, Sergio; Colombero, Chiara; Comina, Cesare; Ferrero, Anna Maria; Mandrone, Giuseppe; Umili, Gessica; Fiaschi, Andrea; Saccorotti, Gilberto

    2014-05-01

    simulated conditions of stress and fluid content will be also studied and theoretical modelling will allow to come up with a full hazard assessment and test new methodologies for a much wider scale of applications within EU.

  15. A fast global tsunami modeling suite as a trans-oceanic tsunami hazard prediction and mitigation tool

    NASA Astrophysics Data System (ADS)

    Mohammed, F.; Li, S.; Jalali Farahani, R.; Williams, C. R.; Astill, S.; Wilson, P. S.; B, S.; Lee, R.

    2014-12-01

    The past decade has been witness to two mega-tsunami events, 2004 Indian ocean tsunami and 2011 Japan tsunami and multiple major tsunami events; 2006 Java, Kuril Islands, 2007 Solomon Islands, 2009 Samoa and 2010 Chile, to name a few. These events generated both local and far field tsunami inundations with runup ranging from a few meters to around 40 m in the coastal impact regions. With a majority of the coastal population at risk, there is need for a sophisticated outlook towards catastrophe risk estimation and a quick mitigation response. At the same time tools and information are needed to aid advanced tsunami hazard prediction. There is an increased need for insurers, reinsurers and Federal hazard management agencies to quantify coastal inundations and vulnerability of coastal habitat to tsunami inundations. A novel tool is developed to model local and far-field tsunami generation, propagation and inundation to estimate tsunami hazards. The tool is a combination of the NOAA MOST propagation database and an efficient and fast GPU (Graphical Processing Unit)-based non-linear shallow water wave model solver. The tsunamigenic seismic sources are mapped on to the NOAA unit source distribution along subduction zones in the ocean basin. Slip models are defined for tsunamigenic seismic sources through a slip distribution on the unit sources while maintaining limits of fault areas. A GPU based finite volume solver is used to simulate non-linear shallow water wave propagation, inundation and runup. Deformation on the unit sources provide initial conditions for modeling local impacts, while the wave history from propagation database provides boundary conditions for far field impacts. The modeling suite provides good agreement with basins for basin wide tsunami propagation to validate local and far field tsunami inundations.

  16. Field Guide for Testing Existing Photovoltaic Systems for Ground Faults and Installing Equipment to Mitigate Fire Hazards: November 2012 - October 2013

    SciTech Connect

    Brooks, William

    2015-02-01

    Ground faults and arc faults are the two most common reasons for fires in photovoltaic (PV) arrays and methods exist that can mitigate the hazards. This report provides field procedures for testing PV arrays for ground faults, and for implementing high resolution ground fault and arc fault detectors in existing and new PV system designs.

  17. 75 FR 29569 - Recovery Policy RP9526.1, Hazard Mitigation Funding Under Section 406 (Stafford Act)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-26

    ... and property, protect the Federal investment in public infrastructure and ultimately help build... mitigation, the Public Assistance Division has adopted the Mitigation Directorate's Benefit Cost...

  18. Novel bio-inspired smart control for hazard mitigation of civil structures

    NASA Astrophysics Data System (ADS)

    Kim, Yeesock; Kim, Changwon; Langari, Reza

    2010-11-01

    In this paper, a new bio-inspired controller is proposed for vibration mitigation of smart structures subjected to ground disturbances (i.e. earthquakes). The control system is developed through the integration of a brain emotional learning (BEL) algorithm with a proportional-integral-derivative (PID) controller and a semiactive inversion (Inv) algorithm. The BEL algorithm is based on the neurologically inspired computational model of the amygdala and the orbitofrontal cortex. To demonstrate the effectiveness of the proposed hybrid BEL-PID-Inv control algorithm, a seismically excited building structure equipped with a magnetorheological (MR) damper is investigated. The performance of the proposed hybrid BEL-PID-Inv control algorithm is compared with that of passive, PID, linear quadratic Gaussian (LQG), and BEL control systems. In the simulation, the robustness of the hybrid BEL-PID-Inv control algorithm in the presence of modeling uncertainties as well as external disturbances is investigated. It is shown that the proposed hybrid BEL-PID-Inv control algorithm is effective in improving the dynamic responses of seismically excited building structure-MR damper systems.

  19. Educational Approach to Seismic Risk Mitigation in Indian Himalayas -Hazard Map Making Workshops at High Schools-

    NASA Astrophysics Data System (ADS)

    Koketsu, K.; Oki, S.; Kimura, M.; Chadha, R. K.; Davuluri, S.

    2014-12-01

    How can we encourage people to take preventive measures against damage risks and empower them to take the right actions in emergencies to save their lives? The conventional approach taken by scientists had been disseminating intelligible information on up-to-date seismological knowledge. However, it has been proven that knowledge alone does not have enough impact to modify people's behaviors in emergencies (Oki and Nakayachi, 2012). On the other hand, the conventional approach taken by practitioners had been to conduct emergency drills at schools or workplaces. The loss of many lives from the 2011 Tohoku earthquake has proven that these emergency drills were not enough to save people's lives, unless they were empowered to assess the given situation on their own and react flexibly. Our challenge is to bridge the gap between knowledge and practice. With reference to best practices observed in Tohoku, such as The Miracles of Kamaishi, our endeavor is to design an effective Disaster Preparedness Education Program that is applicable to other disaster-prone regions in the world, even with different geological, socio-economical and cultural backgrounds. The key concepts for this new approach are 1) empowering individuals to take preventive actions to save their lives, 2) granting community-based understanding of disaster risks and 3) building a sense of reality and relevancy to disasters. With these in mind, we held workshops at some high schools in the Lesser Himalayan Region, combining lectures with an activity called "Hazard Map Making" where students proactively identify and assess the hazards around their living areas and learn practical strategies on how to manage risks. We observed the change of awareness of the students by conducting a preliminary questionnaire survey and interviews after each session. Results strongly implied that the significant change of students' attitudes towards disaster preparedness occurred not by the lectures of scientific knowledge, but

  20. Using Darwin's theory of atoll formation to improve tsunami hazard mitigation in the Pacific

    NASA Astrophysics Data System (ADS)

    Goff, J. R.; Terry, J. P.

    2012-12-01

    It is 130 years since Charles Darwin's death and 176 years since he his penned his subsidence theory of atoll formation on 12th April 1836 during the voyage of the Beagle through the Pacific. This theory, founded on the premise of a subsiding volcano and the corresponding upward growth of coral reef, was astonishing for the time considering the absence of an underpinning awareness of plate tectonics. Furthermore, with the exception of the occasional permutation and opposing idea his theory has endured and has an enviable longevity amongst paradigms in geomorphology. In his theory, Darwin emphasised the generally circular morphology of the atoll shape and surprisingly, the validity of this simple morphological premise has never been questioned. There are however, few atolls in the Pacific Ocean that attain such a simple morphology with most manifesting one or more arcuate 'bight-like' structures (ABLSs). These departures from the circular form complicate his simplistic model and are indicative of geomorphological processes in the Pacific Ocean which cannot be ignored. ABLSs represent the surface morphological expression of major submarine failures of atoll volcanic foundations. Such failures can occur during any stage of atoll formation and are a valuable addition to Darwin's theory because they indicate the instability of the volcanic foundations. It is widely recognized in the research community that sector/flank collapses of island edifices are invariably tsunamigenic and yet we have no clear understanding of how significant such events are in the tsunami hazard arena. The recognition of ABLSs however, now offers scientists the opportunity to establish a first order database of potential local and regional tsunamigenic sources associated with the sector/flank collapses of island edifices. We illustrate the talk with examples of arcuate 'bight-like' structures and associated tsunamis in atoll and atoll-like environments. The implications for our understanding of

  1. Hawaiian cultural influences on support for lava flow hazard mitigation measures during the January 1960 eruption of Kīlauea volcano, Kapoho, Hawai‘i

    USGS Publications Warehouse

    Gregg, Chris E.; Houghton, B.F.; Paton, Douglas; Swanson, D.A.; Lachman, R.; Bonk, W.J.

    2008-01-01

    On average, 72% of respondents favored the construction of earthen barriers to hold back or divert lava and protect Kapoho, but far fewer agreed with the military's use of bombs (14%) to protect Kapoho. In contrast, about one-third of respondents conditionally agreed with the use of bombs. It is suggested that local participation in the bombing strategy may explain the increased conditional acceptance of bombs as a mitigation tool, although this can not be conclusively demonstrated. Belief in Pele and being of Hawaiian ethnicity did not reduce support for the use of barriers, but did reduce support for bombs in both bombing scenarios. The disparity in levels of acceptance of barriers versus bombing and of one bombing strategy versus another suggests that historically public attitudes toward lava flow hazard mitigation strategies were complex. A modern comparative study is needed before the next damaging eruption to inform debates and decisions about whether or not to interfere with the flow of lava. Recent changes in the current eruption of Kīlauea make this a timely topic.

  2. Volcanic sulfur dioxide index and volcanic explosivity index inferred from eruptive volume of volcanoes in Jeju Island, Korea: application to volcanic hazard mitigation

    NASA Astrophysics Data System (ADS)

    Ko, Bokyun; Yun, Sung-Hyo

    2016-04-01

    Jeju Island located in the southwestern part of Korea Peninsula is a volcanic island composed of lavaflows, pyroclasts, and around 450 monogenetic volcanoes. The volcanic activity of the island commenced with phreatomagmatic eruptions under subaqueous condition ca. 1.8-2.0 Ma and lasted until ca. 1,000 year BP. For evaluating volcanic activity of the most recently erupted volcanoes with reported age, volcanic explosivity index (VEI) and volcanic sulfur dioxide index (VSI) of three volcanoes (Ilchulbong tuff cone, Songaksan tuff ring, and Biyangdo scoria cone) are inferred from their eruptive volumes. The quantity of eruptive materials such as tuff, lavaflow, scoria, and so on, is calculated using a model developed in Auckland Volcanic Field which has similar volcanic setting to the island. The eruptive volumes of them are 11,911,534 m3, 24,987,557 m3, and 9,652,025 m3, which correspond to VEI of 3, 3, and 2, respectively. According to the correlation between VEI and VSI, the average quantity of SO2 emission during an eruption with VEI of 3 is 2-8 × 103 kiloton considering that the island was formed under intraplate tectonic setting. Jeju Island was regarded as an extinct volcano, however, several studies have recently reported some volcanic eruption ages within 10,000 year BP owing to the development in age dating technique. Thus, the island is a dormant volcano potentially implying high probability to erupt again in the future. The volcanoes might have explosive eruptions (vulcanian to plinian) with the possibility that SO2 emitted by the eruption reaches stratosphere causing climate change due to backscattering incoming solar radiation, increase in cloud reflectivity, etc. Consequently, recommencement of volcanic eruption in the island is able to result in serious volcanic hazard and this study provides fundamental and important data for volcanic hazard mitigation of East Asia as well as the island. ACKNOWLEDGMENTS: This research was supported by a grant [MPSS

  3. Scientific Animations for Tsunami Hazard Mitigation: The Pacific Tsunami Warning Center's YouTube Channel

    NASA Astrophysics Data System (ADS)

    Becker, N. C.; Wang, D.; Shiro, B.; Ward, B.

    2013-12-01

    Outreach and education save lives, and the Pacific Tsunami Warning Center (PTWC) has a new tool--a YouTube Channel--to advance its mission to protect lives and property from dangerous tsunamis. Such outreach and education is critical for coastal populations nearest an earthquake since they may not get an official warning before a tsunami reaches them and will need to know what to do when they feel strong shaking. Those who live far enough away to receive useful official warnings and react to them, however, can also benefit from PTWC's education and outreach efforts. They can better understand a tsunami warning message when they receive one, can better understand the danger facing them, and can better anticipate how events will unfold while the warning is in effect. The same holds true for emergency managers, who have the authority to evacuate the public they serve, and for the news media, critical partners in disseminating tsunami hazard information. PTWC's YouTube channel supplements its formal outreach and education efforts by making its computer animations available 24/7 to anyone with an Internet connection. Though the YouTube channel is only a month old (as of August 2013), it should rapidly develop a large global audience since similar videos on PTWC's Facebook page have reached over 70,000 viewers during organized media events, while PTWC's official web page has received tens of millions of hits during damaging tsunamis. These animations are not mere cartoons but use scientific data and calculations to render graphical depictions of real-world phenomena as accurately as possible. This practice holds true whether the animation is a simple comparison of historic earthquake magnitudes or a complex simulation cycling through thousands of high-resolution data grids to render tsunami waves propagating across an entire ocean basin. PTWC's animations fall into two broad categories. The first group illustrates concepts about seismology and how it is critical to

  4. Public Policy Issues Associated with Tsunami Hazard Mitigation, Response and Recovery: Transferable Lessons from Recent Global Disasters

    NASA Astrophysics Data System (ADS)

    Johnson, L.

    2014-12-01

    Since 2004, a sequence of devastating tsunamis has taken the lives of more than 300,000 people worldwide. The path of destruction left by each is typically measured in hundreds of meters to a few kilometers and its breadth can extend for hundreds even thousands of kilometers, crossing towns and countries and even traversing an entire oceanic basin. Tsunami disasters in Indonesia, Chile, Japan and elsewhere have also shown that the almost binary nature of tsunami impacts can present some unique risk reduction, response, recovery and rebuilding challenges, with transferable lessons to other tsunami vulnerable coastal communities around the world. In particular, the trauma can motivate survivors to relocate homes, jobs, and even whole communities to safer ground, sometimes at tremendous social and financial costs. For governments, the level of concentrated devastation usually exceeds the local capacity to respond and thus requires complex inter-governmental arrangements with regional, national and even international partners to support the recovery of impacted communities, infrastructure and economies. Two parallel projects underway in California since 2011—the SAFRR (Science Application for Risk Reduction) tsunami scenario project and the California Tsunami Policy Working Group (CTPWG)—have worked to digest key lessons from recent tsunami disasters, with an emphasis on identifying gaps to be addressed in the current state and federal policy framework to enhance tsunami risk awareness, hazard mitigation, and response and recovery planning ahead of disaster and also improve post-disaster implementation practices following a future California or U.S. tsunami event.

  5. Sea otter oil-spill mitigation study

    SciTech Connect

    Davis, R.W.; Thomas, J.; Williams, T.M.; Kastelein, R.; Cornell, L.

    1986-05-01

    The objective of the study was to analyze the effectiveness of existing capture, transport, cleaning, and rehabilitation methods and develop new methods to reduce the impact of an accidental oil spill to California sea otters, resulting from the present conditions or from future Outer Continental Shelf (OCS) oil and gas development in State or Federal waters. In addition, the study investigated whether or not a systematic difference in thermal conductivity existed between the pelts of Alaska and California Sea otters. This was done to assure that conclusions drawn from the oiling experiments carried out at Hubbs Marine Research Institute, Tetra Tech, Inc. contributed to the overall study by preparing a literature review and report on the fate and effects of oil dispersants and chemically dispersed oil.

  6. Explosives Hazard Reduction (EHR) Studies Joint Operations

    DTIC Science & Technology

    2010-07-01

    Analysts, Inc. (ISA) Explosives Hazard Reduction (EHR) ProgramAs of: 2 ● Identify / Quantify Explosives Hazards ● Minimize Risks ● Resolve long...Siting Recommendations ● Produce DDESB Compliant Explosives Safety Site plans EHR Goals & Objectives Explosives Hazard Reduction (EHR) ProgramAs of: 3...of Barricades ● Automated Wash Rack ● Use ISO Trailers in MSA Explosives Hazard Reduction (EHR) ProgramAs of: 6 Proposed Facilities Joint Operations

  7. Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Research Team . Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Kelly, Michael J.

    2013-01-01

    The Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage (horizontal and vertical tail). This report contains the Appendices to Volume I.

  8. Multi-scale earthquake hazard and risk in the Chinese mainland and countermeasures for the preparedness, mitigation, and management: an overview

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Jiang, C.; Ma, T.

    2012-12-01

    Earthquake hazard and risk in the Chinese mainland exhibit multi-scale characteristics. Temporal scales from centuries to months, spatial scales from the whole mainland to specific engineering structures, and energy scales from great disastrous earthquakes to small earthquakes causing social disturbance and economic loss, feature the complexity of earthquake disasters. Coping with such complex challenge, several research and application projects have been undertaken since recent years. Lessons and experiences of the 2008 Wenchuan earthquake contributed much to the launching and conducting of these projects. Understandings of the scientific problems and technical approaches taken in the mainstream studies in the Chinese mainland have no significant difference from those in the international scientific communities, albeit using of some of the terminologies have "cultural differences" - for instance, in the China Earthquake Administration (CEA), the terminology "earthquake forecast/prediction (study)" is generally used in a much broader sense, mainly indicating time-dependent seismic hazard at different spatio-temporal scales. Several scientific products have been produced serving the society in different forms. These scientific products have unique academic merits due to the long-term persistence feature and the forward forecast nature, which are all essential for the evaluation of the technical performance and the falsification of the scientific ideas. On the other hand, using the language of the "actor network theory (ANT)" in science studies (or the sociology of science), at present, the hierarchical "actors' network", making the science transformed to the actions of the public and government for the preparedness, mitigation, and management of multi-scale earthquake disasters, is still in need of careful construction and improvement.

  9. RAGE Hydrocode Modeling of Asteroid Mitigation: new simulations with parametric studies for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Weaver, R.; Plesko, C. S.; Gisler, G. R.

    2013-12-01

    We are performing detailed hydrodynamic simulations of the interaction from a strong explosion with sample Asteroid objects. The purpose of these simulations is to apply modern hydrodynamic codes that have been well verified and validated (V&V) to the problem of mitigating the hazard from a potentially hazardous object (PHO), an asteroid or comet that is on an Earth crossing orbit. The code we use for these simulations is the RAGE code from Los Alamos National Laboratory [1-6]. Initial runs were performed using a spherical object. Next we ran simulations using the shape form from a known asteroid: 25143 Itokawa. This particular asteroid is not a PHO but we use its shape to consider the influence of non-spherical objects. The initial work was performed using 2D cylindrically symmetric simulations and simple geometries. We then performed a major fully 3D simulation. For an Itokawa size object (~500 m) and an explosion energies ranging from 0.5 - 1 megatons, the velocities imparted to all of the PHO "rocks" in all cases were many m/s. The velocities calculated were much larger than escape velocity and would preclude re-assembly of the fragments. The dispersion of the asteroid remnants is very directional from a surface burst, with all fragments moving away from the point of the explosion. This detail can be used to time the intercept for maximum movement off the original orbit. Results from these previous studies will be summarized for background. In the new work presented here we show a variety of parametric studies around these initial simulations. We modified the explosion energy by +/- 20% and varied the internal composition from a few large "rocks" to several hundred smaller rocks. The results of these parametric studies will be presented. We have also extended our work [6],[7] to stand-off nuclear bursts and will present the initial results for the energy deposition by a generic source into the non-uniform composition asteroid. The goal of this new work is to

  10. United States studies in orbital debris - Prevention and mitigation

    NASA Technical Reports Server (NTRS)

    Loftus, Joseph P., Jr.; Potter, Andrew E.

    1990-01-01

    Debris in space has become an issue that has commanded considerable interest in recent years as society has become both more dependent upon space based systems, and more aware of its dependence. After many years of study the United States Space Policy of February 1988 directed that all sectors of the U.S. community minimize space debris. Other space organizations have adopted similar policies. Among the study activities leading to the policy and to subsequent implementing directives were discussions with the ESA, NASDA, and other space operating agencies. The policy derived from technical consensus on the nature of the issues and upon the courses of action available to mitigate the problem, but there remains the concern as to the adequacy of the data to define cost effective strategies. There are now in place mechanisms to continue technical discussions in more formal terms.

  11. AMENDING SOILS WITH PHOSPHATE AS MEANS TO MITIGATE SOIL LEAD HAZARD: A CRITICAL REVIEW OF THE STATE OF THE SCIENCE

    EPA Science Inventory

    Ingested soil and surface dust may be important contributors to elevated blood lead (Pb) levels in children exposed to Pb contaminated environments. Mitigation strategies have typically focused on excavation and removal of the contaminated soil. However, this is not always feas...

  12. System Safety Hazards Assessment in Conceptual Program Trade Studies

    NASA Technical Reports Server (NTRS)

    Eben, Dennis M.; Saemisch, Michael K.

    2003-01-01

    Providing a program in the concept development phase with a method of determining system safety benefits of potential concepts has always been a challenge. Lockheed Martin Space and Strategic Missiles has developed a methodology for developing a relative system safety ranking using the potential hazards of each concept. The resulting output supports program decisions with system safety as an evaluation criterion with supporting data for evaluation. This approach begins with a generic hazards list that has been tailored for the program being studied and augmented with an initial hazard analysis. Each proposed concept is assessed against the list of program hazards and ranked in three derived areas. The hazards can be weighted to show those that are of more concern to the program. Sensitivities can be also be determined to test the robustness of the conclusions

  13. Airflow Hazard Visualization for Helicopter Pilots: Flight Simulation Study Results

    NASA Technical Reports Server (NTRS)

    Aragon, Cecilia R.; Long, Kurtis R.

    2005-01-01

    Airflow hazards such as vortices or low level wind shear have been identified as a primary contributing factor in many helicopter accidents. US Navy ships generate airwakes over their decks, creating potentially hazardous conditions for shipboard rotorcraft launch and recovery. Recent sensor developments may enable the delivery of airwake data to the cockpit, where visualizing the hazard data may improve safety and possibly extend ship/helicopter operational envelopes. A prototype flight-deck airflow hazard visualization system was implemented on a high-fidelity rotorcraft flight dynamics simulator. Experienced helicopter pilots, including pilots from all five branches of the military, participated in a usability study of the system. Data was collected both objectively from the simulator and subjectively from post-test questionnaires. Results of the data analysis are presented, demonstrating a reduction in crash rate and other trends that illustrate the potential of airflow hazard visualization to improve flight safety.

  14. National Wetland Mitigation Banking Study Wetland Migitation Banking.

    DTIC Science & Technology

    1994-02-01

    twist on diversity concerns, the Seaworld Eelgrass Mitigation Bank in Southern California has considered taking genetic diversity into account as a...For example, the Seaworld Eelgrass Mitigation Bank is using the density of eelgrass as a measure of quality. 2. Narrowly Tailored

  15. Experimental Study of Blast Mitigation in a Water Mist

    DTIC Science & Technology

    2006-11-01

    and Silnikov M.V., The selection of the effective blast reduction method when detonating explosives, J. de Phys. IV France, 2002, v . 12, n. 7, pp. Pr7...porosity); iv) environment (confinement), surrounding the charge; and v ) the mitigant arrangement (shell containing the mitigant, and shell’s material...by introduction of a 5% vertical velocity disturbance in the inflow. This disturbance means that the vertical velocity ( V -component of the flow

  16. Mitigation of hazards from future lahars from Mount Merapi in the Krasak River channel near Yogyakarta, central Java

    USGS Publications Warehouse

    Ege, John R.; ,

    1983-01-01

    Procedures for reducing hazards from future lahars and debris flows in the Krasak River channel near Yogyakarta, Central Java, Indonesia, include (1) determining the history of the location, size, and effects of previous lahars and debris flows, and (2) decreasing flow velocities. The first may be accomplished by geologic field mapping along with acquiring information by interviewing local residents, and the second by increasing the cross sectional area of the river channel and constructing barriers in the flow path.

  17. Peru mitigation assessment of greenhouse gases: Sector -- Energy. Peru climate change country study; Final report

    SciTech Connect

    1996-08-01

    The aim of this study is to determine the Inventory and propose Greenhouse Gases Mitigation alternatives in order to face the future development of the country in a clean environmental setting without delaying the development process required to improve Peruvian standard of living. The main idea of this executive abstract is to show concisely the results of the Greenhouse Gases Mitigation for Peru in the period 1990--2015. The studies about mitigation for the Energy Sector are shown in this summary.

  18. Is research on soil erosion hazard and mitigation in the Global South still needed? (Alexander von Humbold Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Poesen, Jean

    2016-04-01

    Soil erosion represents a geomorphological and geological hazard that may cause environmental damage (land degradation), property damage, loss of livelihoods and services as well as social and economic disruption. Erosion not only lowers the quality of our soils on site, resulting in a drastic reduction of their ecosystem functions that play a vital role in daily life, but causes also significant sediment-related problems off site. To curb soil erosion problems, a range of soil conservation techniques and strategies have been designed and are being applied. Worldwide, ca. 62 000 research papers on soil erosion and 116 000 on soil conservation have been published (Web of Science, Dec. 2015). The number of such papers dealing with the Global South represents less than 20 % of all papers, despite the fact that many regions in this part of the world face significant soil erosion problems, aggravated by a rapidly growing population and major environmental changes. Given the large number of research papers on this topic, one might therefore conclude that we now know almost everything about the various soil erosion processes and rates, their factors and consequences as well as their control so that little new knowledge can still be added to the vast amount of available information. We refute this conclusion by pointing to some major research gaps that still need to be addressed if we want to use our soils in a more sustainable way. More specifically the following topics need more research attention: 1) improved understanding of both natural and anthropogenic soil erosion processes and their interactions, 2) scaling up soil erosion processes and rates in space and time, and 3) innovative techniques and strategies to prevent or reduce erosion rates. This will be illustrated with case studies from the Global South. If future research focuses on these research gaps, we will 1) better understand processes and their interactions operating at a range of spatial and temporal

  19. Human uses of forested watersheds and riparian corridors: hazard mitigation as an ecosystem service, with examples from Panama, Puerto Rico, and Venezuela

    NASA Astrophysics Data System (ADS)

    Larsen, M. C.

    2015-12-01

    Humans have long favored settlement along rivers for access to water supply for drinking and agriculture, for transport corridors, and for food sources. Additionally, settlement in or near montane forests include benefits such as food sources, wood supply, esthetic values, and high quality water resources derived from watersheds where upstream human disturbance and environmental degradation is generally reduced. However, the advantages afforded by these riparian and montane settings pose episodic risks for communities located there as floods, landslides, and wildfires cause loss of life, destroy infrastructure, and damage or destroy crops. A basic understanding of flood probability and magnitude as well as hillslope stability by residents in these environments can mitigate these risks. Early humans presumably developed some degree of knowledge about these risks by means of their long periods of occupation in these environments and their observations of seasonal and storm rainfall patterns and river discharge, which became more refined as agriculture developed over the past 10,000 years. Modern global urbanization, particularly in regions of rapid economic growth, has resulted in much of this "organic" knowledge being lost, as rural populations move into megacities, many of which encroach on floodplains and mountain fronts. Moreover, the most likely occupants of these hazardous locations are often economically constrained, increasing their vulnerabity. Effective stewardship of river floodplains and upstream montane forests yields a key ecosystem service, which in addition to the well-known services, ie. water, hydroelectric energy, etc., provides a risk mitigation service, by reducing hazard and vulnerability. Puerto Rico, Panama, and Venezuela illustrate a range of practices and results, providing useful examples for planners and land use managers.

  20. 3D modelling of Mt. Talaga Bodas Crater (Indonesia) by using terrestrial laser scanner for volcano hazard mitigation

    NASA Astrophysics Data System (ADS)

    Gumilar, Irwan; Abidin, Hasanuddin Z.; Putra, Andreas D.; Haerani, Nia

    2015-04-01

    Indonesia is a country with many volcanoes. Each volcano in Indonesia typically has its own crater characteristics. One of them is the Mt.Talaga Bodas, located in Garut, West Java. Researches regarding the crater characteristics are necessary for volcanic disaster mitigation process. One of them is the modelling of the shape of the crater. One of the methods that can be used to model the volcanic crater is using Terrestrial Laser Scanner (TLS). This research aims to create a 3 dimensional (3D) model of the crater of the Mt. Talaga Bodas, that hopefully can be utilized for volcanic disaster mitigation. The methodology used in this research is by obtaining the scanning data using TLS and GPS measurements to obtain the coordinates of the reference points. The data processing methods consist of several steps, namely target to target registration, filterization, georeference, meshing point cloud, surface making, drawing, and 3D modelling. These steps were done using the Cyclone 7 software, and also using 3DS MAX for 3D modelling. The result of this data processing is a 3D model of the crater of the Mt. Talaga Bodas which is similar with the real shape. The calculation result shows that the height of the crater is 62.522 m, the diameter of the crater is 467.231 m, and the total area is 2961054.652 m2. The main obstacle in this research is the dense vegetation which becomes the noise and affects the crater model.

  1. Relating the compensational stacking of debris-flow fans to characteristics of their underlying stratigraphy: Implications for geologic hazard assessment and mitigation

    NASA Astrophysics Data System (ADS)

    Pederson, Christopher A.; Santi, Paul M.; Pyles, David R.

    2015-11-01

    Compensational stacking is the tendency for sediment transport systems to fill topographic lows through avulsion. This article quantitatively relates, for the first time, compensational stacking patterns within debris fans to characteristics of their internal stratigraphy and discusses implications to geologic hazard assessment and mitigation. Three exceptionally well-exposed debris fans were selected in Colorado for quantitative stratigraphic analyses. In each fan, the cross-sectional stratigraphy was subdivided into discrete depositional units (debris-flow and stream-flow deposits). The bounding surfaces between the depositional units were used to analyze the compensation index (κcv) of the fans, which is a measure of their compensational or avulsion tendencies. In the measured datasets, κcv ranged from 0.63 to 1.03. Values close to 0.5 represent intermediate levels of compensation, whereas values approaching 1.0 reflect high levels of compensation. The compensational values (κcv) were statistically compared to some physical, observable characteristics of the fans including: (1) debris-flow size, (2) amount of stream-flow deposits, (3) debris-flow composition, and (4) longitudinal position on the fan. These parameters correlated, either positively or negatively, to κcv, supporting their use as proxies for assessing the degree of compensational stacking in settings where large-scale cross-sections of a fan are unavailable. Such empirical results can be used by geologists and engineers for avoidance and mitigation measures of land use on debris fans.

  2. Using fine-scale fuel measurements to assess wildland fuels, potential fire behavior and hazard mitigation treatments in the southeastern USA.

    SciTech Connect

    Ottmar, Roger, D.; Blake, John, I.; Crolly, William, T.

    2012-01-01

    The inherent spatial and temporal heterogeneity of fuelbeds in forests of the southeastern United States may require fine scale fuel measurements for providing reliable fire hazard and fuel treatment effectiveness estimates. In a series of five papers, an intensive, fine scale fuel inventory from the Savanna River Site in the southeastern United States is used for building fuelbeds and mapping fire behavior potential, evaluating fuel treatment options for effectiveness, and providing a comparative analysis of landscape modeled fire behavior using three different data sources including the Fuel Characteristic Classification System, LANDFIRE, and the Southern Wildfire Risk Assessment. The research demonstrates that fine scale fuel measurements associated with fuel inventories repeated over time can be used to assess broad scale wildland fire potential and hazard mitigation treatment effectiveness in the southeastern USA and similar fire prone regions. Additional investigations will be needed to modify and improve these processes and capture the true potential of these fine scale data sets for fire and fuel management planning.

  3. US country studies program: Results from mitigation studies

    SciTech Connect

    1996-12-31

    This paper describes the U.S. Country Studies Program which was implemented to support the principles and objectives of the Framework Convention on Climate Change (FCCC). There were three principle objectives in this program: to enhance capabilities to conduct climate change assessments, prepare action plans, and implement technology projects; to help establish a process for developing and implementing national policies and measures; to support principles and objective of the FCCC. As a result, 55 countries are completing studies, more than 2000 analysts engaged in the studies have been trained, and there is a much broader understanding and support for climate change concerns. The article describes experiences of some countries, and general observations and conclusions which are broadly seperated into developed countries and those with economies in transition.

  4. Experimental study designs to improve the evaluation of road mitigation measures for wildlife.

    PubMed

    Rytwinski, Trina; van der Ree, Rodney; Cunnington, Glenn M; Fahrig, Lenore; Findlay, C Scott; Houlahan, Jeff; Jaeger, Jochen A G; Soanes, Kylie; van der Grift, Edgar A

    2015-05-01

    An experimental approach to road mitigation that maximizes inferential power is essential to ensure that mitigation is both ecologically-effective and cost-effective. Here, we set out the need for and standards of using an experimental approach to road mitigation, in order to improve knowledge of the influence of mitigation measures on wildlife populations. We point out two key areas that need to be considered when conducting mitigation experiments. First, researchers need to get involved at the earliest stage of the road or mitigation project to ensure the necessary planning and funds are available for conducting a high quality experiment. Second, experimentation will generate new knowledge about the parameters that influence mitigation effectiveness, which ultimately allows better prediction for future road mitigation projects. We identify seven key questions about mitigation structures (i.e., wildlife crossing structures and fencing) that remain largely or entirely unanswered at the population-level: (1) Does a given crossing structure work? What type and size of crossing structures should we use? (2) How many crossing structures should we build? (3) Is it more effective to install a small number of large-sized crossing structures or a large number of small-sized crossing structures? (4) How much barrier fencing is needed for a given length of road? (5) Do we need funnel fencing to lead animals to crossing structures, and how long does such fencing have to be? (6) How should we manage/manipulate the environment in the area around the crossing structures and fencing? (7) Where should we place crossing structures and barrier fencing? We provide experimental approaches to answering each of them using example Before-After-Control-Impact (BACI) study designs for two stages in the road/mitigation project where researchers may become involved: (1) at the beginning of a road/mitigation project, and (2) after the mitigation has been constructed; highlighting real case

  5. Prevalence and predictors of residential health hazards: a pilot study.

    PubMed

    Klitzman, Susan; Caravanos, Jack; Deitcher, Deborah; Rothenberg, Laura; Belanoff, Candice; Kramer, Rachel; Cohen, Louise

    2005-06-01

    This article reports the results of a pilot study designed to ascertain the prevalence of lead-based paint (LBP), vermin, mold, and safety conditions and hazards and to validate observations and self-reports against environmental sampling data. Data are based on a convenience sample of 70 dwellings in a low-income, urban neighborhood in Brooklyn, New York. The vast majority of residences (96%) contained multiple conditions and/or hazards: LBP hazards (80%), vermin (79%), elevated levels of airborne mold (39%), and safety hazards (100%). Observations and occupant reports were associated with environmental sampling data. In general, the more proximate an observed condition was to an actual hazard, the more likely it was to be associated with environmental sampling results (e.g., peeling LBP was associated with windowsill dust lead levels, and cockroach sightings by tenants were associated with Blatella germanica [Bla g 1] levels). Conversely, the more distal an observed condition was to an actual hazard, the less likely it was to be associated with environmental sampling results (e.g., water damage, alone, was not statistically associated with elevated levels of dust lead, Bla g 1, or airborne mold). Based on the findings from this pilot study, there is a need for industrial hygienists and others to adopt more comprehensive and integrative approaches to residential hazard assessment and remediation. Further research--using larger, randomly drawn samples, representing a range of housing types and geographical areas--is needed to clarify the relationship between readily observable conditions, occupant reports, and environmental sampling data and to assess the cumulative impact on human health.

  6. Hazard & Operability Study for Removal of Spent Nuclear Fuel from the 324 Building

    SciTech Connect

    VAN KEUREN, J.C.

    2002-05-07

    A hazard and operability (HAZOP) study was conducted to examine the hazards associated with the removal of the spent nuclear fuel from the 324 Building. Fifty-nine potentially hazardous conditions were identified.

  7. Odor mitigation with vegetative buffers: Swine production case study

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Vegetative environmental buffers (VEB) are a potentially low cost sustainable odor mitigation strategy, but there is little to no data supporting their effectiveness. Wind tunnel experiments and field monitoring were used to determine the effect VEB had on wind flow patterns within a swine facility....

  8. Amending soils with phosphate as means to mitigate soil lead hazard: a critical review of the state of the science.

    PubMed

    Scheckel, Kirk G; Diamond, Gary L; Burgess, Michele F; Klotzbach, Julie M; Maddaloni, Mark; Miller, Bradley W; Partridge, Charles R; Serda, Sophia M

    2013-01-01

    Ingested soil and surface dust may be important contributors to elevated blood lead (Pb) levels in children exposed to Pb contaminated environments. Mitigation strategies have typically focused on excavation and removal of the contaminated soil. However, this is not always feasible for addressing widely disseminated contamination in populated areas often encountered in urban environments. The rationale for amending soils with phosphate is that phosphate will promote formation of highly insoluble Pb species (e.g., pyromorphite minerals) in soil, which will remain insoluble after ingestion and, therefore, inaccessible to absorption mechanisms in the gastrointestinal tract (GIT). Amending soil with phosphate might potentially be used in combination with other methods that reduce contact with or migration of contaminated soils, such as covering the soil with a green cap such as sod, clean soil with mulch, raised garden beds, or gravel. These remediation strategies may be less expensive and far less disruptive than excavation and removal of soil. This review evaluates evidence for efficacy of phosphate amendments for decreasing soil Pb bioavailability. Evidence is reviewed for (1) physical and chemical interactions of Pb and phosphate that would be expected to influence bioavailability, (2) effects of phosphate amendments on soil Pb bioaccessibility (i.e., predicted solubility of Pb in the GIT), and (3) results of bioavailability bioassays of amended soils conducted in humans and animal models. Practical implementation issues, such as criteria and methods for evaluating efficacy, and potential effects of phosphate on mobility and bioavailability of co-contaminants in soil are also discussed.

  9. The Value of Linking Mitigation and Adaptation: A Case Study of Bangladesh

    NASA Astrophysics Data System (ADS)

    Ayers, Jessica M.; Huq, Saleemul

    2009-05-01

    There are two principal strategies for managing climate change risks: mitigation and adaptation. Until recently, mitigation and adaptation have been considered separately in both climate change science and policy. Mitigation has been treated as an issue for developed countries, which hold the greatest responsibility for climate change, while adaptation is seen as a priority for the South, where mitigative capacity is low and vulnerability is high. This conceptual divide has hindered progress against the achievement of the fundamental sustainable development challenges of climate change. Recent attention to exploring the synergies between mitigation and adaptation suggests that an integrated approach could go some way to bridging the gap between the development and adaptation priorities of the South and the need to achieve global engagement in mitigation. These issues are explored through a case study analysis of climate change policy and practice in Bangladesh. Using the example of waste-to-compost projects, a mitigation-adaptation-development nexus is demonstrated, as projects contribute to mitigation through reducing methane emissions; adaptation through soil improvement in drought-prone areas; and sustainable development, because poverty is exacerbated when climate change reduces the flows of ecosystem services. Further, linking adaptation to mitigation makes mitigation action more relevant to policymakers in Bangladesh, increasing engagement in the international climate change agenda in preparation for a post-Kyoto global strategy. This case study strengthens the argument that while combining mitigation and adaptation is not a magic bullet for climate policy, synergies, particularly at the project level, can contribute to the sustainable development goals of climate change and are worth exploring.

  10. The value of linking mitigation and adaptation: a case study of Bangladesh.

    PubMed

    Ayers, Jessica M; Huq, Saleemul

    2009-05-01

    There are two principal strategies for managing climate change risks: mitigation and adaptation. Until recently, mitigation and adaptation have been considered separately in both climate change science and policy. Mitigation has been treated as an issue for developed countries, which hold the greatest responsibility for climate change, while adaptation is seen as a priority for the South, where mitigative capacity is low and vulnerability is high. This conceptual divide has hindered progress against the achievement of the fundamental sustainable development challenges of climate change. Recent attention to exploring the synergies between mitigation and adaptation suggests that an integrated approach could go some way to bridging the gap between the development and adaptation priorities of the South and the need to achieve global engagement in mitigation. These issues are explored through a case study analysis of climate change policy and practice in Bangladesh. Using the example of waste-to-compost projects, a mitigation-adaptation-development nexus is demonstrated, as projects contribute to mitigation through reducing methane emissions; adaptation through soil improvement in drought-prone areas; and sustainable development, because poverty is exacerbated when climate change reduces the flows of ecosystem services. Further, linking adaptation to mitigation makes mitigation action more relevant to policymakers in Bangladesh, increasing engagement in the international climate change agenda in preparation for a post-Kyoto global strategy. This case study strengthens the argument that while combining mitigation and adaptation is not a magic bullet for climate policy, synergies, particularly at the project level, can contribute to the sustainable development goals of climate change and are worth exploring.

  11. Alternative Mechanisms for Compensatory Mitigation: Case Studies and Lessons About Fee-Based Compensatory Wetlands Mitigation

    DTIC Science & Technology

    1993-03-01

    Grove West Vineyard Urban Study Area, Placer County, California . Key Features of Fee-Based Compensation as Represented by the Case Study Programs Fee...supports a wide variety of wildflower and other plant species. Scientists maintain that these forests are among the most floristically diverse in the

  12. A study on seismicity and seismic hazard for Karnataka State

    NASA Astrophysics Data System (ADS)

    Sitharam, T. G.; James, Naveen; Vipin, K. S.; Raj, K. Ganesha

    2012-04-01

    This paper presents a detailed study on the seismic pattern of the state of Karnataka and also quantifies the seismic hazard for the entire state. In the present work, historical and instrumental seismicity data for Karnataka (within 300 km from Karnataka political boundary) were compiled and hazard analysis was done based on this data. Geographically, Karnataka forms a part of peninsular India which is tectonically identified as an intraplate region of Indian plate. Due to the convergent movement of the Indian plate with the Eurasian plate, movements are occurring along major intraplate faults resulting in seismic activity of the region and hence the hazard assessment of this region is very important. Apart from referring to seismotectonic atlas for identifying faults and fractures, major lineaments in the study area were also mapped using satellite data. The earthquake events reported by various national and international agencies were collected until 2009. Declustering of earthquake events was done to remove foreshocks and aftershocks. Seismic hazard analysis was done for the state of Karnataka using both deterministic and probabilistic approaches incorporating logic tree methodology. The peak ground acceleration (PGA) at rock level was evaluated for the entire state considering a grid size of 0.05° × 0.05°. The attenuation relations proposed for stable continental shield region were used in evaluating the seismic hazard with appropriate weightage factors. Response spectra at rock level for important Tier II cities and Bangalore were evaluated. The contour maps showing the spatial variation of PGA values at bedrock are presented in this work.

  13. RADON MITIGATION IN SCHOOLS: CASE STUDIES OF RADON MITIGATION SYSTEMS INSTALLED BY EPA IN FOUR MARYLAND SCHOOLS ARE PRESENTED

    EPA Science Inventory

    The first part of this two-part paper discusses radon entry into schools, radon mitigation approaches for schools, and school characteristics (e.g., heating, ventilation, and air-conditioning -- HVAC-- system design and operation) that influence radon entry and mitigation system ...

  14. Seismic hazard studies at the Department of Energy owned Paducah and Portsmouth Gaseous Diffusion plants

    SciTech Connect

    Beavers, J.E.; Brock, W.R.; Hunt, R.J. )

    1991-01-01

    Seismic hazard levels for free-field rock motion are defined and presented in this paper as annual exceedance probabilities versus peak acceleration and as uniform hazard response spectra. The conclusions of an independent review are also summarized. Based on the seismic hazard studies, peak horizontal acceleration values and uniform hazard response spectra for rock conditions are recommended. 15 refs., 6 figs., 1 tab.

  15. Multi-Hazard Sustainability: Towards Infrastructure Resilience

    NASA Astrophysics Data System (ADS)

    Lin, T.

    2015-12-01

    Natural and anthropogenic hazards pose significant challenges to civil infrastructure. This presents opportunities in investigating site-specific hazards in structural engineering to aid mitigation and adaptation efforts. This presentation will highlight: (a) recent advances in hazard-consistent ground motion selection methodology for nonlinear dynamic analyses, (b) ongoing efforts in validation of earthquake simulations and their effects on tall buildings, and (c) a pilot study on probabilistic sea-level rise hazard analysis incorporating aleatory and epistemic uncertainties. High performance computing and visualization further facilitate research and outreach to improve resilience under multiple hazards in the face of climate change.

  16. Hazard identification of pharmaceutical wastewaters using biodegradability studies.

    PubMed

    Zgajnar Gotvajn, A; Zagorc-Koncan, J

    2003-01-01

    A reliable wastewater characterization is an integral part of treatment and management strategies for industrial effluents. This is especially true for the pharmaceutical industry, which exhibits significant differences in its line of activity, generating effluents of very specific and complex natures. Any hazard or risk assessment of wastewater and/or determination of its treatability must include an evaluation of its degradability. Usually various non-standardized laboratory or pilot-scale long-term tests are run by measuring summary parameters for several days to determine the biodegradation potential of the effluent. A complex approach, based on stabilization studies, was proposed to determine the hazardous impact of wastewaters in terms of biodegradable and persistent toxicity. The objective of our work was to carry out complex hazard evaluation of pharmaceutical wastewaters. Whole effluent toxicity was determined using two different toxicity tests. First, we measured the inhibition of oxygen consumption by activated sludge. The test indicated toxicity of the wastewater and thus we performed an additional acute toxicity test with luminescent bacteria Vibrio fisheri. The next step was the determination of whole effluent ready biodegradability. It was determined with simultaneous measurement of oxygen consumption (ISO 9804) and carbon dioxide production (ISO 9439) in a closed respirometer, accompanied by DOC/IC measurements. The pharmaceutical wastewater degraded readily (83%, lag phase was 2 days, biodegradation rate was 0.33999 day(-1)) on the basis of O2 measurements. The biodegradation, calculated from the CO2 measurements, was comparable. We also applied mass balances of DOC/IC at the beginning and at the end of biodegradation experiments to confirm the extent and rate of biodegradation. The determination of hazardous impact and treatability of the effluent was concluded with aerobic stabilization studies. Biodegradation of the wastewater during the study

  17. Interdisciplinary approach to hydrological hazard mitigation and disaster response and effects of climate change on the occurrence of flood severity in central Alaska

    NASA Astrophysics Data System (ADS)

    Kontar, Y. Y.; Bhatt, U. S.; Lindsey, S. D.; Plumb, E. W.; Thoman, R. L.

    2015-06-01

    In May 2013, a massive ice jam on the Yukon River caused flooding that destroyed much of the infrastructure in the Interior Alaska village of Galena and forced the long-term evacuation of nearly 70% of its residents. This case study compares the communication efforts of the out-of-state emergency response agents with those of the Alaska River Watch program, a state-operated flood preparedness and community outreach initiative. For over 50 years, the River Watch program has been fostering long-lasting, open, and reciprocal communication with flood prone communities, as well as local emergency management and tribal officials. By taking into account cultural, ethnic, and socioeconomic features of rural Alaskan communities, the River Watch program was able to establish and maintain a sense of partnership and reliable communication patterns with communities at risk. As a result, officials and residents in these communities are open to information and guidance from the River Watch during the time of a flood, and thus are poised to take prompt actions. By informing communities of existing ice conditions and flood threats on a regular basis, the River Watch provides effective mitigation efforts in terms of ice jam flood effects reduction. Although other ice jam mitigation attempts had been made throughout US and Alaskan history, the majority proved to be futile and/or cost-ineffective. Galena, along with other rural riverine Alaskan communities, has to rely primarily on disaster response and recovery strategies to withstand the shock of disasters. Significant government funds are spent on these challenging efforts and these expenses might be reduced through an improved understanding of both the physical and climatological principals behind river ice breakup and risk mitigation. This study finds that long term dialogue is critical for effective disaster response and recovery during extreme hydrological events connected to changing climate, timing of river ice breakup, and

  18. Decay extent evaluation of wood degraded by a fungal community using NIRS: application for ecological engineering structures used for natural hazard mitigation

    NASA Astrophysics Data System (ADS)

    Baptiste Barré, Jean; Bourrier, Franck; Bertrand, David; Rey, Freddy

    2015-04-01

    .13). This tool improves the evaluation accuracy of wood decay extent in the context of ecological engineering structures used for natural hazard mitigation.

  19. A European effort towards the development of tools for tsunami hazard and risk assessment and mitigation, and tsunami early warning: the EC-funded TRANSFER project

    NASA Astrophysics Data System (ADS)

    Tinti, S.; Armigliato, A.

    2007-12-01

    TRANSFER (acronym for "Tsunami Risk ANd Strategies For the European Region") is a European Community funded project being coordinated by the University of Bologna (Italy) and involving 29 partners in Europe, Turkey and Israel. The main objectives of the project can be summarised as: 1) improving our understanding of tsunami processes in the Euro-Mediterranean region, 2) contributing to the tsunami hazard, vulnerability and risk assessment, 3) identifying the best strategies for reduction of tsunami risk, 4) focussing on the gaps and needs for the implementation of an efficient tsunami early warning system (TEWS) in the Euro-Mediterranean area, which is a high-priority task in consideration that no tsunami early warning system is today in place in the Euro- Mediterranean countries. This paper briefly outlines the results that were obtained in the first year of life of the project and the activities that are currently carried out and planned for the future. In particular, we will emphasize the efforts made so far in the following directions. 1) The improvement of existing numerical models for tsunami generation, propagation and impact, and the possible development of new ones. Existing numerical models have been already applied to selected benchmark problems. At the same time, the project is making an important effort in the development of standards for inundation maps in Europe. 2) The project Consortium has selected seven test areas in different countries facing the Mediterranean Sea and the eastern Atlantic Ocean, where innovative probabilistic and statistical approaches for tsunami hazard assessment, up-to-date and new methods to compute inundation maps are being and will be applied. For the same test areas, tsunami scenario approaches are being developed, vulnerability and risk assessed, prevention and mitigation measures defined also by the advice of end users that are organised in an End User Group. 3) A final key aspect is represented by the dissemination of

  20. Channelized debris flow hazard mitigation through the use of flexible barriers: a simplified computational approach for a sensitivity analysis.

    NASA Astrophysics Data System (ADS)

    Segalini, Andrea; Ferrero, Anna Maria; Brighenti, Roberto

    2013-04-01

    A channelized debris flow is usually represented by a mixture of solid particles of various sizes and water, flowing along a laterally confined inclined channel-shaped region up to an unconfined area where it slow down its motion and spreads out into a flat-shaped mass. The study of these phenomena is very difficult due to their short duration and unpredictability, lack of historical data for a given basin and complexity of the involved mechanical phenomena. The post event surveys allow for the identification of some depositional features and provide indication about the maximum flow height; however they lack information about development of the phenomena with time. For this purpose the monitoring of recursive events has been carried out by several Authors. Most of the studies, aimed at the determination of the characteristic features of a debris flow, were carried out in artificial channels, where the main involved variables were measured and other where controlled during the tests; however, some uncertainties remained and other scaled models where developed to simulate the deposition mechanics as well as to analyze the transportation mechanics and the energy dissipation. The assessment of the mechanical behavior of the protection structures upon impact with the flow as well as the energy associated to it are necessary for the proper design of such structures that, in densely populated area, can avoid victims and limit the destructive effects of such a phenomenon. In this work a simplified structural model, developed by the Authors for the safety assessment of retention barrier against channelized debris flow, is presented and some parametric cases are interpreted through the proposed approach; this model is developed as a simplified and efficient tool to be used for the verification of the supporting cables and foundations of a flexible debris flow barrier. The present analytical and numerical-based approach has a different aim of a FEM model. The computational

  1. Mitigating Hazards in School Facilities

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    School safety is a human concern, one that every school and community must take seriously and strive continually to achieve. It is also a legal concern; schools can be held liable if they do not make good-faith efforts to provide a safe and secure school environment. How schools are built and maintained is an integral part of school safety and…

  2. A STUDY ON GREENHOUSE GAS EMISSIONS AND MITIGATION POTENTIALS IN AGRICULRE

    NASA Astrophysics Data System (ADS)

    Hasegawa, Tomoko; Matsuoka, Yuzuru

    In this study, world food production and consumption are estimated from 2005 to 2030 using a model developed by General-to-specific modeling methodology. Based on the agricultural production, we estimated GHG emissions and mitigation potentials and evaluated mitigation countermeasures in agriculture. As a result, world crop and meat production will increase by 1.4 and 1.3 times respectively up to 2030. World GHG emissions from agriculture were 5.7 GtCO2eq in 2005. CH4 emission from enteric fermentation and N2O emission from nitrogen fertilizer contributed a large part of the emissions. In 2030, technical and economical mitigation potentials will be 2.0 GtCO2eq and 1.2 GtCO2eq respectively. The potentials correspond to 36% and 22% of total emissions in 2000. The countermeasures with highest effects will be water management in rice paddy such as "Midseason drainage" and "Off-season straw".

  3. Insights from EMF Associated Agricultural and Forestry Greenhouse Gas Mitigation Studies

    SciTech Connect

    McCarl, Bruce A.; Murray, Brian; Kim, Man-Keun; Lee, Heng-Chi; Sands, Ronald D.; Schneider, Uwe

    2007-11-19

    Integrated assessment modeling (IAM) as employed by the Energy Modeling Forum (EMF) generally involves a multi-sector appraisal of greenhouse gas emission (GHGE) mitigation alternatives and climate change effects typically at the global level. Such a multi-sector evaluation encompasses potential climate change effects and mitigative actions within the agricultural and forestry (AF) sectors. In comparison with many of the other sectors covered by IAM, the AF sectors may require somewhat different treatment due to their critical dependence upon spatially and temporally varying resource and climatic conditions. In particular, in large countries like the United States, forest production conditions vary dramatically across the landscape. For example, some areas in the southern US present conditions favorable to production of fast growing, heat tolerant pine species, while more northern regions often favor slower-growing hardwood and softwood species. Moreover, some lands are currently not suitable for forest production (e.g., the arid western plains). Similarly, in agriculture, the US has areas where citrus and cotton can be grown and other areas where barley and wheat are more suitable. This diversity across the landscape causes differential GHGE mitigation potential in the face of climatic changes and/or responses to policy or price incentives. It is difficult for a reasonably sized global IAM system to reflect the full range of sub-national geographic AF production possibilities alluded to above. AF response in the face of climate change altered temperature precipitation regimes or mitigation incentives will likely involve region-specific shifts in land use and agricultural/forest production. This chapter addresses AF sectoral responses in climate change mitigation analysis. Specifically, we draw upon US-based studies of AF GHGE mitigation possibilities that incorporate sub-national detail drawing largely on a body of studies done by the authors in association with

  4. Versatile gas gun target assembly for studying blast wave mitigation in materials

    NASA Astrophysics Data System (ADS)

    Bartyczak, S.; Mock, W., Jr.

    2012-03-01

    Traumatic brain injury (TBI) has become a serious problem for military personnel returning from recent conflicts. This has increased interest in investigating blast mitigating materials for use in helmets. In this paper we describe a new versatile target assembly that is used with an existing gas gun for studying these materials.

  5. CASE STUDY OF RADON DIAGNOSTICS AND MITIGATION IN A NEW YORK STATE SCHOOL

    EPA Science Inventory

    The paper discusses a case study of radon diagnostics and mitigation performed by EPA in a New York State school building. esearch focused on active subslab depressurization (ASD) in the basement and, to a lesser degree, the potential for radon reduction in the basement and slab-...

  6. Changing Family Habits: A Case Study into Climate Change Mitigation Behavior in Families

    ERIC Educational Resources Information Center

    Leger, Michel T.; Pruneau, Diane

    2012-01-01

    A case-study methodology was used to explore the process of change as experienced by 3 suburban families in an attempt to incorporate climate change mitigation behavior into their day to day life. Cross-case analysis of the findings revealed the emergence of three major conceptual themes associated with behavior adoption: collectively applied…

  7. Runaway electrons and mitigation studies in MST tokamak plasmas

    NASA Astrophysics Data System (ADS)

    Goetz, J. A.; Chapman, B. E.; Almagri, A. F.; Cornille, B. S.; Dubois, A.; McCollam, K. J.; Munaretto, S.; Sovinec, C. R.

    2016-10-01

    Studies of runaway electrons generated in low-density MST tokamak plasmas are being undertaken. The plasmas have Bt <= 0.14 T, Ip <= 50 kA, q (a) = 2.2 , and an electron density and temperature of about 5 ×1017m-3 and 150 eV. Runaway electrons are detected via x-ray bremsstrahlung emission. The density and electric field thresholds for production and suppression have been previously explored with variations in gas puffing for density control. Runaway electrons are now being probed with resonant magnetic perturbations (RMP's). An m = 3 RMP strongly suppresses the runaway electrons and initial NIMROD modeling shows that this may be due to degradation of flux surfaces. The RMP is produced by a poloidal array of 32 saddle coils at the narrow vertical insulated cut in MST's thick conducting shell, with each RMP having a single m but a broad n spectrum. While a sufficiently strong m = 3 RMP suppresses the runaway electrons, an RMP with m = 1 and comparable amplitude has little effect. The impact of the RMP's on the magnetic topology of these plasmas is being studied with the nonlinear MHD code NIMROD. With an m = 3 RMP, stochasticity is introduced in the outer third of the plasma but no such flux surface degradation is observed with an m = 1 RMP. NIMROD also predicts regularly occurring MHD activity similar to that observed in the experiment. These studies have also been done in q (a) = 2.7 plasmas and analysis and modeling is ongoing. This work supported by USDoE.

  8. Experimental study of blast mitigating devices based on combined construction

    NASA Astrophysics Data System (ADS)

    Takayama, K.; Silnikov, M. V.; Chernyshov, M. V.

    2016-09-01

    A robust blast inhibiting bin is the most often used device for damage blast effects suppression. In particular, a top open cylindrical bin significantly reduces a fragmentation effect resulted from a detonation of an explosive device placed inside the bin. However, reduction of blast wave overpressure and impulse by such cylindrical bins is not sufficient [1]. A reasonable alternative to endless increase of height and thickness of robust blast inhibiting bins is a development of destructible inhibitors having no solid elements in their structure and, therefore, excluding secondary fragmentation. So, the family of "Fountain" inhibitors [2,3] localizes and suppresses damaging blast effects due to multiphase working system. The present study is analyzing data obtained in testing of prototypes of new combined inhibitors. Their structure combines robust elements (bottoms, side surfaces) with elements responsible for blast loads reduction due to multi-phase working system (top and low transverse embeddings) and fairings impeding wave propagation in undesirable directions.

  9. Feasibility study of tank leakage mitigation using subsurface barriers

    SciTech Connect

    Treat, R.L.; Peters, B.B.; Cameron, R.J.; McCormak, W.D.; Trenkler, T.; Walters, M.F.; Rouse, J.K.; McLaughlin, T.J.; Cruse, J.M.

    1994-09-21

    The US Department of Energy (DOE) has established the Tank Waste Remediation System (TWRS) to satisfy manage and dispose of the waste currently stored in the underground storage tanks. The retrieval element of TWRS includes a work scope to develop subsurface impermeable barriers beneath SSTs. The barriers could serve as a means to contain leakage that may result from waste retrieval operations and could also support site closure activities by facilitating cleanup. Three types of subsurface barrier systems have emerged for further consideration: (1) chemical grout, (2) freeze walls, and (3) desiccant, represented in this feasibility study as a circulating air barrier. This report contains analyses of the costs and relative risks associated with combinations retrieval technologies and barrier technologies that from 14 alternatives. Eight of the alternatives include the use of subsurface barriers; the remaining six nonbarrier alternative are included in order to compare the costs, relative risks and other values of retrieval with subsurface barriers. Each alternative includes various combinations of technologies that can impact the risks associated with future contamination of the groundwater beneath the Hanford Site to varying degrees. Other potential risks associated with these alternatives, such as those related to accidents and airborne contamination resulting from retrieval and barrier emplacement operations, are not quantitatively evaluated in this report.

  10. Computational Study of Scenarios Regarding Explosion Risk Mitigation

    NASA Astrophysics Data System (ADS)

    Vlasin, Nicolae-Ioan; Mihai Pasculescu, Vlad; Florea, Gheorghe-Daniel; Cornel Suvar, Marius

    2016-10-01

    Exploration in order to discover new deposits of natural gas, upgrading techniques to exploit these resources and new ways to convert the heat capacity of these gases into industrial usable energy is the research areas of great interest around the globe. But all activities involving the handling of natural gas (exploitation, transport, combustion) are subjected to the same type of risk: the risk to explosion. Experiments carried out physical scenarios to determine ways to reduce this risk can be extremely costly, requiring suitable premises, equipment and apparatus, manpower, time and, not least, presenting the risk of personnel injury. Taking in account the above mentioned, the present paper deals with the possibility of studying the scenarios of gas explosion type events in virtual domain, exemplifying by performing a computer simulation of a stoichiometric air - methane explosion (methane is the main component of natural gas). The advantages of computer-assisted imply are the possibility of using complex virtual geometries of any form as the area of deployment phenomenon, the use of the same geometry for an infinite number of settings of initial parameters as input, total elimination the risk of personnel injury, decrease the execution time etc. Although computer simulations are hardware resources consuming and require specialized personnel to use the CFD (Computational Fluid Dynamics) techniques, the costs and risks associated with these methods are greatly diminished, presenting, in the same time, a major benefit in terms of execution time.

  11. Studies of millimeter-wave phenomenology for helicopter brownout mitigation

    NASA Astrophysics Data System (ADS)

    Schuetz, Christopher A.; Stein, E. Lee, Jr.; Samluk, Jesse; Mackrides, Daniel; Wilson, John P.; Martin, Richard D.; Dillon, Thomas E.; Prather, Dennis W.

    2009-09-01

    The unique ability of the millimeter-wave portion of the spectrum to penetrate typical visual obscurants has resulted in a wide range of possible applications for imagers in this spectrum. Of particular interest to the military community are imagers that can operate effectively in Degraded Visual Environments (DVE's) experienced by helicopter pilots when landing in dry, dusty environments, otherwise known as "brownout." One of the first steps to developing operational requirements for imagers in this spectrum is to develop a quantitative understanding of the phenomenology that governs imaging in these environments. While preliminary studies have been done in this area, quantitative, calibrated measurements of typical targets and degradation of target contrasts due to brownout conditions are not available. To this end, we will present results from calibrated, empirical measurements of typical targets of interest to helicopter pilots made in a representative desert environment. In addition, real-time measurements of target contrast reduction due to brownout conditions generated by helicopter downwash will be shown. These data were acquired using a W-band, dual-polarization radiometric scanner using optical-upconversion detectors.

  12. A study of shock mitigating materials in a split Hopkinson bar configuration. Phase 1

    SciTech Connect

    Bateman, V.I.; Brown, F.A.; Hansen, N.R.

    1998-06-01

    Sandia National Laboratories (SNL) designs mechanical systems with electronics that must survive high shock environments. These mechanical systems include penetrators that must survive soil, rock, and ice penetration, nuclear transportation casks that must survive transportation environments, and laydown weapons that must survive delivery impact of 125 fps. These mechanical systems contain electronics that may operate during and after the high shock environment and that must be protected from the high shock environments. A study has been started to improve the packaging techniques for the advanced electronics utilized in these mechanical systems because current packaging techniques are inadequate for these more sensitive electronics. In many cases, it has been found that the packaging techniques currently used not only do not mitigate the shock environment but actually amplify the shock environment. An ambitious goal for this packaging study is to avoid amplification and possibly attenuate the shock environment before it reaches the electronics contained in the various mechanical systems. As part of the investigation of packaging techniques, a two phase study of shock mitigating materials is being conducted. The purpose of the first phase reported here is to examine the performance of a joint that consists of shock mitigating material sandwiched in between steel and to compare the performance of the shock mitigating materials. A split Hopkinson bar experimental configuration simulates this joint and has been used to study the shock mitigating characteristics of seventeen, unconfined materials. The nominal input for these tests is an incident compressive wave with 50 fps peak (1,500 {micro}{var_epsilon} peak) amplitude and a 100 {micro}s duration (measured at 10% amplitude).

  13. The critical need for moderate to high resolution thermal infrared data for volcanic hazard mitigation and process monitoring from the micron to the kilometer scale

    NASA Astrophysics Data System (ADS)

    Ramsey, M. S.

    2006-12-01

    thermally-elevated pixels (max = 25.9 C) clustered near the summit with a lesser anomaly (max = 15.5 C) approximately 650 m to the southwest and down slope from the summit. Such small-scale and low-grade thermal features confirmed the increased activity state of the volcano and were only made possible with the moderate spatial, spectral, and radiometric resolution of ASTER. ASTER continued to collect data for the next 12 weeks tracking the progress of large scale pyroclastic flows, the growth of the lava dome, and the path of ash-rich plumes. Data from these observations were reported world-wide and used for evacuation and hazard planning purposes. With the pending demise of such TIR data from orbit, research is also focused on the use of handheld TIR instruments such as the forward-looking infrared radiometer (FLIR) camera. These instruments provide the highest spatial resolution in-situ TIR data and have been used to observe numerous volcanic phenomena and quantitatively model others (e.g., the rise of the magma body preceding the eruption of Mt. St. Helens Volcano; the changes on the lava dome at Bezymianny Volcano; the behavior of basalt crusts during pahoehoe flow inflation). Studies such as these confirm the utility and importance of future moderate to high resolution TIR data in order to understand volcanic processes and their accompanying hazards.

  14. Seismic Hazard characterization study using an earthquake source with Probabilistic Seismic Hazard Analysis (PSHA) method in the Northern of Sumatra

    NASA Astrophysics Data System (ADS)

    Yahya, A.; Palupi, M. I. R.; Suharsono

    2016-11-01

    Sumatra region is one of the earthquake-prone areas in Indonesia because it is lie on an active tectonic zone. In 2004 there is earthquake with a moment magnitude of 9.2 located on the coast with the distance 160 km in the west of Nanggroe Aceh Darussalam and triggering a tsunami. These events take a lot of casualties and material losses, especially in the Province of Nanggroe Aceh Darussalam and North Sumatra. To minimize the impact of the earthquake disaster, a fundamental assessment of the earthquake hazard in the region is needed. Stages of research include the study of literature, collection and processing of seismic data, seismic source characterization and analysis of earthquake hazard by probabilistic methods (PSHA) used earthquake catalog from 1907 through 2014. The earthquake hazard represented by the value of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) in the period of 0.2 and 1 second on bedrock that is presented in the form of a map with a return period of 2475 years and the earthquake hazard curves for the city of Medan and Banda Aceh.

  15. Methodological Issues In Forestry Mitigation Projects: A CaseStudy Of Kolar District

    SciTech Connect

    Ravindranath, N.H.; Murthy, I.K.; Sudha, P.; Ramprasad, V.; Nagendra, M.D.V.; Sahana, C.A.; Srivathsa, K.G.; Khan, H.

    2007-06-01

    There is a need to assess climate change mitigationopportunities in forest sector in India in the context of methodologicalissues such as additionality, permanence, leakage, measurement andbaseline development in formulating forestry mitigation projects. A casestudy of forestry mitigation project in semi-arid community grazing landsand farmlands in Kolar district of Karnataka, was undertaken with regardto baseline and project scenariodevelopment, estimation of carbon stockchange in the project, leakage estimation and assessment ofcost-effectiveness of mitigation projects. Further, the transaction coststo develop project, and environmental and socio-economic impact ofmitigation project was assessed.The study shows the feasibility ofestablishing baselines and project C-stock changes. Since the area haslow or insignificant biomass, leakage is not an issue. The overallmitigation potential in Kolar for a total area of 14,000 ha under variousmitigation options is 278,380 tC at a rate of 20 tC/ha for the period2005-2035, which is approximately 0.67 tC/ha/yr inclusive of harvestregimes under short rotation and long rotation mitigation options. Thetransaction cost for baseline establishment is less than a rupee/tC andfor project scenario development is about Rs. 1.5-3.75/tC. The projectenhances biodiversity and the socio-economic impact is alsosignificant.

  16. Experimental Study on Electrostatic Hazards in Sprayed Liquid

    NASA Astrophysics Data System (ADS)

    Choi, Kwang Seok; Yamaguma, Mizuki; Ohsawa, Atsushi

    2007-12-01

    In this study, to evaluate ignition hazards in a paint process, electrostatic sparks in the sprayed area and the amount of charge while spraying were observed. With the objective of preventing accidents involving fires and/or explosions, we deal also with the ignitability due to an electrostatic spark of a sprayed liquid relative to the percentage of nitrogen (N2), including compression in an air cylinder. For this study, an air-spray-type handheld gun with a 1-mm-internal-diameter orifice and a supply of air pressure in the range of 0.1 to 1 MPa were used. With regard to the materials, water, including some sodium chloride, was used to investigate the charge amount of the sprayed liquid, and kerosene was selected for ignition tests while spraying. Several electrostatic sparks in the sprayed region were observed while spraying. Some values of the electrostatic charge observed in the course of this study would be unsafe in the painting industry. Thus, if any of the conductive parts of the equipment are not grounded, incendiary electrostatic sparks can result. The ignitability of sprayed liquid was markedly reduced; the percentage of N2 in the air was substituted for pressurized pure air, and its efficiency increased with air pressure.

  17. Feasibility Study of Radiometry for Airborne Detection of Aviation Hazards

    NASA Technical Reports Server (NTRS)

    Gimmestad, Gary G.; Papanicolopoulos, Chris D.; Richards, Mark A.; Sherman, Donald L.; West, Leanne L.; Johnson, James W. (Technical Monitor)

    2001-01-01

    Radiometric sensors for aviation hazards have the potential for widespread and inexpensive deployment on aircraft. This report contains discussions of three aviation hazards - icing, turbulence, and volcanic ash - as well as candidate radiometric detection techniques for each hazard. Dual-polarization microwave radiometry is the only viable radiometric technique for detection of icing conditions, but more research will be required to assess its usefulness to the aviation community. Passive infrared techniques are being developed for detection of turbulence and volcanic ash by researchers in this country and also in Australia. Further investigation of the infrared airborne radiometric hazard detection approaches will also be required in order to develop reliable detection/discrimination techniques. This report includes a description of a commercial hyperspectral imager for investigating the infrared detection techniques for turbulence and volcanic ash.

  18. The fujairah united arab emirates (uae) (ml = 5.1) earthquake of march 11, 2002 a reminder for the immediate need to develop and implement a national hazard mitigation strategy

    NASA Astrophysics Data System (ADS)

    Al-Homoud, A.

    2003-04-01

    the epicenter of the earthquake. Indeed, the March 11, 2002 and "aftershocks" scared the citizens of Masafi and surrounding regions and ignited the attention of the public and government to the subject matter of earthquake hazard, specialty this earthquake came one year after the near by Indian m = 6.5 destructive Earthquake. Indeed the recent m = 6.2 June 22 destructive earthquake too that hit north west Iran, has again reminded the UAE public and government with the need to take quick and concrete measures to dtake the necessary steps to mitigate any anticipated earthquake hazard. This study reflects in some details on the following aspects related to the region and vicinity: geological and tectonic setting, seismicity, earthquake activity data base and seismic hazard assessment. Moreover, it documents the following aspects of the March 11, 2002 earthquake: tectonic, seismological, instrumental seismic data, aftershocks, strong motion recordings and response spectral and local site effect analysis, geotechnical effects and structural observations in the region affected by the earthquake. The study identifies local site ground amplification effects and liquefaction hazard potential in some parts of the UAE. Moreover, the study reflects on the coverage of the incident in the media, public and government response, state of earthquake engineering practice in the construction industry in the UAE, and the national preparedness and public awareness issues. However, it is concluded for this event that the mild damages that occurred in Masafi region were due to poor quality of construction, and lack of underestimating of the design base shear. Practical recommendations are suggested for the authorities to avoid damages in newly constructed buildings and lifelines as a result of future stronger earthquakes, in addition to recommendations on a national strategy for earthquake hazard mitigation in the UAE, which is still missing. The recommendations include the development and

  19. Volcanic hazard studies for the Yucca Mountain project

    SciTech Connect

    Crowe, B.; Turrin, B.; Wells, S.; Perry, F.; McFadden, L.; Renault, C.E.; Champion, D.; Harrington, C.

    1989-05-01

    Volcanic hazard studies are ongoing to evaluate the risk of future volcanism with respect to siting of a repository for disposal of high-level radioactive waste at the Yucca Mountain site. Seven Quaternary basaltic volcanic centers are located a minimum distance of 12 km and a maximum distance of 47 km from the outer boundary of the exploration block. The conditional probability of disruption of a repository by future basaltic volcanism is bounded by the range of 10{sup {minus}8} to 10{sup {minus}10} yr{sup {minus}1}. These values are currently being reexamined based on new developments in the understanding of the evaluation of small volume, basaltic volcanic centers including: (1) Many, perhaps most, of the volcanic centers exhibit brief periods of eruptive activity separated by longer periods of inactivity. (2) The centers may be active for time spans exceeding 10{sup 5} yrs, (3) There is a decline in the volume of eruptions of the centers through time, and (4) Small volume eruptions occurred at two of the Quaternary centers during latest Pleistocene or Holocene time. We classify the basalt centers as polycyclic, and distinguish them from polygenetic volcanoes. Polycyclic volcanism is characterized by small volume, episodic eruptions of magma of uniform composition over time spans of 10{sup 3} to 10{sup 5} yrs. Magma eruption rates are low and the time between eruptions exceeds the cooling time of the magma volumes. 25 refs., 2 figs.

  20. Subsurface geology of Louisiana hazardous waste landfills: A case study

    NASA Astrophysics Data System (ADS)

    Hanor, J. S.

    1995-09-01

    Many hazardous waste sites in the south Louisiana Gulf Coast have been emplaced in sediments of Plio-Pleistocene to Recent age. Because of the fining upward nature of these regressive-transgressive fluvial-deltaic sequences and the purported confining capabilities of the shallow clay layers within them, this area would seem to be ideal for the location of surface waste landfills. However, detailed geologic mapping at a site in southeastern Louisiana documents how the three-dimensional distribution of sediment types and early diagenetic features, both of which were ultimately controlled by depositional history, can increase effective vertical permeability of finegrained sequences. Many bodies of sand that appear to be isolated in standard geotechnical cross sections can be shown to be part of spatially complex three-dimensional distributary networks, with fine-grained sediments representing overbank and backswamp deposits. Some clay layers are actually a composite of thinner clay beds, each subjected to subaerial exposure and the development of secondary porosity related to soil formation. There has been documented leakage of wastes down through the clays, and a recent study indicates that the effective vertical hydraulic conductivity of the clay layers exceeds 10-5 cm s-1, or from one to four orders of magnitude higher than values measured on samples from cores of the same sediment. An understanding of the depositional framework, facies architecture, and diagenetic history of geologic materials underlying waste disposal sites in Louisiana is required for rational development of monitoring and remediation plans.

  1. Occupational Health Hazards among Healthcare Workers in Kampala, Uganda

    PubMed Central

    Yu, Xiaozhong; Buregyeya, Esther; Musoke, David; Wang, Jia-Sheng; Halage, Abdullah Ali; Whalen, Christopher; Bazeyo, William; Williams, Phillip; Ssempebwa, John

    2015-01-01

    Objective. To assess the occupational health hazards faced by healthcare workers and the mitigation measures. Methods. We conducted a cross-sectional study utilizing quantitative data collection methods among 200 respondents who worked in 8 major health facilities in Kampala. Results. Overall, 50.0% of respondents reported experiencing an occupational health hazard. Among these, 39.5% experienced biological hazards while 31.5% experienced nonbiological hazards. Predictors for experiencing hazards included not wearing the necessary personal protective equipment (PPE), working overtime, job related pressures, and working in multiple health facilities. Control measures to mitigate hazards were availing separate areas and containers to store medical waste and provision of safety tools and equipment. Conclusion. Healthcare workers in this setting experience several hazards in their workplaces. Associated factors include not wearing all necessary protective equipment, working overtime, experiencing work related pressures, and working in multiple facilities. Interventions should be instituted to mitigate the hazards. Specifically PPE supply gaps, job related pressures, and complacence in adhering to mitigation measures should be addressed. PMID:25802531

  2. The 5 key questions coping with risks due to natural hazards, answered by a case study

    NASA Astrophysics Data System (ADS)

    Hardegger, P.; Sausgruber, J. T.; Schiegg, H. O.

    2009-04-01

    Based on Maslow's hierarchy of needs, human endeavours concern primarily existential needs, consequently, to be safeguarded against both natural as well as man made threads. The subsequent needs are to realize chances in a variety of fields, as economics and many others. Independently, the 5 crucial questions are the same as for coping with risks due to natural hazards specifically. These 5 key questions are I) What is the impact in function of space and time ? II) What protection measures comply with the general opinion and how much do they mitigate the threat? III) How can the loss be adequately quantified and monetized ? IV) What budget for prevention and reserves for restoration and compensation are to be planned ? V) Which mix of measures and allocation of resources is sustainable, thus, optimal ? The 5 answers, exemplified by a case study, concerning the sustainable management of risk due to the debris flows by the Enterbach / Inzing / Tirol / Austria, are as follows : I) The impact, created by both the propagation of flooding and sedimentation, has been forecasted by modeling (numerical simulation) the 30, 50, 100, 150, 300 and 1000 year debris flow. The input was specified by detailed studies in meteorology, precipitation and runoff, in geology, hydrogeology, geomorphology and slope stability, in hydraulics, sediment transport and debris flow, in forestry, agriculture and development of communal settlement and infrastructure. All investigations were performed according to the method of ETAlp (Erosion and Transport in Alpine systems). ETAlp has been developed in order to achieve a sustainable development in alpine areas and has been evaluated by the research project "nab", within the context of the EU-Interreg IIIb projects. II) The risk mitigation measures of concern are in hydraulics at the one hand and in forestry at the other hand. Such risk management is evaluated according to sustainability, which means economic, ecologic and social, in short, "triple

  3. HOUSEHOLD HAZARDOUS WASTE CHARACTERIZATION STUDY FOR PALM BEACH COUNTY, FLORIDA - A MITE PROGRAM EVALUATION

    EPA Science Inventory

    The objectives of the Household Hazardous Waste Characterization Study (the HHW Study) were to: 1) Quantity the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County Florida’s (the County) residential solid waste (characterized in this study as municipal s...

  4. Reducing risk from lahar hazards: concepts, case studies, and roles for scientists

    USGS Publications Warehouse

    Pierson, Thomas C.; Wood, Nathan J.; Driedger, Carolyn L.

    2014-01-01

    Lahars are rapid flows of mud-rock slurries that can occur without warning and catastrophically impact areas more than 100 km downstream of source volcanoes. Strategies to mitigate the potential for damage or loss from lahars fall into four basic categories: (1) avoidance of lahar hazards through land-use planning; (2) modification of lahar hazards through engineered protection structures; (3) lahar warning systems to enable evacuations; and (4) effective response to and recovery from lahars when they do occur. Successful application of any of these strategies requires an accurate understanding and assessment of the hazard, an understanding of the applicability and limitations of the strategy, and thorough planning. The human and institutional components leading to successful application can be even more important: engagement of all stakeholders in hazard education and risk-reduction planning; good communication of hazard and risk information among scientists, emergency managers, elected officials, and the at-risk public during crisis and non-crisis periods; sustained response training; and adequate funding for risk-reduction efforts. This paper reviews a number of methods for lahar-hazard risk reduction, examines the limitations and tradeoffs, and provides real-world examples of their application in the U.S. Pacific Northwest and in other volcanic regions of the world. An overriding theme is that lahar-hazard risk reduction cannot be effectively accomplished without the active, impartial involvement of volcano scientists, who are willing to assume educational, interpretive, and advisory roles to work in partnership with elected officials, emergency managers, and vulnerable communities.

  5. Management of agricultural soils for greenhouse gas mitigation: Learning from a case study in NE Spain.

    PubMed

    Sánchez, B; Iglesias, A; McVittie, A; Álvaro-Fuentes, J; Ingram, J; Mills, J; Lesschen, J P; Kuikman, P J

    2016-04-01

    A portfolio of agricultural practices is now available that can contribute to reaching European mitigation targets. Among them, the management of agricultural soils has a large potential for reducing GHG emissions or sequestering carbon. Many of the practices are based on well tested agronomic and technical know-how, with proven benefits for farmers and the environment. A suite of practices has to be used since none of the practices can provide a unique solution. However, there are limitations in the process of policy development: (a) agricultural activities are based on biological processes and thus, these practices are location specific and climate, soils and crops determine their agronomic potential; (b) since agriculture sustains rural communities, the costs and potential for implementation have also to be regionally evaluated and (c) the aggregated regional potential of the combination of practices has to be defined in order to inform abatement targets. We believe that, when implementing mitigation practices, three questions are important: Are they cost-effective for farmers? Do they reduce GHG emissions? What policies favour their implementation? This study addressed these questions in three sequential steps. First, mapping the use of representative soil management practices in the European regions to provide a spatial context to upscale the local results. Second, using a Marginal Abatement Cost Curve (MACC) in a Mediterranean case study (NE Spain) for ranking soil management practices in terms of their cost-effectiveness. Finally, using a wedge approach of the practices as a complementary tool to link science to mitigation policy. A set of soil management practices was found to be financially attractive for Mediterranean farmers, which in turn could achieve significant abatements (e.g., 1.34 MtCO2e in the case study region). The quantitative analysis was completed by a discussion of potential farming and policy choices to shape realistic mitigation policy at

  6. Development, Implementation, and Pilot Evaluation of a Model-Driven Envelope Protection System to Mitigate the Hazard of In-Flight Ice Contamination on a Twin-Engine Commuter Aircraft

    NASA Technical Reports Server (NTRS)

    Martos, Borja; Ranaudo, Richard; Norton, Billy; Gingras, David; Barnhart, Billy

    2014-01-01

    Fatal loss-of-control accidents have been directly related to in-flight airframe icing. The prototype system presented in this report directly addresses the need for real-time onboard envelope protection in icing conditions. The combination of prior information and real-time aerodynamic parameter estimations are shown to provide sufficient information for determining safe limits of the flight envelope during inflight icing encounters. The Icing Contamination Envelope Protection (ICEPro) system was designed and implemented to identify degradations in airplane performance and flying qualities resulting from ice contamination and provide safe flight-envelope cues to the pilot. The utility of the ICEPro system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their awareness of a hazardous aircraft state. The performance of ICEPro system was further evaluated by various levels of sensor noise and atmospheric turbulence.

  7. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 44 Emergency Management and Assistance 1 2012-10-01 2011-10-01 true Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  8. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  9. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 44 Emergency Management and Assistance 1 2011-10-01 2011-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  10. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  11. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 44 Emergency Management and Assistance 1 2014-10-01 2014-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  12. Best Practices in Grid Integration of Variable Wind Power: Summary of Recent US Case Study Results and Mitigation Measures

    SciTech Connect

    Smith, J. Charles; Parsons, Brian; Acker, Thomas; Milligan, Michael; Zavidil, Robert; Schuerger, Matthew; DeMeo, Edgar

    2010-01-22

    This paper will summarize results from a number of utility wind integration case studies conducted recently in the US, and outline a number of mitigation measures based on insights from those studies.

  13. Geospatial Approach on Landslide Hazard Zonation Mapping Using Multicriteria Decision Analysis: A Study on Coonoor and Ooty, Part of Kallar Watershed, The Nilgiris, Tamil Nadu

    NASA Astrophysics Data System (ADS)

    Rahamana, S. Abdul; Aruchamy, S.; Jegankumar, R.

    2014-12-01

    zone I. Further resulted hazard zone map and landuse/landcover map are overlaid to check the hazard status, and existing inventory of known landslides within the present study area was compared with the resulting vulnerable and hazard zone maps. The landslide hazard zonation map is useful for landslide hazard prevention, mitigation, and improvement to society, and proper planning for land use and construction in the future.

  14. Factors in Perception of Tornado Hazard: An Exploratory Study.

    ERIC Educational Resources Information Center

    de Man, Anton; Simpson-Housley, Paul

    1987-01-01

    Administered questionnaire on tornado hazard to 142 adults. Results indicated that subject's gender and education level were best predictors of perceived probability of tornado recurrence; that ratings of severity of potential damage were related to education level; and that gender accounted for significant percentage of variance in anxiety…

  15. Sonic boom focusing prediction and delta wing shape optimization for boom mitigation studies

    NASA Astrophysics Data System (ADS)

    Khasdeo, Nitin

    Supersonic travel over land would be a reality if new aircraft are designed such that they produce quieter ground sonic booms, no louder than 0.3 psf according to the FAA requirement. An attempt is made to address the challenging goal of predicting the sonic boom focusing effects and mitigate the sonic boom ground overpressure for delta wing geometry. Sonic boom focusing is fundamentally a nonlinear phenomenon and can be predicted by numerically solving the nonlinear Tricomi equation. The conservative time domain scheme is developed to carry out the sonic boom focusing or super boom studies. The computational scheme is a type differencing scheme and is solved using a time-domain scheme, which is called a conservative type difference solution. The finite volume method is used on a structured grid topology. A number of input signals Concorde wave, symmetric and ax symmetric ramp, flat top and typical N wave type are simulated for sonic boom focusing prediction. A parametric study is launched in order to investigate the effects of several key parameters that affect the magnitude of shock wave amplification and location of surface of amplification or "caustics surface." A parametric studies includes the effects of longitudinal and lateral boundaries, footprint and initial shock strength of incoming wave and type of input signal on sonic boom focusing. Another very important aspect to be looked at is the mitigation strategies of sonic boom ground signature. It has been decided that aerodynamic reshaping and geometrical optimization are the main goals for mitigating the ground signal up to the acceptance level of FAA. Biconvex delta wing geometry with a chord length of 60 ft and maximum thickness ratio of 5% of the chord is used as a base line model to carry out the fundamental research focus. The wing is flying at an altitude 40,000 ft with a Mach number of 2.0. Boom mitigation work is focused on investigating the effects of wing thickness ratio, wing camber ratio, wing

  16. STUDY ON AIR INGRESS MITIGATION METHODS IN THE VERY HIGH TEMPERATURE GAS COOLED REACTOR (VHTR)

    SciTech Connect

    Chang H. Oh

    2011-03-01

    An air-ingress accident followed by a pipe break is considered as a critical event for a very high temperature gas-cooled reactor (VHTR). Following helium depressurization, it is anticipated that unless countermeasures are taken, air will enter the core through the break leading to oxidation of the in-core graphite structure. Thus, without mitigation features, this accident might lead to severe exothermic chemical reactions of graphite and oxygen. Under extreme circumstances, a loss of core structural integrity may occur along with excessive release of radiological inventory. Idaho National Laboratory under the auspices of the U.S. Department of Energy is performing research and development (R&D) that focuses on key phenomena important during challenging scenarios that may occur in the VHTR. Phenomena Identification and Ranking Table (PIRT) studies to date have identified the air ingress event, following on the heels of a VHTR depressurization, as very important (Oh et al. 2006, Schultz et al. 2006). Consequently, the development of advanced air ingress-related models and verification and validation (V&V) requirements are part of the experimental validation plan. This paper discusses about various air-ingress mitigation concepts applicable for the VHTRs. The study begins with identifying important factors (or phenomena) associated with the air-ingress accident by using a root-cause analysis. By preventing main causes of the important events identified in the root-cause diagram, the basic air-ingress mitigation ideas can be conceptually derived. The main concepts include (1) preventing structural degradation of graphite supporters; (2) preventing local stress concentration in the supporter; (3) preventing graphite oxidation; (4) preventing air ingress; (5) preventing density gradient driven flow; (4) preventing fluid density gradient; (5) preventing fluid temperature gradient; (6) preventing high temperature. Based on the basic concepts listed above, various air

  17. Method Study of Flood Hazard Analysis for Plain River Network Area, Taihu Basin, China

    NASA Astrophysics Data System (ADS)

    HAN, C.; Liu, S.; Zhong, G.; Zhang, X.

    2015-12-01

    Flood is one of the most common and serious natural calamities. Taihu Basin is located in delta region of the Yangtze River in East China (see Fig. 1). Because of the abundant rainfall and low-lying terrain, the area frequently suffers from flood hazard which have caused serious casualty and economic loss. In order to reduce the severe impacts of floods events, numerous polder areas and hydraulic constructions (including pumps, water gates etc.) were constructed. Flood Hazard Map is an effective non-structural flood mitigation tool measures. Numerical simulation of flood propagation is one of the key technologies of flood hazard mapping. Because of the complexity of its underlying surface characteristics, numerical simulation of flood propagation was faced with some special problems for the plain river network area in Taihu Basin. In this paper, a coupled one and two dimensional hydrodynamic model was established. Densely covered and interconnected river networks, numerous polder areas and complex scheduling hydraulic constructions were generalized in the model. The model was proved to be believable and stable. Based on the results of the simulation of flood propagation, flood hazard map was compiled.

  18. The Use of Geospatial Technologies in Flood Hazard Mapping and Assessment: Case Study from River Evros

    NASA Astrophysics Data System (ADS)

    Mentzafou, Angeliki; Markogianni, Vasiliki; Dimitriou, Elias

    2017-02-01

    Many scientists link climate change to the increase of the extreme weather phenomena frequency, which combined with land use changes often lead to disasters with severe social and economic effects. Especially floods as a consequence of heavy rainfall can put vulnerable human and natural systems such as transboundary wetlands at risk. In order to meet the European Directive 2007/60/EC requirements for the development of flood risk management plans, the flood hazard map of Evros transboundary watershed was produced after a grid-based GIS modelling method that aggregates the main factors related to the development of floods: topography, land use, geology, slope, flow accumulation and rainfall intensity. The verification of this tool was achieved through the comparison between the produced hazard map and the inundation maps derived from the supervised classification of Landsat 5 and 7 satellite imageries of four flood events that took place at Evros delta proximity, a wetland of international importance. The comparison of the modelled output (high and very high flood hazard areas) with the extent of the inundated areas as mapped from the satellite data indicated the satisfactory performance of the model. Furthermore, the vulnerability of each land use against the flood events was examined. Geographically Weighted Regression has also been applied between the final flood hazard map and the major factors in order to ascertain their contribution to flood events. The results accredited the existence of a strong relationship between land uses and flood hazard indicating the flood susceptibility of the lowlands and agricultural land. A dynamic transboundary flood hazard management plan should be developed in order to meet the Flood Directive requirements for adequate and coordinated mitigation practices to reduce flood risk.

  19. The Use of Geospatial Technologies in Flood Hazard Mapping and Assessment: Case Study from River Evros

    NASA Astrophysics Data System (ADS)

    Mentzafou, Angeliki; Markogianni, Vasiliki; Dimitriou, Elias

    2016-11-01

    Many scientists link climate change to the increase of the extreme weather phenomena frequency, which combined with land use changes often lead to disasters with severe social and economic effects. Especially floods as a consequence of heavy rainfall can put vulnerable human and natural systems such as transboundary wetlands at risk. In order to meet the European Directive 2007/60/EC requirements for the development of flood risk management plans, the flood hazard map of Evros transboundary watershed was produced after a grid-based GIS modelling method that aggregates the main factors related to the development of floods: topography, land use, geology, slope, flow accumulation and rainfall intensity. The verification of this tool was achieved through the comparison between the produced hazard map and the inundation maps derived from the supervised classification of Landsat 5 and 7 satellite imageries of four flood events that took place at Evros delta proximity, a wetland of international importance. The comparison of the modelled output (high and very high flood hazard areas) with the extent of the inundated areas as mapped from the satellite data indicated the satisfactory performance of the model. Furthermore, the vulnerability of each land use against the flood events was examined. Geographically Weighted Regression has also been applied between the final flood hazard map and the major factors in order to ascertain their contribution to flood events. The results accredited the existence of a strong relationship between land uses and flood hazard indicating the flood susceptibility of the lowlands and agricultural land. A dynamic transboundary flood hazard management plan should be developed in order to meet the Flood Directive requirements for adequate and coordinated mitigation practices to reduce flood risk.

  20. Water Induced Hazard Mapping in Nepal: A Case Study of East Rapti River Basin

    NASA Astrophysics Data System (ADS)

    Neupane, N.

    2010-12-01

    This paper presents illustration on typical water induced hazard mapping of East Rapti River Basin under the DWIDP, GON. The basin covers an area of 2398 sq km. The methodology includes making of base map of water induced disaster in the basin. Landslide hazard maps were prepared by SINMAP approach. Debris flow hazard maps were prepared by considering geology, slope, and saturation. Flood hazard maps were prepared by using two approaches: HEC-RAS and Satellite Imagery Interpretation. The composite water-induced hazard maps were produced by compiling the hazards rendered by landslide, debris flow, and flood. The monsoon average rainfall in the basin is 1907 mm whereas maximum 24 hours precipitation is 456.8 mm. The peak discharge of the Rapati River in the year of 1993 at station was 1220 cu m/sec. This discharge nearly corresponds to the discharge of 100-year return period. The landslides, floods, and debris flows triggered by the heavy rain of July 1993 claimed 265 lives, affected 148516 people, and damaged 1500 houses in the basin. The field investigation and integrated GIS interpretation showed that the very high and high landslide hazard zones collectively cover 38.38% and debris flow hazard zone constitutes 6.58%. High flood hazard zone occupies 4.28% area of the watershed. Mitigation measures are recommendated according to Integrated Watershed Management Approach under which the non-structural and structural measures are proposed. The non-structural measures includes: disaster management training, formulation of evacuation system (arrangement of information plan about disaster), agriculture management practices, protection of water sources, slope protections and removal of excessive bed load from the river channel. Similarly, structural measures such as dike, spur, rehabilitation of existing preventive measures and river training at some locations are recommendated. The major factors that have contributed to induce high incidences of various types of mass

  1. Hazard mitigation related to water and sediment fluxes in the Yellow River basin, China, based on comparable basins of the United States

    USGS Publications Warehouse

    Osterkamp, W.R.; Gray, J.R.

    2003-01-01

    The Yellow River, north-central China, and comparative rivers of the western United States, the Rio Grande and the Colorado River, derive much of their flows from melting snow at high elevations, but derive most of their se diment loads from semiarid central parts of the basins. The three rivers are regulated by larg e reservoirs that store water and sediment, causing downstream channel scour and, farthe r downstream, flood hazard owing to re- deposition of sediment. Potential approaches to reducing continui ng bed aggradation and increasing flood hazard along the lower Yellow Ri ver include flow augmentation, retirement of irrigation that decreases flows and increas es erosion, and re-routing of the middle Yellow River to bypass large sediment i nputs of the Loess Plateau.

  2. Hazardous waste cleanup: A case study for developing efficient programs

    SciTech Connect

    Elcock, D.; Puder, M.G.

    1995-06-01

    As officials in Pacific Basin Countries develop laws and policies for cleaning up hazardous wastes, experiences of countries with such instruments in place may be instructive. The United States has addressed cleanups of abandoned hazardous waste sites through the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). The US Congress enacted CERCLA in 1980. The task of cleaning up waste sites became larger and more costly than originally envisioned and as a result, Congress strengthened and expanded CERCLA in 1986. Today, many industry representatives, environmentalists, and other interested parties say the program is still costly and ineffective, and Congress is responding through a reauthorization process to change the law once again. Because the law and modifications to it can affect company operations and revenues, industries want to know the potential consequences of such changes. Argonne National Laboratory (ANL) recently developed a baseline for one economic sector -- the US energy industry -- against which impacts of proposed changes to CERCLA could be measured. Difficulties encountered in locating and interpreting the data for developing that baseline suggest that legislation should not only provide for meeting its stated goals (e.g., protection of human health and the environment) but also allow for its efficient evaluation over time. This lesson can be applied to any nation contemplating hazardous waste cleanup laws and policies.

  3. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... identify and characterize each of the hazards and assess the risk to public health and safety and the... consequence for each hazard before risk elimination or mitigation. (3) Ensure that the likelihood and consequence of each hazard meet the following criteria through risk elimination and mitigation measures:...

  4. Study of Landslide Disaster Prevention System in Malaysia as a Disaster Mitigation Prototype for South East Asia Countries

    NASA Astrophysics Data System (ADS)

    Koay, Swee Peng; Fukuoka, Hiroshi; Tien Tay, Lea; Murakami, Satoshi; Koyama, Tomofumi; Chan, Huah Yong; Sakai, Naoki; Hazarika, Hemanta; Jamaludin, Suhaimi; Lateh, Habibah

    2016-04-01

    Every year, hundreds of landslides occur in Malaysia and other tropical monsoon South East Asia countries. Therefore, prevention casualties and economical losses, by rain induced slope failure, are those countries government most important agenda. In Malaysia, millions of Malaysian Ringgit are allocated for slope monitoring and mitigation in every year budget. Besides monitoring the slopes, here, we propose the IT system which provides hazard map information, landslide historical information, slope failure prediction, knowledge on natural hazard, and information on evacuation centres via internet for user to understand the risk of landslides as well as flood. Moreover, the user can obtain information on rainfall intensity in the monitoring sites to predict the occurrence of the slope failure. Furthermore, we are working with PWD, Malaysia to set the threshold value for the landslide prediction system which will alert the officer if there is a risk of the slope failure in the monitoring sites by calculating rainfall intensity. Although the IT plays a significant role in information dissemination, education is also important in disaster prevention by educating school students to be more alert in natural hazard, and there will be bottom up approach to alert parents on what is natural hazard, by conversion among family members, as most of the parents are busy and may not have time to attend natural hazard workshop. There are many races living in Malaysia as well in most of South East Asia countries. It is not easy to educate them in single education method as the level of living and education are different. We started landslides education workshops in primary schools in rural and urban area, in Malaysia. We found out that we have to use their mother tongue language while conducting natural hazard education for better understanding. We took questionnaires from the students before and after the education workshop. Learning from the questionnaire result, the students are

  5. Why so many sperm cells? Not only a possible means of mitigating the hazards inherent to human reproduction but also an indicator of an exaptation

    PubMed Central

    Barlow, Peter W.

    2016-01-01

    ABSTRACT Redundancy—the excess of supply over necessity—has recently been proposed for human sperm cells. However, the apparent superfluity of cell numbers may be necessary in order to circumvent the hazards, many of which can be quantified, that can occur during the transition from gametogenesis within the testes to zygosis within the female reproductive tract. Sperm cell numbers are directly related to testicular volume, and it is owing to a redundancy, and the possible exaptation, of this latter parameter that a putative excess of sperm cells is perceived. PMID:27574542

  6. Observational Studies of Earthquake Preparation and Generation to Mitigate Seismic Risks in Mines

    NASA Astrophysics Data System (ADS)

    Durrheim, R. J.; Ogasawara, H.; Nakatani, M.; Milev, A.; Cichowicz, A.; Kawakata, H.; Yabe, Y.; Murakami, O.; Naoi, M. M.; Moriya, H.; Satoh, T.

    2011-12-01

    We provide a status report on a 5-year project to monitor in-situ fault instability and strong motion in South African gold mines. The project has two main aims: (1) To learn more about earthquake preparation and generation mechanisms by deploying dense arrays of high-sensitivity sensors within rock volumes where mining is likely to induce significant seismic activity. (2) To upgrade the South African national surface seismic network in the mining districts. This knowledge will contribute to efforts to upgrade schemes of seismic hazard assessment and to limit and mitigate the seismic risks in deep mines. As of 31 July 2011, 46 boreholes totalling 1.9 km in length had been drilled at project sites at Ezulwini, Moab-Khotsong and Driefontein gold mines. Several dozen more holes are still to be drilled. Acoustic emission sensors, strain- and tiltmeters, and controlled seismic sources are being installed to monitor the deformation of the rock mass, the accumulation of damage during the preparation phase, and changes in dynamic stress as the rupture front propagates. These data will be integrated with measurements of stope closure, stope strong motion, seismic data recorded by the mine-wide network, and stress modelling. Preliminary results will be reported at AGU meeting. The project is endorsed by the Japan Science and Technology Agency (JST), Japan International Cooperation Agency (JICA) and the South African government. It is funded by the JST-JICA program for Science and Technology Research Partnership for Sustainable development (SATREPS, the Council for Scientific and Industrial Research (CSIR), the Council for Geoscience, the University of the Witwatersrand and the Department of Science and Technology. The contributions of Seismogen CC, OHMS Ltd, AnglogoldAshanti Rock Engineering Applied Research Group, First Uranium, the Gold Fields Seismic Department and the Institute of Mine Seismology are gratefully acknowledged.

  7. Hazard interactions and interaction networks (cascades) within multi-hazard methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2016-08-01

    This paper combines research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between multi-layer single-hazard approaches and multi-hazard approaches that integrate such interactions. This synthesis suggests that ignoring interactions between important environmental and anthropogenic processes could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. In this paper we proceed to present an enhanced multi-hazard framework through the following steps: (i) description and definition of three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment, (ii) outlining of three types of interaction relationship (triggering, increased probability, and catalysis/impedance), and (iii) assessment of the importance of networks of interactions (cascades) through case study examples (based on the literature, field observations and semi-structured interviews). We further propose two visualisation frameworks to represent these networks of interactions: hazard interaction matrices and hazard/process flow diagrams. Our approach reinforces the importance of integrating interactions between different aspects of the Earth system, together with human activity, into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability

  8. Hazard Interactions and Interaction Networks (Cascades) within Multi-Hazard Methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel; Malamud, Bruce D.

    2016-04-01

    Here we combine research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between 'multi-layer single hazard' approaches and 'multi-hazard' approaches that integrate such interactions. This synthesis suggests that ignoring interactions could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. We proceed to present an enhanced multi-hazard framework, through the following steps: (i) describe and define three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment; (ii) outline three types of interaction relationship (triggering, increased probability, and catalysis/impedance); and (iii) assess the importance of networks of interactions (cascades) through case-study examples (based on literature, field observations and semi-structured interviews). We further propose visualisation frameworks to represent these networks of interactions. Our approach reinforces the importance of integrating interactions between natural hazards, anthropogenic processes and technological hazards/disasters into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential, and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.

  9. Landslide hazard evaluation: a review of current techniques and their application in a multi-scale study, Central Italy

    NASA Astrophysics Data System (ADS)

    Guzzetti, Fausto; Carrara, Alberto; Cardinali, Mauro; Reichenbach, Paola

    1999-12-01

    In recent years, growing population and expansion of settlements and life-lines over hazardous areas have largely increased the impact of natural disasters both in industrialized and developing countries. Third world countries have difficulty meeting the high costs of controlling natural hazards through major engineering works and rational land-use planning. Industrialized societies are increasingly reluctant to invest money in structural measures that can reduce natural risks. Hence, the new issue is to implement warning systems and land utilization regulations aimed at minimizing the loss of lives and property without investing in long-term, costly projects of ground stabilization. Government and research institutions worldwide have long attempted to assess landslide hazard and risks and to portray its spatial distribution in maps. Several different methods for assessing landslide hazard were proposed or implemented. The reliability of these maps and the criteria behind these hazard evaluations are ill-formalized or poorly documented. Geomorphological information remains largely descriptive and subjective. It is, hence, somewhat unsuitable to engineers, policy-makers or developers when planning land resources and mitigating the effects of geological hazards. In the Umbria and Marche Regions of Central Italy, attempts at testing the proficiency and limitations of multivariate statistical techniques and of different methodologies for dividing the territory into suitable areas for landslide hazard assessment have been completed, or are in progress, at various scales. These experiments showed that, despite the operational and conceptual limitations, landslide hazard assessment may indeed constitute a suitable, cost-effective aid to land-use planning. Within this framework, engineering geomorphology may play a renewed role in assessing areas at high landslide hazard, and helping mitigate the associated risk.

  10. Human Mars EDL Pathfinder Study: Assessment of Technology Development Gaps and Mitigations

    NASA Technical Reports Server (NTRS)

    Lillard, Randolph; Olejniczak, Joe; Polsgrove, Tara; Cianciolo, Alice Dwyer; Munk, Michelle; Whetsel, Charles; Drake, Bret

    2017-01-01

    This paper presents the results of a NASA initiated Agency-wide assessment to better characterize the risks and potential mitigation approaches associated with landing human class Entry, Descent, and Landing (EDL) systems on Mars. Due to the criticality and long-lead nature of advancing EDL techniques, it is necessary to determine an appropriate strategy to improve the capability to land large payloads. A key focus of this study was to understand the key EDL risks and with a focus on determining what "must" be tested at Mars. This process identified the various risks and potential risk mitigation strategies along with the key near term technology development efforts required and in what environment those technology demonstrations were best suited. The study identified key risks along with advantages to each entry technology. In addition, it was identified that provided the EDL concept of operations (con ops) minimized large scale transition events, there was no technology requirement for a Mars pre-cursor demonstration. Instead, NASA should take a direct path to a human-scale lander.

  11. Prospective study of hepatic, renal, and haematological surveillance in hazardous materials firefighters

    PubMed Central

    Kales, S; Polyhronopoulos, G; Aldrich, J; Mendoza, P; Suh, J; Christiani, D

    2001-01-01

    OBJECTIVES—To evaluate possible health effects related to work with hazardous materials as measured by end organ effect markers in a large cohort over about 2 years, and in a subcohort over 5 years.
METHODS—Hepatic, renal, and haematological variables were analysed from 1996-98 in hazardous materials firefighters including 288 hazardous materials technicians (81%) and 68 support workers (19%). The same end organ effect markers in a subcohort of the technicians were also analysed (n=35) from 1993-98. Support workers were considered as controls because they are also firefighters, but had a low potential exposure to hazardous materials.
RESULTS—During the study period, no serious injuries or exposures were reported. For the end organ effect markers studied, no significant differences were found between technicians and support workers at either year 1 or year 3. After adjustment for a change in laboratory, no significant longitudinal changes were found within groups for any of the markers except for creatinine which decreased for both technicians (p<0.001) and controls (p<0.01).
CONCLUSIONS—Health effects related to work are infrequent among hazardous materials technicians. Haematological, hepatic, and renal testing is not required on an annual basis and has limited use in detecting health effects in hazardous materials technicians.


Keywords: hazardous materials; firefighters; medical surveillance PMID:11160986

  12. New debris flow mitigation measures in southern Gansu, China: a case study of the Zhouqu Region

    NASA Astrophysics Data System (ADS)

    Xiong, Muqi; Meng, Xingmin; Li, Yajun

    2014-05-01

    A devastating debris flow occurred in Zhouqu of Gansu Province, China, on 8th August 2010, resulting in a catastrophic disaster, with 1463 people being perished. The debris flow valleys, as other numerous debris valleys in the mountainous region, had preventive engineering constructions, such as check dames, properly designed based on common engineering practices for safe guiding the town located right on the debris flow fan. However, failures of such preventive measures often cause even heavier disasters than those that have no human interactions, as the mitigations give a false safety impression. Given such a weird situation and in order to explore a much more effective disaster prevention strategy against debris flows in the mountainous region, this paper makes a comparative study based on two cases in the area of which one had preventive structures and one hasn't. The result shows that inappropriate mitigation measures that have commonly been applying in the disaster reduction practices in the region are of questionable. It is concluded that going with the nature and following with the natural rules are the best strategy for disaster reduction in the region. Key words: debris flow disasters, disaster reduction strategy, preventive measures

  13. Assessment of bio-physical drought hazards. A case study of Karkheh River basin in Iran

    NASA Astrophysics Data System (ADS)

    Kamali, Bahareh; Abbaspour, Karim; Houshmand Kouchi, Delaram; Yang, Hong

    2016-04-01

    Iran has been affected by frequent droughts. Climate change is expected to intensify the situation in the future. Extreme drought events have had serious impacts on hydrological and agricultural sector. Thus, identification of bio-physical drought hazard is critically important for formulating effective adaptive measures to improve water and food security. This study aims to investigate temporal and spatial pattern of drought hazards in meteorological, hydrological, and agricultural (inclusively biophysical) sectors in the Karkheh River Basin of Iran in the historical and future climate change context. To do so, drought hazard indices were built based on the severity and frequency of standardized precipitation index (SPI), standardized runoff index (SRI), and standardized soil moisture index (SSMI), which represent the three aspects of drought hazards. Variables required for calculating these indices were obtained from SWAT (Soil and Water Assessment Tool) model constructed for the basin. The model was calibrated based on monthly runoff using the Sequential Uncertainty Fitting (SUFI-2) algorithm in SWAT-CUP. Based on the climate variability and drought analysis, three drought hazard classes, namely low, medium and high, were defined. This help identify how agricultural and hydrological sectors are related to meteorological droughts. Additionally, the bio-physical drivers of drought hazards were identified for each class. Comparing the results during historic and future scenarios revealed that the frequency of high- severity hazards will increase, whereas the same is not predicted for the area with medium hazard intensity. Inferred from findings of this study, the combined application of the SWAT model with bio-physical drought hazard concept helps better understanding of climate risks to water and food security. The developed approach is replicable at different scales to provide a robust planning tool for policy makers.

  14. Field Study of Exhaust Fans for Mitigating Indoor Air Quality Problems & Indoor Air Quality - Exhaust Fan Mitigation.

    SciTech Connect

    United States. Bonneville Power Administration.

    1987-07-01

    Overall, the findings show that exhaust fans basically provide small amounts of ventilation compensation. By monitoring the common indoor air pollutants (radon, formaldehyde, carbon monoxide, nitrogen dioxide, and water vapor), it was found that the quality of the indoor air was not adversely affected by the use of exhaust fans. Nor did their use provide any measurable or significant benefits since no improvement in air quality was ascertained. While exhaust fans of this small size did not increase radon, which is the contaminant of most concern, the researchers caution that operation of a larger fan or installation in a very tight home could result in higher levels because depressurization is greater. The daily energy consumption for use of these appliances during the heating season was calculated to be 1.5 kilowatt hours or approximately 3% of the energy consumption in the study homes. The information collected in this collaborative field study indicates that the use of these particular ventilation systems has no significant effect on indoor air quality.

  15. Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria

    NASA Astrophysics Data System (ADS)

    Maltzkait, Anika; Pfurtscheller, Clemens

    2014-05-01

    Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria A. Maltzkait (1) & C. Pfurtscheller (1) (1) Institute for Interdisciplinary Mountain Research (IGF), Austrian Academy of Sciences, Innsbruck, Austria The extreme flood events of 2002, 2005 and 2013 in Austria underlined the importance of local emergency services being able to withstand and reduce the adverse impacts of natural hazards. Although for legal reasons municipal emergency and crisis management plans exist in Austria, they mostly do not cover risk analyses of natural hazards - a sound, comparable assessment to identify and evaluate risks. Moreover, total losses and operational emergencies triggered by natural hazards have increased in recent decades. Given sparse public funds, objective budget decisions are needed to ensure the efficient provision of operating resources, like personnel, vehicles and equipment in the case of natural hazards. We present a case study of the municipality of Au, Austria, which was hardly affected during the 2005 floods. Our approach is primarily based on a qualitative risk analysis, combining existing hazard plans, GIS data, field mapping and data on operational efforts of the fire departments. The risk analysis includes a map of phenomena discussed in a workshop with local experts and a list of risks as well as a risk matrix prepared at that workshop. On the basis for the exact requirements for technical and non-technical mitigation measures for each natural hazard risk were analysed in close collaboration with members of the municipal operation control and members of the local emergency services (fire brigade, Red Cross). The measures includes warning, evacuation and, technical interventions with heavy equipment and personnel. These results are used, first, to improve the municipal emergency and crisis management plan by providing a risk map, and a

  16. BICAPA case study of natural hazards that trigger technological disasters

    NASA Astrophysics Data System (ADS)

    Boca, Gabriela; Ozunu, Alexandru; Nicolae Vlad, Serban

    2010-05-01

    Industrial facilities are vulnerable to natural disasters. Natural disasters and technological accidents are not always singular or isolated events. The example in this paper show that they can occur in complex combinations and/or in rapid succession, known as NaTech disasters, thereby triggering multiple impacts. This analysis indicates that NaTech disasters have the potential to trigger hazmat releases and other types of technological accidents. Climate changes play an important role in prevalence and NATECH triggering mechanisms. Projections under the IPCC IS92 a scenario (similar to SRES A1B; IPCC, 1992) and two GCMs indicate that the risk of floods increases in central and eastern Europe. Increase in intense short-duration precipitation is likely to lead to increased risk of flash floods. (Lehner et al., 2006). It is emergent to develop tools for the assessment of risks due to NATECH events in the industrial processes, in a framework starting with the characterization of frequency and severity of natural disasters and continuing with complex analysis of industrial processes, to risk assessment and residual functionality analysis. The Ponds with dangerous technological residues are the most vulnerable targets of natural hazards. Technological accidents such as those in Baia Mare, (from January to March 2000) had an important international echo. Extreme weather phenomena, like those in the winter of 2000 in Baia Mare, and other natural disasters such as floods or earthquakes, can cause a similar disaster at Târnăveni in Transylvania Depression. During 1972 - 1978 three decanting ponds were built on the Chemical Platform Târnăveni, now SC BICAPA SA, for disposal of the hazardous-wastes resulting from the manufacture of sodium dichromate, inorganic salts, sludge from waste water purification and filtration, wet gas production from carbide. The ponds are located on the right bank of the river Târnava at about 35-50m from the flooding defense dam. The total

  17. Study on mobility-disadvantage group' risk perception and coping behaviors of abrupt geological hazards in coastal rural area of China.

    PubMed

    Pan, Anping

    2016-07-01

    China is a country highly vulnerable to abrupt geological hazards. The present study aims to investigate disaster preparedness and perception of abrupt geological disasters (such as rock avalanches, landslide, mud-rock flows etc) in mobility-disadvantage group living in coastal rural area of China. This research is to take into account all factors regarding disasters and to design the questionnaires accordingly. Two debris flow vulnerable townships are selected as study areas including Hedi Township in Qinyuan County and Xianxi Township in Yueqing City which are located in East China's Zhejiang Province. SPSS was applied to conduct descriptive analysis, which results in an effective empirical model for evacuation behavior of the disable groups. The result of this study shows mobility-disadvantage groups' awareness on disaster prevention and mitigation is poor and their knowledge about basic theory and emergency response is limited. Errors and distortions in public consciousness on disaster prevention and mitigation stimulate the development of areas with frequent disasters, which will expose more life and property to danger and aggravate the vulnerability of hazard bearing body. In conclusion, before drafting emergency planning, the government should consider more the disable group's expectations and actual evacuation behavior than the request of the situation to ensure the planning is good to work.

  18. Echo-sounding method aids earthquake hazard studies

    USGS Publications Warehouse

    ,

    1995-01-01

    Dramatic examples of catastrophic damage from an earthquake occurred in 1989, when the M 7.1 Lorna Prieta rocked the San Francisco Bay area, and in 1994, when the M 6.6 Northridge earthquake jolted southern California. The surprising amount and distribution of damage to private property and infrastructure emphasizes the importance of seismic-hazard research in urbanized areas, where the potential for damage and loss of life is greatest. During April 1995, a group of scientists from the U.S. Geological Survey and the University of Tennessee, using an echo-sounding method described below, is collecting data in San Antonio Park, California, to examine the Monte Vista fault which runs through this park. The Monte Vista fault in this vicinity shows evidence of movement within the last 10,000 years or so. The data will give them a "picture" of the subsurface rock deformation near this fault. The data will also be used to help locate a trench that will be dug across the fault by scientists from William Lettis & Associates.

  19. Study of Seismic Hazards in the Center of the State of Veracruz, MÉXICO.

    NASA Astrophysics Data System (ADS)

    Torres Morales, G. F.; Leonardo Suárez, M.; Dávalos Sotelo, R.; Mora González, I.; Castillo Aguilar, S.

    2015-12-01

    Preliminary results obtained from the project "Microzonation of geological and hydrometeorological hazards for conurbations of Orizaba, Veracruz, and major sites located in the lower sub-basins: The Antigua and Jamapa" are presented. These project was supported by the Joint Funds CONACyT-Veracruz state government. It was developed a probabilistic seismic hazard assessment (henceforth PSHA) in the central area of Veracruz State, mainly in a region bounded by the watersheds of the rivers Jamapa and Antigua, whit the aim to evaluate the geological and hydrometeorological hazards in this region. The project pays most attention to extreme weather phenomena, floods and earthquakes, in order to calculate the risk induced by previous for landslides and rock falls. In addition, as part of the study, the PSHA was developed considered the site effect in the urban zones of the cities Xalapa and Orizaba; the site effects were incorporated by a standard format proposed in studies of microzonation and its application in computer systems, which allows to optimize and condense microzonation studies in a city. The results obtained from the PSHA are presented through to seismic hazard maps (hazard footprints), exceedance rate curves and uniform hazard spectrum for different spectral ordinates, between 0.01 and 5.0 seconds, associated to selected return periods: 72, 225, 475 and 2475 years.

  20. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 44 Emergency Management and Assistance 1 2012-10-01 2011-10-01 true Flood Mitigation Plan approval..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  1. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  2. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  3. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 44 Emergency Management and Assistance 1 2014-10-01 2014-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  4. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 44 Emergency Management and Assistance 1 2011-10-01 2011-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  5. Melatonin mitigate cerebral vasospasm after experimental subarachnoid hemorrhage: a study of synchrotron radiation angiography

    NASA Astrophysics Data System (ADS)

    Cai, J.; He, C.; Chen, L.; Han, T.; Huang, S.; Huang, Y.; Bai, Y.; Bao, Y.; Zhang, H.; Ling, F.

    2013-06-01

    Cerebral vasospasm (CV) after subarachnoid hemorrhage (SAH) is a devastating and unsolved clinical issue. In this study, the rat models, which had been induced SAH by prechiasmatic cistern injection, were treated with melatonin. Synchrotron radiation angiography (SRA) was employed to detect and evaluate CV of animal models. Neurological scoring and histological examinations were used to assess the neurological deficits and CV as well. Using SRA techniques and histological analyses, the anterior cerebral artery diameters of SAH rats with melatonin administration were larger than those without melatonin treatment (p < 0.05). The neurological deficits of SAH rats treated with melatonin were less than those without melatonin treatment (p < 0.05). We concluded that SRA was a precise and in vivo tool to observe and evaluate CV of SAH rats; intraperitoneally administration of melatonin could mitigate CV after experimental SAH.

  6. On the road to HF mitigation

    SciTech Connect

    VanZele, R.L.; Diener, R. )

    1990-06-01

    The hazards of hydrogen fluoride (HF) have long been recognized and industry performance reflects sound operating practices. However, full-scale industry-sponsored HF release test conducted at the U.S. Department of Energy (DOE) test site in 1986 caused concern in view of HF's toxicity. Ambient impacts were greater than anticipated. And diking, a primary mitigation technique, proved ineffective for releases of pressurized superheated HF. In partial response to these new technical data, an ad-hoc three-component Industry Cooperative Hydrogen Fluoride Mitigation Assessment Program (ICHMAP) was begun in late 1987 to study and test techniques for mitigating accidental releases of HF and alkylation unit acid (AUA) and to enhance capabilities to estimate ambient impacts from such releases. AUA is a mixture of HF and hydrocarbons. The program's mitigation components have recently been completed while work on the impact assessment component is nearing completion. This article describes the program and summarizes the objective, scope of work, structure, and conclusions from the program's two mitigation components. In addition, the objectives and scope of work of the impact assessment components are described.

  7. A combined approach to physical vulnerability of large cities exposed to natural hazards - the case study of Arequipa, Peru

    NASA Astrophysics Data System (ADS)

    Thouret, Jean-Claude; Ettinger, Susanne; Zuccaro, Giulio; Guitton, Mathieu; Martelli, Kim; Degregorio, Daniela; Nardone, Stefano; Santoni, Olivier; Magill, Christina; Luque, Juan Alexis; Arguedas, Ana

    2013-04-01

    Arequipa, the second largest city in Peru with almost one million inhabitants, is exposed to various natural hazards, such as earthquakes, landslides, flash floods, and volcanic eruptions. This study focuses on the vulnerability and response of housing, infrastructure and lifelines in Arequipa to flash floods and eruption induced hazards, notably lahars from El Misti volcano. We propose a combined approach for assessing physical vulnerability in a large city based on: (1) remote sensing utilizing high-resolution imagery (SPOT5, Google Earth Pro, Bing, Pléïades) to map the distribution and type of land use, properties of city blocks in terms of exposure to the hazard (elevation above river level, distance to channel, impact angle, etc.); (2) in situ survey of buildings and critical infrastructure (e.g., bridges) and strategic resources (e.g., potable water, irrigation, sewage); (3) information gained from interviews with engineers involved in construction works, previous crises (e.g., June 2001 earthquake) and risk mitigation in Arequipa. Remote sensing and mapping at the scale of the city has focused on three pilot areas, along the perennial Rio Chili valley that crosses the city and oasis from north to south, and two of the east-margin tributaries termed Quebrada (ravine): San Lazaro crossing the northern districts and Huarangal crossing the northeastern districts. Sampling of city blocks through these districts provides varying geomorphic, structural, historical, and socio-economic characteristics for each sector. A reconnaissance survey included about 900 edifices located in 40 city blocks across districts of the pilot areas, distinct in age, construction, land use and demographics. A building acts as a structural system and its strength and resistance to flashfloods and lahars therefore highly depends on the type of construction and the used material. Each building surveyed was assigned to one of eight building categories based on physical criteria (dominant

  8. Communicating Volcanic Hazards in the North Pacific

    NASA Astrophysics Data System (ADS)

    Dehn, J.; Webley, P.; Cunningham, K. W.

    2014-12-01

    For over 25 years, effective hazard communication has been key to effective mitigation of volcanic hazards in the North Pacific. These hazards are omnipresent, with a large event happening in Alaska every few years to a decade, though in many cases can happen with little or no warning (e.g. Kasatochi and Okmok in 2008). Here a useful hazard mitigation strategy has been built on (1) a large database of historic activity from many datasets, (2) an operational alert system with graduated levels of concern, (3) scenario planning, and (4) routine checks and communication with emergency managers and the public. These baseline efforts are then enhanced in the time of crisis with coordinated talking points, targeted studies and public outreach. Scientists naturally tend to target other scientists as their audience, whereas in effective monitoring of hazards that may only occur on year to decadal timescales, details can distract from the essentially important information. Creating talking points and practice in public communications can help make hazard response a part of the culture. Promoting situational awareness and familiarity can relieve indecision and concerns at the time of a crisis.

  9. Analyzing costs of space debris mitigation methods

    NASA Astrophysics Data System (ADS)

    Wiedemann, C.; Krag, H.; Bendisch, J.; Sdunnus, H.

    The steadily increasing number of space objects poses a considerable hazard to all kinds of spacecraft. To reduce the risks to future space missions different debris mitigation measures and spacecraft protection techniques have been investigated during the last years. However, the economic efficiency has not been considered yet in this context. This economical background is not always clear to satellite operators and the space industry. Current studies have the objective to evaluate the mission costs due to space debris in a business as usual (no mitigation) scenario compared to the missions costs considering debris mitigation. The aim i an estimation of thes time until the investment in debris mitigation will lead to an effective reduction of mission costs. This paper presents the results of investigations on the key problems of cost estimation for spacecraft and the influence of debris mitigation and shielding on cost. The shielding of a satellite can be an effective method to protect the spacecraft against debris impact. Mitigation strategies like the reduction of orbital lifetime and de- or re-orbit of non-operational satellites are methods to control the space debris environment. These methods result in an increase of costs. In a first step the overall costs of different types of unmanned satellites are analyzed. The key problem is, that it is not possible to provide a simple cost model that can be applied to all types of satellites. Unmanned spacecraft differ very much in mission, complexity of design, payload and operational lifetime. It is important to classify relevant cost parameters and investigate their influence on the respective mission. The theory of empirical cost estimation and existing cost models are discussed. A selected cost model is simplified and generalized for an application on all operational satellites. In a next step the influence of space debris on cost is treated, if the implementation of mitigation strategies is considered.

  10. Modeling effects of urban heat island mitigation strategies on heat-related morbidity: a case study for Phoenix, Arizona, USA.

    PubMed

    Silva, Humberto R; Phelan, Patrick E; Golden, Jay S

    2010-01-01

    A zero-dimensional energy balance model was previously developed to serve as a user-friendly mitigation tool for practitioners seeking to study the urban heat island (UHI) effect. Accordingly, this established model is applied here to show the relative effects of four common mitigation strategies: increasing the overall (1) emissivity, (2) percentage of vegetated area, (3) thermal conductivity, and (4) albedo of the urban environment in a series of percentage increases by 5, 10, 15, and 20% from baseline values. In addition to modeling mitigation strategies, we present how the model can be utilized to evaluate human health vulnerability from excessive heat-related events, based on heat-related emergency service data from 2002 to 2006. The 24-h average heat index is shown to have the greatest correlation to heat-related emergency calls in the Phoenix (Arizona, USA) metropolitan region. The four modeled UHI mitigation strategies, taken in combination, would lead to a 48% reduction in annual heat-related emergency service calls, where increasing the albedo is the single most effective UHI mitigation strategy.

  11. Modeling effects of urban heat island mitigation strategies on heat-related morbidity: a case study for Phoenix, Arizona, USA

    NASA Astrophysics Data System (ADS)

    Silva, Humberto R.; Phelan, Patrick E.; Golden, Jay S.

    2010-01-01

    A zero-dimensional energy balance model was previously developed to serve as a user-friendly mitigation tool for practitioners seeking to study the urban heat island (UHI) effect. Accordingly, this established model is applied here to show the relative effects of four common mitigation strategies: increasing the overall (1) emissivity, (2) percentage of vegetated area, (3) thermal conductivity, and (4) albedo of the urban environment in a series of percentage increases by 5, 10, 15, and 20% from baseline values. In addition to modeling mitigation strategies, we present how the model can be utilized to evaluate human health vulnerability from excessive heat-related events, based on heat-related emergency service data from 2002 to 2006. The 24-h average heat index is shown to have the greatest correlation to heat-related emergency calls in the Phoenix (Arizona, USA) metropolitan region. The four modeled UHI mitigation strategies, taken in combination, would lead to a 48% reduction in annual heat-related emergency service calls, where increasing the albedo is the single most effective UHI mitigation strategy.

  12. Studying and Improving Human Response to Natural Hazards: Lessons from the Virtual Hurricane Lab

    NASA Astrophysics Data System (ADS)

    Meyer, R.; Broad, K.; Orlove, B. S.

    2010-12-01

    One of the most critical challenges facing communities in areas prone to natural hazards is how to best encourage residents to invest in individual and collective actions that would reduce the damaging impact of low-probability, high-consequence, environmental events. Unfortunately, what makes this goal difficult to achieve is that the relative rarity natural hazards implies that many who face the risk of natural hazards have no previous experience to draw on when making preparation decisions, or have prior experience that provides misleading guidance on how best to prepare. For example, individuals who have experienced strings of minor earthquakes or near-misses from tropical cyclones may become overly complacent about the risks that extreme events actually pose. In this presentation we report the preliminary findings of a program of work that explores the use of realistic multi-media hazard simulations designed for two purposes: 1) to serve as a basic research tool for studying of how individuals make decisions to prepare for rare natural hazards in laboratory settings; and 2) to serve as an educational tool for giving people in hazard-prone areas virtual experience in hazard preparation. We demonstrate a prototype simulation in which participants experience the approach of a virtual hurricane, where they have the opportunity to invest in different kinds of action to protect their home from damage. As the hurricane approaches participants have access to an “information dashboard” in which they can gather information about the storm threat from a variety of natural sources, including mock television weather broadcasts, web sites, and conversations with neighbors. In response to this information they then have the opportunity to invest in different levels of protective actions. Some versions of the simulation are designed as games, where participants are rewarded based on their ability to make the optimal trade-off between under and over-preparing for the

  13. Public willingness to pay for CO2 mitigation and the determinants under climate change: a case study of Suzhou, China.

    PubMed

    Yang, Jie; Zou, Liping; Lin, Tiansheng; Wu, Ying; Wang, Haikun

    2014-12-15

    This study explored the factors that influence respondents' willingness to pay (WTP) for CO2 mitigation under climate change. A questionnaire survey combined with contingent valuation and psychometric paradigm methods were conducted in the city of Suzhou, Jiangsu Province in China. Respondents' traditional demographic attributes, risk perception of greenhouse gas (GHG), and attitude toward the government's risk management practices were established using a Tobit model to analyze the determinants. The results showed that about 55% of the respondents refused to pay for CO2 mitigation, respondent's WTP increased with increasing CO2 mitigation percentage. Important factors influencing WTP include people's feeling of dread of GHGs, confidence in policy, the timeliness of governmental information disclosure, age, education and income level.

  14. Software safety hazard analysis

    SciTech Connect

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  15. Evaluation of mitigation measures to reduce hydropeaking impacts on river ecosystems - a case study from the Swiss Alps.

    PubMed

    Tonolla, Diego; Bruder, Andreas; Schweizer, Steffen

    2017-01-01

    New Swiss legislation obligates hydropower plant owners to reduce detrimental impacts on rivers ecosystems caused by hydropeaking. We used a case study in the Swiss Alps (hydropower company Kraftwerke Oberhasli AG) to develop an efficient and successful procedure for the ecological evaluation of such impacts, and to predict the effects of possible mitigation measures. We evaluated the following scenarios using 12 biotic and abiotic indicators: the pre-mitigation scenario (i.e. current state), the future scenario with increased turbine capacity but without mitigation measures, and future scenarios with increased turbine capacity and four alternative mitigation measures. The evaluation was based on representative hydrographs and quantitative or qualitative prediction of the indicators. Despite uncertainties in the ecological responses and the future operation mode of the hydropower plant, the procedure allowed the most appropriate mitigation measure to be identified. This measure combines a basin and a cavern at a total retention volume of 80,000m(3), allowing for substantial dampening in the flow falling and ramping rates and in turn considerable reduction in stranding risk for juvenile trout and in macroinvertebrate drift. In general, this retention volume had the greatest predicted ecological benefit and can also, to some extent, compensate for possible modifications in the hydropower operation regime in the future, e.g. due to climate change, changes in the energy market, and changes in river morphology. Furthermore, it also allows for more specific seasonal regulations of retention volume during ecologically sensitive periods (e.g. fish spawning seasons). Overall experience gained from our case study is expected to support other hydropeaking mitigation projects.

  16. Mini-Sosie high-resolution seismic method aids hazards studies

    USGS Publications Warehouse

    Stephenson, W.J.; Odum, J.; Shedlock, K.M.; Pratt, T.L.; Williams, R.A.

    1992-01-01

    The Mini-Sosie high-resolution seismic method has been effective in imaging shallow-structure and stratigraphic features that aid in seismic-hazard and neotectonic studies. The method is not an alternative to Vibroseis acquisition for large-scale studies. However, it has two major advantages over Vibroseis as it is being used by the USGS in its seismic-hazards program. First, the sources are extremely portable and can be used in both rural and urban environments. Second, the shifting-and-summation process during acquisition improves the signal-to-noise ratio and cancels out seismic noise sources such as cars and pedestrians. -from Authors

  17. Seismic hazard assessment of the cultural heritage sites: A case study in Cappadocia (Turkey)

    NASA Astrophysics Data System (ADS)

    Seyrek, Evren; Orhan, Ahmet; Dinçer, İsmail

    2014-05-01

    Turkey is one of the most seismically active regions in the world. Major earthquakes with the potential of threatening life and property occur frequently here. In the last decade, over 50,000 residents lost their lives, commonly as a result of building failures in seismic events. The Cappadocia region is one of the most important touristic sites in Turkey. At the same time, the region has been included to the Word Heritage List by UNESCO at 1985 due to its natural, historical and cultural values. The region is undesirably affected by several environmental conditions, which are subjected in many previous studies. But, there are limited studies about the seismic evaluation of the region. Some of the important historical and cultural heritage sites are: Goreme Open Air Museum, Uchisar Castle, Ortahisar Castle, Derinkuyu Underground City and Ihlara Valley. According to seismic hazard zonation map published by the Ministry of Reconstruction and Settlement these heritage sites fall in Zone III, Zone IV and Zone V. This map show peak ground acceleration or 10 percent probability of exceedance in 50 years for bedrock. In this connection, seismic hazard assessment of these heritage sites has to be evaluated. In this study, seismic hazard calculations are performed both deterministic and probabilistic approaches with local site conditions. A catalog of historical and instrumental earthquakes is prepared and used in this study. The seismic sources have been identified for seismic hazard assessment based on geological, seismological and geophysical information. Peak Ground Acceleration (PGA) at bed rock level is calculated for different seismic sources using available attenuation relationship formula applicable to Turkey. The result of the present study reveals that the seismic hazard at these sites is closely matching with the Seismic Zonation map published by the Ministry of Reconstruction and Settlement. Keywords: Seismic Hazard Assessment, Probabilistic Approach

  18. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  19. Turning a hazardous waste lagoon into reclaimed land for wildlife management: A case study

    SciTech Connect

    Leong, A.K.

    1996-12-31

    Brownfields are turning back to green. This paper presents a case study of a former dump site for hazardous waste that has been remediated and will be developed into an enhanced wildlife management habitat. This successful remediation case combined various investigations, remedial designs, risk assessments, ecological studies, and engineering practices. 3 refs., 1 fig., 1 tab.

  20. Economic valuation of flood mitigation services: A case study from the Otter Creek, VT.

    NASA Astrophysics Data System (ADS)

    Galford, G. L.; Ricketts, T.; Bryan, K. L.; ONeil-Dunne, J.; Polasky, S.

    2014-12-01

    The ecosystem services provided by wetlands are widely recognized but difficult to quantify. In particular, estimating the effect of landcover and land use on downstream flood outcomes remains challenging, but is increasingly important in light of climate change predictions of increased precipitation in many areas. Economic valuation can help incorporate ecosystem services into decisions and enable communities to plan for climate and flood resiliency. Here we estimate the economic value of Otter Creek wetlands for Middlebury, VT in mitigating the flood that followed Tropical Storm Irene, as well as for ten historic floods. Observationally, hydrographs above and below the wetlands in the case of each storm indicated the wetlands functioned as a temporary reservoir, slowing the delivery of water to Middlebury. We compare observed floods, based on Middlebury's hydrograph, with simulated floods for scenarios without wetlands. To simulate these "without wetlands" scenarios, we assume the same volume of water was delivered to Middlebury, but in a shorter time pulse similar to a hydrograph upstream of the wetlands. For scenarios with and without wetlands, we map the spatial extent of flooding using LiDAR digital elevation data. We then estimate flood depth at each affected building, and calculate monetary losses as a function of the flood depth and house value using established depth damage relationships. For example, we expect damages equal to 20% of the houses value for a flood depth of two feet in a two-story home with a basement. We define the value of flood mitigation services as the difference in damages between the with and without wetlands scenario, and find that the Otter Creek wetlands reduced flood damage in Middlebury by 88% following Hurricane Irene. Using the 10 additional historic floods, we estimate an ongoing mean value of $400,000 in avoided damages per year. Economic impacts of this magnitude stress the importance of wetland conservation and warrant the

  1. Coastal dynamics studies for evaluation of hazard and vulnerability for coastal erosion. case study the town La Bocana, Buenaventura, colombian pacific

    NASA Astrophysics Data System (ADS)

    Coca-Domínguez, Oswaldo; Ricaurte-Villota, Constanza

    2015-04-01

    The analysis of the hazard and vulnerability in coastal areas caused for erosion is based on studies of coastal dynamics since that allows having a better information detail that is useful for decision-making in aspects like prevention, mitigation, disaster reduction and integrated risk management. The Town of La Bocana, located in Buenaventura (Colombian Pacific) was selected to carry out the threat assessment for coastal erosion based on three components: i) magnitude, ii) occurrence and iii) susceptibility. Vulnerability meanwhile, is also composed of three main components for its evaluation: i) exposure ii) fragility and iii) resilience, which in turn are evaluated in 6 dimensions of vulnerability: physical, social, economic, ecological, institutional and cultural. The hazard analysis performed used a semi-quantitative approach, and an index of variables such as type of geomorphological unit, type of beach, exposure of the surfing coast, occurrence, among others. Quantitative data of coastal retreat was measured through the use of DSAS (Digital Shoreline Analysis System) an application of ArcGIS, as well as the development of digital elevation models from the beach and 6 beach profiles strategically located on the coast obtained with GNSS technology. Sediment samples collected from these beaches, medium height and wave direction were used as complementary data. The information was integrated across the coast line into segments of 250 x 250 meters. 4 sectors are part of the coastal area of La Bocana: Pianguita, Vistahermosa, Donwtown and Shangay. 6 vulnerability dimensions units were taken from these population, as well as its density for exposure, wich was analyzed through a multi-array method that include variables such as, land use, population, type of structure, education, basic services, among others, to measure frailty, and their respective indicator of resilience. The hazard analysis results indicate that Vistahermosa is in very high threat, while

  2. Reducing aluminum dust explosion hazards: case study of dust inerting in an aluminum buffing operation.

    PubMed

    Myers, Timothy J

    2008-11-15

    Metal powders or dusts can represent significant dust explosion hazards in industry, due to their relatively low ignition energy and high explosivity. The hazard is well known in industries that produce or use aluminum powders, but is sometimes not recognized by facilities that produce aluminum dust as a byproduct of bulk aluminum processing. As demonstrated by the 2003 dust explosion at aluminum wheel manufacturer Hayes Lemmerz, facilities that process bulk metals are at risk due to dust generated during machining and finishing operations [U.S. Chemical Safety and Hazard Investigation Board, Investigation Report, Aluminum Dust Explosion Hayes Lemmerz International, Inc., Huntington, Indiana, Report No. 2004-01-I-IN, September 2005]. Previous studies have shown that aluminum dust explosions are more difficult to suppress with flame retardants or inerting agents than dust explosions fueled by other materials such as coal [A.G. Dastidar, P.R. Amyotte, J. Going, K. Chatrathi, Flammability limits of dust-minimum inerting concentrations, Proc. Saf. Progr., 18-1 (1999) 56-63]. In this paper, an inerting method is discussed to reduce the dust explosion hazard of residue created in an aluminum buffing operation as the residue is generated. This technique reduces the dust explosion hazard throughout the buffing process and within the dust collector systems making the process inherently safer. Dust explosion testing results are presented for process dusts produced during trials with varying amounts of flame retardant additives.

  3. An optimization model for regional air pollutants mitigation based on the economic structure adjustment and multiple measures: A case study in Urumqi city, China.

    PubMed

    Sun, Xiaowei; Li, Wei; Xie, Yulei; Huang, Guohe; Dong, Changjuan; Yin, Jianguang

    2016-11-01

    A model based on economic structure adjustment and pollutants mitigation was proposed and applied in Urumqi. Best-worst case analysis and scenarios analysis were performed in the model to guarantee the parameters accuracy, and to analyze the effect of changes of emission reduction styles. Results indicated that pollutant-mitigations of electric power industry, iron and steel industry, and traffic relied mainly on technological transformation measures, engineering transformation measures and structure emission reduction measures, respectively; Pollutant-mitigations of cement industry relied mainly on structure emission reduction measures and technological transformation measures; Pollutant-mitigations of thermal industry relied mainly on the four mitigation measures. They also indicated that structure emission reduction was a better measure for pollutants mitigation of Urumqi. Iron and steel industry contributed greatly in SO2, NOx and PM (particulate matters) emission reduction and should be given special attention in pollutants emission reduction. In addition, the scales of iron and steel industry should be reduced with the decrease of SO2 mitigation amounts. The scales of traffic and electric power industry should be reduced with the decrease of NOx mitigation amounts, and the scales of cement industry and iron and steel industry should be reduced with the decrease of PM mitigation amounts. The study can provide references of pollutants mitigation schemes to decision-makers for regional economic and environmental development in the 12th Five-Year Plan on National Economic and Social Development of Urumqi.

  4. A Case Study in Ethical Decision Making Regarding Remote Mitigation of Botnets

    NASA Astrophysics Data System (ADS)

    Dittrich, David; Leder, Felix; Werner, Tillmann

    It is becoming more common for researchers to find themselves in a position of being able to take over control of a malicious botnet. If this happens, should they use this knowledge to clean up all the infected hosts? How would this affect not only the owners and operators of the zombie computers, but also other researchers, law enforcement agents serving justice, or even the criminals themselves? What dire circumstances would change the calculus about what is or is not appropriate action to take? We review two case studies of long-lived malicious botnets that present serious challenges to researchers and responders and use them to illuminate many ethical issues regarding aggressive mitigation. We make no judgments about the questions raised, instead laying out the pros and cons of possible choices and allowing workshop attendees to consider how and where they would draw lines. By this, we hope to expose where there is clear community consensus as well as where controversy or uncertainty exists.

  5. A Food Effect Study of an Oral Thrombin Inhibitor and Prodrug Approach To Mitigate It.

    PubMed

    Lee, Jihye; Kim, Bongchan; Kim, Tae Hun; Lee, Sun Hwa; Park, Hee Dong; Chung, Kyungha; Lee, Sung-Hack; Paek, Seungyup; Kim, Eunice EunKyeong; Yoon, SukKyoon; Kim, Aeri

    2016-04-04

    LB30870, a new direct thrombin inhibitor, showed 80% reduction in oral bioavailability in fed state. The present study aims to propose trypsin binding as a mechanism for such negative food effect and demonstrate a prodrug approach to mitigate food effect. Effect of food composition on fed state oral bioavailability of LB30870 was studied in dogs. Various prodrugs were synthesized, and their solubility, permeability, and trypsin binding affinity were measured. LB30870 and prodrugs were subject to cocrystallization with trypsin, and the X-ray structures of cocrystals were determined. Food effect was studied in dogs for selected prodrugs. Protein or lipid meal appeared to affect oral bioavailability of LB30870 in dogs more than carbohydrate meal. Blocking both carboxyl and amidine groups of LB30870 resulted in trypsin Ki values orders of magnitude higher than that of LB30870. Prodrugs belonged to either Biopharmaceutical Classification System I, II, or III. X-ray crystallography revealed that prodrugs did not bind to trypsin, but instead their hydrolysis product at the amidine blocking group formed cocrystal with trypsin. A prodrug with significantly less food effect than LB30870 was identified. Binding of prodrugs to food components such as dietary fiber appeared to counteract the positive effect brought with the prodrug approach. Further formulation research is warranted to enhance the oral bioavailability of prodrugs. In conclusion, this study is the first to demonstrate that the negative food effect of LB30870 can be attributed to trypsin binding. Trypsin binding study is proposed as a screening tool during lead optimization to minimize food effect.

  6. Landslide hazard mapping with selected dominant factors: A study case of Penang Island, Malaysia

    SciTech Connect

    Tay, Lea Tien; Alkhasawneh, Mutasem Sh.; Ngah, Umi Kalthum; Lateh, Habibah

    2015-05-15

    Landslide is one of the destructive natural geohazards in Malaysia. In addition to rainfall as triggering factos for landslide in Malaysia, topographical and geological factors play important role in the landslide susceptibility analysis. Conventional topographic factors such as elevation, slope angle, slope aspect, plan curvature and profile curvature have been considered as landslide causative factors in many research works. However, other topographic factors such as diagonal length, surface area, surface roughness and rugosity have not been considered, especially for the research work in landslide hazard analysis in Malaysia. This paper presents landslide hazard mapping using Frequency Ratio (FR) and the study area is Penang Island of Malaysia. Frequency ratio approach is a variant of probabilistic method that is based on the observed relationships between the distribution of landslides and each landslide-causative factor. Landslide hazard map of Penang Island is produced by considering twenty-two (22) landslide causative factors. Among these twenty-two (22) factors, fourteen (14) factors are topographic factors. They are elevation, slope gradient, slope aspect, plan curvature, profile curvature, general curvature, tangential curvature, longitudinal curvature, cross section curvature, total curvature, diagonal length, surface area, surface roughness and rugosity. These topographic factors are extracted from the digital elevation model of Penang Island. The other eight (8) non-topographic factors considered are land cover, vegetation cover, distance from road, distance from stream, distance from fault line, geology, soil texture and rainfall precipitation. After considering all twenty-two factors for landslide hazard mapping, the analysis is repeated with fourteen dominant factors which are selected from the twenty-two factors. Landslide hazard map was segregated into four categories of risks, i.e. Highly hazardous area, Hazardous area, Moderately hazardous area

  7. Flood hazards studies in the Mississippi River basin using remote sensing

    NASA Technical Reports Server (NTRS)

    Rango, A.; Anderson, A. T.

    1974-01-01

    The Spring 1973 Mississippi River flood was investigated using remotely sensed data from ERTS-1. Both manual and automatic analyses of the data indicated that ERTS-1 is extremely useful as a regional tool for flood mamagement. Quantitative estimates of area flooded were made in St. Charles County, Missouri and Arkansas. Flood hazard mapping was conducted in three study areas along the Mississippi River using pre-flood ERTS-1 imagery enlarged to 1:250,000 and 1:100,000 scale. Initial results indicate that ERTS-1 digital mapping of flood prone areas can be performed at 1:62,500 which is comparable to some conventional flood hazard map scales.

  8. Natural hazard understanding in the middle schools of the Colorado Front Range

    SciTech Connect

    Grogger, P.K.

    1995-12-01

    The best form of mitigation is not to put one`s self in a position that mitigation is required. For the last five years the University of Colorado`s Department of Geology has teamed with local school districts to implement an understanding of natural hazards. By working with middle school students the dangers and possible mitigation of North America are learned at an early age. Over the years, the knowledge gained by these communities citizens will hopefully help lessen the dangers from natural hazards society faces. Education of the general public about natural hazards needs to be addressed by the professional societies studying and developing answers to natural hazards problems. By working with school children this process of educating the general public starts early in the education system and will bear fruit many years in the future. This paper describes the course that is being given to students in Colorado.

  9. Identifying hazard parameter to develop quantitative and dynamic hazard map of an active volcano in Indonesia

    NASA Astrophysics Data System (ADS)

    Suminar, Wulan; Saepuloh, Asep; Meilano, Irwan

    2016-05-01

    Analysis of hazard assessment to active volcanoes is crucial for risk management. The hazard map of volcano provides information to decision makers and communities before, during, and after volcanic crisis. The rapid and accurate hazard assessment, especially to an active volcano is necessary to be developed for better mitigation on the time of volcanic crises in Indonesia. In this paper, we identified the hazard parameters to develop quantitative and dynamic hazard map of an active volcano. The Guntur volcano in Garut Region, West Java, Indonesia was selected as study area due population are resided adjacent to active volcanoes. The development of infrastructures, especially related to tourism at the eastern flank from the Summit, are growing rapidly. The remote sensing and field investigation approaches were used to obtain hazard parameters spatially. We developed a quantitative and dynamic algorithm to map spatially hazard potential of volcano based on index overlay technique. There were identified five volcano hazard parameters based on Landsat 8 and ASTER imageries: volcanic products including pyroclastic fallout, pyroclastic flows, lava and lahar, slope topography, surface brightness temperature, and vegetation density. Following this proposed technique, the hazard parameters were extracted, indexed, and calculated to produce spatial hazard values at and around Guntur Volcano. Based on this method, the hazard potential of low vegetation density is higher than high vegetation density. Furthermore, the slope topography, surface brightness temperature, and fragmental volcanic product such as pyroclastics influenced to the spatial hazard value significantly. Further study to this proposed approach will be aimed for effective and efficient analyses of volcano risk assessment.

  10. Mitigating flood exposure

    PubMed Central

    Shultz, James M; McLean, Andrew; Herberman Mash, Holly B; Rosen, Alexa; Kelly, Fiona; Solo-Gabriele, Helena M; Youngs Jr, Georgia A; Jensen, Jessica; Bernal, Oscar; Neria, Yuval

    2013-01-01

    Introduction. In 2011, following heavy winter snowfall, two cities bordering two rivers in North Dakota, USA faced major flood threats. Flooding was foreseeable and predictable although the extent of risk was uncertain. One community, Fargo, situated in a shallow river basin, successfully mitigated and prevented flooding. For the other community, Minot, located in a deep river valley, prevention was not possible and downtown businesses and one-quarter of the homes were inundated, in the city’s worst flood on record. We aimed at contrasting the respective hazards, vulnerabilities, stressors, psychological risk factors, psychosocial consequences, and disaster risk reduction strategies under conditions where flood prevention was, and was not, possible. Methods. We applied the “trauma signature analysis” (TSIG) approach to compare the hazard profiles, identify salient disaster stressors, document the key components of disaster risk reduction response, and examine indicators of community resilience. Results. Two demographically-comparable communities, Fargo and Minot, faced challenging river flood threats and exhibited effective coordination across community sectors. We examined the implementation of disaster risk reduction strategies in situations where coordinated citizen action was able to prevent disaster impact (hazard avoidance) compared to the more common scenario when unpreventable disaster strikes, causing destruction, harm, and distress. Across a range of indicators, it is clear that successful mitigation diminishes both physical and psychological impact, thereby reducing the trauma signature of the event. Conclusion. In contrast to experience of historic flooding in Minot, the city of Fargo succeeded in reducing the trauma signature by way of reducing risk through mitigation. PMID:28228985

  11. Collaborative studies target volcanic hazards in Central America

    NASA Astrophysics Data System (ADS)

    Bluth, Gregg J. S.; Rose, William I.

    Central America is the second-most consistently active volcanic zone on Earth, after Indonesia. Centuries of volcanic activity have produced a spectacular landscape of collapsed calderas, debris flows, and thick blankets of pyroclastic materials. Volcanic activity dominates the history, culture, and daily life of Central American countries.January 2002 marked the third consecutive year in which a diverse group of volcanologists and geophysicists conducted focused field studies in Central America. This type of multi-institutional collaboration reflects the growing involvement of a number of U.S. and non-U.S. universities, and of other organizations, in Guatemala and El Salvador (Table 1).

  12. Natural phenomena hazards site characterization criteria

    SciTech Connect

    Not Available

    1994-03-01

    The criteria and recommendations in this standard shall apply to site characterization for the purpose of mitigating Natural Phenomena Hazards (wind, floods, landslide, earthquake, volcano, etc.) in all DOE facilities covered by DOE Order 5480.28. Criteria for site characterization not related to NPH are not included unless necessary for clarification. General and detailed site characterization requirements are provided in areas of meteorology, hydrology, geology, seismology, and geotechnical studies.

  13. A Study of Airline Passenger Susceptibility to Atmospheric Turbulence Hazard

    NASA Technical Reports Server (NTRS)

    Stewart, Eric C.

    2000-01-01

    A simple, generic, simulation math model of a commercial airliner has been developed to study the susceptibility of unrestrained passengers to large, discrete gust encounters. The math model simulates the longitudinal motion to vertical gusts and includes (1) motion of an unrestrained passenger in the rear cabin, (2) fuselage flexibility, (3) the lag in the downwash from the wing to the tail, and (4) unsteady lift effects. Airplane and passenger response contours are calculated for a matrix of gust amplitudes and gust lengths of a simulated mountain rotor. A comparison of the model-predicted responses to data from three accidents indicates that the accelerations in actual accidents are sometimes much larger than the simulated gust encounters.

  14. Epidemiological study of health hazards among workers handling engineered nanomaterials

    NASA Astrophysics Data System (ADS)

    Liou, Saou-Hsing; Tsou, Tsui-Chun; Wang, Shu-Li; Li, Lih-Ann; Chiang, Hung-Che; Li, Wan-Fen; Lin, Pin-Pin; Lai, Ching-Huang; Lee, Hui-Ling; Lin, Ming-Hsiu; Hsu, Jin-Huei; Chen, Chiou-Rong; Shih, Tung-Sheng; Liao, Hui-Yi; Chung, Yu-Teh

    2012-08-01

    The aim of this study was to establish and identify the health effect markers of workers with potential exposure to nanoparticles (20-100 nm) during manufacturing and/or application of nanomaterials. For this cross-sectional study, we recruited 227 workers who handled nanomaterials and 137 workers for comparison who did not from 14 plants in Taiwan. A questionnaire was used to collect data on exposure status, demographics, and potential confounders. The health effect markers were measured in the medical laboratory. Control banding from the Nanotool Risk Level Matrix was used to categorize the exposure risk levels of the workers. The results showed that the antioxidant enzyme, superoxide dismutase (SOD) in risk level 1 (RL1) and risk level 2 (RL2) workers was significantly ( p < 0.05) lower than in control workers. A significantly decreasing gradient was found for SOD (control > RL1 > RL2). Another antioxidant, glutathione peroxidase (GPX), was significantly lower only in RL1 workers than in the control workers. The cardiovascular markers, fibrinogen and ICAM (intercellular adhesion molecule), were significantly higher in RL2 workers than in controls and a significant dose-response with an increasing trend was found for these two cardiovascular markers. Another cardiovascular marker, interleukin-6, was significantly increased among RL1 workers, but not among RL2 workers. The accuracy rate for remembering 7-digits and reciting them backwards was significantly lower in RL2 workers (OR = 0.48) than in controls and a significantly reversed gradient was also found for the correct rate of backward memory (OR = 0.90 for RL1, OR = 0.48 for RL2, p < 0.05 in test for trend). Depression of antioxidant enzymes and increased expression of cardiovascular markers were found among workers handling nanomaterials. Antioxidant enzymes, such as SOD and GPX, and cardiovascular markers, such as fibrinogen, ICAM, and interluekin-6, are possible biomarkers for medical surveillance of

  15. Orbital debris hazard insights from spacecraft anomalies studies

    NASA Astrophysics Data System (ADS)

    McKnight, Darren S.

    2016-09-01

    Since the dawning of the space age space operators have been tallying spacecraft anomalies and failures then using these insights to improve the space systems and operations. As space systems improved and their lifetimes increased, the anomaly and failure modes have multiplied. Primary triggers for space anomalies and failures include design issues, space environmental effects, and satellite operations. Attempts to correlate anomalies to the orbital debris environment have started as early as the mid-1990's. Early attempts showed tens of anomalies correlated well to altitudes where the cataloged debris population was the highest. However, due to the complexity of tracing debris impacts to mission anomalies, these analyses were found to be insufficient to prove causation. After the fragmentation of the Chinese Feng-Yun satellite in 2007, it was hypothesized that the nontrackable fragments causing anomalies in LEO would have increased significantly from this event. As a result, debris-induced anomalies should have gone up measurably in the vicinity of this breakup. Again, the analysis provided some subtle evidence of debris-induced anomalies but it was not convincing. The continued difficulty in linking debris flux to satellite anomalies and failures prompted the creation of a series of spacecraft anomalies and failure workshops to investigate the identified shortfalls. These gatherings have produced insights into why this process is not straightforward. Summaries of these studies and workshops are presented and observations made about how to create solutions for anomaly attribution, especially as it relates to debris-induced spacecraft anomalies and failures.

  16. Use of geotextiles for mitigation of the effects of man-made hazards such as greening of waste deposits in frame of the conversion of industrial areas

    NASA Astrophysics Data System (ADS)

    Bostenaru, Magdalena; Siminea, Ioana; Bostenaru, Maria

    2010-05-01

    The city of Karlsruhe lays on the Rhine valley; however, it is situated at a certain distance from the Rhine river and the coastal front is not integrated in the urban development. However, the port to the Rhine developed to the second largest internal port in Germany. With the process of deindustrialisation, industrial use is now shrinking. With the simultaneous process of the ecological re-win of rivers, the conversion of the industrial area to green and residential areals is imposed. In the 1990s a project was made by the third author of the contribution with Andrea Ciobanu as students of the University of Karlsruhe for the conversion of the Rhine port area of Karlsruhe into such a nature-residential use. The area included also a waste deposit, proposed to be transformed into a "green hill". Such an integration of a waste deposit into a park in the process of the conversion of an industrial area is not singular in Germany; several such projects were proposed and some of them realised at the IBA Emscher Park in the Ruhr area. Some of them were coupled with artistic projects. The technical details are also subject of the contribution. Studies were made by the first two authors on the conditions in which plants grow on former waste deposits if supported by intermediar layers of a geotextile. The characteristics of the geotextiles, together with the technologic process of obtaining, and the results of laboratory and field experiments for use on waste deposits in comparable conditions in Romania will be shown. The geotextile is also usable for ash deposits such as those in the Ruhr area.

  17. Geometrical Scaling of the Magnitude Frequency Statistics of Fluid Injection Induced Earthquakes and Implications for Assessment and Mitigation of Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Dinske, C.; Shapiro, S. A.

    2015-12-01

    To study the influence of size and geometry of hydraulically perturbed rock volumes on the magnitude statistics of induced events, we compare b value and seismogenic index estimates derived from different algorithms. First, we use standard Gutenberg-Richter approaches like least square fit and maximum likelihood technique. Second, we apply the lower bound probability fit (Shapiro et al., 2013, JGR, doi:10.1002/jgrb.50264) which takes the finiteness of the perturbed volume into account. The different estimates systematically deviate from each other and the deviations are larger for smaller perturbed rock volumes. It means that the frequency-magnitude distribution is most affected for small injection volume and short injection time resulting in a high apparent b value. In contrast, the specific magnitude value, the quotient of seismogenic index and b value (Shapiro et al., 2013, JGR, doi:10.1002/jgrb.50264), appears to be a unique seismotectonic parameter of a reservoir location. Our results confirm that it is independent of the size of perturbed rock volume. The specific magnitude is hence an indicator of the magnitudes that one can expect for a given injection. Several performance tests to forecast the magnitude frequencies of induced events show that the seismogenic index model provides reliable predictions which confirm its applicability as a forecast tool, particularly, if applied in real-time monitoring. The specific magnitude model can be used to predict an asymptotical upper limit of probable frequency-magnitude distributions of induced events. We also conclude from our analysis that the physical process of pore pressure diffusion for the event triggering and the scaling of their frequency-magnitude distribution by the size of perturbed rock volume well depicts the presented relation between upper bound of maximum seismic moment and injected fluid volume (McGarr, 2014, JGR, doi:10.1002/2013JB010597), particularly, if nonlinear effects in the diffusion process

  18. Urban Vulnerability Assessment to Seismic Hazard through Spatial Multi-Criteria Analysis. Case Study: the Bucharest Municipality/Romania

    NASA Astrophysics Data System (ADS)

    Armas, Iuliana; Dumitrascu, Silvia; Bostenaru, Maria

    2010-05-01

    conditions). In effect, the example of Bucharest demonstrates how the results shape the ‘vulnerability to seismic hazard profile of the city, based on which decision makers could develop proper mitigation strategies. To sum up, the use of an analytical framework as the standard Spatial Multi-Criteria Analysis (SMCA) - despite all difficulties in creating justifiable weights (Yeh et al., 1999) - results in accurate estimations of the state of the urban system. Although this method was often mistrusted by decision makers (Janssen, 2001), we consider that the results can represent, based on precisely the level of generalization, a decision support framework for policy makers to critically reflect on possible risk mitigation plans. Further study will lead to the improvement of the analysis by integrating a series of daytime and nighttime scenarios and a better definition of the constructed space variables.

  19. Success in transmitting hazard science

    NASA Astrophysics Data System (ADS)

    Price, J. G.; Garside, T.

    2010-12-01

    Money motivates mitigation. An example of success in communicating scientific information about hazards, coupled with information about available money, is the follow-up action by local governments to actually mitigate. The Nevada Hazard Mitigation Planning Committee helps local governments prepare competitive proposals for federal funds to reduce risks from natural hazards. Composed of volunteers with expertise in emergency management, building standards, and earthquake, flood, and wildfire hazards, the committee advises the Nevada Division of Emergency Management on (1) the content of the State’s hazard mitigation plan and (2) projects that have been proposed by local governments and state agencies for funding from various post- and pre-disaster hazard mitigation programs of the Federal Emergency Management Agency. Local governments must have FEMA-approved hazard mitigation plans in place before they can receive this funding. The committee has been meeting quarterly with elected and appointed county officials, at their offices, to encourage them to update their mitigation plans and apply for this funding. We have settled on a format that includes the county’s giving the committee an overview of its infrastructure, hazards, and preparedness. The committee explains the process for applying for mitigation grants and presents the latest information that we have about earthquake hazards, including locations of nearby active faults, historical seismicity, geodetic strain, loss-estimation modeling, scenarios, and documents about what to do before, during, and after an earthquake. Much of the county-specific information is available on the web. The presentations have been well received, in part because the committee makes the effort to go to their communities, and in part because the committee is helping them attract federal funds for local mitigation of not only earthquake hazards but also floods (including canal breaches) and wildfires, the other major concerns in

  20. Probabilistic seismic hazard study based on active fault and finite element geodynamic models

    NASA Astrophysics Data System (ADS)

    Kastelic, Vanja; Carafa, Michele M. C.; Visini, Francesco

    2016-04-01

    We present a probabilistic seismic hazard analysis (PSHA) that is exclusively based on active faults and geodynamic finite element input models whereas seismic catalogues were used only in a posterior comparison. We applied the developed model in the External Dinarides, a slow deforming thrust-and-fold belt at the contact between Adria and Eurasia.. is the Our method consists of establishing s two earthquake rupture forecast models: (i) a geological active fault input (GEO) model and, (ii) a finite element (FEM) model. The GEO model is based on active fault database that provides information on fault location and its geometric and kinematic parameters together with estimations on its slip rate. By default in this model all deformation is set to be released along the active faults. The FEM model is based on a numerical geodynamic model developed for the region of study. In this model the deformation is, besides along the active faults, released also in the volumetric continuum elements. From both models we calculated their corresponding activity rates, its earthquake rates and their final expected peak ground accelerations. We investigated both the source model and the earthquake model uncertainties by varying the main active fault and earthquake rate calculation parameters through constructing corresponding branches of the seismic hazard logic tree. Hazard maps and UHS curves have been produced for horizontal ground motion on bedrock conditions VS 30 ≥ 800 m/s), thereby not considering local site amplification effects. The hazard was computed over a 0.2° spaced grid considering 648 branches of the logic tree and the mean value of 10% probability of exceedance in 50 years hazard level, while the 5th and 95th percentiles were also computed to investigate the model limits. We conducted a sensitivity analysis to control which of the input parameters influence the final hazard results in which measure. The results of such comparison evidence the deformation model and

  1. A Study of Aircraft Fire Hazards Related to Natural Electrical Phenomena

    NASA Technical Reports Server (NTRS)

    Kester, Frank L.; Gerstein, Melvin; Plumer, J. A.

    1960-01-01

    The problems of natural electrical phenomena as a fire hazard to aircraft are evaluated. Assessment of the hazard is made over the range of low level electrical discharges, such as static sparks, to high level discharges, such as lightning strikes to aircraft. In addition, some fundamental work is presented on the problem of flame propagation in aircraft fuel vent systems. This study consists of a laboratory investigation in five parts: (1) a study of the ignition energies and flame propagation rates of kerosene-air and JP-6-air foams, (2) a study of the rate of flame propagation of n-heptane, n-octane, n-nonane, and n-decane in aircraft vent ducts, (3) a study of the damage to aluminum, titanium, and stainless steel aircraft skin materials by lightning strikes, (4) a study of fuel ignition by lightning strikes to aircraft skins, and (5) a study of lightning induced flame propagation in an aircraft vent system.

  2. Space Particle Hazard Specification, Forecasting, and Mitigation

    DTIC Science & Technology

    2007-11-30

    or damage rates ( METEOR IMPACT), (6) A meteor sky map module to calculate the number of visible meteors from active meteor showers ( METEOR SKY...determined. METEOR SKY MAP: The Meteor Sky Map module calculates the number of visible meteors from active meteor showers (and any user...communication link outage regions for active ionospheric conditions; (6) specification of meteor flux rates along orbits plus probabilities of incurring

  3. Predisaster Hazard Mitigation Act of 2010

    THOMAS, 111th Congress

    Sen. Lieberman, Joseph I. [ID-CT

    2010-04-22

    06/29/2010 Held at the desk. (All Actions) Notes: For further action, see H.R.1746, which became Public Law 111-351 on 1/4/2011. Tracker: This bill has the status Passed SenateHere are the steps for Status of Legislation:

  4. Natural hazard risk perception of Italian population: case studies along national territory.

    NASA Astrophysics Data System (ADS)

    Gravina, Teresita; Tupputi Schinosa, Francesca De Luca; Zuddas, Isabella; Preto, Mattia; Marengo, Angelo; Esposito, Alessandro; Figliozzi, Emanuele; Rapinatore, Matteo

    2015-04-01

    Risk perception is judgment that people make about the characteristics and severity of risks, in last few years risk perception studies focused on provide cognitive elements to communication experts responsible in order to design citizenship information and awareness appropriate strategies. Several authors in order to determine natural hazards risk (Seismic, landslides, cyclones, flood, Volcanic) perception used questionnaires as tool for providing reliable quantitative data and permitting comparison the results with those of similar surveys. In Italy, risk perception studies based on surveys, were also carried out in order to investigate on national importance Natural risk, in particular on Somma-Vesuvio and Phlegrean Fields volcanic Risks, but lacked risk perception studies on local situation distributed on whole national territory. National importance natural hazard were frequently reported by national mass media and there were debate about emergencies civil protection plans, otherwise could be difficult to obtain information on bonded and regional nature natural hazard which were diffuses along National territory. In fact, Italian peninsula was a younger geological area subjected to endogenous phenomena (volcanoes, earthquake) and exogenous phenomena which determine land evolution and natural hazard (landslide, coastal erosion, hydrogeological instability, sinkhole) for population. For this reason we decided to investigate on natural risks perception in different Italian place were natural hazard were taken place but not reported from mass media, as were only local relevant or historical event. We carried out surveys in different Italian place interested by different types of natural Hazard (landslide, coastal erosion, hydrogeological instability, sinkhole, volcanic phenomena and earthquake) and compared results, in order to understand population perception level, awareness and civil protection exercises preparation. Our findings support that risks

  5. Study of metal whiskers growth and mitigation technique using additive manufacturing

    NASA Astrophysics Data System (ADS)

    Gullapalli, Vikranth

    For years, the alloy of choice for electroplating electronic components has been tin-lead (Sn-Pb) alloy. However, the legislation established in Europe on July 1, 2006, required significant lead (Pb) content reductions from electronic hardware due to its toxic nature. A popular alternative for coating electronic components is pure tin (Sn). However, pure tin has the tendency to spontaneously grow electrically conductive Sn whisker during storage. Sn whisker is usually a pure single crystal tin with filament or hair-like structures grown directly from the electroplated surfaces. Sn whisker is highly conductive, and can cause short circuits in electronic components, which is a very significant reliability problem. The damages caused by Sn whisker growth are reported in very critical applications such as aircraft, spacecraft, satellites, and military weapons systems. They are also naturally very strong and are believed to grow from compressive stresses developed in the Sn coating during deposition or over time. The new directive, even though environmentally friendly, has placed all lead-free electronic devices at risk because of whisker growth in pure tin. Additionally, interest has occurred about studying the nature of other metal whiskers such as zinc (Zn) whiskers and comparing their behavior to that of Sn whiskers. Zn whiskers can be found in flooring of data centers which can get inside electronic systems during equipment reorganization and movement and can also cause systems failure. Even though the topic of metal whiskers as reliability failure has been around for several decades to date, there is no successful method that can eliminate their growth. This thesis will give further insights towards the nature and behavior of Sn and Zn whiskers growth, and recommend a novel manufacturing technique that has potential to mitigate metal whiskers growth and extend life of many electronic devices.

  6. When does highway construction to mitigate congestion reduce carbon emissions? A Case Study: The Caldecott Tunnel

    NASA Astrophysics Data System (ADS)

    Thurlow, M. E.; Maness, H.; Wiersema, D. J.; Mcdonald, B. C.; Harley, R.; Fung, I. Y.

    2014-12-01

    The construction of the fourth bore of the Caldecott Tunnel, which connects Oakland and Moraga, CA on State Route 24, was the second largest roadway construction project in California last year with a total cost of $417 million. The objective of the fourth bore was to reduce traffic congestion before the tunnel entrance in the off-peak direction of travel, but the project was a source of conflict between policy makers and environmental and community groups concerned about the air quality and traffic impacts. We analyze the impact of the opening of the fourth bore on CO2 emissions associated with traffic. We made surface observations of CO2from a mobile platform along State Route 24 for several weeks in November 2013 incorporating the period prior to and after the opening of the fourth bore on November 16, 2013. We directly compare bottom-up and top-down approaches to estimate the change in traffic emissions associated with the fourth bore opening. A bottom-up emissions inventory was derived from the high-resolution Performance Measurement System (PeMs) dataset and the Multi-scale Motor Vehicle and Equipment Emissions System (MOVES). The emissions inventory was used to drive a box model as well as a high-resolution regional transport model (the Weather and Regional Forecasting Model). The box model was also used to derive emissions from observations in a basic inversion. We also present an analysis of long-term traffic patterns and consider the potential for compensating changes in behavior that offset the observed emissions reductions on longer timescales. Finally, we examine how the results from the Caldecott study demonstrate the general benefit of using mobile measurements for quantifying environmental impacts of congestion mitigation projects.

  7. Strategies for casualty mitigation programs by using advanced tsunami computation

    NASA Astrophysics Data System (ADS)

    IMAI, K.; Imamura, F.

    2012-12-01

    1. Purpose of the study In this study, based on the scenario of great earthquakes along the Nankai trough, we aim on the estimation of the run up and high accuracy inundation process of tsunami in coastal areas including rivers. Here, using a practical method of tsunami analytical model, and taking into account characteristics of detail topography, land use and climate change in a realistic present and expected future environment, we examined the run up and tsunami inundation process. Using these results we estimated the damage due to tsunami and obtained information for the mitigation of human casualties. Considering the time series from the occurrence of the earthquake and the risk of tsunami damage, in order to mitigate casualties we provide contents of disaster risk information displayed in a tsunami hazard and risk map. 2. Creating a tsunami hazard and risk map From the analytical and practical tsunami model (a long wave approximated model) and the high resolution topography (5 m) including detailed data of shoreline, rivers, building and houses, we present a advanced analysis of tsunami inundation considering the land use. Based on the results of tsunami inundation and its analysis; it is possible to draw a tsunami hazard and risk map with information of human casualty, building damage estimation, drift of vehicles, etc. 3. Contents of disaster prevention information To improve the hazard, risk and evacuation information distribution, it is necessary to follow three steps. (1) Provide basic information such as tsunami attack info, areas and routes for evacuation and location of tsunami evacuation facilities. (2) Provide as additional information the time when inundation starts, the actual results of inundation, location of facilities with hazard materials, presence or absence of public facilities and areas underground that required evacuation. (3) Provide information to support disaster response such as infrastructure and traffic network damage prediction

  8. Treatment of kappa in Recent Western US Seismic Nuclear Plant Probabilistic Seismic Hazard Studies

    NASA Astrophysics Data System (ADS)

    Toro, G. R.; Di Alessandro, C.; Al Atik, L.

    2015-12-01

    The three operating nuclear plants (Diablo Canyon, Palo Verde, and Columbia Generating Station) in the western United States recently performed SSHAC Level 3 seismic hazard studies in response to a Request for Information by the Nuclear Regulatory Commission, following the accident at the Fukushima Dai-ichi nuclear facility. The treatment of zero-distance kappa, referred to as kappa_0 and commonly attributed to material damping and scattering in the shallow crust, was given extensive consideration in these studies. Available ground motion prediction equations (GMPEs) do not typically include kappa_0 as a prediction parameter and are developed for an average kappa_0 of the host region. Kappa scaling is routinely applied to adjust for the differences in average kappa between the GMPEs host regions and the target regions. The impact of kappa scaling on the results of probabilistic seismic hazard analyses is significant for nuclear and other facilities that are sensitive to high frequency ground motions (frequencies greater than about 10 Hz). There are several available approaches for deriving kappa scaling factors to GMPEs, which all require estimating kappa_0 at the target site. It is difficult to constrain the target kappa_0 empirically due to the scarcity of ground-motion data from hard-rock sites in ground-motion databases.The hazard studies for the three nuclear power plants had different data, faced different challenges in the estimation of kappa_0, and used different methods for the estimation of the effect of kappa_0 on the site-specific ground motions. This presentation summarizes the approaches used for the evaluation of kappa_0 and for their incorporation in the probabilistic seismic hazard analysis. Emphasis is given to the quantification of the kappa_0 uncertainty, and on the evaluation of its impact to the resulting seismic hazard at the different sites.

  9. Comparative risk judgements for oral health hazards among Norwegian adults: a cross sectional study.

    PubMed

    Astrøm, Anne

    2002-08-20

    BACKGROUND: This study identified optimistic biases in health and oral health hazards, and explored whether comparative risk judgements for oral health hazards vary systematically with socio-economic characteristics and self-reported risk experience. METHODS: A simple random sample of 1,190 residents born in 1972 was drawn from the population resident in three counties of Norway. A total of 735 adults (51% women) completed postal questionnaires at home. RESULTS: Mean ratings of comparative risk judgements differed significantly (p < 0.001) from the mid point of the scales. T-values ranged from -13.1 and -12.1 for the perceived risk of being divorced and loosing all teeth to -8.2 and -7.8 (p < 0.001) for having gum disease and toothdecay. Multivariate analyses using General Linear Models, GLM, revealed gender differences in comparative risk judgements for gum disease, whereas social position varied systematically with risk judgements for tooth decay, gum disease and air pollution. The odds ratios for being comparatively optimistic with respect to having gum disease were 2.9, 1.9, 1.8 and 1.5 if being satisfied with dentition, having a favourable view of health situation, and having high and low involvement with health enhancing and health detrimental behaviour, respectively. CONCLUSION: Optimism in comparative judgements for health and oral health hazards was evident in young Norwegian adults. When judging their comparative susceptibility for oral health hazards, they consider personal health situation and risk behaviour experience.

  10. Comparative risk judgements for oral health hazards among Norwegian adults: a cross sectional study

    PubMed Central

    Åstrøm, Anne Nordrehaug

    2002-01-01

    Background This study identified optimistic biases in health and oral health hazards, and explored whether comparative risk judgements for oral health hazards vary systematically with socio-economic characteristics and self-reported risk experience. Methods A simple random sample of 1,190 residents born in 1972 was drawn from the population resident in three counties of Norway. A total of 735 adults (51% women) completed postal questionnaires at home. Results Mean ratings of comparative risk judgements differed significantly (p < 0.001) from the mid point of the scales. T-values ranged from -13.1 and -12.1 for the perceived risk of being divorced and loosing all teeth to -8.2 and -7.8 (p < 0.001) for having gum disease and toothdecay. Multivariate analyses using General Linear Models, GLM, revealed gender differences in comparative risk judgements for gum disease, whereas social position varied systematically with risk judgements for tooth decay, gum disease and air pollution. The odds ratios for being comparatively optimistic with respect to having gum disease were 2.9, 1.9, 1.8 and 1.5 if being satisfied with dentition, having a favourable view of health situation, and having high and low involvement with health enhancing and health detrimental behaviour, respectively. Conclusion Optimism in comparative judgements for health and oral health hazards was evident in young Norwegian adults. When judging their comparative susceptibility for oral health hazards, they consider personal health situation and risk behaviour experience. PMID:12186656

  11. Robot-assisted home hazard assessment for fall prevention: a feasibility study.

    PubMed

    Sadasivam, Rajani S; Luger, Tana M; Coley, Heather L; Taylor, Benjamin B; Padir, Taskin; Ritchie, Christine S; Houston, Thomas K

    2014-01-01

    We examined the feasibility of using a remotely manoeuverable robot to make home hazard assessments for fall prevention. We employed use-case simulations to compare robot assessments with in-person assessments. We screened the homes of nine elderly patients (aged 65 years or more) for fall risks using the HEROS screening assessment. We also assessed the participants' perspectives of the remotely-operated robot in a survey. The nine patients had a median Short Blessed Test score of 8 (interquartile range, IQR 2-20) and a median Life-Space Assessment score of 46 (IQR 27-75). Compared to the in-person assessment (mean = 4.2 hazards identified per participant), significantly more home hazards were perceived in the robot video assessment (mean = 7.0). Only two checklist items (adequate bedroom lighting and a clear path from bed to bathroom) had more than 60% agreement between in-person and robot video assessment. Participants were enthusiastic about the robot and did not think it violated their privacy. The study found little agreement between the in-person and robot video hazard assessments. However, it identified several research questions about how to best use remotely-operated robots.

  12. Integration of Airborne Laser Scanning Altimetry Data in Alpine Geomorphological and Hazard Studies

    NASA Astrophysics Data System (ADS)

    Seijmonsbergen, A. C.

    2007-12-01

    A digital terrain and surface model derived from an airborne laser scanning (ALS) altimetry dataset was used in the Austrian Alps for the preparation, improvement and the evaluation of a digital geomorphological hazard map. The geomorphology in the study area consists of a wide variety of landforms, which include glacial landforms such as cirques, hanging valleys, and moraine deposits, of pre- and postglacial mass movement landforms and processes, such as deep seated slope failures, rock fall, debris flows and solifluction. The area includes naked and covered gypsum karst, collapse dolines and fluvial landforms and deposits such as river terraces, incisions, alluvial fans and gullies. A detailed symbol based paper geomorphological map served as a basis for the digitalization of basic morphogenetic landform and process units. These units were assigned a `geomorphological unit type`, `hazard type` and `activity` code in the attribute table, according to a morphogenetic classification scheme. Selected zonal statistical attributes - mean height, aspect and slope angle - were calculated in a GIS using the vector based morphogenetic landform and process units and the underlying 1m resolution laser altimetry raster dataset. This statistical information was added to the attribute table of the `geomorphological hazard map`. Interpretation of the zonal statistical information shows that indicative topographic signatures exist for the various geomorphological and hazard units in this region of the Alps. Based on this experience a further step is made towards semi-automated geomorphological hazard classification of segmented laser altimetry data using expert knowledge rules. The first results indicate a classification accuracy of 50-70 percent for most landform associations. Areas affected by slide processes resulted in less accurate classification, probably because of their polygenetic history in this area. It is concluded that the use of lidar data improves visual

  13. Reviewing and visualising relationships between anthropic processes and natural hazards within a multi-hazard framework

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2014-05-01

    Here we present a broad overview of the interaction relationships between 17 anthropic processes and 21 different natural hazard types. Anthropic processes are grouped into seven categories (subsurface extraction, subsurface addition, land use change, explosions, hydrological change, surface construction processes, miscellaneous). Natural hazards are grouped into six categories (geophysical, hydrological, shallow earth processes, atmospheric, biophysical and space). A wide-ranging review based on grey- and peer-reviewed literature from many scientific disciplines identified 54 relationships where anthropic processes have been noted to trigger natural hazards. We record case studies for all but three of these relationships. Based on the results of this review, we find that the anthropic processes of deforestation, explosions (conventional and nuclear) and reservoir construction could trigger the widest range of different natural hazard types. We also note that within the natural hazards, landslides and earthquakes are those that could be triggered by the widest range of anthropic processes. This work also examines the possibility of anthropic processes (i) resulting in an increased occurrence of a particular hazard interaction (e.g., deforestation could result in an increased interaction between storms and landslides); and (ii) inadvertently reducing the likelihood of a natural hazard or natural hazard interaction (e.g., poor drainage or deforestation reducing the likelihood of wildfires triggered by lightning). This study synthesises, using accessible visualisation techniques, the large amounts of anthropic process and natural hazard information from our review. In it we have outlined the importance of considering anthropic processes within any analysis of hazard interactions, and we reinforce the importance of a holistic approach to natural hazard assessment, mitigation and management.

  14. Thermal study of payload module for the next-generation infrared space telescope SPICA in risk mitigation phase

    NASA Astrophysics Data System (ADS)

    Shinozaki, Keisuke; Sato, Yoichi; Sawada, Kenichiro; Ando, Makiko; Sugita, Hiroyuki; Yamawaki, Toshihiro; Mizutani, Tadahiro; Komatsu, Keiji; Nakagawa, Takao; Murakami, Hiroshi; Matsuhara, Hideo; Takada, Makoto; Takai, Shigeki; Okabayashi, Akinobu; Tsunematsu, Shoji; Kanao, Kenichi; Narasaki, Katsuhiro

    2014-11-01

    SPace Infrared telescope for Cosmology and Astrophysics (SPICA) is a pre-project of JAXA in collaboration with ESA to be launched around 2020. The SPICA is transferred into a halo orbit around the second Lagrangian point (L2) in the Sun-Earth system, which enables us to use effective radiant cooling in combination with mechanical cooling system in order to cool a 3 m large IR telescope below 6 K. At a present, a conceptional study of SPICA is underway to assess and mitigate mission's risks; the thermal study for the risk mitigation sets a goal of a 25% margin on cooling power of 4 K/1 K temperature regions, a 25% margin on the heat load from Focal Plane Instruments (FPIs) at intermediated temperature region, to enhance the reliability of the mechanical cooler system, and to enhance feasibility of ground tests. Thermal property measurements of FRP materials are also important. This paper introduces details of the thermal design study for risk mitigation, including development of the truss separation mechanism, the cryogenic radiator, mechanical cooler system, and thermal property measurements of materials.

  15. Social and ethical perspectives of landslide risk mitigation measures

    NASA Astrophysics Data System (ADS)

    Kalsnes, Bjørn; Vangelsten, Bjørn V.

    2015-04-01

    Landslide risk may be mitigated by use of a wide range of measures. Mitigation and prevention options may include (1) structural measures to reduce the frequency, severity or exposure to the hazard, (2) non-structural measures, such as land-use planning and early warning systems, to reduce the hazard frequency and consequences, and (3) measures to pool and transfer the risks. In a given situation the appropriate system of mitigation measures may be a combination of various types of measures, both structural and non-structural. In the process of choosing mitigation measures for a given landslide risk situation, the role of the geoscientist is normally to propose possible mitigation measures on basis of the risk level and technical feasibility. Social and ethical perspectives are often neglected in this process. However, awareness of the need to consider social as well as ethical issues in the design and management of mitigating landslide risk is rising. There is a growing understanding that technical experts acting alone cannot determine what will be considered the appropriate set of mitigation and prevention measures. Issues such as environment versus development, questions of acceptable risk, who bears the risks and benefits, and who makes the decisions, also need to be addressed. Policymakers and stakeholders engaged in solving environmental risk problems are increasingly recognising that traditional expert-based decision-making processes are insufficient. This paper analyse the process of choosing appropriate mitigation measures to mitigate landslide risk from a social and ethical perspective, considering technical, cultural, economical, environmental and political elements. The paper focus on stakeholder involvement in the decision making process, and shows how making strategies for risk communication is a key for a successful process. The study is supported by case study examples from Norway and Italy. In the Italian case study, three different risk mitigation

  16. A web-based tool for ranking landslide mitigation measures

    NASA Astrophysics Data System (ADS)

    Lacasse, S.; Vaciago, G.; Choi, Y. J.; Kalsnes, B.

    2012-04-01

    brief description, guidance on design, schematic details, practical examples and references for each mitigation measure. Each of the measures was given a score on its ability and applicability for different types of landslides and boundary conditions, and a decision support matrix was established. The web-based toolbox organizes the information in the compendium and provides an algorithm to rank the measures on the basis of the decision support matrix, and on the basis of the risk level estimated at the site. The toolbox includes a description of the case under study and offers a simplified option for estimating the hazard and risk levels of the slide at hand. The user selects the mitigation measures to be included in the assessment. The toolbox then ranks, with built-in assessment factors and weights and/or with user-defined ranking values and criteria, the mitigation measures included in the analysis. The toolbox includes data management, e.g. saving data half-way in an analysis, returning to an earlier case, looking up prepared examples or looking up information on mitigation measures. The toolbox also generates a report and has user-forum and help features. The presentation will give an overview of the mitigation measures considered and examples of the use of the toolbox, and will take the attendees through the application of the toolbox.

  17. Seaside, Oregon, Tsunami Pilot Study-Modernization of FEMA Flood Hazard Maps: GIS Data

    USGS Publications Warehouse

    Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.

    2006-01-01

    Introduction: The Federal Emergency Management Agency (FEMA) Federal Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study (Chowdhury and others, 2005). Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Analysis (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines (Tsunami Pilot Study Working Group, 2006). The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey, and the National Oceanic and Atmospheric Administration (NOAA), in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. We present the spatial (geographic information system, GIS) data from the pilot study in standard GIS formats and provide files for visualization in Google Earth, a global map viewer.

  18. Selecting land-based mitigation practices to reduce GHG emissions from the rural land use sector: a case study of North East Scotland.

    PubMed

    Feliciano, Diana; Hunter, Colin; Slee, Bill; Smith, Pete

    2013-05-15

    The Climate Change (Scotland) Act 2009 commits Scotland to reduce GHG emissions by at least 42% by 2020 and 80% by 2050, from 1990 levels. According to the Climate Change Delivery Plan, the desired emission reduction for the rural land use sector (agriculture and other land uses) is 21% compared to 1990, or 10% compared to 2006 levels. In 2006, in North East Scotland, gross greenhouse gas (GHG) emissions from rural land uses were about 1599 ktCO2e. Thus, to achieve a 10% reduction in 2020 relative to 2006, emissions would have to decrease to about 1440 ktCO2e. This study developed a methodology to help selecting land-based practices to mitigate GHG emissions at the regional level. The main criterion used was the "full" mitigation potential of each practice. A mix of methods was used to undertake this study, namely a literature review and quantitative estimates. The mitigation practice that offered greatest "full" mitigation potential (≈66% reduction by 2020 relative to 2006) was woodland planting with Sitka spruce. Several barriers, such as economic, social, political and institutional, affect the uptake of mitigation practices in the region. Consequently the achieved mitigation potential of a practice may be lower than its "full" mitigation potential. Surveys and focus groups, with relevant stakeholders, need to be undertaken to assess the real area where mitigation practices can be implemented and the best way to overcome the barriers for their implementation.

  19. Climate engineering of vegetated land for hot extremes mitigation: an ESM sensitivity study

    NASA Astrophysics Data System (ADS)

    Wilhelm, Micah; Davin, Edouard; Seneviratne, Sonia

    2014-05-01

    Mitigation efforts to reduce anthropogenic climate forcing have thus far proven inadequate, as evident from accelerating greenhouse gas emissions. Many subtropical and mid-latitude regions are expected to experience longer and more frequent heat waves and droughts within the next century. This increased occurrence of weather extremes has important implications for human health, mortality and for socio-economic factors including forest fires, water availability and agricultural production. Various solar radiation management (SRM) schemes that attempt to homogeneously counter the anthropogenic forcing have been examined with different Earth System Models (ESM). Land climate engineering schemes have also been investigated which reduces the amount of solar radiation that is absorbed at the surface. However, few studies have investigated their effects on extremes but rather on mean climate response. Here we present the results of a series of climate engineering sensitivity experiments performed with the Community Earth System Model (CESM) version 1.0.2 at 2°-resolution. This configuration entails 5 fully coupled model components responsible for simulating the Earth's atmosphere, land, land-ice, ocean and sea-ice that interact through a central coupler. Historical and RCP8.5 scenarios were performed with transient land-cover changes and prognostic terrestrial Carbon/Nitrogen cycles. Four sets of experiments are performed in which surface albedo over snow-free vegetated grid points is increased by 0.5, 0.10, 0.15 and 0.20. The simulations show a strong preferential cooling of hot extremes throughout the Northern mid-latitudes during boreal summer. A strong linear scaling between the cooling of extremes and additional surface albedo applied to the land model is observed. The strongest preferential cooling is found in southeastern Europe and the central United States, where increases of soil moisture and evaporative fraction are the largest relative to the control

  20. Space Debris & its Mitigation

    NASA Astrophysics Data System (ADS)

    Kaushal, Sourabh; Arora, Nishant

    2012-07-01

    Space debris has become a growing concern in recent years, since collisions at orbital velocities can be highly damaging to functioning satellites and can also produce even more space debris in the process. Some spacecraft, like the International Space Station, are now armored to deal with this hazard but armor and mitigation measures can be prohibitively costly when trying to protect satellites or human spaceflight vehicles like the shuttle. This paper describes the current orbital debris environment, outline its main sources, and identify mitigation measures to reduce orbital debris growth by controlling these sources. We studied the literature on the topic Space Debris. We have proposed some methods to solve this problem of space debris. We have also highlighted the shortcomings of already proposed methods by space experts and we have proposed some modification in those methods. Some of them can be very effective in the process of mitigation of space debris, but some of them need some modification. Recently proposed methods by space experts are maneuver, shielding of space elevator with the foil, vaporizing or redirecting of space debris back to earth with the help of laser, use of aerogel as a protective layer, construction of large junkyards around international space station, use of electrodynamics tether & the latest method proposed is the use of nano satellites in the clearing of the space debris. Limitations of the already proposed methods are as follows: - Maneuvering can't be the final solution to our problem as it is the act of self-defence. - Shielding can't be done on the parts like solar panels and optical devices. - Vaporizing or redirecting of space debris can affect the human life on earth if it is not done in proper manner. - Aerogel has a threshold limit up to which it can bear (resist) the impact of collision. - Large junkyards can be effective only for large sized debris. In this paper we propose: A. The Use of Nano Tubes by creating a mesh

  1. New Seismic Hazard study in Spain Aimed at the revision of the Spanish Building Code

    NASA Astrophysics Data System (ADS)

    Rivas-Medina, A.; Benito, B.; Cabañas, L.; Martínez-Solares, J. M.; Ruíz, S.; Gaspar-Escribano, J. M.; Carreño, E.; Crespo, M.; García-Mayordomo, J.

    2013-05-01

    In this paper we present a global overview of the recent study carried out in Spain for the new hazard map, which final goal is the revision of the Building Code in our country (NCSE-02). The study was carried our for a working group joining experts from The Instituto Geografico Nacional (IGN) and the Technical University of Madrid (UPM) , being the different phases of the work supervised by an expert Committee integrated by national experts from public institutions involved in subject of seismic hazard. The PSHA method (Probabilistic Seismic Hazard Assessment) has been followed, quantifying the epistemic uncertainties through a logic tree and the aleatory ones linked to variability of parameters by means of probability density functions and Monte Carlo simulations. In a first phase, the inputs have been prepared, which essentially are: 1) a project catalogue update and homogenization at Mw 2) proposal of zoning models and source characterization 3) calibration of Ground Motion Prediction Equations (GMPE's) with actual data and development of a local model with data collected in Spain for Mw < 5.5. In a second phase, a sensitivity analysis of the different input options on hazard results has been carried out in order to have criteria for defining the branches of the logic tree and their weights. Finally, the hazard estimation was done with the logic tree shown in figure 1, including nodes for quantifying uncertainties corresponding to: 1) method for estimation of hazard (zoning and zoneless); 2) zoning models, 3) GMPE combinations used and 4) regression method for estimation of source parameters. In addition, the aleatory uncertainties corresponding to the magnitude of the events, recurrence parameters and maximum magnitude for each zone have been also considered including probability density functions and Monte Carlo simulations The main conclusions of the study are presented here, together with the obtained results in terms of PGA and other spectral accelerations

  2. Using the proportional hazards model to study heart valve replacement data.

    PubMed

    Bunday, B D; Kiri, V A; Stoodley, K D

    1992-01-01

    The proportional hazards model is used to study the effect of various concomitant variables on the time to valve failure, mortality, or other complications, for patients who have had artificial heart valves inserted. The data are from a database, which is still being assembled as more information is acquired, at Killingbeck Hospital. A suite of computer programs, not specifically developed with this application in mind, has been used to carry out the exploratory data analysis, the estimation of parameters and the validation of the model. These three elements of the analysis are all illustrated. The present report is seen as a preliminary study to assess the usefulness of the proportional hazards model in this area. Follow-up work as more data are accumulated is intended.

  3. Assessment and indirect adjustment for confounding by smoking in cohort studies using relative hazards models.

    PubMed

    Richardson, David B; Laurier, Dominique; Schubauer-Berigan, Mary K; Tchetgen Tchetgen, Eric; Cole, Stephen R

    2014-11-01

    Workers' smoking histories are not measured in many occupational cohort studies. Here we discuss the use of negative control outcomes to detect and adjust for confounding in analyses that lack information on smoking. We clarify the assumptions necessary to detect confounding by smoking and the additional assumptions necessary to indirectly adjust for such bias. We illustrate these methods using data from 2 studies of radiation and lung cancer: the Colorado Plateau cohort study (1950-2005) of underground uranium miners (in which smoking was measured) and a French cohort study (1950-2004) of nuclear industry workers (in which smoking was unmeasured). A cause-specific relative hazards model is proposed for estimation of indirectly adjusted associations. Among the miners, the proposed method suggests no confounding by smoking of the association between radon and lung cancer--a conclusion supported by adjustment for measured smoking. Among the nuclear workers, the proposed method suggests substantial confounding by smoking of the association between radiation and lung cancer. Indirect adjustment for confounding by smoking resulted in an 18% decrease in the adjusted estimated hazard ratio, yet this cannot be verified because smoking was unmeasured. Assumptions underlying this method are described, and a cause-specific proportional hazards model that allows easy implementation using standard software is presented.

  4. Analytic study to evaluate associations between hazardous waste sites and birth defects. Final report

    SciTech Connect

    Marshall, E.G.; Gensburg, L.J.; Geary, N.S.; Deres, D.A.; Cayo, M.R.

    1995-06-01

    A study was conducted to evaluate the risk of two types of birth defects (central nervous system and musculoskeletal defects) associated with mothers` exposure to solvents, metals, and pesticides through residence near hazardous waste sites. The only environmental factor showing a statistically significant elevation in risk was living within one mile of industrial or commercial facilities emitting solvents into the air. Residence near these facilities showed elevated risk for central nervous system defects but no elevated risks for musculoskeletal defects.

  5. Incorporating induced seismicity in the 2014 United States National Seismic Hazard Model: results of the 2014 workshop and sensitivity studies

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.

    2015-01-01

    The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after

  6. FDA-iRISK--a comparative risk assessment system for evaluating and ranking food-hazard pairs: case studies on microbial hazards.

    PubMed

    Chen, Yuhuan; Dennis, Sherri B; Hartnett, Emma; Paoli, Greg; Pouillot, Régis; Ruthman, Todd; Wilson, Margaret

    2013-03-01

    Stakeholders in the system of food safety, in particular federal agencies, need evidence-based, transparent, and rigorous approaches to estimate and compare the risk of foodborne illness from microbial and chemical hazards and the public health impact of interventions. FDA-iRISK (referred to here as iRISK), a Web-based quantitative risk assessment system, was developed to meet this need. The modeling tool enables users to assess, compare, and rank the risks posed by multiple food-hazard pairs at all stages of the food supply system, from primary production, through manufacturing and processing, to retail distribution and, ultimately, to the consumer. Using standard data entry templates, built-in mathematical functions, and Monte Carlo simulation techniques, iRISK integrates data and assumptions from seven components: the food, the hazard, the population of consumers, process models describing the introduction and fate of the hazard up to the point of consumption, consumption patterns, dose-response curves, and health effects. Beyond risk ranking, iRISK enables users to estimate and compare the impact of interventions and control measures on public health risk. iRISK provides estimates of the impact of proposed interventions in various ways, including changes in the mean risk of illness and burden of disease metrics, such as losses in disability-adjusted life years. Case studies for Listeria monocytogenes and Salmonella were developed to demonstrate the application of iRISK for the estimation of risks and the impact of interventions for microbial hazards. iRISK was made available to the public at http://irisk.foodrisk.org in October 2012.

  7. Seaside, Oregon Tsunami Pilot Study - modernization of FEMA flood hazard maps

    USGS Publications Warehouse

    ,

    2006-01-01

    FEMA Flood Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study. Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Assessment (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines. The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State Agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey and the National Oceanic and Atmospheric Administration, in collaboration with the University of Southern California, Middle East Technical University. Portland State University, Horning Geosciences, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. Draft copies and a briefing on the contents, results and recommendations of this document were provided to FEMA officials before final publication.

  8. High-dose selenium for the mitigation of radiation injury: a pilot study in a rat model.

    PubMed

    Sieber, Fritz; Muir, Sarah A; Cohen, Eric P; North, Paula E; Fish, Brian L; Irving, Amy A; Mäder, Marylou; Moulder, John E

    2009-03-01

    The purpose of this study was to evaluate in an animal model the safety and efficacy of dietary supplementation with high doses of selenium for the mitigation of the type of radiation injury that might be sustained during a nuclear accident or an act of radiological terrorism. Age-matched male rats were exposed to 10 Gy (single dose) of total-body irradiation (TBI) followed by a syngeneic bone marrow transplant, then randomized to standard drinking water or drinking water supplemented with sodium selenite or seleno-l-methionine. At 21 weeks after TBI, most rats on standard drinking water had severe renal failure with a mean blood urea nitrogen (BUN) level of 124 +/- 29 mg/dl (geometric mean +/- SE) whereas rats on selenium-supplemented drinking water (100 microg/day) had a mean BUN level of 67 +/- 12 mg/dl. The mitigating effect of selenium was confirmed by histopathological analyses. None of the animals on high-dose selenium showed signs of selenium toxicity. Our results suggest that dietary supplementation with high-dose selenium may provide a safe, effective and practical way to mitigate radiation injury to kidneys.

  9. Power and public participation in a hazardous waste dispute: a community case study.

    PubMed

    Culley, Marci R; Hughey, Joseph

    2008-03-01

    Qualitative case study findings are presented. We examined whether public participation in a hazardous waste dispute manifested in ways consistent with theories of social power; particularly whether participatory processes or participants' experiences of them were consistent with the three-dimensional view of power (Gaventa, Power and powerlessness: quiescence and rebellion in an appalacian valley, 1980; Lukes, Power: A radical view, 1974; Parenti, Power and the powerless, 1978). Findings from four data sources collected over 3 years revealed that participatory processes manifested in ways consistent with theories of power, and participants' experiences reflected this. Results illustrated how participation was limited and how citizen influence could be manipulated via control of resources, barriers to participation, agenda setting, and shaping conceptions about what participation was possible. Implications for community research and policy related to participation in hazardous waste disputes are discussed.

  10. Examination of Icing Induced Loss of Control and Its Mitigations

    NASA Technical Reports Server (NTRS)

    Reehorst, Andrew L.; Addy, Harold E., Jr.; Colantonio, Renato O.

    2010-01-01

    Factors external to the aircraft are often a significant causal factor in loss of control (LOC) accidents. In today s aviation world, very few accidents stem from a single cause and typically have a number of causal factors that culminate in a LOC accident. Very often the "trigger" that initiates an accident sequence is an external environment factor. In a recent NASA statistical analysis of LOC accidents, aircraft icing was shown to be the most common external environmental LOC causal factor for scheduled operations. When investigating LOC accident or incidents aircraft icing causal factors can be categorized into groups of 1) in-flight encounter with super-cooled liquid water clouds, 2) take-off with ice contamination, or 3) in-flight encounter with high concentrations of ice crystals. As with other flight hazards, icing induced LOC accidents can be prevented through avoidance, detection, and recovery mitigations. For icing hazards, avoidance can take the form of avoiding flight into icing conditions or avoiding the hazard of icing by making the aircraft tolerant to icing conditions. Icing detection mitigations can take the form of detecting icing conditions or detecting early performance degradation caused by icing. Recovery from icing induced LOC requires flight crew or automated systems capable of accounting for reduced aircraft performance and degraded control authority during the recovery maneuvers. In this report we review the icing induced LOC accident mitigations defined in a recent LOC study and for each mitigation describe a research topic required to enable or strengthen the mitigation. Many of these research topics are already included in ongoing or planned NASA icing research activities or are being addressed by members of the icing research community. These research activities are described and the status of the ongoing or planned research to address the technology needs is discussed

  11. Status of volcanic hazard studies for the Nevada Nuclear Waste Storage Investigations

    SciTech Connect

    Crowe, B.M.; Vaniman, D.T.; Carr, W.J.

    1983-03-01

    Volcanism studies of the Nevada Test Site (NTS) region are concerned with hazards of future volcanism with respect to underground disposal of high-level radioactive waste. The hazards of silicic volcanism are judged to be negligible; hazards of basaltic volcanism are judged through research approaches combining hazard appraisal and risk assessment. The NTS region is cut obliquely by a N-NE trending belt of volcanism. This belt developed about 8 Myr ago following cessation of silicic volcanism and contemporaneous with migration of basaltic activity toward the southwest margin of the Great Basin. Two types of fields are present in the belt: (1) large-volume, long-lived basalt and local rhyolite fields with numerous eruptive centers and (2) small-volume fields formed by scattered basaltic scoria cones. Late Cenozoic basalts of the NTS region belong to the second field type. Monogenetic basalt centers of this region were formed mostly by Strombolian eruptions; Surtseyean activity has been recognized at three centers. Geochemically, the basalts of the NTS region are classified as straddle A-type basalts of the alkalic suite. Petrological studies indicate a volumetric dominance of evolved hawaiite magmas. Trace- and rare-earth-element abundances of younger basalt (<4 Myr) of the NTS region and southern Death Valley area, California, indicate an enrichment in incompatible elements, with the exception of rubidium. The conditional probability of recurring basaltic volcanism and disruption of a repository by that event is bounded by the range of 10{sup -8} to 10{sup -10} as calculated for a 1-yr period. Potential disruptive and dispersal effects of magmatic penetration of a repository are controlled primarily by the geometry of basalt feeder systems, the mechanism of waste incorporation in magma, and Strombolian eruption processes.

  12. A simulation study of finite-sample properties of marginal structural Cox proportional hazards models.

    PubMed

    Westreich, Daniel; Cole, Stephen R; Schisterman, Enrique F; Platt, Robert W

    2012-08-30

    Motivated by a previously published study of HIV treatment, we simulated data subject to time-varying confounding affected by prior treatment to examine some finite-sample properties of marginal structural Cox proportional hazards models. We compared (a) unadjusted, (b) regression-adjusted, (c) unstabilized, and (d) stabilized marginal structural (inverse probability-of-treatment [IPT] weighted) model estimators of effect in terms of bias, standard error, root mean squared error (MSE), and 95% confidence limit coverage over a range of research scenarios, including relatively small sample sizes and 10 study assessments. In the base-case scenario resembling the motivating example, where the true hazard ratio was 0.5, both IPT-weighted analyses were unbiased, whereas crude and adjusted analyses showed substantial bias towards and across the null. Stabilized IPT-weighted analyses remained unbiased across a range of scenarios, including relatively small sample size; however, the standard error was generally smaller in crude and adjusted models. In many cases, unstabilized weighted analysis showed a substantial increase in standard error compared with other approaches. Root MSE was smallest in the IPT-weighted analyses for the base-case scenario. In situations where time-varying confounding affected by prior treatment was absent, IPT-weighted analyses were less precise and therefore had greater root MSE compared with adjusted analyses. The 95% confidence limit coverage was close to nominal for all stabilized IPT-weighted but poor in crude, adjusted, and unstabilized IPT-weighted analysis. Under realistic scenarios, marginal structural Cox proportional hazards models performed according to expectations based on large-sample theory and provided accurate estimates of the hazard ratio.

  13. Cyber security with radio frequency interferences mitigation study for satellite systems

    NASA Astrophysics Data System (ADS)

    Wang, Gang; Wei, Sixiao; Chen, Genshe; Tian, Xin; Shen, Dan; Pham, Khanh; Nguyen, Tien M.; Blasch, Erik

    2016-05-01

    Satellite systems including the Global Navigation Satellite System (GNSS) and the satellite communications (SATCOM) system provide great convenience and utility to human life including emergency response, wide area efficient communications, and effective transportation. Elements of satellite systems incorporate technologies such as navigation with the global positioning system (GPS), satellite digital video broadcasting, and information transmission with a very small aperture terminal (VSAT), etc. The satellite systems importance is growing in prominence with end users' requirement for globally high data rate transmissions; the cost reduction of launching satellites; development of smaller sized satellites including cubesat, nanosat, picosat, and femtosat; and integrating internet services with satellite networks. However, with the promising benefits, challenges remain to fully develop secure and robust satellite systems with pervasive computing and communications. In this paper, we investigate both cyber security and radio frequency (RF) interferences mitigation for satellite systems, and demonstrate that they are not isolated. The action space for both cyber security and RF interferences are firstly summarized for satellite systems, based on which the mitigation schemes for both cyber security and RF interferences are given. A multi-layered satellite systems structure is provided with cross-layer design considering multi-path routing and channel coding, to provide great security and diversity gains for secure and robust satellite systems.

  14. Studies of electron cloud growth and mitigation at CESR-TA

    NASA Astrophysics Data System (ADS)

    Calvey, Joseph Raymond

    The electron cloud effect is a well known phenomenon in particle accelerators, in which a high density of low energy electrons builds up inside the vacuum chamber. These electrons can cause various undesirable effects, including emittance blowup and beam instabilities. Electron cloud has been observed in several currently operating machines, and is expected to be a major limiting factor in the design of the damping rings of future linear colliders. As part of an effort to understand and mitigate this effect, the Cornell Electron Storage Ring (CESR) has been reconfigured into a damping ring-like setting, and instrumented with a large number of electron cloud diagnostic devices. In particular, more than 30 Retarding Field Analyzers (RFAs) have been installed. These devices, which measure the local electron cloud density and energy distribution, have been deployed in drift, dipole, quadrupole, and wiggler field regions, and have been used to evaluate the efficacy of cloud mitigation techniques in each element. Understanding RFA measurements through the use of specially modified cloud buildup simulations results in a great deal of insight into the behavior of the electron cloud, and provides essential information on the properties of the instrumented chamber surfaces.

  15. Hazardous Waste

    MedlinePlus

    ... you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint ...

  16. Chemical Safety Alert: Fire Hazard from Carbon Adsorption Deodorizing Systems

    EPA Pesticide Factsheets

    Activated carbon systems used to adsorb vapors for odor control may pose a fire hazard when used for certain types of substances, such as crude sulfate turpentine. Facilities should take precautions and proper procedures to avoid or mitigate these hazards.

  17. Hazardous-Materials Robot

    NASA Technical Reports Server (NTRS)

    Stone, Henry W.; Edmonds, Gary O.

    1995-01-01

    Remotely controlled mobile robot used to locate, characterize, identify, and eventually mitigate incidents involving hazardous-materials spills/releases. Possesses number of innovative features, allowing it to perform mission-critical functions such as opening and unlocking doors and sensing for hazardous materials. Provides safe means for locating and identifying spills and eliminates risks of injury associated with use of manned entry teams. Current version of vehicle, called HAZBOT III, also features unique mechanical and electrical design enabling vehicle to operate safely within combustible atmosphere.

  18. Seismic hazard in low slip rate crustal faults, estimating the characteristic event and the most hazardous zone: study case San Ramón Fault, in southern Andes

    NASA Astrophysics Data System (ADS)

    Estay, Nicolás P.; Yáñez, Gonzalo; Carretier, Sebastien; Lira, Elias; Maringue, José

    2016-11-01

    Crustal faults located close to cities may induce catastrophic damages. When recurrence times are in the range of 1000-10 000 or higher, actions to mitigate the effects of the associated earthquake are hampered by the lack of a full seismic record, and in many cases, also of geological evidences. In order to characterize the fault behavior and its effects, we propose three different already-developed time-integration methodologies to define the most likely scenarios of rupture, and then to quantify the hazard with an empirical equation of peak ground acceleration (PGA). We consider the following methodologies: (1) stream gradient and (2) sinuosity indexes to estimate fault-related topographic effects, and (3) gravity profiles across the fault to identify the fault scarp in the basement. We chose the San Ramón Fault on which to apply these methodologies. It is a ˜ 30 km N-S trending fault with a low slip rate (0.1-0.5 mm yr-1) and an approximated recurrence of 9000 years. It is located in the foothills of the Andes near the large city of Santiago, the capital of Chile (> 6 000 000 inhabitants). Along the fault trace we define four segments, with a mean length of ˜ 10 km, which probably become active independently. We tested the present-day seismic activity by deploying a local seismological network for 1 year, finding five events that are spatially related to the fault. In addition, fault geometry along the most evident scarp was imaged in terms of its electrical resistivity response by a high resolution TEM (transient electromagnetic) profile. Seismic event distribution and TEM imaging allowed the constraint of the fault dip angle (˜ 65°) and its capacity to break into the surface. Using the empirical equation of Chiou and Youngs (2014) for crustal faults and considering the characteristic seismic event (thrust high-angle fault, ˜ 10 km, Mw = 6.2-6.7), we estimate the acceleration distribution in Santiago and the hazardous zones. City domains that are under

  19. Effects of anthropogenic land-subsidence on river flood hazard: a case study in Ravenna, Italy

    NASA Astrophysics Data System (ADS)

    Carisi, Francesca; Domeneghetti, Alessio; Castellarin, Attilio

    2015-04-01

    Can differential land-subsidence significantly alter the river flooding dynamics, and thus flood risk in flood prone areas? Many studies show how the lowering of the coastal areas is closely related to an increase in the flood-hazard due to more important tidal flooding and see level rise. On the contrary, the literature on the relationship between differential land-subsidence and possible alterations to riverine flood-hazard of inland areas is still sparse, while several areas characterized by significant land-subsidence rates during the second half of the 20th century experienced an intensification in both inundation magnitude and frequency. This study investigates the possible impact of a significant differential ground lowering on flood hazard in proximity of Ravenna, which is one of the oldest Italian cities, former capital of the Western Roman Empire, located a few kilometers from the Adriatic coast and about 60 km south of the Po River delta. The rate of land-subsidence in the area, naturally in the order of a few mm/year, dramatically increased up to 110 mm/year after World War II, primarily due to groundwater pumping and a number of deep onshore and offshore gas production platforms. The subsidence caused in the last century a cumulative drop larger than 1.5 m in the historical center of the city. Starting from these evidences and taking advantage of a recent digital elevation model of 10m resolution, we reconstructed the ground elevation in 1897 for an area of about 65 km2 around the city of Ravenna. We referred to these two digital elevation models (i.e. current topography and topographic reconstruction) and a 2D finite-element numerical model for the simulation of the inundation dynamics associated with several levee failure scenarios along embankment system of the river Montone. For each scenario and digital elevation model, the flood hazard is quantified in terms of water depth, speed and dynamics of the flooding front. The comparison enabled us to

  20. Mitigation potential of horizontal ground coupled heat pumps for current and future climatic conditions: UK environmental modelling and monitoring studies

    NASA Astrophysics Data System (ADS)

    García González, Raquel; Verhoef, Anne; Vidale, Pier Luigi; Gan, Guohui; Wu, Yupeng; Hughes, Andrew; Mansour, Majdi; Blyth, Eleanor; Finch, Jon; Main, Bruce

    2010-05-01

    An increased uptake of alternative low or non-CO2 emitting energy sources is one of the key priorities for policy makers to mitigate the effects of environmental change. Relatively little work has been undertaken on the mitigation potential of Ground Coupled Heat Pumps (GCHPs) despite the fact that a GCHP could significantly reduce CO2 emissions from heating systems. It is predicted that under climate change the most probable scenario is for UK temperatures to increase and for winter rainfall to become more abundant; the latter is likely to cause a general rise in groundwater levels. Summer rainfall may reduce considerably, while vegetation type and density may change. Furthermore, recent studies underline the likelihood of an increase in the number of heat waves. Under such a scenario, GCHPs will increasingly be used for cooling as well as heating. These factors will affect long-term performance of horizontal GCHP systems and hence their economic viability and mitigation potential during their life span ( 50 years). The seasonal temperature differences encountered in soil are harnessed by GCHPs to provide heating in the winter and cooling in the summer. The performance of a GCHP system will depend on technical factors (heat exchanger (HE) type, length, depth, and spacing of pipes), but also it will be determined to a large extent by interactions between the below-ground parts of the system and the environment (atmospheric conditions, vegetation and soil characteristics). Depending on the balance between extraction and rejection of heat from and to the ground, the soil temperature in the neighbourhood of the HE may fall or rise. The GROMIT project (GROund coupled heat pumps MITigation potential), funded by the Natural Environment Research Council (UK), is a multi-disciplinary research project, in collaboration with EarthEnergy Ltd., which aims to quantify the CO2 mitigation potential of horizontal GCHPs. It considers changing environmental conditions and combines

  1. Study on the Application of Probabilistic Tsunami Hazard Analysis for the Nuclear Power Plant Site in Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Rhee, H. M.; Kim, M.; Sheen, D. H.; Choi, I. K.

    2014-12-01

    The necessity of study on the tsunami hazard assessment for Nuclear Power Plant (NPP) site was suggested since the event of Fukushima in 2011 had been occurred. It has being emphasized because all of the NPPs in Korean Peninsula are located in coastal region. The tsunami hazard is regarded as the annual exceedance probability for the wave heights. The methodology for analysis of tsunami hazard is based on the seismic hazard analysis. The seismic hazard analysis had been performed by using both deterministic and probabilistic method. Recently, the probabilistic method had been received more attention than the deterministic method because the uncertainties of hazard analysis could be considered by using the logic tree approach. In this study, the probabilistic tsunami hazard analysis for Uljin NPP site was performed by using the information of fault sources which was published by Atomic Energy Society of Japan (AESJ). The wave parameter is the most different parameter with seismic hazard. It could be estimated from the results of tsunami propagation analysis. The TSUNAMI_ver1.0 which was developed by Japan nuclear energy safety organization (JNES), was used for the tsunami simulation. The 80 cases tsunami simulations were performed and then the wave parameters were estimated. For reducing the sensitivity which was encouraged by location of sampling point, the wave parameters were estimated from group of sampling points.The probability density function on the tsunami height was computed by using the recurrence intervals and the wave parameters. And then the exceedance probability distribution was calculated from the probability density function. The tsunami hazards for the sampling groups were calculated. The fractile curves which were shown the uncertainties of input parameters were estimated from the hazards by using the round-robin algorithm. In general, tsunami hazard analysis is focused on the maximum wave heights. But the minimum wave height should be considered

  2. Evaluation of low impact development approach for mitigating flood inundation at a watershed scale in China.

    PubMed

    Hu, Maochuan; Sayama, Takahiro; Zhang, Xingqi; Tanaka, Kenji; Takara, Kaoru; Yang, Hong

    2017-05-15

    Low impact development (LID) has attracted growing attention as an important approach for urban flood mitigation. Most studies evaluating LID performance for mitigating floods focus on the changes of peak flow and runoff volume. This paper assessed the performance of LID practices for mitigating flood inundation hazards as retrofitting technologies in an urbanized watershed in Nanjing, China. The findings indicate that LID practices are effective for flood inundation mitigation at the watershed scale, and especially for reducing inundated areas with a high flood hazard risk. Various scenarios of LID implementation levels can reduce total inundated areas by 2%-17% and areas with a high flood hazard level by 6%-80%. Permeable pavement shows better performance than rainwater harvesting against mitigating urban waterlogging. The most efficient scenario is combined rainwater harvesting on rooftops with a cistern capacity of 78.5 mm and permeable pavement installed on 75% of non-busy roads and other impervious surfaces. Inundation modeling is an effective approach to obtaining the information necessary to guide decision-making for designing LID practices at watershed scales.

  3. Effectiveness of protected areas in mitigating fire within their boundaries: case study of Chiapas, Mexico.

    PubMed

    Román-Cuesta, María Rosa; Martínez-Vilalta, Jordi

    2006-08-01

    Since the severe 1982-1983 El Niño drought, recurrent burning has been reported inside tropical protected areas (TPAs). Despite the key role of fire in habitat degradation, little is known about the effectiveness of TPAs in mitigating fire incidence and burned areas. We used a GPS fire database (1995-2005) (n=3590 forest fires) obtained from the National Forest Commission to compare fire incidence (number of fires) and burned areas inside TPAs and their surrounding adjacent buffer areas in Southern Mexico (Chiapas). Burned areas inside parks ranged from 2% (Palenque) to 45% (Lagunas de Montebello) of a park's area, and the amount burned was influenced by two severe El Niño events (1998 and 2003). These two years together resulted in 67% and 46% of the total area burned in TPAs and buffers, respectively during the period under analysis. Larger burned areas in TPAs than in their buffers were exclusively related to the extent of natural habitats (flammable area excluding agrarian and pasture lands). Higher fuel loads together with access and extinction difficulties were likely behind this trend. A higher incidence of fire in TPAs than in their buffers was exclusively related to anthropogenic factors such as higher road densities and agrarian extensions. Our results suggest that TPAs are failing to mitigate fire impacts, with both fire incidence and total burned areas being significantly higher in the reserves than in adjacent buffer areas. Management plans should consider those factors that facilitate fires in TPAs: anthropogenic origin of fires, sensitivity of TPAs to El Niñio-droughts, large fuel loads and fuel continuity inside parks, and limited financial resources. Consideration of these factors favors lines of action such as alternatives to the use of fire (e.g., mucuna-maize system), climatic prediction to follow the evolution of El Niño, fuel management strategies that favor extinction practices, and the strengthening of local communities and ecotourism.

  4. Galactic cosmic rays dose mitigation inside a spacecraft by a superconductor "compact" toroid: A FLUKA Monte Carlo study

    NASA Astrophysics Data System (ADS)

    Dicarolo, P. R.

    2016-12-01

    Galactic Cosmic Rays and Solar Cosmic Rays are responsible for a significant absorbed dose by the astronauts that increases their risk and the probability of health problems. The mitigation of this dose is crucial for planning future missions, beginning from those to Mars. Different strategies have been designed to protect the crew during the mission. In this work it is considered the shielding of cosmic rays using a superconductor magnet enough compact for the launch. A Monte Carlo study is carried out. The simulations are made by modelling the structures with Monte Carlo FLUKA code, dividing cosmic rays into 5 ionic groups.

  5. Residential Exposure to Estrogen Disrupting Hazardous Air Pollutants and Breast Cancer Risk: the California Teachers Study

    PubMed Central

    Liu, Ruiling; Nelson, David; Hurley, Susan; Hertz, Andrew; Reynolds, Peggy

    2016-01-01

    Background Some studies show increased breast cancer risk from exposure to xenoestrogens, but few have explored exposures via ambient air, which could impact large populations. Objectives This study explored the association between breast cancer risk and residential exposures to ambient estrogen disruptors among participants in a large cohort study, the California Teachers Study. Methods Participants consisted of 112,379 women free of breast cancer and living at a California address in 1995/1996. Eleven hazardous air pollutants (HAPs) from the U.S. EPA 2002 list were identified as estrogen disruptors based on published endocrine disrupting chemical lists and literature review. Census-tract estrogen disruptor concentrations modeled by the U.S. EPA in 2002 were assigned to participants’ baseline addresses. Cox proportional hazards models were used to estimate hazard ratios associated with exposure to each estrogen disruptor and a summary measure of nine estrogenic HAPs among all participants and selected subgroups, adjusting for age, race/birthplace, socioeconomic status, and known breast cancer risk factors. Results 5,361 invasive breast cancer cases were identified between 1995 and 2010. No associations were found between residential exposure to ambient estrogen disruptors and overall breast cancer risk or hormone-responsive-positive breast cancer risk, nor among targeted subgroups of participants (pre/peri-menopausal women, post-menopausal women, never smokers, non-movers, and never-smoking non-movers). However, elevated risks for hormone-responsive-negative tumors were observed for higher exposure to cadmium compounds and possibly inorganic arsenic among never-smoking non-movers. Conclusion Long-term low-dose exposure to ambient cadmium compounds or possibly inorganic arsenic may be a risk factor for breast cancer. PMID:25760782

  6. Study on landslide hazard zonation based on factor weighting-rating theory in Slanic Prahova

    NASA Astrophysics Data System (ADS)

    Maftei, R.-M.; Vina, G.; Filipciuc, C.

    2012-04-01

    Studying the risks caused by landslides is important in the context of its forecast triggering. This study mainly integrates the background data that are related to historical and environmental factors and also current triggering factors. The theory on zoning hazard caused by landslides, Landslide Hazard Zonation, (LHZ) appeared in the 1960s. In this period the U.S. and many European countries began to use other triggers factors, besides the slope factor, in achieving hazard zoning. This theory has progressed due to the development of remote sensing and GIS technology, which were used to develop and analys methods and techniques consisting in combining data from different sources. The study of an area involves analysing the geographical position data, estimating the surface, the type of terrain, altitude, identifing the landslides in the area and some geological summary data. Data sources. The data used in this study are: · Landsat 7 satellite images; · 30 m spatial resolution, from which is derived the vegetation index; · topographic maps 1:25 000 from which we can obtain the numerical altitude model (DEM) (used to calculate the slope and relative altitude to land) · geological maps 1:50 000. Studied factors. The main factors used and studied in achieving land slides hazard zoning are: - the rate of displacement, the angle of slope, lithology - the index of vegetation or ground coverage of vegetation (NDVI) - river network, structural factor 1. The calculation of normalized vegetation index is made based on Landsat ETM satellite images. This vegetation factor can be both a principal and a secondary trigger factor in landslides. In areas devoid of vegetation, landslides are triggered more often compared with those in which coverage is greater. 2. Factors derived from the numerical model are the slope and elevation relative altitude. This operation was made using the topographic map 1:25 000 from were the level curvs contour was extracted by digitization, and

  7. Climate change and mitigation.

    PubMed

    Nibleus, Kerstin; Lundin, Rickard

    2010-01-01

    Planet Earth has experienced repeated changes of its climate throughout time. Periods warmer than today as well as much colder, during glacial episodes, have alternated. In our time, rapid population growth with increased demand for natural resources and energy, has made society increasingly vulnerable to environmental changes, both natural and those caused by man; human activity is clearly affecting the radiation balance of the Earth. In the session "Climate Change and Mitigation" the speakers offered four different views on coal and CO2: the basis for life, but also a major hazard with impact on Earth's climate. A common denominator in the presentations was that more than ever science and technology is required. We need not only understand the mechanisms for climate change and climate variability, we also need to identify means to remedy the anthropogenic influence on Earth's climate.

  8. Towards the Seismic Hazard Reassessment of Paks NPP (Hungary) Site: Seismicity and Sensitivity Studies

    NASA Astrophysics Data System (ADS)

    Toth, Laszlo; Monus, Peter; Gyori, Erzsebet; Grenerczy, Gyula; Janos Katona, Tamas; Kiszely, Marta

    2015-04-01

    reviews, and hazard characterization of the site has been confirmed. The hazard curves have been extended to lower probability events, as it is required by the probabilistic safety analysis. These earlier projects resulted in 0.22-0.26 g and 0.43-0.54 g mean PGA at 104 and 105 return periods. The site effect and liquefaction probability have also been evaluated. As it is expected for the site of soft soil conditions, the amplification is greater at shorter periods for the lower amplitude ground motion of 104 return period compared to the longer periods for the higher amplitude of the 105 year level ground motion. Further studies will be based on the improved regional seismotectonic model, state-of-the-art hazard evaluation software, and better knowledge of the local soil conditions. The presented preliminary results can demonstrate the adequacy of the planned program and highlight the progress in the hazard assessment.

  9. GIS data for the Seaside, Oregon, Tsunami Pilot Study to modernize FEMA flood hazard maps

    USGS Publications Warehouse

    Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.

    2007-01-01

    A Tsunami Pilot Study was conducted for the area surrounding the coastal town of Seaside, Oregon, as part of the Federal Emergency Management's (FEMA) Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). The Cascadia subduction zone extends from Cape Mendocino, California, to Vancouver Island, Canada. The Seaside area was chosen because it is typical of many coastal communities subject to tsunamis generated by far- and near-field (Cascadia) earthquakes. Two goals of the pilot study were to develop probabilistic 100-year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improving tsunami hazard assessment guidelines for FEMA and state and local agencies. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study model data and results are published separately as a geographic information systems (GIS) data report (Wong and others, 2006). The flood maps and GIS data are briefly described here.

  10. Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology

    NASA Technical Reports Server (NTRS)

    Woods, Stephen

    2009-01-01

    This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.

  11. Lithium-Sulfur Dioxide (Li/SO2) Battery Safety Hazards - Thermal Studies.

    DTIC Science & Technology

    1982-03-01

    Dioxide and Lithium - Thionyl Chloride Cells," J. Electrochem. Soc., 128, 508 (1981). 6. Bro, P., "Heat Generation in Li/S0 2 Cells During Low Rate...ElMER. K Y KIM. H V VENKATASETTT M60921-S1-C-006 U7CLASSIFIED NLfl3hmhmhhhhhil momhmhhohhohl LITHIUM -SULFUR DIOXIDE (Li/SO2) BATTERY SAFETY HAZARDS...icro-calorime~ter studies ruiN the heat of reaction for the lithium / acetintileiieIT-rnto be -54 .6 -F1.0 kcal/mole*-Li. Lithium /aluminum alloy A was

  12. Prediction of Ungauged River Basin for Hydro Power Potential and Flood Risk Mitigation; a Case Study at Gin River, Sri Lanka

    NASA Astrophysics Data System (ADS)

    Ratnayake, A. S.

    2011-12-01

    floodplains. Thus modern GIS technology has been productively executed to prepare hazard maps based on the flood modeling and also it would be further utilized for disaster preparedness and mitigation activities. Five suitable hydraulic heads were recognized for mini-hydro power sites and it would be the most economical and applicable flood controlling hydraulic engineering structure considering all morphologic, climatic, environmental and socioeconomic proxies of the study area. Mini-hydro power sites also utilized as clean, eco friendly and reliable energy source (8630.0 kW). Finally Francis Turbine can be employed as the most efficiency turbine for the selected sites bearing in mind of both technical and economical parameters.

  13. Prevalent cases in observational studies of cancer survival: do they bias hazard ratio estimates?

    PubMed Central

    Azzato, E M; Greenberg, D; Shah, M; Blows, F; Driver, K E; Caporaso, N E; Pharoah, P D P

    2009-01-01

    Observational epidemiological studies often include prevalent cases recruited at various times past diagnosis. This left truncation can be dealt with in non-parametric (Kaplan–Meier) and semi-parametric (Cox) time-to-event analyses, theoretically generating an unbiased hazard ratio (HR) when the proportional hazards (PH) assumption holds. However, concern remains that inclusion of prevalent cases in survival analysis results inevitably in HR bias. We used data on three well-established breast cancer prognosticators – clinical stage, histopathological grade and oestrogen receptor (ER) status – from the SEARCH study, a population-based study including 4470 invasive breast cancer cases (incident and prevalent), to evaluate empirically the effectiveness of allowing for left truncation in limiting HR bias. We found that HRs of prognostic factors changed over time and used extended Cox models incorporating time-dependent covariates. When comparing Cox models restricted to subjects ascertained within six months of diagnosis (incident cases) to models based on the full data set allowing for left truncation, we found no difference in parameter estimates (P=0.90, 0.32 and 0.95, for stage, grade and ER status respectively). Our results show that use of prevalent cases in an observational epidemiological study of breast cancer does not bias the HR in a left truncation Cox survival analysis, provided the PH assumption holds true. PMID:19401693

  14. The newest achievements of studies on the reutilization, treatment, and disposal technology of hazardous wastes

    SciTech Connect

    Liu Peizhe

    1996-12-31

    From 1991 to 1996, key studies on the reutilization, treatment, and disposal technology of hazardous wastes have been incorporated into the national plan for environmental protection science and technology. At present, the research achievements have been accomplished, have passed national approval, and have been accepted. The author of this paper, as leader of the national group for this research work, expounds the newest achievements of the studies involving four parts: (1) the reutilization technology of electroplating sludge, including the ion-exchange process for recovering the sludge and waste liquor for producing chromium tanning agent and extracting chromium and colloidal protein from tanning waste residue; on the recovery of heavy metals from the electroplating waste liquor with microbic purification; on the demonstration project of producing modified plastics from the sludge and the waste plastics; and on the demonstration of the recovery of heavy metals from waste electroplating sludge by using the ammonia-leaching process; (2) the demonstrative research of reutilization technology of chromium waste residues, including production of self-melting ore and smelting of chromium-containing pig iron, and of pyrolytic detoxification of the residue with cyclone furnace; (3) the incineration technology of hazardous wastes with successful results of the industrial incinerator system for polychlorinated biphenyls; and (4) the safety landfill technology for disposal of hazardous wastes, with a complete set of technology for pretreatment, selection of the site, development of the antipercolating materials, and design and construction of the landfill. Only a part of the achievements is introduced in this paper, most of which has been built and is being operated for demonstration to further spreading application and accumulate experience. 6 refs., 7 figs., 6 tabs.

  15. Hazard Ranking Methodology for Assessing Health Impacts of Unconventional Natural Gas Development and Production: The Maryland Case Study

    PubMed Central

    Sangaramoorthy, Thurka; Wilson, Sacoby; Nachman, Keeve E.; Babik, Kelsey; Jenkins, Christian C.; Trowell, Joshua; Milton, Donald K.; Sapkota, Amir

    2016-01-01

    The recent growth of unconventional natural gas development and production (UNGDP) has outpaced research on the potential health impacts associated with the process. The Maryland Marcellus Shale Public Health Study was conducted to inform the Maryland Marcellus Shale Safe Drilling Initiative Advisory Commission, State legislators and the Governor about potential public health impacts associated with UNGDP so they could make an informed decision that considers the health and well-being of Marylanders. In this paper, we describe an impact assessment and hazard ranking methodology we used to assess the potential public health impacts for eight hazards associated with the UNGDP process. The hazard ranking included seven metrics: 1) presence of vulnerable populations (e.g. children under the age of 5, individuals over the age of 65, surface owners), 2) duration of exposure, 3) frequency of exposure, 4) likelihood of health effects, 5) magnitude/severity of health effects, 6) geographic extent, and 7) effectiveness of setbacks. Overall public health concern was determined by a color-coded ranking system (low, moderately high, and high) that was generated based on the overall sum of the scores for each hazard. We provide three illustrative examples of applying our methodology for air quality and health care infrastructure which were ranked as high concern and for water quality which was ranked moderately high concern. The hazard ranking was a valuable tool that allowed us to systematically evaluate each of the hazards and provide recommendations to minimize the hazards. PMID:26726918

  16. Hazard Ranking Methodology for Assessing Health Impacts of Unconventional Natural Gas Development and Production: The Maryland Case Study.

    PubMed

    Boyle, Meleah D; Payne-Sturges, Devon C; Sangaramoorthy, Thurka; Wilson, Sacoby; Nachman, Keeve E; Babik, Kelsey; Jenkins, Christian C; Trowell, Joshua; Milton, Donald K; Sapkota, Amir

    2016-01-01

    The recent growth of unconventional natural gas development and production (UNGDP) has outpaced research on the potential health impacts associated with the process. The Maryland Marcellus Shale Public Health Study was conducted to inform the Maryland Marcellus Shale Safe Drilling Initiative Advisory Commission, State legislators and the Governor about potential public health impacts associated with UNGDP so they could make an informed decision that considers the health and well-being of Marylanders. In this paper, we describe an impact assessment and hazard ranking methodology we used to assess the potential public health impacts for eight hazards associated with the UNGDP process. The hazard ranking included seven metrics: 1) presence of vulnerable populations (e.g. children under the age of 5, individuals over the age of 65, surface owners), 2) duration of exposure, 3) frequency of exposure, 4) likelihood of health effects, 5) magnitude/severity of health effects, 6) geographic extent, and 7) effectiveness of setbacks. Overall public health concern was determined by a color-coded ranking system (low, moderately high, and high) that was generated based on the overall sum of the scores for each hazard. We provide three illustrative examples of applying our methodology for air quality and health care infrastructure which were ranked as high concern and for water quality which was ranked moderately high concern. The hazard ranking was a valuable tool that allowed us to systematically evaluate each of the hazards and provide recommendations to minimize the hazards.

  17. Economics of Tsunami Mitigation in the Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Goettel, K. A.; Rizzo, A.; Sigrist, D.; Bernard, E. N.

    2011-12-01

    The death total in a major Cascadia Subduction Zone (CSZ) tsunami may be comparable to the Tohoku tsunami - tens of thousands. To date, tsunami risk reduction activities have been almost exclusively hazard mapping and evacuation planning. Reducing deaths in locations where evacuation to high ground is impossible in the short time between ground shaking and arrival of tsunamis requires measures such as vertical evacuation facilities or engineered pathways to safe ground. Yet, very few, if any, such tsunami mitigation projects have been done. In contrast, many tornado safe room and earthquake mitigation projects driven entirely or in largely by life safety have been done with costs in the billions of dollars. The absence of tsunami mitigation measures results from the belief that tsunamis are too infrequent and the costs too high to justify life safety mitigation measures. A simple analysis based on return periods, death rates, and the geographic distribution of high risk areas for these hazards demonstrates that this belief is incorrect: well-engineered tsunami mitigation projects are more cost-effective with higher benefit-cost ratios than almost all tornado or earthquake mitigation projects. Goldfinger's paleoseismic studies of CSZ turbidites indicate return periods for major CSZ tsunamis of about 250-500 years (USGS Prof. Paper 1661-F in press). Tsunami return periods are comparable to those for major earthquakes at a given location in high seismic areas and are much shorter than those for tornados at any location which range from >4,000 to >16,000 years for >EF2 and >EF4 tornadoes, respectively. The average earthquake death rate in the US over the past 100-years is about 1/year, or about 30/year including the 1906 San Francisco earthquake. The average death rate for tornadoes is about 90/year. For CSZ tsunamis, the estimated average death rate ranges from about 20/year (10,000 every 500 years) to 80/year (20,000 every 250 years). Thus, the long term deaths rates

  18. Can isolated and riparian wetlands mitigate the impact of climate change on watershed hydrology? A case study approach.

    PubMed

    Fossey, M; Rousseau, A N

    2016-12-15

    The effects of wetlands on stream flows are well established, namely mitigating flow regimes through water storage and slow water release. However, their effectiveness in reducing flood peaks and sustaining low flows is mainly driven by climate conditions and wetland type with respect to their connectivity to the hydrographic network (i.e. isolated or riparian wetlands). While some studies have demonstrated these hydrological functions/services, few of them have focused on the benefits to the hydrological regimes and their evolution under climate change (CC) and, thus, some gaps persist. The objective of this study was to further advance our knowledge with that respect. The PHYSITEL/HYDROTEL modelling platform was used to assess current and future states of watershed hydrology of the Becancour and Yamaska watersheds, Quebec, Canada. Simulation results showed that CC will induce similar changes on mean seasonal flows, namely larger and earlier spring flows leading to decreases in summer and fall flows. These expected changes will have different effects on 20-year and 100-year peak flows with respect to the considered watershed. Nevertheless, conservation of current wetland states should: (i) for the Becancour watershed, mitigate the potential increase in 2-year, 20-year and 100-year peak flows; and (ii) for the Yamaska watershed, accentuate the potential decrease in the aforementioned indicators. However, any loss of existing wetlands would be detrimental for 7-day 2-year and 10-year as well as 30-day 5-year low flows.

  19. Viscoelastic Materials Study for the Mitigation of Blast-Related Brain Injury

    NASA Astrophysics Data System (ADS)

    Bartyczak, Susan; Mock, Willis, Jr.

    2011-06-01

    Recent preliminary research into the causes of blast-related brain injury indicates that exposure to blast pressures, such as from IED detonation or multiple firings of a weapon, causes damage to brain tissue resulting in Traumatic Brain Injury (TBI) and Post Traumatic Stress Disorder (PTSD). Current combat helmets are not sufficient to protect the warfighter from this danger and the effects are debilitating, costly, and long-lasting. Commercially available viscoelastic materials, designed to dampen vibration caused by shock waves, might be useful as helmet liners to dampen blast waves. The objective of this research is to develop an experimental technique to test these commercially available materials when subject to blast waves and evaluate their blast mitigating behavior. A 40-mm-bore gas gun is being used as a shock tube to generate blast waves (ranging from 1 to 500 psi) in a test fixture at the gun muzzle. A fast opening valve is used to release nitrogen gas from the breech to impact instrumented targets. The targets consist of aluminum/ viscoelastic polymer/ aluminum materials. Blast attenuation is determined through the measurement of pressure and accelerometer data in front of and behind the target. The experimental technique, calibration and checkout procedures, and results will be presented.

  20. First Production of C60 Nanoparticle Plasma Jet for Study of Disruption Mitigation for ITER

    NASA Astrophysics Data System (ADS)

    Bogatu, I. N.; Thompson, J. R.; Galkin, S. A.; Kim, J. S.; Brockington, S.; Case, A.; Messer, S. J.; Witherspoon, F. D.

    2012-10-01

    Unique fast response and large mass-velocity delivery of nanoparticle plasma jets (NPPJs) provide a novel application for ITER disruption mitigation, runaway electrons diagnostics and deep fueling. NPPJs carry a much larger mass than usual gases. An electromagnetic plasma gun provides a very high injection velocity (many km/s). NPPJ has much higher ram pressure than any standard gas injection method and penetrates the tokamak confining magnetic field. Assimilation is enhanced due to the NP large surface-to-volume ratio. Radially expanding NPPJs help achieving toroidal uniformity of radiation power. FAR-TECH's NPPJ system was successfully tested: a coaxial plasma gun prototype (˜35 cm length, 96 kJ energy) using a solid state TiH2/C60 pulsed power cartridge injector produced a hyper-velocity (>4 km/s), high-density (>10^23 m-3), C60 plasma jet in ˜0.5 ms, with ˜1-2 ms overall response-delivery time. We present the TiH2/C60 cartridge injector output characterization (˜180 mg of sublimated C60 gas) and first production results of a high momentum C60 plasma jet (˜0.6 g.km/s).

  1. Study of cover source mismatch in steganalysis and ways to mitigate its impact

    NASA Astrophysics Data System (ADS)

    Kodovský, Jan; Sedighi, Vahid; Fridrich, Jessica

    2014-02-01

    When a steganalysis detector trained on one cover source is applied to images from a different source, generally the detection error increases due to the mismatch between both sources. In steganography, this situation is recognized as the so-called cover source mismatch (CSM). The drop in detection accuracy depends on many factors, including the properties of both sources, the detector construction, the feature space used to represent the covers, and the steganographic algorithm. Although well recognized as the single most important factor negatively affecting the performance of steganalyzers in practice, the CSM received surprisingly little attention from researchers. One of the reasons for this is the diversity with which the CSM can manifest. On a series of experiments in the spatial and JPEG domains, we refute some of the common misconceptions that the severity of the CSM is tied to the feature dimensionality or their "fragility." The CSM impact on detection appears too difficult to predict due to the effect of complex dependencies among the features. We also investigate ways to mitigate the negative effect of the CSM using simple measures, such as by enlarging the diversity of the training set (training on a mixture of sources) and by employing a bank of detectors trained on multiple different sources and testing on a detector trained on the closest source.

  2. Conforth Ranch Wildlife Mitigation Feasibility Study, McNary, Oregon : Annual Report.

    SciTech Connect

    Rasmussen, Larry; Wright, Patrick; Giger, Richard

    1991-03-01

    The 2,860-acre Conforth Ranch near Umatilla, Oregon is being considered for acquisition and management to partially mitigate wildlife losses associated with McNary Hydroelectric Project. The Habitat Evaluation Procedures (HEP) estimated that management for wildlife would result in habitat unit gains of 519 for meadowlark, 420 for quail, 431 for mallard, 466 for Canada goose, 405 for mink, 49 for downy woodpecker, 172 for yellow warbler, and 34 for spotted sandpiper. This amounts to a total combined gain of 2,495 habitat units -- a 110 percent increase over the existing values for these species combined of 2,274 habitat units. Current water delivery costs, estimated at $50,000 per year, are expected to increase to $125,000 per year. A survey of local interest indicated a majority of respondents favored the concept with a minority opposed. No contaminants that would preclude the Fish and Wildlife Service from agreeing to accept the property were identified. 21 refs., 3 figs., 5 tabs.

  3. Implications of Adhesion Studies for Dust Mitigation on Thermal Control Surfaces

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Berkebile, Stephen P.

    2012-01-01

    Experiments measuring the adhesion forces under ultrahigh vacuum conditions (10 (exp -10) torr) between a synthetic volcanic glass and commonly used space exploration materials have recently been described. The glass has a chemistry and surface structure typical of the lunar regolith. It was found that Van der Waals forces between the glass and common spacecraft materials was negligible. Charge transfer between the materials was induced by mechanically striking the spacecraft material pin against the glass plate. No measurable adhesion occurred when striking the highly conducting materials, however, on striking insulating dielectric materials the adhesion increased dramatically. This indicates that electrostatic forces dominate over Van der Waals forces under these conditions. The presence of small amounts of surface contaminants was found to lower adhesive forces by at least two orders of magnitude, and perhaps more. Both particle and space exploration material surfaces will be cleaned by the interaction with the solar wind and other energetic processes and stay clean because of the extremely high vacuum (10 (exp -12) torr) so the atomically clean adhesion values are probably the relevant ones for the lunar surface environment. These results are used to interpret the results of dust mitigation technology experiments utilizing textured surfaces, work function matching surfaces and brushing. They have also been used to reinterpret the results of the Apollo 14 Thermal Degradation Samples experiment.

  4. Standardization of Seismic Microzonification and Probabilistic Seismic Hazard Study Considering Site Effect for Metropolitan Areas in the State of Veracruz

    NASA Astrophysics Data System (ADS)

    Torres Morales, G. F.; Leonardo Suárez, M.; Dávalos Sotelo, R.; Castillo Aguilar, S.; Mora González, I.

    2014-12-01

    Preliminary results obtained from the project "Seismic Hazard in the State of Veracruz and Xalapa Conurbation" and "Microzonation of geological and hydrometeorological hazards for conurbations of Orizaba, Veracruz, and major sites located in the lower sub-basins: The Antigua and Jamapa" are presented. These projects were sponsored respectively by the PROMEP program and the Joint Funds CONACyT-Veracruz state government. The study consists of evaluating the probabilistic seismic hazard considering the site effect (SE) in the urban zones of cities of Xalapa and Orizaba; the site effects in this preliminary stage were incorporated through a standard format proposed in studies of microzonation and application in computer systems, which allows to optimize and condense microzonation studies of a city. This study stems from the need to know the seismic hazard (SH) in the State of Veracruz and its major cities, defining SH as the probabilistic description of exceedance of a given level of ground motion intensity (generally designated by the acceleration soil or maximum ordinate in the response spectrum of pseudo-acceleration, PGA and Sa, respectively) as a result of the action of an earthquake in the area of influence for a specified period of time. The evaluation results are presented through maps of seismic hazard exceedance rate curves and uniform hazard spectra (UHS) for different spectral ordinates and return periods, respectively.

  5. Progress in NTHMP Hazard Assessment

    USGS Publications Warehouse

    Gonzalez, F.I.; Titov, V.V.; Mofjeld, H.O.; Venturato, A.J.; Simmons, R.S.; Hansen, R.; Combellick, R.; Eisner, R.K.; Hoirup, D.F.; Yanagi, B.S.; Yong, S.; Darienzo, M.; Priest, G.R.; Crawford, G.L.; Walsh, T.J.

    2005-01-01

    The Hazard Assessment component of the U.S. National Tsunami Hazard Mitigation Program has completed 22 modeling efforts covering 113 coastal communities with an estimated population of 1.2 million residents that are at risk. Twenty-three evacuation maps have also been completed. Important improvements in organizational structure have been made with the addition of two State geotechnical agency representatives to Steering Group membership, and progress has been made on other improvements suggested by program reviewers. ?? Springer 2005.

  6. Volcanic hazards in Central America

    USGS Publications Warehouse

    Rose, William I.; Bluth, Gregg J.S.; Carr, Michael J.; Ewert, John W.; Patino, Lina C.; Vallance, James W.

    2006-01-01

    This volume is a sampling of current scientific work about volcanoes in Central America with specific application to hazards. The papers reflect a variety of international and interdisciplinary collaborations and employ new methods. The book will be of interest to a broad cross section of scientists, especially volcanologists. The volume also will interest students who aspire to work in the field of volcano hazards mitigation or who may want to work in one of Earth’s most volcanically active areas.

  7. Vertical Field of View Reference Point Study for Flight Path Control and Hazard Avoidance

    NASA Technical Reports Server (NTRS)

    Comstock, J. Raymond, Jr.; Rudisill, Marianne; Kramer, Lynda J.; Busquets, Anthony M.

    2002-01-01

    Researchers within the eXternal Visibility System (XVS) element of the High-Speed Research (HSR) program developed and evaluated display concepts that will provide the flight crew of the proposed High-Speed Civil Transport (HSCT) with integrated imagery and symbology to permit path control and hazard avoidance functions while maintaining required situation awareness. The challenge of the XVS program is to develop concepts that would permit a no-nose-droop configuration of an HSCT and expanded low visibility HSCT operational capabilities. This study was one of a series of experiments exploring the 'design space' restrictions for physical placement of an XVS display. The primary experimental issues here was 'conformality' of the forward display vertical position with respect to the side window in simulated flight. 'Conformality' refers to the case such that the horizon and objects appear in the same relative positions when viewed through the forward windows or display and the side windows. This study quantified the effects of visual conformality on pilot flight path control and hazard avoidance performance. Here, conformality related to the positioning and relationship of the artificial horizon line and associated symbology presented on the forward display and the horizon and associated ground, horizon, and sky textures as they would appear in the real view through a window presented in the side window display. No significant performance consequences were found for the non-conformal conditions.

  8. Expert study to select indicators of the occurrence of emerging mycotoxin hazards.

    PubMed

    Kandhai, M C; Booij, C J H; Van der Fels-Klerx, H J

    2011-01-01

    This article describes a Delphi-based expert judgment study aimed at the selection of indicators to identify the occurrence of emerging mycotoxin hazards related to Fusarium spp. in wheat supply chains. A panel of 29 experts from 12 European countries followed a holistic approach to evaluate the most important indicators for different chain stages (growth, transport and storage, and processing) and their relative importance. After three e-mailing rounds, the experts reached consensus on the most important indicators for each of the three stages: wheat growth, transport and storage, and processing. For wheat growth, these indicators include: relative humidity/rainfall, crop rotation, temperature, tillage practice, water activity of the kernels, and crop variety/cultivar. For the transport and storage stage, they include water activity in the kernels, relative humidity, ventilation, temperature, storage capacity, and logistics. For wheat processing, indicators include quality data, fraction of the cereal used, water activity in the kernels, quality management and traceability systems, and carryover of contamination. The indicators selected in this study can be used in an identification system for the occurrence of emerging mycotoxin hazards in wheat supply chains. Such a system can be used by risk managers within governmental (related) organizations and/or the food and feed industry in order to react proactively to the occurrence of these emerging mycotoxins.

  9. RISMUR II: New seismic hazard and risk study in Murcia Region after the Lorca Earthquake, 2011

    NASA Astrophysics Data System (ADS)

    Benito, Belen; Gaspar, Jorge; Rivas, Alicia; Quiros, Ligia; Ruiz, Sandra; Hernandez, Roman; Torres, Yolanda; Staller, Sandra

    2016-04-01

    The Murcia Region, is one of the highest seimic activity of Spain, located SE Iberian Peninsula. A system of active faults are included in the región, where the most recent damaging eartquakes took place in our country: 1999, 2002, 2005 and 2011. The last one ocurred in Lorca, causing 9 deads and notably material losses, including the artistic stock. The seismic emergency plann of the Murcia Region was developed in 2006, based of the results of the risk Project RISMUR I, which among other conslusions pointed out Lorca as one of the municipalities with highest risk in the province,. After the Lorca earthquake in 2011, a revisión of the previous study has been developed through the Project RISMUR II, including data of this earthquake , as well as updted Data Base of: seismicity, active faults, strong motion records, cadastre, vulnerability, etc. In adittion, the new study includes, some methodology innovations: modelization of faults as independent units for hazard assessment, analytic methods for risk estimations using data of the earthquake for calibration of capacity and fragility curves. In this work the results of RISMUR II are presented, which are compared with those reached in RISMUR I. The main conclusions are: Increasing of the hazard along the central system fault SW-NE (Alhama de Murcia, Totana nad Carracoy), which involve highest expected damages in the nearest populations to these faults: Lorca, Totana, Alcantarilla and Murcia.

  10. A Randomized, Controlled Trial of Home Injury Hazard Reduction: The HOME Injury Study

    PubMed Central

    Phelan, Kieran J.; Khoury, Jane; Xu, Yingying; Liddy, Stacey; Hornung, Richard; Lanphear, Bruce P.

    2013-01-01

    Objective Test the efficacy of an intervention of safety device installation on medically-attended injury in children birth to 3 years of age. Design A nested, prospective, randomized, controlled trial. Setting Indoor environment of housing units of mothers and children. Participants Mothers and their children enrolled in a birth cohort examining the effects of prevalent neurotoxicants on child development, the Home Observation and Measures of the Environment (HOME) Study. Intervention Installation of multiple, passive measures (stairgates, window locks, smoke & carbon monoxide detectors, to reduce exposure to injury hazards present in housing units. Outcome measure Self-reported and medically-attended and modifiable injury. Methods 1263 (14%) prenatal patients were eligible, 413 (33%) agreed to participate and 355 were randomly assigned to the experimental (n=181) or control (n=174) groups. Injury hazards were assessed at home visits by teams of trained research assistants using a validated survey. Safety devices were installed in intervention homes. Intention-to-treat analyses to test efficacy were conducted on: 1) total injury rates and 2) on injuries deemed, a priori, modifiable by the installation of safety devices. Rates of medically attended injuries (phone calls, office or emergency visits) were calculated using generalized estimating equations. Results The mean age of the children at intervention was 6 months. Injury hazards were significantly reduced in the intervention but not in control group homes at one and two years (p<0.004). There was not a significant difference in the rate for all medically-attended injuries in intervention compared with control group children, 14.3 (95%CI 9.7, 21.1) vs. 20.8 (14.4, 29.9) per 100 child-years (p=0.17) respectively; but there was a significant reduction in modifiable medically attended injuries in intervention compared with control group children, 2.3 (1.0, 5.5) vs. 7.7 (4.2, 14.2) per 100 child-years, respectively

  11. Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor

    This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

  12. Cost-benefit analysis of alternative LNG vapor-mitigation measures. Topical report, September 14, 1987-January 15, 1991

    SciTech Connect

    Atallah, S.

    1992-06-25

    A generalized methodology is presented for comparing the costs and safety benefits of alternative hazard mitigation measures for a large LNG vapor release. The procedure involves the quantification of the risk to the public before and after the application of LNG vapor mitigation measures. In the study, risk was defined as the product of the annual accident frequency, estimated from a fault tree analysis, and the severity of the accident. Severity was measured in terms of the number of people who may be exposed to 2.5% or higher concentration. The ratios of the annual costs of the various mitigation measures to their safety benefits (as determined by the differences between the risk before and after mitigation measure implementation), were then used to identify the most cost-effective approaches to vapor cloud mitigation.

  13. Application of multi-agent coordination methods to the design of space debris mitigation tours

    NASA Astrophysics Data System (ADS)

    Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby

    2016-04-01

    The growth in the number of defunct and fragmented objects near to the Earth poses a growing hazard to launch operations as well as existing on-orbit assets. Numerous studies have demonstrated the positive impact of active debris mitigation campaigns upon the growth of debris populations, but comparatively fewer investigations incorporate specific mission scenarios. Furthermore, while many active mitigation methods have been proposed, certain classes of debris objects are amenable to mitigation campaigns employing chaser spacecraft with existing chemical and low-thrust propulsive technologies. This investigation incorporates an ant colony optimization routing algorithm and multi-agent coordination via auctions into a debris mitigation tour scheme suitable for preliminary mission design and analysis as well as spacecraft flight operations.

  14. Evaluation of MEDALUS model for desertification hazard zonation using GIS; study area: Iyzad Khast plain, Iran.

    PubMed

    Farajzadeh, Manuchehr; Egbal, Mahbobeh Nik

    2007-08-15

    In this study, the MEDALUS model along with GIS mapping techniques are used to determine desertification hazards for a province of Iran to determine the desertification hazard. After creating a desertification database including 20 parameters, the first steps consisted of developing maps of four indices for the MEDALUS model including climate, soil, vegetation and land use were prepared. Since these parameters have mostly been presented for the Mediterranean region in the past, the next step included the addition of other indicators such as ground water and wind erosion. Then all of the layers weighted by environmental conditions present in the area were used (following the same MEDALUS framework) before a desertification map was prepared. The comparison of two maps based on the original and modified MEDALUS models indicates that the addition of more regionally-specific parameters into the model allows for a more accurate representation of desertification processes across the Iyzad Khast plain. The major factors affecting desertification in the area are climate, wind erosion and low land quality management, vegetation degradation and the salinization of soil and water resources.

  15. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  16. Concerns About Climate Change Mitigation Projects: Summary of Findings from Case Studies in Brazil, India, Mexico, and South Africa

    SciTech Connect

    Sathaye, Jayant A.; Andrasko, Kenneth; Makundi, Willy; La Rovere, Emilio Lebre; Ravinandranath, N.H.; Melli, Anandi; Rangachari, Anita; Amaz, Mireya; Gay, Carlos; Friedmann, Rafael; Goldberg, Beth; van Horen, Clive; Simmonds, Gillina; Parker, Gretchen

    1998-11-01

    The concept of joint implementation as a way to implement climate change mitigation projects in another country has been controversial ever since its inception. Developing countries have raised numerous issues at the project-specific technical level, and broader concerns having to do with equity and burden sharing. This paper summarizes the findings of studies for Brazil, India, Mexico and South Africa, four countries that have large greenhouse gas emissions and are heavily engaged in the debate on climate change projects under the Kyoto Protocol. The studies examine potential or current projects/programs to determine whether eight technical concerns about joint implementation can be adequately addressed. They conclude that about half the concerns were minor or well managed by project developers, but concerns about additionality of funds, host country institutions and guarantees of performance (including the issues of baselines and possible leakage) need much more effort to be adequately addressed. All the papers agree on the need to develop institutional arrangements for approving and monitoring such projects in each of the countries represented. The case studies illustrate that these projects have the potential to bring new technology, investment, employment and ancillary socioeconomic and environmental benefits to developing countries. These benefits are consistent with the goal of sustainable development in the four study countries. At a policy level, the studies' authors note that in their view, the Annex I countries should consider limits on the use of jointly implemented projects as a way to get credits against their own emissions at home, and stress the importance of industrialized countries developing new technologies that will benefit all countries. The authors also observe that if all countries accepted caps on their emissions (with a longer time period allowed for developing countries to do so) project-based GHG mitigation would be significantly

  17. Economic optimization of natural hazard protection - conceptual study of existing approaches

    NASA Astrophysics Data System (ADS)

    Spackova, Olga; Straub, Daniel

    2013-04-01

    Risk-based planning of protection measures against natural hazards has become a common practice in many countries. The selection procedure aims at identifying an economically efficient strategy with regard to the estimated costs and risk (i.e. expected damage). A correct setting of the evaluation methodology and decision criteria should ensure an optimal selection of the portfolio of risk protection measures under a limited state budget. To demonstrate the efficiency of investments, indicators such as Benefit-Cost Ratio (BCR), Marginal Costs (MC) or Net Present Value (NPV) are commonly used. However, the methodologies for efficiency evaluation differ amongst different countries and different hazard types (floods, earthquakes etc.). Additionally, several inconsistencies can be found in the applications of the indicators in practice. This is likely to lead to a suboptimal selection of the protection strategies. This study provides a general formulation for optimization of the natural hazard protection measures from a socio-economic perspective. It assumes that all costs and risks can be expressed in monetary values. The study regards the problem as a discrete hierarchical optimization, where the state level sets the criteria and constraints, while the actual optimization is made on the regional level (towns, catchments) when designing particular protection measures and selecting the optimal protection level. The study shows that in case of an unlimited budget, the task is quite trivial, as it is sufficient to optimize the protection measures in individual regions independently (by minimizing the sum of risk and cost). However, if the budget is limited, the need for an optimal allocation of resources amongst the regions arises. To ensure this, minimum values of BCR or MC can be required by the state, which must be achieved in each region. The study investigates the meaning of these indicators in the optimization task at the conceptual level and compares their

  18. Effects of vegetation on debris flow mitigation: A case study from Gansu province, China

    NASA Astrophysics Data System (ADS)

    Wang, Siyuan; Meng, Xingmin; Chen, Guan; Guo, Peng; Xiong, Muqi; Zeng, Runqiang

    2017-04-01

    Debris flows are traditionally controlled using civil engineering structures such as check dams. However, the misuse of such strategies may sometimes trigger environmental hazards such as the catastrophic landslide in 2010 in Zouqu county, China, and therefore other methods such as the use of vegetation as an eco-engineering tool are increasingly being adopted. The aim of the present research was to investigate the bioengineering effects of vegetation over time in an area prone to debris flows in Gansu province, China. We collected detailed data from 2012 to 2014 on vegetation type, density, and root system morphology, and measured profiles across the valley. In addition, we assessed the increased soil cohesion provided by the root development of three monospecific stands of Robinia pseudoacacia of different ages growing within the debris valley, and on a larger scale, their effects on channel morphology. These data were incorporated into a modified form of BSTEM (Bank Stability and Toe Erosion Model) and a cellular braided-stream model. The results indicate that with increasing age, the FOS (factor of safety) of the bank would be significantly increased, and that the flooded area in the valley caused by simulated flood events would be decreased by 18-24%, on average. Subsequently, field data were incorporated into a cellular model to simulate sediment movement and the effects of vegetation on the channel dynamics. The results demonstrate that the stability provided by vegetation could result in a less active valley system and that overall the development of debris-controlling vegetation could make a major contribution to ecosystem restoration. However, careful management is essential for making optimum use of the vegetation.

  19. The Study on Ecological Treatment of Saline Lands to Mitigate the Effects of Climate Change

    NASA Astrophysics Data System (ADS)

    Xie, Jiancang; Zhu, Jiwei; Wang, Tao

    2010-05-01

    The soil water and salt movement is influenced strongly by the frequent droughts, floods and climate change. Additionally, as continued population growth, large-scale reclaiming of arable land and long-term unreasonable irrigation, saline land is increasing at the rate of 1,000,000~15,000,000 mu each year all over the world. In the tradition management, " drainage as the main " measure has series of problem, which appears greater project, more occupation of land, harmful for water saving and downstream pollution. To response the global climate change, it has become the common understanding, which promote energy-saving and environment protection, reflect the current model, explore the ecological management model. In this paper, we take severe saline land—Lubotan in Shaanxi Province as an example. Through nearly 10 years harnessing practice and observing to meteorology, hydrology, soil indicators of climate, we analyze the influence of climate change to soil salinity movement at different seasons and years, then put forward and apply a new model of saline land harnessing to mitigate the Effects of Climate Change and self-rehabilitate entironment. This model will be changed "drainage" to "storage", through the establishment engineering of " storage as the main ", taken comprehensive measures of " project - biology - agriculture ", we are changing saline land into arable land. Adapted to natural changes of climate, rainfall, irrigation backwater, groundwater level, reduced human intervention to achieve system dynamic equilibrium. During the ten years, the salt of plough horizon has reduced from 0.74% to 0.20%, organic matter has increased from 0.7% to 0.92%, various indicators of soil is begining to go better. At the same time, reduced the water for irrigation, drainage pollution and investment costs. Through the model, reformed severe saline land 18,900 mu, increased new cultivated land 16,500 mu, comprehensive efficient significant, ensured the coordinated

  20. Case study: Mapping tsunami hazards associated with debris flow into a reservoir

    USGS Publications Warehouse

    Walder, J.S.; Watts, P.; Waythomas, C.F.

    2006-01-01

    Debris-flow generated impulse waves (tsunamis) pose hazards in lakes, especially those used for hydropower or recreation. We describe a method for assessing tsunami-related hazards for the case in which inundation by coherent water waves, rather than chaotic splashing, is of primary concern. The method involves an experimentally based initial condition (tsunami source) and a Boussinesq model for tsunami propagation and inundation. Model results are used to create hazard maps that offer guidance for emergency planners and responders. An example application explores tsunami hazards associated with potential debris flows entering Baker Lake, a reservoir on the flanks of the Mount Baker volcano in the northwestern United States. ?? 2006 ASCE.

  1. Property-close source separation of hazardous waste and waste electrical and electronic equipment--a Swedish case study.

    PubMed

    Bernstad, Anna; la Cour Jansen, Jes; Aspegren, Henrik

    2011-03-01

    Through an agreement with EEE producers, Swedish municipalities are responsible for collection of hazardous waste and waste electrical and electronic equipment (WEEE). In most Swedish municipalities, collection of these waste fractions is concentrated to waste recycling centres where households can source-separate and deposit hazardous waste and WEEE free of charge. However, the centres are often located on the outskirts of city centres and cars are needed in order to use the facilities in most cases. A full-scale experiment was performed in a residential area in southern Sweden to evaluate effects of a system for property-close source separation of hazardous waste and WEEE. After the system was introduced, results show a clear reduction in the amount of hazardous waste and WEEE disposed of incorrectly amongst residual waste or dry recyclables. The systems resulted in a source separation ratio of 70 wt% for hazardous waste and 76 wt% in the case of WEEE. Results show that households in the study area were willing to increase source separation of hazardous waste and WEEE when accessibility was improved and that this and similar collection systems can play an important role in building up increasingly sustainable solid waste management systems.

  2. Risk assessment of debris flow hazards in natural slope

    NASA Astrophysics Data System (ADS)

    Choi, Junghae; Chae, Byung-gon; Liu, Kofei; Wu, Yinghsin

    2016-04-01

    The study area is located at north-east part of South Korea. Referring to the map of landslide sus-ceptibility (KIGAM, 2009) from Korea Institute of Geoscience and Mineral Resources (KIGAM for short), there are large areas of potential landslide in high probability on slope land of mountain near the study area. Besides, recently some severe landslide-induced debris flow hazards occurred in this area. So this site is convinced to be prone to debris flow haz-ards. In order to mitigate the influence of hazards, the assessment of potential debris flow hazards is very important and essential. In this assessment, we use Debris-2D, debris flow numerical program, to assess the potential debris flow hazards. The worst scenario is considered for simulation. The input mass sources are determined using landslide susceptibility map. The water input is referred to the daily accumulative rainfall in the past debris flow event in study area. The only one input material property, i.e. yield stress, is obtained using calibration test. The simulation results show that the study area has po-tential to be impacted by debris flow. Therefore, based on simulation results, to mitigate debris flow hazards, we can propose countermeasures, including building check dams, constructing a protection wall in study area, and installing instruments for active monitoring of debris flow hazards. Acknowledgements:This research was supported by the Public Welfare & Safety Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning (NRF-2012M3A2A1050983)

  3. Monitoring and modelling for landslide risk mitigation and reduction. The case study of San Benedetto Ullano (Northern Calabria - Italy)

    NASA Astrophysics Data System (ADS)

    Terranova, Oreste G.; Greco, Venanzio R.; Gariano, Stefano L.; Pascale, Stefania; Rago, Valeria; Caloiero, Paola; Iovine, Giulio G. R.

    2016-04-01

    movements caused severe damage to roads and infrastructure. The second crisis ended up in late June: the hydrological model FLaIR was then successfully tested against the known dates of activation of the slope movement, by using a local rain series [3]. Meanwhile, technical support was being assured to the Municipality to optimize geological prospections, monitoring, and design of remedial works of a master plan. A third activation occurred during the night of 15 March 2013, when planned remedial works had not been completed yet. By applying the hydrological model SAKe [4, 5], this activation could be predicted, again permitting the prompt adoption of mitigation measures. Such activation was triggered by rains smaller and shorter than those that caused previous activations, perhaps indicating an apparent increasing fragility of the slope. Changes in slope stability conditions before and after the construction of the remedial works are being investigated. Critical rain conditions and groundwater levels for landslide activation are in fact expected to change, depending on combined effects of natural weakening vs. artificial strengthening. Monitoring will allow to quantitatively verify the new relationships between rainfall, groundwater and slope stability. References [1] Iovine G., Iaquinta P. & Terranova O. (2009). In Anderssen, Braddock & Newham (Eds.), Proc. 18th World IMACS Congr. and MODSIM09 Int. Congr. on Modelling and Simulation, pp. 2686-2693. [2] Iovine G., Lollino P., Gariano S.L. & Terranova O.G. (2010). NHESS, 10, 2341-2354. [3] Capparelli G., Iaquinta P., Iovine G., Terranova O.G. & Versace P. (2012). Natural Hazards, 61(1), pp.247-256. [4] Terranova O.G., Iaquinta P., Gariano S.L., Greco R. & Iovine G. (2013) In: Landslide Science and Practice, Margottini, Canuti, Sassa (Eds.), Vol. 3, pp.73-79. [5] Terranova O.G., Gariano S.L., Iaquinta P. & Iovine G.G.R. (2015). Geosci. Model Dev., 8, 1955-1978.

  4. Reproductive Hazards

    MedlinePlus

    ... such as lead and mercury Chemicals such as pesticides Cigarettes Some viruses Alcohol For men, a reproductive hazard can affect the sperm. For a woman, a reproductive hazard can cause different effects during pregnancy, depending on when she is exposed. ...

  5. Status of volcanic hazard studies for the Nevada Nuclear Waste Storage Investigations. Volume II

    SciTech Connect

    Crowe, B.M.; Wohletz, K.H.; Vaniman, D.T.; Gladney, E.; Bower, N.

    1986-01-01

    Volcanic hazard investigations during FY 1984 focused on five topics: the emplacement mechanism of shallow basalt intrusions, geochemical trends through time for volcanic fields of the Death Valley-Pancake Range volcanic zone, the possibility of bimodal basalt-rhyolite volcanism, the age and process of enrichment for incompatible elements in young basalts of the Nevada Test Site (NTS) region, and the possibility of hydrovolcanic activity. The stress regime of Yucca Mountain may favor formation of shallow basalt intrusions. However, combined field and drill-hole studies suggest shallow basalt intrusions are rare in the geologic record of the southern Great Basin. The geochemical patterns of basaltic volcanism through time in the NTS region provide no evidence for evolution toward a large-volume volcanic field or increases in future rates of volcanism. Existing data are consistent with a declining volcanic system comparable to the late stages of the southern Death Valley volcanic field. The hazards of bimodal volcanism in this area are judged to be low. The source of a 6-Myr pumice discovered in alluvial deposits of Crater Flat has not been found. Geochemical studies show that the enrichment of trace elements in the younger rift basalts must be related to an enrichment of their mantle source rocks. This geochemical enrichment event, which may have been metasomatic alteration, predates the basalts of the silicic episode and is, therefore, not a young event. Studies of crater dimensions of hydrovolcanic landforms indicate that the worst case scenario (exhumation of a repository at Yucca Mountain by hydrovolcanic explosions) is unlikely. Theoretical models of melt-water vapor explosions, particularly the thermal detonation model, suggest hydrovolcanic explosion are possible at Yucca Mountain. 80 refs., 21 figs., 5 tabs.

  6. Incorporating Community Knowledge to Lahar Hazard Maps: Canton Buenos Aires Case Study, at Santa Ana (Ilamatepec) Volcano

    NASA Astrophysics Data System (ADS)

    Bajo, J. V.; Martinez-Hackert, B.; Polio, C.; Gutierrez, E.

    2015-12-01

    Santa Ana (Ilamatepec) Volcano is an active composite volcano located in the Apaneca Volcanic Field located in western part of El Salvador, Central America. The volcano is surrounded by rural communities in its proximal areas and the second (Santa Ana, 13 km) and fourth (Sonsosante, 15 km) largest cities of the country. On October 1st, 2005, the volcano erupted after months of increased activity. Following the eruption, volcanic mitigation projects were conducted in the region, but the communities had little or no input on them. This project consisted in the creation of lahar volcanic hazard map for the Canton Buanos Aires on the northern part of the volcano by incorporating the community's knowledge from prior events to model parameters and results. The work with the community consisted in several meetings where the community members recounted past events. They were asked to map the outcomes of those events using either a topographic map of the area, a Google Earth image, or a blank paper poster size. These maps have been used to identify hazard and vulnerable areas, and for model validation. These maps were presented to the communities and they accepted their results and the maps.

  7. Classification of residential areas according to physical vulnerability to natural hazards: a case study of Çanakkale, Turkey.

    PubMed

    Başaran-Uysal, Arzu; Sezen, Funda; Ozden, Süha; Karaca, Oznur

    2014-01-01

    The selection of new settlement areas and the construction of safe buildings, as well as rendering built-up areas safe, are of great importance in mitigating the damage caused by natural disasters. Most cities in Turkey are unprepared for natural hazards. In this paper, Çanakkale, located in a first-degree seismic zone and sprawled around the Sartçay Delta, is examined in terms of its physical vulnerability to natural hazards. Residential areas are analysed using GIS (geographic information system) and remote-sensing technologies in relation to selected indicators. Residential areas of the city are divided into zones according to an evaluation of geological characteristics, the built-up area's features, and urban infrastructure, and four risk zones are determined. The results of the analysis show that the areas of the city suitable for housing are very limited. In addition, the historical centre and the housing areas near Sartçay stream are shown to be most problematic in terms of natural disasters and sustainability.

  8. Model-Predictive Cascade Mitigation in Electric Power Systems With Storage and Renewables-Part II: Case-Study

    SciTech Connect

    Almassalkhi, MR; Hiskens, IA

    2015-01-01

    The novel cascade-mitigation scheme developed in Part I of this paper is implemented within a receding-horizon model predictive control (MPC) scheme with a linear controller model. This present paper illustrates the MPC strategy with a case-study that is based on the IEEE RTS-96 network, though with energy storage and renewable generation added. It is shown that the MPC strategy alleviates temperature overloads on transmission lines by rescheduling generation, energy storage, and other network elements, while taking into account ramp-rate limits and network limitations. Resilient performance is achieved despite the use of a simplified linear controller model. The MPC scheme is compared against a base-case that seeks to emulate human operator behavior.

  9. Eco-efficiency for greenhouse gas emissions mitigation of municipal solid waste management: a case study of Tianjin, China.

    PubMed

    Zhao, Wei; Huppes, Gjalt; van der Voet, Ester

    2011-06-01

    The issue of municipal solid waste (MSW) management has been highlighted in China due to the continually increasing MSW volumes being generated and the limited capacity of waste treatment facilities. This article presents a quantitative eco-efficiency (E/E) analysis on MSW management in terms of greenhouse gas (GHG) mitigation. A methodology for E/E analysis has been proposed, with an emphasis on the consistent integration of life cycle assessment (LCA) and life cycle costing (LCC). The environmental and economic impacts derived from LCA and LCC have been normalized and defined as a quantitative E/E indicator. The proposed method was applied in a case study of Tianjin, China. The study assessed the current MSW management system, as well as a set of alternative scenarios, to investigate trade-offs between economy and GHG emissions mitigation. Additionally, contribution analysis was conducted on both LCA and LCC to identify key issues driving environmental and economic impacts. The results show that the current Tianjin's MSW management system emits the highest GHG and costs the least, whereas the situation reverses in the integrated scenario. The key issues identified by the contribution analysis show no linear relationship between the global warming impact and the cost impact in MSW management system. The landfill gas utilization scenario is indicated as a potential optimum scenario by the proposed E/E analysis, given the characteristics of MSW, technology levels, and chosen methodologies. The E/E analysis provides an attractive direction towards sustainable waste management, though some questions with respect to uncertainty need to be discussed further.

  10. Eco-efficiency for greenhouse gas emissions mitigation of municipal solid waste management: A case study of Tianjin, China

    SciTech Connect

    Zhao Wei; Huppes, Gjalt; Voet, Ester van der

    2011-06-15

    The issue of municipal solid waste (MSW) management has been highlighted in China due to the continually increasing MSW volumes being generated and the limited capacity of waste treatment facilities. This article presents a quantitative eco-efficiency (E/E) analysis on MSW management in terms of greenhouse gas (GHG) mitigation. A methodology for E/E analysis has been proposed, with an emphasis on the consistent integration of life cycle assessment (LCA) and life cycle costing (LCC). The environmental and economic impacts derived from LCA and LCC have been normalized and defined as a quantitative E/E indicator. The proposed method was applied in a case study of Tianjin, China. The study assessed the current MSW management system, as well as a set of alternative scenarios, to investigate trade-offs between economy and GHG emissions mitigation. Additionally, contribution analysis was conducted on both LCA and LCC to identify key issues driving environmental and economic impacts. The results show that the current Tianjin's MSW management system emits the highest GHG and costs the least, whereas the situation reverses in the integrated scenario. The key issues identified by the contribution analysis show no linear relationship between the global warming impact and the cost impact in MSW management system. The landfill gas utilization scenario is indicated as a potential optimum scenario by the proposed E/E analysis, given the characteristics of MSW, technology levels, and chosen methodologies. The E/E analysis provides an attractive direction towards sustainable waste management, though some questions with respect to uncertainty need to be discussed further.

  11. UAV-based Natural Hazard Management in High-Alpine Terrain - Case Studies from Austria

    NASA Astrophysics Data System (ADS)

    Sotier, Bernadette; Adams, Marc; Lechner, Veronika

    2015-04-01

    Unmanned Aerial Vehicles (UAV) have become a standard tool for geodata collection, as they allow conducting on-demand mapping missions in a flexible, cost-effective manner at an unprecedented level of detail. Easy-to-use, high-performance image matching software make it possible to process the collected aerial images to orthophotos and 3D-terrain models. Such up-to-date geodata have proven to be an important asset in natural hazard management: Processes like debris flows, avalanches, landslides, fluvial erosion and rock-fall can be detected and quantified; damages can be documented and evaluated. In the Alps, these processes mostly originate in remote areas, which are difficult and hazardous to access, thus presenting a challenging task for RPAS data collection. In particular, the problems include finding suitable landing and piloting-places, dealing with bad or no GPS-signals and the installation of ground control points (GCP) for georeferencing. At the BFW, RPAS have been used since 2012 to aid natural hazard management of various processes, of which three case studies are presented below. The first case study deals with the results from an attempt to employ UAV-based multi-spectral remote sensing to monitor the state of natural hazard protection forests. Images in the visible and near-infrared (NIR) band were collected using modified low-cost cameras, combined with different optical filters. Several UAV-flights were performed in the 72 ha large study site in 2014, which lies in the Wattental, Tyrol (Austria) between 1700 and 2050 m a.s.l., where the main tree species are stone pine and mountain pine. The matched aerial images were analysed using different UAV-specific vitality indices, evaluating both single- and dual-camera UAV-missions. To calculate the mass balance of a debris flow in the Tyrolean Halltal (Austria), an RPAS flight was conducted in autumn 2012. The extreme alpine environment was challenging for both the mission and the evaluation of the aerial

  12. CMMAD Usability Case Study in Support of Countermine and Hazard Sensing

    SciTech Connect

    Victor G. Walker; David I. Gertman

    2010-04-01

    During field trials, operator usability data were collected in support of lane clearing missions and hazard sensing for two robot platforms with Robot Intelligence Kernel (RIK) software and sensor scanning payloads onboard. The tests featured autonomous and shared robot autonomy levels where tasking of the robot used a graphical interface featuring mine location and sensor readings. The goal of this work was to provide insights that could be used to further technology development. The efficacy of countermine systems in terms of mobility, search, path planning, detection, and localization were assessed. Findings from objective and subjective operator interaction measures are reviewed along with commentary from soldiers having taken part in the study who strongly endorse the system.

  13. Distinguishing Realistic Military Blasts from Firecrackers in Mitigation Studies of Blast Induced Traumatic Brain Injury

    SciTech Connect

    Moss, W C; King, M J; Blackman, E G

    2011-01-21

    In their Contributed Article, Nyein et al. (1,2) present numerical simulations of blast waves interacting with a helmeted head and conclude that a face shield may significantly mitigate blast induced traumatic brain injury (TBI). A face shield may indeed be important for future military helmets, but the authors derive their conclusions from a much smaller explosion than typically experienced on the battlefield. The blast from the 3.16 gm TNT charge of (1) has the following approximate peak overpressures, positive phase durations, and incident impulses (3): 10 atm, 0.25 ms, and 3.9 psi-ms at the front of the head (14 cm from charge), and 1.4 atm, 0.32 ms, and 1.7 psi-ms at the back of a typical 20 cm head (34 cm from charge). The peak pressure of the wave decreases by a factor of 7 as it traverses the head. The blast conditions are at the threshold for injury at the front of the head, but well below threshold at the back of the head (4). The blast traverses the head in 0.3 ms, roughly equal to the positive phase duration of the blast. Therefore, when the blast reaches the back of the head, near ambient conditions exist at the front. Because the headform is so close to the charge, it experiences a wave with significant curvature. By contrast, a realistic blast from a 2.2 kg TNT charge ({approx} an uncased 105 mm artillery round) is fatal at an overpressure of 10 atm (4). For an injury level (4) similar to (1), a 2.2 kg charge has the following approximate peak overpressures, positive phase durations, and incident impulses (3): 2.1 atm, 2.3 ms, and 18 psi-ms at the front of the head (250 cm from charge), and 1.8 atm, 2.5 ms, and 16.8 psi-ms at the back of the head (270 cm from charge). The peak pressure decreases by only a factor of 1.2 as it traverses the head. Because the 0.36 ms traversal time is much smaller than the positive phase duration, pressures on the head become relatively uniform when the blast reaches the back of the head. The larger standoff implies

  14. Balancing Mitigation Against Impact: A Case Study From the 2005 Chicxulub Seismic Survey

    NASA Astrophysics Data System (ADS)

    Barton, P.; Diebold, J.; Gulick, S.

    2006-05-01

    investigation. Both datasets indicate significantly lower levels than reported by Tolstoy et al. (2004). There was no evidence of environmental damage created by this survey. It can be concluded that the mitigation measures were extremely successful, but there is also a concern that the overhead cost of the environmental protection made this one of the most costly academic surveys ever undertaken, and that not all of this protection was necessary. In particular, the predicted 180 dB safety radius appeared to be overly conservative, even though based on calibrated measurements in very similar physical circumstances, and we suggest that these differences were a result of local seismic velocity structure in the water column and/or shallow seabed, which resulted in different partitioning of the energy. These results suggest that real time monitoring of hydrophone array data may provide a method of determining the safety radius dynamically, in response to local conditions.

  15. Case studies. [hazardous effects of early medical use of X-rays

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The characteristics of the technology assessment process which were manifested in response to the hazardous effects of early medical uses of X-rays are considered. Other topics discussed include: controlling the potential hazards of government-sponsored technology, genetic technology, community level impacts of expanded underground coal mining, and an integrated strategy for aircraft/airport noise abatement.

  16. AMERICAN HEALTHY HOMES SURVEY: A NATIONAL STUDY OF RESIDENTIAL RELATED HAZARDS

    EPA Science Inventory

    The US Environmental Protection Agency's (EPA) National Exposure Research Laboratory (NERL) and the US Department of Housing and Urban Development's (HUD) Office of Healthy Homes and Lead Hazard Control conducted a national survey of housing related hazards in US residences. The...

  17. Natural phenomena hazards, Hanford Site, Washington

    SciTech Connect

    Conrads, T.J.

    1998-09-29

    This document presents the natural phenomena hazard loads for use in implementing DOE Order 5480.28, Natural Phenomena Hazards Mitigation, and supports development of double-shell tank systems specifications at the Hanford Site in south-central Washington State. The natural phenomena covered are seismic, flood, wind, volcanic ash, lightning, snow, temperature, solar radiation, suspended sediment, and relative humidity.

  18. Incorporating natural hazard assessments into municipal master-plans; case-studies from Israel

    NASA Astrophysics Data System (ADS)

    Katz, Oded

    2010-05-01

    The active Dead Sea Rift (DSR) runs along the length of Israel, making the entire state susceptible to earthquake-related hazards. Current building codes generally acknowledge seismic hazards and direct engineers towards earthquake-resistant structures. However, hazard mapping on a scale fit for municipal/governmental planning is subject to local initiative and is currently not mandatory as seems necessary. In the following, a few cases of seismic-hazard evaluation made by the Geological Survey of Israel are presented, emphasizing the reasons for their initiation and the way results were incorporated (or not). The first case is a seismic hazard qualitative micro-zonation invited by the municipality of Jerusalem as part of a new master plan. This work resulted in maps (1:50,000; GIS format) identifying areas prone to (1) amplification of seismic shaking due to site characteristics (outcrops of soft rocks or steep topography) and (2) sites with earthquake induced landslide (EILS) hazard. Results were validated using reports from the 1927, M=6.2 earthquake that originated along the DSR about 30km east of Jerusalem. Although the hazard maps were accepted by municipal authorities, practical use by geotechnical engineers working within the frame of the new master-plan was not significant. The main reason for that is apparently a difference of opinion between the city-engineers responsible for implementing the new master-plan and the geologists responsible of the hazard evaluation. The second case involves evaluation of EILS hazard for two towns located further north along the DSR, Zefat and Tiberias. Both were heavily damaged more than once by strong earthquakes in past centuries. Work was carried out as part of a governmental seismic-hazard reduction program. The results include maps (1:10,000 scales) of sites with high EILS hazard identified within city limits. Maps (in GIS format) were sent to city engineers with reports explaining the methods and results. As far as

  19. Impact and effectiveness of risk mitigation strategies on the insurability of nanomaterial production: evidences from industrial case studies.

    PubMed

    Bergamaschi, Enrico; Murphy, Finbarr; Poland, Craig A; Mullins, Martin; Costa, Anna L; McAlea, Eamonn; Tran, Lang; Tofail, Syed A M

    2015-01-01

    Workers involved in producing nanomaterials or using nanomaterials in manufacturing plants are likely to have earlier and higher exposure to manufactured/engineered nanomaterials (ENM) than the general population. This is because both the volume handled and the probability of the effluence of 'free' nanoparticles from the handled volume are much higher during a production process than at any other stage in the lifecycle of nanomaterials and nanotechnology-enabled products. Risk assessment (RA) techniques using control banding (CB) as a framework for risk transfer represents a robust theory but further progress on implementing the model is required so that risk can be transferred to insurance companies. Following a review of RA in general and hazard measurement in particular, we subject a Structural Alert Scheme methodology to three industrial case studies using ZrO2 , TiO2 , and multi-walled carbon nanotubes (MWCNT). The materials are tested in a pristine state and in a remediated (coated) state, and the respective emission and hazard rates are tested alongside the material performance as originally designed. To our knowledge, this is the first such implementation of a CB RA in conjunction with an ENM performance test and offers both manufacturers and underwriters an insight into future applications.

  20. Los Alamos Radiation Hydrocode Models of Asteroid Mitigation by a Subsurface Explosion

    NASA Astrophysics Data System (ADS)

    Weaver, R.; Plesko, C. S.; Dearholt, W.

    2010-12-01

    Mitigation of a potentially hazardous object (PHO) by a nuclear subsurface explosion is considered. In this new work we examine non-central subsurface emplacements and seek an optimal depth-of-burial for various explosion energies. This intervention methodology has been popularized in media presentations and is considered as one possible method of impact-hazard mitigation. We present new RAGE radiation hydrocode models of the shock-generated disruption of PHOs by subsurface nuclear bursts and deflection from shallow buried bursts using scenario-specific models from authentic RADAR shape models. We will show 2D and 3D models for the disruption by a large energy source at the center and near the edge (mitigation) of such PHO models (1-10 Mton TNT equivalent), specifically for asteroid 25143 Itokawa. Parametric studies will be done on: the value of the source energy (from 100 Kton to 10 Mton), the parameters in the Steinberg-Guinan strength model used and the internal composition of the object from uniform composition to a “rubble pile” distribution. Specifically we are interested in assessing the optimum depth of burial and energy required to essentially disrupt and/or move the PHO and therefore mitigate the hazard. Recollection will be considered. (LA-UR-10-05860) A subsurface 1 Mt explosion near the long-axis surface of an Itokawa shape model with a non-uniform internal composition. The resulting velocity imparted to the bulk remainder of the object is ~50 m/s.

  1. Los Alamos Radiation Hydrocode Models of Asteroid Mitigation by an Internal Explosion

    NASA Astrophysics Data System (ADS)

    Weaver, Robert; Plesko, C.; Dearholdt, W.

    2010-10-01

    Mitigation of a potentially hazardous object (PHO) by a conventional or nuclear subsurface burst is considered. This intervention methodology has been popularized in media presentations and is considered as one possible method of impact-hazard mitigation. We present RAGE radiation hydrocode models of the shock-generated disruption of PHOs by subsurface nuclear bursts and deflection from shallow buried bursts using scenario-specific models from authentic RADAR shape models. We will show 2D and 3D models for the disruption by a large energy source at the center and near the edge (mitigation) of such PHO models (1-10 Mton TNT equivalent), specifically for asteroid 25143 Itokawa. Parametric studies will be done on: the value of the source energy (from 1 Mton to 10 Mton), the parameters in the Steinberg-Guinan strength model used and the internal composition of the object from uniform composition to a "rubble pile” distribution. Specifically we are interested in assessing the optimum depth of burial and energy required to essentially disrupt and/or move the PHO and therefore mitigate the hazard. Recollection will be considered.

  2. The Relative Severity of Single Hazards within a Multi-Hazard Framework

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2013-04-01

    Here we present a description of the relative severity of single hazards within a multi-hazard framework, compiled through examining, quantifying and ranking the extent to which individual hazards trigger or increase the probability of other hazards. Hazards are broken up into six major groupings (geophysical, hydrological, shallow earth processes, atmospheric, biophysical and space), with the interactions for 21 different hazard types examined. These interactions include both one primary hazard triggering a secondary hazard, and one primary hazard increasing the probability of a secondary hazard occurring. We identify, through a wide-ranging review of grey- and peer-review literature, >90 interactions. The number of hazard-type linkages are then summed for each hazard in terms of their influence (the number of times one hazard type triggers another type of hazard, or itself) and their sensitivity (the number of times one hazard type is triggered by other hazard types, or itself). The 21 different hazards are then ranked based on (i) influence and (ii) sensitivity. We found, by quantification and ranking of these hazards, that: (i) The strongest influencers (those triggering the most secondary hazards) are volcanic eruptions, earthquakes and storms, which when taken together trigger almost a third of the possible hazard interactions identified; (ii) The most sensitive hazards (those being triggered by the most primary hazards) are identified to be landslides, volcanic eruptions and floods; (iii) When sensitivity rankings are adjusted to take into account the differential likelihoods of different secondary hazards being triggered, the most sensitive hazards are found to be landslides, floods, earthquakes and ground heave. We believe that by determining the strongest influencing and the most sensitive hazards for specific spatial areas, the allocation of resources for mitigation measures might be done more effectively.

  3. Integrated Geo Hazard Management System in Cloud Computing Technology

    NASA Astrophysics Data System (ADS)

    Hanifah, M. I. M.; Omar, R. C.; Khalid, N. H. N.; Ismail, A.; Mustapha, I. S.; Baharuddin, I. N. Z.; Roslan, R.; Zalam, W. M. Z.

    2016-11-01

    Geo hazard can result in reducing of environmental health and huge economic losses especially in mountainous area. In order to mitigate geo-hazard effectively, cloud computer technology are introduce for managing geo hazard database. Cloud computing technology and it services capable to provide stakeholder's with geo hazards information in near to real time for an effective environmental management and decision-making. UNITEN Integrated Geo Hazard Management System consist of the network management and operation to monitor geo-hazard disaster especially landslide in our study area at Kelantan River Basin and boundary between Hulu Kelantan and Hulu Terengganu. The system will provide easily manage flexible measuring system with data management operates autonomously and can be controlled by commands to collects and controls remotely by using “cloud” system computing. This paper aims to document the above relationship by identifying the special features and needs associated with effective geohazard database management using “cloud system”. This system later will use as part of the development activities and result in minimizing the frequency of the geo-hazard and risk at that research area.

  4. Using Significant Geologic Hazards and Disasters to Focus Geoethics Case Studies

    NASA Astrophysics Data System (ADS)

    Cronin, V. S.

    2015-12-01

    Ethics education since classical times has involved the consideration of stories, parables, myths, fables, allegories and histories. These are the ancient equivalents of case studies. Modern case studies are used in applied-ethics courses in law, engineering, business, and science. When used in a geoscience course, geoethical case studies can enrich a student's understanding of the relationships between issues of geoscience, engineering, sociology, business, public policy and law - all with an ethical dimension. Perhaps more importantly, real cases affected real people. Students develop a strong empathetic connection to the people involved, enhancing students' drive to understand the interconnected layers of the cases. Students might begin to appreciate that geoscientists can help to avoid or alleviate human suffering -- that their careers can have meaning and purpose beyond simply earning a paycheck. Geologic disasters in which losses could have been predicted, avoided or minimized are quite effective as cases. Coupling a "disaster" case with a comparable "hazard" case is particularly effective. For example, there are many places along the San Andreas Fault in California where [1] significant coseismic displacement has occurred during historical times, [2] structures that are still inhabited were built along or across active traces prior to the Alquist-Priolo Earthquake Fault Zoning Act in 1971, and [3] inhabited structures have been built legally since 1971 within a few tens of feet of active traces. The question students confront is whether society ought to allow habitable structures to be built very near to a major active fault. This topic allows students to work with issues of law, history, seismology, seismic site response, crustal deformation adjacent to active faults, building codes and, ultimately, ethics. Similar progressions can be developed for other major geologic hazards, both natural and man-made, such as floods, landslides, erosion along rivers and

  5. Remedial Action Assessment System (RAAS): Evaluation of selected feasibility studies of CERCLA (Comprehensive Environmental Response, Compensation, and Liability Act) hazardous waste sites

    SciTech Connect

    Whelan, G. ); Hartz, K.E.; Hilliard, N.D. and Associates, Seattle, WA )

    1990-04-01

    Congress and the public have mandated much closer scrutiny of the management of chemically hazardous and radioactive mixed wastes. Legislative language, regulatory intent, and prudent technical judgment, call for using scientifically based studies to assess current conditions and to evaluate and select costeffective strategies for mitigating unacceptable situations. The NCP requires that a Remedial Investigation (RI) and a Feasibility Study (FS) be conducted at each site targeted for remedial response action. The goal of the RI is to obtain the site data needed so that the potential impacts on public health or welfare or on the environment can be evaluated and so that the remedial alternatives can be identified and selected. The goal of the FS is to identify and evaluate alternative remedial actions (including a no-action alternative) in terms of their cost, effectiveness, and engineering feasibility. The NCP also requires the analysis of impacts on public health and welfare and on the environment; this analysis is the endangerment assessment (EA). In summary, the RI, EA, and FS processes require assessment of the contamination at a site, of the potential impacts in public health or the environment from that contamination, and of alternative RAs that could address potential impacts to the environment. 35 refs., 7 figs., 1 tab.

  6. A Competence-Based Science Learning Framework Illustrated through the Study of Natural Hazards and Disaster Risk Reduction

    ERIC Educational Resources Information Center

    Oyao, Sheila G.; Holbrook, Jack; Rannikmäe, Miia; Pagunsan, Marmon M.

    2015-01-01

    This article proposes a competence-based learning framework for science teaching, applied to the study of "big ideas", in this case to the study of natural hazards and disaster risk reduction (NH&DRR). The framework focuses on new visions of competence, placing emphasis on nurturing connectedness and behavioral actions toward…

  7. Safety in earth orbit study. Volume 2: Analysis of hazardous payloads, docking, on-board survivability

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Detailed and supporting analyses are presented of the hazardous payloads, docking, and on-board survivability aspects connected with earth orbital operations of the space shuttle program. The hazards resulting from delivery, deployment, and retrieval of hazardous payloads, and from handling and transport of cargo between orbiter, sortie modules, and space station are identified and analyzed. The safety aspects of shuttle orbiter to modular space station docking includes docking for assembly of space station, normal resupply docking, and emergency docking. Personnel traffic patterns, escape routes, and on-board survivability are analyzed for orbiter with crew and passenger, sortie modules, and modular space station, under normal, emergency, and EVA and IVA operations.

  8. Case study to remove radioactive hazardous sludge from long horizontal storage tanks

    SciTech Connect

    Hylton, T.D.; Youngblood, E.L.; Cummins, R.L.

    1995-12-31

    The removal of radioactive hazardous sludge from waste tanks is a significant problem at several US Department of Energy (DOE) sites. The use of submerged jets produced by mixing pumps lowered into the supernatant/sludge interface to produce a homogeneous slurry is being studied at several DOE facilities. The homogeneous slurry can be pumped from the tanks to a treatment facility or alternative storage location. Most of the previous and current studies with this method are for flat-bottom tanks with vertical walls. Because of the difference in geometry, the results of these studies are not directly applicable to long horizontal tanks such as those used at the Oak Ridge National Laboratory. Mobilization and mixing studies were conducted with a surrogate sludge (e.g., kaolin clay) using submerged jets in two sizes of horizontal tanks. The nominal capacities of these tanks were 0.87 m{sup 3} (230 gal) and 95 m{sup 3} (25,000 gal). Mobilization efficiencies and mixing times were determined for single and bidirectional jets in both tanks with the discharge nozzles positioned at two locations in the tanks. Approximately 80% of the surrogate sludge was mobilized in the 95-m{sup 3} tank using a fixed bidirectional jet (inside diameter = 0.035 m) and a jet velocity of 6.4 m/s (21 ft/s).

  9. An Optimal Mitigation Strategy Against the Asteroid Impact Threat with Short Warning Time

    NASA Technical Reports Server (NTRS)

    Wie, Bong; Barbee, Brent W.

    2015-01-01

    This paper presents the results of a NASA Innovative Advanced Concept (NIAC) Phase 2 study entitled "An Innovative Solution to NASA's Near-Earth Object (NEO) Impact Threat Mitigation Grand Challenge and Flight Validation Mission Architecture Development." This NIAC Phase 2 study was conducted at the Asteroid Deflection Research Center (ADRC) of Iowa State University in 2012-2014. The study objective was to develop an innovative yet practically implementable mitigation strategy for the most probable impact threat of an asteroid or comet with short warning time (less than 5 years). The mitigation strategy described in this paper is intended to optimally reduce the severity and catastrophic damage of the NEO impact event, especially when we don't have sufficient warning times for non-disruptive deflection of a hazardous NEO. This paper provides an executive summary of the NIAC Phase 2 study results.

  10. Propensity score applied to survival data analysis through proportional hazards models: a Monte Carlo study.

    PubMed

    Gayat, Etienne; Resche-Rigon, Matthieu; Mary, Jean-Yves; Porcher, Raphaël

    2012-01-01

    Propensity score methods are increasingly used in medical literature to estimate treatment effect using data from observational studies. Despite many papers on propensity score analysis, few have focused on the analysis of survival data. Even within the framework of the popular proportional hazard model, the choice among marginal, stratified or adjusted models remains unclear. A Monte Carlo simulation study was used to compare the performance of several survival models to estimate both marginal and conditional treatment effects. The impact of accounting or not for pairing when analysing propensity-score-matched survival data was assessed. In addition, the influence of unmeasured confounders was investigated. After matching on the propensity score, both marginal and conditional treatment effects could be reliably estimated. Ignoring the paired structure of the data led to an increased test size due to an overestimated variance of the treatment effect. Among the various survival models considered, stratified models systematically showed poorer performance. Omitting a covariate in the propensity score model led to a biased estimation of treatment effect, but replacement of the unmeasured confounder by a correlated one allowed a marked decrease in this bias. Our study showed that propensity scores applied to survival data can lead to unbiased estimation of both marginal and conditional treatment effect, when marginal and adjusted Cox models are used. In all cases, it is necessary to account for pairing when analysing propensity-score-matched data, using a robust estimator of the variance.

  11. Vulnerability studies and integrated assessments for hazard risk reduction in Pittsburgh, PA (Invited)

    NASA Astrophysics Data System (ADS)

    Klima, K.

    2013-12-01

    Today's environmental problems stretch beyond the bounds of most academic disciplines, and thus solutions require an interdisciplinary approach. For instance, the scientific consensus is changes in the frequency and severity of many types of extreme weather events are increasing (IPCC 2012). Yet despite our efforts to reduce greenhouse gases, we continue to experience severe weather events such as Superstorm Sandy, record heat and blizzards, and droughts. These natural hazards, combined with increased vulnerability and exposure, result in longer-lasting disruptions to critical infrastructure and business continuity throughout the world. In order to protect both our lives and the economy, we must think beyond the bounds of any one discipline to include an integrated assessment of relevant work. In the wake of recent events, New York City, Washington, DC, Chicago, and a myriad of other cities have turned to their academic powerhouses for assistance in better understanding their vulnerabilities. This talk will share a case study of the state of integrated assessments and vulnerability studies of energy, transportation, water, real estate, and other main sectors in Pittsburgh, PA. Then the talk will use integrated assessment models and other vulnerability studies to create coordinated sets of climate projections for use by the many public agencies and private-sector organizations in the region.

  12. Climate change adaptation & mitigation strategies for Water-Energy-Land Nexus management in Mediterranean region: Case study of Catalunya (Spain).

    NASA Astrophysics Data System (ADS)

    Kumar, Vikas; Schuhmacher, Marta

    2016-04-01

    Water-Energy-Land (WEL) Nexus management is one of those complex decision problems where holistic approach to supply-demand management considering different criteria would be valuable. However, multi-criteria decision making with diverse indicators measured on different scales and uncertainty levels is difficult to solve. On the other hand, climate adaptation and mitigation need to be integrated, and resource sensitive regions like Mediterranean provide ample opportunities towards that end. While the water sector plays a key role in climate adaptation, mitigation focuses on the energy and agriculture sector. Recent studies on the so-called WEL nexus confirm the potential synergies to be derived from mainstreaming climate adaptation in the water sector, while simultaneously addressing opportunities for co-management with energy (and also land use). Objective of this paper is to develop scenarios for the future imbalances in water & energy supply and demand for a water stressed Mediterranean area of Northern Spain (Catalonia) and to test the scenario based climate adaptation & mitigation strategy for WEL management policies. Resource sensitive area of Catalonia presents an interesting nexus problem to study highly stressed water demand scenario (representing all major demand sectors), very heterogeneous land use including intensive agriculture to diversified urban and industrial uses, and mixed energy supply including hydro, wind, gas turbine to nuclear energy. Different energy sectors have different water and land requirements. Inter-river basin water transfer is another factor which is considered for this area. The water-energy link is multifaceted. Energy production can affect water quality, while energy is used in water treatment and to reduce pollution. Similarly, hydropower - producing energy from water - and desalination - producing freshwater using energy - both play important role in economic growth by supplying large and secure amounts of 'green' energy or

  13. Are some "safer alternatives" hazardous as PBTs? The case study of new flame retardants.

    PubMed

    Gramatica, Paola; Cassani, Stefano; Sangion, Alessandro

    2016-04-05

    Some brominated flame retardants (BFRs), as PBDEs, are persistent, bioaccumulative, toxic (PBT) and are restricted/prohibited under various legislations. They are replaced by "safer" flame retardants (FRs), such as new BFRs or organophosphorous compounds. However, informations on the PBT behaviour of these substitutes are often lacking. The PBT assessment is required by the REACH regulation and the PBT chemicals should be subjected to authorization. Several new FRs, proposed and already used as safer alternatives to PBDEs, are here screened by the cumulative PBT Index model, implemented in QSARINS (QSAR-Insubria), new software for the development/validation of QSAR models. The results, obtained directly from the chemical structure for the three studied characteristics altogether, were compared with those from the US-EPA PBT Profiler: the two different approaches are in good agreement, supporting the utility of a consensus approach in these screenings. A priority list of the most harmful FRs, predicted in agreement by the two modelling tools, has been proposed, highlighting that some supposed "safer alternatives" are detected as intrinsically hazardous for their PBT properties. This study also shows that the PBT Index could be a valid tool to evaluate appropriate and safer substitutes, a priori from the chemical design, in a benign by design approach, avoiding unnecessary synthesis and tests.

  14. Role of human- and animal-sperm studies in the evaluation of male reproductive hazards

    SciTech Connect

    Wyrobek, A.J.; Gordon, L.; Watchmaker, G.

    1982-04-07

    Human sperm tests provide a direct means of assessing chemically induced spermatogenic dysfunction in man. Available tests include sperm count, motility, morphology (seminal cytology), and Y-body analyses. Over 70 different human exposures have been monitored in various groups of exposed men. The majority of exposures studied showed a significant change from control in one or more sperm tests. When carefully controlled, the sperm morphology test is statistically the most sensitive of these human sperm tests. Several sperm tests have been developed in nonhuman mammals for the study of chemical spermatotoxins. The sperm morphology test in mice has been the most widely used. Results with this test seem to be related to germ-cell mutagenicity. In general, animal sperm tests should play an important role in the identification and assessment of potential human reproductive hazards. Exposure to spermatotoxins may lead to infertility, and more importantly, to heritable genetic damage. While there are considerable animal and human data suggesting that sperm tests may be used to detect agents causing infertility, the extent to which these tests detect heritable genetic damage remains unclear. (ERB)

  15. Methane emission from ruminants and solid waste: A critical analysis of baseline and mitigation projections for climate and policy studies

    NASA Astrophysics Data System (ADS)

    Matthews, E.

    2012-12-01

    Current and projected estimates of methane (CH4) emission from anthropogenic sources are numerous but largely unexamined or compared. Presented here is a critical appraisal of CH4 projections used in climate-chemistry and policy studies. We compare emissions for major CH4 sources from several groups, including our own new data and RCP projections developed for climate-chemistry models for the next IPCC Assessment Report (AR5). We focus on current and projected baseline and mitigation emissions from ruminant animals and solid waste that are both predicted to rise dramatically in coming decades, driven primarily by developing countries. For waste, drivers include increasing urban populations, higher per capita waste generation due to economic growth and increasing landfilling rates. Analysis of a new global data base detailing waste composition, collection and disposal indicates that IPCC-based methodologies and default data overestimate CH4 emission for the current period which cascades into substantial overestimates in future projections. CH4 emission from solid waste is estimated to be ~10-15 Tg CH4/yr currently rather than the ~35 Tg/yr often reported in the literature. Moreover, emissions from developing countries are unlikely to rise rapidly in coming decades because new management approaches, such as sanitary landfills, that would increase emissions are maladapted to infrastructures in these countries and therefore unlikely to be implemented. The low current emission associated with solid waste (~10 Tg), together with future modest growth, implies that mitigation of waste-related CH4 emission is a poor candidate for slowing global warming. In the case of ruminant animals (~90 Tg CH4/yr currently), the dominant assumption driving future trajectories of CH4 emission is a substantial increase in meat and dairy consumption in developing countries to be satisfied by growing animal populations. Unlike solid waste, current ruminant emissions among studies exhibit a

  16. Probabilistic Hazard Curves for Tornadic Winds, Wind Gusts, and Extreme Rainfall Events

    SciTech Connect

    Weber, A.H.

    1999-07-29

    'This paper summarizes a study carried on at the Savannah River Site (SRS) for determining probabilistic hazard curves for tornadic winds, wind gusts, and extreme rainfall events. DOE Order 420.1, Facility Safety, outlines the requirements for Natural Phenomena Hazards (NPH) mitigation for new and existing DOE facilities. Specifically, NPH include tornadic winds, maximum wind gusts, and extreme rainfall events. Probabilistic hazard curves for each phenomenon indicate the recurrence frequency, and these hazard curves must be updated at least every 10 years to account for recent data, improved methodologies, or criteria changes. Also, emergency response exercises often use hypothetical weather data to initiate accident scenarios. The hazard curves in these reports provide a means to use extreme weather events based on models and measurements rather than scenarios that are created ad hoc as is often the case.'

  17. Mitigating Infectious Disease Outbreaks

    NASA Astrophysics Data System (ADS)

    Davey, Victoria

    The emergence of new, transmissible infections poses a significant threat to human populations. As the 2009 novel influenza A/H1N1 pandemic and the 2014-2015 Ebola epidemic demonstrate, we have observed the effects of rapid spread of illness in non-immune populations and experienced disturbing uncertainty about future potential for human suffering and societal disruption. Clinical and epidemiologic characteristics of a newly emerged infectious organism are usually gathered in retrospect as the outbreak evolves and affects populations. Knowledge of potential effects of outbreaks and epidemics and most importantly, mitigation at community, regional, national and global levels is needed to inform policy that will prepare and protect people. Study of possible outcomes of evolving epidemics and application of mitigation strategies is not possible in observational or experimental research designs, but computational modeling allows conduct of `virtual' experiments. Results of well-designed computer simulations can aid in the selection and implementation of strategies that limit illness and death, and maintain systems of healthcare and other critical resources that are vital to public protection. Mitigating Infectious Disease Outbreaks.

  18. Landslide Hazards

    USGS Publications Warehouse

    ,

    2000-01-01

    Landslide hazards occur in many places around What Can You Do If You Live Near Steep Hills? the world and include fast-moving debris flows, slow-moving landslides, and a variety of flows and slides initiating from volcanoes. Each year, these hazards cost billions of dollars and cause numerous fatalities and injuries. Awareness and education about these hazards is a first step toward reducing damaging effects. The U.S. Geological Survey conducts research and distributes information about geologic hazards. This Fact Sheet is published in English and Spanish and can be reproduced in any form for further distribution. 

  19. Inactivation of RNA Viruses by Gamma Irradiation: A Study on Mitigating Factors.

    PubMed

    Hume, Adam J; Ames, Joshua; Rennick, Linda J; Duprex, W Paul; Marzi, Andrea; Tonkiss, John; Mühlberger, Elke

    2016-07-22

    Effective inactivation of biosafety level 4 (BSL-4) pathogens is vital in order to study these agents safely. Gamma irradiation is a commonly used method for the inactivation of BSL-4 viruses, which among other advantages, facilitates the study of inactivated yet morphologically intact virions. The reported values for susceptibility of viruses to inactivation by gamma irradiation are sometimes inconsistent, likely due to differences in experimental protocols. We analyzed the effects of common sample attributes on the inactivation of a recombinant vesicular stomatitis virus expressing the Zaire ebolavirus glycoprotein and green fluorescent protein. Using this surrogate virus, we found that sample volume and protein content of the sample modulated viral inactivation by gamma irradiation but that air volume within the sample container and the addition of external disinfectant surrounding the sample did not. These data identify several factors which alter viral susceptibility to inactivation and highlight the usefulness of lower biosafety level surrogate viruses for such studies. Our results underscore the need to validate inactivation protocols of BSL-4 pathogens using "worst-case scenario" procedures to ensure complete sample inactivation.

  20. Inactivation of RNA Viruses by Gamma Irradiation: A Study on Mitigating Factors

    PubMed Central

    Hume, Adam J.; Ames, Joshua; Rennick, Linda J.; Duprex, W. Paul; Marzi, Andrea; Tonkiss, John; Mühlberger, Elke

    2016-01-01

    Effective inactivation of biosafety level 4 (BSL-4) pathogens is vital in order to study these agents safely. Gamma irradiation is a commonly used method for the inactivation of BSL-4 viruses, which among other advantages, facilitates the study of inactivated yet morphologically intact virions. The reported values for susceptibility of viruses to inactivation by gamma irradiation are sometimes inconsistent, likely due to differences in experimental protocols. We analyzed the effects of common sample attributes on the inactivation of a recombinant vesicular stomatitis virus expressing the Zaire ebolavirus glycoprotein and green fluorescent protein. Using this surrogate virus, we found that sample volume and protein content of the sample modulated viral inactivation by gamma irradiation but that air volume within the sample container and the addition of external disinfectant surrounding the sample did not. These data identify several factors which alter viral susceptibility to inactivation and highlight the usefulness of lower biosafety level surrogate viruses for such studies. Our results underscore the need to validate inactivation protocols of BSL-4 pathogens using “worst-case scenario” procedures to ensure complete sample inactivation. PMID:27455307

  1. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 3 2014-10-01 2014-10-01 false What must I do to mitigate internal corrosion? 195... SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Corrosion Control § 195.579 What must I do to mitigate internal corrosion? (a) General. If you transport any hazardous liquid or carbon dioxide...

  2. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 3 2013-10-01 2013-10-01 false What must I do to mitigate internal corrosion? 195... SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Corrosion Control § 195.579 What must I do to mitigate internal corrosion? (a) General. If you transport any hazardous liquid or carbon dioxide...

  3. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 3 2012-10-01 2012-10-01 false What must I do to mitigate internal corrosion? 195... SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Corrosion Control § 195.579 What must I do to mitigate internal corrosion? (a) General. If you transport any hazardous liquid or carbon dioxide...

  4. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false What must I do to mitigate internal corrosion? 195... SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Corrosion Control § 195.579 What must I do to mitigate internal corrosion? (a) General. If you transport any hazardous liquid or carbon dioxide...

  5. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 3 2011-10-01 2011-10-01 false What must I do to mitigate internal corrosion? 195... SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Corrosion Control § 195.579 What must I do to mitigate internal corrosion? (a) General. If you transport any hazardous liquid or carbon dioxide...

  6. Using No-Stakes Educational Testing to Mitigate Summer Learning Loss: A Pilot Study. Research Report. ETS RR-14-21

    ERIC Educational Resources Information Center

    Zaromb, Franklin; Adler, Rachel M.; Bruce, Kelly; Attali, Yigal; Rock, JoAnn

    2014-01-01

    This study investigates the benefits of no-stakes educational testing during students' summer vacation as a strategy to mitigate summer learning loss. Fifty-one students in Grades 3-8 from the Every Child Valued (ECV) and Lawrence Community Center (LCC) summer programs in Lawrenceville, NJ, took short, online assessments throughout the summer,…

  7. Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs

    EPA Science Inventory

    This is the fourth, also the last, report of the report series entitled “Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings.” This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method, developed ...

  8. Laboratory Study of Polychlorinated Biphenyl Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs

    EPA Science Inventory

    This is the fourth, also the last, report of the report series entitled “Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings.” This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method...

  9. Field study of exhaust fans for mitigating indoor air quality problems: Final report

    SciTech Connect

    Grimsrud, D.T.; Szydlowski, R.F.; Turk, B.H.

    1986-09-01

    Residential ventilation in the United States housing stock is provided primarily by infiltration, the natural leakage of outdoor air into a building through cracks and holes in the building shell. Since ventilation is the dominant mechanism for control of indoor pollutant concentrations, low infiltration rates caused fluctuation in weather conditions may lead to high indoor pollutant concentrations. Supplemental mechanical ventilation can be used to eliminate these periods of low infiltration. This study examined effects of small continuously-operating exhaust fan on pollutant concentrations and energy use in residences.

  10. Assessment of Nearshore Hazard due to Tsunami-Induced Currents

    NASA Astrophysics Data System (ADS)

    Lynett, P. J.; Ayca, A.; Borrero, J. C.; Eskijian, M.; Miller, K.; Wilson, R. I.

    2014-12-01

    The California Tsunami Program in cooperation with NOAA and FEMA has begun implementing a plan to increase tsunami hazard preparedness and mitigation in maritime communities (both ships and harbor infrastructure) through the development of in-harbor hazard maps, offshore safety zones for boater evacuation, and associated guidance for harbors and marinas before, during and following tsunamis. The hope is that the maritime guidance and associated education program will help save lives and reduce exposure of damage to boats and harbor infrastructure. Findings will be used to develop maps, guidance documents, and consistent policy recommendations for emergency managers and port authorities and provide information critical to real-time decisions required when responding to tsunami alert notifications. The initial goals of the study are to (1) evaluate the effectiveness and sensitivity of existing numerical models for assessing maritime tsunami hazards, (2) find a relationship between current speeds and expected damage levels, (3) evaluate California ports and harbors in terms of tsunami induced hazards by identifying regions that are prone to higher current speeds and damage and to identify regions of relatively lower impact that may be used for evacuation of maritime assets, and (4) determine 'safe depths' for evacuation of vessels from ports and harbors during a tsunami event. We will present details of a new initiative to evaluate the future likelihood of failure for different structural components of a harbor, leading to the identification of high priority areas for mitigation. This presentation will focus on the results from California ports and harbors across the State, and will include feedback we have received from discussions with local harbor masters and port authorities. To help promote accurate and consistent products, the authors are also working through the National Tsunami Hazard Mitigation Program to organize a tsunami current model benchmark workshop.

  11. Studies on lithium salts to mitigate ASR-induced expansion in new concrete: a critical review

    SciTech Connect

    Feng, X. . E-mail: k488i@unb.ca; Thomas, M.D.A.; Bremner, T.W.; Balcom, B.J.; Folliard, K.J.

    2005-09-01

    This paper provides a critical review of the research work conducted so far on the suppressive effects of lithium compounds on expansion due to alkali-silica reaction (ASR) in concrete and on the mechanism or mechanisms by which lithium inhibits the expansion. After a thorough examination of the existing literature regarding lithium salts in controlling ASR expansion, a summary of research findings is provided. It shows that all the lithium salts studied, including LiF, LiCl, LiBr, LiOH, LiOH.H{sub 2}O, LiNO{sub 3}, LiNO{sub 2}, Li{sub 2}CO{sub 3}, Li{sub 2}SO{sub 4}, Li{sub 2}HPO{sub 4}, and Li{sub 2}SiO{sub 3}, are effective in suppressing ASR expansion in new concrete, provided they are used at the appropriate dosages. Among these compounds, LiNO{sub 3} appears to be the most promising one. Although the mechanism(s) for the suppressive effects of lithium are not well understood, several mechanisms have been proposed. A detailed discussion about these existing mechanisms is provided in the paper. Finally, some recommendations for future studies are identified.

  12. Towards Practical, Real-Time Estimation of Spatial Aftershock Probabilities: A Feasibility Study in Earthquake Hazard

    NASA Astrophysics Data System (ADS)

    Morrow, P.; McCloskey, J.; Steacy, S.

    2001-12-01

    It is now widely accepted that the goal of deterministic earthquake prediction is unattainable in the short term and may even be forbidden by nonlinearity in the generating dynamics. This nonlinearity does not, however, preclude the estimation of earthquake probability and, in particular, how this probability might change in space and time; earthquake hazard estimation might be possible in the absence of earthquake prediction. Recently, there has been a major development in the understanding of stress triggering of earthquakes which allows accurate calculation of the spatial variation of aftershock probability following any large earthquake. Over the past few years this Coulomb stress technique (CST) has been the subject of intensive study in the geophysics literature and has been extremely successful in explaining the spatial distribution of aftershocks following several major earthquakes. The power of current micro-computers, the great number of local, telemetered seismic networks, the rapid acquisition of data from satellites coupled with the speed of modern telecommunications and data transfer all mean that it may be possible that these new techniques could be applied in a forward sense. In other words, it is theoretically possible today to make predictions of the likely spatial distribution of aftershocks in near-real-time following a large earthquake. Approximate versions of such predictions could be available within, say, 0.1 days after the mainshock and might be continually refined and updated over the next 100 days. The European Commission has recently provided funding for a project to assess the extent to which it is currently possible to move CST predictions into a practically useful time frame so that low-confidence estimates of aftershock probability might be made within a few hours of an event and improved in near-real-time, as data of better quality become available over the following days to tens of days. Specifically, the project aims to assess the

  13. Legal and institutional tools to mitigate plastic pollution affecting marine species: Argentina as a case study.

    PubMed

    González Carman, Victoria; Machain, Natalia; Campagna, Claudio

    2015-03-15

    Plastics are the most common form of debris found along the Argentine coastline. The Río de la Plata estuarine area is a relevant case study to describe a situation where ample policy exists against a backdrop of plastics disposed by populated coastal areas, industries, and vessels; with resultant high impacts of plastic pollution on marine turtles and mammals. Policy and institutions are in place but the impact remains due to ineffective waste management, limited public education and awareness, and weaknesses in enforcement of regulations. This context is frequently repeated all over the world. We list possible interventions to increase the effectiveness of policy that require integrating efforts among governments, the private sector, non-governmental organizations and the inhabitants of coastal cities to reduce the amount of plastics reaching the Río de la Plata and protect threatened marine species. What has been identified for Argentina applies to the region and globally.

  14. Energy technology roll-out for climate change mitigation: A multi-model study for Latin America

    SciTech Connect

    van der Zwaan, Bob; Kober, Tom; Calderon, Silvia; Clarke, Leon; Daenzer, Katie; Kitous, Alban; Labriet, Maryse; Lucena, André F. P.; Octaviano, Claudia; Di Sbroiavacca, Nicolas

    2016-05-01

    In this paper we investigate opportunities for energy technology deployment under climate change mitigation efforts in Latin America. Through several carbon tax and CO2 abatement scenarios until 2050 we analyze what resources and technologies, notably for electricity generation, could be cost-optimal in the energy sector to significantly reduce CO2 emissions in the region. By way of sensitivity test we perform a cross-model comparison study and inspect whether robust conclusions can be drawn across results from different models as well as different types of models (general versus partial equilibrium). Given the abundance of biomass resources in Latin America, they play a large role in energy supply in all scenarios we inspect. This is especially true for stringent climate policy scenarios, for instance because the use of biomass in power plants in combination with CCS can yield negative CO2 emissions. We find that hydropower, which today contributes about 800 TWh to overall power production in Latin America, could be significantly expanded to meet the climate policies we investigate, typically by about 50%, but potentially by as much as 75%. According to all models, electricity generation increases exponentially with a two- to three-fold expansion between 2010 and 2050.Wefind that in our climate policy scenarios renewable energy overall expands typically at double-digit growth rates annually, but there is substantial spread in model results for specific options such as wind and solar power: the climate policies that we simulate raise wind power in 2050 on average to half the production level that hydropower provides today, while they raise solar power to either a substantially higher or a much lower level than hydropower supplies at present, depending on which model is used. Also for CCS we observe large diversity in model outcomes, which reflects the uncertainties with regard to its future implementation potential as a result of

  15. Early Intervention Following Trauma May Mitigate Genetic Risk for PTSD in Civilians: A Pilot Prospective Emergency Department Study

    PubMed Central

    Rothbaum, Barbara O.; Kearns, Megan C.; Reiser, Emily; Davis, Jennifer S.; Kerley, Kimberly A.; Rothbaum, Alex O.; Mercer, Kristina B.; Price, Matthew; Houry, Debra; Ressler, Kerry J.

    2015-01-01

    intervention group, even after controlling for age, sex, race, education, income, and childhood trauma. Using logistic regression, the number of risk alleles was significantly associated with likelihood of PTSD diagnosis at week 12 (P < .05). Conclusions This pilot prospective study suggests that combined genetic variants may serve to predict those most at risk for developing PTSD following trauma. A psychotherapeutic intervention initiated in the emergency department within hours of the trauma may mitigate this risk. The role of genetic predictors of risk and resilience should be further evaluated in larger, prospective intervention and prevention trials. Trial Registration ClinicalTrials.gov identifier: NCT00895518 PMID:25188543

  16. Mitigation of polar pesticides across a vegetative filter strip. A mesocosm study.

    PubMed

    Franco, Jorge; Matamoros, Víctor

    2016-12-01

    Vegetated filter strips (VFSs) are planted at the edge of agricultural fields to reduce pesticide run-off and its consequent potential toxicological effects on ecosystem biota; however, little attention has been paid to date to the attenuation of highly polar and ionisable pesticides such as phenoxyacid herbicides. This study assesses the effect of soil moisture, run-off flow and vegetation on the attenuation of MCPA, mecoprop, dicamba, dichlorprop, fenitrothion, atrazine and simazine by VFSs. Reactors measuring 5 m long by 0.1 m wide were each filled with 60 kg of soil from a real field VFS. VFSs planted with Phragmites australis and unvegetated control reactors were assessed. After a simulated rainfall event of 50 mm, two hydraulic loading rates (HLRs) were assessed (1 and 2 cm h(-1)). These results were compared to those from the same systems under water-saturated conditions. The results show that VFSs reduced the peak inlet concentration and pesticide mass by more than 90 % and that the presence of vegetation increased that attenuation (82-90 % without vegetation and 90-93 % with vegetation, on average). The laboratory-scale study showed that such attenuation was due to sorption into the soil. The toxicity units of pesticides fell by more than 90 % in all cases, except under the water-saturated conditions, in which the decrease was lower (16 vs 54 %, for unvegetated and vegetated reactors). Therefore, the presence of vegetation was shown to be effective for reducing mass discharge of ionisable and highly polar pesticides into surface-water bodies.

  17. Dietary Flaxseed Mitigates Impaired Skeletal Muscle Regeneration: in Vivo, in Vitro and in Silico Studies

    PubMed Central

    Carotenuto, Felicia; Costa, Alessandra; Albertini, Maria Cristina; Rocchi, Marco Bruno Luigi; Rudov, Alexander; Coletti, Dario; Minieri, Marilena; Di Nardo, Paolo; Teodori, Laura

    2016-01-01

    Background: Diets enriched with n-3 polyunsaturated fatty acids (n-3 PUFAs) have been shown to exert a positive impact on muscle diseases. Flaxseed is one of the richest sources of n-3 PUFA acid α-linolenic acid (ALA). The aim of this study was to assess the effects of flaxseed and ALA in models of skeletal muscle degeneration characterized by high levels of Tumor Necrosis Factor-α (TNF). Methods: The in vivo studies were carried out on dystrophic hamsters affected by muscle damage associated with high TNF plasma levels and fed with a long-term 30% flaxseed-supplemented diet. Differentiating C2C12 myoblasts treated with TNF and challenged with ALA represented the in vitro model. Skeletal muscle morphology was scrutinized by applying the Principal Component Analysis statistical method. Apoptosis, inflammation and myogenesis were analyzed by immunofluorescence. Finally, an in silico analysis was carried out to predict the possible pathways underlying the effects of n-3 PUFAs. Results: The flaxseed-enriched diet protected the dystrophic muscle from apoptosis and preserved muscle myogenesis by increasing the myogenin and alpha myosin heavy chain. Moreover, it restored the normal expression pattern of caveolin-3 thereby allowing protein retention at the sarcolemma. ALA reduced TNF-induced apoptosis in differentiating myoblasts and prevented the TNF-induced inhibition of myogenesis, as demonstrated by the increased expression of myogenin, myosin heavy chain and caveolin-3, while promoting myotube fusion. The in silico investigation revealed that FAK pathways may play a central role in the protective effects of ALA on myogenesis. Conclusions: These findings indicate that flaxseed may exert potent beneficial effects by preserving skeletal muscle regeneration and homeostasis partly through an ALA-mediated action. Thus, dietary flaxseed and ALA may serve as a useful strategy for treating patients with muscle dystrophies. PMID:26941581

  18. Simultaneous transcutaneous electrical nerve stimulation mitigates simulator sickness symptoms in healthy adults: a crossover study

    PubMed Central

    2013-01-01

    Background Flight simulators have been used to train pilots to experience and recognize spatial disorientation, a condition in which pilots incorrectly perceive the position, location, and movement of their aircrafts. However, during or after simulator training, simulator sickness (SS) may develop. Spatial disorientation and SS share common symptoms and signs and may involve a similar mechanism of dys-synchronization of neural inputs from the vestibular, visual, and proprioceptive systems. Transcutaneous electrical nerve stimulation (TENS), a maneuver used for pain control, was found to influence autonomic cardiovascular responses and enhance visuospatial abilities, postural control, and cognitive function. The purpose of present study was to investigate the protective effects of TENS on SS. Methods Fifteen healthy young men (age: 28.6 ± 0.9 years, height: 172.5 ± 1.4 cm, body weight: 69.3 ± 1.3 kg, body mass index: 23.4 ± 1.8 kg/m2) participated in this within-subject crossover study. SS was induced by a flight simulator. TENS treatment involved 30 minutes simultaneous electrical stimulation of the posterior neck and the right Zusanli acupoint. Each subject completed 4 sessions (control, SS, TENS, and TENS + SS) in a randomized order. Outcome indicators included SS symptom severity and cognitive function, evaluated with the Simulator Sickness Questionnaire (SSQ) and d2 test of attention, respectively. Sleepiness was rated using the Visual Analogue Scales for Sleepiness Symptoms (VAS-SS). Autonomic and stress responses were evaluated by heart rate, heart rate variability (HRV) and salivary stress biomarkers (salivary alpha-amylase activity and salivary cortisol concentration). Results Simulator exposure increased SS symptoms (SSQ and VAS-SS scores) and decreased the task response speed and concentration. The heart rate, salivary stress biomarker levels, and the sympathetic parameter of HRV increased with simulator exposure, but

  19. Calorimetric studies on the thermal hazard of methyl ethyl ketone peroxide with incompatible substances.

    PubMed

    Chang, Ron-Hsin; Shu, Chi-Min; Duh, Yih-Shing; Jehng, Jih-Mirn

    2007-03-22

    In Taiwan, Japan, and China, methyl ethyl ketone peroxide (MEKPO) has caused many severe thermal explosions owing to its thermal instability and reactivity originating from the complexity of its structure. This study focused on the incompatible features of MEKPO as detected by calorimetry. The thermal decomposition and runaway behaviors of MEKPO with about 10wt.% incompatibilities, such as H(2)SO(4), HCl, NaOH, KOH, FeCl(3), and FeSO(4), were analyzed by dynamic calorimeter, differential scanning calorimetry (DSC) and adiabatic calorimeter, vent sizing package 2 (VSP2). Thermokinetic data, such as onset temperature, heat of decomposition, adiabatic temperature rise, and self-heat rate, were obtained and assessed. Experimental data were used for determining the incompatibility rating on hazards. From the thermal curves of MEKPO with assumed incompatible substances detected by DSC, all the onset temperatures in the other tests occurring earlier advanced, especially with alkaline or ferric materials. In some tests, significant incompatible reactions were found. Adiabatic runaway behaviors for simulating the worst case scenario were performed by using VSP2. These calorimetric data led to the same results that the alkaline or ferric solution was the most incompatible with MEKPO.

  20. Flood hazard studies in Central Texas using orbital and suborbital remote sensing machinery

    NASA Technical Reports Server (NTRS)

    Baker, V. R.; Holz, R. K.; Patton, P. C.

    1975-01-01

    Central Texas is subject to infrequent, unusually intense rainstorms which cause extremely rapid runoff from drainage basins developed on the deeply dissected limestone and marl bedrock of the Edwards Plateau. One approach to flood hazard evaluation in this area is a parametric model relating flood hydrograph characteristics to quantitative geomorphic properties of the drainage basins. The preliminary model uses multiple regression techniques to predict potential peak flood discharge from basin magnitude, drainage density, and ruggedness number. After mapping small catchment networks from remote sensing imagery, input data for the model are generated by network digitization and analysis by a computer assisted routine of watershed analysis. The study evaluated the network resolution capabilities of the following data formats: (1) large-scale (1:24,000) topographic maps, employing Strahler's "method of v's," (2) standard low altitude black and white aerial photography (1:13,000 and 1:20,000 scales), (3) NASA - generated aerial infrared photography at scales ranging from 1:48,000 to 1:123,000, and (4) Skylab Earth Resources Experiment Package S-190A and S-190B sensors (1:750,000 and 1:500,000 respectively).

  1. Can hazardous waste become a raw material? The case study of an aluminium residue: a review.

    PubMed

    López-Delgado, Aurora; Tayibi, Hanan

    2012-05-01

    The huge number of research studies carried out during recent decades focused on finding an effective solution for the waste treatment, have allowed some of these residues to become new raw materials for many industries. Achieving this ensures a reduction in energy and natural resources consumption, diminishing of the negative environmental impacts and creating secondary and tertiary industries. A good example is provided by the metallurgical industry, in general, and the aluminium industry in this particular case. The aluminium recycling industry is a beneficial activity for the environment, since it recovers resources from primary industry, manufacturing and post-consumer waste. Slag and scrap which were previously considered as waste, are nowadays the raw material for some highly profitable secondary and tertiary industries. The most recent European Directive on waste establishes that if waste is used as a common product and fulfils the existing legislation for this product, then this waste can be defined as 'end-of-waste'. The review presented here, attempts to show several proposals for making added-value materials using an aluminium residue which is still considered as a hazardous waste, and accordingly, disposed of in secure storage. The present proposal includes the use of this waste to manufacture glass, glass-ceramic, boehmite and calcium aluminate. Thus the waste might effectively be recovered as a secondary source material for various industries.

  2. Ground motion input in seismic evaluation studies: impacts on risk assessment of uniform hazard spectra

    SciTech Connect

    Wu, S.C.; Sewell, R.T.

    1996-07-01

    Conservatism and variability in seismic risk estimates are studied: effects of uniform hazard spectrum (UHS) are examined for deriving probabilistic estimates of risk and in-structure demand levels, as compared to the more-exact use of realistic time history inputs (of given probability) that depend explicitly on magnitude and distance. This approach differs from the conventional in its exhaustive treatment of the ground-motion threat and in its more detailed assessment of component responses to that threat. The approximate UH-ISS (in-structure spectrum) obtained based on UHS appear to be very close to the more-exact results directed computed from scenario earthquakes. This conclusion does not depend on site configurations and structural characteristics. Also, UH-ISS has composite shapes and may not correspond to the characteristics possessed a single earthquake. The shape is largely affected by the structural property in most cases and can be derived approximately from the corresponding UHS. Motions with smooth spectra, however, will not have the same damage potential as those of more realistic motions with jagged spectral shapes. As a result, UHS-based analysis may underestimate the real demands in nonlinear structural analyses.

  3. Control strategy optimization for attainment and exposure mitigation: case study for ozone in Macon, Georgia.

    PubMed

    Cohan, Daniel S; Tian, Di; Hu, Yongtao; Russell, Armistead G

    2006-09-01

    Implementation of more stringent 8-hour ozone standards has led the U.S. Environmental Protection Agency to designate nonattainment status to 474 counties nationwide, many of which had never previously violated air quality standards. As states select emission control measures to achieve attainment in these regions, their choices pose significant implications to local economies and the health of their citizens. Considering a case study of one such nonattainment region, Macon, Georgia, we develop a menu of potential controls that could be implemented locally or in neighboring parts of the state. The control menu offers the potential to control about 20-35% of ozone precursor emissions in most Georgia regions, but marginal costs increase rapidly beyond 15-20%. We link high-order ozone sensitivities with the control menu to identify cost-optimized strategies for achieving attainment and for alternative goals such as reducing spatially averaged or population-weighted ozone concentrations. Strategies targeted toward attainment of Macon ozone would prioritize local reductions of nitrogen oxides, whereas controls in the more densely populated Atlanta region are shown to be more effective for reducing statewide potential population exposure to ozone. A U.S. EPA-sanctioned approach for demonstrating ozone attainment with photochemical models is shown to be highly dependent on the choice of a baseline period and may not foster optimal strategies for assuring attainment and protecting human health.

  4. Spectroscopic study of debris mitigation with minimum-mass Sn laser plasma for extreme ultraviolet lithography

    SciTech Connect

    Namba, S.; Fujioka, S.; Nishimura, H.; Yasuda, Y.; Nagai, K.; Miyanaga, N.; Izawa, Y.; Mima, K.; Takiyama, K.

    2006-04-24

    An experimental study was made of a target consisting of the minimum mass of pure tin (Sn) necessary for the highest conversion to extreme ultraviolet (EUV) light while minimizing the generation of plasma debris. The minimum-mass target comprised a thin Sn layer coated on a plastic shell and was irradiated with a Nd:YAG laser pulse. The expansion behavior of neutral atoms and singly charged ions emanating from the Sn plasma were investigated by spatially resolved visible spectroscopy. A remarkable reduction of debris emission in the backward direction with respect to the incident laser beam was demonstrated with a decrease in the thickness of the Sn layer. The optimal thickness of the Sn layer for a laser pulse of 9 ns at 7x10{sup 10} W/cm{sup 2} was found to be 40 nm, at which low-debris emission in the backward direction and a high conversion to 13.5 nm EUV radiation were simultaneously attained.

  5. Feasibility study of tank leakage mitigation using subsurface barriers. Revision 1

    SciTech Connect

    Treat, R.L.; Peters, B.B.; Cameron, R.J.

    1995-01-01

    This document reflects the evaluations and analyses performed in response to Tri-Party Agreement Milestone M-45-07A - {open_quotes}Complete Evaluation of Subsurface Barrier Feasibility{close_quotes} (September 1994). In addition, this feasibility study was revised reflecting ongoing work supporting a pending decision by the DOE Richland Operations Office, the Washington State Department of Ecology, and the US Environmental Protection Agency regarding further development of subsurface barrier options for SSTs and whether to proceed with demonstration plans at the Hanford Site (Tri-Party Agreement Milestone M-45-07B). Analyses of 14 integrated SST tank farm remediation alternatives were conducted in response to the three stated objectives of Tri-Party Agreement Milestone M-45-07A. The alternatives include eight with subsurface barriers and six without. Technologies used in the alternatives include three types of tank waste retrieval, seven types of subsurface barriers, a method of stabilizing the void space of emptied tanks, two types of in situ soil flushing, one type of surface barrier, and a clean-closure method. A no-action alternative and a surface-barrier-only alternative were included as nonviable alternatives for comparison. All other alternatives were designed to result in closure of SST tank farms as landfills or in clean-closure. Revision 1 incorporates additional analyses of worker safety, large leak scenarios, and sensitivity to the leach rates of risk controlling constituents. The additional analyses were conducted to support TPA Milestone M-45-07B.

  6. Mitigation of air pollution and carbon footprint by energy conservation through CFLs: a case study.

    PubMed

    Wath, Sushant B; Majumdar, Deepanjan

    2011-01-01

    Electricity consumption of compact fluorescent lamps (CFLs) is low, making them a useful tool for minimizing the rapidly increasing demand of electrical energy in India. The present study aims to project the likely electricity conservation in a scenario of complete replacement of existing Fluorescent Tubes (FTs) by CFLs at CSIR-NEERI (National Environmental Engineering Research Institute) visa vis the financial repercussions and indirect reduction in emissions of greenhouse gases, e.g. CO2, N2O, CH4 and other air pollutants, e.g. SO2, NO, suspended particulate matter (SPM), black carbon (BC) and mercury (Hg) from coal fired thermal power plants. The calculations show that the Institute could save around 122850 kWh of electricity per annum, thereby saving approximately INR 859950/(USD 18453.86) towards electricity cost per annum and would be able to minimize 44579.08 kg of CO2-C equivalent (over 100 year time horizon), 909 kg SO2, 982.8 kg NO, 9.8 kg of BC, 368.5 kg SPM, 18.4 kg PM10 and 0.0024 kg Hg emissions per annum from a coal fired thermal power plant by conserving electricity at the institute level.

  7. Over-Pressurized Drums: Their Causes and Mitigation

    SciTech Connect

    Simmons, Fred; Kuntamukkula, Murty; Quigley, David; Robertson, Janeen; Freshwater, David

    2009-07-10

    Having to contend with bulging or over-pressurized drums is, unfortunately, a common event for people storing chemicals and chemical wastes. (Figure 1) The Department of Energy alone reported over 120 incidents of bulging drums between 1992 and 1999 (1). Bulging drums can be caused by many different mechanisms, represent a number of significant hazards and can be tricky to mitigate. In this article, we will discuss reasons or mechanisms by which drums can become over-pressurized, recognition of the hazards associated with and mitigation of over-pressurized drums, and methods that can be used to prevent drum over-pressurization from ever occurring. Drum pressurization can represent a significant safety hazard. Unless recognized and properly mitigated, improperly manipulated pressurized drums can result in employee exposure, employee injury, and environmental contamination. Therefore, recognition of when a drum is pressurized and knowledge of pressurized drum mitigation techniques is essential.

  8. Climate engineering of vegetated land for hot extremes mitigation: An Earth system model sensitivity study

    NASA Astrophysics Data System (ADS)

    Wilhelm, Micah; Davin, Edouard L.; Seneviratne, Sonia I.

    2015-04-01

    Various climate engineering schemes have been proposed as a way to curb anthropogenic climate change. Land climate engineering schemes aiming to reduce the amount of solar radiation absorbed at the surface by changes in land surface albedo have been considered in a limited number of investigations. However, global studies on this topic have generally focused on the impacts on mean climate rather than extremes. Here we present the results of a series of transient global climate engineering sensitivity experiments performed with the Community Earth System Model over the time period 1950-2100 under historical and Representative Concentration Pathway 8.5 scenarios. Four sets of experiments are performed in which the surface albedo over snow-free vegetated grid points is increased respectively by 0.05, 0.10, 0.15, and 0.20. The simulations show a preferential cooling of hot extremes relative to mean temperatures throughout the Northern midlatitudes during boreal summer under the late twentieth century conditions. Two main mechanisms drive this response: On the one hand, a stronger efficacy of the albedo-induced radiative forcing on days with high incoming shortwave radiation and, on the other hand, enhanced soil moisture-induced evaporative cooling during the warmest days relative to the control simulation due to accumulated soil moisture storage and reduced drying. The latter effect is dominant in summer in midlatitude regions and also implies a reduction of summer drought conditions. It thus constitutes another important benefit of surface albedo modifications in reducing climate change impacts. The simulated response for the end of the 21st century conditions is of the same sign as that for the end of the twentieth century conditions but indicates an increasing absolute impact of land surface albedo increases in reducing mean and extreme temperatures under enhanced greenhouse gas forcing.

  9. Seismic hazard map of the western hemisphere

    USGS Publications Warehouse

    Shedlock, K.M.; Tanner, J.G.

    1999-01-01

    Vulnerability to natural disasters increases with urbanization and development of associated support systems (reservoirs, power plants, etc.). Catastrophic earthquakes account for 60% of worldwide casualties associated with natural disasters. Economic damage from earthquakes is increasing, even in technologically advanced countries with some level of seismic zonation, as shown by the 1989 Loma Prieta, CA ($6 billion), 1994 Northridge, CA ($ 25 billion), and 1995 Kobe, Japan (> $ 100 billion) earthquakes. The growth of megacities in seismically active regions around the world often includes the construction of seismically unsafe buildings and infrastructures, due to an insufficient knowledge of existing seismic hazard. Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes), emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of the Americas is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful global seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA) with a 10% chance of exceedance in 50 years for the western hemisphere. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions specify the

  10. Geoengineering, climate change scepticism and the 'moral hazard' argument: an experimental study of UK public perceptions.

    PubMed

    Corner, Adam; Pidgeon, Nick

    2014-12-28

    Many commentators have expressed concerns that researching and/or developing geoengineering technologies may undermine support for existing climate policies-the so-called moral hazard argument. This argument plays a central role in policy debates about geoengineering. However, there has not yet been a systematic investigation of how members of the public view the moral hazard argument, or whether it impacts on people's beliefs about geoengineering and climate change. In this paper, we describe an online experiment with a representative sample of the UK public, in which participants read one of two arguments (either endorsing or rejecting the idea that geoengineering poses a moral hazard). The argument endorsing the idea of geoengineering as a moral hazard was perceived as more convincing overall. However, people with more sceptical views and those who endorsed 'self-enhancing' values were more likely to agree that the prospect of geoengineering would reduce their motivation to make changes in their own behaviour in response to climate change. The findings suggest that geoengineering is likely to pose a moral hazard for some people more than others, and the implications for engaging the public are discussed.

  11. The California Hazards Institute

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Kellogg, L. H.; Turcotte, D. L.

    2006-12-01

    California's abundant resources are linked with its natural hazards. Earthquakes, landslides, wildfires, floods, tsunamis, volcanic eruptions, severe storms, fires, and droughts afflict the state regularly. These events have the potential to become great disasters, like the San Francisco earthquake and fire of 1906, that overwhelm the capacity of society to respond. At such times, the fabric of civic life is frayed, political leadership is tested, economic losses can dwarf available resources, and full recovery can take decades. A patchwork of Federal, state and local programs are in place to address individual hazards, but California lacks effective coordination to forecast, prevent, prepare for, mitigate, respond to, and recover from, the harmful effects of natural disasters. Moreover, we do not know enough about the frequency, size, time, or locations where they may strike, nor about how the natural environment and man-made structures would respond. As California's population grows and becomes more interdependent, even moderate events have the potential to trigger catastrophes. Natural hazards need not become natural disasters if they are addressed proactively and effectively, rather than reactively. The University of California, with 10 campuses distributed across the state, has world-class faculty and students engaged in research and education in all fields of direct relevance to hazards. For that reason, the UC can become a world leader in anticipating and managing natural hazards in order to prevent loss of life and property and degradation of environmental quality. The University of California, Office of the President, has therefore established a new system-wide Multicampus Research Project, the California Hazards Institute (CHI), as a mechanism to research innovative, effective solutions for California. The CHI will build on the rich intellectual capital and expertise of the Golden State to provide the best available science, knowledge and tools for

  12. Further RAGE modeling of asteroid mitigation: surface and subsurface explosions in porous objects

    SciTech Connect

    Weaver, Robert P; Plesko, Catherine S; Dearholt, William R

    2011-01-03

    Disruption or mitigation of a potentially hazardous object (PHO) by a high-energy subsurface burst is considered. This is just one possible method of impact-hazard mitigation. We present RAGE hydrocode models of the shock-generated disruption of PHOs by subsurface nuclear bursts using scenario-specific models from realistic RADAR shape models. We will show 2D and 3D models for the disruption by a large energy source at the center of such PHO models ({approx}100 kt-10 Mt) specifically for the shape of the asteroid 25143 Itokawa. We study the effects of non-uniform composition (rubble pile), shallow buried bursts for the optimal depth of burial and porosity.

  13. Land use and management change under climate change adaptation and mitigation strategies: a U.S. case study

    USGS Publications Warehouse

    Mu, Jianhong E.; Wein, Anne; McCarl, Bruce

    2015-01-01

    We examine the effects of crop management adaptation and climate mitigation strategies on land use and land management, plus on related environmental and economic outcomes. We find that crop management adaptation (e.g. crop mix, new species) increases Greenhouse gas (GHG) emissions by 1.7 % under a more severe climate projection while a carbon price reduces total forest and agriculture GHG annual flux by 15 % and 9 %, respectively. This shows that trade-offs are likely between mitigation and adaptation. Climate change coupled with crop management adaptation has small and mostly negative effects on welfare; mitigation, which is implemented as a carbon price starting at $15 per metric ton carbon dioxide (CO2) equivalent with a 5 % annual increase rate, bolsters welfare carbon payments. When both crop management adaptation and carbon price are implemented the effects of the latter dominates.

  14. Smart disaster mitigation in Thailand

    NASA Astrophysics Data System (ADS)

    Aimmanee, S.; Ekkawatpanit, C.; Asanuma, H.

    2016-04-01

    Thailand is notoriously exposed to several natural disasters, from heavy thunder storms to earthquakes and tsunamis, since it is located in the tropical area and has tectonic cracks underneath the ground. Besides these hazards flooding, despite being less severe, occurs frequently, stays longer than the other disasters, and affects a large part of the national territory. Recently in 2011 have also been recorded the devastating effects of major flooding causing the economic damages and losses around 50 billion dollars. Since Thailand is particularly exposed to such hazards, research institutions are involved in campaigns about monitoring, prevention and mitigation of the effects of such phenomena, with the aim to secure and protect human lives, and secondly, the remarkable cultural heritage. The present paper will first make a brief excursus on the main Thailand projects aimed at the mitigation of natural disasters, referring to projects of national and international relevance, being implemented, such as the ESCAP1999 (flow regime regulation and water conservation). Adaptable devices such as foldable flood barriers and hydrodynamically supported temporary banks have been utilized when flooding. In the second part of the paper, will be described some new ideas concerning the use of smart and biomimicking column structures capable of high-velocity water interception and velocity detection in the case of tsunami. The pole configuration is composite cylindrical shell structure embedded with piezoceramic sensor. The vortex shedding of the flow around the pole induces the vibration and periodically strains the piezoelectric element, which in turn generates the electrical sensorial signal. The internal space of the shell is filled with elastic foam to enhance the load carrying capability due to hydrodynamic application. This more rigid outer shell inserted with soft core material resemble lotus stem in nature in order to prolong local buckling and ovalization of column

  15. Trade study of leakage detection, monitoring, and mitigation technologies to support Hanford single-shell waste retrieval

    SciTech Connect

    Hertzel, J.S.

    1996-03-01

    The U.S. Department of Energy has established the Tank Waste Remediation System to safely manage and dispose of low-level, high-level, and transuranic wastes currently stored in underground storage tanks at the Hanford Site in Eastern Washington. This report supports the Hanford Federal Facility Agreement and Consent Order (Tri-Party Agreement) Milestone No. M-45-08-T01 and addresses additional issues regarding single-shell tank leakage detection, monitoring, and mitigation technologies and provide an indication of the scope of leakage detection, monitoring, and mitigation activities necessary to support the Tank Waste Remedial System Initial Single-shell Tank Retrieval System project.

  16. Houston's Novel Strategy to Control Hazardous Air Pollutants: A Case Study in Policy Innovation and Political Stalemate.

    PubMed

    Sexton, Ken; Linder, Stephen H

    2015-01-01

    Although ambient concentrations have declined steadily over the past 30 years, Houston has recorded some of the highest levels of hazardous air pollutants in the United States. Nevertheless, federal and state regulatory efforts historically have emphasized compliance with the National Ambient Air Quality Standard for ozone, treating "air toxics" in Houston as a residual problem to be solved through application of technology-based standards. Between 2004 and 2009, Mayor Bill White and his administration challenged the well-established hierarchy of air quality management spelled out in the Clean Air Act, whereby federal and state authorities are assigned primacy over local municipalities for the purpose of designing and implementing air pollution control strategies. The White Administration believed that existing regulations were not sufficient to protect the health of Houstonians and took a diversity of both collaborative and combative policy actions to mitigate air toxic emissions from stationary sources. Opposition was substantial from a local coalition of entrenched interests satisfied with the status quo, which hindered the city's attempts to take unilateral policy actions. In the short term, the White Administration successfully raised the profile of the air toxics issue, pushed federal and state regulators to pay more attention, and induced a few polluting facilities to reduce emissions. But since White left office in 2010, air quality management in Houston has returned to the way it was before, and today there is scant evidence that his policies have had any lasting impact.

  17. Houston’s Novel Strategy to Control Hazardous Air Pollutants: A Case Study in Policy Innovation and Political Stalemate

    PubMed Central

    Sexton, Ken; Linder, Stephen H

    2015-01-01

    Although ambient concentrations have declined steadily over the past 30 years, Houston has recorded some of the highest levels of hazardous air pollutants in the United States. Nevertheless, federal and state regulatory efforts historically have emphasized compliance with the National Ambient Air Quality Standard for ozone, treating “air toxics” in Houston as a residual problem to be solved through application of technology-based standards. Between 2004 and 2009, Mayor Bill White and his administration challenged the well-established hierarchy of air quality management spelled out in the Clean Air Act, whereby federal and state authorities are assigned primacy over local municipalities for the purpose of designing and implementing air pollution control strategies. The White Administration believed that existing regulations were not sufficient to protect the health of Houstonians and took a diversity of both collaborative and combative policy actions to mitigate air toxic emissions from stationary sources. Opposition was substantial from a local coalition of entrenched interests satisfied with the status quo, which hindered the city’s attempts to take unilateral policy actions. In the short term, the White Administration successfully raised the profile of the air toxics issue, pushed federal and state regulators to pay more attention, and induced a few polluting facilities to reduce emissions. But since White left office in 2010, air quality management in Houston has returned to the way it was before, and today there is scant evidence that his policies have had any lasting impact. PMID:25698880

  18. Integrated multi-parameters Probabilistic Seismic Landslide Hazard Analysis (PSLHA): the case study of Ischia island, Italy

    NASA Astrophysics Data System (ADS)

    Caccavale, Mauro; Matano, Fabio; Sacchi, Marco; Mazzola, Salvatore; Somma, Renato; Troise, Claudia; De Natale, Giuseppe

    2014-05-01

    the areas with higher susceptibility of landslide occurrence due to the seismic effect. The (PSLHA) combines the probability of exceedance maps for different GM parameters with the geological and geomorphological information, in terms of critical acceleration and dynamic stability factor. Generally the maps are evaluated for Peak Ground Acceleration, Velocity or Intensity, are well related with anthropic infrastructures (e.g. streets, building, etc.). Each ground motion parameter represents a different aspect in the hazard and has a different correlation with the generation of possible damages. Many works pointed out that other GM like Arias and Housner intensity and the absolute displacement could represent a better choice to analyse for example the cliffs stability. The selection of the GM parameter is of crucial importance to obtain the most useful hazard maps. However in the last decades different Ground Motion Prediction Equations for a new set of GM parameters have been published. Based on this information a series of landslide hazard maps can be produced. The new maps will lead to the identification of areas with highest probability of landslide induced by an earthquake. In a strategic site like Ischia this new methodologies will represent an innovative and advanced tool for the landslide hazard mitigation.

  19. Mitigating GHG emissions from agriculture under climate change constrains - a case study for the State of Saxony, Germany

    NASA Astrophysics Data System (ADS)

    Haas, E.; Kiese, R.; Klatt, S.; Butterbach-Bahl, K.

    2012-12-01

    Mitigating greenhouse gas (N2O, CO2, CH4) emissions from agricultural soils under conditions of projected climate change (IPCC SRES scenarios) is a prerequisite to limit global warming. In this study we used the recently developed regional biogeochemical ecosystem model LandscapeDNDC (Haas et al., 2012, Landscape Ecology) and two time slices for present day (1998 - 2018) and future climate (2078-2098) (regional downscale of IPCC SRES A1B climate simulation) and compared a business as usual agricultural management scenario (winter rape seed - winter barley - winter wheat rotation; fertilization: 170 / 150 / 110 kg-N mineral fertilizer; straw harvest barley/wheat: 90 %) with scenarios where either one or all of the following options were realized: no-till, residue return to fields equal 100%, reduction of fertilization rate s were left on the field or reduction of N fertilization by 10%. The spatial domain is the State of Saxony (1 073 523 hectares of arable land), a typical region for agricultural production in Central Europe. The simulations are based on a high resolution polygonal datasets (5 517 agricultural grid cells) for which relevant information on soil properties is available. The regionalization of the N2O emissions was validated against the IPCC Tier I methodology resulting in N2O emissions of 1 824 / 1 610 / 1 180 [t N2O-N yr-1] for of the baseline years whereas the simulations results in 6 955 / 6 039 / 2 207 [t N2O-N yr-1] for the first three years of the baseline scenarios and ranging between 621 and 6 955 [t N2O-N yr-1] within the following years (mean of 2 923). The influence of climate change (elevated mean temperature of approx. 2°C and minor changes in precipitation) results in an increase of 259 [t N2O-N yr-1] (mean 3 182) or approx. 9 percent on average (with a minimum of 618 and a maximum of 6 553 [t N2O-N yr-1]). Focusing on the mitigation , the recarbonization did result in an increase of soil carbon stocks of 2 585 [kg C/ha] within the

  20. A study of hazardous air pollutants at the Tidd PFBC Demonstration Plant

    SciTech Connect

    1994-10-01

    The US Department of Energy (DOE) Clean Coal Technology (CCD Program is a joint effort between government and industry to develop a new generation of coal utilization processes. In 1986, the Ohio Power Company, a subsidiary of American Electric Power (AEP), was awarded cofunding through the CCT program for the Tidd Pressure Fluidized Bed Combustor (PFBC) Demonstration Plant located in Brilliant, Ohio. The Tidd PFBC unit began operation in 1990 and was later selected as a test site for an advanced particle filtration (APF) system designed for hot gas particulate removal. The APF system was sponsored by the DOE Morgantown Energy Technology Center (METC) through their Hot Gas Cleanup Research and Development Program. A complementary goal of the DOE CCT and METC R&D programs has always been to demonstrate the environmental acceptability of these emerging technologies. The Clean Air Act Amendments of 1990 (CAAA) have focused that commitment toward evaluating the fate of hazardous air pollutants (HAPs) associated with advanced coal-based and hot gas cleanup technologies. Radian Corporation was contacted by AEP to perform this assessment of HAPs at the Tidd PFBC demonstration plant. The objective of this study is to assess the major input, process, and emission streams at Plant Tidd for the HAPs identified in Title III of the CAAA. Four flue gas stream locations were tested: ESP inlet, ESP outlet, APF inlet, and APF outlet. Other process streams sampled were raw coal, coal paste, sorbent, bed ash, cyclone ash, individual ESP hopper ash, APF ash, and service water. Samples were analyzed for trace elements, minor and major elements, anions, volatile organic compounds, dioxin/furan compounds, ammonia, cyanide, formaldehyde, and semivolatile organic compounds. The particle size distribution in the ESP inlet and outlet gas streams and collected ash from individual ESP hoppers was also determined.

  1. Physical and Environmental Hazards in the Prosthetics and Orthotics workshop: A pilot study.

    PubMed

    Anderson, Sarah; Stuckey, Rwth; Poole, Diana; Oakman, Jodi

    2017-02-07

    Background; Prosthetists and Orthotists (P&O) are exposed to physical hazards within the workshop environment. Concern regarding these exposures has been expressed by P&Os; however, little research has been undertaken. Exposures to noise and volatile organic compounds in amounts larger than statutorily allowed can have adverse short and long term consequences on people's health.

  2. PEACETIME RADIATION HAZARDS IN THE FIRE SERVICE, BASIC COURSE, STUDY GUIDE.

    ERIC Educational Resources Information Center

    Atomic Energy Commission, Washington, DC.

    THE ASSIGNMENT SHEETS INCLUDED ARE CORRELATED WITH THE INSTRUCTOR'S GUIDE (VT 002 117), THE RESOURCE MANUAL (VT 001 337), AND A SET OF TWENTY-TWO 20- BY 28-INCH CHARTS (OE 84002). THE MATERIAL IS DESIGNED TO BE PRESENTED TO FIREMEN IN A 15-HOUR COURSE AS A PART OF THEIR BASIC FIRE TRAINING AND IS CONCERNED WITH THE HAZARDS RESULTING FROM THE…

  3. One-year follow-up study of performance of radon mitigation systems installed in Tennessee Valley houses

    SciTech Connect

    Dudney, C.S.; Wilson, D.L.; Saultz, R.J.; Matthews, T.G.

    1990-01-01

    Subbarrier depressurization systems were installed for radon mitigation in two basement ranchers in Oak Ridge, TN, and in two ranchers with partial basements in Huntsville, AL. System performance parameters, including pressure field extension, subslab permeability, and indoor radon concentrations were followed in each house for a year or longer. 9 refs., 3 figs., 3 tabs.

  4. Controlled study of ground-based interferometric radar for rockfall hazard monitoring

    NASA Astrophysics Data System (ADS)

    Gilliam, Joseph T.

    The ability to detect small and localized movements of geotechnical facilities is important for performance monitoring and early-warning hazard detection. Current deformation monitoring technology is limited in its ability to both scan massive structures and detect small and localized movements. Ground-based interferometric radar (GBIR) is an emerging remote sensing technology that has the potential to fill this gap in surface deformation monitoring technology. Much of the focus of previous research and application of GBIR has been on monitoring large spatial scale movements, such as mine walls, dams, and earth slopes experiencing deformational movements occurring over hundreds or thousands of square meters. In this study the focus is on evaluating the capabilities of GBIR for detecting and monitoring small movements (mm-scale) occurring over very localized regions (a few m2). Specifically, this study focused on the application of GBIR for detecting and monitoring movements of individual boulders in a massive landscape. The potential long-term application is to use GBIR to detect and monitor precursor movements to rockfall events. Boulders, ranging from 0.5 to 5 meters in approximate facial dimensions, were moved using pry bars and airbag jacks in increments of a few to several mm. Two identical GBIR devices positioned at separate locations scanned a region covering approximately 20,000 m2 after each boulder movement. Ground truth measurements were also performed after each boulder movement. The detectability of the boulder movements was assessed by comparing the distribution of measured phase values from the boulder to (1) phase values measured on the surrounding non-moving portion of the image and (2) phase values measured from the same boulder when it was not moving. The results from the study showed that movements of boulders with dimensions larger than about 2 m were detectable for range offset distances from about 75 to 150 m. Movements as small of 1.7 mm

  5. Development and application of the EPIC model for carbon cycle, greenhouse-gas mitigation, and biofuel studies

    SciTech Connect

    Izaurralde, Roberto C.; Mcgill, William B.; Williams, J.R.

    2012-06-01

    This chapter provides a comprehensive review of the EPIC model in relation to carbon cycle, greenhouse-gas mitigation, and biofuel applications. From its original capabilities and purpose (i.e., quantify the impacts or erosion on soil productivity), the EPIC model has evolved into a comprehensive terrestrial ecosystem model for simulating with more or less process-level detail many ecosystem processes such as weather, hydrology, plant growth and development, carbon cycle (including erosion), nutrient cycling, greenhouse-gas emissions, and the most complete set of manipulations that can be implemented on a parcel of land (e.g. tillage, harvest, fertilization, irrigation, drainage, liming, burning, pesticide application). The chapter also provides details and examples of the latest efforts in model development such as the coupled carbon-nitrogen model, a microbial denitrification model with feedback to the carbon decomposition model, updates on calculation of ecosystem carbon balances, and carbon emissions from fossil fuels. The chapter has included examples of applications of the EPIC model in soil carbon sequestration, net ecosystem carbon balance, and biofuel studies. Finally, the chapter provides the reader with an update on upcoming improvements in EPIC such as the additions of modules for simulating biochar amendments, sorption of soluble C in subsoil horizons, nitrification including the release of N2O, and the formation and consumption of methane in soils. Completion of these model development activities will render an EPIC model with one of the most complete representation of biogeochemical processes and capable of simulating the dynamic feedback of soils to climate and management in terms not only of transient processes (e.g., soil water content, heterotrophic respiration, N2O emissions) but also of fundamental soil properties (e.g. soil depth, soil organic matter, soil bulk density, water limits).

  6. The Impact Hazard

    NASA Astrophysics Data System (ADS)

    Morrison, D.

    2009-12-01

    Throughout its existence, Earth has been pummelled by rocks from space. The cratered face of the Moon testifies to this continuing cosmic bombardment, and the 1908 Tunguska impact in Siberia should have been a wake-up call to the impact hazard. For most scientists, however, it was the discovery 30 years ago that the KT mass extinction was caused by an impact that opened our eyes to this important aspect of Earth history -- that some geological and biological changes have an external origin, and that the biosphere is much more sensitive to impact disturbance than was imagined. While life adapts beautifully to slow changes in the enviroment, a sudden event, like a large impact, can have catastrophic consequences. While we do not face any known hazard today for an extinction-level event, we are becoming aware that more than a million near-earth asteroids (NEAs) exist with the capacity to take out a city if they hit in the wrong place. The NASA Spaceguard Survey has begun to discover and track the larger NEAs, but we do not yet have the capability to find more than a few pecent of the objects as small as the Tunguska impactor (about 40 m diameter). This continuing impact hazard is at roughly the hazard level of volcanic eruptions, including the rare supervolcano eruptions. The differnece is that an incoming cosmic projectile can be detected and tracked, and by application of modern space technology, most impactors could be deflected. Impacts are the only natural hazard that can be eliminated. This motivates our NEA search programs such as Spaceguard and argues for extending them to smaller sizes. At the same time we realize that the most likely warning time for the next impact remains a few seconds, and we may therefore need to fall back on the more conventional responses of disaster mitigation and relief.

  7. Hazardous'' terminology

    SciTech Connect

    Powers, J.

    1991-01-01

    A number of terms (e.g., hazardous chemicals,'' hazardous materials,'' hazardous waste,'' and similar nomenclature) refer to substances that are subject to regulation under one or more federal environmental laws. State laws and regulations also provide additional, similar, or identical terminology that may be confused with the federally defined terms. Many of these terms appear synonymous, and it easy to use them interchangeably. However, in a regulatory context, inappropriate use of narrowly defined terms can lead to confusion about the substances referred to, the statutory provisions that apply, and the regulatory requirements for compliance under the applicable federal statutes. This information Brief provides regulatory definitions, a brief discussion of compliance requirements, and references for the precise terminology that should be used when referring to hazardous'' substances regulated under federal environmental laws. A companion CERCLA Information Brief (EH-231-004/0191) addresses toxic'' nomenclature.

  8. Identifying Hazards

    EPA Pesticide Factsheets

    The federal government has established a system of labeling hazardous materials to help identify the type of material and threat posed. Summaries of information on over 300 chemicals are maintained in the Envirofacts Master Chemical Integrator.

  9. Coastal Hazards.

    ERIC Educational Resources Information Center

    Vandas, Steve

    1998-01-01

    Focuses on hurricanes and tsunamis and uses these topics to address other parts of the science curriculum. In addition to a discussion on beach erosion, a poster is provided that depicts these natural hazards that threaten coastlines. (DDR)

  10. Flood hazard and risk analysis in the southwest region of Bangladesh

    NASA Astrophysics Data System (ADS)

    Tingsanchali, Tawatchai; Fazlul Karim, Mohammed

    2005-06-01

    Flood hazard and risk assessment was conducted to identify the priority areas in the southwest region of Bangladesh for flood mitigation. Simulation of flood flow through the Gorai and Arial Khan river system and its floodplains was done by using a hydrodynamic model. After model calibration and verification, the model was used to simulate the flood flow of 100-year return period for a duration of four months. The maximum flooding depths at different locations in the rivers and floodplains were determined. The process in determining long flooding durations at every grid point in the hydrodynamic model is laborious and time-consuming. Therefore the flood durations were determined by using satellite images of the observed flood in 1988, which has a return period close to 100 years. Flood hazard assessment was done considering flooding depth and duration. By dividing the study area into smaller land units for hazard assessment, the hazard index and the hazard factor for each land unit for depth and duration of flooding were determined. From the hazard factors of the land units, a flood hazard map, which indicates the locations of different categories of hazard zones, was developed. It was found that 54% of the study area was in the medium hazard zone, 26% in the higher hazard zone and 20% in the lower hazard zone. Due to lack of sufficient flood damage data, flood damage vulnerability is simply considered proportional to population density. The flood risk factor of each land unit was determined as the product of the flood hazard factor and the vulnerability factor. Knowing the flood risk factors for the land units, a flood risk map was developed based on the risk factors. These maps are very useful for the inhabitants and floodplain management authorities to minimize flood damage and loss of human lives.

  11. Reliability studies of incident coding systems in high hazard industries: A narrative review of study methodology.

    PubMed

    Olsen, Nikki S

    2013-03-01

    This paper reviews the current literature on incident coding system reliability and discusses the methods applied in the conduct and measurement of reliability. The search strategy targeted three electronic databases using a list of search terms and the results were examined for relevance, including any additional relevant articles from the bibliographies. Twenty five papers met the relevance criteria and their methods are discussed. Disagreements in the selection of methods between reliability researchers are highlighted as are the effects of method selection on the outcome of the trials. The review provides evidence that the meaningfulness of and confidence in results is directly affected by the methodologies employed by the researcher during the preparation, conduct and analysis of the reliability study. Furthermore, the review highlights the heterogeneity of methodologies employed by researchers measuring reliability of incident coding techniques, reducing the ability to critically compare and appraise techniques being considered for the adoption of report coding and trend analysis by client organisations. It is recommended that future research focuses on the standardisation of reliability research and measurement within the incident coding domain.

  12. Volcanic hazards and aviation safety

    USGS Publications Warehouse

    Casadevall, Thomas J.; Thompson, Theodore B.; Ewert, John W.; ,

    1996-01-01

    An aeronautical chart was developed to determine the relative proximity of volcanoes or ash clouds to the airports and flight corridors that may be affected by volcanic debris. The map aims to inform and increase awareness about the close spatial relationship between volcanoes and aviation operations. It shows the locations of the active volcanoes together with selected aeronautical navigation aids and great-circle routes. The map mitigates the threat that volcanic hazards pose to aircraft and improves aviation safety.

  13. The impact of overlapping processes on rockfall hazard analysis - the Bolonia Bay study (southern Spain)

    NASA Astrophysics Data System (ADS)

    Fernandez-Steeger, T.; Grützner, C.; Reicherter, K.; Braun, A.; Höbig, N.

    2009-04-01

    For rockfall simulations, competitive case studies and data sets are important to develop and evaluate the models or software. Especially for empirical or data driven stochastic modelling the quality of the reference data sets has a major impact on model skills and knowledge discovery. Therefore, rockfalls in the Bolonia Bay close to Tarifa (Spain) were mapped. Here, the siliciclastic Miocene rocks (megaturbidites) are intensively joined and disaggregated by a perpendicular joint system. Although bedding supports stability as the dip is not directed towards the rock face, the deposits indicate a continuous process of material loss from the 80 m high cliff of the San Bartolome mountain front by single large rock falls. For more than 300 blocks data on size, shape, type of rock, and location were collected. The work concentrated on rockfall blocks with a volume of more than 2 m³ and up to 350 m³. Occasionally very long "runout" distances of up to 2 km have been observed. For all major source areas and deposits, runout analysis using empirical models and a numerical trajectorian model has been performed. The most empirical models are principally based on the relation between fall height and travel distance. Beside the "Fahrböschung" from Heim (1932) the "shadow angle" introduced by Evans and Hungr (1993) is most common today. However, studies from different sites show a wide variance of the angle relations (Dorren 2003, Corominas 1996). The reasons for that might be different environments and trigger mechanisms, or varying secondary effects such as post-depositional movement. Today, "semi" numerical approaches based on trajectorian models are quite common to evaluate the rockfall energy and the runout distance for protection measures and risk evaluations. The results of the models highly depend on the quality of the input parameters. One problem here might be that some of the parameters, especially the dynamic ones, are not easy to determine and the quality of the

  14. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Standard Flood Hazard... MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND MAPPING OF SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard...

  15. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 44 Emergency Management and Assistance 1 2014-10-01 2014-10-01 false Standard Flood Hazard... MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND MAPPING OF SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard...

  16. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false Standard Flood Hazard... MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND MAPPING OF SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard...

  17. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 44 Emergency Management and Assistance 1 2012-10-01 2011-10-01 true Standard Flood Hazard... MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND MAPPING OF SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard...

  18. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 44 Emergency Management and Assistance 1 2011-10-01 2011-10-01 false Standard Flood Hazard... MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND MAPPING OF SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard...

  19. Hazard and operability study of the multi-function Waste Tank Facility. Revision 1

    SciTech Connect

    Hughes, M.E.

    1995-05-15

    The Multi-Function Waste Tank Facility (MWTF) East site will be constructed on the west side of the 200E area and the MWTF West site will be constructed in the SW quadrant of the 200W site in the Hanford Area. This is a description of facility hazards that site personnel or the general public could potentially be exposed to during operation. A list of preliminary Design Basis Accidents was developed.

  20. A New Master of Natural Hazards Program at The Australian National University

    NASA Astrophysics Data System (ADS)

    Pozgay, S.; Zoleta-Nantes, D.

    2009-12-01

    The new Master of Natural Hazards program at The Australian National University provides a multi-disciplinary approach to the study and monitoring of geophysical processes that can lead to the recognition of hazards and a consequent reduction of their impacts through emergency measures, disaster plans, and relief and rehabilitation. The program provides people with an understanding of the most up-to-date scientific understanding on the causes of natural hazards, their effects on human societies, and ways to mitigate their impacts and reduce their losses by focusing on Australia and the Asia-Pacific case studies. The Master of Natural Hazards program brings together the expertise of researchers across the university to provide an opportunity for students to do coursework and research projects that will provide them with extensive knowledge of the natural hazards that occur and pose the greatest risks on human communities in the Asia-Pacific, and an understanding of the human dimensions of the natural hazards occurrences. The program consists of two compulsory courses each in the Earth Sciences and in the Social Sciences that are designed to provide a complementary and comprehensive overview of natural hazards issues. Elective courses can be of a general grouping, or students may choose one of four Focus Streams: Environmental and Geographic Studies; Climate Change; Earth Structure and Imaging; or Socio-economic, Development and Policy Studies. A special case study project will involve writing a thesis on a topic to be approved by the Program Conveners and will comprise a body of work on an approved topic in natural hazards in the Asia-Pacific region. Students in this program will gain a broad scientific knowledge and methodological skills to understand the physical causes and frequency of the most important natural hazards in the Asia-Pacific region, as well as the latest scientific methods and best practices of monitoring them for hazard mapping and disaster

  1. SLUDGE TREATMENT PROJECT ENGINEERED CONTAINER RETRIEVAL AND TRANSFER SYSTEM PRELMINARY DESIGN HAZARD AND OPERABILITY STUDY

    SciTech Connect

    CARRO CA

    2011-07-15

    This Hazard and Operability (HAZOP) study addresses the Sludge Treatment Project (STP) Engineered Container Retrieval and Transfer System (ECRTS) preliminary design for retrieving sludge from underwater engineered containers located in the 105-K West (KW) Basin, transferring the sludge as a sludge-water slurry (hereafter referred to as 'slurry') to a Sludge Transport and Storage Container (STSC) located in a Modified KW Basin Annex, and preparing the STSC for transport to T Plant using the Sludge Transport System (STS). There are six, underwater engineered containers located in the KW Basin that, at the time of sludge retrieval, will contain an estimated volume of 5.2 m{sup 3} of KW Basin floor and pit sludge, 18.4 m{sup 3} of 105-K East (KE) Basin floor, pit, and canister sludge, and 3.5 m{sup 3} of settler tank sludge. The KE and KW Basin sludge consists of fuel corrosion products (including metallic uranium, and fission and activation products), small fuel fragments, iron and aluminum oxide, sand, dirt, operational debris, and biological debris. The settler tank sludge consists of sludge generated by the washing of KE and KW Basin fuel in the Primary Clean Machine. A detailed description of the origin of sludge and its chemical and physical characteristics can be found in HNF-41051, Preliminary STP Container and Settler Sludge Process System Description and Material Balance. In summary, the ECRTS retrieves sludge from the engineered containers and hydraulically transfers it as a slurry into an STSC positioned within a trailer-mounted STS cask located in a Modified KW Basin Annex. The slurry is allowed to settle within the STSC to concentrate the solids and clarify the supernate. After a prescribed settling period the supernate is decanted. The decanted supernate is filtered through a sand filter and returned to the basin. Subsequent batches of slurry are added to the STSC, settled, and excess supernate removed until the prescribed quantity of sludge is collected

  2. 2015 USGS Seismic Hazard Model for Induced Seismicity

    NASA Astrophysics Data System (ADS)

    Petersen, M. D.; Mueller, C. S.; Moschetti, M. P.; Hoover, S. M.; Ellsworth, W. L.; Llenos, A. L.; Michael, A. J.

    2015-12-01

    Over the past several years, the seismicity rate has increased markedly in multiple areas of the central U.S. Studies have tied the majority of this increased activity to wastewater injection in deep wells and hydrocarbon production. These earthquakes are induced by human activities that change rapidly based on economic and policy decisions, making them difficult to forecast. Our 2014 USGS National Seismic Hazard Model and previous models are intended to provide the long-term hazard (2% probability of exceedance in 50 years) and are based on seismicity rates and patterns observed mostly from tectonic earthquakes. However, potentially induced earthquakes were identified in 14 regions that were not included in the earthquake catalog used for constructing the 2014 model. We recognized the importance of considering these induced earthquakes in a separate hazard analysis, and as a result in April 2015 we released preliminary models that explored the impact of this induced seismicity on the hazard. Several factors are important in determining the hazard from induced seismicity: period of the catalog that optimally forecasts the next year's activity, earthquake magnitude-rate distribution, earthquake location statistics, maximum magnitude, ground motion models, and industrial drivers such as injection rates. The industrial drivers are not currently available in a form that we can implement in a 1-year model. Hazard model inputs have been evaluated by a broad group of scientists and engineers to assess the range of acceptable models. Results indicate that next year's hazard is significantly higher by more than a factor of three in Oklahoma, Texas, and Colorado compared to the long-term 2014 hazard model. These results have raised concern about the impacts of induced earthquakes on the built environment and have led to many engineering and policy discussions about how to mitigate these effects for the more than 7 million people that live near areas of induced seismicity.

  3. Comment on Omira, Baptista and Matias (2015), "Probabilistic Tsunami Hazard in the Northeast Atlantic from Near- and Far-Field Tectonic Sources"

    NASA Astrophysics Data System (ADS)

    Fonseca, Joao F. B. D.

    2016-12-01

    Omira et al. (Pure and Applied Geophysics 172:901-920, 2015) assess the probabilistic tsunami hazard in the NE Atlantic. While recognizing the importance of this type of study for the prioritization of risk mitigation measures, I take consider the issue with the way the results are estimated and presented. I argue that the communication of the hazard is flawed, the analysis suffered from compounded conservatism, and no estimate of the associated uncertainties is provided.

  4. Comment on Omira, Baptista and Matias (2015), "Probabilistic Tsunami Hazard in the Northeast Atlantic from Near- and Far-Field Tectonic Sources"

    NASA Astrophysics Data System (ADS)

    Fonseca, Joao F. B. D.

    2017-03-01

    Omira et al. (Pure and Applied Geophysics 172:901-920, 2015) assess the probabilistic tsunami hazard in the NE Atlantic. While recognizing the importance of this type of study for the prioritization of risk mitigation measures, I take consider the issue with the way the results are estimated and presented. I argue that the communication of the hazard is flawed, the analysis suffered from compounded conservatism, and no estimate of the associated uncertainties is provided.

  5. Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner

    2014-12-01

    During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly

  6. Conveying Flood Hazard Risk Through Spatial Modeling: A Case Study for Hurricane Sandy-Affected Communities in Northern New Jersey.

    PubMed

    Artigas, Francisco; Bosits, Stephanie; Kojak, Saleh; Elefante, Dominador; Pechmann, Ildiko

    2016-10-01

    The accurate forecast from Hurricane Sandy sea surge was the result of integrating the most sophisticated environmental monitoring technology available. This stands in contrast to the limited information and technology that exists at the community level to translate these forecasts into flood hazard levels on the ground at scales that are meaningful to property owners. Appropriately scaled maps with high levels of certainty can be effectively used to convey exposure to flood hazard at the community level. This paper explores the most basic analysis and data required to generate a relatively accurate flood hazard map to convey inundation risk due to sea surge. A Boolean overlay analysis of four input layers: elevation and slope derived from LiDAR data and distances from streams and catch basins derived from aerial photography and field reconnaissance were used to create a spatial model that explained 55 % of the extent and depth of the flood during Hurricane Sandy. When a ponding layer was added to the previous model to account for depressions that would fill and spill over to nearby areas, the new model explained almost 70 % of the extent and depth of the flood. The study concludes that fairly accurate maps can be created with readily available information and that it is possible to infer a great deal about risk of inundation at the property level, from flood hazard maps. The study goes on to conclude that local communities are encouraged to prepare for disasters, but in reality because of the existing Federal emergency management framework there is very little incentive to do so.

  7. Conveying Flood Hazard Risk Through Spatial Modeling: A Case Study for Hurricane Sandy-Affected Communities in Northern New Jersey

    NASA Astrophysics Data System (ADS)

    Artigas, Francisco; Bosits, Stephanie; Kojak, Saleh; Elefante, Dominador; Pechmann, Ildiko

    2016-10-01

    The accurate forecast from Hurricane Sandy sea surge was the result of integrating the most sophisticated environmental monitoring technology available. This stands in contrast to the limited information and technology that exists at the community level to translate these forecasts into flood hazard levels on the ground at scales that are meaningful to property owners. Appropriately scaled maps with high levels of certainty can be effectively used to convey exposure to flood hazard at the community level. This paper explores the most basic analysis and data required to generate a relatively accurate flood hazard map to convey inundation risk due to sea surge. A Boolean overlay analysis of four input layers: elevation and slope derived from LiDAR data and distances from streams and catch basins derived from aerial photography and field reconnaissance were used to create a spatial model that explained 55 % of the extent and depth of the flood during Hurricane Sandy. When a ponding layer was added to the previous model to account for depressions that would fill and spill over to nearby areas, the new model explained almost 70 % of the extent and depth of the flood. The study concludes that fairly accurate maps can be created with readily available information and that it is possible to infer a great deal about risk of inundation at the property level, from flood hazard maps. The study goes on to conclude that local communities are encouraged to prepare for disasters, but in reality because of the existing Federal emergency management framework there is very little incentive to do so.

  8. Hazard Assessment of Chemical Air Contaminants Measured in Residences

    SciTech Connect

    Logue, J.M.; McKone, T.E.; Sherman, M. H.; Singer, B.C.

    2010-05-10

    Identifying air pollutants that pose a potential hazard indoors can facilitate exposure mitigation. In this study, we compiled summary results from 77 published studies reporting measurements of chemical pollutants in residences in the United States and in countries with similar lifestyles. These data were used to calculate representative mid-range and upper bound concentrations relevant to chronic exposures for 267 pollutants and representative peak concentrations relevant to acute exposures for 5 activity-associated pollutants. Representative concentrations are compared to available chronic and acute health standards for 97 pollutants. Fifteen pollutants appear to exceed chronic health standards in a large fraction of homes. Nine other pollutants are identified as potential chronic health hazards in a substantial minority of homes and an additional nine are identified as potential hazards in a very small percentage of homes