Science.gov

Sample records for hazard mitigation studies

  1. Unacceptable Risk: Earthquake Hazard Mitigation in One California School District. Hazard Mitigation Case Study.

    ERIC Educational Resources Information Center

    California State Office of Emergency Services, Sacramento.

    Earthquakes are a perpetual threat to California's school buildings. School administrators must be aware that hazard mitigation means much more than simply having a supply of water bottles in the school; it means getting everyone involved in efforts to prevent tragedies from occurring in school building in the event of an earthquake. The PTA in…

  2. Mitigating lightning hazards

    SciTech Connect

    Hasbrouck, R.

    1996-05-01

    A new draft document provides guidance for assessing and mitigating the effects of lightning hazards on a Department of Energy (or any other) facility. Written by two Lawrence Livermore Engineers, the document combines lightning hazard identification and facility categorization with a new concept, the Lightning Safety System, to help dispel the confusion and mystery surrounding lightning and its effects. The guidance is of particular interest to DOE facilities storing and handling nuclear and high-explosive materials. The concepts presented in the document were used to evaluate the lightning protection systems of the Device Assembly Facility at the Nevada Test Site.

  3. Numerical study on tsunami hazard mitigation using a submerged breakwater.

    PubMed

    Ha, Taemin; Yoo, Jeseon; Han, Sejong; Cho, Yong-Sik

    2014-01-01

    Most coastal structures have been built in surf zones to protect coastal areas. In general, the transformation of waves in the surf zone is quite complicated and numerous hazards to coastal communities may be associated with such phenomena. Therefore, the behavior of waves in the surf zone should be carefully analyzed and predicted. Furthermore, an accurate analysis of deformed waves around coastal structures is directly related to the construction of economically sound and safe coastal structures because wave height plays an important role in determining the weight and shape of a levee body or armoring material. In this study, a numerical model using a large eddy simulation is employed to predict the runup heights of nonlinear waves that passed a submerged structure in the surf zone. Reduced runup heights are also predicted, and their characteristics in terms of wave reflection, transmission, and dissipation coefficients are investigated. PMID:25215334

  4. Numerical Study on Tsunami Hazard Mitigation Using a Submerged Breakwater

    PubMed Central

    Yoo, Jeseon; Han, Sejong; Cho, Yong-Sik

    2014-01-01

    Most coastal structures have been built in surf zones to protect coastal areas. In general, the transformation of waves in the surf zone is quite complicated and numerous hazards to coastal communities may be associated with such phenomena. Therefore, the behavior of waves in the surf zone should be carefully analyzed and predicted. Furthermore, an accurate analysis of deformed waves around coastal structures is directly related to the construction of economically sound and safe coastal structures because wave height plays an important role in determining the weight and shape of a levee body or armoring material. In this study, a numerical model using a large eddy simulation is employed to predict the runup heights of nonlinear waves that passed a submerged structure in the surf zone. Reduced runup heights are also predicted, and their characteristics in terms of wave reflection, transmission, and dissipation coefficients are investigated. PMID:25215334

  5. Washington Tsunami Hazard Mitigation Program

    NASA Astrophysics Data System (ADS)

    Walsh, T. J.; Schelling, J.

    2012-12-01

    Washington State has participated in the National Tsunami Hazard Mitigation Program (NTHMP) since its inception in 1995. We have participated in the tsunami inundation hazard mapping, evacuation planning, education, and outreach efforts that generally characterize the NTHMP efforts. We have also investigated hazards of significant interest to the Pacific Northwest. The hazard from locally generated earthquakes on the Cascadia subduction zone, which threatens tsunami inundation in less than hour following a magnitude 9 earthquake, creates special problems for low-lying accretionary shoreforms in Washington, such as the spits of Long Beach and Ocean Shores, where high ground is not accessible within the limited time available for evacuation. To ameliorate this problem, we convened a panel of the Applied Technology Council to develop guidelines for construction of facilities for vertical evacuation from tsunamis, published as FEMA 646, now incorporated in the International Building Code as Appendix M. We followed this with a program called Project Safe Haven (http://www.facebook.com/ProjectSafeHaven) to site such facilities along the Washington coast in appropriate locations and appropriate designs to blend with the local communities, as chosen by the citizens. This has now been completed for the entire outer coast of Washington. In conjunction with this effort, we have evaluated the potential for earthquake-induced ground failures in and near tsunami hazard zones to help develop cost estimates for these structures and to establish appropriate tsunami evacuation routes and evacuation assembly areas that are likely to to be available after a major subduction zone earthquake. We intend to continue these geotechnical evaluations for all tsunami hazard zones in Washington.

  6. Contributions of Nimbus 7 TOMS Data to Volcanic Study and Hazard Mitigation

    NASA Technical Reports Server (NTRS)

    Krueger, Arlin J.; Bluth, G. J. S.; Schaefer, S. A.

    1998-01-01

    Nimbus TOMS data have led to advancements among many volcano-related scientific disciplines, from the initial ability to quantify SO2 clouds leading to derivations of eruptive S budgets and fluxes, to tracking of individual clouds, assessing global volcanism and atmospheric impacts. Some of the major aspects of TOMS-related research, listed below, will be reviewed and updated: (1) Measurement of volcanic SO2 clouds: Nimbus TOMS observed over 100 individual SO2 clouds during its mission lifetime; large explosive eruptions are now routinely and reliably measured by satellite. (2) Eruption processes: quantification of SO2 emissions have allowed assessments of eruption sulfur budgets, the evaluation of "excess" sulfur, and inferences of H2S emissions. (3) Detection of ash: TOMS data are now used to detect volcanic particulates in the atmosphere, providing complementary analyses to infrared methods of detection. Paired TOMS and AVHRR studies have provided invaluable information on volcanic cloud compositions and processes. (4) Cloud tracking and hazard mitigation: volcanic clouds can be considered gigantic tracers in the atmosphere, and studies of the fates of these clouds have led to new knowledge of their physical and chemical dispersion in the atmosphere for predictive models. (5) Global trends: the long term data set has provided researchers an unparalleled record of explosive volcanism, and forms a key component in assessing annual to decadal trends in global S emissions. (6) Atmospheric impacts: TOMS data have been linked to independent records of atmospheric change, in order to compare cause and effect processes following a massive injection of SO2 into the atmosphere. (7) Future TOMS instruments and applications: Nimbus TOMS has given way to new satellite platforms, with several wavelength and resolution modifications. New efforts to launch a geostationary TOMS could provide unprecedented observations of volcanic activity.

  7. Landslide hazard mitigation in North America

    USGS Publications Warehouse

    Wieczorek, G.F.; Leahy, P.P.

    2008-01-01

    Active landslides throughout the states and territories of the United States result in extensive property loss and 25-50 deaths per year. The U.S. Geological Survey (USGS) has a long history of detailed examination of landslides since the work of Howe (1909) in the San Juan Mountains of Colorado. In the last four decades, landslide inventory maps and landslide hazard maps have depicted landslides of different ages, identified fresh landslide scarps, and indicated the direction of landslide movement for different regions of the states of Colorado, California, and Pennsylvania. Probability-based methods improve landslide hazards assessments. Rainstorms, earthquakes, wildfires, and volcanic eruptions can trigger landslides. Improvements in remote sensing of rainfall make it possible to issue landslide advisories and warnings for vulnerable areas. From 1986 to 1995, the USGS issued hazard warnings based on rainfall in the San Francisco Bay area. USGS workers also identified rainfall thresholds triggering landslides in Puerto Rico, Hawaii, Washington, and the Blue Ridge Mountains of central Virginia. Detailed onsite monitoring of landslides near highways in California and Colorado aided transportation officials. The USGS developed a comprehensive, multi-sector, and multi-agency strategy to mitigate landslide hazards nationwide. This study formed the foundation of the National Landslide Hazards Mitigation Strategy. The USGS, in partnership with the U.S. National Weather Service and the State of California, began to develop a real-time warning system for landslides from wildfires in Southern California as a pilot study in 2005.

  8. Predictability and extended-range prognosis in natural hazard risk mitigation process: A case study over west Greece

    NASA Astrophysics Data System (ADS)

    Matsangouras, Ioannis T.; Nastos, Panagiotis T.

    2014-05-01

    Natural hazards pose an increasing threat to society and new innovative techniques or methodologies are necessary to be developed, in order to enhance the risk mitigation process in nowadays. It is commonly accepted that disaster risk reduction is a vital key for future successful economic and social development. The systematic improvement accuracy of extended-range prognosis products, relating with monthly and seasonal predictability, introduced them as a new essential link in risk mitigation procedure. Aiming at decreasing the risk, this paper presents the use of seasonal and monthly forecasting process that was tested over west Greece from September to December, 2013. During that season significant severe weather events occurred, causing significant impact to the local society (severe storms/rainfalls, hail, flash floods, etc). Seasonal and monthly forecasting products from European Centre for Medium-Range Weather Forecasts (ECMWF) depicted, with probabilities stratified by terciles, areas of Greece where significant weather may occur. As atmospheric natural hazard early warning systems are able to deliver warnings up to 72 hours in advance, this study illustrates that extended-range prognosis could be introduced as a new technique in risk mitigation. Seasonal and monthly forecast products could highlight extended areas where severe weather events may occur in one month lead time. In addition, a risk mitigation procedure, that extended prognosis products are adopted, is also presented providing useful time to preparedness process at regional administration level.

  9. Satellite imagery for volcanic hazards mitigation

    USGS Publications Warehouse

    Helz, R.T.; Ellrod, G.A.; Wadge, G.

    2002-01-01

    The Committee on Earth Observation Satellites (CEOS) seeks to foster cooperation to increase the usefulness and accessibility of satellite imagery. In 1997, CEOS initiated the Disaster Management Support Project to assess the present and potential use of satellite-derived information for volcanic hazards mitigation. The final report of the CEOS Volcanic Hazards Working Group reviews current use of satellite data for mitigation of volcanic hazards. The report specifies the minimum spectral channels needed for effective remote sensing of volcanic hazards, together with recommendations for threshold and optimum spatial and temporal resolutions.

  10. An economic and geographic appraisal of a spatial natural hazard risk: a study of landslide mitigation rules

    USGS Publications Warehouse

    Bernknopf, R.L.; Brookshire, D.S.; Campbell, R.H.; Shapiro, C.D.

    1988-01-01

    Efficient mitigation of natural hazards requires a spatial representation of the risk, based upon the geographic distribution of physical parameters and man-related development activities. Through such a representation, the spatial probability of landslides based upon physical science concepts is estimated for Cincinnati, Ohio. Mitigation programs designed to reduce loss from landslide natural hazards are then evaluated. An optimum mitigation rule is suggested that is spatially selective and is determined by objective measurements of hillside slope and properties of the underlying soil. -Authors

  11. Earthquake Hazard Mitigation Strategy in Indonesia

    NASA Astrophysics Data System (ADS)

    Karnawati, D.; Anderson, R.; Pramumijoyo, S.

    2008-05-01

    Because of the active tectonic setting of the region, the risks of geological hazards inevitably increase in Indonesian Archipelagoes and other ASIAN countries. Encouraging community living in the vulnerable area to adapt with the nature of geology will be the most appropriate strategy for earthquake risk reduction. Updating the Earthquake Hazard Maps, enhancement ofthe existing landuse management , establishment of public education strategy and method, strengthening linkages among stake holders of disaster mitigation institutions as well as establishement of continues public consultation are the main strategic programs for community resilience in earthquake vulnerable areas. This paper highlights some important achievements of Earthquake Hazard Mitigation Programs in Indonesia, together with the difficulties in implementing such programs. Case examples of Yogyakarta and Bengkulu Earthquake Mitigation efforts will also be discussed as the lesson learned. The new approach for developing earthquake hazard map which is innitiating by mapping the psychological aspect of the people living in vulnerable area will be addressed as well.

  12. Playing against nature: improving earthquake hazard mitigation

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Stein, J.

    2012-12-01

    The great 2011 Tohoku earthquake dramatically demonstrated the need to improve earthquake and tsunami hazard assessment and mitigation policies. The earthquake was much larger than predicted by hazard models, and the resulting tsunami overtopped coastal defenses, causing more than 15,000 deaths and $210 billion damage. Hence if and how such defenses should be rebuilt is a challenging question, because the defences fared poorly and building ones to withstand tsunamis as large as March's is too expensive,. A similar issue arises along the Nankai Trough to the south, where new estimates warning of tsunamis 2-5 times higher than in previous models raise the question of what to do, given that the timescale on which such events may occur is unknown. Thus in the words of economist H. Hori, "What should we do in face of uncertainty? Some say we should spend our resources on present problems instead of wasting them on things whose results are uncertain. Others say we should prepare for future unknown disasters precisely because they are uncertain". Thus society needs strategies to mitigate earthquake and tsunami hazards that make economic and societal sense, given that our ability to assess these hazards is poor, as illustrated by highly destructive earthquakes that often occur in areas predicted by hazard maps to be relatively safe. Conceptually, we are playing a game against nature "of which we still don't know all the rules" (Lomnitz, 1989). Nature chooses tsunami heights or ground shaking, and society selects the strategy to minimize the total costs of damage plus mitigation costs. As in any game of chance, we maximize our expectation value by selecting the best strategy, given our limited ability to estimate the occurrence and effects of future events. We thus outline a framework to find the optimal level of mitigation by balancing its cost against the expected damages, recognizing the uncertainties in the hazard estimates. This framework illustrates the role of the uncertainties and the need to candidly assess them. It can be applied to exploring policies under various hazard scenarios and mitigating other natural hazards.ariation in total cost, the sum of expected loss and mitigation cost, as a function of mitigation level. The optimal level of mitigation, n*, minimizes the total cost. The expected loss depends on the hazard model, so the better the hazard model, the better the mitigation policy (Stein and Stein, 2012).

  13. Mitigation of Hazardous Comets and Asteroids

    NASA Astrophysics Data System (ADS)

    Belton, Michael J. S.; Morgan, Thomas H.; Samarasinha, Nalin H.; Yeomans, Donald K.

    2004-11-01

    Preface; 1. Recent progress in interpreting the nature of the near-Earth object population W. Bottke, A. Morbidelli and R. Jedicke; 2. Earth impactors: orbital characteristics and warning times S. R. Chesley and T. B. Spahr; 3. The role of radar in predicting and preventing asteroid and comet collisions with Earth S. J. Ostro and J. D. Giorgini; 4. Interior structures for asteroids and cometary nuclei E. Asphaug; 5. What we know and don't know about surfaces of potentially hazardous small bodies C. R. Chapman; 6. About deflecting asteroids and comets K. A. Holsapple; 7. Scientific requirements for understanding the near-Earth asteroid population A. W. Harris; 8. Physical properties of comets and asteroids inferred from fireball observations M. D. Martino and A. Cellino; 9. Mitigation technologies and their requirements C. Gritzner and R. Kahle; 10. Peering inside near-Earth objects with radio tomography W. Kofman and A. Safaeinili; 11. Seismological imvestigation of asteroid and comet interiors J. D. Walker and W. F. Huebner; 12. Lander and penetrator science for near-Earth object mitigation studies A. J. Ball, P. Lognonne, K. Seiferlin, M. Patzold and T. Spohn; 13. Optimal interpretation and deflection of Earth-approaching asteroids using low-thrust electric propulsion B. A. Conway; 14. Close proximity operations at small bodies: orbiting, hovering, and hopping D. J. Scheeres; 15. Mission operations in low gravity regolith and dust D. Sears, M. Franzen, S. Moore, S. Nichols, M. Kareev and P. Benoit; 16. Impacts and the public: communicating the nature of the impact hazard D. Morrison, C. R. Chapman, D. Steel and R. P. Binzel; 17. Towards a program to remove the threat of hazardous NEOs M. J. S. Belton.

  14. Mitigation of Hazardous Comets and Asteroids

    NASA Astrophysics Data System (ADS)

    Belton, Michael J. S.; Morgan, Thomas H.; Samarasinha, Nalin H.; Yeomans, Donald K.

    2011-03-01

    Preface; 1. Recent progress in interpreting the nature of the near-Earth object population W. Bottke, A. Morbidelli and R. Jedicke; 2. Earth impactors: orbital characteristics and warning times S. R. Chesley and T. B. Spahr; 3. The role of radar in predicting and preventing asteroid and comet collisions with Earth S. J. Ostro and J. D. Giorgini; 4. Interior structures for asteroids and cometary nuclei E. Asphaug; 5. What we know and don't know about surfaces of potentially hazardous small bodies C. R. Chapman; 6. About deflecting asteroids and comets K. A. Holsapple; 7. Scientific requirements for understanding the near-Earth asteroid population A. W. Harris; 8. Physical properties of comets and asteroids inferred from fireball observations M. D. Martino and A. Cellino; 9. Mitigation technologies and their requirements C. Gritzner and R. Kahle; 10. Peering inside near-Earth objects with radio tomography W. Kofman and A. Safaeinili; 11. Seismological imvestigation of asteroid and comet interiors J. D. Walker and W. F. Huebner; 12. Lander and penetrator science for near-Earth object mitigation studies A. J. Ball, P. Lognonne, K. Seiferlin, M. Patzold and T. Spohn; 13. Optimal interpretation and deflection of Earth-approaching asteroids using low-thrust electric propulsion B. A. Conway; 14. Close proximity operations at small bodies: orbiting, hovering, and hopping D. J. Scheeres; 15. Mission operations in low gravity regolith and dust D. Sears, M. Franzen, S. Moore, S. Nichols, M. Kareev and P. Benoit; 16. Impacts and the public: communicating the nature of the impact hazard D. Morrison, C. R. Chapman, D. Steel and R. P. Binzel; 17. Towards a program to remove the threat of hazardous NEOs M. J. S. Belton.

  15. Volcano hazard mitigation program in Indonesia

    USGS Publications Warehouse

    Sudradjat, A.

    1990-01-01

    Volcanological investigations in Indonesia were started in the 18th century, when Valentijn in 1726 prepared a chronological report of the eruption of Banda Api volcno, Maluku. Modern and intensive volcanological studies did not begin until the catastrophic eruption of Kelut volcano, East Java, in 1919. The eruption took 5,011 lives and destroyed thousands of acres of coffee plantation. An eruption lahar generated by the crater lake water mixed with volcanic eruptions products was the cause of death for a high number of victims. An effort to mitigate the danger from volcanic eruption was first initiated in 1921 by constructing a tunnel to drain the crater lake water of Kelut volcano. At the same time a Volcanological Survey was established by the government with the responsibility of seeking every means for minimizing the hazard caused by volcanic eruption. 

  16. Advances in Tsunami Hazard Mitigation in Chile

    NASA Astrophysics Data System (ADS)

    Lagos, M.; Arenas, F.; Lillo, I.; Tamburini, L.

    2012-12-01

    Chile has records of recurring tsunamis. This is confirmed by its geological evidences, long historical records and instrumental data. However, tsunamis were always an underestimated hazard. In 2010 its coasts were affected by a large near-field tsunami and in 2011 for a far-field tsunami generated in Japan, confirming the high vulnerability of coastal communities. Both events had different magnitudes and impacts on coastal areas. The near-field tsunami was generated by an earthquake (Mw 8.8) that occurred on the 27th of February 2010, the waves arrived at the coast in a few minutes and mostly impacted small coastal communities located within the rupture area, there were 156 victims and 25 missing. While the far-field tsunami was generated by a giant earthquake (Mw 9) that occurred on the 11th of March 2011 in Japan, arriving their first waves on the coast of Chile twenty one hours later, displacing thousands of people to high ground. These two recent events have resulted in advances in tsunami hazard mitigation, mainly in the localities that were affected by both events, incorporating the tsunami risk and the emergency management in territorial planning. Example of this is the consideration of risk based on worst case scenarios, design and assessment of mitigation scenarios (e.g. tsunami forest, mitigation parks, dikes and fills) using tsunami modeling and land use policies more rigorous. This research is supported by Fondecyt 11090210.

  17. WHC natural phenomena hazards mitigation implementation plan

    SciTech Connect

    Conrads, T.J.

    1996-09-11

    Natural phenomena hazards (NPH) are unexpected acts of nature which pose a threat or danger to workers, the public or to the environment. Earthquakes, extreme winds (hurricane and tornado),snow, flooding, volcanic ashfall, and lightning strike are examples of NPH at Hanford. It is the policy of U.S. Department of Energy (DOE) to design, construct and operate DOE facilitiesso that workers, the public and the environment are protected from NPH and other hazards. During 1993 DOE, Richland Operations Office (RL) transmitted DOE Order 5480.28, ``Natural Phenomena Hazards Mitigation,`` to Westinghouse Hanford COmpany (WHC) for compliance. The Order includes rigorous new NPH criteria for the design of new DOE facilities as well as for the evaluation and upgrade of existing DOE facilities. In 1995 DOE issued Order 420.1, ``Facility Safety`` which contains the same NPH requirements and invokes the same applicable standards as Order 5480.28. It will supersede Order 5480.28 when an in-force date for Order 420.1 is established through contract revision. Activities will be planned and accomplished in four phases: Mobilization; Prioritization; Evaluation; and Upgrade. The basis for the graded approach is the designation of facilities/structures into one of five performance categories based upon safety function, mission and cost. This Implementation Plan develops the program for the Prioritization Phase, as well as an overall strategy for the implemention of DOE Order 5480.2B.

  18. The National Tsunami Hazard Mitigation Program

    NASA Astrophysics Data System (ADS)

    Bernard, E. N.

    2003-12-01

    The National Tsunami Hazard Mitigation Program (NTHMP) is a state/Federal partnership that was created to reduce the impacts of tsunamis to U. S. Coastal areas. It is a coordinated effort between the states of Alaska, California, Hawaii, Oregon, and Washington and four Federal agencies: the National Oceanic and Atmospheric Administration(NOAA), the Federal Emergency Management Agency (FEMA), the U. S. Geological Survey (USGS), and the National Science Foundation(NSF). NOAA has led the effort to forge a solid partnership between the states and the Federal agencies because of it's responsibility to provide tsunami warning services to the nation. This successful partnership has established a mitigation program in each state that is preparing coastal communities for the next tsunami. Inundation maps are now available for many of the coastal communities of Alaska, California, Hawaii, Oregon, and Washington. These maps are used to develop evacuation plans and, in the case of Oregon, for land use management. The partnership has successfully upgraded the warning capability in NOAA so that earthquakes can be detected within 5 minutes and tsunamis can be detected in the open ocean in real time, paving the way for improved tsunami forecasts. NSF's new Network for Earthquake Engineering (NEES) program has agreed to work with the NTHMP to focus tsunami research on national needs. An overview of the NTHMP will be given including a discussion of accomplishments and the new collaboration with NEES.

  19. EVALUATION OF FOAMS FOR MITIGATING AIR POLLUTION FROM HAZARDOUS SPILLS

    EPA Science Inventory

    This program has been conducted to evaluate commercially available water base foams for mitigating the vapors from hazardous chemical spills. Foam systems were evaluated in the laboratory to define those foam properties which are important in mitigating hazardous vapors. Larger s...

  20. The National Tsunami Hazard Mitigation Program

    NASA Astrophysics Data System (ADS)

    Bernard, E. N.

    2004-12-01

    The National Tsunami Hazard Mitigation Program (NTHMP) is a state/Federal partnership that was created to reduce the impacts of tsunamis to U.S. Coastal areas. It is a coordinated effort between the states of Alaska, California, Hawaii, Oregon, and Washington and four Federal agencies: the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), the U.S. Geological Survey (USGS), and the National Science Foundation (NSF). NOAA has led the effort to forge a solid partnership between the states and the Federal agencies because of it's responsibility to provide tsunami warning services to the nation. The successful partnership has established a mitigation program in each state that is developing tsunami resilient coastal communities. Inundation maps are now available for many of the coastal communities of Alaska, California, Hawaii, Oregon, and Washington. These maps are used to develop evacuation plans and, in the case of Oregon, for land use management. The NTHMP mapping technology is now being applied to FEMA's Flood Insurance Rate Maps (FIRMs). The NTHMP has successfully upgraded the warning capability in NOAA so that earthquakes can be detected within 5 minutes and tsunamis can be detected in the open ocean in real time. Deep ocean reporting of tsunamis has already averted one unnecessary evacuation of Hawaii and demonstrated that real-time tsunami forecasting is now possible. NSF's new Network for Earthquake Engineering (NEES) program has agreed to work with the NTHMP to focus tsunami research on national needs. An overview of the NTHMP will be given including a discussion of accomplishments and a progress report on NEES and FIRM activities.

  1. Potentially Hazardous Objects (PHO) Mitigation Program

    NASA Astrophysics Data System (ADS)

    Huebner, Walter

    Southwest Research Institute (SwRI) and its partner, Los Alamos National Laboratory (LANL), are prepared to develop, implement, and expand procedures to avert collisions of potentially hazardous objects (PHOs) with Earth as recommended by NASA in its White Paper "Near- Earth Object Survey and Deflection Analysis of Alternatives" requested by the US Congress and submitted to it in March 2007. In addition to developing the general mitigation program as outlined in the NASA White Paper, the program will be expanded to include aggressive mitigation procedures for small (e.g., Tunguska-sized) PHOs and other short warning-time PHOs such as some long-period comet nuclei. As a first step the program will concentrate on the most likely and critical cases, namely small objects and long-period comet nuclei with short warning-times, but without losing sight of objects with longer warning-times. Objects smaller than a few hundred meters are of interest because they are about 1000 times more abundant than kilometer-sized objects and are fainter and more difficult to detect, which may lead to short warning times and hence short reaction times. Yet, even these small PHOs can have devastating effects as the 30 June 1908, Tungaska event has shown. In addition, long-period comets, although relatively rare but large (sometimes tens of kilometers in size), cannot be predicted because of their long orbital periods. Comet C/1983 H1 (IRAS-Araki-Alcock), for example, has an orbital period of 963.22 years, was discovered 27 April 1983, and passed Earth only two weeks later, on 11 May 1983, at a distance of 0.0312 AU. Aggressive methods and continuous alertness will be needed to defend against objects with such short warning times. While intact deflection of a PHO remains a key objective, destruction of a PHO and dispersion of the pieces must also be considered. The effectiveness of several alternative methods including nuclear demolition munitions, conventional explosives, and hyper-velocity impacts will be investigated and compared. This comparison is important for technical as well as political reasons, both domestic and international. The long-range plan includes evaluation of technical readiness including launch capabilities, tests for effectiveness using materials simulating PHOs, and building and testing several modular systems appropriate for alternative applications depending on the type of PHO.

  2. Influence of behavioral biases on the assessment of multi-hazard risks and the implementation of multi-hazard risks mitigation measures: case study of multi-hazard cyclone shelters in Tamil Nadu, India

    NASA Astrophysics Data System (ADS)

    Komendantova, Nadejda; Patt, Anthony

    2013-04-01

    In December 2004, a multiple hazards event devastated the Tamil Nadu province of India. The Sumatra -Andaman earthquake with a magnitude of Mw=9.1-9.3 caused the Indian Ocean tsunami with wave heights up to 30 m, and flooding that reached up to two kilometers inland in some locations. More than 7,790 persons were killed in the province of Tamil Nadu, with 206 in its capital Chennai. The time lag between the earthquake and the tsunami's arrival in India was over an hour, therefore, if a suitable early warning system existed, a proper means of communicating the warning and shelters existing for people would exist, than while this would not have prevented the destruction of infrastructure, several thousands of human lives would have been saved. India has over forty years of experience in the construction of cyclone shelters. With additional efforts and investment, these shelters could be adapted to other types of hazards such as tsunamis and flooding, as well as the construction of new multi-hazard cyclone shelters (MPCS). It would therefore be possible to mitigate one hazard such as cyclones by the construction of a network of shelters while at the same time adapting these shelters to also deal with, for example, tsunamis, with some additional investment. In this historical case, the failure to consider multiple hazards caused significant human losses. The current paper investigates the patterns of the national decision-making process with regards to multiple hazards mitigation measures and how the presence of behavioral and cognitive biases influenced the perceptions of the probabilities of multiple hazards and the choices made for their mitigation by the national decision-makers. Our methodology was based on the analysis of existing reports from national and international organizations as well as available scientific literature on behavioral economics and natural hazards. The results identified several biases in the national decision-making process when the construction of cyclone shelters was being undertaken. The availability heuristics caused a perception of low probability of tsunami following an earthquake, as the last large similar event happened over a hundred years ago. Another led to a situation when decisions were taken on the basis of experience and not statistical evidence, namely, experience showed that the so-called "Ring of Fire" generates underground earthquakes and tsunamis in the Pacific Ocean. This knowledge made decision-makers to neglect the numerical estimations about probability of underground earthquake in the Indian Ocean even though seismologists were warning about probability of a large underground earthquake in the Indian Ocean. The bounded rationality bias led to misperception of signals from the early warning center in the Pacific Ocean. The resulting limited concern resulted in risk mitigation measures that considered cyclone risks, but much less about tsunami. Under loss aversion considerations, the decision-makers perceived the losses connected with the necessary additional investment as being greater than benefits from mitigating a less probable hazard.

  3. 76 FR 61070 - Disaster Assistance; Hazard Mitigation Grant Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-03

    ...On May 1, 1998, the Federal Emergency Management Agency (FEMA) published a Notice of Proposed Rulemaking (NPRM) to revise the categories of projects eligible for funding under the Hazard Mitigation Grant Program (HMGP). The NPRM proposed to define eligible mitigation activities under the HMGP to include minor flood control projects that do not duplicate the efforts and authorities of other......

  4. Destructive Interactions Between Mitigation Strategies and the Causes of Unexpected Failures in Natural Hazard Mitigation Systems

    NASA Astrophysics Data System (ADS)

    Day, S. J.; Fearnley, C. J.

    2013-12-01

    Large investments in the mitigation of natural hazards, using a variety of technology-based mitigation strategies, have proven to be surprisingly ineffective in some recent natural disasters. These failures reveal a need for a systematic classification of mitigation strategies; an understanding of the scientific uncertainties that affect the effectiveness of such strategies; and an understanding of how the different types of strategy within an overall mitigation system interact destructively to reduce the effectiveness of the overall mitigation system. We classify mitigation strategies into permanent, responsive and anticipatory. Permanent mitigation strategies such as flood and tsunami defenses or land use restrictions, are both costly and 'brittle': when they malfunction they can increase mortality. Such strategies critically depend on the accuracy of the estimates of expected hazard intensity in the hazard assessments that underpin their design. Responsive mitigation strategies such as tsunami and lahar warning systems rely on capacities to detect and quantify the hazard source events and to transmit warnings fast enough to enable at risk populations to decide and act effectively. Self-warning and voluntary evacuation is also usually a responsive mitigation strategy. Uncertainty in the nature and magnitude of the detected hazard source event is often the key scientific obstacle to responsive mitigation; public understanding of both the hazard and the warnings, to enable decision making, can also be a critical obstacle. Anticipatory mitigation strategies use interpretation of precursors to hazard source events and are used widely in mitigation of volcanic hazards. Their critical limitations are due to uncertainties in time, space and magnitude relationships between precursors and hazard events. Examples of destructive interaction between different mitigation strategies are provided by the Tohoku 2011 earthquake and tsunami; recent earthquakes that have impacted population centers with poor enforcement of building codes, unrealistic expectations of warning systems or failures to understand local seismic damage mechanisms; and the interaction of land use restriction strategies and responsive warning strategies around lahar-prone volcanoes. A more complete understanding of the interactions between these different types of mitigation strategy, especially the consequences for the expectations and behaviors of the populations at risk, requires models of decision-making under high levels of both uncertainty and danger. The Observation-Orientation-Decision-Action (OODA) loop model (Boyd, 1987) may be a particularly useful model. It emphasizes the importance of 'orientation' (the interpretation of observations and assessment of their significance for the observer and decision-maker), the feedback between decisions and subsequent observations and orientations, and the importance of developing mitigation strategies that are flexible and so able to respond to the occurrence of the unexpected. REFERENCE: Boyd, J.R. A Discourse on Winning and Losing [http://dnipogo.org/john-r-boyd/

  5. Space options for tropical cyclone hazard mitigation

    NASA Astrophysics Data System (ADS)

    Dicaire, Isabelle; Nakamura, Ryoko; Arikawa, Yoshihisa; Okada, Kazuyuki; Itahashi, Takamasa; Summerer, Leopold

    2015-02-01

    This paper investigates potential space options for mitigating the impact of tropical cyclones on cities and civilians. Ground-based techniques combined with space-based remote sensing instrumentation are presented together with space-borne concepts employing space solar power technology. Two space-borne mitigation options are considered: atmospheric warming based on microwave irradiation and laser-induced cloud seeding based on laser power transfer. Finally technology roadmaps dedicated to the space-borne options are presented, including a detailed discussion on the technological viability and technology readiness level of our proposed systems. Based on these assessments, the space-borne cyclone mitigation options presented in this paper may be established in a quarter of a century.

  6. Volcanic hazards and their mitigation: progress and problems

    USGS Publications Warehouse

    Tilling, R.I.

    1989-01-01

    A review of hazards mitigation approaches and techniques indicates that significant advances have been made in hazards assessment, volcano monioring, and eruption forecasting. For example, the remarkable accuracy of the predictions of dome-building events at Mount St. Helens since June 1980 is unprecedented. Yet a predictive capability for more voluminous and explosive eruptions still has not been achieved. Studies of magma-induced seismicity and ground deformation continue to provide the most systematic and reliable data for early detection of precursors to eruptions and shallow intrusions. In addition, some other geophysical monitoring techniques and geochemical methods have been refined and are being more widely applied and tested. Comparison of the four major volcanic disasters of the 1980s (Mount St. Helens, U.S.A. (1980), El Chichon, Mexico (1982); Galunggung, Indonesia (1982); and Nevado del Ruiz, Colombia (1985)) illustrates the importance of predisaster geoscience studies, volcanic hazards assessments, volcano monitoring, contingency planning, and effective communications between scientists and authorities. -from Author

  7. Rockslide susceptibility and hazard assessment for mitigation works design along vertical rocky cliffs: workflow proposal based on a real case-study conducted in Sacco (Campania), Italy

    NASA Astrophysics Data System (ADS)

    Pignalosa, Antonio; Di Crescenzo, Giuseppe; Marino, Ermanno; Terracciano, Rosario; Santo, Antonio

    2015-04-01

    The work here presented concerns a case study in which a complete multidisciplinary workflow has been applied for an extensive assessment of the rockslide susceptibility and hazard in a common scenario such as a vertical and fractured rocky cliffs. The studied area is located in a high-relief zone in Southern Italy (Sacco, Salerno, Campania), characterized by wide vertical rocky cliffs formed by tectonized thick successions of shallow-water limestones. The study concerned the following phases: a) topographic surveying integrating of 3d laser scanning, photogrammetry and GNSS; b) gelogical surveying, characterization of single instabilities and geomecanichal surveying, conducted by geologists rock climbers; c) processing of 3d data and reconstruction of high resolution geometrical models; d) structural and geomechanical analyses; e) data filing in a GIS-based spatial database; f) geo-statistical and spatial analyses and mapping of the whole set of data; g) 3D rockfall analysis; The main goals of the study have been a) to set-up an investigation method to achieve a complete and thorough characterization of the slope stability conditions and b) to provide a detailed base for an accurate definition of the reinforcement and mitigation systems. For this purposes the most up-to-date methods of field surveying, remote sensing, 3d modelling and geospatial data analysis have been integrated in a systematic workflow, accounting of the economic sustainability of the whole project. A novel integrated approach have been applied both fusing deterministic and statistical surveying methods. This approach enabled to deal with the wide extension of the studied area (near to 200.000 m2), without compromising an high accuracy of the results. The deterministic phase, based on a field characterization of single instabilities and their further analyses on 3d models, has been applied for delineating the peculiarity of each single feature. The statistical approach, based on geostructural field mapping and on punctual geomechanical data from scan-line surveying, allowed the rock mass partitioning in homogeneous geomechanical sectors and data interpolation through bounded geostatistical analyses on 3d models. All data, resulting from both approaches, have been referenced and filed in a single spatial database and considered in global geo-statistical analyses for deriving a fully modelled and comprehensive evaluation of the rockslide susceptibility. The described workflow yielded the following innovative results: a) a detailed census of single potential instabilities, through a spatial database recording the geometrical, geological and mechanical features, along with the expected failure modes; b) an high resolution characterization of the whole slope rockslide susceptibility, based on the partitioning of the area according to the stability and mechanical conditions which can be directly related to specific hazard mitigation systems; c) the exact extension of the area exposed to the rockslide hazard, along with the dynamic parameters of expected phenomena; d) an intervention design for hazard mitigation.

  8. Mitigation of earthquake hazards using seismic base isolation systems

    SciTech Connect

    Wang, C.Y.

    1994-06-01

    This paper deals with mitigation of earthquake hazards using seismic base-isolation systems. A numerical algorithm is described for system response analysis of isolated structures with laminated elastomer bearings. The focus of this paper is on the adaptation of a nonlinear constitutive equation for the isolation bearing, and the treatment of foundation embedment for the soil-structure-interaction analysis. Sample problems are presented to illustrate the mitigating effect of using base-isolation systems.

  9. Mitigation options for accidental releases of hazardous gases

    SciTech Connect

    Fthenakis, V.M.

    1995-05-01

    The objective of this paper is to review and compare technologies available for mitigation of unconfined releases of toxic and flammable gases. These technologies include: secondary confinement, deinventory, vapor barriers, foam spraying, and water sprays/monitors. Guidelines for the design and/or operation of effective post-release mitigation systems and case studies involving actual industrial mitigation systems are also presented.

  10. Rainfall-triggered landslides, anthropogenic hazards, and mitigation strategies

    USGS Publications Warehouse

    Larsen, M.C.

    2008-01-01

    Rainfall-triggered landslides are part of a natural process of hillslope erosion that can result in catastrophic loss of life and extensive property damage in mountainous, densely populated areas. As global population expansion on or near steep hillslopes continues, the human and economic costs associated with landslides will increase. Landslide hazard mitigation strategies generally involve hazard assessment mapping, warning systems, control structures, and regional landslide planning and policy development. To be sustainable, hazard mitigation requires that management of natural resources is closely connected to local economic and social interests. A successful strategy is dependent on a combination of multi-disciplinary scientific and engineering approaches, and the political will to take action at the local community to national scale.

  11. New Approaches to Tsunami Hazard Mitigation Demonstrated in Oregon

    NASA Astrophysics Data System (ADS)

    Priest, G. R.; Rizzo, A.; Madin, I.; Lyles Smith, R.; Stimely, L.

    2012-12-01

    Oregon Department of Geology and Mineral Industries and Oregon Emergency Management collaborated over the last four years to increase tsunami preparedness for residents and visitors to the Oregon coast. Utilizing support from the National Tsunami Hazards Mitigation Program (NTHMP), new approaches to outreach and tsunami hazard assessment were developed and then applied. Hazard assessment was approached by first doing two pilot studies aimed at calibrating theoretical models to direct observations of tsunami inundation gleaned from the historical and prehistoric (paleoseismic/paleotsunami) data. The results of these studies were then submitted to peer-reviewed journals and translated into 1:10,000-12,000-scale inundation maps. The inundation maps utilize a powerful new tsunami model, SELFE, developed by Joseph Zhang at the Oregon Health & Science University. SELFE uses unstructured computational grids and parallel processing technique to achieve fast accurate simulation of tsunami interactions with fine-scale coastal morphology. The inundation maps were simplified into tsunami evacuation zones accessed as map brochures and an interactive mapping portal at http://www.oregongeology.org/tsuclearinghouse/. Unique in the world are new evacuation maps that show separate evacuation zones for distant versus locally generated tsunamis. The brochure maps explain that evacuation time is four hours or more for distant tsunamis but 15-20 minutes for local tsunamis that are invariably accompanied by strong ground shaking. Since distant tsunamis occur much more frequently than local tsunamis, the two-zone maps avoid needless over evacuation (and expense) caused by one-zone maps. Inundation mapping for the entire Oregon coast will be complete by ~2014. Educational outreach was accomplished first by doing a pilot study to measure effectiveness of various approaches using before and after polling and then applying the most effective methods. In descending order, the most effective methods were: (1) door-to-door (person-to-person) education, (2) evacuation drills, (3) outreach to K-12 schools, (4) media events, and (5) workshops targeted to key audiences (lodging facilities, teachers, and local officials). Community organizers were hired to apply these five methods to clusters of small communities, measuring performance by before and after polling. Organizers were encouraged to approach the top priority, person-to-person education, by developing Community Emergency Response Teams (CERT) or CERT-like organizations in each community, thereby leaving behind a functioning volunteer-based group that will continue the outreach program and build long term resiliency. One of the most effective person-to-person educational tools was the Map Your Neighborhood program that brings people together so they can sketch the basic layout of their neighborhoods to depict key earthquake and tsunami hazards and mitigation solutions. The various person-to-person volunteer efforts and supporting outreach activities are knitting communities together and creating a permanent culture of tsunami and earthquake preparedness. All major Oregon coastal population centers will have been covered by this intensive outreach program by ~2014.

  12. Collaborative Monitoring and Hazard Mitigation at Fuego Volcano, Guatemala

    NASA Astrophysics Data System (ADS)

    Lyons, J. J.; Bluth, G. J.; Rose, W. I.; Patrick, M.; Johnson, J. B.; Stix, J.

    2007-05-01

    A portable, digital sensor network has been installed to closely monitor changing activity at Fuego volcano, which takes advantage of an international collaborative effort among Guatemala, U.S. and Canadian universities, and the Peace Corps. The goal of this effort is to improve the understanding shallow internal processes, and consequently to more effectively mitigate volcanic hazards. Fuego volcano has had more than 60 historical eruptions and nearly-continuous activity make it an ideal laboratory to study volcanic processes. Close monitoring is needed to identify base-line activity, and rapidly identify and disseminate changes in the activity which might threaten nearby communities. The sensor network is comprised of a miniature DOAS ultraviolet spectrometer fitted with a system for automated plume scans, a digital video camera, and two seismo-acoustic stations and portable dataloggers. These sensors are on loan from scientists who visited Fuego during short field seasons and donated use of their sensors to a resident Peace Corps Masters International student from Michigan Technological University for extended data collection. The sensor network is based around the local volcano observatory maintained by Instituto National de Sismologia, Vulcanologia, Metrologia e Hidrologia (INSIVUMEH). INSIVUMEH provides local support and historical knowledge of Fuego activity as well as a secure location for storage of scientific equipment, data processing, and charging of the batteries that power the sensors. The complete sensor network came online in mid-February 2007 and here we present preliminary results from concurrent gas, seismic, and acoustic monitoring of activity from Fuego volcano.

  13. GO/NO-GO - When is medical hazard mitigation acceptable for launch?

    NASA Technical Reports Server (NTRS)

    Hamilton, Douglas R.; Polk, James D.

    2005-01-01

    Medical support of spaceflight missions is composed of complex tasks and decisions that dedicated to maintaining the health and performance of the crew and the completion of mission objectives. Spacecraft represent one of the most complex vehicles built by humans, and are built to very rigorous design specifications. In the course of a Flight Readiness Review (FRR) or a mission itself, the flight surgeon must be able to understand the impact of hazards and risks that may not be completely mitigated by design alone. Some hazards are not mitigated because they are never actually identified. When a hazard is identified, it must be reduced or waivered. Hazards that cannot be designed out of the vehicle or mission, are usually mitigated through other means to bring the residual risk to an acceptable level. This is possible in most engineered systems because failure modes are usually predictable and analysis can include taking these systems to failure. Medical support of space missions is complicated by the inability of flight surgeons to provide "exact" hazard and risk numbers to the NASA engineering community. Taking humans to failure is not an option. Furthermore, medical dogma is mostly comprised of "medical prevention" strategies that mitigate risk by examining the behaviour of a cohort of humans similar to astronauts. Unfortunately, this approach does not lend itself well for predicting the effect of a hazard in the unique environment of space. This presentation will discuss how Medical Operations uses an evidence-based approach to decide if hazard mitigation strategies are adequate to reduce mission risk to acceptable levels. Case studies to be discussed will include: 1. Risk of electrocution risk during EVA 2. Risk of cardiac event risk during long and short duration missions 3. Degraded cabin environmental monitoring on the ISS. Learning Objectives 1.) The audience will understand the challenges of mitigating medical risk caused by nominal and off-nominal mission events. 2.) The audience will understand the process by which medical hazards are identified and mitigated before launch. 3.) The audience will understand the roles and responsibilities of all the other flight control positions in participating in the process of reducing hazards and reducing medical risk to an acceptable level.

  14. Composite Materials for Hazard Mitigation of Reactive Metal Hydrides.

    SciTech Connect

    Pratt, Joseph William; Cordaro, Joseph Gabriel; Sartor, George B.; Dedrick, Daniel E.; Reeder, Craig L.

    2012-02-01

    In an attempt to mitigate the hazards associated with storing large quantities of reactive metal hydrides, polymer composite materials were synthesized and tested under simulated usage and accident conditions. The composites were made by polymerizing vinyl monomers using free-radical polymerization chemistry, in the presence of the metal hydride. Composites with vinyl-containing siloxane oligomers were also polymerized with and without added styrene and divinyl benzene. Hydrogen capacity measurements revealed that addition of the polymer to the metal hydride reduced the inherent hydrogen storage capacity of the material. The composites were found to be initially effective at reducing the amount of heat released during oxidation. However, upon cycling the composites, the mitigating behavior was lost. While the polymer composites we investigated have mitigating potential and are physically robust, they undergo a chemical change upon cycling that makes them subsequently ineffective at mitigating heat release upon oxidation of the metal hydride. Acknowledgements The authors would like to thank the following people who participated in this project: Ned Stetson (U.S. Department of Energy) for sponsorship and support of the project. Ken Stewart (Sandia) for building the flow-through calorimeter and cycling test stations. Isidro Ruvalcaba, Jr. (Sandia) for qualitative experiments on the interaction of sodium alanate with water. Terry Johnson (Sandia) for sharing his expertise and knowledge of metal hydrides, and sodium alanate in particular. Marcina Moreno (Sandia) for programmatic assistance. John Khalil (United Technologies Research Corp) for insight into the hazards of reactive metal hydrides and real-world accident scenario experiments. Summary In an attempt to mitigate and/or manage hazards associated with storing bulk quantities of reactive metal hydrides, polymer composite materials (a mixture of a mitigating polymer and a metal hydride) were synthesized and tested under simulated usage and accident conditions. Mitigating the hazards associated with reactive metal hydrides during an accident while finding a way to keep the original capability of the active material intact during normal use has been the focus of this work. These composites were made by polymerizing vinyl monomers using free-radical polymerization chemistry, in the presence of the metal hydride, in this case a prepared sodium alanate (chosen as a representative reactive metal hydride). It was found that the polymerization of styrene and divinyl benzene could be initiated using AIBN in toluene at 70 degC. The resulting composite materials can be either hard or brittle solids depending on the cross-linking density. Thermal decomposition of these styrene-based composite materials is lower than neat polystyrene indicating that the chemical nature of the polymer is affected by the formation of the composite. The char-forming nature of cross-linked polystyrene is low and therefore, not an ideal polymer for hazard mitigation. To obtain composite materials containing a polymer with higher char-forming potential, siloxane-based monomers were investigated. Four vinyl-containing siloxane oligomers were polymerized with and without added styrene and divinyl benzene. Like the styrene materials, these composite materials exhibited thermal decomposition behavior significantly different than the neat polymers. Specifically, the thermal decomposition temperature was shifted approximately 100 degC lower than the neat polymer signifying a major chemical change to the polymer network. Thermal analysis of the cycled samples was performed on the siloxane-based composite materials. It was found that after 30 cycles the siloxane-containing polymer composite material has similar TGA/DSC-MS traces as the virgin composite material indicating that the polymer is physically intact upon cycling. Hydrogen capacity measurements revealed that addition of the polymer to the metal hydride in the form of a composite material reduced the inherent hydrogen storage capacity of the material. This reduction in capacity was observed to be independent of the amount of charge/discharge cycles except for the composites containing siloxane, which showed less of an impact on hydrogen storage capacity as it was cycled further. While the reason for this is not clear, it may be due to a chemically stabilizing effect of the siloxane on the metal hydride. Flow-through calorimetry was used to characterize the mitigating effectiveness of the different composites relative to the neat (no polymer) material. The composites were found to be initially effective at reducing the amount of heat released during oxidation, and the best performing material was the siloxane-containing composite which reduced the heat release to less than 50% of the value of the neat material. However, upon cycling the composites, all mitigating behavior was lost. The combined results of the flow-through calorimetry, hydrogen capacity, and thermogravimetric analysis tests lead to the proposed conclusion that while the polymer composites have mitigating potential and are physically robust under cycling, they undergo a chemical change upon cycling that makes them ineffective at mitigating heat release upon oxidation of the metal hydride.

  15. An early warning system for marine storm hazard mitigation

    NASA Astrophysics Data System (ADS)

    Vousdoukas, M. I.; Almeida, L. P.; Pacheco, A.; Ferreira, O.

    2012-04-01

    The present contribution presents efforts towards the development of an operational Early Warning System for storm hazard prediction and mitigation. The system consists of a calibrated nested-model train which consists of specially calibrated Wave Watch III, SWAN and XBeach models. The numerical simulations provide daily forecasts of the hydrodynamic conditions, morphological change and overtopping risk at the area of interest. The model predictions are processed by a 'translation' module which is based on site-specific Storm Impact Indicators (SIIs) (Ciavola et al., 2011, Storm impacts along European coastlines. Part 2: lessons learned from the MICORE project, Environmental Science & Policy, Vol 14), and warnings are issued when pre-defined threshold values are exceeded. For the present site the selected SIIs were (i) the maximum wave run-up height during the simulations; and (ii) the dune-foot horizontal retreat at the end of the simulations. Both SIIs and pre-defined thresholds were carefully selected on the grounds of existing experience and field data. Four risk levels were considered, each associated with an intervention approach, recommended to the responsible coastal protection authority. Regular updating of the topography/bathymetry is critical for the performance of the storm impact forecasting, especially when there are significant morphological changes. The system can be extended to other critical problems, like implications of global warming and adaptive management strategies, while the approach presently followed, from model calibration to the early warning system for storm hazard mitigation, can be applied to other sites worldwide, with minor adaptations.

  16. 77 FR 24505 - Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential Buildings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-24

    ... SECURITY Federal Emergency Management Agency Hazard Mitigation Assistance for Wind Retrofit Projects for... comments on Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential Buildings... property from hazards and their effects. One such activity is the implementation of wind retrofit...

  17. Deterministic and Nondeterministic Behavior of Earthquakes and Hazard Mitigation Strategy

    NASA Astrophysics Data System (ADS)

    Kanamori, H.

    2014-12-01

    Earthquakes exhibit both deterministic and nondeterministic behavior. Deterministic behavior is controlled by length and time scales such as the dimension of seismogenic zones and plate-motion speed. Nondeterministic behavior is controlled by the interaction of many elements, such as asperities, in the system. Some subduction zones have strong deterministic elements which allow forecasts of future seismicity. For example, the forecasts of the 2010 Mw=8.8 Maule, Chile, earthquake and the 2012 Mw=7.6, Costa Rica, earthquake are good examples in which useful forecasts were made within a solid scientific framework using GPS. However, even in these cases, because of the nondeterministic elements uncertainties are difficult to quantify. In some subduction zones, nondeterministic behavior dominates because of complex plate boundary structures and defies useful forecasts. The 2011 Mw=9.0 Tohoku-Oki earthquake may be an example in which the physical framework was reasonably well understood, but complex interactions of asperities and insufficient knowledge about the subduction-zone structures led to the unexpected tragic consequence. Despite these difficulties, broadband seismology, GPS, and rapid data processing-telemetry technology can contribute to effective hazard mitigation through scenario earthquake approach and real-time warning. A scale-independent relation between M0 (seismic moment) and the source duration, t, can be used for the design of average scenario earthquakes. However, outliers caused by the variation of stress drop, radiation efficiency, and aspect ratio of the rupture plane are often the most hazardous and need to be included in scenario earthquakes. The recent development in real-time technology would help seismologists to cope with, and prepare for, devastating tsunamis and earthquakes. Combining a better understanding of earthquake diversity and modern technology is the key to effective and comprehensive hazard mitigation practices.

  18. Meteorological Hazard Assessment and Risk Mitigation in Rwanda.

    NASA Astrophysics Data System (ADS)

    Nduwayezu, Emmanuel; Jaboyedoff, Michel; Bugnon, Pierre-Charles; Nsengiyumva, Jean-Baptiste; Horton, Pascal; Derron, Marc-Henri

    2015-04-01

    Between 10 and 13 April 2012, heavy rains hit sectors adjacent to the Vulcanoes National Park (Musanze District in the Northern Province and Nyabihu and Rubavu Districts in the Western Province of RWANDA), causing floods that affected about 11,000 persons. Flooding caused deaths and injuries among the affected population, and extensive damage to houses and properties. 348 houses were destroyed and 446 were partially damaged or have been underwater for several days. Families were forced to leave their flooded homes and seek temporal accommodation with their neighbors, often in overcrowded places. Along the West-northern border of RWANDA, Virunga mountain range consists of 6 major volcanoes. Mount Karisimbi is the highest volcano at 4507m. The oldest mountain is mount Sabyinyo which rises 3634m. The hydraulic network in Musanze District is formed by temporary torrents and permanent watercourses. Torrents surge during strong storms, and are provoked by water coming downhill from the volcanoes, some 20 km away. This area is periodically affected by flooding and landslides because of heavy rain (Rwanda has 2 rainy seasons from February to April and from September to November each year in general and 2 dry seasons) striking the Volcano National Park. Rain water creates big water channels (in already known torrents or new ones) that impact communities, agricultural soils and crop yields. This project aims at identifying hazardous and risky areas by producing susceptibility maps for floods, debris flow and landslides over this sector. Susceptibility maps are being drawn using field observations, during and after the 2012 events, and an empirical model of propagation for regional susceptibility assessments of debris flows (Flow-R). Input data are 10m and 30m resolution DEMs, satellite images, hydrographic network, and some information on geological substratum and soil occupation. Combining susceptibility maps with infrastructures, houses and population density maps will be used in identifying the most risky areas. Finally, based on practical experiences in this kind of field and produced documents some recommendations for low-cost mitigation measures will be proposed. Reference: MIDIMAR, Impacts of floods and landslides on socio-economic development profile. Case study: Musanze District. Kigali, June 2012.

  19. Tsunami hazard mitigation in tourism in the tropical and subtropical coastal areas: a case study in the Ryukyu Islands, southwest of Japan

    NASA Astrophysics Data System (ADS)

    Matsumoto, T.

    2006-12-01

    Life and economy (including tourism) in tropical and subtropical coastal areas, such as Okinawa Prefecture (Ryukyu) are highly relying on the sea. The sea has both "gentle" side to give people healing and "dangerous" side to kill people. If we are going to utilise the sea for marine tourism such as constructing resort facilities on the oceanfront, we should know all of the sea, including the both sides of the sea: especially the nature of tsunamis. And also we islanders should issue accurate information about the sea towards outsiders, especially tourists visiting the island. We have already learned a lesson about this issue from the Sumatra tsunami in 2004. However, measures against the tsunami disaster by marine tourism industry are still inadequate in these areas. The goal of tsunami hazard mitigation for those engaged in tourism industry in tropical and subtropical coastal areas should be as follows. (1) Preparedness against tsunamis: "Be aware of the characteristics of tsunamis." "Prepare tsunamis when you feel an earthquake." "Prepare tsunamis when an earthquake takes place somewhere in the world." (2) Maintenance of an exact tsunami hazard map under quantitative analyses of the characteristics of tsunamis: "Flooding areas by tsunami attacks are dependent not only on altitude but also on amplification and inundation due to the seafloor topography near the coast and the onland topographic relief." "Tsunami damage happens repeatedly." (3) Maintenance of a tsunami disaster prevention manual and training after the manual: "Who should do what in case of tsunamis?" "How should the resort hotel employees lead the guests to the safe place?" Such a policy for disaster prevention is discussed in the class of the general education of "Ocean Sciences" in University of the Ryukyus (UR) and summer school for high school students. The students (most of them are from Okinawa Prefecture) consider, discuss and make reports about what to do in case of tsunamis as an islander. Especially, students of Department of Tourism Sciences (DTS) are keen on the discussion and make excellent reports/proposals. Here, the author would also like to introduce some of them in the presentation.

  20. Mitigating mountain hazards in Austria - legislation, risk transfer, and awareness building

    NASA Astrophysics Data System (ADS)

    Holub, M.; Fuchs, S.

    2009-04-01

    Embedded in the overall concept of integral risk management, mitigating mountain hazards is pillared by land use regulations, risk transfer, and information. In this paper aspects on legislation related to natural hazards in Austria are summarised, with a particular focus on spatial planning activities and hazard mapping, and possible adaptations focussing on enhanced resilience are outlined. Furthermore, the system of risk transfer is discussed, highlighting the importance of creating incentives for risk-aware behaviour, above all with respect to individual precaution and insurance solutions. Therefore, the issue of creating awareness through information is essential, which is presented subsequently. The study results in recommendations of how administrative units on different federal and local levels could increase the enforcement of regulations related to the minimisation of natural hazard risk. Moreover, the nexus to risk transfer mechanisms is provided, focusing on the current compensation system in Austria and some possible adjustments in order to provide economic incentives for (private) investments in mitigation measures, i.e. local structural protection. These incentives should be supported by delivering information on hazard and risk target-oriented to any stakeholder involved. Therefore, coping strategies have to be adjusted and the interaction between prevention and precaution has to be highlighted. The paper closes with recommendations of how these efforts could be achieved, with a particular focus on the situation in the Republic of Austria.

  1. Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.

    2006-12-01

    An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes reduce the level of uncertainty in their results to the uncertainty in the geophysical initial conditions. Further, when coupled with real--time free--field tsunami measurements from tsunameters, validated codes are the only choice for realistic forecasting of inundation; the consequences of failure are too ghastly to take chances with numerical procedures that have not been validated. We discuss a ten step process of benchmark tests for models used for inundation mapping. The associated methodology and algorithmes have to first be validated with analytical solutions, then verified with laboratory measurements and field data. The models need to be published in the scientific literature in peer-review journals indexed by ISI. While this process may appear onerous, it reflects our state of knowledge, and is the only defensible methodology when human lives are at stake. Synolakis, C.E., and Bernard, E.N, Tsunami science before and beyond Boxing Day 2004, Phil. Trans. R. Soc. A 364 1845, 2231--2263, 2005.

  2. India's Initiative in Mitigating Tsunami and Storm Surge Hazard

    NASA Astrophysics Data System (ADS)

    Gupta, H.

    2008-12-01

    Soon after the occurrence of the most devastating tsunami caused by the 26th December, 2004 Sumatra earthquake, India took the initiative to set up an end to end system to mitigate tsunami and storm surge hazard. The system includes all the necessary elements: networking of seismic stations; deployment of ocean bottom pressure recorders; real time sea level monitoring stations; establishment of radar based monitoring stations for real time measurement of surface currents and waves; modeling for tsunamis and storm surges; generation of coastal inundation and vulnerability maps; operation of a tsunami and storm surges warning centre on 247 basis; capacity building and training of all the stakeholders and communication with the global community. This initiative was estimated to have a direct cost of US $30 million and was to be operative by August 2007. This has been achieved. The Indian National Centre for Ocean Information and Services (INCOIS), belonging to the Ministry of Earth Sciences (MoES), located at Hyderabad, is the nodal agency for this program. The system fared well during the occurrence of September 12/13 2007 tsunamigenic earthquakes. One of the problems is delay in estimating the size of large earthquakes. Empirical approaches are being developed to quickly estimate the size of the earthquakes occurring in Sumatra -Andaman zone of tsunamigenic earthquakes.

  3. ANALYSIS AND MITIGATION OF X-RAY HAZARD GENERATED FROM HIGH INTENSITY LASER-TARGET INTERACTIONS

    SciTech Connect

    Qiu, R.; Liu, J.C.; Prinz, A.A.; Rokni, S.H.; Woods, M.; Xia, Z.; ,

    2011-03-21

    Interaction of a high intensity laser with matter may generate an ionizing radiation hazard. Very limited studies have been made, however, on the laser-induced radiation protection issue. This work reviews available literature on the physics and characteristics of laser-induced X-ray hazards. Important aspects include the laser-to-electron energy conversion efficiency, electron angular distribution, electron energy spectrum and effective temperature, and bremsstrahlung production of X-rays in the target. The possible X-ray dose rates for several femtosecond Ti:sapphire laser systems used at SLAC, including the short pulse laser system for the Matter in Extreme Conditions Instrument (peak power 4 TW and peak intensity 2.4 x 10{sup 18} W/cm{sup 2}) were analysed. A graded approach to mitigate the laser-induced X-ray hazard with a combination of engineered and administrative controls is also proposed.

  4. Mitigation of unconfined releases of hazardous gases via liquid spraying

    SciTech Connect

    Fthenakis, V.M.

    1997-02-01

    The capability of water sprays in mitigating clouds of hydrofluoric acid (HF) has been demonstrated in the large-scale field experiments of Goldfish and Hawk, which took place at the DOE Nevada Test Site. The effectiveness of water sprays and fire water monitors to remove HF from vapor plume, has also been studied theoretically using the model HGSPRAY5 with the near-field and far-field dispersion described by the HGSYSTEM models. This paper presents options to select and evaluate liquid spraying systems, based on the industry experience and mathematical modeling.

  5. The seismic project of the National Tsunami Hazard Mitigation Program

    USGS Publications Warehouse

    Oppenheimer, D.H.; Bittenbinder, A.N.; Bogaert, B.M.; Buland, R.P.; Dietz, L.D.; Hansen, R.A.; Malone, S.D.; McCreery, C.S.; Sokolowski, T.J.; Whitmore, P.M.; Weaver, C.S.

    2005-01-01

    In 1997, the Federal Emergency Management Agency (FEMA), National Oceanic and Atmospheric Administration (NOAA), U.S. Geological Survey (USGS), and the five western States of Alaska, California, Hawaii, Oregon, and Washington joined in a partnership called the National Tsunami Hazard Mitigation Program (NTHMP) to enhance the quality and quantity of seismic data provided to the NOAA tsunami warning centers in Alaska and Hawaii. The NTHMP funded a seismic project that now provides the warning centers with real-time seismic data over dedicated communication links and the Internet from regional seismic networks monitoring earthquakes in the five western states, the U.S. National Seismic Network in Colorado, and from domestic and global seismic stations operated by other agencies. The goal of the project is to reduce the time needed to issue a tsunami warning by providing the warning centers with high-dynamic range, broadband waveforms in near real time. An additional goal is to reduce the likelihood of issuing false tsunami warnings by rapidly providing to the warning centers parametric information on earthquakes that could indicate their tsunamigenic potential, such as hypocenters, magnitudes, moment tensors, and shake distribution maps. New or upgraded field instrumentation was installed over a 5-year period at 53 seismic stations in the five western states. Data from these instruments has been integrated into the seismic network utilizing Earthworm software. This network has significantly reduced the time needed to respond to teleseismic and regional earthquakes. Notably, the West Coast/Alaska Tsunami Warning Center responded to the 28 February 2001 Mw 6.8 Nisqually earthquake beneath Olympia, Washington within 2 minutes compared to an average response time of over 10 minutes for the previous 18 years. ?? Springer 2005.

  6. Local hazard mitigation plans: a preliminary estimation of state-level completion from 2004 to 2009.

    PubMed

    Jackman, Andrea M; Beruvides, Mario G

    2013-01-01

    According to the Disaster Mitigation Act of 2000 and subsequent federal policy, local governments are required to have a Hazard Mitigation Plan (HMP) written and approved by the Federal Emergency Management Agency (FEMA) to be eligible for federal mitigation assistance. This policy took effect on November 1, 2004. Using FEMA's database of approved HMPs and US Census Bureau's 2002 Survey of Local Governments, it is estimated that 3 years after the original deadline, 67 percent of the country's active local governments were without an approved HMP. A follow-up examination in 2009 of the eight states with the lowest completion percentages did not indicate significant improvement following the initial study and revealed inconsistencies in plan completion data over time. The completion percentage varied greatly by state and did not appear to follow any expected pattern such as wealth or hazard vulnerability that might encourage prompt completion of a plan. Further, the results indicate that -92 percent of the approved plans were completed by a multijurisdictional entity, which suggests single governments seldom complete and gain approval for plans. Based on these results, it is believed that state-level resolution is not adequate for explaining the variation of plan completion, and further study at the local level is warranted. PMID:24180092

  7. 76 FR 23613 - Draft Programmatic Environmental Assessment for Hazard Mitigation Safe Room Construction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-27

    ..., Regulations, & Policy Division, Office of Chief Counsel, Federal Emergency Management Agency, 500 C Street, SW... SECURITY Federal Emergency Management Agency Draft Programmatic Environmental Assessment for Hazard Mitigation Safe Room Construction AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice...

  8. Next-Generation GPS Station for Hazards Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2013-12-01

    Our objective is to better forecast, assess, and mitigate natural hazards, including earthquakes, tsunamis, and extreme storms and flooding through development and implementation of a modular technology for the next-generation in-situ geodetic station to support the flow of information from multiple stations to scientists, mission planners, decision makers, and first responders. The same technology developed under NASA funding can be applied to enhance monitoring of large engineering structures such as bridges, hospitals and other critical infrastructure. Meaningful warnings save lives when issued within 1-2 minutes for destructive earthquakes, several tens of minutes for tsunamis, and up to several hours for extreme storms and flooding, and can be provided by on-site fusion of multiple data types and generation of higher-order data products: GPS/GNSS and accelerometer measurements to estimate point displacements, and GPS/GNSS and meteorological measurements to estimate moisture variability in the free atmosphere. By operating semi-autonomously, each station can then provide low-latency, high-fidelity and compact data products within the constraints of narrow communications bandwidth that often accompanies natural disasters. We have developed a power-efficient, low-cost, plug-in Geodetic Module for fusion of data from in situ sensors including GPS, a strong-motion accelerometer module, and a meteorological sensor package, for deployment at existing continuous GPS stations in southern California; fifteen stations have already been upgraded. The low-cost modular design is scalable to the many existing continuous GPS stations worldwide. New on-the-fly data products are estimated with 1 mm precision and accuracy, including three-dimensional seismogeodetic displacements for earthquake, tsunami and structural monitoring and precipitable water for forecasting extreme weather events such as summer monsoons and atmospheric rivers experienced in California. Unlike more traditional approaches where data are collected and analyzed from a network of stations at a central processing facility, we are embedding these capabilities in the Geodetic Module's processor for in situ analysis and data delivery through TCP/IP to avoid single points of failure during emergencies. We are infusing our technology to several local and state groups, including the San Diego County Office of Emergency Services for earthquake and tsunami early warnings, UC San Diego Health Services for hospital monitoring and early warning, Caltrans for bridge monitoring, and NOAA's Weather Forecasting Offices in San Diego and Los Angeles Counties for forecasting extreme weather events. We describe our overall system and the ongoing efforts at technology infusion.

  9. Aligning Natural Resource Conservation and Flood Hazard Mitigation in California

    PubMed Central

    Calil, Juliano; Beck, Michael W.; Gleason, Mary; Merrifield, Matthew; Klausmeyer, Kirk; Newkirk, Sarah

    2015-01-01

    Flooding is the most common and damaging of all natural disasters in the United States, and was a factor in almost all declared disasters in U.S. history. Direct flood losses in the U.S. in 2011 totaled $8.41 billion and flood damage has also been on the rise globally over the past century. The National Flood Insurance Program paid out more than $38 billion in claims since its inception in 1968, more than a third of which has gone to the one percent of policies that experienced multiple losses and are classified as “repetitive loss.” During the same period, the loss of coastal wetlands and other natural habitat has continued, and funds for conservation and restoration of these habitats are very limited. This study demonstrates that flood losses could be mitigated through action that meets both flood risk reduction and conservation objectives. We found that there are at least 11,243km2 of land in coastal California, which is both flood-prone and has natural resource conservation value, and where a property/structure buyout and habitat restoration project could meet multiple objectives. For example, our results show that in Sonoma County, the extent of land that meets these criteria is 564km2. Further, we explore flood mitigation grant programs that can be a significant source of funds to such projects. We demonstrate that government funded buyouts followed by restoration of targeted lands can support social, environmental, and economic objectives: reduction of flood exposure, restoration of natural resources, and efficient use of limited governmental funds. PMID:26200353

  10. Numerical and probabilistic analysis of asteroid and comet impact hazard mitigation

    SciTech Connect

    Plesko, Catherine S; Weaver, Robert P; Huebner, Walter F

    2010-09-09

    The possibility of asteroid and comet impacts on Earth has received significant recent media and scientific attention. Still, there are many outstanding questions about the correct response once a potentially hazardous object (PHO) is found. Nuclear munitions are often suggested as a deflection mechanism because they have a high internal energy per unit launch mass. However, major uncertainties remain about the use of nuclear munitions for hazard mitigation. There are large uncertainties in a PHO's physical response to a strong deflection or dispersion impulse like that delivered by nuclear munitions. Objects smaller than 100 m may be solid, and objects at all sizes may be 'rubble piles' with large porosities and little strength. Objects with these different properties would respond very differently, so the effects of object properties must be accounted for. Recent ground-based observations and missions to asteroids and comets have improved the planetary science community's understanding of these objects. Computational power and simulation capabilities have improved such that it is possible to numerically model the hazard mitigation problem from first principles. Before we know that explosive yield Y at height h or depth -h from the target surface will produce a momentum change in or dispersion of a PHO, we must quantify energy deposition into the system of particles that make up the PHO. Here we present the initial results of a parameter study in which we model the efficiency of energy deposition from a stand-off nuclear burst onto targets made of PHO constituent materials.

  11. Earthquake and Volcanic Hazard Mitigation and Capacity Building in Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Ayele, A.

    2012-04-01

    The East African Rift System (EARS) is a classic example of active continental rifting, and a natural laboratory setting to study initiation and early stage evolution of continental rifts. The EARS is at different stages of development that varies from relatively matured rift (16 mm/yr) in the Afar to a weakly extended Okavango Delta in the south with predicted opening velocity < 3 mm/yr. Recent studies in the region helped researchers to highlight the length and timescales of magmatism and faulting, the partitioning of strain between faulting and magmatism, and their implications for the development of along-axis segmentation. Although the human resource and instrument coverage is sparse in the continent, our understanding of rift processes and deep structure has improved in the last decade after the advent of space geodesy and broadband seismology. The recent major earthquakes, volcanic eruptions and mega dike intrusions that occurred along the EARS attracted several earth scientist teams across the globe. However, most African countries traversed by the rift do not have the full capacity to monitor and mitigate earthquake and volcanic hazards. Few monitoring facilities exist in some countries, and the data acquisition is rarely available in real-time for mitigation purpose. Many sub-Saharan Africa governments are currently focused on achieving the millennium development goals with massive infrastructure development scheme and urbanization while impending natural hazards of such nature are severely overlooked. Collaborations with overseas researchers and other joint efforts by the international community are opportunities to be used by African institutions to best utilize limited resources and to mitigate earthquake and volcano hazards.

  12. Evaluating fuel complexes for fire hazard mitigation planning in the southeastern United States.

    SciTech Connect

    Andreu, Anne G.; Shea, Dan; Parresol, Bernard, R.; Ottmar, Roger, D.

    2012-01-01

    Fire hazard mitigation planning requires an accurate accounting of fuel complexes to predict potential fire behavior and effects of treatment alternatives. In the southeastern United States, rapid vegetation growth coupled with complex land use history and forest management options requires a dynamic approach to fuel characterization. In this study we assessed potential surface fire behavior with the Fuel Characteristic Classification System (FCCS), a tool which uses inventoried fuelbed inputs to predict fire behavior. Using inventory data from 629 plots established in the upper Atlantic Coastal Plain, South Carolina, we constructed FCCS fuelbeds representing median fuel characteristics by major forest type and age class. With a dry fuel moisture scenario and 6.4 km h{sub 1} midflame wind speed, the FCCS predicted moderate to high potential fire hazard for the majority of the fuelbeds under study. To explore fire hazard under potential future fuel conditions, we developed fuelbeds representing the range of quantitative inventorydata for fuelbed components that drive surface fire behavior algorithms and adjusted shrub species composition to represent 30% and 60% relative cover of highly flammable shrub species. Results indicate that the primary drivers of surface fire behavior vary by forest type, age and surface fire behavior rating. Litter tends to be a primary or secondary driver in most forest types. In comparison to other surface fire contributors, reducing shrub loading results in reduced flame lengths most consistently across forest types. FCCS fuelbeds and the results from this project can be used for fire hazard mitigation planning throughout the southern Atlantic Coastal Plain where similar forest types occur. The approach of building simulated fuelbeds across the range of available surface fuel data produces sets of incrementally different fuel characteristics that can be applied to any dynamic forest types in which surface fuel conditions change rapidly.

  13. The variational effects of jurisdictional attributes on hazard mitigation planning costs.

    PubMed

    Jackman, Andrea M; Beruvides, Mario G

    2015-01-01

    Under the Disaster Mitigation Act of 2000 and Federal Emergency Management Agency's subsequent Interim Final Rule, the requirement was placed on local governments to author and gain approval for a Hazard Mitigation Plan (HMP) for the areas under their jurisdiction. Low completion percentages for HMPs-less than one-third of eligible governments-were found by an analysis conducted 3 years after the final deadline for the aforementioned legislation took place. Follow-up studies showed little improvement at 5 and 8 years after the deadline. Based on these results, a previous study hypothesized that the cost of creating a HMP might be an influential factor in explaining why most jurisdictions had failed to write or gain approval for a HMP. The frequency of natural hazards experienced by the planning jurisdiction, the number of jurisdictions participating in the plan, and the population and population density were found to explain more than half of the variation in HMP costs. This study is a continuation of that effort, finding that there are significant differences in cost both across ranges of values for the jurisdictional attributes and single-jurisdictional versus multijurisdictional plans. PMID:25779899

  14. Spatio-temporal patterns of hazards and their use in risk assessment and mitigation. Case study of road accidents in Romania

    NASA Astrophysics Data System (ADS)

    Catalin Stanga, Iulian

    2013-04-01

    Road accidents are among the leading causes of death in many world countries, partly as an inherent consequence of the increasing mobility of today society. The World Health Organization estimates that 1.3 million people died in road accidents in 2011, which means 186 deaths per million. The tragic picture is completed by millions of peoples experiencing different physical injuries or by the enormous social and economic costs that these events imply. Romania has one of the most unsafe road networks within the European Union, with annual averages of 9400 accidents, 8300 injuries and almost 2680 fatalities (2007-2012). An average of 141 death per million is more than twice the average fatality rate in European Union (about 60 death per million). Other specific indicators (accidents or fatalities reported to the road length, vehicle fleet size, driving license owners or adult population etc.) are even worst in the same European context. Road accidents are caused by a complex series of factors, some of them being a relatively constant premise, while others act as catalyzing factors or triggering agent: road features and quality, vehicle technical state, weather conditions, human related factors etc. All these lead to a complex equation with too many unknown variables, making almost impossible a probabilistic approach. However, the high concentration of accidents in a region or in some road sectors is caused by the existence of a specific context, created by factors with permanent or repetitive character, and leads to the idea of a spatial autocorrelation between locations of different adjoining accident. In the same way, the increasing frequency of road accidents and of their causes repeatability in different periods of the year would allow to identify those black timeframes with higher incidence of road accidents. Identifying and analyzing the road blackspots (hotspots) and black zones would help to improve road safety by acting against the common causes that create the spatial or temporal clustering of crash accidents. Since the 1990's, Geographical Informational Systems (GIS) became a very important tool for traffic and road safety management, allowing not only the spatial and multifactorial analysis, but also graphical and non-graphical outputs. The current paper presents an accessible GIS methodology to study the spatio-temporal pattern of injury related road accidents, to identify the high density accidents zones, to make a cluster analysis, to create multicriterial typologies, to identify spatial and temporal similarities and to explain them. In this purpose, a Geographical Information System was created, allowing a complex analysis that involves not only the events, but also a large set of interrelated and spatially linked attributes. The GIS includes the accidents as georeferenced point elements with a spatially linked attribute database: identification information (date, location details); accident type; main, secondary and aggravating causes; data about driver; vehicle information; consequences (damages, injured peoples and fatalities). Each attribute has its own number code that allows both the statistical analysis and the spatial interrogation. The database includes those road accidents that led to physical injuries and loss of human lives between 2007 and 2012 and the spatial analysis was realized using TNTmips 7.3 software facilities. Data aggregation and processing allowed creating the spatial pattern of injury related road accidents through Kernel density estimation at three different levels (national - Romania; county level - Iasi County; local level - Iasi town). Spider graphs were used to create the temporal pattern or road accidents at three levels (daily, weekly and monthly) directly related to their causes. Moreover the spatial and temporal database relates the natural hazards (glazed frost, fog, and blizzard) with the human made ones, giving the opportunity to evaluate the nature of uncertainties in risk assessment. At the end, this paper provides a clustering methodology based on several environmental indicators intended to classify the spatial and temporal hotspots of road traffic insecurity. The results are a useful guide for planners and decision makers in developing effective road safety strategies and measures.

  15. How much do hazard mitigation plans cost? An analysis of federal grant data.

    PubMed

    Jackman, Andrea M; Beruvides, Mario G

    2013-01-01

    Under the Disaster Mitigation Act of 2000 and Federal Emergency Management Agency's subsequent Interim Final Rule, the requirement was placed on local governments to author and gain approval for a Hazard Mitigation Plan (HMP) for the areas under their jurisdiction. Low completion percentages for HMPs--less than one-third of eligible governments--were found by an analysis conducted 3 years after the final deadline for the aforementioned legislation took place. Follow-up studies showed little improvement at 5 and 8 years after the deadline. It was hypothesized that the cost of a HMP is a significant factor in determining whether or not a plan is completed. A study was conducted using Boolean Matrix Analysis methods to determine what, if any, characteristics of a certain community will most influence the cost of a HMP. The frequency of natural hazards experienced by the planning area, the number of jurisdictions participating in the HMEP, the population, and population density were found to significantly affect cost. These variables were used in a regression analysis to determine their predictive power for cost. It was found that along with two interaction terms, the variables explain approximately half the variation in HMP cost. PMID:24303771

  16. EFFECTS OF USING NUMERICAL SIMULATION FOR PLANNING OF TSUNAMI HAZARD MITIGATION

    NASA Astrophysics Data System (ADS)

    Kitamura, Fukutaro; Shigihara, Yoshinori; Fujima, Koji

    A plan of tsunami hazard mitigation is discussed for a business establishment in the predicted inundation area of Nankai earthquake tsunami, based on a public hazard map. That is compared with the one based on a numerical simulation. It is difficult to obtaine the detailed inundation depth and arrival time from the public hazard map. Thus, effective countermeasure may not be taken by the business establishment. It results the large damage of the business establishment. On the contrary, the detailed information is obtained by using numerical simulation. It is possible for the business establishment to make an effective plan for hazard reduction. Thus, use of numerical simulation is effective for planning of tsunami hazard mitigation and business continuity planning.

  17. Threshold effects of hazard mitigation in coastal human-environmental systems

    NASA Astrophysics Data System (ADS)

    Lazarus, E. D.

    2014-01-01

    Despite improved scientific insight into physical and social dynamics related to natural disasters, the financial cost of extreme events continues to rise. This paradox is particularly evident along developed coastlines, where future hazards are projected to intensify with consequences of climate change, and where the presence of valuable infrastructure exacerbates risk. By design, coastal hazard mitigation buffers human activities against the variability of natural phenomena such as storms. But hazard mitigation also sets up feedbacks between human and natural dynamics. This paper explores developed coastlines as exemplary coupled human-environmental systems in which hazard mitigation is the key coupling mechanism. Results from a simplified numerical model of an agent-managed seawall illustrate the nonlinear effects that economic and physical thresholds can impart into coastal human-environmental system dynamics. The scale of mitigation action affects the time frame over which human activities and natural hazards interact. By accelerating environmental changes observable in some settings over human timescales of years to decades, climate change may temporarily strengthen the coupling between human and environmental dynamics. However, climate change could ultimately result in weaker coupling at those human timescales as mitigation actions increasingly engage global-scale systems.

  18. Assessing the costs of hazard mitigation through landscape interventions in the urban structure

    NASA Astrophysics Data System (ADS)

    Bostenaru-Dan, Maria; Aldea Mendes, Diana; Panagopoulos, Thomas

    2014-05-01

    In this paper we look at an issue rarely approached, the economic efficiency of natural hazard risk mitigation. The urban scale at which a natural hazard can impact leads to the importance of urban planning strategy in risk management. However, usually natural, engineering, and social sciences deal with it, and the role of architecture and urban planning is neglected. Climate change can lead to risks related to increased floods, desertification, sea level rise among others. Reducing the sealed surfaces in cities through green spaces in the crowded centres can mitigate them, and can be foreseen in restructuration plans in presence or absence of disasters. For this purpose we reviewed the role of green spaces and community centres such as churches in games, which can build the core for restructuration efforts, as also field and archive studies show. We look at the way ICT can contribute to organize the information from the building survey to economic computations in direct modeling or through games. The roles of game theory, agent based modeling and networks and urban public policies in designing decision systems for risk management are discussed. Games rules are at the same time supported by our field and archive studies, as well as research by design. Also we take into consideration at a rare element, which is the role of landscape planning, through the inclusion of green elements in reconstruction after the natural and man-made disasters, or in restructuration efforts to mitigate climate change. Apart of existing old city tissue also landscape can be endangered by speculation and therefore it is vital to highlight its high economic value, also in this particular case. As ICOMOS highlights for the 2014 congress, heritage and landscape are two sides of the same coin. Landscape can become or be connected to a community centre, the first being necessary for building a settlement, the second raising its value, or can build connections between landmarks in urban routes. For this reason location plays a role not only for mitigating the effects of hazards but also for increasing the value of land through vicinities. Games are only another way to build a model of the complex system which is the urban organism in this regard, and a model is easier to be analysed than the system while displaying its basic rules. The role of landscape of building roads of memory between landmarks in the reconstruction is yet to be investigated in a future proposed COST action.

  19. The Diversity of Large Earthquakes and Its Implications for Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Kanamori, Hiroo

    2014-05-01

    With the advent of broadband seismology and GPS, significant diversity in the source radiation spectra of large earthquakes has been clearly demonstrated. This diversity requires different approaches to mitigate hazards. In certain tectonic environments, seismologists can forecast the future occurrence of large earthquakes within a solid scientific framework using the results from seismology and GPS. Such forecasts are critically important for long-term hazard mitigation practices, but because stochastic fracture processes are complex, the forecasts are inevitably subject to large uncertainty, and unexpected events will continue to surprise seismologists. Recent developments in real-time seismology will help seismologists to cope with and prepare for tsunamis and earthquakes. Combining a better understanding of earthquake diversity with modern technology is the key to effective and comprehensive hazard mitigation practices.

  20. Experimentally Benchmarked Numerical Approaches to Lightning Hazard Assessment and Mitigation

    NASA Astrophysics Data System (ADS)

    Jones, Malcolm; Newton, David

    2013-04-01

    A natural hazard that has been with us since the beginning of time is the lighting strike. Not only does it represent a direct hazard to humans but also to the facilities that they work within and the products they produce. The latter categories are of particular concern when they are related to potentially hazardous processes and products. For this reason experimental and numerical modelling techniques are developed to understand the nature of the hazards, to develop appropriate protective approaches which can be put in place and finally to gain assurance that the overall risks fall within national, international accepted standards and those appropriate to the special nature of the work. The latter is of particular importance when the processes and the products within such facilities have a potential susceptibility to lightning strike and where failure is deemed unacceptable. This paper covers examples of the modelling approaches applied to such facilities within which high consequence operations take place, together with the protection that is required for high consequence products. In addition examples are given of how the numerical techniques are benchmarked by supporting experimental programmes. Not only should such a safety rationale be laid down and agreed early for these facilities and products but that it is maintained during the inevitable changes that will occur during the design, development, production and maintenance phases. For example an 'improvement', as seen by a civil engineer or a facility manager, may well turn out to be detrimental to lightning safety. Constant vigilance is key to ensuring the maintenance of safety.

  1. Fourth DOE Natural Phenomena Hazards Mitigation Conference: Proceedings. Volume 1

    SciTech Connect

    Not Available

    1993-12-31

    This conference allowed an interchange in the natural phenomena area among designers, safety professionals, and managers. The papers presented in Volume I of the proceedings are from sessions I - VIII which cover the general topics of: DOE standards, lessons learned and walkdowns, wind, waste tanks, ground motion, testing and materials, probabilistic seismic hazards, risk assessment, base isolation and energy dissipation, and lifelines and floods. Individual papers are indexed separately. (GH)

  2. Advances(?) in mitigating volcano hazards in Latin America

    USGS Publications Warehouse

    Hall, M.L.

    1991-01-01

    The 1980's were incredible years for volcanology. As a consequence of the Mount St. Helens and other eruptions, major advances in our understanding of volcanic processes and eruption dynamics were made. the decade also witnessed the greatest death toll caused by volcanism since 1902. Following Mount St. Helens, awareness of volcano hazards increased throughout the world; however, in Latin America, subsequent events showed that much was still to be learned.

  3. Department of Energy Natural Phenomena Hazards Mitigation Program

    SciTech Connect

    Murray, R.C.

    1993-09-01

    This paper will present a summary of past and present accomplishments of the Natural Phenomena Hazards Program that has been ongoing at Lawrence Livermore National Laboratory since 1975. The Natural Phenomena covered includes earthquake; winds, hurricanes, and tornadoes; flooding and precipitation; lightning; and volcanic events. The work is organized into four major areas (1) Policy, requirements, standards, and guidance (2) Technical support, research development, (3) Technology transfer, and (4) Oversight.

  4. Looking before we leap: an ongoing, quantative investigation of asteroid and comet impact hazard mitigation

    SciTech Connect

    Plesko, Catherine S; Weaver, Robert P; Bradley, Paul A; Huebner, Walter F

    2010-01-01

    There are many outstanding questions about the correct response to an asteroid or comet impact threat on Earth. Nuclear munitions are currently thought to be the most efficient method of delivering an impact-preventing impulse to a potentially hazardous object (PHO). However, there are major uncertainties about the response of PHOs to a nuclear burst, and the most appropriate ways to use nuclear munitions for hazard mitigation.

  5. Interdisciplinary approach for Tsunami Hazard Mitigation in Algeria (West Mediterranean)

    NASA Astrophysics Data System (ADS)

    Amir, L. A.; Cisternas, A.; Vigneresse, J. D.

    2009-12-01

    Numerous tsunamis occurred in the West Mediterranean with magnitudes ranging from m=-1 to m=2 (Imamura-Iida scale). In Algeria, tsunamis are reported from the 14th century to 2003. Northern Algeria is located at the border between the African and the Eurasian plate. Destructive earthquakes with magnitude greater than 6.7 occurred 3 times in the last century. The North Algeria western region is characterized by the Murdjadjo anticline. A destructive earthquake hit Oran city on October 1790 (Intensity: X, West of Algeria). A tsunami was triggered in the Alboran sea. The Spanish and North Africa coasts were flooded. Run-ups of 2 meters in height are reported in historical documents (Lopez Marinas and Salord, 1990). Here, the 1790 Alboran tsunami is studied from a modelling approach. The tsunami source is determined from the Okada equations and the tsunami propagation is estimated from the SWAN code (Mader, 2004). Results show that active thrust faulting related to the Murdjadjo structure is responsible for the tsunami. In the central part of Algeria, the Algiers city (capital of Algeria) was the location of destructive earthquakes (Intensity: X) that were followed by tsunamis in 1365 and in 1773. Flooding and run-ups of 2 meters in height are reported in historical documents for the 1365 event. The central part of Algeria is the site of the Sahel anticline. A tsunami modelling is also performed considering the Sahel fault system as a potential tsunami source. Results show that it takes less than 15 minutes for the tsunami waves to reach the Spanish coast. Run-ups are estimated lower than 2 meters in height. Discrepancies are attributed to the resolution of the bathymetry and the limits of the modelling. In the eastern region, historical reports also reveal run-ups up to 5 meters in height after a tsunami triggered by a destructive earthquake in 1856 in Jijel city (intensity: VIII). From tsunami catalogs, seismic and tsunami data are plotted using a tsunami vulnerability parameter. The vulnerability index is estimated from the tsunami intensity and the seismic intensity using the Papadopoulos and the EMS scale. Results show that in Algeria, tsunami damages are minor relative to seismic damages. Since the 2004 Sumatra-Andaman tsunami, intergovernmental coordinated groups are working on an Indian and a Mediterranean tsunami alert system. To reduce vulnerability and increase resilience, it is very important to implement an efficiency warning system and a communication policy for fast urbanized coastal cities. In that context, lessons from the pacific case study are of major interest. Chile is marked by a very high seismic and tsunami hazard. The Iquique area is a threaten zone for a potential earthquake of magnitude greater than 8 and a local tsunami that could generate run-ups up to 20 meters in height. In addition to the Pacific Tsunami Warning centre based in Hawaii, the Chile has elaborated a local tsunami warning centre. The Chilean case study is presented in discussion to highlight some lessons that may serve as an example for fast urbanized coastal cities that have to face local tsunamis.

  6. Assessment of indirect losses and costs of emergency for project planning of alpine hazard mitigation

    NASA Astrophysics Data System (ADS)

    Amenda, Lisa; Pfurtscheller, Clemens

    2013-04-01

    By virtue of augmented settling in hazardous areas and increased asset values, natural disasters such as floods, landslides and rockfalls cause high economic losses in Alpine lateral valleys. Especially in small municipalities, indirect losses, mainly stemming from a breakdown of transport networks, and costs of emergency can reach critical levels. A quantification of these losses is necessary to estimate the worthiness of mitigation measures, to determine the appropriate level of disaster assistance and to improve risk management strategies. There are comprehensive approaches available for assessing direct losses. However, indirect losses and costs of emergency are widely not assessed and the empirical basis for estimating these costs is weak. To address the resulting uncertainties of project appraisals, a standardized methodology has been developed dealing with issues of local economic effects and emergency efforts needed. In our approach, the cost-benefit-analysis for technical mitigation of the Austrian Torrent and Avalanche Control (TAC) will be optimized and extended using the 2005-debris flow as a design event, which struggled a small town in the upper Inn valley in southwest Tyrol (Austria). Thereby, 84 buildings were affected, 430 people were evacuated and due to this, the TAC implemented protection measures for 3.75 million Euros. Upgrading the method of the TAC and analyzing to what extent the cost-benefit-ratio is about to change, is one of the main objectives of this study. For estimating short-run indirect effects and costs of emergency on the local level, data was collected via questionnaires, field mapping, guided interviews, as well as intense literature research. According to this, up-to-date calculation methods were evolved and the cost-benefit-analysis of TAC was recalculated with these new-implemented results. The cost-benefit-ratio will be more precise and specific and hence, the decision, which mitigation alternative will be carried out. Based on this, the worthiness of the mitigation measures can be determined in more detail and the proper level of emergency assistance can be calculated more adequately. By dint of this study, a better data basis will be created evaluating technical and non-technical mitigation measures, which is useful for government agencies, insurance companies and research.

  7. The role of the media in hazard mitigation and disaster management.

    PubMed

    Rattien, S

    1990-03-01

    The International Decade for Natural Disaster Reduction, which began in January 1990, will embrace efforts to reduce death, injury and property losses stemming from rapid-onset natural disasters. Our expanding science and technology base makes possible this concerted cooperative international effort, and communications is a central part of that effort - for public education, early warning, evacuation and coordination of post-disaster relief. Mass communication is inextricably entwined with disasters and hazard mitigation. Reflecting the public's great interest and concern, the electronic and print media extensively cover natural disasters and significantly affect how and what the public learns about and how it perceives natural hazards. Improving the linkages between the media and disaster-mitigation researchers and practitioners could prepare the public to act promptly on warnings, helping to mitigate disasters. This could also accelerate the shift of the societal emphasis from post-disaster relief toward pre-disaster initiatives. PMID:20958692

  8. Mitigation of EMU Cut Glove Hazard from Micrometeoroid and Orbital Debris Impacts on ISS Handrails

    NASA Technical Reports Server (NTRS)

    Ryan, Shannon; Christiansen, Eric L.; Davis, Bruce A.; Ordonez, Erick

    2009-01-01

    Recent cut damages sustained on crewmember gloves during extravehicular activity (ISS) onboard the International Space Station (ISS) have been caused by contact with sharp edges or a pinch point according to analysis of the damages. One potential source are protruding sharp edged crater lips from micrometeoroid and orbital debris (MMOD) impacts on metallic handrails along EVA translation paths. A number of hypervelocity impact tests were performed on ISS handrails, and found that mm-sized projectiles were capable of inducing crater lip heights two orders of magnitude above the minimum value for glove abrasion concerns. Two techniques were evaluated for mitigating the cut glove hazard of MMOD impacts on ISS handrails: flexible overwraps which act to limit contact between crewmember gloves and impact sites, and; alternate materials which form less hazardous impact crater profiles. In parallel with redesign efforts to increase the cut resilience of EMU gloves, the modifications to ISS handrails evaluated in this study provide the means to significantly reduce cut glove risk from MMOD impact craters

  9. The U.S. National Tsunami Hazard Mitigation Program: Successes in Tsunami Preparedness

    NASA Astrophysics Data System (ADS)

    Whitmore, P.; Wilson, R. I.

    2012-12-01

    Formed in 1995 by Congressional Action, the National Tsunami Hazards Mitigation Program (NTHMP) provides the framework for tsunami preparedness activities in the United States. The Program consists of the 28 U.S. coastal states, territories, and commonwealths (STCs), as well as three Federal agencies: the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), and the United States Geological Survey (USGS). Since its inception, the NTHMP has advanced tsunami preparedness in the United States through accomplishments in many areas of tsunami preparedness: - Coordination and funding of tsunami hazard analysis and preparedness activities in STCs; - Development and execution of a coordinated plan to address education and outreach activities (materials, signage, and guides) within its membership; - Lead the effort to assist communities in meeting National Weather Service (NWS) TsunamiReady guidelines through development of evacuation maps and other planning activities; - Determination of tsunami hazard zones in most highly threatened coastal communities throughout the country by detailed tsunami inundation studies; - Development of a benchmarking procedure for numerical tsunami models to ensure models used in the inundation studies meet consistent, NOAA standards; - Creation of a national tsunami exercise framework to test tsunami warning system response; - Funding community tsunami warning dissemination and reception systems such as sirens and NOAA Weather Radios; and, - Providing guidance to NOAA's Tsunami Warning Centers regarding warning dissemination and content. NTHMP activities have advanced the state of preparedness of United States coastal communities, and have helped save lives and property during recent tsunamis. Program successes as well as future plans, including maritime preparedness, are discussed.

  10. New Activities of the U.S. National Tsunami Hazard Mitigation Program, Mapping and Modeling Subcommittee

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Eble, M. C.

    2013-12-01

    The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is comprised of representatives from coastal states and federal agencies who, under the guidance of NOAA, work together to develop protocols and products to help communities prepare for and mitigate tsunami hazards. Within the NTHMP are several subcommittees responsible for complimentary aspects of tsunami assessment, mitigation, education, warning, and response. The Mapping and Modeling Subcommittee (MMS) is comprised of state and federal scientists who specialize in tsunami source characterization, numerical tsunami modeling, inundation map production, and warning forecasting. Until September 2012, much of the work of the MMS was authorized through the Tsunami Warning and Education Act, an Act that has since expired but the spirit of which is being adhered to in parallel with reauthorization efforts. Over the past several years, the MMS has developed guidance and best practices for states and territories to produce accurate and consistent tsunami inundation maps for community level evacuation planning, and has conducted benchmarking of numerical inundation models. Recent tsunami events have highlighted the need for other types of tsunami hazard analyses and products for improving evacuation planning, vertical evacuation, maritime planning, land-use planning, building construction, and warning forecasts. As the program responsible for producing accurate and consistent tsunami products nationally, the NTHMP-MMS is initiating a multi-year plan to accomplish the following: 1) Create and build on existing demonstration projects that explore new tsunami hazard analysis techniques and products, such as maps identifying areas of strong currents and potential damage within harbors as well as probabilistic tsunami hazard analysis for land-use planning. 2) Develop benchmarks for validating new numerical modeling techniques related to current velocities and landslide sources. 3) Generate guidance and protocols for the production and use of new tsunami hazard analysis products. 4) Identify multistate collaborations and funding partners interested in these new products. Application of these new products will improve the overall safety and resilience of coastal communities exposed to tsunami hazards.

  11. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    The United States spends approximately four million dollars each year searching for near-Earth objects (NEOs). The objective is to detect those that may collide with Earth. The majority of this funding supports the operation of several observatories that scan the sky searching for NEOs. This, however, is insufficient in detecting the majority of NEOs that may present a tangible threat to humanity. A significantly smaller amount of funding supports ways to protect the Earth from such a potential collision or "mitigation." In 2005, a Congressional mandate called for NASA to detect 90 percent of NEOs with diameters of 140 meters of greater by 2020. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies identifies the need for detection of objects as small as 30 to 50 meters as these can be highly destructive. The book explores four main types of mitigation including civil defense, "slow push" or "pull" methods, kinetic impactors and nuclear explosions. It also asserts that responding effectively to hazards posed by NEOs requires national and international cooperation. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies is a useful guide for scientists, astronomers, policy makers and engineers.

  12. Determining public policy and resource allocation priorities for mitigating natural hazards: a capabilities-based approach.

    PubMed

    Murphy, Colleen; Gardoni, Paolo

    2007-12-01

    This paper proposes a Capabilities-based Approach to guide hazard mitigation efforts. First, a discussion is provided of the criteria that should be met by an adequate framework for formulating public policy and allocating resources. This paper shows why a common decision-aiding tool, Cost-benefit Analysis, fails to fulfill such criteria. A Capabilities-based Approach to hazard mitigation is then presented, drawing on the framework originally developed in the context of development economics and policy. The focus of a Capabilities-based Approach is protecting and promoting the well-being of individuals. Capabilities are dimensions of well-being and specified in terms of functionings. Functionings capture the various things of value an individual does or becomes in his or her life, including being alive, being healthy, and being sheltered. Capabilities refer to the real achievability of specific functionings. In the context of hazard mitigation, from a Capabilities-based Approach, decision- and policy-makers should consider the acceptability and tolerability of risks along with the affectability of hazards when determining policy formulation and resource allocation. Finally, the paper shows how the proposed approach satisfies the required criteria, and overcomes the limitations of Cost-benefit Analysis, while maintaining its strengths. PMID:18066680

  13. Almost strict liability: Wind River Petroleum and the Utah Hazardous Substance Mitigation Act

    SciTech Connect

    1996-12-31

    In Wind River, the Utah Supreme Court developed a two-step liability standard. The court ruled that under the act, statutorily responsible parties are strictly liable for any release of hazardous material from their facility. Among responsible parties, liability is to be apportioned on an equitable contribution standard. However, the Utah Legislature has subsequently amended the Mitigation Act to prohibit the application of unapportioned strict liability. Therefore, Wind River can no longer be relied upon as the law regarding liability under the Mitigation Act.

  14. Developing a scientific procedure for community based hazard mapping and risk mitigation

    NASA Astrophysics Data System (ADS)

    Verrier, M.

    2011-12-01

    As an international exchange student from the Geological Sciences Department at San Diego State University (SDSU), I joined the KKN-PPM program at Universitas Gadjah Mada (UGM), Yogyakarta, Indonesia, in July 2011 for 12 days (July 4th to July 16th) of its two month duration (July 4th to August 25th). The KKN-PPM group I was attached was designated 154 and was focused in Plosorejo Village, Karanganyar, Kerjo, Central Java, Indonesia. The mission of KKN-PPM 154 was to survey Plosorejo village for existing landslides, to generate a simple hazard susceptibility map that can be understood by local villagers, and then to begin dissemination of that map into the community. To generate our susceptibility map we first conducted a geological survey of the existing landslides in the field study area, with a focus on determining landslide triggers and gauging areas for susceptibility for future landslides. The methods for gauging susceptibility included lithological observation, the presence of linear cracking, visible loss of structural integrity in structures such as villager homes, as well as collaboration with local residents and with the local rescue and response team. There were three color distinctions used in representing susceptibility which were green, where there is no immediate danger of landslide damage; orange, where transportation routes are at risk of being disrupted by landslides; and red, where imminent landslide potential puts a home in direct danger. The landslide inventory and susceptibility data was compiled into digital mediums such as CorelDraw, ArcGIS and Google Earth. Once a technical map was generated, we presented it to the village leadership for confirmation and modification based on their experience. Finally, we began to use the technical susceptibility map to draft evacuation routes and meeting points in the event of landslides, as well as simple susceptibility maps that can be understood and utilized by local villagers. Landslide mitigation projects that are being conducted alongside the community hazard map include marking evacuation routes with painted bamboo signs, creating a meaningful landslide awareness mural, and installing simple early warning systems that detect land movement and alert residents that evacuation routes should be used. KKN-PPM is scheduled to continue until August 25th, 2011. In the future, research will be done into using the model for community based hazard mapping outlined here in the Geological Sciences Department at SDSU to increase georisk awareness and improve mitigation of landslides in local areas of need such as Tijuana, Mexico.

  15. The influence of hazard models on GIS-based regional risk assessments and mitigation policies

    USGS Publications Warehouse

    Bernknopf, R.L.; Rabinovici, S.J.M.; Wood, N.J.; Dinitz, L.B.

    2006-01-01

    Geographic information systems (GIS) are important tools for understanding and communicating the spatial distribution of risks associated with natural hazards in regional economies. We present a GIS-based decision support system (DSS) for assessing community vulnerability to natural hazards and evaluating potential mitigation policy outcomes. The Land Use Portfolio Modeler (LUPM) integrates earth science and socioeconomic information to predict the economic impacts of loss-reduction strategies. However, the potential use of such systems in decision making may be limited when multiple but conflicting interpretations of the hazard are available. To explore this problem, we conduct a policy comparison using the LUPM to test the sensitivity of three available assessments of earthquake-induced lateral-spread ground failure susceptibility in a coastal California community. We find that the uncertainty regarding the interpretation of the science inputs can influence the development and implementation of natural hazard management policies. Copyright ?? 2006 Inderscience Enterprises Ltd.

  16. The NEOShield Project: Understanding the Mitigation-Relevant Physical Properties of Potentially Hazardous Asteroids

    NASA Astrophysics Data System (ADS)

    Harris, Alan W.; Drube, L.; Consortium, NEOShield

    2012-10-01

    NEOShield is a European-Union funded project to address impact hazard mitigation issues, coordinated by the German Aerospace Center, DLR. The NEOShield consortium consists of 13 research institutes, universities, and industrial partners from 6 countries and includes leading US and Russian space organizations. The primary aim of the 5.8 million euro, 3.5 year project, which commenced in January 2012, is to investigate in detail promising mitigation techniques, such as the kinetic impactor, blast deflection, and the gravity tractor, and devise feasible demonstration missions. Options for an international strategy for implementation when an actual impact threat arises will also be investigated. Our current scientific work is focused on examining the mitigation-relevant physical properties of the NEO population via observational data and laboratory experiments on asteroid surface analog materials. We are attempting to narrow the range of the expected properties of objects that are most likely to threaten the Earth and trigger space-borne mitigation attempts, and investigate how such objects would respond to different mitigation techniques. The results of our scientific work will flow into the technical phase of the project, during which detailed designs of feasible mitigation demonstration missions will be developed. We briefly describe the scope of the project and report on results obtained to date. Funded under EU FP7 program agreement no. 282703.

  17. Planning ahead for asteroid and comet hazard mitigation, phase 1: parameter space exploration and scenario modeling

    SciTech Connect

    Plesko, Catherine S; Clement, R Ryan; Weaver, Robert P; Bradley, Paul A; Huebner, Walter F

    2009-01-01

    The mitigation of impact hazards resulting from Earth-approaching asteroids and comets has received much attention in the popular press. However, many questions remain about the near-term and long-term, feasibility and appropriate application of all proposed methods. Recent and ongoing ground- and space-based observations of small solar-system body composition and dynamics have revolutionized our understanding of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). Ongoing increases in computing power and algorithm sophistication make it possible to calculate the response of these inhomogeneous objects to proposed mitigation techniques. Here we present the first phase of a comprehensive hazard mitigation planning effort undertaken by Southwest Research Institute and Los Alamos National Laboratory. We begin by reviewing the parameter space of the object's physical and chemical composition and trajectory. We then use the radiation hydrocode RAGE (Gittings et al. 2008), Monte Carlo N-Particle (MCNP) radiation transport (see Clement et al., this conference), and N-body dynamics codes to explore the effects these variations in object properties have on the coupling of energy into the object from a variety of mitigation techniques, including deflection and disruption by nuclear and conventional munitions, and a kinetic impactor.

  18. A perspective multidisciplinary geological approach for mitigation of effects due to the asbestos hazard

    NASA Astrophysics Data System (ADS)

    Vignaroli, Gianluca; Rossetti, Federico; Belardi, Girolamo; Billi, Andrea

    2010-05-01

    Asbestos-bearing rock sequences constitute a remarkable natural hazard that poses important threat to human health and may be at the origin of diseases such as asbestosis, mesothelioma and lung cancer). Presently, asbestos is classified as Category 1 carcinogen by world health authorities. Although regulatory agencies in many countries prohibit or restrict the use of asbestos, and discipline the environmental asbestos exposure, the impact of asbestos on human life still constitutes a major problem. Naturally occurring asbestos includes serpentine and amphibole minerals characterised by fibrous morphology and it is a constituent of mineralogical associations typical of mafic and ultramafic rocks within the ophiolitic sequences. Release of fibres can occur both through natural processes (erosion) and through human activities requiring fragmentation of ophiolite rocks (quarrying, tunnelling, railways construction, etc.). As a consequence, vulnerability is increasing in sites where workers and living people are involved by dispersion of fibres during mining and milling of ophiolitic rocks. By analysing in the field different exposures of ophiolitic sequences from the Italian peninsula and after an extensive review of the existing literature, we remark the importance of the geological context (origin, tectonic and deformation history) of ophiolites as a first-order parameter in evaluating the asbestos hazard. Integrated structural, textural, mineralogical and petrological studies significantly improve our understanding of the mechanisms governing the nucleation/growth of fibrous minerals in deformation structures (both ductile and brittle) within the ophiolitic rocks. A primary role is recognised in the structural processes favouring the fibrous mineralization, with correlation existing between the fibrous parameters (such as mineralogical composition, texture, mechanics characteristics) and the particles released in the air (such as shape, size, and amount liberated during rock fragmentation). Accordingly, we are confident that definition of an analytical protocol based on the geological attributes of the asbestos-bearing rocks may constitute a propaedeutical tool to evaluate the asbestos hazard in natural environments. This approach may have important implications for mitigation effects of the asbestos hazard from the medical field to the engineering operations.

  19. Multidisciplinary Approach to Identify and Mitigate the Hazard from Induced Seismicity in Oklahoma

    NASA Astrophysics Data System (ADS)

    Holland, A. A.; Keller, G. R., Jr.; Darold, A. P.; Murray, K. E.; Holloway, S. D.

    2014-12-01

    Oklahoma has experienced a very significant increase in seismicity rates over the last 5 years with the greatest increase occurring in 2014. The observed rate increase indicates that the seismic hazard for at least some parts of Oklahoma has increased significantly. Many seismologists consider the large number of salt-water disposal wells operating in Oklahoma as the largest contributing factor to this increase. However, unlike many cases of seismicity induced by injection, the greatest increase is occurring over a very large area, about 15% of the state. There are more than 3,000 disposal wells currently operating within Oklahoma along with injection volumes greater than 2010 rates. These factors add many significant challenges to identifying potential cases of induced seismicity and understanding the contributing factors well enough to mitigate such occurrences. In response to a clear need for a better geotechnical understanding of what is occurring in Oklahoma, a multi-year multidisciplinary study some of the most active areas has begun at the University of Oklahoma. This study includes additional seismic monitoring, better geological and geophysical characterization of the subsurface, hydrological and reservoir modeling, and geomechanical studies to better understand the rise in seismicity rates. The Oklahoma Corporation Commission has added new rules regarding reporting and monitoring of salt-water disposal wells, and continue to work with the Oklahoma Geological Survey and other researchers.

  20. [Comment on Old and new ideas visited for comet and asteroid hazard mitigation strategies by J. Wakefield] Clarifying a few points about impact mitigation strategies

    NASA Astrophysics Data System (ADS)

    Morrison, David

    The article on impact mitigation studies [Eos, August 30, 1994] raises some interesting points about the hazard of asteroid and comet impacts but betrays a major misunderstanding of the mitigation strategies under study. It refers repeatedly to Strategic Defense Initiative (SDI) or Star Wars systems and tries to discuss their merits without giving the reader any hint of what these mysterious new technologies are or how they may relate to mitigation of impacts. In fact, detecting, intercepting, and deflecting a kilometer-scale asteroid or comet at ranges from the Earth of tens of millions of kilometers has little in common with the missile detection and defense systems developed as part of the SDI program, no more than you would use a rifle to shoot down an Inter-Continental Ballistic Missile. The one example mentioned, the Air Force GEODSS system of wide-field telescopes, was not a part of SDI. In addition, the article seriously overstates the current discovery rate of kilometer-scale near Earth asteroids, which are currently found at a rate of several per year, not several per month (the several per month refers to all near Earth asteroids found, most of which are less than 1 km in diameter).

  1. On mitigating rapid onset natural disasters: Project THRUST (Tsunami Hazards Reduction Utilizing Systems Technology)

    NASA Astrophysics Data System (ADS)

    Bernard, E. N.; Behn, R. R.; Hebenstreit, G. T.; Gonzalez, F. I.; Krumpe, P.; Lander, J. F.; Lorca, E.; McManamon, P. M.; Milburn, H. B.

    Rapid onset natural hazards have claimed more than 2.8 million lives worldwide in the past 20 years. This category includes such events as earthquakes, landslides, hurricanes, tornados, floods, volcanic eruptions, wildfires, and tsunamis. Effective hazard mitigation is particularly difficult in such cases, since the time available to issue warnings can be very short or even nonexistent. This paper presents the concept of a local warning system that exploits and integrates the existing technologies of risk evaluation, environmental measurement, and telecommunications. We describe Project THRUST, a successful implementation of this general, systematic approach to tsunamis. The general approach includes pre-event emergency planning, real-time hazard assessment, and rapid warning via satellite communication links.

  2. Earthquake Scaling and Development of Ground Motion Prediction for Earthquake Hazard Mitigation in Taiwan

    NASA Astrophysics Data System (ADS)

    Ma, K.; Yen, Y.

    2011-12-01

    For earthquake hazard mitigation toward risk management, integration study from development of source model to ground motion prediction is crucial. The simulation for high frequency component ( > 1 Hz) of strong ground motions in the near field was not well resolved due to the insufficient resolution in velocity structure. Using the small events as Green's functions (i.e. empirical Green's function (EGF) method) can resolve the problem of lack of precise velocity structure to replace the path effect evaluation. If the EGF is not available, a stochastic Green's function (SGF) method can be employed. Through characterizing the slip models derived from the waveform inversion, we directly extract the parameters needed for the ground motion prediction in the EGF method or the SGF method. The slip models had been investigated from Taiwan dense strong motion and global teleseismic data. In addition, the low frequency ( < 1 Hz) can obtained numerically by the Frequency-Wavenumber (FK) method. Thus, broadband frequency strong ground motion can be calculated by a hybrid method that combining a deterministic FK method for the low frequency simulation and the EGF or SGF method for high frequency simulation. Characterizing the definitive source parameters from the empirical scaling study can provide directly to the ground motion simulation. To give the ground motion prediction for a scenario earthquake, we compiled the earthquake scaling relationship from the inverted finite-fault models of moderate to large earthquakes in Taiwan. The studies show the significant involvement of the seismogenic depth to the development of rupture width. In addition to that, several earthquakes from blind fault show distinct large stress drop, which yield regional high PGA. According to the developing scaling relationship and the possible high stress drops for earthquake from blind faults, we further deploy the hybrid method mentioned above to give the simulation of the strong motion in displacement, velocity and acceleration. We now give this exercise to the high stress drop event, and the events, which might have potential seismic hazard to a specific site to give further estimation on seismic hazard evaluation.

  3. A portfolio approach to evaluating natural hazard mitigation policies: An Application to lateral-spread ground failure in Coastal California

    USGS Publications Warehouse

    Bernknopf, R.L.; Dinitz, L.B.; Rabinovici, S.J.M.; Evans, A.M.

    2001-01-01

    In the past, efforts to prevent catastrophic losses from natural hazards have largely been undertaken by individual property owners based on site-specific evaluations of risks to particular buildings. Public efforts to assess community vulnerability and encourage mitigation have focused on either aggregating site-specific estimates or adopting standards based upon broad assumptions about regional risks. This paper develops an alternative, intermediate-scale approach to regional risk assessment and the evaluation of community mitigation policies. Properties are grouped into types with similar land uses and levels of hazard, and hypothetical community mitigation strategies for protecting these properties are modeled like investment portfolios. The portfolios consist of investments in mitigation against the risk to a community posed by a specific natural hazard, and are defined by a community's mitigation budget and the proportion of the budget invested in locations of each type. The usefulness of this approach is demonstrated through an integrated assessment of earthquake-induced lateral-spread ground failure risk in the Watsonville, California area. Data from the magnitude 6.9 Loma Prieta earthquake of 1989 are used to model lateral-spread ground failure susceptibility. Earth science and economic data are combined and analyzed in a Geographic Information System (GIS). The portfolio model is then used to evaluate the benefits of mitigating the risk in different locations. Two mitigation policies, one that prioritizes mitigation by land use type and the other by hazard zone, are compared with a status quo policy of doing no further mitigation beyond that which already exists. The portfolio representing the hazard zone rule yields a higher expected return than the land use portfolio does: However, the hazard zone portfolio experiences a higher standard deviation. Therefore, neither portfolio is clearly preferred. The two mitigation policies both reduce expected losses and increase overall expected community wealth compared to the status quo policy.

  4. Lidar and Electro-Optics for Atmospheric Hazard Sensing and Mitigation

    NASA Technical Reports Server (NTRS)

    Clark, Ivan O.

    2012-01-01

    This paper provides an overview of the research and development efforts of the Lidar and Electro-Optics element of NASA's Aviation Safety Program. This element is seeking to improve the understanding of the atmospheric environments encountered by aviation and to provide enhanced situation awareness for atmospheric hazards. The improved understanding of atmospheric conditions is specifically to develop sensor signatures for atmospheric hazards. The current emphasis is on kinetic air hazards such as turbulence, aircraft wake vortices, mountain rotors, and windshear. Additional efforts are underway to identify and quantify the hazards arising from multi-phase atmospheric conditions including liquid and solid hydrometeors and volcanic ash. When the multi-phase conditions act as obscurants that result in reduced visual awareness, the element seeks to mitigate the hazards associated with these diminished visual environments. The overall purpose of these efforts is to enable safety improvements for air transport class and business jet class aircraft as the transition to the Next Generation Air Transportation System occurs.

  5. The asteroid and comet impact hazard: risk assessment and mitigation options

    NASA Astrophysics Data System (ADS)

    Gritzner, Christian; Drfeld, Kai; Kasper, Jan; Fasoulas, Stefanos

    2006-08-01

    The impact of extraterrestrial matter onto Earth is a continuous process. On average, some 50,000 tons of dust are delivered to our planet every year. While objects smaller than about 30 m mainly disintegrate in the Earths atmosphere, larger ones can penetrate through it and cause damage on the ground. When an object of hundreds of meters in diameter impacts an ocean, a tsunami is created that can devastate coastal cities. Further, if a km-sized object hit the Earth it would cause a global catastrophe due to the transport of enormous amounts of dust and vapour into the atmosphere resulting in a change in the Earths climate. This article gives an overview of the near-Earth asteroid and comet (near-Earth object-NEO) impact hazard and the NEO search programmes which are gathering important data on these objects. It also points out options for impact hazard mitigation by using deflection systems. It further discusses the critical constraints for NEO deflection strategies and systems as well as mitigation and evacuation costs and benefits. Recommendations are given for future activities to solve the NEO impact hazard problem.

  6. The asteroid and comet impact hazard: risk assessment and mitigation options.

    PubMed

    Gritzner, Christian; Dürfeld, Kai; Kasper, Jan; Fasoulas, Stefanos

    2006-08-01

    The impact of extraterrestrial matter onto Earth is a continuous process. On average, some 50,000 tons of dust are delivered to our planet every year. While objects smaller than about 30 m mainly disintegrate in the Earth's atmosphere, larger ones can penetrate through it and cause damage on the ground. When an object of hundreds of meters in diameter impacts an ocean, a tsunami is created that can devastate coastal cities. Further, if a km-sized object hit the Earth it would cause a global catastrophe due to the transport of enormous amounts of dust and vapour into the atmosphere resulting in a change in the Earth's climate. This article gives an overview of the near-Earth asteroid and comet (near-Earth object-NEO) impact hazard and the NEO search programmes which are gathering important data on these objects. It also points out options for impact hazard mitigation by using deflection systems. It further discusses the critical constraints for NEO deflection strategies and systems as well as mitigation and evacuation costs and benefits. Recommendations are given for future activities to solve the NEO impact hazard problem. PMID:16670908

  7. Seismic hazard studies in Egypt

    NASA Astrophysics Data System (ADS)

    Mohamed, Abuo El-Ela A.; El-Hadidy, M.; Deif, A.; Abou Elenean, K.

    2012-12-01

    The study of earthquake activity and seismic hazard assessment of Egypt is very important due to the great and rapid spreading of large investments in national projects, especially the nuclear power plant that will be held in the northern part of Egypt. Although Egypt is characterized by low seismicity, it has experienced occurring of damaging earthquake effect through its history. The seismotectonic sitting of Egypt suggests that large earthquakes are possible particularly along the Gulf of Aqaba-Dead Sea transform, the Subduction zone along the Hellenic and Cyprean Arcs, and the Northern Red Sea triple junction point. In addition some inland significant sources at Aswan, Dahshour, and Cairo-Suez District should be considered. The seismic hazard for Egypt is calculated utilizing a probabilistic approach (for a grid of 0.5 0.5) within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for four ground motion spectral periods and for different return periods. In addition, the uniform hazard spectra for rock sites for different 25 periods, and the probabilistic hazard curves for Cairo, and Alexandria cities are graphed. The peak ground acceleration (PGA) values were found close to the Gulf of Aqaba and it was about 220 gal for 475 year return period. While the lowest (PGA) values were detected in the western part of the western desert and it is less than 25 gal.

  8. Fluor Daniel Hanford implementation plan for DOE Order 5480.28, Natural phenomena hazards mitigation

    SciTech Connect

    Conrads, T.J.

    1997-09-12

    Natural phenomena hazards (NPH) are unexpected acts of nature that pose a threat or danger to workers, the public, or the environment. Earthquakes, extreme winds (hurricane and tornado), snow, flooding, volcanic ashfall, and lightning strikes are examples of NPH that could occur at the Hanford Site. U.S. Department of Energy (DOE) policy requires facilities to be designed, constructed, and operated in a manner that protects workers, the public, and the environment from hazards caused by natural phenomena. DOE Order 5480.28, Natural Phenomena Hazards Mitigation, includes rigorous new natural phenomena criteria for the design of new DOE facilities, as well as for the evaluation and, if necessary, upgrade of existing DOE facilities. The Order was transmitted to Westinghouse Hanford Company in 1993 for compliance and is also identified in the Project Hanford Management Contract, Section J, Appendix C. Criteria and requirements of DOE Order 5480.28 are included in five standards, the last of which, DOE-STD-1023, was released in fiscal year 1996. Because the Order was released before all of its required standards were released, enforcement of the Order was waived pending release of the last standard and determination of an in-force date by DOE Richland Operations Office (DOE-RL). Agreement also was reached between the Management and Operations Contractor and DOE-RL that the Order would become enforceable for new structures, systems, and components (SSCS) 60 days following issue of a new order-based design criteria in HNF-PRO-97, Engineering Design and Evaluation. The order also requires that commitments addressing existing SSCs be included in an implementation plan that is to be issued 1 year following the release of the last standard. Subsequently, WHC-SP-1175, Westinghouse Hanford Company Implementation Plan for DOE Order 5480.28, Natural Phenomena Hazards Mitigation, Rev. 0, was issued in November 1996, and this document, HNF-SP-1175, Fluor Daniel Hanford Implementation Plan for DOE Order 5480.28, Natural Phenomena Hazards Mitigation, is Rev. 1 of that plan.

  9. The Puerto Rico Component of the National Tsunami Hazard and Mitigation Program Pr-Nthmp

    NASA Astrophysics Data System (ADS)

    Huerfano Moreno, V. A.; Hincapie-Cardenas, C. M.

    2014-12-01

    Tsunami hazard assessment, detection, warning, education and outreach efforts are intended to reduce losses to life and property. The Puerto Rico Seismic Network (PRSN) is participating in an effort with local and federal agencies, to developing tsunami hazard risk reduction strategies under the National Tsunami Hazards and Mitigation Program (NTHMP). This grant supports the TsunamiReady program which is the base of the tsunami preparedness and mitigation in PR. The Caribbean region has a documented history of damaging tsunamis that have affected coastal areas. The seismic water waves originating in the prominent fault systems around PR are considered to be a near-field hazard for Puerto Rico and the Virgin islands (PR/VI) because they can reach coastal areas within a few minutes after the earthquake. Sources for local, regional and tele tsunamis have been identified and modeled and tsunami evacuation maps were prepared for PR. These maps were generated in three phases: First, hypothetical tsunami scenarios on the basis of the parameters of potential underwater earthquakes were developed. Secondly, each of these scenarios was simulated. The third step was to determine the worst case scenario (MOM). The run-ups were drawn on GIS referenced maps and aerial photographs. These products are being used by emergency managers to educate the public and develop mitigation strategies. Online maps and related evacuation products are available to the public via the PR-TDST (PR Tsunami Decision Support Tool). Currently all the 44 coastal municipalities were recognized as TsunamiReady by the US NWS. The main goal of the program is to declare Puerto Rico as TsunamiReady, including two cities that are not coastal but could be affected by tsunamis. Based on these evacuation maps, tsunami signs were installed, vulnerability profiles were created, communication systems to receive and disseminate tsunami messages were installed in each TWFP, and tsunami response plans were approved. Also, the existing tsunami protocol and criteria in the PR/VI was updated. This paper describes the PR-NTHMP project, including the real time earthquake and tsunami monitoring as well as the specific protocols used to broadcast tsunami messages. The paper highlights tsunami hazards assessment, detection, warning, education and outreach in Puerto Rico.

  10. Impact hazard mitigation: understanding the effects of nuclear explosive outputs on comets and asteroids

    SciTech Connect

    Clement, Ralph R C; Plesko, Catherine S; Bradley, Paul A; Conlon, Leann M

    2009-01-01

    The NASA 2007 white paper ''Near-Earth Object Survey and Deflection Analysis of Alternatives'' affirms deflection as the safest and most effective means of potentially hazardous object (PHO) impact prevention. It also calls for further studies of object deflection. In principle, deflection of a PHO may be accomplished by using kinetic impactors, chemical explosives, gravity tractors, solar sails, or nuclear munitions. Of the sudden impulse options, nuclear munitions are by far the most efficient in terms of yield-per-unit-mass launched and are technically mature. However, there are still significant questions about the response of a comet or asteroid to a nuclear burst. Recent and ongoing observational and experimental work is revolutionizing our understanding of the physical and chemical properties of these bodies (e.g ., Ryan (2000) Fujiwara et al. (2006), and Jedicke et al. (2006)). The combination of this improved understanding of small solar-system bodies combined with current state-of-the-art modeling and simulation capabilities, which have also improved dramatically in recent years, allow for a science-based, comprehensive study of PHO mitigation techniques. Here we present an examination of the effects of radiation from a nuclear explosion on potentially hazardous asteroids and comets through Monte Carlo N-Particle code (MCNP) simulation techniques. MCNP is a general-purpose particle transport code commonly used to model neutron, photon, and electron transport for medical physics reactor design and safety, accelerator target and detector design, and a variety of other applications including modeling the propagation of epithermal neutrons through the Martian regolith (Prettyman 2002). It is a massively parallel code that can conduct simulations in 1-3 dimensions, complicated geometries, and with extremely powerful variance reduction techniques. It uses current nuclear cross section data, where available, and fills in the gaps with analytical models where data are not available. MCNP has undergone extensive verification and validation and is considered the gold-standard for particle transport. (Forrest B. Brown, et al., ''MCNP Version 5,'' Trans. Am. Nucl. Soc., 87, 273, November 2002.) Additionally, a new simulation capability using MCNP has become available to this collaboration. The first results of this new capability will also be presented.

  11. Impact Hazard Mitigation: Understanding the Effects of Nuclear Explosive Outputs on Comets and Asteroids

    NASA Astrophysics Data System (ADS)

    Clement, R.

    The NASA 2007 white paper "Near-Earth Object Survey and Deflection Analysis of Alternatives" affirms deflection as the safest and most effective means of potentially hazardous object (PHO) impact prevention. It also calls for further studies of object deflection. In principle, deflection of a PHO may be accomplished by using kinetic impactors, chemical explosives, gravity tractors, solar sails, or nuclear munitions. Of the sudden impulse options, nuclear munitions are by far the most efficient in terms of yield-per-unit-mass launched and are technically mature. However, there are still significant questions about the response of a comet or asteroid to a nuclear burst. Recent and ongoing observational and experimental work is revolutionizing our understanding of the physical and chemical properties of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). The combination of this improved understanding of small solar-system bodies combined with current state-of-the-art modeling and simulation capabilities, which have also improved dramatically in recent years, allow for a science-based, comprehensive study of PHO mitigation techniques. Here we present an examination of the effects of radiation from a nuclear explosion on potentially hazardous asteroids and comets through Monte Carlo N-Particle code (MCNP) simulation techniques. MCNP is a general-purpose particle transport code commonly used to model neutron, photon, and electron transport for medical physics, reactor design and safety, accelerator target and detector design, and a variety of other applications including modeling the propagation of epithermal neutrons through the Martian regolith (Prettyman 2002). It is a massively parallel code that can conduct simulations in 1-3 dimensions, complicated geometries, and with extremely powerful variance reduction techniques. It uses current nuclear cross section data, where available, and fills in the gaps with analytical models where data are not available. MCNP has undergone extensive verification and validation and is considered the gold-standard for particle transport. (Forrest B. Brown, et al., "MCNP Version 5," Trans. Am. Nucl. Soc., 87, 273, November 2002.) Additionally, a new simulation capability using MCNP has become available to this collaboration. The first results of this new capability will also be presented. In particular, we will show results of neutron and gamma-ray energy deposition and flux as a function of material depth, composition, density, geometry, and distance from the source (nuclear burst). We will also discuss the benefits and shortcomings of linear Monte Carlo. Finally, we will set the stage for the correct usage and limitations of these results in coupled radiation-hydrodynamic calculations (see Plesko et al, this conference).

  12. Use of a Novel Visual Metaphor Measure (PRISM) to Evaluate School Children's Perceptions of Natural Hazards, Sources of Hazard Information, Hazard Mitigation Organizations, and the Effectiveness of Future Hazard Education Programs in Dominica, Eastern Caribbean

    NASA Astrophysics Data System (ADS)

    Parham, M.; Day, S. J.; Teeuw, R. M.; Solana, C.; Sensky, T.

    2014-12-01

    This project aims to study the development of understanding of natural hazards (and of hazard mitigation) from the age of 11 to the age of 15 in secondary school children from 5 geographically and socially different schools on Dominica, through repeated interviews with the students and their teachers. These interviews will be coupled with a structured course of hazard education in the Geography syllabus; the students not taking Geography will form a control group. To avoid distortion of our results arising from the developing verbalization and literacy skills of the students over the 5 years of the project, we have adapted the PRISM tool used in clinical practice to assess patient perceptions of illness and treatment (Buchi & Sensky, 1999). This novel measure is essentially non-verbal, and uses spatial positions of moveable markers ("object" markers) on a board, relative to a fixed marker that represents the subject's "self", as a visual metaphor for the importance of the object to the subject. The subjects also explain their reasons for placing the markers as they have, to provide additional qualitative information. The PRISM method thus produces data on the perceptions measured on the board that can be subjected to statistical analysis, and also succinct qualitative data about each subject. Our study will gather data on participants' perceptions of different natural hazards, different sources of information about these, and organizations or individuals to whom they would go for help in a disaster, and investigate how these vary with geographical and social factors. To illustrate the method, which is generalisable, we present results from our initial interviews of the cohort of 11 year olds whom we will follow through their secondary school education.Büchi, S., & Sensky, T. (1999). PRISM: Pictorial Representation of Illness and Self Measure: a brief nonverbal measure of illness impact and therapeutic aid in psychosomatic medicine. Psychosomatics, 40(4), 314-320.

  13. Use of a Novel Visual Metaphor Measure (PRISM) to Evaluate School Children's Perceptions of Natural Hazards, Sources of Hazard Information, Hazard Mitigation Organizations, and the Effectiveness of Future Hazard Education Programs in Dominica, Eastern Car

    NASA Astrophysics Data System (ADS)

    Parham, Martin; Day, Simon; Teeuw, Richard; Solana, Carmen; Sensky, Tom

    2015-04-01

    This project aims to study the development of understanding of natural hazards (and of hazard mitigation) from the age of 11 to the age of 15 in secondary school children from 5 geographically and socially different schools on Dominica, through repeated interviews with the students and their teachers. These interviews will be coupled with a structured course of hazard education in the Geography syllabus; the students not taking Geography will form a control group. To avoid distortion of our results arising from the developing verbalization and literacy skills of the students over the 5 years of the project, we have adapted the PRISM tool used in clinical practice to assess patient perceptions of illness and treatment (Buchi & Sensky, 1999). This novel measure is essentially non-verbal, and uses spatial positions of moveable markers ("object" markers) on a board, relative to a fixed marker that represents the subject's "self", as a visual metaphor for the importance of the object to the subject. The subjects also explain their reasons for placing the markers as they have, to provide additional qualitative information. The PRISM method thus produces data on the perceptions measured on the board that can be subjected to statistical analysis, and also succinct qualitative data about each subject. Our study will gather data on participants' perceptions of different natural hazards, different sources of information about these, and organizations or individuals to whom they would go for help in a disaster, and investigate how these vary with geographical and social factors. To illustrate the method, which is generalisable, we present results from our initial interviews of the cohort of 11 year olds whom we will follow through their secondary school education. Büchi, S., & Sensky, T. (1999). PRISM: Pictorial Representation of Illness and Self Measure: a brief nonverbal measure of illness impact and therapeutic aid in psychosomatic medicine. Psychosomatics, 40(4), 314-320.

  14. Identification, prediction, and mitigation of sinkhole hazards in evaporite karst areas

    USGS Publications Warehouse

    Gutierrez, F.; Cooper, A.H.; Johnson, K.S.

    2008-01-01

    Sinkholes usually have a higher probability of occurrence and a greater genetic diversity in evaporite terrains than in carbonate karst areas. This is because evaporites have a higher solubility and, commonly, a lower mechanical strength. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas. To deal with these hazards, a phased approach is needed for sinkhole identification, investigation, prediction, and mitigation. Identification techniques include field surveys and geomorphological mapping combined with accounts from local people and historical sources. Detailed sinkhole maps can be constructed from sequential historical maps, recent topographical maps, and digital elevation models (DEMs) complemented with building-damage surveying, remote sensing, and high-resolution geodetic surveys. On a more detailed level, information from exposed paleosubsidence features (paleokarst), speleological explorations, geophysical investigations, trenching, dating techniques, and boreholes may help in investigating dissolution and subsidence features. Information on the hydrogeological pathways including caves, springs, and swallow holes are particularly important especially when corroborated by tracer tests. These diverse data sources make a valuable database-the karst inventory. From this dataset, sinkhole susceptibility zonations (relative probability) may be produced based on the spatial distribution of the features and good knowledge of the local geology. Sinkhole distribution can be investigated by spatial distribution analysis techniques including studies of preferential elongation, alignment, and nearest neighbor analysis. More objective susceptibility models may be obtained by analyzing the statistical relationships between the known sinkholes and the conditioning factors. Chronological information on sinkhole formation is required to estimate the probability of occurrence of sinkholes (number of sinkholes/km2 year). Such spatial and temporal predictions, frequently derived from limited records and based on the assumption that past sinkhole activity may be extrapolated to the future, are non-corroborated hypotheses. Validation methods allow us to assess the predictive capability of the susceptibility maps and to transform them into probability maps. Avoiding the most hazardous areas by preventive planning is the safest strategy for development in sinkhole-prone areas. Corrective measures could be applied to reduce the dissolution activity and subsidence processes. A more practical solution for safe development is to reduce the vulnerability of the structures by using subsidence-proof designs. ?? 2007 Springer-Verlag.

  15. Rio Soliette (haiti): AN International Initiative for Flood-Hazard Assessment and Mitigation

    NASA Astrophysics Data System (ADS)

    Gandolfi, S.; Castellarin, A.; Barbarella, M.; Brath, A.; Domeneghetti, A.; Brandimarte, L.; Di Baldassarre, G.

    2013-01-01

    Natural catastrophic events are one of most critical aspects for health and economy all around the world. However, the impact in a poor region can impact more dramatically than in others countries. Isla Hispaniola (Haiti and the Dominican Republic), one of the poorest regions of the planet, has repeatedly been hit by catastrophic natural disasters that caused incalculable human and economic losses. After the catastrophic flood event occurred in the basin of River Soliette on May 24th, 2004, the General Direction for Development and Cooperation of the Italian Department of Foreign Affairs funded an international cooperation initiative (ICI) coordinated by the University of Bologna, that involved Haitian and Dominican institutions.Main purpose of the ICI was hydrological and hydraulic analysis of the May 2004 flood event aimed at formulating a suitable and affordable flood risk mitigation plan, consisting of structural and non-structural measures. In this contest, a topographic survey was necessary to realize the hydrological model and to improve the knowledge in some areas candidates to be site for mitigation measures.To overcome the difficulties arising from the narrowness of funds, surveyors and limited time available for the survey, only GPS technique have been used, both for framing aspects (using PPP approach), and for geometrical survey of the river by means of river cross-sections and detailed surveys in two areas (RTK technique). This allowed us to reconstruct both the river geometry and the DTM's of two expansion areas (useful for design hydraulic solutions for mitigate flood-hazard risk).

  16. 2009 ERUPTION OF REDOUBT VOLCANO: Lahars, Oil, and the Role of Science in Hazards Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Swenson, R.; Nye, C. J.

    2009-12-01

    In March, 2009, Redoubt Volcano erupted for the third time in 45 years. More than 19 explosions produced ash plumes to 60,000 ft asl, lahar flows of mud and ice down the Drift river ~30 miles to the coast, and tephra fall up to 1.5 mm onto surrounding communities. The eruption had severe impact on many operations. Airlines were forced to cancel or divert hundreds of international and domestic passenger and cargo flights, and Anchorage International airport closed for over 12 hours. Mudflows and floods down the Drift River to the coast impacted operations at the Drift River Oil Terminal (DROT) which was forced to shut down and ultimately be evacuated. Prior mitigation efforts to protect the DROT oil tank farm from potential impacts associated with a major eruptive event were successful, and none of the 148,000 barrels of oil stored at the facility was spilled or released. Nevertheless, the threat of continued eruptive activity at Redoubt, with the possibility of continued lahar flows down the Drift River alluvial fan, required an incident command post be established so that the US Coast Guard, Alaska Dept. of Environmental Conservation, and the Cook Inlet Pipeline Company could coordinate a response to the potential hazards. Ultimately, the incident command team relied heavily on continuous real-time data updates from the Alaska Volcano Observatory, as well as continuous geologic interpretations and risk analysis by the USGS Volcanic Hazards group, the State Division of Geological and Geophysical Surveys and the University of Alaska Geophysical Institute, all members of the collaborative effort of the Alaska Volcano Observatory. The great success story that unfolded attests to the efforts of the incident command team, and their reliance on real-time scientific analysis from scientific experts. The positive results also highlight how pre-disaster mitigation and monitoring efforts, in concert with hazards response planning, can be used in a cooperative industry / multi-agency effort to positively affect hazards mitigation. The final outcomes from this potentially disastrous event included: 1) no on-site personnel were injured; 2) no detrimental environmental impacts associated with the oil terminal occurred; and 3) incident command personnel, together with numerous industry representatives, were able to make well-informed, although costly decisions that resulted in safe removal of the oil from the storage facilities. The command teams efforts also furthered the process of restarting the Cook Inlet oil production after a forced five month shutdown.

  17. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

    NASA Astrophysics Data System (ADS)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

    2013-12-01

    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at five of them. The results showed that teams quickly learned to collect high-quality data for each method of analysis. SPAC and refraction microtremor analysis each demonstrated that dispersion relations based on ambient noise and from arrays with an aperture of less than 200 meters could be used to determine the depth of a weak, disaggregated layer known to underlie the fast near-surface limestone terraces on which Santo Domingo is situated, and indicated the presence of unexpectedly strong rocks below. All three array methods concurred that most Santo Domingo sites has relatively high VS30 (average shear velocity to a depth of 30 m), generally at the B-C NEHRP hazard class boundary or higher. HVSR analysis revealed that the general pattern of resonance was short periods close to the coast, and an increase with distance from the shore line. In the east-west direction, significant variations were also evident at the highest elevation terrace, and near the Ozama River. In terms of the sub-soil conditions, the observed pattern of HVSR values, departs form the expected increase of sediments thickness close to the coast.

  18. Coupling Radar Rainfall Estimation and Hydrological Modelling For Flash-flood Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Borga, M.; Creutin, J. D.

    Flood risk mitigation is accomplished through managing either or both the hazard and vulnerability. Flood hazard may be reduced through structural measures which alter the frequency of flood levels in the area. The vulnerability of a community to flood loss can be mitigated through changing or regulating land use and through flood warning and effective emergency response. When dealing with flash-flood hazard, it is gener- ally accepted that the most effective way (and in many instances the only affordable in a sustainable perspective) to mitigate the risk is by reducing the vulnerability of the involved communities, in particular by implementing flood warning systems and community self-help programs. However, both the inherent characteristics of the at- mospheric and hydrologic processes involved in flash-flooding and the changing soci- etal needs provide a tremendous challenge to traditional flood forecasting and warning concepts. In fact, the targets of these systems are traditionally localised like urbanised sectors or hydraulic structures. Given the small spatial scale that characterises flash floods and the development of dispersed urbanisation, transportation, green tourism and water sports, human lives and property are exposed to flash flood risk in a scat- tered manner. This must be taken into consideration in flash flood warning strategies and the investigated region should be considered as a whole and every section of the drainage network as a potential target for hydrological warnings. Radar technology offers the potential to provide information describing rain intensities almost contin- uously in time and space. Recent research results indicate that coupling radar infor- mation to distributed hydrologic modelling can provide hydrologic forecasts at all potentially flooded points of a region. Nevertheless, very few flood warning services use radar data more than on a qualitative basis. After a short review of current under- standing in this area, two issues are examined: advantages and caveats of using radar rainfall estimates in operational flash flood forecasting, methodological problems as- sociated to the use of hydrological models for distributed flash flood forecasting with rainfall input estimated from radar.

  19. Earth sciences, GIS and geomatics for natural hazards assessment and risks mitigation: a civil protection perspective

    NASA Astrophysics Data System (ADS)

    Perotti, Luigi; Conte, Riccardo; Lanfranco, Massimo; Perrone, Gianluigi; Giardino, Marco; Ratto, Sara

    2010-05-01

    Geo-information and remote sensing are proper tools to enhance functional strategies for increasing awareness on natural hazards and risks and for supporting research and operational activities devoted to disaster reduction. An improved Earth Sciences knowledge coupled with Geomatics advanced technologies has been developed by the joint research group and applied by the ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action) centre, within its partnership with the UN World Food Programme (WFP) with the goal of reducing human, social, economic and environmental losses due to natural hazards and related disasters. By cooperating with local and regional authorities (Municipalities, Centro Funzionale of the Aosta Valley, Civil Protection Agency of Regione Piemonte), data on natural hazards and risks have been collected, compared to national and global data, then interpreted for helping communities and civil protection agencies of sensitive mountain regions to make strategic choices and decisions to better mitigation and adaption measures. To enhance the application of GIS and Remote-sensing technologies for geothematic mapping of geological and geomorphological risks of mountain territories of Europe and Developing Countries, research activities led to the collection and evaluation of data from scientific literature and historical technical archives, for the definition of predisposing/triggering factors and evolutionary processes of natural instability phenomena (landslides, floods, storms, …) and for the design and implementation of early-warning and early-impact systems. Geodatabases, Remote Sensing and Mobile-GIS applications were developed to perform analysis of : 1) large climate-related disaster (Hurricane Mitch, Central America), by the application of remote sensing techniques, either for early warning or mitigation measures at the national and international scale; 2) distribution of slope instabilities at the regional scale (Aosta Valley, NW-Italy), for preventing and recovering measures; 3) geological and geomorphological controlling factors of seismicity, to provide microzonation maps and scenarios for co-seismic response of instable zones (Dronero, NW- Italian Alps); 4) earthquake effects on ground and infrastructures, in order to register early assessment for awareness situations and for compile damage inventories (Asti-Alessandria seismic events, 2000, 2001, 2003). The research results has been able to substantiate early warning models by structuring geodatabases on natural disasters, and to support humanitarian relief and disaster management activities by creating and testing SRG2, a mobile-GIS application for field-data collection on natural hazards and risks.

  20. Solutions Network Formulation Report. NASA's Potential Contributions using ASTER Data in Marine Hazard Mitigation

    NASA Technical Reports Server (NTRS)

    Fletcher, Rose

    2010-01-01

    The 28-foot storm surge from Hurricane Katrina pushed inland along bays and rivers for a distance of 12 miles in some areas, contributing to the damage or destruction of about half of the fleet of boats in coastal Mississippi. Most of those boats had sought refuge in back bays and along rivers. Some boats were spared damage because the owners chose their mooring site well. Gulf mariners need a spatial analysis tool that provides guidance on the safest places to anchor their boats during future hurricanes. This product would support NOAA s mission to minimize the effects of coastal hazards through awareness, education, and mitigation strategies and could be incorporated in the Coastal Risk Atlas decision support tool.

  1. Rockfall hazard assessment, risk quantification, and mitigation options for reef cove resort development, False Cape, Queensland, Australia

    NASA Astrophysics Data System (ADS)

    Schlotfeldt, P.

    2009-04-01

    GIS and 2-D rock fall simulations were used as the primary tools during a rock fall hazard assessment and analyses for a major resort and township development near Cairns, Queensland in Australia. The methods used included 1) the development of a digital elevation model (DEM); undertaking rock fall trajectory analyses to determine the end points of rockfalls, the distribution of kinetic energy for identified rock fall runout Zones, and 3) undertaking event tree analyses based on a synthesis of all data in order to establish Zones with the highest risk of fatalities. This paper describes the methodology used and the results of this work. Recommendations to mitigate the hazard included having exclusions zones with no construction, scaling (including trim blasting), construction of berms and rockfall catch fences. Keywords: GIS, rockfall simulation, rockfall runout Zones, mitigation options INTRODUCTION False Cape is located on the east side of the Trinity inlet near Cairns (Figure 1). Construction is underway for a multi-million dollar development close the beach front. The development will ultimately cover about 1.5 km of prime coast line. The granite slopes above the development are steep and are covered with a number of large, potentially unstable boulders. Sheet jointing is present in the in-situ bedrock and these combined with other tectonic joint sets have provided a key mechanism for large side down slope on exposed bedrock. With each rock fall (evidence by boulders strew in gullies, over the lower parts of the slope, and on the beach) the failure mechanism migrates upslope. In order for the Developer to proceed with construction he needs to mitigate the identified rock fall hazard. The method used to study the hazard and key finding are presented in this paper. Discussion is provided in the conclusion on mitigation options. KEY METHODS USED TO STUDY THE HAZARD In summary the methods used to study the hazard for the False Cape project include; 1. The development of a digital elevation model (DEM) used to delineate rock fall runout Zones [1] that included the spatial location of boulder fields mapped within Zones(Figure 2). A Zone is defined as an area above the development on steep sided slopes where falling rocks are channeled into gullies / and or are contained between topographic features such as ridges and spurs that extend down the mountainside. These natural barriers generally ensure that falling rocks do not fall or roll into adjacent Zones; 2. The use of ‘Flow Path Tracing Tool' in Arc GIS spatial analyst to confirm typical descents of boulders in Zones. These were shown to correlated strongly with the endpoints of boulders observed within the development and major clusters of boulders on the beach front; 3. The use of 2-D rockfall trajectory analyses [2] using sections cut along typical 3-D trajectory paths mapped out in ARC GIS per Zone. Sections along typical paths in Zones simulated, to some degree, the 3-D affect or path of rocks as they bounce roll down slope (Figure 3); 4. The calibration of rockfall input parameters (coefficients of normal and tangential restitution, slope roughness, friction angle, etc.) using field identified endpoints and size of fallen rock and boulder; and 5. Undertaking risk evolutions in order to quantify the potential risk for each independent rockfall Zone. KEY FINDINGS FROM THE STUDIES The key findings from the study include; 1. Multiple potentially unstable in-situ boulders (some in excess of several thousand tonnes) are present above the development. 2. Similar geological structures (dykes, jointing, etc.) are present in the boulders on the beach front and within the development exposed in-situ bedrock located above the development. Measurement and comparison of the orientation of these geological structures present in boulders with that observed in the in-situ bedrock provided strong evidence that that the boulders have mitigated down slope. 3. Eight discrete Rockfall Runout Zones were identified using the digital elevation model set up in ARC GIS (Figure 4). The boundaries were field verified as far as possible. The identified Zones formed the basis of all subsequent work. 4. Once calibrated the rockfall trajectory modeling showed that only between 1% and in the worst case 28% of falling rocks (percentage of 1000 seeding events) per Zones would actually reach the development. While this indicated a reduced likelihood of an incident and hence the risk, the kinetic energy in the case of an impact in most Zones was so high (for the given design block size) that the consequence would be untenable without some form of mitigation. 5. An event tree analysis showed that five out of the eight Zones identified had risk profiles that fell above or very close to what was considered to be an acceptable annual probability of occurrence of a fatality or fatalities. CONCLUSIONS Each Zone has unique characteristics that influence the risk profile associated with the rock fall hazard to the development. Mitigation options and recommendations needed to be adjusted accordingly to fit the physical characteristics and assessed risk profile of each Zone. These included: 1. The possible implantation of exclusion zones (no build areas); 2. Scaling (including controlled blasting) to reduce the potential kinetic energy associated with identified potentially unstable boulders; and 3. The design and construction of Berms and rockfall catch fences.

  2. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    NASA Astrophysics Data System (ADS)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nations gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation of earthquake scientists and engineers. In addition to the national maps, the USGS produces more detailed urban seismic hazard maps that communities have used to prioritize retrofits and design critical infrastructure that can withstand large earthquakes. At a regional scale, the USGS and its partners in California have developed a time-dependent earthquake rupture forecast that is being used by the insurance sector, which can serve to distribute risk and foster mitigation if the right incentives are in place. What the USGS and partners are doing at the urban, regional, and national scales, the Global Earthquake Model project is seeking to do for the world. A significant challenge for engaging the public to prepare for earthquakes is making low-probability, high-consequence events real enough to merit personal action. Scenarios help by starting with the hazard posed by a specific earthquake and then exploring the fragility of the built environment, cascading failures, and the real-life consequences for the public. To generate such a complete picture takes multiple disciplines working together. Earthquake scenarios are being used both for emergency management exercises and much broader public preparedness efforts like the Great California ShakeOut, which engaged nearly 7 million people.

  3. Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Research Team

    NASA Technical Reports Server (NTRS)

    Kelly, Michael J.

    2013-01-01

    The Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage raft empennage.

  4. Evaluation Of Risk And Possible Mitigation Schemes For Previously Unidentified Hazards

    NASA Technical Reports Server (NTRS)

    Linzey, William; McCutchan, Micah; Traskos, Michael; Gilbrech, Richard; Cherney, Robert; Slenski, George; Thomas, Walter, III

    2006-01-01

    This report presents the results of arc track testing conducted to determine if such a transfer of power to un-energized wires is possible and/or likely during an arcing event, and to evaluate an array of protection schemes that may significantly reduce the possibility of such a transfer. The results of these experiments may be useful for determining the level of protection necessary to guard against spurious voltage and current being applied to safety critical circuits. It was not the purpose of these experiments to determine the probability of the initiation of an arc track event only if an initiation did occur could it cause the undesired event: an inadvertent thruster firing. The primary wire insulation used in the Orbiter is aromatic polyimide, or Kapton , a construction known to arc track under certain conditions [3]. Previous Boeing testing has shown that arc tracks can initiate in aromatic polyimide insulated 28 volts direct current (VDC) power circuits using more realistic techniques such as chafing with an aluminum blade (simulating the corner of an avionics box or lip of a wire tray), or vibration of an aluminum plate against a wire bundle [4]. Therefore, an arc initiation technique was chosen that provided a reliable and consistent technique of starting the arc and not a realistic simulation of a scenario on the vehicle. Once an arc is initiated, the current, power and propagation characteristics of the arc depend on the power source, wire gauge and insulation type, circuit protection and series resistance rather than type of initiation. The initiation method employed for these tests was applying an oil and graphite mixture to the ends of a powered twisted pair wire. The flight configuration of the heater circuits, the fuel/oxider (or ox) wire, and the RCS jet solenoid were modeled in the test configuration so that the behavior of these components during an arcing event could be studied. To determine if coil activation would occur with various protection wire schemes, 145 tests were conducted using various fuel/ox wire alternatives (shielded and unshielded) and/or different combinations of polytetrafuloroethylene (PTFE), Mystik tape and convoluted wraps to prevent unwanted coil activation. Test results were evaluated along with other pertinent data and information to develop a mitigation strategy for an inadvertent RCS firing. The SSP evaluated civilian aircraft wiring failures to search for aging trends in assessing the wire-short hazard. Appendix 2 applies Weibull statistical methods to the same data with a similar purpose.

  5. Using Robust Decision Making to Assess and Mitigate the Risks of Natural Hazards in Developing Countries

    NASA Astrophysics Data System (ADS)

    Kalra, N.; Lempert, R. J.; Peyraud, S.

    2012-12-01

    Ho Chi Minh City (HCMC) ranks fourth globally among coastal cities most vulnerable to climate change and already experiences extensive routine flooding. In the coming decades, increased precipitation, rising sea levels, and land subsidence could permanently inundate a large portion of the city's population, place the poor at particular risk, and threaten new economic development in low-lying areas. HCMC is not alone in facing the impacts of natural hazards exacerbated by uncertain future climate change, development, and other deep uncertainties. Assessing and managing these risks is a tremendous challenge, particularly in developing countries which face pervasive shortages of the data and models generally used to plan for such changes. Using HCMC as a case study, this talk will demonstrate how a scenario-based approach that uses robustness as a decision and planning element can help developing countries assess future climate risk and manage the risk of natural disasters. In contrast to traditional approaches which treat uncertainty with a small number of handcrafted scenarios, this talk will emphasize how robust decision making, which uses modeling to explore over thousands of scenarios, can identify potential vulnerabilities to HCMC's emerging flood risk management strategy and suggest potential responses. The talk will highlight several novel features of the collaboration with the HCMC Steering Committee for Flood Control. First, it examines several types of risk -- risk to the poor, risk to the non-poor, and risk to the economy -- and illustrates how management policies have different implications for these sectors. Second, it demonstrates how diverse and sometimes incomplete climate, hydrologic, socioeconomic, GIS, and other data and models can be integrated into a modeling framework to develop and evaluate many scenarios of flood risk. Third, it illustrates the importance of non-structural policies such as land use management and building design to manage flood risk. Finally, it demonstrates how an adaptive management strategy that evolves over time and implements management options in response to new information can more effectively mitigate risks from natural disasters than can static policies.; A scatter plot of risk to the poor and non-poor in 1000 different scenarios under eight different risk management options (differentiated by color).

  6. Seismic Hazard and Risk Assessment in Multi-Hazard Prone Urban Areas: The Case Study of Cologne, Germany

    NASA Astrophysics Data System (ADS)

    Tyagunov, S.; Fleming, K.; Parolai, S.; Pittore, M.; Vorogushyn, S.; Wieland, M.; Zschau, J.

    2012-04-01

    Most hazard and risk assessment studies usually analyze and represent different kinds of hazards and risks separately, although risk assessment and mitigation programs in multi-hazard prone urban areas should take into consideration possible interactions of different hazards. This is particularly true for communities located in seismically active zones, where, on the one hand, earthquakes are capable of triggering other types of hazards, while, on the other hand, one should bear in mind that temporal coincidence or succession of different hazardous events may influence the vulnerability of the existing built environment and, correspondingly, the level of the total risk. Therefore, possible inter-dependencies and inter-influences of different hazards should be reflected properly in the hazard, vulnerability and risk analyses. This work presents some methodological aspects and preliminary results of a study being implemented within the framework of the MATRIX (New Multi-Hazard and Multi-Risk Assessment Methods for Europe) project. One of the test cases of the MATRIX project is the city of Cologne, which is one of the largest cities of Germany. The area of Cologne, being exposed to windstorm, flood and earthquake hazards, has already been considered in comparative risk assessments. However, possible interactions of these different hazards have been neglected. The present study is aimed at the further development of a holistic multi-risk assessment methodology, taking into consideration possible time coincidence and inter-influences of flooding and earthquakes in the area.

  7. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    USGS Publications Warehouse

    Hearn,, Paul P., Jr.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  8. Volcano Hazard Tracking and Disaster Risk Mitigation: A Detailed Gap Analysis from Data-Collection to User Implementation

    NASA Astrophysics Data System (ADS)

    Faied, D.; Sanchez, A.

    2009-04-01

    Volcano Hazard Tracking and Disaster Risk Mitigation: A Detailed Gap Analysis from Data-Collection to User Implementation Dohy Faied, Aurora Sanchez (on behalf of SSP08 VAPOR Project Team) Dohy.Faied@masters.isunet.edu While numerous global initiatives exist to address the potential hazards posed by volcanic eruption events and assess impacts from a civil security viewpoint, there does not yet exist a single, unified, international system of early warning and hazard tracking for eruptions. Numerous gaps exist in the risk reduction cycle, from data collection, to data processing, and finally dissemination of salient information to relevant parties. As part of the 2008 International Space University's Space Studies Program, a detailed gap analysis of the state of volcano disaster risk reduction was undertaken, and this paper presents the principal results. This gap analysis considered current sensor technologies, data processing algorithms, and utilization of data products by various international organizations. Recommendations for strategies to minimize or eliminate certain gaps are also provided. In the effort to address the gaps, a framework evolved at system level. This framework, known as VIDA, is a tool to develop user requirements for civil security in hazardous contexts, and a candidate system concept for a detailed design phase. VIDA also offers substantial educational potential: the framework includes a centralized clearinghouse for volcanology data which could support education at a variety of levels. Basic geophysical data, satellite maps, and raw sensor data are combined and accessible in a way that allows the relationships between these data types to be explored and used in a training environment. Such a resource naturally lends itself to research efforts in the subject but also research in operational tools, system architecture, and human/machine interaction in civil protection or emergency scenarios.

  9. Field Guide for Testing Existing Photovoltaic Systems for Ground Faults and Installing Equipment to Mitigate Fire Hazards

    SciTech Connect

    Brooks, William; Basso, Thomas; Coddington, Michael

    2015-10-01

    Ground faults and arc faults are the two most common reasons for fires in photovoltaic (PV) arrays and methods exist that can mitigate the hazards. This report provides field procedures for testing PV arrays for ground faults, and for implementing high resolution ground fault and arc fault detectors in existing and new PV system designs.

  10. 75 FR 29569 - Recovery Policy RP9526.1, Hazard Mitigation Funding Under Section 406 (Stafford Act)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-26

    ... SECURITY Federal Emergency Management Agency Recovery Policy RP9526.1, Hazard Mitigation Funding Under Section 406 (Stafford Act) AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice of... help build disaster-resistant communities. FEMA has revised this policy to reflect the alignment...

  11. The Identification of Filters and Interdependencies for Effective Resource Allocation: Coupling the Mitigation of Natural Hazards to Economic Development.

    NASA Astrophysics Data System (ADS)

    Agar, S. M.; Kunreuther, H.

    2005-12-01

    Policy formulation for the mitigation and management of risks posed by natural hazards requires that governments confront difficult decisions for resource allocation and be able to justify their spending. Governments also need to recognize when spending offers little improvement and the circumstances in which relatively small amounts of spending can make substantial differences. Because natural hazards can have detrimental impacts on local and regional economies, patterns of economic development can also be affected by spending decisions for disaster mitigation. This paper argues that by mapping interdependencies among physical, social and economic factors, governments can improve resource allocation to mitigate the risks of natural hazards while improving economic development on local and regional scales. Case studies of natural hazards in Turkey have been used to explore specific "filters" that act to modify short- and long-term outcomes. Pre-event filters can prevent an event from becoming a natural disaster or change a routine event into a disaster. Post-event filters affect both short and long-term recovery and development. Some filters cannot be easily modified by spending (e.g., rural-urban migration) but others (e.g., land-use practices) provide realistic spending targets. Net social benefits derived from spending, however, will also depend on the ways by which filters are linked, or so-called "interdependencies". A single weak link in an interdependent system, such as a power grid, can trigger a cascade of failures. Similarly, weak links in social and commercial networks can send waves of disruption through communities. Conversely, by understanding the positive impacts of interdependencies, spending can be targeted to maximize net social benefits while mitigating risks and improving economic development. Detailed information on public spending was not available for this study but case studies illustrate how networks of interdependent filters can modify social benefits and costs. For example, spending after the 1992 Erzincan earthquake targeted local businesses but limited alternative employment, labor losses and diminished local markets all contributed to economic stagnation. Spending after the 1995 Dinar earthquake provided rent subsidies, supporting a major exodus from the town. Consequently many local people were excluded from reconstruction decisions and benefits offered by reconstruction funds. After the 1999 Marmara earthquakes, a 3-year economic decline in Yalova illustrates the vulnerability of local economic stability to weak regulation enforcement by a few agents. A resource allocation framework indicates that government-community relations, lack of economic diversification, beliefs, and compensation are weak links for effective spending. Stronger positive benefits could be achieved through spending to target land-use regulation enforcement, labor losses, time-critical needs of small businesses, and infrastructure. While the impacts of the Marmara earthquakes were devastating, strong commercial networks and international interests helped to re-establish the regional economy. Interdependencies may have helped to drive a recovery. Smaller events in eastern Turkey, however, can wipe out entire communities and can have long-lasting impacts on economic development. These differences may accelerate rural to urban migration and perpetuate regional economic divergence in the country. 1: Research performed in the Wharton MBA Program, Univ. of Pennsylvania.

  12. Challenges in understanding, modelling, and mitigating Lake Outburst Flood Hazard: experiences from Central Asia

    NASA Astrophysics Data System (ADS)

    Mergili, Martin; Schneider, Demian; Andres, Norina; Worni, Raphael; Gruber, Fabian; Schneider, Jean F.

    2010-05-01

    Lake Outburst Floods can evolve from complex process chains like avalanches of rock or ice that produce flood waves in a lake which may overtop and eventually breach glacial, morainic, landslide, or artificial dams. Rising lake levels can lead to progressive incision and destabilization of a dam, to enhanced ground water flow (piping), or even to hydrostatic failure of ice dams which can cause sudden outflow of accumulated water. These events often have a highly destructive potential because a large amount of water is released in a short time, with a high capacity to erode loose debris, leading to a powerful debris flow with a long travel distance. The best-known example of a lake outburst flood is the Vajont event (Northern Italy, 1963), where a landslide rushed into an artificial lake which spilled over and caused a flood leading to almost 2000 fatalities. Hazards from the failure of landslide dams are often (not always) fairly manageable: most breaches occur in the first few days or weeks after the landslide event and the rapid construction of a spillway - though problematic - has solved some hazardous situations (e.g. in the case of Hattian landslide in 2005 in Pakistan). Older dams, like Usoi dam (Lake Sarez) in Tajikistan, are usually fairly stable, though landsildes into the lakes may create floodwaves overtopping and eventually weakening the dams. The analysis and the mitigation of glacial lake outburst flood (GLOF) hazard remains a challenge. A number of GLOFs resulting in fatalities and severe damage have occurred during the previous decades, particularly in the Himalayas and in the mountains of Central Asia (Pamir, Tien Shan). The source area is usually far away from the area of impact and events occur at very long intervals or as singularities, so that the population at risk is usually not prepared. Even though potentially hazardous lakes can be identified relatively easily with remote sensing and field work, modeling and predicting of GLOFs (and also the outburst of landslide-dammed lakes) remains a challenge: • The knowledge about the onset of the process is often limited (bathymetry of the lakes, subsurface water, properties of dam (content of ice), type of dam breach, understanding of process chains and interactions). • The size of glacial lakes may change rapidly but continuously, and many lakes break out within a short time after their development. Continuous monitoring is therefore required to keep updated on the existing hazards. • Also the outburst of small glacial lakes may lead to significant debris floods or even debris flows if there is plenty of erodible material available. • The available modeling software packages are of limited suitability for lake outburst floods: e.g. software developed by the hydrological community is specialized to simulate (debris) floods with input hydrographs on moderately steep flow channels and with lower sediment loads. In contrast to this, programs for rapid mass movements are better suited on steeper slopes and sudden onset of the movement. The typical characteristics of GLOFs are in between and vary for different channel sections. In summary, the major bottlenecks remain in deriving realistic or worst case scenarios and predicting their magnitude and area of impact. This mainly concerns uncertainties in the dam break process, involved volumes, erosion rates, changing rheologies, and the limited capabilities of available software packages to simulate process interactions and transformations such as the development of a hyperconcentrated flow into a debris flow. In addition, many areas prone to lake outburst floods are located in developing countries with a limited scope of the threatened population for decision-making and limited resources for mitigation.

  13. Environmental legislation as the legal framework for mitigating natural hazards in Spain

    NASA Astrophysics Data System (ADS)

    Garrido, Jesús; Arana, Estanislao; Jiménez Soto, Ignacio; Delgado, José

    2015-04-01

    In Spain, the socioeconomic losses due to natural hazards (floods, earthquakes or landslides) are considerable, and the indirect costs associated with them are rarely considered because they are very difficult to evaluate. The prevention of losses due to natural hazards is more economic and efficient through legislation and spatial planning rather than through structural measures, such as walls, anchorages or structural reinforcements. However, there isn't a Spanish natural hazards law and national and regional sector legislation make only sparse mention of them. After 1978, when the Spanish Constitution was enacted, the Autonomous Communities (Spanish regions) were able to legislate according to the different competences (urban planning, environment or civil protection), which were established in the Constitution. In the 1990's, the Civil Protection legislation (national law and regional civil protection tools) dealt specifically with natural hazards (floods, earthquakes and volcanoes), but this was before any soil, seismic or hydrological studies were recommended in the national sector legislation. On the other hand, some Autonomous Communities referred to natural hazards in the Environmental Impact Assessment legislation (EIA) and also in the spatial and urban planning legislation and tools. The National Land Act, enacted in 1998, established, for the first time, that those lands exposed to natural hazards should be classified as non-developable. The Spanish recast text of the Land Act, enacted by Royal Legislative Decree 2/2008, requires that a natural hazards map be included in the Environmental Sustainability Report (ESR), which is compulsory for all master plans, according to the provisions set out by Act 9/2006, known as Spanish Strategic Environmental Assessment (SEA). Consequently, the environmental legislation, after the aforementioned transposition of the SEA European Directive 2001/42/EC, is the legal framework to prevent losses due to natural hazards through land use planning. However, most of the Spanish master plans approved after 2008 don't include a natural hazards map or/and don't classify the areas exposed to natural hazards as non-developable. Restrictions or prohibitions for building in natural hazard-prone areas are not usually established in the master plans. According to the jurisprudence, the environmental legislation prevails over spatial and urban planning regulations. On the other hand, the precedence of the national competence in public security would allow reclassification or the land, independently of the political or economic motivations of the municipal government. Despite of the technical building code or the seismic building code where some recommendations for avoiding "geotechnical" or seismic hazards are established, there are not compulsory guidelines to do technical studies/hazard maps for floods or landslides. The current legislation should be improved, under a technical point of view, and some mechanisms for enforcing the law should be also considered.

  14. The Effective Organization and Use of Data in Bridging the Hazard Mitigation-Climate Change Adaptation Divide (Invited)

    NASA Astrophysics Data System (ADS)

    Smith, G. P.; Fox, J.; Shuford, S.

    2010-12-01

    The costs associated with managing natural hazards and disasters continue to rise in the US and elsewhere. Many climate change impacts are manifested in stronger or more frequent natural hazards such as floods, wildfire, hurricanes and typhoons, droughts, and heat waves. Despite this common problem, the climate change adaptation and hazards management communities have largely failed to acknowledge each other’s work in reducing hazard impacts. This is even reflected in the language that each community uses; for example, the hazards management community refers to hazard risk reduction as mitigation while the climate change community refers to it as adaptation. In order to help bridge this divide, we suggest each community utilize data in a more formally-organized and effective manner based on four principles: 1. The scale of the data must reflect the needs of the decision maker. In most cases, decision makers’ needs are most effectively met through the development of a multiple alternatives that takes into account a variety of possible impacts. 2. Investments intended to reduce vulnerability and increase resilience should be driven by the wise use of available data using a “risk-based” strategy. 3. Climate change adaptation and hazard mitigation strategies must be integrated with other value drivers when building resiliency. Development and use of data that underscore the concept of “no regrets” risk reduction can be used to accomplish this aim. 4. The use of common data is critical in building a bridge between the climate change adaptation and hazards management communities. We will explore how the creation of data repositories that collect, analyze, display and archive hazards and disaster data can help address the challenges posed by the current and hazards management and climate change adaptation divide.

  15. New Multi-HAzard and MulTi-RIsk Assessment MethodS for Europe (MATRIX): A research program towards mitigating multiple hazards and risks in Europe

    NASA Astrophysics Data System (ADS)

    Fleming, K. M.; Zschau, J.; Gasparini, P.; Modaressi, H.; Matrix Consortium

    2011-12-01

    Scientists, engineers, civil protection and disaster managers typically treat natural hazards and risks individually. This leads to the situation where the frequent causal relationships between the different hazards and risks, e.g., earthquakes and volcanos, or floods and landslides, are ignored. Such an oversight may potentially lead to inefficient mitigation planning. As part of their efforts to confront this issue, the European Union, under its FP7 program, is supporting the New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe or MATRIX project. The focus of MATRIX is on natural hazards, in particular earthquakes, landslides, volcanos, wild fires, storms and fluvial and coastal flooding. MATRIX will endeavour to develop methods and tools to tackle multi-type natural hazards and risks within a common framework, focusing on methodologies that are suited to the European context. The work will involve an assessment of current single-type hazard and risk assessment methodologies, including a comparison and quantification of uncertainties and harmonization of single-type methods, examining the consequence of cascade effects within a multi-hazard environment, time-dependent vulnerability, decision making and support for multi-hazard mitigation and adaption, and a series of test cases. Three test sites are being used to assess the methods developed within the project (Naples, Cologne, and the French West Indies), as well as a "virtual city" based on a comprehensive IT platform that will allow scenarios not represented by the test cases to be examined. In addition, a comprehensive dissemination program that will involve national platforms for disaster management, as well as various outreach activities, will be undertaken. The MATRIX consortium consists of ten research institutions (nine European and one Canadian), an end-user (i.e., one of the European national platforms for disaster reduction) and a partner from industry.

  16. A review of accidents, prevention and mitigation options related to hazardous gases

    SciTech Connect

    Fthenakis, V.M.

    1993-05-01

    Statistics on industrial accidents are incomplete due to lack of specific criteria on what constitutes a release or accident. In this country, most major industrial accidents were related to explosions and fires of flammable materials, not to releases of chemicals into the environment. The EPA in a study of 6,928 accidental releases of toxic chemicals revealed that accidents at stationary facilities accounted for 75% of the total number of releases, and transportation accidents for the other 25%. About 7% of all reported accidents (468 cases) resulted in 138 deaths and 4,717 injuries ranging from temporary respiratory problems to critical injuries. In-plant accidents accounted for 65% of the casualties. The most efficient strategy to reduce hazards is to choose technologies which do not require the use of large quantities of hazardous gases. For new technologies this approach can be implemented early in development, before large financial resources and efforts are committed to specific options. Once specific materials and options have been selected, strategies to prevent accident initiating events need to be evaluated and implemented. The next step is to implement safety options which suppress a hazard when an accident initiating event occurs. Releases can be prevented or reduced with fail-safe equipment and valves, adequate warning systems and controls to reduce and interrupt gas leakage. If an accident occurs and safety systems fail to contain a hazardous gas release, then engineering control systems will be relied on to reduce/minimize environmental releases. As a final defensive barrier, the prevention of human exposure is needed if a hazardous gas is released, in spite of previous strategies. Prevention of consequences forms the final defensive barrier. Medical facilities close by that can accommodate victims of the worst accident can reduce the consequences of personnel exposure to hazardous gases.

  17. Integrated Tsunami Data Supports Forecast, Warning, Research, Hazard Assessment, and Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Stroker, K. J.

    2009-12-01

    With nearly 230,000 fatalities, the 26 December 2004 Indian Ocean tsunami was the deadliest tsunami in history, illustrating the importance of developing basinwide warning systems. Key to creating these systems is easy access to quality-controlled, verified data on past tsunamis. It is essential that warning centers, emergency managers, and modelers can determine if and when similar events have occurred. Following the 2004 tsunami, the National Oceanic and Atmospheric Administrations (NOAA) National Geophysical Data Center (NGDC) began examining all aspects of the tsunami data archive to help answer questions regarding the frequency and severity of past tsunamis. Historical databases span insufficient time to reveal a regions full tsunami hazard, so a global database of citations to articles on tsunami deposits was added to the archive. NGDC further expanded the archive to include high-resolution tide gauge data, deep-ocean sensor data, and digital elevation models used for propagation and inundation modeling. NGDC continuously reviews the data for accuracy, making modifications as new information is obtained. These added databases allow NGDC to provide the tsunami data necessary for warning guidance, hazard assessments, and mitigation efforts. NGDC is also at the forefront of standards-based Web delivery of integrated science data through a variety of tools, from Web-form interfaces to interactive maps. The majority of the data in the tsunami archive are discoverable online. Scientists, journalists, educators, planners, and emergency managers are among the many users of these public domain data, which may be used without restriction provided that users cite data sources.

  18. Improving Tsunami Hazard Mitigation and Preparedness Using Real-Time and Post-Tsunami Field Data

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Miller, K. M.

    2012-12-01

    The February 27, 2010 Chile and March 11, 2011 Japan tsunamis caused dramatic loss of life and damage in the near-source region, and notable impacts in distant coastal regions like California. Comprehensive real-time and post-tsunami field surveys and the availability of hundreds of videos within harbors and marinas allow for detailed documentation of these two events by the State of California Tsunami Program, which receives funding through the National Tsunami Hazard Mitigation Program. Although neither event caused significant inundation of dry land in California, dozens of harbors sustained damage totaling nearly $100-million. Information gathered from these events has guided new strategies in tsunami evacuation planning and maritime preparedness. Scenario-specific, tsunami evacuation "playbook" maps and guidance are being produced detailing inundation from tsunamis of various size and source location. These products help coastal emergency managers prepare local response plans when minor distant source tsunamis or larger tsunamis from local and regional sources are generated. In maritime communities, evaluation of strong tsunami currents and damage are being used to validate/calibrate numerical tsunami model currents and produce in-harbor hazard maps and identify offshore safety zones for potential boat evacuation when a tsunami Warning is issued for a distant source event. Real-time and post-tsunami field teams have been expanded to capture additional detailed information that can be shared in a more timely manner during and after an event through a state-wide clearinghouse. These new products and related efforts will result in more accurate and efficient emergency response by coastal communities, potentially reducing the loss of lives and property during future tsunamis.

  19. Physical Prototype Development for the Real-Time Detection and Mitigation of Hazardous Releases into a Flow System

    NASA Astrophysics Data System (ADS)

    Rimer, Sara; Katopodes, Nikolaos

    2013-11-01

    The threat of accidental or deliberate toxic chemicals released into public spaces is a significant concern to public safety. The real-time detection and mitigation of such hazardous contaminants has the potential to minimize harm and save lives. In this study, we demonstrate the feasibility of feedback control of a hazardous contaminant by means of a laboratory-scale physical prototype integrated with a previously-developed robust predictive control numerical model. The physical prototype is designed to imitate a public space characterized by a long conduit with an ambient flow (e.g. airport terminal). Unidirectional air flows through a 24-foot long duct. The ``contaminant'' plume of propylene glycol smoke is released into the duct. Camera sensors are used to visually measure concentration of the plume. A pneumatic system is utilized to localize the contaminant via air curtains, and draw it out via vacuum nozzles. The control prescribed to the pneumatic system is based on the numerical model. NSF-CMMI 0856438.

  20. Mitigation of EMU Glove Cut Hazard by MMOD Impact Craters on Exposed ISS Handrails

    NASA Technical Reports Server (NTRS)

    Christiansen, Eric L.; Ryan, Shannon

    2009-01-01

    Recent cut damages to crewmember extravehicular mobility unit (EMU) gloves during extravehicular activity (EVA) onboard the International Space Station (ISS) has been found to result from contact with sharp edges or pinch points rather than general wear or abrasion. One possible source of cut-hazards are protruding sharp edged crater lips from impact of micrometeoroid and orbital debris (MMOD) particles on external metallic handrails along EVA translation paths. During impact of MMOD particles at hypervelocity an evacuation flow develops behind the shock wave, resulting in the formation of crater lips that can protrude above the target surface. In this study, two methods were evaluated to limit EMU glove cut-hazards due to MMOD impact craters. In the first phase, four flexible overwrap configurations are evaluated: a felt-reusable surface insulation (FRSI), polyurethane polyether foam with beta-cloth cover, double-layer polyurethane polyether foam with beta-cloth cover, and multi-layer beta-cloth with intermediate Dacron netting spacers. These overwraps are suitable for retrofitting ground equipment that has yet to be flown, and are not intended to protect the handrail from impact of MMOD particles, rather to act as a spacer between hazardous impact profiles and crewmember gloves. At the impact conditions considered, all four overwrap configurations evaluated were effective in limiting contact between EMU gloves and impact crater profiles. The multi-layer beta-cloth configuration was the most effective in reducing the height of potentially hazardous profiles in handrail-representative targets. In the second phase of the study, four material alternatives to current aluminum and stainless steel alloys were evaluated: a metal matrix composite, carbon fiber reinforced plastic (CFRP), fiberglass, and a fiber metal laminate. Alternative material handrails are intended to prevent the formation of hazardous damage profiles during MMOD impact and are suitable for flight hardware yet to be constructed. Of the four materials evaluated, only the fiberglass formed a less hazardous damage profile than the baseline metallic target. Although the CFRP laminate did not form any noticeable crater lip, brittle protruding fibers are considered a puncture risk. In parallel with EMU glove redesign efforts, modifications to metallic ISS handrails such as those evaluated in this study provide the means to significantly reduce cut-hazards from MMOD impact craters.

  1. Earthquake Hazard Mitigation and Real-Time Warnings of Tsunamis and Earthquakes

    NASA Astrophysics Data System (ADS)

    Kanamori, Hiroo

    2015-09-01

    With better understanding of earthquake physics and the advent of broadband seismology and GPS, seismologists can forecast the future activity of large earthquakes on a sound scientific basis. Such forecasts are critically important for long-term hazard mitigation, but because stochastic fracture processes are complex, the forecasts are inevitably subject to large uncertainties, and unexpected events will inevitably occur. Recent developments in real-time seismology helps seismologists cope with and prepare for such unexpected events, including tsunamis and earthquakes. For a tsunami warning, the required warning time is fairly long (usually 5 min or longer) and enables use of a rigorous method for this purpose. Significant advances have already been made. In contrast, early warning of earthquakes is far more challenging because the required warning time is very short (as short as three seconds). Despite this difficulty the methods used for regional warnings have advanced substantially, and several systems have been already developed and implemented. A future strategy for more challenging, rapid (a few second) warnings, which are critically important for saving properties and lives, is discussed.

  2. California Real Time Network: Test Bed for Mitigation of Geological and Atmospheric Hazards within a Modern Data Portal Environment

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2008-12-01

    Global geological and atmospheric hazards such as earthquakes, volcanoes, tsunamis, landslides, storms and floods continue to wreak havoc on the lives of millions of people worldwide. High precision geodetic observations of surface displacements and atmospheric water vapor are indispensable tools in studying natural hazards along side more traditional seismic and atmospheric measurements. The rapid proliferation of dense in situ GPS networks for crustal deformation studies such as the Earthscope Plate Boundary Observatory provides us with unique data sets. However, the full information content and timeliness of these observations have not been fully developed, in particular at higher frequencies than traditional daily continuous GPS position time series. Nor have scientists taken full advantage of the complementary nature of space-based and in situ observations in forecasting, assessing and mitigating natural hazards. The primary operating mode for in situ GPS networks has been daily download of GPS data sampled at a 15-30 s sample rate, and the production of daily position time series or hourly tropospheric zenith delay estimates. However, as continuous GPS networks are being upgraded to provide even higher-frequency information approaching the sampling rates (1-50 Hz) of modern GPS receivers, and with a latency of less than 1 second, new data processing approaches are being developed. Low-latency high-rate measurements are being applied to earthquake source modeling, early warning of natural hazards (geological and atmospheric), and structural monitoring. Since 2002, more than 80 CGPS stations in southern California have been upgraded to a 1 Hz sample rate, including stations from the SCIGN and PBO networks, and several large earthquakes have been recorded. The upgraded stations comprise the California Real Time Network (CRTN - http://sopac.ucsd.edu/projects/realtime/). This prototype network provides continuous 1 Hz (upgradable to 10 Hz at some stations) GPS relative displacements and troposphere delay estimates with a latency of less than 1 s. Steps are being taken to tie these higher-order data products to the global (ITRF) reference frame, and discussions are underway to extend this capability throughout California by upgrading more UNAVCO/PBO stations. With funding from NASA, CRTN provides a test bed for developing advanced in situ- based observation systems within a modern data portal environment, which can be extended seamlessly to the entire PBO region and to other plate boundaries. I describe a prototype early warning system for earthquakes using CRTN, which is also being deployed at other plate boundaries. I show how researchers can access advanced observations of displacements and troposphere delay in real-time and replay significant events within the GPS Explorer data portal and Geophysical Resource Web Services (GRWS) environment.

  3. Mitigation of Earthquake-Induced Catastrophic Landslide Hazard on Gentle Slopes by Surface Loading

    NASA Astrophysics Data System (ADS)

    Trandafir, A. C.; Sidle, R. C.; Kamai, T.

    2005-05-01

    Catastrophic landslides have occurred repeatedly on gentle slopes during past earthquakes in Japan, sometimes causing great loss of life and significant environmental damage. Reconnaissance reports on the October 23, 2004, Chuetsu earthquake in Niigata Prefecture, Japan, also include collapse of gentle slopes associated with damage to roads and railways which lost foundation support. Additionally, investigations of the spatial distribution and features of landslides triggered by the Chuetsu earthquake revealed that 23% of the landslides mapped within a 2.9 km radius around the major earthquake epicenter occurred on slopes with gradients between 10 and 20 degrees. Past experience demonstrated that such earthquake-induced catastrophic landslides occurred along shear surfaces in saturated cohesionless materials, and the main factor controlling the high mobility of the slide mass after failure was the gradual loss in shear strength with progressive shear displacement. Given the high rainfall prior the earthquake, it is likely that the same failure mechanism characterizes the catastrophic landslides triggered on gentle slopes during the October 23, 2004, Chuetsu event in Niigata Prefecture. Thus, we introduce a procedure based on application of an additional confining stress to the surface to increase the stability of gentle slopes in saturated cohesionless soils subject to catastrophic failure during earthquakes. This surface pressure is achieved by concrete plates tied back with prestressed steel anchors which penetrate through the soil well below the potential sliding surface. Results of a dynamic analysis of undrained seismic performance conducted for a gentle infinite slope that experienced different levels of increase in effective confining stress due to a uniform load applied normal to the surface, illustrate the effectiveness of this measure in mitigating the earthquake-induced catastrophic landslide hazards.

  4. Indias Initiative in Mitigating Tsunami Hazard & Tsunami Potential in Northern Bay of Bengal (Invited)

    NASA Astrophysics Data System (ADS)

    Gupta, H. K.

    2009-12-01

    Soon after the occurrence of the most devastating tsunami caused by the 26th December 2004 Sumatra earthquake, India took the initiative to set up an end-to-end system to mitigate tsunami and storm surge hazard. The system includes all the necessary elements: networking of seismic stations; deployment of ocean bottom pressure recorders; real time sea level monitoring stations; establishment of radar based monitoring stations for real time measurement of surface currents and waves; modeling for tsunamis and storm surges; generation of coastal inundation and vulnerability maps; operation of a tsunami and storm surges warning centre on 247 basis; capacity building and training of all the stakeholders and communication with the global community. This initiative was estimated to have a direct cost of US $30million and was to be operative by August 2007. This has been achieved. The Indian National Centre for Ocean Information and Services (INCOIS), belonging to the Ministry of Earth Sciences (MoES), Government of India, located at Hyderabad, is the nodal agency for this program. The system is functioning well. We also examine the tsunami potential in the northern Bay of Bengal, where a large population (about 100 million) in the coastal area makes the region very vulnerable if a large tsunami was to occur. It is observed that: i) oblique plate motion characterizes the region resulting in strike-slip dominated earthquakes with low tsunami generating potential; ii) in the northern Bay of Bengal, the deformation front associated with the plate boundary between India and Sunda plates is either landward or in the shallow water in the Arakan region and therefore a great earthquake will not displace large amounts of water causing a major tsunami; and iii) there is no evidence of the region been affected by a large tsunami in the past 2000 years. We therefore conclude that though a great earthquake could occur in the Arakan region, it would not generate a large tsunami in the northern Bay of Bengal.

  5. Past, Present, and Future Challenges in Earthquake Hazard Mitigation of Indonesia: A Collaborative Work of Geological Agency Indonesia and Geoscience Australia

    NASA Astrophysics Data System (ADS)

    Hidayati, S.; Cummins, P. R.; Cipta, A.; Omang, A.; Griffin, J.; Horspool, N.; Robiana, R.; Sulaeman, C.

    2012-12-01

    In the last decade, Indonesia has suffered from earthquakes disaster since four out of twelve of the world's large earthquakes with more than 1000 causalities occurred in Indonesia. The great Sumatra earthquake of December 26, 2004 followed by tsunami which cost 227,898 of lives has brought Indonesia and its active tectonic setting to the world's attention. Therefore the government of Indonesia encourages hazard mitigation efforts that are more focused on the pre-disaster phase. In response to government policy in earthquake disaster mitigation, Geological Agency Indonesia attempts to meet the need for rigorous earthquake hazard map throughout the country in provincial scale in 2014. A collaborative work with Geoscience Australia through short-term training missions; on-going training, mentoring, assistance and studying in Australia, under the auspices of Australia-Indonesia Facility for Disaster Reduction (AIFDR) have accelerated the execution of these maps. Since 2010 to date of collaboration, by using probabilistic seismic hazard assessment (PSHA) method, provincial earthquake hazard maps of Central Java (2010), West Sulawesi, Gorontalo, and North Maluku (2011) have been published. In 2012, by the same method, the remaining provinces of Sulawesi Island, Papua, North Sumatera and Jambi will be published. In the end of 2014, all 33 Indonesian provinces hazard maps will be delivered. The future challenges are to work together with the stakeholders, to produce district scale maps and establish a national standard for earthquake hazard maps. Moreover, the most important consideration is to build the capacity to update, maintain and revise the maps as recent information available.

  6. PREDICTION/MITIGATION OF SUBSIDENCE DAMAGE TO HAZARDOUS WASTE LANDFILL COVERS

    EPA Science Inventory

    Characteristics of Resource Conservation and Recovery Act hazardous waste landfills and of landfilled hazardous wastes have been described to permit development of models and other analytical techniques for predicting, reducing, and preventing landfill settlement and related cove...

  7. Geo hazard studies and their policy implications in Nicaragua

    NASA Astrophysics Data System (ADS)

    Strauch, W.

    2007-05-01

    Nicaragua, situated at the Central American Subduction zone and placed in the trajectory of tropical storms and hurricanes, is a frequent showplace of natural disasters which have multiplied the negative effects of a long term socioeconomic crisis leaving Nicaragua currently as the second poorest country of the Americas. In the last years, multiple efforts were undertaken to prevent or mitigate the affectation of the natural phenomena to the country. National and local authorities have become more involved in disaster prevention policy and international cooperation boosted funding for disaster prevention and mitigation measures in the country. The National Geosciences Institution (INETER) in cooperation with foreign partners developed a national monitoring and early warning system on geological and hydro-meteorological phenomena. Geological and risk mapping projects were conducted by INETER and international partners. Universities, NGOs, International Technical Assistance, and foreign scientific groups cooperated to capacitate Nicaraguan geoscientists and to improve higher education on disaster prevention up to the master degree. Funded by a World Bank loan, coordinated by the National System for Disaster Prevention, Mitigation and Attention (SINAPRED) and scientifically supervised by INETER, multidisciplinary hazard and vulnerability studies were carried out between 2003 and 2005 with emphasis on seismic hazard. These GIS based works provided proposals for land use policies on a local level in 30 municipalities and seismic vulnerability and risk information for each single building in Managua, Capital of Nicaragua. Another large multidisciplinary project produced high resolution air photos, elaborated 1:50,000 vectorized topographic maps, and a digital elevation model for Western Nicaragua. These data, integrated in GIS, were used to assess: 1) Seismic Hazard for Metropolitan Managua; 2) Tsunami hazard for the Pacific coast; 3) Volcano hazard for Telica-Cerro Negro and El Hoyo volcanoes; and 4) Flood hazard map of Maravilla river. This study was realized between 2004 and 2006, through technical cooperation of Japan International Cooperation Agency with INETER, upon the request of the Government of Nicaragua. The results of the mapping and investigations are fed in a National GIS on Geohazards maintained by INETER and developed in the frame of a regional cooperation project with BGR, Germany, and other international institutions. Many maps, project reports and GIS coverage are made available on INETERs Website to the general public. (www.ineter.gob.ni/geofisica/geofisica.html ).

  8. Disruption mitigation studies in DIII-D

    SciTech Connect

    Taylor, P.L.; Kellman, A.G.; Evans, T.E.

    1999-01-01

    Data on the discharge behavior, thermal loads, halo currents, and runaway electrons have been obtained in disruptions on the DIII-D tokamak. These experiments have also evaluated techniques to mitigate the disruptions while minimizing runaway electron production. Experiments injecting cryogenic impurity killer pellets of neon and argon and massive amounts of helium gas have successfully reduced these disruption effects. The halo current generation, scaling, and mitigation are understood and are in good agreement with predictions of a semianalytic model. Results from killer pellet injection have been used to benchmark theoretical models of the pellet ablation and energy loss. Runaway electrons are often generated by the pellets and new runaway generation mechanisms, modifications of the standard Dreicer process, have been found to explain the runaways. Experiments with the massive helium gas puff have also effectively mitigated disruptions without the formation of runaway electrons that can occur with killer pellets.

  9. Mitigation and benefits measures as policy tools for siting potentially hazardous facilities: determinants of effectiveness and appropriateness.

    PubMed

    Jenkins-Smith, H; Kunreuther, H

    2001-04-01

    How do mitigation and benefits measures affect public acceptance for siting different kinds of potentially hazardous facilities? What kinds of benefits measures are seen as most (or least) appropriate for different kinds of facilities? This study used a nationwide telephone survey consisting of 1,234 interviews with randomly selected respondents to test for the effects of packages of safety and benefits measures for siting a landfill, prison, incinerator and nuclear waste repository. The experimental design used in the survey permits analysis of the fractions of respondents who are willing to change their initial levels of acceptance (or opposition) when presented with a sequence of the safety and benefit measures. The measures vary significantly in their impact on levels of acceptance for the facilities, and some measures that would at face value appear to reassure residents of facility safety turn out to lack credibility and therefore diminish facility acceptance. Ordering of the benefits versus safety measures significantly affects changes in acceptance in surprising ways. The perceived appropriateness of different kinds of benefits measures varies systematically by the type of facility under consideration. It appears that successful benefits packages will directly address the underlying dimensions of concern caused by the facility. These findings point to the importance of further research on "commensurable" benefits measures. PMID:11414544

  10. Catastrophic debris flows transformed from landslides in volcanic terrains : mobility, hazard assessment and mitigation strategies

    USGS Publications Warehouse

    Scott, Kevin M.; Macias, Jose Luis; Naranjo, Jose Antonio; Rodriguez, Sergio; McGeehin, John P.

    2001-01-01

    Communities in lowlands near volcanoes are vulnerable to significant volcanic flow hazards in addition to those associated directly with eruptions. The largest such risk is from debris flows beginning as volcanic landslides, with the potential to travel over 100 kilometers. Stratovolcanic edifices commonly are hydrothermal aquifers composed of unstable, altered rock forming steep slopes at high altitudes, and the terrain surrounding them is commonly mantled by readily mobilized, weathered airfall and ashflow deposits. We propose that volcano hazard assessments integrate the potential for unanticipated debris flows with, at active volcanoes, the greater but more predictable potential of magmatically triggered flows. This proposal reinforces the already powerful arguments for minimizing populations in potential flow pathways below both active and selected inactive volcanoes. It also addresses the potential for volcano flank collapse to occur with instability early in a magmatic episode, as well as the 'false-alarm problem'-the difficulty in evacuating the potential paths of these large mobile flows. Debris flows that transform from volcanic landslides, characterized by cohesive (muddy) deposits, create risk comparable to that of their syneruptive counterparts of snow and ice-melt origin, which yield noncohesive (granular) deposits, because: (1) Volcano collapses and the failures of airfall- and ashflow-mantled slopes commonly yield highly mobile debris flows as well as debris avalanches with limited runout potential. Runout potential of debris flows may increase several fold as their volumes enlarge beyond volcanoes through bulking (entrainment) of sediment. Through this mechanism, the runouts of even relatively small collapses at Cascade Range volcanoes, in the range of 0.1 to 0.2 cubic kilometers, can extend to populated lowlands. (2) Collapse is caused by a variety of triggers: tectonic and volcanic earthquakes, gravitational failure, hydrovolcanism, and precipitation, as well as magmatic activity and eruptions. (3) Risk of collapse begins with initial magmatic activity and increases as intrusion proceeds. An archetypal debris flow from volcanic terrain occurred in Colombia with a tectonic earthquake (M 6.4) in 1994. The Rio Piez conveyed a catastrophic wave of debris flow over 100 kilometers, coalesced from multiple slides of surflcial material weakened both by weathering and by hydrothermal alteration in a large strato- volcano. Similar seismogenic flows occurred in Mexico in 1920 (M -6.5), Chile in 1960 (M 9.2), and Ecuador in 1987 (M 6.1 and 6.9). Velocities of wave fronts in two examples were 60 to 90 km/hr (17-25 meters per second) over the initial 30 kilometers. Volcano flank and sector collapses may produce untransformed debris avalanches, as occurred initially at Mount St. Helens in 1980. However, at least as common is direct transformation of the failed mass to a debris flow. At two other volcanoes in the Cascade Range-- Mount Rainier and Mount Baker--rapid transformation and high mobility were typical of most of at least 15 Holocene flows. This danger exists downstream from many stratovolcanoes worldwide; the population at risk is near 150,000 and increasing at Mount Rainier. The first step in preventing future catastrophes is documenting past flows. Deposits of some debris flows, however, can be mistaken for those of less-mobile debris avalanches on the basis of mounds formed by buoyed megaclasts. Megaclasts may record only the proximal phase of a debris flow that began as a debris avalanche. Runout may have extended much farther, and thus furore flow mobility may be underestimated. Processes and behaviors of megaclast-bearing paleoflows are best inferred from the intermegaclast matrix. Mitigation strategy can respond to volcanic flows regardless of type and trigger by: (1) Avoidance: Limit settlement in flow pathways to numbers that can be evacuated after event warnings (flow is occurring). (2) Instrumental even

  11. Mitigating hazards to aircraft from drifting volcanic clouds by comparing and combining IR satellite data with forward transport models

    NASA Astrophysics Data System (ADS)

    Matiella Novak, M. Alexandra

    Volcanic ash clouds in the upper atmosphere (>10km) present a significant hazard to the aviation community and in some cases cause near-disastrous situations for aircraft that inadvertently encounter them. The two most commonly used techniques for mitigating hazards to aircraft from drifting volcanic clouds are (1) using data from satellite observations and (2) the forecasting of dispersion and trajectories with numerical models. This dissertation aims to aid in the mitigation of this hazard by using Moderate Infrared Resolution Spectroradiometer (MODIS) and Advanced Very High Resolution Radiometer (AVHRR) infrared (IR) satellite data to quantitatively analyze and constrain the uncertainties in the PUFF volcanic ash transport model. Furthermore, this dissertation has experimented with the viability of combining IR data with the PUFF model to increase the model's reliability. Comparing IR satellite data with forward transport models provides valuable information concerning the uncertainty and sensitivity of the transport models. A study analyzing the viability of combining satellite-based information with the PUFF model was also done. Factors controlling the cloud-shape evolution, such as the horizontal dispersion coefficient, vertical distribution of particles, the height of the cloud, and the location of the cloud were all updated based on observations from satellite data in an attempt to increase the reliability of the simulations. Comparing center of mass locations--calculated from satellite data--to HYSPLIT trajectory simulations provides insight into the vertical distribution of the cloud. A case study of the May 10, 2003 Anatahan Volcano eruption was undertaken to assess methods of calculating errors in PUFF simulations with respect to the transport and dispersion of the erupted cloud. An analysis of the factors controlling the cloud-shape evolution of the cloud in the model was also completed and compared to the shape evolution of the cloud observed in the IR satellite data. An accurate eruption length of 28 hours--based on satellite imagery--resulted in an error growth rate that decreased by 50% from the original simulation. Using a dispersion coefficient that was calculated from satellite imagery further improved the PUFF simulation. Results show that using satellite-based information in the PUFF model decreases the error growth of the simulation by as much as 60%. PUFF simulations were also compared to IR satellite data of four other eruptions: Augustine (2006), Cleveland (2001), Hekla (2000) and Soufriere Hills (2006). The Anatahan, Augustine and Cleveland eruptions produced clouds that were ash-rich. The Hekla and Soufriere Hills eruption produced clouds that were ice-rich. Mass retrievals performed on the satellite data for these eruptions were holistically compared to determine if the evolution of the ash clouds were dependent on the cloud species and the atmospheric environment into which they were ejected (arctic vs. tropical environments). Analysis show that the ice-rich clouds decrease in mass, area and optical depth more rapidly than the ash-rich clouds. Moreover, error growth rates of the simulated lowlatitude eruptions were linear, where as error growth rates of simulated high-latitude clouds were exponential. Results from this study provide some insight into possible implications for volcanic cloud simulations of ash and ice clouds in differing environments. Finally, a sensitivity analysis of the PUFF model was implemented. This part of the research generated collaborative research with the University of Alaska, Fairbanks/Alaska Volcano Observatory, where the PUFF model is housed. Transport simulations of the May 10, 2003 Anatahan volcanic cloud were done from the PUFF model's command line. Unlike the web-based version of PUFF, the command line is not publicly accessible but provides substantially more control over the user's definition of certain variables. Results of this study show it is viable and practical to continually update the PUFF simulation of a volcanic cloud's evolution and locatio

  12. Evaluation and mitigation of lightning hazards to the space shuttle Solid Rocket Motors (SRM)

    NASA Technical Reports Server (NTRS)

    Rigden, Gregory J.; Papazian, Peter B.

    1988-01-01

    The objective was to quantify electric field strengths in the Solid Rocket Motor (SRM) propellant in the event of a worst case lightning strike. Using transfer impedance measurements for selected lightning protection materials and 3D finite difference modeling, a retrofit design approach for the existing dielectric grain cover and railcar covers was evaluated and recommended for SRM segment transport. A safe level of 300 kV/m was determined for the propellant. The study indicated that a significant potential hazard exists for unprotected segments during rail transport. However, modified railcar covers and grain covers are expected to prevent lightning attachment to the SRM and to reduce the levels to several orders of magnitude below 300 kV/m.

  13. Disruption mitigation studies in DIII-D

    SciTech Connect

    Taylor, P.L.; Kellman, A.G.; Evans, T.E.; Gray, D.S.; Humphreys, D.A.; Hyatt, A.W.; Jernigan, T.C.; Lee, R.L.; Leuer, J.A.; Luckhardt, S.C.; Parks, P.B.; Schaffer, M.J.; Whyte, D.G.; Zhang, J.

    1999-05-01

    Data on the discharge behavior, thermal loads, halo currents, and runaway electrons have been obtained in disruptions on the DIII-D tokamak [J. L. Luxon and L. G. Davis, Fusion Technol. {bold 8}, 2A 441 (1985)]. These experiments have also evaluated techniques to mitigate the disruptions while minimizing runaway electron production. Experiments injecting cryogenic impurity {open_quotes}killer{close_quotes} pellets of neon and argon and massive amounts of helium gas have successfully reduced these disruption effects. The halo current generation, scaling, and mitigation are understood and are in good agreement with predictions of a semianalytic model. Results from {open_quotes}killer{close_quotes} pellet injection have been used to benchmark theoretical models of the pellet ablation and energy loss. Runaway electrons are often generated by the pellets and new runaway generation mechanisms, modifications of the standard Dreicer process, have been found to explain the runaways. Experiments with the massive helium gas puff have also effectively mitigated disruptions without the formation of runaway electrons that can occur with {open_quotes}killer{close_quotes} pellets. {copyright} {ital 1999 American Institute of Physics.}

  14. Pulsed Electric Processing of the Seismic-Active Fault for Earthquake Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Novikov, V. A.; Zeigarnik, V. A.; Konev, Yu. B.; Klyuchkin, V. N.

    2010-03-01

    Previous field and laboratory investigations performed in Russia (1999-2008) showed a possibility of application of high-power electric current pulses generated by pulsed MHD power system for triggering the weak seismicity and release of tectonic stresses in the Earth crust for earthquake hazard mitigation. The mechanism of the influence of man-made electromagnetic field on the regional seismicity is not clear yet. One of possible cause of the phenomenon may be formation of cracks in the rocks under fluid pressure increase due to Joule heat generation by electric current injected into the Earth crust. Detailed 3D-calculaton of electric current density in the Earth crust of Northern Tien Shan provided by pulsed MHD power system connected to grounded electric dipole showed that at the depth of earthquake epicenters (> 5km) the electric current density is lower than 10-7 A/m2 that is not sufficient for increase of pressure in the fluid-saturated porous geological medium due to Joule heat generation, which may provide formation of cracks resulting in the fault propagation and release of tectonic stresses in the Earth crust. Nevertheless, under certain conditions, when electric current will be injected into the fault through the casing pipes of deep wells with preliminary injection of conductive fluid into the fault, the current density may be high enough for significant increase of mechanic pressure in the porous two-phase geological medium. Numerical analysis of a crack formation triggered by high-power electric pulses based on generation of mechanical pressure in the geological medium was carried out. It was shown that calculation of mechanical pressure impulse due to high-power electrical current in the porous two-phase medium may be performed neglecting thermal conductance by solving the non-stationary equation of piezo-conductivity with Joule heat generation. For calculation of heat generation the known solution of the task of current spreading from spherical or elliptic electrode submerged into unbounded medium is used. Pressure increase due to electric current is determined by voltage of the current source and the medium parameters, and it does not depend on the electrode radius. The pressure increase is proportional to parameter ? ? /r2, where ? is viscosity factor, ? is electric conductivity of fluid in pores, r is average radius of capillaries. This parameter may vary for different media and fluids in the pores by many orders of magnitude. The pressure increase for water is insignificant. If a high-mineralized fluid (e.g. sludge) is injected into the medium, the pressure may be increased by several orders. In that case the pressure may obtain tens kilobars, and an intergrowth of cracks will be possible in the medium exposed to high-power electric current. An estimation of parameters of portable pulsed power system for electric processing of the fault was performed, when the current is injected into the fault through the casing tubes of deep wells with preliminary injection of conductive fluid into the fault between the wells. The work is supported by grant No. 09-05-12059 of Russian Foundation for Basic Research.

  15. Looking Before We Leap: Recent Results From An Ongoing Quantitative Investigation Of Asteroid And Comet Impact Hazard Mitigation.

    NASA Astrophysics Data System (ADS)

    Plesko, Catherine; Weaver, R. P.; Korycansky, D. G.; Huebner, W. F.

    2010-10-01

    The asteroid and comet impact hazard is now part of public consciousness, as demonstrated by movies, Super Bowl commercials, and popular news stories. However, there is a popular misconception that hazard mitigation is a solved problem. Many people think, `we'll just nuke it.’ There are, however, significant scientific questions remaining in the hazard mitigation problem. Before we can say with certainty that an explosive yield Y at height of burst h will produce a momentum change in or dispersion of a potentially hazardous object (PHO), we need to quantify how and where energy is deposited into the rubble pile or conglomerate that may make up the PHO. We then need to understand how shock waves propagate through the system, what causes them to disrupt, and how long gravitationally bound fragments take to recombine. Here we present numerical models of energy deposition from an energy source into various materials that are known PHO constituents, and rigid body dynamics models of the recombination of disrupted objects. In the energy deposition models, we explore the effects of porosity and standoff distance as well as that of composition. In the dynamical models, we explore the effects of fragment size and velocity distributions on the time it takes for gravitationally bound fragments to recombine. Initial models indicate that this recombination time is relatively short, as little as 24 hours for a 1 km sized PHO composed of 1000 meter-scale self-gravitating fragments with an initial velocity field of v/r = 0.001 1/s.

  16. Bike Helmets and Black Riders: Experiential Approaches to Helping Students Understand Natural Hazard Assessment and Mitigation Issues

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Kley, J.; Hindle, D.; Friedrich, A. M.

    2014-12-01

    Defending society against natural hazards is a high-stakes game of chance against nature, involving tough decisions. How should a developing nation allocate its budget between building schools for towns without ones or making existing schools earthquake-resistant? Does it make more sense to build levees to protect against floods, or to prevent development in the areas at risk? Would more lives be saved by making hospitals earthquake-resistant, or using the funds for patient care? These topics are challenging because they are far from normal experience, in that they involve rare events and large sums. To help students in natural hazard classes conceptualize them, we pose tough and thought-provoking questions about complex issues involved and explore them together via lectures, videos, field trips, and in-class and homework questions. We discuss analogous examples from the students' experiences, drawing on a new book "Playing Against Nature, Integrating Science and Economics to Mitigate Natural Hazards in an Uncertain World". Asking whether they wear bicycle helmets and why or why not shows the cultural perception of risk. Individual students' responses vary, and the overall results vary dramatically between the US, UK, and Germany. Challenges in hazard assessment in an uncertain world are illustrated by asking German students whether they buy a ticket on public transportation - accepting a known cost - or "ride black" - not paying but risking a heavy fine if caught. We explore the challenge of balancing mitigation costs and benefits via the question "If you were a student in Los Angeles, how much more would you pay in rent each month to live in an earthquake-safe building?" Students learn that interdisciplinary thinking is needed, and that due to both uncertainties and sociocultural factors, no unique or right strategies exist for a particular community, much the less all communities. However, we can seek robust policies that give sensible results given uncertainties.

  17. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    Richard C. Logan

    2002-03-28

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  18. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    J. L. Kubicek

    2001-09-07

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  19. A mission template for exploration and damage mitigation of potential hazard of Near Earth Asteroids

    NASA Astrophysics Data System (ADS)

    Hyland, D. C.; Altwaijry, H. A.; Margulieux, R.; Doyle, J.; Sandberg, J.; Young, B.; Satak, N.; Lopez, J.; Ge, S.; Bai, X.

    2010-10-01

    The Apophis Exploratory and Mitigation Platform (AEMP) concept was developed as a prototype mission to explore and potentially deflect the Near Earth Asteroid (NEA) 99942 Apophis. Deflection of the asteroid from the potential 2036 impact will be achieved using a gravity tractor technique, while a permanent deflection, eliminating future threats, will be imparted using a novel albedo manipulation technique. This mission will serve as an archetypal template for future missions to small NEAs and could be adapted to mitigate the threat of collision with other potential Earth-crossing objects.

  20. Laboratory scale studies on mitigation of high 222Rn concentrations in air and water

    NASA Astrophysics Data System (ADS)

    Mamoon, A.; Gomma, M. A.; Sohsah, M.

    2004-01-01

    In view of the occasional occurrence of high 222Rn concentrations in air and water under certain circumstances, and in view of the potential health hazards of increased levels of 222Rn in respirable air and in potable water, mitigation of such high 222Rn concentration has become of primary concern. To facilitate the study of the efficiency of the various 222Rn mitigating factors simple laboratory systems were used. Altered alkali granite was used as radon source to enrich air and a piece of pitchblende was used as radon source to enrich water samples. Both enriched media will then be subjected to the mitigation treatments. Charcoal canister technique along with gamma spectrometry were used to measure 222Rn concentrations in air before and after the different mitigating treatments. These were: use of ventilation, radon barriers such as geo-membranes and aluminum sheet, and sealant such as epoxy and vinyl tape. Regarding high levels of 222Rn in air ventilation was the most efficient mitigating factor. Standard liquid scintillation counting was used to measure 222Rn concentrations in water before and after the different mitigation treatments. These were: use of aeration, activated charcoal and heating. Regarding high levels of 222Rn in water, aeration using bubblers and large volume of air was most effective in removing radon from water in a short time. However all the mitigating factors proved effective, in different degrees in decreasing 222Rn concentrations in the respective media. The result from these studies are in general agreement with reports in the literature. It can be concluded then that the different 222Rn mitigating factors can be tested and compared effectively under controlled conditions using simple laboratory scale systems.

  1. Debris flows hazard mitigation: an example in the F. Menotre basin (central Italy)

    NASA Astrophysics Data System (ADS)

    Taramelli, A.; Melelli, L.; Cattuto, C.; Gregori, L.

    2003-04-01

    Debris flows are important geomorphic events in a wide variety of landscape. Although repeated landsliding causes bedrock to underlie many hill slopes of mountain belts, some landslide debris remains and is stored as a thin colluvial cover, particularly in "hollows". These colluvial pockets act as slope failure "hot spots" by focusing infiltration of storm runoff, leading to local groundwater concentration above perched water tables and therefore enhanced failure potential. Consequently, colluvial deposits in hollows are particularly susceptible to landsliding. Once a failure occurs, hillslope sediment transport processes refill the scar, resulting in a cycle of gradual colluvium accumulation and periodic instability. Hillslope debris moves down-slope as a result of hillslope processes and where overland flow is either non-erosive or infrequent colluvium accumulates along the line of descendent. So the association of debris flow with hollows is governed by relations between sediment transport, hillslope hydrology and slope stability. Hollows consequently define a mappable debris flow hazard source areas. With these goals in mind we propose a GIS model where we have provided a record of landslide activity (debris flow events) in response to specific storm over areas of diverse geology, geomorpholgy, microclimates and vegetation. When combined with information on the physical properties of hillslope form and gradient, rainfall characteristics, and travel distance, these inventories could provide a foundation for the development of accurate and meaningful susceptibility maps. In particular susceptibility index identify those geologic units that produced the most debris flow in each study area in response to specific rainfall condition. To examine relations between geologic units and debris flow susceptibility, we calculated an index of debris flow susceptibility for each geologic unit in each representative elementary area. This is done by dividing the number of landslide initiation locations within a particular unit by the areal extent of that unit in the study area. The final aim of our research has been the identification of the terrain attributes related to the occurrence of debris flow and to quantify their relative contribution to the hazard assessment. The modeling work has focused on slope failure in the Menotre basin because the connection between severe storms and debris flows is clear and we had access to appropriate data to constrain the modeling.

  2. Studies Update Vinyl Chloride Hazards.

    ERIC Educational Resources Information Center

    Rawls, Rebecca

    1980-01-01

    Extensive study affirms that vinyl chloride is a potent animal carcinogen. Epidemiological studies show elevated rates of human cancers in association with extended contact with the compound. (Author/RE)

  3. Advances in Remote Sensing Approaches for Hazard Mitigation and Natural Resource Protection in Pacific Latin America: A Workshop for Advanced Graduate Students, Post- Doctoral Researchers, and Junior Faculty

    NASA Astrophysics Data System (ADS)

    Gierke, J. S.; Rose, W. I.; Waite, G. P.; Palma, J. L.; Gross, E. L.

    2008-12-01

    Though much of the developing world has the potential to gain significantly from remote sensing techniques in terms of public health and safety, they often lack resources for advancing the development and practice of remote sensing. All countries share a mutual interest in furthering remote sensing capabilities for natural hazard mitigation and resource development. With National Science Foundation support from the Partnerships in International Research and Education program, we are developing a new educational system of applied research and engineering for advancing collaborative linkages among agencies and institutions in Pacific Latin American countries (to date: Guatemala, El Salvador, Nicaragua, Costa Rica, Panama, and Ecuador) in the development of remote sensing tools for hazard mitigation and water resources management. The project aims to prepare students for careers in science and engineering through their efforts to solve suites of problems needing creative solutions: collaboration with foreign agencies; living abroad immersed in different cultures; and adapting their academic training to contend with potentially difficult field conditions and limited resources. The ultimate goal of integrating research with education is to encourage cross-disciplinary, creative, and critical thinking in problem solving and foster the ability to deal with uncertainty in analyzing problems and designing appropriate solutions. In addition to traditional approaches for graduate and undergraduate research, we have built new educational systems of applied research and engineering: (1) the Peace Corp/Master's International program in Natural Hazards which features a 2-year field assignment during service in the U.S. Peace Corps, (2) the Michigan Tech Enterprise program for undergraduates, which gives teams of students from different disciplines the opportunity to work for three years in a business-like setting to solve real-world problems, and (3) a unique university exchange program in natural hazards (E-Haz). Advancements in research have been made, for example, in using thermal remote sensing methods for studying vent and eruptive processes, and in fusing RADARSAT with ASTER imagery to delineate lineaments in volcanic terrains for siting water wells. While these and other advancements are developed in conjunction with our foreign counterparts, the impacts of this work can be broadened through more comprehensive dissemination activities. Towards this end, we are in the planning phase of a Pan American workshop on applications of remote sensing techniques for natural hazards and water resources management. The workshop will be at least two weeks, sometime in July/August 2009, and involve 30-40 participants, with balanced participation from the U.S. and Latin America. In addition to fundamental aspects of remote sensing and digital image processing, the workshop topics will be presented in the context of new developments for studying volcanic processes and hazards and for characterizing groundwater systems.

  4. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr events could benefit the NTHMP. The joint NFIP/NTHMP pilot study at Seaside, Oregon is organized into three closely related components: Probabilistic, Modeling, and Impact studies. Probabilistic studies (Geist, et al., this session) are led by the USGS and include the specification of near- and far-field seismic tsunami sources and their associated probabilities. Modeling studies (Titov, et al., this session) are led by NOAA and include the development and testing of a Seaside tsunami inundation model and an associated database of computed wave height and flow velocity fields. Impact studies (Synolakis, et al., this session) are led by USC and include the computation and analyses of indices for the categorization of hazard zones. The results of each component study will be integrated to produce a Seaside tsunami hazard map. This presentation will provide a brief overview of the project and an update on progress, while the above-referenced companion presentations will provide details on the methods used and the preliminary results obtained by each project component.

  5. Remote Sensing for Hazard Mitigation and Resource Protection in Pacific Latin America: New NSF sponsored initiative at Michigan Tech.

    NASA Astrophysics Data System (ADS)

    Rose, W. I.; Bluth, G. J.; Gierke, J. S.; Gross, E.

    2005-12-01

    Though much of the developing world has the potential to gain significantly from remote sensing techniques in terms of public health and safety and, eventually, economic development, they lack the resources required to advance the development and practice of remote sensing. Both developed and developing countries share a mutual interest in furthering remote sensing capabilities for natural hazard mitigation and resource development, and this common commitment creates a solid foundation upon which to build an integrated education and research project. This will prepare students for careers in science and engineering through their efforts to solve a suite of problems needing creative solutions: collaboration with foreign agencies; living abroad immersed in different cultures; and adapting their academic training to contend with potentially difficult field conditions and limited resources. This project makes two important advances: (1) We intend to develop the first formal linkage among geoscience agencies from four Pacific Latin American countries (Guatemala, El Salvador, Nicaragua and Ecuador), focusing on the collaborative development of remote sensing tools for hazard mitigation and water resource development; (2) We will build a new educational system of applied research and engineering, using two existing educational programs at Michigan Tech: a new Peace Corp/Master's International (PC/MI) program in Natural Hazards which features a 2-year field assignment, and an "Enterprise" program for undergraduates, which gives teams of geoengineering students the opportunity to work for three years in a business-like setting to solve real-world problems This project will involve 1-2 post-doctoral researchers, 3 Ph.D., 9 PC/MI, and roughly 20 undergraduate students each year.

  6. The respiratory health hazards of volcanic ash: a review for volcanic risk mitigation

    NASA Astrophysics Data System (ADS)

    Horwell, Claire J.; Baxter, Peter J.

    2006-07-01

    Studies of the respiratory health effects of different types of volcanic ash have been undertaken only in the last 40 years, and mostly since the eruption of Mt. St. Helens in 1980. This review of all published clinical, epidemiological and toxicological studies, and other work known to the authors up to and including 2005, highlights the sparseness of studies on acute health effects after eruptions and the complexity of evaluating the long-term health risk (silicosis, non-specific pneumoconiosis and chronic obstructive pulmonary disease) in populations from prolonged exposure to ash due to persistent eruptive activity. The acute and chronic health effects of volcanic ash depend upon particle size (particularly the proportion of respirable-sized material), mineralogical composition (including the crystalline silica content) and the physico-chemical properties of the surfaces of the ash particles, all of which vary between volcanoes and even eruptions of the same volcano, but adequate information on these key characteristics is not reported for most eruptions. The incidence of acute respiratory symptoms (e.g. asthma, bronchitis) varies greatly after ashfalls, from very few, if any, reported cases to population outbreaks of asthma. The studies are inadequate for excluding increases in acute respiratory mortality after eruptions. Individuals with pre-existing lung disease, including asthma, can be at increased risk of their symptoms being exacerbated after falls of fine ash. A comprehensive risk assessment, including toxicological studies, to determine the long-term risk of silicosis from chronic exposure to volcanic ash, has been undertaken only in the eruptions of Mt. St. Helens (1980), USA, and Soufrire Hills, Montserrat (1995 onwards). In the Soufrire Hills eruption, a long-term silicosis hazard has been identified and sufficient exposure and toxicological information obtained to make a probabilistic risk assessment for the development of silicosis in outdoor workers and the general population. A more systematic approach to multi-disciplinary studies in future eruptions is recommended, including establishing an archive of ash samples and a website containing health advice for the public, together with scientific and medical study guidelines for volcanologists and health-care workers.

  7. Piloted Simulation to Evaluate the Utility of a Real Time Envelope Protection System for Mitigating In-Flight Icing Hazards

    NASA Technical Reports Server (NTRS)

    Ranaudo, Richard J.; Martos, Borja; Norton, Bill W.; Gingras, David R.; Barnhart, Billy P.; Ratvasky, Thomas P.; Morelli, Eugene

    2011-01-01

    The utility of the Icing Contamination Envelope Protection (ICEPro) system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device (ICEFTD). ICEPro provides real time envelope protection cues and alerting messages on pilot displays. The pilots participating in this test were divided into two groups; a control group using baseline displays without ICEPro, and an experimental group using ICEPro driven display cueing. Each group flew identical precision approach and missed approach procedures with a simulated failure case icing condition. Pilot performance, workload, and survey questionnaires were collected for both groups of pilots. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their situation awareness of a hazardous aircraft state.

  8. Hazardous near Earth asteroid mitigation campaign planning based on uncertain information on fundamental asteroid characteristics

    NASA Astrophysics Data System (ADS)

    Sugimoto, Y.; Radice, G.; Ceriotti, M.; Sanchez, J. P.

    2014-10-01

    Given a limited warning time, an asteroid impact mitigation campaign would hinge on uncertainty-based information consisting of remote observational data of the identified Earth-threatening object, general knowledge of near-Earth asteroids (NEAs), and engineering judgment. Due to these ambiguities, the campaign credibility could be profoundly compromised. It is therefore imperative to comprehensively evaluate the inherent uncertainty in deflection and plan the campaign accordingly to ensure successful mitigation. This research demonstrates dual-deflection mitigation campaigns consisting of primary (instantaneous/quasi-instantaneous) and secondary (slow-push) deflection missions, where both deflection efficiency and campaign credibility are taken into account. The results of the dual-deflection campaign analysis show that there are trade-offs between the competing aspects: the launch cost, mission duration, deflection distance, and the confidence in successful deflection. The design approach is found to be useful for multi-deflection campaign planning, allowing us to select the best possible combination of missions from a catalogue of campaign options, without compromising the campaign credibility.

  9. The Relation of Hazard Awareness to Adoption of Approved Mitigation Measures.

    ERIC Educational Resources Information Center

    Saarinen, Thomas F.

    The relationship between an individual's or community's awareness of natural hazards and subsequent behavior change is examined in this review of research. The document is presented in seven sections. Following Section I, the introduction, Section II discusses the role of experience in behavior change. Section III examines the role of education

  10. Overcoming obstacles to implementation: addressing political, institutional and behavioral problems in earthquake hazard mitigation policies

    NASA Astrophysics Data System (ADS)

    Alesch, Daniel J.; Petak, William J.

    2002-06-01

    This project is aimed at bridging the three planes, from basic research, through enabling processes, to engineered systems. At the basic research plane, we have been working to improve our collective understanding about obstacles to implementing mitigation practices, owner decision processes (in connection with other MCEER projects), and public policy processes. At the level of enabling processes, we have been seeking to develop an understanding of how obstacles to greater mitigation can be overcome by improved policy design and processes. At the engineered systems plane, our work is intended to result in practical guidelines for devising policies and programs with appropriate motivation and incentives for implementing policies and programs once adopted. This phase of the research has been aimed, first, at a thorough, multidisciplinary review of the literature concerning obstacles to implementation. Second, the research has focused on advancing the state of the art by developing means for integrating the insights offered by diverse perspectives on the implementation process from the several social, behavioral, and decision sciences. The research establishes a basis for testing our understanding of these processes in the case of hospital retrofit decisions.

  11. Marine and Hydrokinetic Renewable Energy Devices, Potential Navigational Hazards and Mitigation Measures

    SciTech Connect

    Cool, Richard, M.; Hudon, Thomas, J.; Basco, David, R.; Rondorf, Neil, E.

    2009-12-01

    On April 15, 2008, the Department of Energy (DOE) issued a Funding Opportunity Announcement for Advanced Water Power Projects which included a Topic Area for Marine and Hydrokinetic Renewable Energy Market Acceleration Projects. Within this Topic Area, DOE identified potential navigational impacts of marine and hydrokinetic renewable energy technologies and measures to prevent adverse impacts on navigation as a sub-topic area. DOE defines marine and hydrokinetic technologies as those capable of utilizing one or more of the following resource categories for energy generation: ocean waves; tides or ocean currents; free flowing water in rivers or streams; and energy generation from the differentials in ocean temperature. PCCI was awarded Cooperative Agreement DE-FC36-08GO18177 from the DOE to identify the potential navigational impacts and mitigation measures for marine hydrokinetic technologies. A technical report addressing our findings is available on this Science and Technology Information site under the Product Title, "Marine and Hydrokinetic Renewable Energy Technologies: Potential Navigational Impacts and Mitigation Measures". This product is a brochure, primarily for project developers, that summarizes important issues in that more comprehensive report, identifies locations where that report can be downloaded, and identifies points of contact for more information.

  12. Debris flood hazard documentation and mitigation on the Tilcara alluvial fan (Quebrada de Humahuaca, Jujuy province, North-West Argentina)

    NASA Astrophysics Data System (ADS)

    Marcato, G.; Bossi, G.; Rivelli, F.; Borgatti, L.

    2012-06-01

    For some decades, mass wasting processes such as landslides and debris floods have been threatening villages and transportation routes in the Rio Grande Valley, named Quebrada de Humauhuaca. One of the most significant examples is the urban area of Tilcara, built on a large alluvial fan. In recent years, debris flood phenomena have been triggered in the tributary valley of the Huasamayo Stream and reached the alluvial fan on a decadal basis. In view of proper development of the area, hazard and risk assessment together with risk mitigation strategies are of paramount importance. The need is urgent also because the Quebrada de Humahuaca was recently included in the UNESCO World Cultural Heritage. Therefore, the growing tourism industry may lead to uncontrolled exploitation and urbanization of the valley, with a consequent increase of the vulnerability of the elements exposed to risk. In this context, structural and non structural mitigation measures not only have to be based on the understanding of natural processes, but also have to consider environmental and sociological factors that could hinder the effectiveness of the countermeasure works. The hydrogeological processes are described with reference to present-day hazard and risk conditions. Considering the socio-economic context, some possible interventions are outlined, which encompass budget constraints and local practices. One viable solution would be to build a protecting dam upstream of the fan apex and an artificial channel, in order to divert the floodwaters in a gully that would then convey water and sediments into the Rio Grande, some kilometers downstream of Tilcara. The proposed remedial measures should employ easily available and relatively cheap technologies and local workers, incorporating low environmental and visual impacts issues, in order to ensure both the future conservation of the site and its safe exploitation for inhabitants and tourists.

  13. Societal transformation and adaptation necessary to manage dynamics in flood hazard and risk mitigation (TRANS-ADAPT)

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Thaler, Thomas; Bonnefond, Mathieu; Clarke, Darren; Driessen, Peter; Hegger, Dries; Gatien-Tournat, Amandine; Gralepois, Mathilde; Fournier, Marie; Mees, Heleen; Murphy, Conor; Servain-Courant, Sylvie

    2015-04-01

    Facing the challenges of climate change, this project aims to analyse and to evaluate the multiple use of flood alleviation schemes with respect to social transformation in communities exposed to flood hazards in Europe. The overall goals are: (1) the identification of indicators and parameters necessary for strategies to increase societal resilience, (2) an analysis of the institutional settings needed for societal transformation, and (3) perspectives of changing divisions of responsibilities between public and private actors necessary to arrive at more resilient societies. This proposal assesses societal transformations from the perspective of changing divisions of responsibilities between public and private actors necessary to arrive at more resilient societies. Yet each risk mitigation measure is built on a narrative of exchanges and relations between people and therefore may condition the outputs. As such, governance is done by people interacting and defining risk mitigation measures as well as climate change adaptation are therefore simultaneously both outcomes of, and productive to, public and private responsibilities. Building off current knowledge this project will focus on different dimensions of adaptation and mitigation strategies based on social, economic and institutional incentives and settings, centring on the linkages between these different dimensions and complementing existing flood risk governance arrangements. The policy dimension of adaptation, predominantly decisions on the societal admissible level of vulnerability and risk, will be evaluated by a human-environment interaction approach using multiple methods and the assessment of social capacities of stakeholders across scales. As such, the challenges of adaptation to flood risk will be tackled by converting scientific frameworks into practical assessment and policy advice. In addressing the relationship between these dimensions of adaptation on different temporal and spatial scales, this project is both scientifically innovative and policy relevant, thereby supporting climate policy needs in Europe towards a concept of risk governance. Key words: climate change adaptation; transformation; flood risk management; resilience; vulnerability; innovative bottom-up developments; multifunctional use

  14. Detecting Slow Deformation Signals Preceding Dynamic Failure: A New Strategy For The Mitigation Of Natural Hazards (SAFER)

    NASA Astrophysics Data System (ADS)

    Vinciguerra, Sergio; Colombero, Chiara; Comina, Cesare; Ferrero, Anna Maria; Mandrone, Giuseppe; Umili, Gessica; Fiaschi, Andrea; Saccorotti, Gilberto

    2014-05-01

    Rock slope monitoring is a major aim in territorial risk assessment and mitigation. The high velocity that usually characterizes the failure phase of rock instabilities makes the traditional instruments based on slope deformation measurements not applicable for early warning systems. On the other hand the use of acoustic emission records has been often a good tool in underground mining for slope monitoring. Here we aim to identify the characteristic signs of impending failure, by deploying a "site specific" microseismic monitoring system on an unstable patch of the Madonna del Sasso landslide on the Italian Western Alps designed to monitor subtle changes of the mechanical properties of the medium and installed as close as possible to the source region. The initial characterization based on geomechanical and geophysical tests allowed to understand the instability mechanism and to design the monitoring systems to be placed. Stability analysis showed that the stability of the slope is due to rock bridges. Their failure progress can results in a global slope failure. Consequently the rock bridges potentially generating dynamic ruptures need to be monitored. A first array consisting of instruments provided by University of Turin, has been deployed on October 2013, consisting of 4 triaxial 4.5 Hz seismometers connected to a 12 channel data logger arranged in a 'large aperture' configuration which encompasses the entire unstable rock mass. Preliminary data indicate the occurrence of microseismic swarms with different spectral contents. Two additional geophones and 4 triaxial piezoelectric accelerometers able to operate at frequencies up to 23 KHz will be installed during summer 2014. This will allow us to develop a network capable of recording events with Mw < 0.5 and frequencies between 700 Hz and 20 kHz. Rock physical and mechanical characterization along with rock deformation laboratory experiments during which the evolution of related physical parameters under simulated conditions of stress and fluid content will be also studied and theoretical modelling will allow to come up with a full hazard assessment and test new methodologies for a much wider scale of applications within EU.

  15. Volcanic risk: mitigation of lava flow invasion hazard through optimized barrier configuration

    NASA Astrophysics Data System (ADS)

    Scifoni, S.; Coltelli, M.; Marsella, M.; Napoleoni, Q.; Del Negro, C.; Proietti, C.; Vicari, A.

    2009-04-01

    In order to mitigate the destructive effects of lava flows along volcanic slopes, the building of artificial barriers is a fundamental action for controlling and slowing down the lava flow advance, as experienced during a few recent eruptions of Etna. The simulated lava path can be used to define an optimize project to locate the work but for a timely action it is also necessary to quickly construct a barrier. Therefore this work investigates different type of engineering work that can be adopted to build up a lava containing barrier for improving the efficiency of the structure. From the analysis of historical cases it is clear that barriers were generally constructed by building up earth, lava blocks and incoherent, low density material. This solution implies complex operational constraints and logistical problems that justify the effort of looking for alternative design. Moreover for optimizing the barrier construction an alternative project of gabion-made barrier was here proposed. In this way the volume of mobilized material is lower than that for a earth barrier, thus reducing the time needed for build up the structure. A second crucial aspect to be considered is the geometry of the barrier which, is one of the few parameters that can be modulated, the others being linked to the morphological and topographical characteristics of the ground. Once the walls have been realized, it may be necessary to be able to expand the structure vertically. The use of gabion has many advantages over loose riprap (earthen walls) owing to their modularity and capability to be stacked in various shapes. Furthermore, the elements which are not inundated by lava can be removed and rapidly used for other barriers. The combination between numerical simulations and gabions will allow a quicker mitigation of risk on lava flows and this is an important aspect for a civil protection intervention in emergency cases.

  16. Smart Oceans BC: Supporting Coastal and Ocean Natural Hazards Mitigation for British Columbia

    NASA Astrophysics Data System (ADS)

    Moran, K.; Insua, T. L.; Pirenne, B.; Hoeberechts, M.; McLean, S.

    2014-12-01

    Smart Oceans BC is a new multi-faceted program to support decision-makers faced with responding to natural disasters and hazards in Canada's Province of British Columbia. It leverages the unique capabilities of Ocean Networks Canada's cabled ocean observatories, NEPTUNE and VENUS to enhance public safety, marine safety and environmental monitoring. Smart Oceans BC combines existing and new marine sensing technology with its robust data management and archive system, Oceans 2.0, to deliver information and science for good ocean management and responsible ocean use. Smart Oceans BC includes new ocean observing infrastructure for: public safety, through natural hazard detection for earthquake groundshaking and near-field tsunamis; marine safety, by monitoring and providing alerts on sea state, ship traffic, and marine mammal presence; and environmental protection, by establishing baseline data in critical areas, and providing real-time environmental observations. Here we present the elements of this new ocean observing initiative that are focused on tsunami and earthquake early warning including cabled and autonomous sensor systems, real-time data delivery, software developments that enable rapid detection, analytics used in notification development, and stakeholder engagement plans.

  17. A fast global tsunami modeling suite as a trans-oceanic tsunami hazard prediction and mitigation tool

    NASA Astrophysics Data System (ADS)

    Mohammed, F.; Li, S.; Jalali Farahani, R.; Williams, C. R.; Astill, S.; Wilson, P. S.; B, S.; Lee, R.

    2014-12-01

    The past decade has been witness to two mega-tsunami events, 2004 Indian ocean tsunami and 2011 Japan tsunami and multiple major tsunami events; 2006 Java, Kuril Islands, 2007 Solomon Islands, 2009 Samoa and 2010 Chile, to name a few. These events generated both local and far field tsunami inundations with runup ranging from a few meters to around 40 m in the coastal impact regions. With a majority of the coastal population at risk, there is need for a sophisticated outlook towards catastrophe risk estimation and a quick mitigation response. At the same time tools and information are needed to aid advanced tsunami hazard prediction. There is an increased need for insurers, reinsurers and Federal hazard management agencies to quantify coastal inundations and vulnerability of coastal habitat to tsunami inundations. A novel tool is developed to model local and far-field tsunami generation, propagation and inundation to estimate tsunami hazards. The tool is a combination of the NOAA MOST propagation database and an efficient and fast GPU (Graphical Processing Unit)-based non-linear shallow water wave model solver. The tsunamigenic seismic sources are mapped on to the NOAA unit source distribution along subduction zones in the ocean basin. Slip models are defined for tsunamigenic seismic sources through a slip distribution on the unit sources while maintaining limits of fault areas. A GPU based finite volume solver is used to simulate non-linear shallow water wave propagation, inundation and runup. Deformation on the unit sources provide initial conditions for modeling local impacts, while the wave history from propagation database provides boundary conditions for far field impacts. The modeling suite provides good agreement with basins for basin wide tsunami propagation to validate local and far field tsunami inundations.

  18. Field Guide for Testing Existing Photovoltaic Systems for Ground Faults and Installing Equipment to Mitigate Fire Hazards: November 2012 - October 2013

    SciTech Connect

    Brooks, William

    2015-02-01

    Ground faults and arc faults are the two most common reasons for fires in photovoltaic (PV) arrays and methods exist that can mitigate the hazards. This report provides field procedures for testing PV arrays for ground faults, and for implementing high resolution ground fault and arc fault detectors in existing and new PV system designs.

  19. Novel bio-inspired smart control for hazard mitigation of civil structures

    NASA Astrophysics Data System (ADS)

    Kim, Yeesock; Kim, Changwon; Langari, Reza

    2010-11-01

    In this paper, a new bio-inspired controller is proposed for vibration mitigation of smart structures subjected to ground disturbances (i.e. earthquakes). The control system is developed through the integration of a brain emotional learning (BEL) algorithm with a proportional-integral-derivative (PID) controller and a semiactive inversion (Inv) algorithm. The BEL algorithm is based on the neurologically inspired computational model of the amygdala and the orbitofrontal cortex. To demonstrate the effectiveness of the proposed hybrid BEL-PID-Inv control algorithm, a seismically excited building structure equipped with a magnetorheological (MR) damper is investigated. The performance of the proposed hybrid BEL-PID-Inv control algorithm is compared with that of passive, PID, linear quadratic Gaussian (LQG), and BEL control systems. In the simulation, the robustness of the hybrid BEL-PID-Inv control algorithm in the presence of modeling uncertainties as well as external disturbances is investigated. It is shown that the proposed hybrid BEL-PID-Inv control algorithm is effective in improving the dynamic responses of seismically excited building structure-MR damper systems.

  20. Protecting new health facilities from natural hazards: guidelines for the promotion of disaster mitigation.

    PubMed

    2004-01-01

    The health sector is particularly vulnerable to naturally occurring events. The vulnerability of the health infrastructure (hospitals and clinics) is of particular concern. Not only are such facilities vulnerable structurally, but their ability to continue to provide essential functions may be severely compromised, thus leaving the stricken population without essential services. This paper summarizes a more detailed document, Guidelines for Vulnerability Reduction in the Design of New Health Facilities published by the Pan-American Health Organization (PAHO)/ World Health Organization (WHO). The current document summarizes these Guidelines emphasizing how they may be used, by whom, and for what purpose. Potential users of the Guidelines include, but are not limited to: (1) initiators of health facility construction projects; (2) executors and supervisors of health facility construction projects; and (3) financing bodies in charge of funding health facility construction projects. The Guidelines include: (1) implications of natural phenomena upon the health infrastructure; (2) guidelines for vulnerability reduction for incorporation into development project cycles; (3) definitive phases and stages within the phases for development projects including: (I) Projects Assessment (needs assessment; assessment of options, the preliminary project); (II) Investment (project design, construction); and (III) Operational Activities (operations and maintenance). In addition, investment in damage reduction measures, policies and regulations, training and education, and the role of international organizations in the promotion and funding of mitigation strategies are addressed. PMID:15645629

  1. Using Darwin's theory of atoll formation to improve tsunami hazard mitigation in the Pacific

    NASA Astrophysics Data System (ADS)

    Goff, J. R.; Terry, J. P.

    2012-12-01

    It is 130 years since Charles Darwin's death and 176 years since he his penned his subsidence theory of atoll formation on 12th April 1836 during the voyage of the Beagle through the Pacific. This theory, founded on the premise of a subsiding volcano and the corresponding upward growth of coral reef, was astonishing for the time considering the absence of an underpinning awareness of plate tectonics. Furthermore, with the exception of the occasional permutation and opposing idea his theory has endured and has an enviable longevity amongst paradigms in geomorphology. In his theory, Darwin emphasised the generally circular morphology of the atoll shape and surprisingly, the validity of this simple morphological premise has never been questioned. There are however, few atolls in the Pacific Ocean that attain such a simple morphology with most manifesting one or more arcuate 'bight-like' structures (ABLSs). These departures from the circular form complicate his simplistic model and are indicative of geomorphological processes in the Pacific Ocean which cannot be ignored. ABLSs represent the surface morphological expression of major submarine failures of atoll volcanic foundations. Such failures can occur during any stage of atoll formation and are a valuable addition to Darwin's theory because they indicate the instability of the volcanic foundations. It is widely recognized in the research community that sector/flank collapses of island edifices are invariably tsunamigenic and yet we have no clear understanding of how significant such events are in the tsunami hazard arena. The recognition of ABLSs however, now offers scientists the opportunity to establish a first order database of potential local and regional tsunamigenic sources associated with the sector/flank collapses of island edifices. We illustrate the talk with examples of arcuate 'bight-like' structures and associated tsunamis in atoll and atoll-like environments. The implications for our understanding of tsunami hazards are profound. In essence, at present we are seriously under-estimating the significance of locally and regionally generated tsunamis throughout the entire Pacific Ocean, but we now have the opportunity to enhance our understanding of such events.

  2. Educational Approach to Seismic Risk Mitigation in Indian Himalayas -Hazard Map Making Workshops at High Schools-

    NASA Astrophysics Data System (ADS)

    Koketsu, K.; Oki, S.; Kimura, M.; Chadha, R. K.; Davuluri, S.

    2014-12-01

    How can we encourage people to take preventive measures against damage risks and empower them to take the right actions in emergencies to save their lives? The conventional approach taken by scientists had been disseminating intelligible information on up-to-date seismological knowledge. However, it has been proven that knowledge alone does not have enough impact to modify people's behaviors in emergencies (Oki and Nakayachi, 2012). On the other hand, the conventional approach taken by practitioners had been to conduct emergency drills at schools or workplaces. The loss of many lives from the 2011 Tohoku earthquake has proven that these emergency drills were not enough to save people's lives, unless they were empowered to assess the given situation on their own and react flexibly. Our challenge is to bridge the gap between knowledge and practice. With reference to best practices observed in Tohoku, such as The Miracles of Kamaishi, our endeavor is to design an effective Disaster Preparedness Education Program that is applicable to other disaster-prone regions in the world, even with different geological, socio-economical and cultural backgrounds. The key concepts for this new approach are 1) empowering individuals to take preventive actions to save their lives, 2) granting community-based understanding of disaster risks and 3) building a sense of reality and relevancy to disasters. With these in mind, we held workshops at some high schools in the Lesser Himalayan Region, combining lectures with an activity called "Hazard Map Making" where students proactively identify and assess the hazards around their living areas and learn practical strategies on how to manage risks. We observed the change of awareness of the students by conducting a preliminary questionnaire survey and interviews after each session. Results strongly implied that the significant change of students' attitudes towards disaster preparedness occurred not by the lectures of scientific knowledge, but after completing the whole program of activities. Students closed their presentation by spontaneously adding messages to others about importance of life and preparedness. In this presentation, we share good practices in terms of program design and facilitation that encouraged the transition of participants from a learner to an actor.

  3. Hawaiian cultural influences on support for lava flow hazard mitigation measures during the January 1960 eruption of Kīlauea volcano, Kapoho, Hawai‘i

    USGS Publications Warehouse

    Gregg, Chris E.; Houghton, B.F.; Paton, Douglas; Swanson, D.A.; Lachman, R.; Bonk, W.J.

    2008-01-01

    On average, 72% of respondents favored the construction of earthen barriers to hold back or divert lava and protect Kapoho, but far fewer agreed with the military's use of bombs (14%) to protect Kapoho. In contrast, about one-third of respondents conditionally agreed with the use of bombs. It is suggested that local participation in the bombing strategy may explain the increased conditional acceptance of bombs as a mitigation tool, although this can not be conclusively demonstrated. Belief in Pele and being of Hawaiian ethnicity did not reduce support for the use of barriers, but did reduce support for bombs in both bombing scenarios. The disparity in levels of acceptance of barriers versus bombing and of one bombing strategy versus another suggests that historically public attitudes toward lava flow hazard mitigation strategies were complex. A modern comparative study is needed before the next damaging eruption to inform debates and decisions about whether or not to interfere with the flow of lava. Recent changes in the current eruption of Kīlauea make this a timely topic.

  4. The subsurface cross section resistivity using magnetotelluric method in Pelabuhan Ratu area, West Java, implication for geological hazard mitigation

    NASA Astrophysics Data System (ADS)

    Gaffar, Eddy Z.

    2016-02-01

    Pelabuhan Ratu area is located on the south coast of West Java. Pelabuhan Ratu area's rapid development and population growth were partly stimulated by the Indonesian Government Regulation No. 66 the year 1998 that made Pelabuhan Ratu the capital city of the district of Sukabumi. Because of this fact, it is very important to create a geological hazard mitigation plan for the area. Pelabuhan Ratu were passed by two major faults: Cimandiri fault in the western and Citarik fault in the eastern. Cimandiri fault starts from the upstream of Cimandiri River to the southern of Sukabumi and Cianjur city. While Citarik fault starts from the Citarik River until the Salak Mountain. These two faults needs to be observed closely as they are prone to cause earthquake in the area. To mitigate earthquake that is estimated will occur at Cimandiri fault or the Citarik fault, the Research Center for Geotechnology LIPI conducted research using Magnetotelluric (MT) method with artificial Phoenix MT tool to determine the cross-section resistivity of the Pelabuhan Ratu and the surrounding area. Measurements were taken at 40 points along the highway towards Jampang to Pelabuhan Ratu, and to Bandung towards Cibadak with a distance of less than 500 meters between the measuring points. Measurement results using this tool will generate AMT cross-section resistivity to a depth of 1500 meters below the surface. Cross-section resistivity measurement results showed that there was a layer of rock with about 10 Ohm-m to 1000 Ohm-m resistivity. Rocks with resistivity of 10 Ohm-m was interpreted as conductive rocks that were loose or sandstone containing water. If an earthquake to occur in this area, it will lead to a strong movement and liquefaction that will destroy buildings and potentially cause casualties in this area.

  5. Volcanic Ash Image Products from MODIS for Aviation Safety and Natural Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Stephens, G.; Ellrod, G. P.; Im, J.

    2003-12-01

    Multi-spectral volcanic ash image products have been developed using Moderate Resolution Imaging Spectroradiometer (MODIS) data from the NASA Terra spacecraft (Ellrod and Im 2003). Efforts are now underway to integrate these new products into the MODIS Data Retrieval System at NESDIS, for use in the operational Hazard Mapping System (HMS). The images will be used at the Washington Volcanic Ash Advisory Center (W-VAAC) in the issuance of volcanic ash advisory statements to aircraft. In addition, the images will be made available to users in the global volcano and emergency management community via the World Wide Web. During the development process, good results (high detection rate with low false alarms") were obtained from a tri-spectral combination of MODIS Infrared (IR) bands centered near 8.6, 11.0 and 12.0 Ym (Bands 29, 31, and 32). Optimum Red-Green-Blue false color composite images were developed to provide information on ash cloud location, as well as cloud phase and surface characteristics, to aid in interpretation both day and night. Information on volcanic ash derived from the tri-spectral product was displayed using the red color gun. This information was combined with visible (0.6 Ym) and near-IR (1.6 Ym) data for green and blue, respectively, during daylight periods. At night, the 8.6 V 11.0 Ym combination and 11.0 Ym band were used for the green and blue colors in the RGB product. Currently, raw MODIS data in five minute granules" are processed for the following regions: (1) southern Alaska, (2) Mexico, Central America and the Caribbean, and (3) northern Andes region of South America. Image products are converted to Geo-spatial Information System (GIS) compatible formats for use in the HMS, and to Man-Computer Interactive Data Access System (McIDAS) Area File" format for use in currently configured W-VAAC display systems. The installation of a high speed, fiber optic line from NASA Goddard Space Flight Center to the World Weather Building, Camp Springs, Maryland (scheduled for completion by Fall, 2003) will allow a full set of data to be processed from both Terra and Aqua spacecraft.

  6. Scientific Animations for Tsunami Hazard Mitigation: The Pacific Tsunami Warning Center's YouTube Channel

    NASA Astrophysics Data System (ADS)

    Becker, N. C.; Wang, D.; Shiro, B.; Ward, B.

    2013-12-01

    Outreach and education save lives, and the Pacific Tsunami Warning Center (PTWC) has a new tool--a YouTube Channel--to advance its mission to protect lives and property from dangerous tsunamis. Such outreach and education is critical for coastal populations nearest an earthquake since they may not get an official warning before a tsunami reaches them and will need to know what to do when they feel strong shaking. Those who live far enough away to receive useful official warnings and react to them, however, can also benefit from PTWC's education and outreach efforts. They can better understand a tsunami warning message when they receive one, can better understand the danger facing them, and can better anticipate how events will unfold while the warning is in effect. The same holds true for emergency managers, who have the authority to evacuate the public they serve, and for the news media, critical partners in disseminating tsunami hazard information. PTWC's YouTube channel supplements its formal outreach and education efforts by making its computer animations available 24/7 to anyone with an Internet connection. Though the YouTube channel is only a month old (as of August 2013), it should rapidly develop a large global audience since similar videos on PTWC's Facebook page have reached over 70,000 viewers during organized media events, while PTWC's official web page has received tens of millions of hits during damaging tsunamis. These animations are not mere cartoons but use scientific data and calculations to render graphical depictions of real-world phenomena as accurately as possible. This practice holds true whether the animation is a simple comparison of historic earthquake magnitudes or a complex simulation cycling through thousands of high-resolution data grids to render tsunami waves propagating across an entire ocean basin. PTWC's animations fall into two broad categories. The first group illustrates concepts about seismology and how it is critical to tsunami warning operations, such as those about earthquake magnitudes, how earthquakes are located, where and how often earthquakes occur, and fault rupture length. The second group uses the PTWC-developed tsunami forecast model, RIFT (Wang et al., 2012), to show how various historic tsunamis propagated through the world's oceans. These animations illustrate important concepts about tsunami behavior such as their speed, how they bend around and bounce off of seafloor features, how their wave heights vary from place to place and in time, and how their behavior is strongly influenced by the type of earthquake that generated them. PTWC's YouTube channel also includes an animation that simulates both seismic and tsunami phenomena together as they occurred for the 2011 Japan tsunami including actual sea-level measurements and proper timing for tsunami alert status, thus serving as a video 'time line' for that event and showing the time scales involved in tsunami warning operations. Finally, PTWC's scientists can use their YouTube channel to communicate with their colleagues in the research community by supplementing their peer-reviewed papers with video 'figures' (e.g., Wang et al., 2012).

  7. Towards the Establishment of the Hawaii Integrated Seismic Network for Tsunami, Seismic, and Volcanic Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Shiro, B. R.; Koyanagi, S. K.; Okubo, P. G.; Wolfe, C. J.

    2006-12-01

    The NOAA Pacific Tsunami Warning Center (PTWC) located in `Ewa Beach, Hawai`i, provides warnings to the State of Hawai`i regarding locally generated tsunamis. The USGS Hawaiian Volcano Observatory (HVO) located in Hawai`i National Park monitors earthquakes on the island of Hawai`i in order to characterize volcanic and earthquake activity and hazards. In support of these missions, PTWC and HVO operate seismic networks for rapidly detecting and evaluating earthquakes for their tsunamigenic potential and volcanic risk, respectively. These existing seismic networks are comprised mostly of short-period vertical seismometers with analog data collection and transmission based on decades-old technology. The USGS National Strong Motion Program (NSMP) operates 31 accelerometers throughout the state, but none currently transmit their data in real time. As a result of enhancements to the U.S. Tsunami Program in the wake of the December 2004 Indian Ocean tsunami disaster, PTWC is upgrading and expanding its seismic network using digital real-time telemetry from broadband and strong motion accelerometer stations. Through new cooperative agreements with partners including the USGS (HVO and NSMP), IRIS, University of Hawai`i, and Germany's GEOFON, the enhanced seismic network has been designed to ensure maximum benefit to all stakeholders. The Hawaii Integrated Seismic Network (HISN) will provide a statewide resource for tsunami, earthquake, and volcanic warnings. Furthermore, because all data will be archived by the IRIS Data Management Center (DMC), the HISN will become a research resource to greater scientific community. The performance target for the enhanced HISN is for PTWC to provide initial local tsunami warnings within 90 seconds of the earthquake origin time. This will be accomplished using real-time digital data transmission over redundant paths and by implementing contemporary analysis algorithms in real-time and near-real-time. Earthquake location, depth, and magnitude determination will be improved due to the better station coverage. More advanced seismic analysis techniques such as rapid characterization of the earthquake source will also be possible with HISN broadband data. Anticipated products from upgraded strong motion monitoring include ShakeMaps and earthquake rupture models. The HISN will ultimately consist of the following three types of stations: 12 broadband stations built to ANSS (Advanced National Seismic System) standards using STS-2 broadband seismometers and strong motion accelerometers, 15 new strong motion accelerometer stations, and at least 12 NSMP stations upgraded to real time digital communications. Combined with other existing broadband, short-period, and strong motion stations throughout Hawai`i, the HISN will greatly enhance seismic monitoring capabilities throughout the region. Although most seismicity in Hawai`i occurs under the Island of Hawai`i, large earthquakes do happen further up the island chain. Therefore, stations will be located on all major islands in order to optimize coverage. PTWC is currently finalizing site selection for new sites located on the islands of Kaua`i, O`ahu, Moloka`i, Lana`i, Maui, and Hawai`i. PTWC has begun installation of new stations and expects to have the entire HISN completed by late 2007 or early 2008.

  8. Public Policy Issues Associated with Tsunami Hazard Mitigation, Response and Recovery: Transferable Lessons from Recent Global Disasters

    NASA Astrophysics Data System (ADS)

    Johnson, L.

    2014-12-01

    Since 2004, a sequence of devastating tsunamis has taken the lives of more than 300,000 people worldwide. The path of destruction left by each is typically measured in hundreds of meters to a few kilometers and its breadth can extend for hundreds even thousands of kilometers, crossing towns and countries and even traversing an entire oceanic basin. Tsunami disasters in Indonesia, Chile, Japan and elsewhere have also shown that the almost binary nature of tsunami impacts can present some unique risk reduction, response, recovery and rebuilding challenges, with transferable lessons to other tsunami vulnerable coastal communities around the world. In particular, the trauma can motivate survivors to relocate homes, jobs, and even whole communities to safer ground, sometimes at tremendous social and financial costs. For governments, the level of concentrated devastation usually exceeds the local capacity to respond and thus requires complex inter-governmental arrangements with regional, national and even international partners to support the recovery of impacted communities, infrastructure and economies. Two parallel projects underway in California since 2011—the SAFRR (Science Application for Risk Reduction) tsunami scenario project and the California Tsunami Policy Working Group (CTPWG)—have worked to digest key lessons from recent tsunami disasters, with an emphasis on identifying gaps to be addressed in the current state and federal policy framework to enhance tsunami risk awareness, hazard mitigation, and response and recovery planning ahead of disaster and also improve post-disaster implementation practices following a future California or U.S. tsunami event.

  9. Conceptual Study on Air Ingress Mitigation for VHTRs

    SciTech Connect

    Chang H. Oh; Eung S. Kim

    2012-09-01

    An air-ingress accident followed by a pipe break is considered as a critical event for a very high temperature gas-cooled reactor (VHTR) safety. Following helium depressurization, it is anticipated that unless countermeasures are taken, air will enter the core through the break leading to oxidation of the in-core graphite structure. Thus, without mitigation features, this accident might lead to severe exothermic chemical reactions of graphite and oxygen depending on the accident scenario and the design. Under extreme circumstances, a loss of core structural integrity may occur and lead to a detrimental situation for the VHTR safety. This paper discusses various air-ingress mitigation concepts applicable for the VHTRs. The study begins with identifying important factors (or phenomena) associated with the air-ingress accident using a root-cause analysis. By preventing main causes of the important events identified in the root-cause diagram, the basic air-ingress mitigation ideas were conceptually developed. Among them, two concepts were finally evaluated as effective candidates. One concept is to inject helium into the lower plenum which is a direct in-vessel helium injection. The other concept is to enclose the reactor with a non-pressure boundary consisting of an opening at the bottom, which is an ex-vessel enclosure boundary. Computational fluid dynamics (CFD) methods were used to validate these concepts. As a result, it was shown that both concepts can effectively mitigate the air-ingress process. In the first concept, the injected helium replaces the air in the core and the lower plenum upper part by buoyancy force because of its low density. It prevented air from moving into the reactor core showing great potential for mitigating graphite oxidation in the core. In the second concept, the air-ingress rate is controlled by molecular diffusion through the opening at the enclosure bottom after depressurization. Some modified reactor cavity design is expected to play this role in the VHTRs.

  10. Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Research Team . Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Kelly, Michael J.

    2013-01-01

    The Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage (horizontal and vertical tail). This report contains the Appendices to Volume I.

  11. Multi-scale earthquake hazard and risk in the Chinese mainland and countermeasures for the preparedness, mitigation, and management: an overview

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Jiang, C.; Ma, T.

    2012-12-01

    Earthquake hazard and risk in the Chinese mainland exhibit multi-scale characteristics. Temporal scales from centuries to months, spatial scales from the whole mainland to specific engineering structures, and energy scales from great disastrous earthquakes to small earthquakes causing social disturbance and economic loss, feature the complexity of earthquake disasters. Coping with such complex challenge, several research and application projects have been undertaken since recent years. Lessons and experiences of the 2008 Wenchuan earthquake contributed much to the launching and conducting of these projects. Understandings of the scientific problems and technical approaches taken in the mainstream studies in the Chinese mainland have no significant difference from those in the international scientific communities, albeit using of some of the terminologies have "cultural differences" - for instance, in the China Earthquake Administration (CEA), the terminology "earthquake forecast/prediction (study)" is generally used in a much broader sense, mainly indicating time-dependent seismic hazard at different spatio-temporal scales. Several scientific products have been produced serving the society in different forms. These scientific products have unique academic merits due to the long-term persistence feature and the forward forecast nature, which are all essential for the evaluation of the technical performance and the falsification of the scientific ideas. On the other hand, using the language of the "actor network theory (ANT)" in science studies (or the sociology of science), at present, the hierarchical "actors' network", making the science transformed to the actions of the public and government for the preparedness, mitigation, and management of multi-scale earthquake disasters, is still in need of careful construction and improvement.

  12. Natural hazards and motivation for mitigation behavior: people cannot predict the affect evoked by a severe flood.

    PubMed

    Siegrist, Michael; Gutscher, Heinz

    2008-06-01

    Past research indicates that personal flood experience is an important factor in motivating mitigation behavior. It is not fully clear, however, why such experience is so important. This study tested the hypothesis that people without flooding experience underestimate the negative affect evoked by such an event. People who were affected by a severe recent flood disaster were compared with people who were not affected, but who also lived in flood-prone areas. Face-to-face interviews with open and closed questions were conducted (n= 201). Results suggest that people without flood experience envisaged the consequences of a flood differently from people who had actually experienced severe losses due to a flood. People who were not affected strongly underestimated the negative affect associated with a flood. Based on the results, it can be concluded that risk communication must not focus solely on technical aspects; in order to trigger motivation for mitigation behavior, successful communication must also help people to envisage the negative emotional consequences of natural disasters. PMID:18643832

  13. RAGE Hydrocode Modeling of Asteroid Mitigation: new simulations with parametric studies for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Weaver, R.; Plesko, C. S.; Gisler, G. R.

    2013-12-01

    We are performing detailed hydrodynamic simulations of the interaction from a strong explosion with sample Asteroid objects. The purpose of these simulations is to apply modern hydrodynamic codes that have been well verified and validated (V&V) to the problem of mitigating the hazard from a potentially hazardous object (PHO), an asteroid or comet that is on an Earth crossing orbit. The code we use for these simulations is the RAGE code from Los Alamos National Laboratory [1-6]. Initial runs were performed using a spherical object. Next we ran simulations using the shape form from a known asteroid: 25143 Itokawa. This particular asteroid is not a PHO but we use its shape to consider the influence of non-spherical objects. The initial work was performed using 2D cylindrically symmetric simulations and simple geometries. We then performed a major fully 3D simulation. For an Itokawa size object (~500 m) and an explosion energies ranging from 0.5 - 1 megatons, the velocities imparted to all of the PHO "rocks" in all cases were many m/s. The velocities calculated were much larger than escape velocity and would preclude re-assembly of the fragments. The dispersion of the asteroid remnants is very directional from a surface burst, with all fragments moving away from the point of the explosion. This detail can be used to time the intercept for maximum movement off the original orbit. Results from these previous studies will be summarized for background. In the new work presented here we show a variety of parametric studies around these initial simulations. We modified the explosion energy by +/- 20% and varied the internal composition from a few large "rocks" to several hundred smaller rocks. The results of these parametric studies will be presented. We have also extended our work [6],[7] to stand-off nuclear bursts and will present the initial results for the energy deposition by a generic source into the non-uniform composition asteroid. The goal of this new work is to obtain an "optimal stand-off" distance from detailed radiation transport-hydrodynamic simulations from generic explosion properties. The initial results of these two studies will also be presented. References [1] Gitting, Weaver et al 'The RAGE radiation-hydrodynamics Code,' Comp. Sci. Disc. 1 (2008) 015005 November 21, 2008 [2] Huebner, W.F. et al, 'The Engagement Space for Countermeasures Against Potentially Hazardous Objects (PHOs),' International Conference in Asteroid and Comet Hazards, 2009 held at the Russian Academy of Sciences, St. Petersburg 21-25-September 2009. [3] Gisler, Weaver, Mader, & Gittings, Two and three dimensional asteroid impact simulations, Computing in Science & Engineering, 6, 38 (2004). [4] NASA geometry courtesy of S.J. Osto et al. (2002) in Asteroids Book 3 [5] Itokawa image courtesy of JAXA: http://www.isas.jaxa.jp/e/snews/2005/1102.shtml#pic001 [6] Plesko, C et al "Looking Before we Leap: Recent Results from an Ongoing, Quantitative Investigation of Asteroid and Comet Impact Hazard Mitigation" Division of Planetary Sciences 2010. [7] Plesko, C et. al. "Numerical Models of Asteroid and Comet Impact Hazard Mitigation by Nuclear Stand-Off Burst." Planetary Defense Conference 2011.

  14. A design study on complexity reduced multipath mitigation

    NASA Astrophysics Data System (ADS)

    Wasenmller, U.; Brack, T.; Groh, I.; Staudinger, E.; Sand, S.; Wehn, N.

    2012-09-01

    Global navigation satellite systems, e.g. the current GPS and the future European Galileo system, are frequently used in car navigation systems or smart phones to determine the position of a user. The calculation of the mobile position is based on the signal propagation times between the satellites and the mobile terminal. At least four time of arrival (TOA) measurements from four different satellites are required to resolve the position uniquely. Further, the satellites need to be line-of-sight to the receiver for exact position calculation. However, in an urban area, the direct path may be blocked and the resulting multipath propagation causes errors in the order of tens of meters for each measurement. and in the case of non-line-of-sight (NLOS), positive errors in the order of hundreds of meters. In this paper an advanced algorithm for multipath mitigation known as CRMM is presented. CRMM features reduced algorithmic complexity and superior performance in comparison with other state of the art multipath mitigation algorithms. Simulation results demonstrate the significant improvements in position calculation in environments with severe multipath propagation. Nevertheless, in relation to traditional algorithms an increased effort is required for real-time signal processing due to the large amount of data, which has to be processed in parallel. Based on CRMM, we performed a comprehensive design study including a design space exploration for the tracking unit hardware part, and prototype implementation for hardware complexity estimation.

  15. United States studies in orbital debris - Prevention and mitigation

    NASA Technical Reports Server (NTRS)

    Loftus, Joseph P., Jr.; Potter, Andrew E.

    1990-01-01

    Debris in space has become an issue that has commanded considerable interest in recent years as society has become both more dependent upon space based systems, and more aware of its dependence. After many years of study the United States Space Policy of February 1988 directed that all sectors of the U.S. community minimize space debris. Other space organizations have adopted similar policies. Among the study activities leading to the policy and to subsequent implementing directives were discussions with the ESA, NASDA, and other space operating agencies. The policy derived from technical consensus on the nature of the issues and upon the courses of action available to mitigate the problem, but there remains the concern as to the adequacy of the data to define cost effective strategies. There are now in place mechanisms to continue technical discussions in more formal terms.

  16. Natural Hazard Mitigation thru Water Augmentation Strategies to Provide Additional Snow Pack for Water Supply and Hydropower Generation in Drought Stressed Alps/Mountains

    NASA Astrophysics Data System (ADS)

    Matthews, D.; Brilly, M.

    2009-12-01

    Climate variability and change are clearly stressing water supplies in high alpine regions of the Earth. These recent long-term natural hazards present critical challenges to policy makers and water managers. This paper addresses strategies to use enhanced scientific methods to mitigate the problem. Recent rapid depletions of glaciers and intense droughts throughout the world have created a need to reexamine modern water augmentation technologies for enhancing snow pack in mountainous regions. Todays reliance on clean efficient hydroelectric power in the Alps and the Rocky Mountains poses a critical need for sustainable snow packs and high elevation water supplies through out the year. Hence, the need to make natural cloud systems more efficient precipitators during the cold season through anthropogenic weather modification techniques. The Bureau of Reclamation, US Department of the Interior, has spent over $39M in research from 1963 to 1990 to develop the scientific basis for snow pack augmentation in the headwaters of the Colorado, American, and Columbia River Basins in the western United States, and through USAID in Morocco in the High Atlas Mountains. This paper presents a brief summary of the research findings and shows that even during drought conditions potential exists for significant, cost-effective enhancement of water supplies. Examples of ground based propane and AgI seeding generators, cloud physics studies of supercooled cloud droplets and ice crystal characteristics that indicate seeding potential will be shown. Hypothetical analyses of seeding potential in 17 western states from Montana to California will be presented based on observed SNOTEL snow water equivalent measurements, and distributed by elevation and observed winter precipitation. Early studies indicated from 5 to 20% increases in snow pack were possible, if winter storm systems were seeded effectively. If this potential was realized in drought conditions observed in 2003, over 1.08 million acre feet (1.33 x 10**9 m3) of additional water could be captured by seeding efficiently and effectively in just 10 storms. Recent projects sponsored by the National Science Foundation, NOAA, and the States of Wyoming, Utah and Nevada, and conducted by the National Center for Atmospheric Research will be discussed briefly. Examples of conditions in extreme droughts of the Western United States will be presented that show potential to mitigate droughts in these regions through cloud seeding. Implications for American and European hydropower generation and sustainable water supplies will be discussed.

  17. Airflow Hazard Visualization for Helicopter Pilots: Flight Simulation Study Results

    NASA Technical Reports Server (NTRS)

    Aragon, Cecilia R.; Long, Kurtis R.

    2005-01-01

    Airflow hazards such as vortices or low level wind shear have been identified as a primary contributing factor in many helicopter accidents. US Navy ships generate airwakes over their decks, creating potentially hazardous conditions for shipboard rotorcraft launch and recovery. Recent sensor developments may enable the delivery of airwake data to the cockpit, where visualizing the hazard data may improve safety and possibly extend ship/helicopter operational envelopes. A prototype flight-deck airflow hazard visualization system was implemented on a high-fidelity rotorcraft flight dynamics simulator. Experienced helicopter pilots, including pilots from all five branches of the military, participated in a usability study of the system. Data was collected both objectively from the simulator and subjectively from post-test questionnaires. Results of the data analysis are presented, demonstrating a reduction in crash rate and other trends that illustrate the potential of airflow hazard visualization to improve flight safety.

  18. AMENDING SOILS WITH PHOSPHATE AS MEANS TO MITIGATE SOIL LEAD HAZARD: A CRITICAL REVIEW OF THE STATE OF THE SCIENCE

    EPA Science Inventory

    Ingested soil and surface dust may be important contributors to elevated blood lead (Pb) levels in children exposed to Pb contaminated environments. Mitigation strategies have typically focused on excavation and removal of the contaminated soil. However, this is not always feas...

  19. Peru mitigation assessment of greenhouse gases: Sector -- Energy. Peru climate change country study; Final report

    SciTech Connect

    1996-08-01

    The aim of this study is to determine the Inventory and propose Greenhouse Gases Mitigation alternatives in order to face the future development of the country in a clean environmental setting without delaying the development process required to improve Peruvian standard of living. The main idea of this executive abstract is to show concisely the results of the Greenhouse Gases Mitigation for Peru in the period 1990--2015. The studies about mitigation for the Energy Sector are shown in this summary.

  20. Experimental Studies of Mitigation Materials for Blast Induced TBI

    NASA Astrophysics Data System (ADS)

    Alley, Matthew; Son, Steven

    2009-06-01

    The objective of this experimental study is to compare the effects of various materials obstructing the flow of a blast wave and the ability of the given material to reduce the damage caused by the blast. Several methods of energy transfer in blast wave flows are known or expected including: material interfaces with impedance mismatches, density changes in a given material, internal shearing, and particle fracture. The theory applied to this research is that the greatest energy transfer within the obstructing material will yield the greatest mitigation effects to the blast. Sample configurations of foam were varied to introduce material interfaces and filler materials with varying densities and impedances (liquids and powders). The samples were loaded according to a small scale blast produced by an explosive driven shock tube housing gram-range charges. The transmitted blast profiles were analyzed for variations in impulse characteristics and frequency components as compared to standard free field profiles. The results showed a rounding effect of the transmitted blast profile for all samples with the effects of the low density fillers surpassing all others tested.

  1. Experimental Studies of Mitigation Materials for Blast Induced Tbi

    NASA Astrophysics Data System (ADS)

    Alley, M. D.; Son, S. F.; Christou, G.; Goel, R.; Young, L.

    2009-12-01

    The objective of this experimental study is to compare the effects of various materials obstructing the flow of a blast wave and the ability of the material to reduce the damage caused by the blast. Several methods of energy transfer in blast wave flows are expected including: material interfaces with impedance mismatches, density changes in a given material, internal shearing, and particle fracture. Our hypothesis is that the greatest energy transfer within the obstructing material will yield the greatest mitigation effects to the blast. Sample configurations of foam were varied to introduce material interfaces and filler materials with varying densities and impedances (liquids and powders). The samples were dynamically loaded using a small scale blast produced by an explosive driven shock tube housing gram-scale explosive charges. The transmitted blast profiles were analyzed for variations in impulse characteristics and frequency components as compared to standard free field profiles. The results showed a rounding effect of the transmitted blast profile for all samples with the effects of the high density fillers surpassing all others tested. These results lead to a conclusion that low porosity, high density materials offer superior attenuation by reducing air blast features and spatially distributing the transmitted wave.

  2. From structural investigation towards multi-parameter early warning systems: geophysical contributions to hazard mitigation at the landslide of Gschliefgraben (Gmunden, Upper Austria)

    NASA Astrophysics Data System (ADS)

    Supper, Robert; Baron, Ivo; Jochum, Birgit; Ita, Anna; Winkler, Edmund; Motschka, Klaus; Moser, Günter

    2010-05-01

    In December 2007 the large landslide system inside the Gschliefgraben valley (located at the east edge of the Traun lake, Upper Austria), known over centuries for its repeated activity, was reactivated. Although a hazard zone map was already set up in 1974, giving rise to a complete prohibition on building, some hundreds of people are living on the alluvial fan close to the lake. Consequently, in frame of the first emergency measures, 55 building had to be evacuated. Within the first phase of mitigation, measures were focused on property and infrastructure protection. Around 220 wells and one deep channel were implemented to drain the sliding mass. Additionally a big quantity of sliding material was removed close to the inhabited areas. Differential GPS and water level measurements were performed to evaluate the effectiveness of the measures, which led to a significant slowdown of the movement. Soon after the suspension of the evacuation several investigations, including drilling, borehole logging and complex geophysical measurements were performed to investigate the structure of the landslide area in order to evaluate maximum hazard scenarios as a basis for planning further measures. Based on these results, measuring techniques for an adapted, future early warning system are currently being tested. This emergency system should enable local stakeholders to take appropriate and timely measures in case of a future event thus lessening the impact of a future disaster significantly. Within this tree-step-plan the application of geophysical methodologies was an integral part of the research and could considerably contribute to the success. Several innovative approaches were implemented which will be described in more detail within the talk. Airborne multi-sensor geophysical surveying is one of new and progressive approaches which can remarkably contribute to effectively analyse triggering processes of large landslides and to better predict their hazard. It was tested in Gschliefgraben earthflow and landslide complex in September 2009. Several parameters, such as vegetation thickness, soil moisture, potassium and thorium content (gamma ray) or four layer resistivity were the principal studied parameters. These parameters were compared with the landslide inventory map of Gschliefgraben developed from differential airborne laser scan terrain models. Since mass wasting is usually triggered by rising water pore pressure due to heavy rainfall or seismic tremors, often supported by changes in the shape, structure, and hydrology of a slope or vegetation cover. As the electrical resistivity of the subsurface mainly depends on porosity, saturation, pore fluid conductivity and clay content, the geoelectric method is a reliable method to investigate the structure of the landslide and surrounding and could be an emerging tool for observing those triggering factors. Therefore, first a multi-electrode geoelectrical survey was performed in a broader area of the active earthflow to verify the subsurface structure and to optimise the location for a monitoring system, followed by the installacion of the geoelectric monitoring system Geomon4D in September 2009. The monitoring profiles were complemented by an automatic DMS inclinometer to correlate measured resistivity values with displacement rates. Since the installation, the system works continuously and data is processed on a daily basis at the monitoring centre in Vienna. These works were supported by the 7th FP project "Safeland - Living with the landslide risk in Europe".

  3. Integrated Data Products to Forecast, Mitigate, and Educate for Natural Hazard Events Based on Recent and Historical Observations

    NASA Astrophysics Data System (ADS)

    McCullough, H. L.; Dunbar, P. K.; Varner, J. D.

    2011-12-01

    Immediately following a damaging or fatal natural hazard event there is interest to access authoritative data and information. The National Geophysical Data Center (NGDC) maintains and archives a comprehensive collection of natural hazards data. The NGDC global historic event database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Examining the past record provides clues to what might happen in the future. NGDC also archives tide gauge data from stations operated by the NOAA/NOS Center for Operational Oceanographic Products and Services and the NOAA Tsunami Warning Centers. In addition to the tide gauge data, NGDC preserves deep-ocean water-level, 15-second sampled data as collected by the Deep-ocean Assessment and Reporting of Tsunami (DART) buoys. Water-level data provide evidence of sea-level fluctuation and possible inundation events. NGDC houses an extensive collection of geologic hazards photographs available online as digital images. Visual media provide invaluable pre- and post-event data for natural hazards. Images can be used to illustrate inundation and possible damage or effects. These images are organized by event or hazard type (earthquake, volcano, tsunami, landslide, etc.), along with description and location. They may be viewed via interactive online maps and are integrated with historic event details. The planning required to achieve collection and dissemination of hazard event data is extensive. After a damaging or fatal event, NGDC begins to collect and integrate data and information from many people and organizations into the hazards databases. Sources of data include the U.S. NOAA Tsunami Warning Centers, the U.S. Geological Survey, the U.S. NOAA National Data Buoy Center, the UNESCO Intergovernmental Oceanographic Commission (IOC), Smithsonian Institution's Global Volcanism Program, news organizations, etc. NGDC then works to promptly distribute data and information for the appropriate audience. For example, when a major tsunami occurs, all of the related tsunami data are combined into one timely resource. NGDC posts a publicly accessible online report which includes: 1) event summary; 2) eyewitness and instrumental recordings from preliminary field surveys; 3) regional historical observations including similar past events and effects; 4) observed water heights and calculated tsunami travel times; and 5) near-field effects. This report is regularly updated to incorporate the most recent news and observations. Providing timely access to authoritative data and information ultimately benefits researchers, state officials, the media and the public.

  4. Experimental study designs to improve the evaluation of road mitigation measures for wildlife.

    PubMed

    Rytwinski, Trina; van der Ree, Rodney; Cunnington, Glenn M; Fahrig, Lenore; Findlay, C Scott; Houlahan, Jeff; Jaeger, Jochen A G; Soanes, Kylie; van der Grift, Edgar A

    2015-05-01

    An experimental approach to road mitigation that maximizes inferential power is essential to ensure that mitigation is both ecologically-effective and cost-effective. Here, we set out the need for and standards of using an experimental approach to road mitigation, in order to improve knowledge of the influence of mitigation measures on wildlife populations. We point out two key areas that need to be considered when conducting mitigation experiments. First, researchers need to get involved at the earliest stage of the road or mitigation project to ensure the necessary planning and funds are available for conducting a high quality experiment. Second, experimentation will generate new knowledge about the parameters that influence mitigation effectiveness, which ultimately allows better prediction for future road mitigation projects. We identify seven key questions about mitigation structures (i.e., wildlife crossing structures and fencing) that remain largely or entirely unanswered at the population-level: (1) Does a given crossing structure work? What type and size of crossing structures should we use? (2) How many crossing structures should we build? (3) Is it more effective to install a small number of large-sized crossing structures or a large number of small-sized crossing structures? (4) How much barrier fencing is needed for a given length of road? (5) Do we need funnel fencing to lead animals to crossing structures, and how long does such fencing have to be? (6) How should we manage/manipulate the environment in the area around the crossing structures and fencing? (7) Where should we place crossing structures and barrier fencing? We provide experimental approaches to answering each of them using example Before-After-Control-Impact (BACI) study designs for two stages in the road/mitigation project where researchers may become involved: (1) at the beginning of a road/mitigation project, and (2) after the mitigation has been constructed; highlighting real case studies when available. PMID:25704749

  5. Mitigation of hazards from future lahars from Mount Merapi in the Krasak River channel near Yogyakarta, central Java

    USGS Publications Warehouse

    Ege, John R.; Sutikno

    1983-01-01

    Procedures for reducing hazards from future lahars and debris flows in the Krasak River channel near Yogyakarta, Central Java, Indonesia, include (1) determining the history of the location, size, and effects of previous lahars and debris flows, and (2) decreasing flow velocities. The first may be accomplished by geologic field mapping along with acquiring information by interviewing local residents, and the second by increasing the cross sectional area of the river channel and constructing barriers in the flow path.

  6. Odor mitigation with vegetative buffers: Swine production case study

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Vegetative environmental buffers (VEB) are a potentially low cost sustainable odor mitigation strategy, but there is little to no data supporting their effectiveness. Wind tunnel experiments and field monitoring were used to determine the effect VEB had on wind flow patterns within a swine facility....

  7. 3D modelling of Mt. Talaga Bodas Crater (Indonesia) by using terrestrial laser scanner for volcano hazard mitigation

    NASA Astrophysics Data System (ADS)

    Gumilar, Irwan; Abidin, Hasanuddin Z.; Putra, Andreas D.; Haerani, Nia

    2015-04-01

    Indonesia is a country with many volcanoes. Each volcano in Indonesia typically has its own crater characteristics. One of them is the Mt.Talaga Bodas, located in Garut, West Java. Researches regarding the crater characteristics are necessary for volcanic disaster mitigation process. One of them is the modelling of the shape of the crater. One of the methods that can be used to model the volcanic crater is using Terrestrial Laser Scanner (TLS). This research aims to create a 3 dimensional (3D) model of the crater of the Mt. Talaga Bodas, that hopefully can be utilized for volcanic disaster mitigation. The methodology used in this research is by obtaining the scanning data using TLS and GPS measurements to obtain the coordinates of the reference points. The data processing methods consist of several steps, namely target to target registration, filterization, georeference, meshing point cloud, surface making, drawing, and 3D modelling. These steps were done using the Cyclone 7 software, and also using 3DS MAX for 3D modelling. The result of this data processing is a 3D model of the crater of the Mt. Talaga Bodas which is similar with the real shape. The calculation result shows that the height of the crater is 62.522 m, the diameter of the crater is 467.231 m, and the total area is 2961054.652 m2. The main obstacle in this research is the dense vegetation which becomes the noise and affects the crater model.

  8. The Value of Linking Mitigation and Adaptation: A Case Study of Bangladesh

    NASA Astrophysics Data System (ADS)

    Ayers, Jessica M.; Huq, Saleemul

    2009-05-01

    There are two principal strategies for managing climate change risks: mitigation and adaptation. Until recently, mitigation and adaptation have been considered separately in both climate change science and policy. Mitigation has been treated as an issue for developed countries, which hold the greatest responsibility for climate change, while adaptation is seen as a priority for the South, where mitigative capacity is low and vulnerability is high. This conceptual divide has hindered progress against the achievement of the fundamental sustainable development challenges of climate change. Recent attention to exploring the synergies between mitigation and adaptation suggests that an integrated approach could go some way to bridging the gap between the development and adaptation priorities of the South and the need to achieve global engagement in mitigation. These issues are explored through a case study analysis of climate change policy and practice in Bangladesh. Using the example of waste-to-compost projects, a mitigation-adaptation-development nexus is demonstrated, as projects contribute to mitigation through reducing methane emissions; adaptation through soil improvement in drought-prone areas; and sustainable development, because poverty is exacerbated when climate change reduces the flows of ecosystem services. Further, linking adaptation to mitigation makes mitigation action more relevant to policymakers in Bangladesh, increasing engagement in the international climate change agenda in preparation for a post-Kyoto global strategy. This case study strengthens the argument that while combining mitigation and adaptation is not a magic bullet for climate policy, synergies, particularly at the project level, can contribute to the sustainable development goals of climate change and are worth exploring.

  9. The value of linking mitigation and adaptation: a case study of Bangladesh.

    PubMed

    Ayers, Jessica M; Huq, Saleemul

    2009-05-01

    There are two principal strategies for managing climate change risks: mitigation and adaptation. Until recently, mitigation and adaptation have been considered separately in both climate change science and policy. Mitigation has been treated as an issue for developed countries, which hold the greatest responsibility for climate change, while adaptation is seen as a priority for the South, where mitigative capacity is low and vulnerability is high. This conceptual divide has hindered progress against the achievement of the fundamental sustainable development challenges of climate change. Recent attention to exploring the synergies between mitigation and adaptation suggests that an integrated approach could go some way to bridging the gap between the development and adaptation priorities of the South and the need to achieve global engagement in mitigation. These issues are explored through a case study analysis of climate change policy and practice in Bangladesh. Using the example of waste-to-compost projects, a mitigation-adaptation-development nexus is demonstrated, as projects contribute to mitigation through reducing methane emissions; adaptation through soil improvement in drought-prone areas; and sustainable development, because poverty is exacerbated when climate change reduces the flows of ecosystem services. Further, linking adaptation to mitigation makes mitigation action more relevant to policymakers in Bangladesh, increasing engagement in the international climate change agenda in preparation for a post-Kyoto global strategy. This case study strengthens the argument that while combining mitigation and adaptation is not a magic bullet for climate policy, synergies, particularly at the project level, can contribute to the sustainable development goals of climate change and are worth exploring. PMID:18956222

  10. Prevalence and predictors of residential health hazards: a pilot study.

    PubMed

    Klitzman, Susan; Caravanos, Jack; Deitcher, Deborah; Rothenberg, Laura; Belanoff, Candice; Kramer, Rachel; Cohen, Louise

    2005-06-01

    This article reports the results of a pilot study designed to ascertain the prevalence of lead-based paint (LBP), vermin, mold, and safety conditions and hazards and to validate observations and self-reports against environmental sampling data. Data are based on a convenience sample of 70 dwellings in a low-income, urban neighborhood in Brooklyn, New York. The vast majority of residences (96%) contained multiple conditions and/or hazards: LBP hazards (80%), vermin (79%), elevated levels of airborne mold (39%), and safety hazards (100%). Observations and occupant reports were associated with environmental sampling data. In general, the more proximate an observed condition was to an actual hazard, the more likely it was to be associated with environmental sampling results (e.g., peeling LBP was associated with windowsill dust lead levels, and cockroach sightings by tenants were associated with Blatella germanica [Bla g 1] levels). Conversely, the more distal an observed condition was to an actual hazard, the less likely it was to be associated with environmental sampling results (e.g., water damage, alone, was not statistically associated with elevated levels of dust lead, Bla g 1, or airborne mold). Based on the findings from this pilot study, there is a need for industrial hygienists and others to adopt more comprehensive and integrative approaches to residential hazard assessment and remediation. Further research--using larger, randomly drawn samples, representing a range of housing types and geographical areas--is needed to clarify the relationship between readily observable conditions, occupant reports, and environmental sampling data and to assess the cumulative impact on human health. PMID:16020089

  11. Using fine-scale fuel measurements to assess wildland fuels, potential fire behavior and hazard mitigation treatments in the southeastern USA.

    SciTech Connect

    Ottmar, Roger, D.; Blake, John, I.; Crolly, William, T.

    2012-01-01

    The inherent spatial and temporal heterogeneity of fuelbeds in forests of the southeastern United States may require fine scale fuel measurements for providing reliable fire hazard and fuel treatment effectiveness estimates. In a series of five papers, an intensive, fine scale fuel inventory from the Savanna River Site in the southeastern United States is used for building fuelbeds and mapping fire behavior potential, evaluating fuel treatment options for effectiveness, and providing a comparative analysis of landscape modeled fire behavior using three different data sources including the Fuel Characteristic Classification System, LANDFIRE, and the Southern Wildfire Risk Assessment. The research demonstrates that fine scale fuel measurements associated with fuel inventories repeated over time can be used to assess broad scale wildland fire potential and hazard mitigation treatment effectiveness in the southeastern USA and similar fire prone regions. Additional investigations will be needed to modify and improve these processes and capture the true potential of these fine scale data sets for fire and fuel management planning.

  12. Leak detection, monitoring, and mitigation technology trade study update

    SciTech Connect

    HERTZEL, J.S.

    1998-11-10

    This document is a revision and update to the initial report that describes various leak detection, monitoring, and mitigation (LDMM) technologies that can be used to support the retrieval of waste from the single-shell tanks (SST) at the Hanford Site. This revision focuses on the improvements in the technical performance of previously identified and useful technologies, and it introduces new technologies that might prove to be useful.

  13. Remote sensing applied in natural hazards mitigation - experiences from the international UNESCO/IUGS GARS-Program 1984 - 2002

    NASA Astrophysics Data System (ADS)

    Bannert, D.

    Worldwide resources of arable land, water, groundwater, forest and expanding human habitat are under increasing pressure almost anywhere. Especially the non- industrialised countries with their rapidly increasing population are facing severe problems from natural catastrophes such as landslides, volcanic and seismic hazards, soil degradation and shortage of water or flooding. Geo-environmental research can help to identify the causes for these events, define the rehabilitation and can lead to early warning systems. Remote sensing adds considerable knowledge by providing a wide variety of sensors applied form airborne and space platforms, the data of which, once analysed, can provide completely new observations on natural risk areas. The UNESCO/IUGS sponsored GARS Program since 1984 is conducting- joint research with institutions in industrialised and developing countries. As of today, more than 40 institutes and individuals worldwide have joined the GARS- Program. Results of their research are among others contributions toLandslide assessment qVolcanic risk qCoastal hazards qDesertification processes q Space organisations and financing institutions serving developing nations are requested to help to deploy new sensors to monitor geo-dynamic processes, providing free and direct data reception in all parts of the world in order to allow national institutes to develop their own early warning capabilities.

  14. Amending soils with phosphate as means to mitigate soil lead hazard: a critical review of the state of the science.

    PubMed

    Scheckel, Kirk G; Diamond, Gary L; Burgess, Michele F; Klotzbach, Julie M; Maddaloni, Mark; Miller, Bradley W; Partridge, Charles R; Serda, Sophia M

    2013-01-01

    Ingested soil and surface dust may be important contributors to elevated blood lead (Pb) levels in children exposed to Pb contaminated environments. Mitigation strategies have typically focused on excavation and removal of the contaminated soil. However, this is not always feasible for addressing widely disseminated contamination in populated areas often encountered in urban environments. The rationale for amending soils with phosphate is that phosphate will promote formation of highly insoluble Pb species (e.g., pyromorphite minerals) in soil, which will remain insoluble after ingestion and, therefore, inaccessible to absorption mechanisms in the gastrointestinal tract (GIT). Amending soil with phosphate might potentially be used in combination with other methods that reduce contact with or migration of contaminated soils, such as covering the soil with a green cap such as sod, clean soil with mulch, raised garden beds, or gravel. These remediation strategies may be less expensive and far less disruptive than excavation and removal of soil. This review evaluates evidence for efficacy of phosphate amendments for decreasing soil Pb bioavailability. Evidence is reviewed for (1) physical and chemical interactions of Pb and phosphate that would be expected to influence bioavailability, (2) effects of phosphate amendments on soil Pb bioaccessibility (i.e., predicted solubility of Pb in the GIT), and (3) results of bioavailability bioassays of amended soils conducted in humans and animal models. Practical implementation issues, such as criteria and methods for evaluating efficacy, and potential effects of phosphate on mobility and bioavailability of co-contaminants in soil are also discussed. PMID:24151967

  15. Hazards analysis and prediction from remote sensing and GIS using spatial data mining and knowledge discovery: a case study for landslide hazard zonation

    NASA Astrophysics Data System (ADS)

    Hsu, Pai-Hui; Su, Wen-Ray; Chang, Chy-Chang

    2011-11-01

    Due to the particular geographical location and geological condition, Taiwan suffers from many natural hazards which often cause series property damages and life losses. To reduce the damages and casualty, an effective real-time system for hazard prediction and mitigation is necessary. In this study, a case study for Landslide Hazard Zonation (LHZ) is tested in accordance with Spatial Data Mining and Knowledge Discovery (SDMKD) from database. Many different kinds of geospatial data, such as the terrain elevation, land cover types, the distance to roads and rivers, geology maps, NDVI, and monitoring rainfall data etc., are collected into the database for SDMKD. In order to guarantee the data quality, the spatial data cleaning is essential to remove the noises, errors, outliers, and inconsistency hiding in the input spatial data sets. In this paper, the Kriging interpolation is used to calibrate the QPESUMS rainfall data to the rainfall observations from rain gauge stations to remove the data inconsistency. After the data cleaning, the artificial neural networks (ANNs) is applied to generate the LHZ map throughout the test area. The experiment results show that the accuracy of LHZ is about 92.3% with the ANNs analysis, and the landslides induced by heavy-rainfall can be mapped efficiently from remotely sensed images and geospatial data using SDMKD technologies.

  16. Remote Sensing Applied in Natural Hazards Mitigation Experiences from the International UNESCO/IUGS Gars-Program 1984- 2002

    NASA Astrophysics Data System (ADS)

    Bannert, D.

    The Geological Application of Remote Sensing-Program (GARS-Program) has since 1984 devoted its efforts towards the application of remote sensing from aircraft and space platforms with the understanding that it adds considerably to the knowledge of geo-dynamic processes in the widest range. Remote sensing provides a large variety of sensors, the data of which, once analysed, can provide completely new observations on areas threatened by natural hazards. UNESCO and IUGS through the GARS-Program provide a forum, where remote sensing techniques are continuously scrutinised concerning their geological application. Especially their potential in assessing geo-environmental issues under various geological, climatic and morphological conditions is considered. In the past, international co-operation projects in developing countries have been carried out under the GARS-Program, addressing the following subjects: Currently, the GARS-Program is strongly involved in the IGOS Geohazards Theme Working Group and in the UNESCO / IAH Middle East Transboundary Aquifer Initiative, as well as numerous individual projects by member institutes. There are more than 40 institutes co- operating world-wide under the GARS-Program. Space organisations and financing institutions serving developing nations are requested to help to deploy new sensors to monitor geo-dynamic processes, providing free and direct data reception in all parts of the world in order to allow national institutes to develop their own early warning capabilities.

  17. S. 353: A bill to require the Director of the National Institute for Occupational Safety and Health to conduct a study of the prevalence and issues related to contamination of workers' homes with hazardous chemicals and substances transported from their workplace and to issue or report on regulations to prevent or mitigate the future contamination of workers' homes, and for other purposes, introduced in the United States Senate, One Hundred Second Congress, First Session, November 27, 1991

    SciTech Connect

    Not Available

    1991-01-01

    This bill was introduced into the Senate of the United States on February 5, 1991 to require NIOSH to conduct a study concerning the contamination of worker's homes with hazardous materials transported from their workplace and to issue or report on regulations to prevent future contamination. These hazardous chemicals and substances are being transported out of industries on worker's clothing and pose a threat to the health and welfare of workers and their families. Separate sections address the following: evaluation of employee transported contaminant releases; regulations or standards; and authorization of appropriations.

  18. RADON MITIGATION IN SCHOOLS: CASE STUDIES OF RADON MITIGATION SYSTEMS INSTALLED BY EPA IN FOUR MARYLAND SCHOOLS ARE PRESENTED

    EPA Science Inventory

    The first part of this two-part paper discusses radon entry into schools, radon mitigation approaches for schools, and school characteristics (e.g., heating, ventilation, and air-conditioning -- HVAC-- system design and operation) that influence radon entry and mitigation system ...

  19. Monitoring volcanic activity with satellite remote sensing to reduce aviation hazard and mitigate the risk: application to the North Pacific

    NASA Astrophysics Data System (ADS)

    Webley, P. W.; Dehn, J.

    2012-12-01

    Volcanic activity across the North Pacific (NOPAC) occurs on a daily basis and as such monitoring needs to occur on a 24 hour, 365 days a year basis. The risk to the local population and aviation traffic is too high for this not to happen. Given the size and remoteness of the NOPAC region, satellite remote sensing has become an invaluable tool to monitor the ground activity from the regions volcanoes as well as observe, detect and analyze the volcanic ash clouds that transverse across the Pacific. Here, we describe the satellite data collection, data analysis, real-time alert/alarm systems, observational database and nearly 20-year archive of both automated and manual observations of volcanic activity. We provide examples of where satellite remote sensing has detected precursory activity at volcanoes, prior to the volcanic eruption, as well as different types of eruptive behavior that can be inferred from the time series data. Additionally, we illustrate how the remote sensing data be used to detect volcanic ash in the atmosphere, with some of the pro's and con's to the method as applied to the NOPAC, and how the data can be used with other volcano monitoring techniques, such as seismic monitoring and infrasound, to provide a more complete understanding of a volcanoes behavior. We focus on several large volcanic events across the region, since our archive started in 1993, and show how the system can detect both these large scale events as well as the smaller in size but higher in frequency type events. It's all about how to reduce the risk, improve scenario planning and situational awareness and at the same time providing the best and most reliable hazard assessment from any volcanic activity.

  20. Deepwater Gulf of Mexico Shallow Hazards Studies Best Practices

    NASA Astrophysics Data System (ADS)

    Fernandez, M.; Hobbs, B.

    2005-05-01

    ConocoPhillips (hConoco) has been involved in deepwater exploration in the Gulf of Mexico for the last 5 years using a dynamically positioned (DP) drillship. As part of the Federal (MMS) and State permitting process for deepwater exploration, ConocoPhillips (COPC) actively undertakes in securing seabed and shallow subsurface hazard surveys and analyses for every potential drillsite. COPC conducts seabed and shallow subsurface hazards surveys for at least two main reasons: To be a safe, efficient operator, seabed and shallow subsurface hazard surveys and analyses are necessary steps of the Exploration Work Flow to help ensure a safe well, and to fulfill MMS (or local government) regulatory requirements The purpose of shallow geohazards studies is to determine seafloor and sub-bottom conditions, inspect for possible chemosynthetic communities, and to provide a shallow hazards assessment in accordance with NTL 2003-G17. During the five years of deepwater exploration COPC has contracted Fugro Geoservices to perform hazards studies in over 30 offshore blocks. Deepwater Gulf of Mexico Shallow Hazards Studies Best Practices The results of the seabed and shallow geohazards are a critical part of the construction of all of our well plans and are dynamically used in all MDT's. The results of the seabed and shallow geohazards investigations have greatly improved our drilling efficiency by predicting and avoiding possible chemosynthetic communities, sea floor faults, shallow gas, and shallow water flow. CoP's outstanding safety record and environmental stewardship with regards to geohazards has helped us in accelerating certain Exploration Plans (within MMS guidelines). These types of efforts has saved money and kept the drilling schedule running smoothly. In the last two years, the MMS has given COPC approval to use existing 3D spec seismic volumes for Shallow Hazards Assessment at several locations where applicable. This type of effort has saved ConocoPhillips hundreds of thousands of dollars that would have been spent in either acquiring 2D high resolution seismic data or reprocessing an existing 3D data volume. Examples from Selected Prospects: Magnolia (Garden Banks 783/784); Voss (Keathley Canyon 347/391/435); Lorien (Green Canyon 199); Yorick (Green Canyon 391/435)

  1. Detecting Slow Deformation Signals Preceding Dynamic Failure: A New Strategy For The Mitigation Of Natural Hazards (SAFER)

    NASA Astrophysics Data System (ADS)

    Vinciguerra, Sergio; Colombero, Chiara; Comina, Cesare; Ferrero, Anna Maria; Mandrone, Giuseppe; Umili, Gessica; Fiaschi, Andrea; Saccorotti, Gilberto

    2015-04-01

    Rock slope monitoring is a major aim in territorial risk assessment and mitigation. The high velocity that usually characterizes the failure phase of rock instabilities makes the traditional instruments based on slope deformation measurements not applicable for early warning systems. The use of "site specific" microseismic monitoring systems, with particular reference to potential destabilizing factors, such as rainfalls and temperature changes, can allow to detect pre-failure signals in unstable sectors within the rock mass and to predict the possible acceleration to the failure. We deployed a microseismic monitoring system in October 2013 developed by the University of Turin/Compagnia San Paolo and consisting of a network of 4 triaxial 4.5 Hz seismometers connected to a 12 channel data logger on an unstable patch of the Madonna del Sasso, Italian Western Alps. The initial characterization based on geomechanical and geophysical tests allowed to understand the instability mechanism and to design a 'large aperture' configuration which encompasses the entire unstable rock and can monitor subtle changes of the mechanical properties of the medium. Stability analysis showed that the stability of the slope is due to rock bridges. A continuous recording at 250 Hz sampling frequency (switched in March 2014 to 1 kHz for improving the first arrival time picking and obtain wider frequency content information) and a trigger recording based on a STA/LTA (Short Time Average over Long Time Average) detection algorithm have been used. More than 2000 events with different waveforms, duration and frequency content have been recorded between November 2013 and March 2014. By inspecting the acquired events we identified the key parameters for a reliable distinction among the nature of each signal, i.e. the signal shape in terms of amplitude, duration, kurtosis and the frequency content in terms of range of maximum frequency content, frequency distribution in spectrograms. Four main classes of recorded signals can be recognised: microseismic events, regional earthquakes, electrical noises and calibration signals, and unclassified events (probably grouping rockfalls, quarry blasts, other anthropic and natural sources of seismic noise). Since the seismic velocity inside the rock mass is highly heterogeneous, as it resulted from the geophysical investigations and the signals are often noisy an accurate location is not possible. To overcome this limitation a three-dimensional P-wave velocity model linking the DSM (Digital Surface Model) of the cliff obtained from a laser-scanner survey to the results of the cross-hole seismic tomography, the geological observations and the geomechanical measures of the most pervasive fracture planes has been built. As a next step we will proceed to the localization of event sources, to the improvement and automation of data analysis procedures and to search for correlations between event rates and meteorological data, for a better understanding of the processes driving the rock mass instability.

  2. The use of questionnaires for acquiring information on public perception of natural hazards and risk mitigation - a review of current knowledge and practice

    NASA Astrophysics Data System (ADS)

    Bird, D. K.

    2009-07-01

    Questionnaires are popular and fundamental tools for acquiring information on public knowledge and perception of natural hazards. Questionnaires can provide valuable information to emergency management agencies for developing risk management procedures. Although many natural hazards researchers describe results generated from questionnaires, few explain the techniques used for their development and implementation. Methodological detail should include, as a minimum, response format (open/closed questions), mode of delivery, sampling technique, response rate and access to the questionnaire to allow reproduction of or comparison with similar studies. This article reviews current knowledge and practice for developing and implementing questionnaires. Key features include questionnaire design, delivery mode, sampling techniques and data analysis. In order to illustrate these aspects, a case study examines methods chosen for the development and implementation of questionnaires used to obtain information on knowledge and perception of volcanic hazards in a tourist region in southern Iceland. Face-to-face interviews highlighted certain issues with respect to question structure and sequence. Recommendations are made to overcome these problems before the questionnaires are applied in future research projects. In conclusion, basic steps that should be disclosed in the literature are provided as a checklist to ensure that reliable, replicable and valid results are produced from questionnaire based hazard knowledge and risk perception research.

  3. Knowledge of occupational hazards in photography: a pilot study.

    PubMed

    Marlenga, B; Parker-Conrad, J E

    1993-04-01

    1. Artists use many materials composed of the same chemicals that cause major occupational health problems in industry. 2. The majority of artists are unaware of the potential hazards in the materials and processes they use. 3. The pilot study revealed that greater than 90% of the amateur photographers did not use safety precautions in the darkroom. 4. The most common perceived barrier to the use of safety precautions was the lack of knowledge about chemical safety. PMID:8507283

  4. Household hazardous waste quantification, characterization and management in China's cities: a case study of Suzhou.

    PubMed

    Gu, Binxian; Zhu, Weimo; Wang, Haikun; Zhang, Rongrong; Liu, Miaomiao; Chen, Yangqing; Wu, Yi; Yang, Xiayu; He, Sheng; Cheng, Rong; Yang, Jie; Bi, Jun

    2014-11-01

    A four-stage systematic tracking survey of 240 households was conducted from the summer of 2011 to the spring of 2012 in a Chinese city of Suzhou to determine the characteristics of household hazardous waste (HHW) generated by the city. Factor analysis and a regression model were used to study the major driving forces of HHW generation. The results indicate that the rate of HHW generation was 6.16 (0.16-31.74, 95% CI) g/person/day, which accounted for 2.23% of the household solid waste stream. The major waste categories contributing to total HHW were home cleaning products (21.33%), medicines (17.67%) and personal care products (15.19%). Packaging and containers (one-way) and products (single-use) accounted for over 80% of total HHW generation, implying a considerable potential to mitigate HHW generation by changing the packaging design and materials used by manufacturing enterprises. Strong correlations were observed between HHW generation (g/person/day) and the driving forces group of "household structure" and "consumer preferences" (among which the educational level of the household financial manager has the greatest impact). Furthermore, the HHW generation stream in Suzhou suggested the influence of another set of variables, such as local customs and culture, consumption patterns, and urban residential life-style. This study emphasizes that HHW should be categorized at its source (residential households) as an important step toward controlling the HHW hazards of Chinese cities. PMID:25022547

  5. HOUSEHOLD HAZARDOUS WASTE CHARACTERIZATION STUDY FOR PALM BEACH COUNTY, FLORIDA: A MITE PROGRAM EVALUATION

    EPA Science Inventory

    The objectives of the Household hazardous Waste Characterization Study (the HHW Study) were to quantify the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County, Florida's (the county) residential solid waste (characterized in this study as municipal soli...

  6. Study on the health hazards of scrap metal cutters.

    PubMed

    Ho, S F; Wong, P H; Kwok, S F

    1989-12-01

    Scrap metal cutters seemed to be left out in most preventive programmes as the workers were mainly contract workers. The health hazards of scrap metal cutting in 54 workers from a foundry and a ship breaking plant were evaluated. Environmental sampling showed lead levels ranging from 0.02 to 0.57 mg/m3 (threshold limit values is 0.15 mg/m3). Exposure to lead came mainly from the paint coat of the metals cut. Metal fume fever was not reported although their main complaints were cough and rhinitis. Skin burns at all stages of healing and residual scars were seen over hands, forearms and thighs. 96% of the cutters had blood lead levels exceeding 40 micrograms/100 ml with 10 workers exceeding 70 micrograms/100 ml. None had clinical evidence of lead poisoning. The study showed that scrap metal cutting is a hazardous industry associated with significant lead exposure. With proper medical supervision, the blood lead levels of this group of workers decreased illustrating the importance of identifying the hazard and implementing appropriate medical surveillance programmes. PMID:2635395

  7. Interdisciplinary approach to hydrological hazard mitigation and disaster response and effects of climate change on the occurrence of flood severity in central Alaska

    NASA Astrophysics Data System (ADS)

    Kontar, Y. Y.; Bhatt, U. S.; Lindsey, S. D.; Plumb, E. W.; Thoman, R. L.

    2015-06-01

    In May 2013, a massive ice jam on the Yukon River caused flooding that destroyed much of the infrastructure in the Interior Alaska village of Galena and forced the long-term evacuation of nearly 70% of its residents. This case study compares the communication efforts of the out-of-state emergency response agents with those of the Alaska River Watch program, a state-operated flood preparedness and community outreach initiative. For over 50 years, the River Watch program has been fostering long-lasting, open, and reciprocal communication with flood prone communities, as well as local emergency management and tribal officials. By taking into account cultural, ethnic, and socioeconomic features of rural Alaskan communities, the River Watch program was able to establish and maintain a sense of partnership and reliable communication patterns with communities at risk. As a result, officials and residents in these communities are open to information and guidance from the River Watch during the time of a flood, and thus are poised to take prompt actions. By informing communities of existing ice conditions and flood threats on a regular basis, the River Watch provides effective mitigation efforts in terms of ice jam flood effects reduction. Although other ice jam mitigation attempts had been made throughout US and Alaskan history, the majority proved to be futile and/or cost-ineffective. Galena, along with other rural riverine Alaskan communities, has to rely primarily on disaster response and recovery strategies to withstand the shock of disasters. Significant government funds are spent on these challenging efforts and these expenses might be reduced through an improved understanding of both the physical and climatological principals behind river ice breakup and risk mitigation. This study finds that long term dialogue is critical for effective disaster response and recovery during extreme hydrological events connected to changing climate, timing of river ice breakup, and flood occurrence in rural communities of the Far North.

  8. Decay extent evaluation of wood degraded by a fungal community using NIRS: application for ecological engineering structures used for natural hazard mitigation

    NASA Astrophysics Data System (ADS)

    Baptiste Barr, Jean; Bourrier, Franck; Bertrand, David; Rey, Freddy

    2015-04-01

    Ecological engineering corresponds to the design of efficient solutions for protection against natural hazards such as shallow landslides and soil erosion. In particular, bioengineering structures can be composed of a living part, made of plants, cuttings or seeds, and an inert part, a timber logs structure. As wood is not treated by preservatives, fungal degradation can occur from the start of the construction. It results in wood strength loss, which practitioners try to evaluate with non-destructive tools (NDT). Classical NDT are mainly based on density measurements. However, the fungal activity reduces the mechanical properties (modulus of elasticity - MOE) well before well before a density change could be measured. In this context, it would be useful to provide a tool for assessing the residual mechanical strength at different decay stages due to a fungal community. Near-infrared spectroscopy (NIRS) can be used for that purpose, as it can allow evaluating wood mechanical properties as well as wood chemical changes due to brown and white rots. We monitored 160 silver fir samples (30x30x6000mm) from green state to different levels of decay. The degradation process took place in a greenhouse and samples were inoculated with silver fir decayed debris in order to accelerate the process. For each sample, we calculated the normalized bending modulus of elasticity loss (Dw moe) and defined it as decay extent. Near infrared spectra collected from both green and decayed ground samples were corrected by the subtraction of baseline offset. Spectra of green samples were averaged into one mean spectrum and decayed spectra were subtracted from the mean spectrum to calculate the absorption loss. Partial least square regression (PLSR) has been performed between the normalized MOE loss Dw moe (0 < Dw moe < 1) and the absorption loss, with a correlation coefficient R equal to 0.85. Finally, the prediction of silver fir biodegradation rate by NIRS was significant (RMSEP = 0.13). This tool improves the evaluation accuracy of wood decay extent in the context of ecological engineering structures used for natural hazard mitigation.

  9. Channelized debris flow hazard mitigation through the use of flexible barriers: a simplified computational approach for a sensitivity analysis.

    NASA Astrophysics Data System (ADS)

    Segalini, Andrea; Ferrero, Anna Maria; Brighenti, Roberto

    2013-04-01

    A channelized debris flow is usually represented by a mixture of solid particles of various sizes and water, flowing along a laterally confined inclined channel-shaped region up to an unconfined area where it slow down its motion and spreads out into a flat-shaped mass. The study of these phenomena is very difficult due to their short duration and unpredictability, lack of historical data for a given basin and complexity of the involved mechanical phenomena. The post event surveys allow for the identification of some depositional features and provide indication about the maximum flow height; however they lack information about development of the phenomena with time. For this purpose the monitoring of recursive events has been carried out by several Authors. Most of the studies, aimed at the determination of the characteristic features of a debris flow, were carried out in artificial channels, where the main involved variables were measured and other where controlled during the tests; however, some uncertainties remained and other scaled models where developed to simulate the deposition mechanics as well as to analyze the transportation mechanics and the energy dissipation. The assessment of the mechanical behavior of the protection structures upon impact with the flow as well as the energy associated to it are necessary for the proper design of such structures that, in densely populated area, can avoid victims and limit the destructive effects of such a phenomenon. In this work a simplified structural model, developed by the Authors for the safety assessment of retention barrier against channelized debris flow, is presented and some parametric cases are interpreted through the proposed approach; this model is developed as a simplified and efficient tool to be used for the verification of the supporting cables and foundations of a flexible debris flow barrier. The present analytical and numerical-based approach has a different aim of a FEM model. The computational experiences by using FEM modeling for these kind of structures, had shown that a large amount of time for both the geometrical setup of the model and its computation is necessary. The big effort required by FEM for this class of problems limits the actual possibility to investigate different geometrical configurations, load schemes etc. and it is suitable to represent a specific configuration but it does not allow for investigation of the influence of parameter changes. On the other hand parametrical analysis are common practice in geotechnical design for the quoted reasons. Consequently, the Authors felt the need to develop a simplified method (which is not yet available in our knowledge) that allow to perform several parametrical analysis in a limited time. It should be noted that, in this paper, no consideration regarding the mechanical and physical behavior of debris flows are carried out; the proposed model requires the input of parameters that must be acquired through a preliminary characterization of the design event. However, adopting the proposed tool, the designer will be able to perform sensitivity analysis that will help in quantify the influence of parameters variability as commonly occurs in geotechnical design.

  10. Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards: Part II. Validation of satellite-derived Volcanic Sulphur Dioxide Levels.

    NASA Astrophysics Data System (ADS)

    Koukouli, MariLiza; Balis, Dimitris; Dimopoulos, Spiros; Clarisse, Lieven; Carboni, Elisa; Hedelt, Pascal; Spinetti, Claudia; Theys, Nicolas; Tampellini, Lucia; Zehner, Claus

    2014-05-01

    The eruption of the Icelandic volcano Eyjafjallajkull in the spring of 2010 turned the attention of both the public and the scientific community to the susceptibility of the European airspace to the outflows of large volcanic eruptions. The ash-rich plume from Eyjafjallajkull drifted towards Europe and caused major disruptions of European air traffic for several weeks affecting the everyday life of millions of people and with a strong economic impact. This unparalleled situation revealed limitations in the decision making process due to the lack of information on the tolerance to ash of commercial aircraft engines as well as limitations in the ash monitoring and prediction capabilities. The European Space Agency project Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards, was introduced to facilitate the development of an optimal End-to-End System for Volcanic Ash Plume Monitoring and Prediction. This system is based on comprehensive satellite-derived ash plume and sulphur dioxide [SO2] level estimates, as well as a widespread validation using supplementary satellite, aircraft and ground-based measurements. The validation of volcanic SO2 levels extracted from the sensors GOME-2/MetopA and IASI/MetopA are shown here with emphasis on the total column observed right before, during and after the Eyjafjallajkull 2010 eruptions. Co-located ground-based Brewer Spectrophotometer data extracted from the World Ozone and Ultraviolet Radiation Data Centre, WOUDC, were compared to the different satellite estimates. The findings are presented at length, alongside a comprehensive discussion of future scenarios.

  11. Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards: Part I. Validation of satellite-derived Volcanic Ash Levels.

    NASA Astrophysics Data System (ADS)

    Koukouli, MariLiza; Balis, Dimitris; Simopoulos, Spiros; Siomos, Nikos; Clarisse, Lieven; Carboni, Elisa; Wang, Ping; Siddans, Richard; Marenco, Franco; Mona, Lucia; Pappalardo, Gelsomina; Spinetti, Claudia; Theys, Nicolas; Tampellini, Lucia; Zehner, Claus

    2014-05-01

    The 2010 eruption of the Icelandic volcano Eyjafjallajkull attracted the attention of the public and the scientific community to the vulnerability of the European airspace to volcanic eruptions. Major disruptions in European air traffic were observed for several weeks surrounding the two eruptive episodes, which had a strong impact on the everyday life of many Europeans as well as a noticable economic loss of around 2-3 billion Euros in total. The eruptions made obvious that the decision-making bodies were not informed properly and timely about the commercial aircraft capabilities to ash-leaden air, and that the ash monitoring and prediction potential is rather limited. After the Eyjafjallajkull eruptions new guidelines for aviation, changing from zero tolerance to newly established ash threshold values, were introduced. Within this spirit, the European Space Agency project Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards, called for the creation of an optimal End-to-End System for Volcanic Ash Plume Monitoring and Prediction . This system is based on improved and dedicated satellite-derived ash plume and sulphur dioxide level assessments, as well as an extensive validation using auxiliary satellite, aircraft and ground-based measurements. The validation of volcanic ash levels extracted from the sensors GOME-2/MetopA, IASI/MetopA and MODIS/Terra and MODIS/Aqua is presented in this work with emphasis on the ash plume height and ash optical depth levels. Co-located aircraft flights, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation [CALIPSO] soundings and well as European Aerosol Research Lidar Network [EARLINET] measurements were compared to the different satellite estimates for the those two eruptive episodes. The validation results are extremely promising with most satellite sensors performing quite well and within the estimated uncertainties compared to the comparative datasets. The findings are extensively presented here and future directions discussed in length.

  12. Mitigating Hazards in School Facilities

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    School safety is a human concern, one that every school and community must take seriously and strive continually to achieve. It is also a legal concern; schools can be held liable if they do not make good-faith efforts to provide a safe and secure school environment. How schools are built and maintained is an integral part of school safety and…

  13. 44 CFR 201.7 - Tribal Mitigation Plans.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... OF HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION PLANNING § 201.7 Tribal Mitigation Plans. The... tribal government's pre- and post-disaster hazard management policies, programs, and capabilities...

  14. 44 CFR 201.7 - Tribal Mitigation Plans.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... OF HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION PLANNING § 201.7 Tribal Mitigation Plans. The... tribal government's pre- and post-disaster hazard management policies, programs, and capabilities...

  15. A STUDY ON GREENHOUSE GAS EMISSIONS AND MITIGATION POTENTIALS IN AGRICULRE

    NASA Astrophysics Data System (ADS)

    Hasegawa, Tomoko; Matsuoka, Yuzuru

    In this study, world food production and consumption are estimated from 2005 to 2030 using a model developed by General-to-specific modeling methodology. Based on the agricultural production, we estimated GHG emissions and mitigation potentials and evaluated mitigation countermeasures in agriculture. As a result, world crop and meat production will increase by 1.4 and 1.3 times respectively up to 2030. World GHG emissions from agriculture were 5.7 GtCO2eq in 2005. CH4 emission from enteric fermentation and N2O emission from nitrogen fertilizer contributed a large part of the emissions. In 2030, technical and economical mitigation potentials will be 2.0 GtCO2eq and 1.2 GtCO2eq respectively. The potentials correspond to 36% and 22% of total emissions in 2000. The countermeasures with highest effects will be water management in rice paddy such as "Midseason drainage" and "Off-season straw".

  16. Insights from EMF Associated Agricultural and Forestry Greenhouse Gas Mitigation Studies

    SciTech Connect

    McCarl, Bruce A.; Murray, Brian; Kim, Man-Keun; Lee, Heng-Chi; Sands, Ronald D.; Schneider, Uwe

    2007-11-19

    Integrated assessment modeling (IAM) as employed by the Energy Modeling Forum (EMF) generally involves a multi-sector appraisal of greenhouse gas emission (GHGE) mitigation alternatives and climate change effects typically at the global level. Such a multi-sector evaluation encompasses potential climate change effects and mitigative actions within the agricultural and forestry (AF) sectors. In comparison with many of the other sectors covered by IAM, the AF sectors may require somewhat different treatment due to their critical dependence upon spatially and temporally varying resource and climatic conditions. In particular, in large countries like the United States, forest production conditions vary dramatically across the landscape. For example, some areas in the southern US present conditions favorable to production of fast growing, heat tolerant pine species, while more northern regions often favor slower-growing hardwood and softwood species. Moreover, some lands are currently not suitable for forest production (e.g., the arid western plains). Similarly, in agriculture, the US has areas where citrus and cotton can be grown and other areas where barley and wheat are more suitable. This diversity across the landscape causes differential GHGE mitigation potential in the face of climatic changes and/or responses to policy or price incentives. It is difficult for a reasonably sized global IAM system to reflect the full range of sub-national geographic AF production possibilities alluded to above. AF response in the face of climate change altered temperature precipitation regimes or mitigation incentives will likely involve region-specific shifts in land use and agricultural/forest production. This chapter addresses AF sectoral responses in climate change mitigation analysis. Specifically, we draw upon US-based studies of AF GHGE mitigation possibilities that incorporate sub-national detail drawing largely on a body of studies done by the authors in association with EMF activities. We discuss characteristics of AF sectoral responses that could be incorporated in future IAM efforts in climate change policy.

  17. Interventionist and participatory approaches to flood risk mitigation decisions: two case studies in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Bianchizza, C.; Del Bianco, D.; Pellizzoni, L.; Scolobig, A.

    2012-04-01

    Flood risk mitigation decisions pose key challenges not only from a technical but also from a social, economic and political viewpoint. There is an increasing demand for improving the quality of these processes by including different stakeholders - and especially by involving the local residents in the decision making process - and by guaranteeing the actual improvement of local social capacities during and after the decision making. In this paper we analyse two case studies of flood risk mitigation decisions, Malborghetto-Valbruna and Vipiteno-Sterzing, in the Italian Alps. In both of them, mitigation works have been completed or planned, yet following completely different approaches especially in terms of responses of residents and involvement of local authorities. In Malborghetto-Valbruna an 'interventionist' approach (i.e. leaning towards a top down/technocratic decision process) was used to make decisions after the flood event that affected the municipality in the year 2003. In Vipiteno-Sterzing, a 'participatory' approach (i.e. leaning towards a bottom-up/inclusive decision process) was applied: decisions about risk mitigation measures were made by submitting different projects to the local citizens and by involving them in the decision making process. The analysis of the two case studies presented in the paper is grounded on the results of two research projects. Structured and in-depth interviews, as well as questionnaire surveys were used to explore residents' and local authorities' orientations toward flood risk mitigation. Also a SWOT analysis (Strengths, Weaknesses, Opportunities and Threats) involving key stakeholders was used to better understand the characteristics of the communities and their perception of flood risk mitigation issues. The results highlight some key differences between interventionist and participatory approaches, together with some implications of their adoption in the local context. Strengths and weaknesses of the two approaches, as well as key challenges for the future are also discussed.

  18. CASE STUDY OF RADON DIAGNOSTICS AND MITIGATION IN A NEW YORK STATE SCHOOL

    EPA Science Inventory

    The paper discusses a case study of radon diagnostics and mitigation performed by EPA in a New York State school building. esearch focused on active subslab depressurization (ASD) in the basement and, to a lesser degree, the potential for radon reduction in the basement and slab-...

  19. Versatile gas gun target assembly for studying blast wave mitigation in materials

    NASA Astrophysics Data System (ADS)

    Bartyczak, S.; Mock, W., Jr.

    2012-03-01

    Traumatic brain injury (TBI) has become a serious problem for military personnel returning from recent conflicts. This has increased interest in investigating blast mitigating materials for use in helmets. In this paper we describe a new versatile target assembly that is used with an existing gas gun for studying these materials.

  20. Feasibility study of tank leakage mitigation using subsurface barriers

    SciTech Connect

    Treat, R.L.; Peters, B.B.; Cameron, R.J.; McCormak, W.D.; Trenkler, T.; Walters, M.F.; Rouse, J.K.; McLaughlin, T.J.; Cruse, J.M.

    1994-09-21

    The US Department of Energy (DOE) has established the Tank Waste Remediation System (TWRS) to satisfy manage and dispose of the waste currently stored in the underground storage tanks. The retrieval element of TWRS includes a work scope to develop subsurface impermeable barriers beneath SSTs. The barriers could serve as a means to contain leakage that may result from waste retrieval operations and could also support site closure activities by facilitating cleanup. Three types of subsurface barrier systems have emerged for further consideration: (1) chemical grout, (2) freeze walls, and (3) desiccant, represented in this feasibility study as a circulating air barrier. This report contains analyses of the costs and relative risks associated with combinations retrieval technologies and barrier technologies that from 14 alternatives. Eight of the alternatives include the use of subsurface barriers; the remaining six nonbarrier alternative are included in order to compare the costs, relative risks and other values of retrieval with subsurface barriers. Each alternative includes various combinations of technologies that can impact the risks associated with future contamination of the groundwater beneath the Hanford Site to varying degrees. Other potential risks associated with these alternatives, such as those related to accidents and airborne contamination resulting from retrieval and barrier emplacement operations, are not quantitatively evaluated in this report.

  1. Process support for risk mitigation: a case study of variability and resilience in vascular surgery

    PubMed Central

    Faxvaag, Arild; Seim, Andreas

    2011-01-01

    Objective To inform the design of IT support, the authors explored the characteristics and sources of process variability in a surgical care process that transcends multiple institutions and professional boundaries. Setting A case study of the care process in the Abdominal Aortic Aneurysm surveillance programme of three hospitals in Norway. Design Observational study of encounters between patients and surgeons accompanied by semistructured interviews of patients and key health personnel. Results Four process variety dimensions were identified. The captured process variations were further classified into intended and unintended variations according to the cause of the variations. Our main findings, however, suggest that the care process is best understood as systematised analysis and mitigation of risk. Even if major variations accommodated for the flexibility needed to achieve particular clinical aims and/or to satisfy patient preferences, other variations reflected healthcare actors' responses to risks arising from a lack of resilience in the existing system. On this basis, the authors outlined suggestions for a resilience-based approach by including awareness in workflow as well as feedback loops for adaptive learning. The authors suggest that IT process support should be designed to prevent process breakdowns with patient dropouts as well as to sustain risk-mitigating performance. Conclusion Process variation was in part induced by systemised risk mitigation. IT-based process support for monitoring processes such as that studied here should aim to ensure resilience and further mitigate risk to enhance patient safety. PMID:21325658

  2. A study of shock mitigating materials in a split Hopkinson bar configuration. Phase 1

    SciTech Connect

    Bateman, V.I.; Brown, F.A.; Hansen, N.R.

    1998-06-01

    Sandia National Laboratories (SNL) designs mechanical systems with electronics that must survive high shock environments. These mechanical systems include penetrators that must survive soil, rock, and ice penetration, nuclear transportation casks that must survive transportation environments, and laydown weapons that must survive delivery impact of 125 fps. These mechanical systems contain electronics that may operate during and after the high shock environment and that must be protected from the high shock environments. A study has been started to improve the packaging techniques for the advanced electronics utilized in these mechanical systems because current packaging techniques are inadequate for these more sensitive electronics. In many cases, it has been found that the packaging techniques currently used not only do not mitigate the shock environment but actually amplify the shock environment. An ambitious goal for this packaging study is to avoid amplification and possibly attenuate the shock environment before it reaches the electronics contained in the various mechanical systems. As part of the investigation of packaging techniques, a two phase study of shock mitigating materials is being conducted. The purpose of the first phase reported here is to examine the performance of a joint that consists of shock mitigating material sandwiched in between steel and to compare the performance of the shock mitigating materials. A split Hopkinson bar experimental configuration simulates this joint and has been used to study the shock mitigating characteristics of seventeen, unconfined materials. The nominal input for these tests is an incident compressive wave with 50 fps peak (1,500 {micro}{var_epsilon} peak) amplitude and a 100 {micro}s duration (measured at 10% amplitude).

  3. The fujairah united arab emirates (uae) (ml = 5.1) earthquake of march 11, 2002 a reminder for the immediate need to develop and implement a national hazard mitigation strategy

    NASA Astrophysics Data System (ADS)

    Al-Homoud, A.

    2003-04-01

    On March 11, 2002, at mid nigh, the Fujairah Masafi region in the UAE was shaken by an earthquake of shallow depth and local magnitude m = 5.1 on Richter Scale. The earthquake occurred on Dibba fault in the UAE with epicenter of the earthquake at 20 km NW of Fujairha city. The focal depth was just 10 km. The earthquake was felt in most parts of the northern emirates: Dubai, Sharjah, Ajman, Ras Al-Khaima, and Um-Qwain. The "main shock" was followed in the following weeks by more than twenty five earthquakes with local magnitude ranging from m = 4 to m = 4.8. The location of those earthquakes was along Zagros Reverse Faulting System in the Iranian side the Arabian Gulf, opposite to the Shores of the UAE. Most of these earthquakes were shallow too and were actually felt by the people. However, there was another strong earthquake in early April 2002 in the same Masafi region with local magnitude m = 5.1 and focal depth 30 km, therefore it was not felt by the northern emirates residents. No major structural damages to buildings and lifeline systems were reported in the several cities located in the vicinity of the earthquake epicenter. The very small values of ground accelerations were not enough to test the structural integrity of tall building and major infrastructures. Future major earthquakes anticipated in the region in close vicinity of northern emirates, once they occur, and considering the noticeable local site effect of the emirates sandy soils of high water table levels, will surely put these newly constructed building into the real test. Even though there were no casualties in the March 11th event, but there was major fear as a result of the loud sound of rock rupture heard in the mountains close to Maafi, the noticeable disturbance of animals and birds minutes before the earthquake incident and during the incident, cracks in the a good number of Masafi buildings and major damages that occurred in "old" buildings of Fujairah Masafi area, the closest city to the epicenter of the earthquake. Indeed, the March 11, 2002 and "aftershocks" scared the citizens of Masafi and surrounding regions and ignited the attention of the public and government to the subject matter of earthquake hazard, specialty this earthquake came one year after the near by Indian m = 6.5 destructive Earthquake. Indeed the recent m = 6.2 June 22 destructive earthquake too that hit north west Iran, has again reminded the UAE public and government with the need to take quick and concrete measures to dtake the necessary steps to mitigate any anticipated earthquake hazard. This study reflects in some details on the following aspects related to the region and vicinity: geological and tectonic setting, seismicity, earthquake activity data base and seismic hazard assessment. Moreover, it documents the following aspects of the March 11, 2002 earthquake: tectonic, seismological, instrumental seismic data, aftershocks, strong motion recordings and response spectral and local site effect analysis, geotechnical effects and structural observations in the region affected by the earthquake. The study identifies local site ground amplification effects and liquefaction hazard potential in some parts of the UAE. Moreover, the study reflects on the coverage of the incident in the media, public and government response, state of earthquake engineering practice in the construction industry in the UAE, and the national preparedness and public awareness issues. However, it is concluded for this event that the mild damages that occurred in Masafi region were due to poor quality of construction, and lack of underestimating of the design base shear. Practical recommendations are suggested for the authorities to avoid damages in newly constructed buildings and lifelines as a result of future stronger earthquakes, in addition to recommendations on a national strategy for earthquake hazard mitigation in the UAE, which is still missing. The recommendations include the development and implementation of a design code for earthquake loading in the UAE, development of macro and micro seismic hazard maps, development of local site effect and liquefaction hazard maps, installation of a national earthquake monitoring network, assessment of the vulnerability of critical structures and life line facilities, public awareness, training of rescue teams in civil defense, etc.

  4. Hazards and operability study for the surface moisture monitoring system

    SciTech Connect

    Board, B.D.

    1996-04-04

    The Hanford Nuclear Reservation Tank Farms` underground waste tanks have been used to store liquid radioactive waste from defense materials production since the 1940`s. Waste in certain of the tanks may contain material in the form of ferrocyanide or various organic compounds which could potentially be susceptible to condensed phase chemical reactions. Because of the presence of oxidizing materials (nitrate compounds) and heat sources (radioactive decay and chemical reactions), the ferrocyanide or organic material could potentially fuel a propagating exothermic reaction with undesirable consequences. Analysis and experiments indicate that the reaction propagation and/or initiation may be prevented by the presence of sufficient moisture in the waste. Because the reaction would probably be initiated at the surface of the waste, evidence of sufficient moisture concentration would help provide evidence that the tank waste can continue to be safely stored. The Surface Moisture Measurement System (SMMS) was developed to collect data on the surface moisture in the waste by inserting two types of probes (singly) into a waste tank-a neutron probe and an electromagnetic inductance (EMI) probe. The sensor probes will be placed on the surface of the waste utilizing a moveable deployment arm to lower them through an available riser. The movement of the SMMS within the tank will be monitored by a camera lowered through an adjacent riser. The SMMS equipment is the subject of this study. Hazards and Operability Analysis (HAZOP) is a systematic technique for assessing potential hazards and/or operability problems for a new activity. It utilizes a multidiscipline team of knowledgeable individuals in a systematic brainstorming effort. The results of this study will be used as input to an Unreviewed Safety Question determination.

  5. Methodological Issues In Forestry Mitigation Projects: A CaseStudy Of Kolar District

    SciTech Connect

    Ravindranath, N.H.; Murthy, I.K.; Sudha, P.; Ramprasad, V.; Nagendra, M.D.V.; Sahana, C.A.; Srivathsa, K.G.; Khan, H.

    2007-06-01

    There is a need to assess climate change mitigationopportunities in forest sector in India in the context of methodologicalissues such as additionality, permanence, leakage, measurement andbaseline development in formulating forestry mitigation projects. A casestudy of forestry mitigation project in semi-arid community grazing landsand farmlands in Kolar district of Karnataka, was undertaken with regardto baseline and project scenariodevelopment, estimation of carbon stockchange in the project, leakage estimation and assessment ofcost-effectiveness of mitigation projects. Further, the transaction coststo develop project, and environmental and socio-economic impact ofmitigation project was assessed.The study shows the feasibility ofestablishing baselines and project C-stock changes. Since the area haslow or insignificant biomass, leakage is not an issue. The overallmitigation potential in Kolar for a total area of 14,000 ha under variousmitigation options is 278,380 tC at a rate of 20 tC/ha for the period2005-2035, which is approximately 0.67 tC/ha/yr inclusive of harvestregimes under short rotation and long rotation mitigation options. Thetransaction cost for baseline establishment is less than a rupee/tC andfor project scenario development is about Rs. 1.5-3.75/tC. The projectenhances biodiversity and the socio-economic impact is alsosignificant.

  6. OFFSHORE PLATFORM HAZARDOUS WASTE INCINERATION FACILITY: FEASIBILITY STUDY

    EPA Science Inventory

    This report describes a program conducted to evaluate the technical and environmental feasibility of using a proposed offshore platform incineration facility in the destruction of hazardous wastes and for incineration research.

  7. Feasibility Study of Radiometry for Airborne Detection of Aviation Hazards

    NASA Technical Reports Server (NTRS)

    Gimmestad, Gary G.; Papanicolopoulos, Chris D.; Richards, Mark A.; Sherman, Donald L.; West, Leanne L.; Johnson, James W. (Technical Monitor)

    2001-01-01

    Radiometric sensors for aviation hazards have the potential for widespread and inexpensive deployment on aircraft. This report contains discussions of three aviation hazards - icing, turbulence, and volcanic ash - as well as candidate radiometric detection techniques for each hazard. Dual-polarization microwave radiometry is the only viable radiometric technique for detection of icing conditions, but more research will be required to assess its usefulness to the aviation community. Passive infrared techniques are being developed for detection of turbulence and volcanic ash by researchers in this country and also in Australia. Further investigation of the infrared airborne radiometric hazard detection approaches will also be required in order to develop reliable detection/discrimination techniques. This report includes a description of a commercial hyperspectral imager for investigating the infrared detection techniques for turbulence and volcanic ash.

  8. Management of agricultural soils for greenhouse gas mitigation: Learning from a case study in NE Spain.

    PubMed

    Sánchez, B; Iglesias, A; McVittie, A; Álvaro-Fuentes, J; Ingram, J; Mills, J; Lesschen, J P; Kuikman, P J

    2016-04-01

    A portfolio of agricultural practices is now available that can contribute to reaching European mitigation targets. Among them, the management of agricultural soils has a large potential for reducing GHG emissions or sequestering carbon. Many of the practices are based on well tested agronomic and technical know-how, with proven benefits for farmers and the environment. A suite of practices has to be used since none of the practices can provide a unique solution. However, there are limitations in the process of policy development: (a) agricultural activities are based on biological processes and thus, these practices are location specific and climate, soils and crops determine their agronomic potential; (b) since agriculture sustains rural communities, the costs and potential for implementation have also to be regionally evaluated and (c) the aggregated regional potential of the combination of practices has to be defined in order to inform abatement targets. We believe that, when implementing mitigation practices, three questions are important: Are they cost-effective for farmers? Do they reduce GHG emissions? What policies favour their implementation? This study addressed these questions in three sequential steps. First, mapping the use of representative soil management practices in the European regions to provide a spatial context to upscale the local results. Second, using a Marginal Abatement Cost Curve (MACC) in a Mediterranean case study (NE Spain) for ranking soil management practices in terms of their cost-effectiveness. Finally, using a wedge approach of the practices as a complementary tool to link science to mitigation policy. A set of soil management practices was found to be financially attractive for Mediterranean farmers, which in turn could achieve significant abatements (e.g., 1.34 MtCO2e in the case study region). The quantitative analysis was completed by a discussion of potential farming and policy choices to shape realistic mitigation policy at European regional level. PMID:26789201

  9. Volcanic hazard studies for the Yucca Mountain project

    SciTech Connect

    Crowe, B.; Harrington, C.; Turrin, B.; Champion, D.; Wells, S.; Perry, F.; McFadden, L.; Renault, C.

    1989-12-31

    Volcanic hazard studies are ongoing to evaluate the risk of future volcanism with respect to siting of a repository for disposal of high-level radioactive waste at the Yucca Mountain site. Seven Quaternary basaltic volcanic centers are located between 8 and 47 km from the outer boundary of the exploration block. The conditional probability of disruption of a repository by future basaltic volcanism is bounded by the range of 10-8 to 10-10 yr-1. These bounds are currently being reexamined based on new developments in the understanding of the evolution of small volume, basaltic volcanic centers including: Many of the volcanic centers exhibit brief periods of eruptive activity separated by longer periods of inactivity, The centers may be active for time spans exceeding 105 yrs, There is a decline in the volume of eruptions of the centers through time, and Small volume eruptions occurred at two of the Quaternary centers during latest Pleistocene or Holocene. The authors classify the basalt centers as polycyclic, and distinguish them from polygenetic volcanoes. Polycyclic volcanism is characterized by small volume, episodic eruptions of magma of uniform composition over time spans of 103 to 105 yrs. magma eruption rates are low and the time between eruptions exceeds the cooling time of the magma volumes.

  10. Volcanic hazard studies for the Yucca Mountain project

    SciTech Connect

    Crowe, B.; Turrin, B.; Wells, S.; Perry, F.; McFadden, L.; Renault, C.E.; Champion, D.; Harrington, C.

    1989-05-01

    Volcanic hazard studies are ongoing to evaluate the risk of future volcanism with respect to siting of a repository for disposal of high-level radioactive waste at the Yucca Mountain site. Seven Quaternary basaltic volcanic centers are located a minimum distance of 12 km and a maximum distance of 47 km from the outer boundary of the exploration block. The conditional probability of disruption of a repository by future basaltic volcanism is bounded by the range of 10{sup {minus}8} to 10{sup {minus}10} yr{sup {minus}1}. These values are currently being reexamined based on new developments in the understanding of the evaluation of small volume, basaltic volcanic centers including: (1) Many, perhaps most, of the volcanic centers exhibit brief periods of eruptive activity separated by longer periods of inactivity. (2) The centers may be active for time spans exceeding 10{sup 5} yrs, (3) There is a decline in the volume of eruptions of the centers through time, and (4) Small volume eruptions occurred at two of the Quaternary centers during latest Pleistocene or Holocene time. We classify the basalt centers as polycyclic, and distinguish them from polygenetic volcanoes. Polycyclic volcanism is characterized by small volume, episodic eruptions of magma of uniform composition over time spans of 10{sup 3} to 10{sup 5} yrs. Magma eruption rates are low and the time between eruptions exceeds the cooling time of the magma volumes. 25 refs., 2 figs.

  11. The 5 key questions coping with risks due to natural hazards, answered by a case study

    NASA Astrophysics Data System (ADS)

    Hardegger, P.; Sausgruber, J. T.; Schiegg, H. O.

    2009-04-01

    Based on Maslow's hierarchy of needs, human endeavours concern primarily existential needs, consequently, to be safeguarded against both natural as well as man made threads. The subsequent needs are to realize chances in a variety of fields, as economics and many others. Independently, the 5 crucial questions are the same as for coping with risks due to natural hazards specifically. These 5 key questions are I) What is the impact in function of space and time ? II) What protection measures comply with the general opinion and how much do they mitigate the threat? III) How can the loss be adequately quantified and monetized ? IV) What budget for prevention and reserves for restoration and compensation are to be planned ? V) Which mix of measures and allocation of resources is sustainable, thus, optimal ? The 5 answers, exemplified by a case study, concerning the sustainable management of risk due to the debris flows by the Enterbach / Inzing / Tirol / Austria, are as follows : I) The impact, created by both the propagation of flooding and sedimentation, has been forecasted by modeling (numerical simulation) the 30, 50, 100, 150, 300 and 1000 year debris flow. The input was specified by detailed studies in meteorology, precipitation and runoff, in geology, hydrogeology, geomorphology and slope stability, in hydraulics, sediment transport and debris flow, in forestry, agriculture and development of communal settlement and infrastructure. All investigations were performed according to the method of ETAlp (Erosion and Transport in Alpine systems). ETAlp has been developed in order to achieve a sustainable development in alpine areas and has been evaluated by the research project "nab", within the context of the EU-Interreg IIIb projects. II) The risk mitigation measures of concern are in hydraulics at the one hand and in forestry at the other hand. Such risk management is evaluated according to sustainability, which means economic, ecologic and social, in short, "triple" compatibility. 100% protection against the 100 year event shows to be the optimal degree of protection. Consequently, impacts statistically less frequent than once in 100 year are accepted as the remaining risk. Such floods and debris flows respectively cause a fan of propagation which is substantially reduced due to the protection measures against the 100 year event. III) The "triple loss distribution" shows the monetized triple damage, dependent on its probability. The monetization is performed by the social process of participation of the impacted interests, if not, by official experts in representation. The triple loss distribution rises in time mainly due to the rise in density and value of precious goods. A comparison of the distributions of the triple loss and the triple risk, behaving in opposite direction, is shown and explained within the project. IV) The recommended yearly reserves to be stocked for restoration and compensation of losses, caused by debris flows, amount to 70'000.- according to the approach of the "technical risk premium". The discrepancy in comparison with the much higher amounts according to the common approaches of natural hazards engineering are discussed. V) The sustainable mix of hydraulic and forestry measures with the highest return on investment at lowest risk is performed according to the portfolio theory (Markowitz), based on the triple value curves, generated by the method of TripelBudgetierung. Accordingly, the optimum mix of measures to protect the community of Inzing against the natural hazard of debris flow, thus, the most efficient allocation of resources equals to 2/3 for hydraulic, 1/3 for forestry measures. In detail, the results of the research pilot project "Nachhaltiges Risikomanagement - Enterbach / Inzing / Tirol / Austria" may be consulted under www.ibu.hsr.ch/inzing.

  12. Best Practices in Grid Integration of Variable Wind Power: Summary of Recent US Case Study Results and Mitigation Measures

    SciTech Connect

    Smith, J. Charles; Parsons, Brian; Acker, Thomas; Milligan, Michael; Zavidil, Robert; Schuerger, Matthew; DeMeo, Edgar

    2010-01-22

    This paper will summarize results from a number of utility wind integration case studies conducted recently in the US, and outline a number of mitigation measures based on insights from those studies.

  13. HOUSEHOLD HAZARDOUS WASTE CHARACTERIZATION STUDY FOR PALM BEACH COUNTY, FLORIDA - A MITE PROGRAM EVALUATION

    EPA Science Inventory

    The objectives of the Household Hazardous Waste Characterization Study (the HHW Study) were to: 1) Quantity the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County Florida’s (the County) residential solid waste (characterized in this study as municipal s...

  14. HOUSEHOLD HAZARDOUS WASTE CHARACTERIZATION STUDY FOR PALM BEACH COUNTY, FLORIDA - A MITE PROGRAM EVALUATION

    EPA Science Inventory

    The objectives of the Household Hazardous Waste Characterization Study (the HHW Study) were to: 1) Quantity the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County Floridas (the County) residential solid waste (characterized in this study as municipal s...

  15. Sonic boom focusing prediction and delta wing shape optimization for boom mitigation studies

    NASA Astrophysics Data System (ADS)

    Khasdeo, Nitin

    Supersonic travel over land would be a reality if new aircraft are designed such that they produce quieter ground sonic booms, no louder than 0.3 psf according to the FAA requirement. An attempt is made to address the challenging goal of predicting the sonic boom focusing effects and mitigate the sonic boom ground overpressure for delta wing geometry. Sonic boom focusing is fundamentally a nonlinear phenomenon and can be predicted by numerically solving the nonlinear Tricomi equation. The conservative time domain scheme is developed to carry out the sonic boom focusing or super boom studies. The computational scheme is a type differencing scheme and is solved using a time-domain scheme, which is called a conservative type difference solution. The finite volume method is used on a structured grid topology. A number of input signals Concorde wave, symmetric and ax symmetric ramp, flat top and typical N wave type are simulated for sonic boom focusing prediction. A parametric study is launched in order to investigate the effects of several key parameters that affect the magnitude of shock wave amplification and location of surface of amplification or "caustics surface." A parametric studies includes the effects of longitudinal and lateral boundaries, footprint and initial shock strength of incoming wave and type of input signal on sonic boom focusing. Another very important aspect to be looked at is the mitigation strategies of sonic boom ground signature. It has been decided that aerodynamic reshaping and geometrical optimization are the main goals for mitigating the ground signal up to the acceptance level of FAA. Biconvex delta wing geometry with a chord length of 60 ft and maximum thickness ratio of 5% of the chord is used as a base line model to carry out the fundamental research focus. The wing is flying at an altitude 40,000 ft with a Mach number of 2.0. Boom mitigation work is focused on investigating the effects of wing thickness ratio, wing camber ratio, wing nose angle and dihedral angle on mitigating the sonic-boom ground signature. Optimal shape design for low sonic boom ground signature and least degradation of aerodynamic performance are the main goals of the present work. Response surface methodology is used for carrying out wing shape optimization. Far-field computations are carried out to predict the sonic boom signature on the ground using the full-potential code and the Thomas ray code.

  16. Reducing risk from lahar hazards: concepts, case studies, and roles for scientists

    USGS Publications Warehouse

    Pierson, Thomas C.; Wood, Nathan J.; Driedger, Carolyn L.

    2014-01-01

    Lahars are rapid flows of mud-rock slurries that can occur without warning and catastrophically impact areas more than 100 km downstream of source volcanoes. Strategies to mitigate the potential for damage or loss from lahars fall into four basic categories: (1) avoidance of lahar hazards through land-use planning; (2) modification of lahar hazards through engineered protection structures; (3) lahar warning systems to enable evacuations; and (4) effective response to and recovery from lahars when they do occur. Successful application of any of these strategies requires an accurate understanding and assessment of the hazard, an understanding of the applicability and limitations of the strategy, and thorough planning. The human and institutional components leading to successful application can be even more important: engagement of all stakeholders in hazard education and risk-reduction planning; good communication of hazard and risk information among scientists, emergency managers, elected officials, and the at-risk public during crisis and non-crisis periods; sustained response training; and adequate funding for risk-reduction efforts. This paper reviews a number of methods for lahar-hazard risk reduction, examines the limitations and tradeoffs, and provides real-world examples of their application in the U.S. Pacific Northwest and in other volcanic regions of the world. An overriding theme is that lahar-hazard risk reduction cannot be effectively accomplished without the active, impartial involvement of volcano scientists, who are willing to assume educational, interpretive, and advisory roles to work in partnership with elected officials, emergency managers, and vulnerable communities.

  17. STUDY ON AIR INGRESS MITIGATION METHODS IN THE VERY HIGH TEMPERATURE GAS COOLED REACTOR (VHTR)

    SciTech Connect

    Chang H. Oh

    2011-03-01

    An air-ingress accident followed by a pipe break is considered as a critical event for a very high temperature gas-cooled reactor (VHTR). Following helium depressurization, it is anticipated that unless countermeasures are taken, air will enter the core through the break leading to oxidation of the in-core graphite structure. Thus, without mitigation features, this accident might lead to severe exothermic chemical reactions of graphite and oxygen. Under extreme circumstances, a loss of core structural integrity may occur along with excessive release of radiological inventory. Idaho National Laboratory under the auspices of the U.S. Department of Energy is performing research and development (R&D) that focuses on key phenomena important during challenging scenarios that may occur in the VHTR. Phenomena Identification and Ranking Table (PIRT) studies to date have identified the air ingress event, following on the heels of a VHTR depressurization, as very important (Oh et al. 2006, Schultz et al. 2006). Consequently, the development of advanced air ingress-related models and verification and validation (V&V) requirements are part of the experimental validation plan. This paper discusses about various air-ingress mitigation concepts applicable for the VHTRs. The study begins with identifying important factors (or phenomena) associated with the air-ingress accident by using a root-cause analysis. By preventing main causes of the important events identified in the root-cause diagram, the basic air-ingress mitigation ideas can be conceptually derived. The main concepts include (1) preventing structural degradation of graphite supporters; (2) preventing local stress concentration in the supporter; (3) preventing graphite oxidation; (4) preventing air ingress; (5) preventing density gradient driven flow; (4) preventing fluid density gradient; (5) preventing fluid temperature gradient; (6) preventing high temperature. Based on the basic concepts listed above, various air-ingress mitigation methods are proposed in this study. Among them, the following two mitigation ideas are extensively investigated using computational fluid dynamic codes (CFD): (1) helium injection in the lower plenum, and (2) reactor enclosure opened at the bottom. The main idea of the helium injection method is to replace air in the core and the lower plenum upper part by buoyancy force. This method reduces graphite oxidation damage in the severe locations of the reactor inside. To validate this method, CFD simulations are addressed here. A simple 2-D CFD model is developed based on the GT-MHR 600MWt design. The simulation results showed that the helium replace the air flow into the core and significantly reduce the air concentration in the core and bottom reflector potentially protecting oxidation damage. According to the simulation results, even small helium flow was sufficient to remove air in the core, mitigating the air-ingress successfully. The idea of the reactor enclosure with an opening at the bottom changes overall air-ingress mechanism from natural convection to molecular diffusion. This method can be applied to the current system by some design modification of the reactor cavity. To validate this concept, this study also uses CFD simulations based on the simplified 2-D geometry. The simulation results showed that the enclosure open at the bottom can successfully mitigate air-ingress into the reactor even after on-set natural circulation occurs.

  18. Countermeasures to hazardous chemicals

    SciTech Connect

    Holmes, J.M.; Byers, C.H.

    1989-04-01

    Recent major incidents involving the airborne release of hazardous chemicals have led to this study of effective strategies must be developed to prevent and to deal with emergencies. The comprehensive study of FEMA and the various other entities required that the project be divided into three tasks. These included Task 1 (the nature of the threat from incidents involving airborne hazardous chemicals is described. Based on available databases, a new methodology for ranking chemical hazards is proposed and tested); Task 2 (Existing responsibilities of federal, state, and local agencies, as well as the part played by the private sector, have been surveyed. Legislation at all levels of government are reviewed and in light of this analysis, the role of FEMA is examined. Institutional options to new and existing approaches for reducing risk are reevaluated, and recommendations are made for these approaches); and Task 3 (Technical options are discussed in light of the most hazardous situations, and recommendations are made for action or research where needed. Emphasis is laid on new and emerging technologies in the area). Recommendations are offered regarding actions which would improve preparation, training, mitigation, and response on the part of FEMA to the release of hazardous chemicals. 180 refs., 13 figs., 34 tabs.

  19. Hazardous drinking-related characteristics of depressive disorders in Korea: the CRESCEND study.

    PubMed

    Park, Seon-Cheol; Lee, Sang Kyu; Oh, Hong Seok; Jun, Tae-Youn; Lee, Min-Soo; Kim, Jae-Min; Kim, Jung-Bum; Yim, Hyeon-Woo; Park, Yong Chon

    2015-01-01

    This study aimed to identify clinical correlates of hazardous drinking in a large cohort of Korean patients with depression. We recruited a total of 402 depressed patients aged > 18 yr from the Clinical Research Center for Depression (CRESCEND) study in Korea. Patients' drinking habits were assessed using the Korean Alcohol Use Disorder Identification Test (AUDIT-K). Psychometric scales, including the HAMD, HAMA, BPRS, CGI-S, SSI-Beck, SOFAS, and WHOQOL-BREF, were used to assess depression, anxiety, overall psychiatric symptoms, global severity, suicidal ideation, social functioning, and quality of life, respectively. We compared demographic and clinical features and psychometric scores between patients with and without hazardous drinking behavior after adjusting for the effects of age and sex. We then performed binary logistic regression analysis to identify independent correlates of hazardous drinking in the study population. Our results revealed that hazardous drinking was associated with current smoking status, history of attempted suicide, greater psychomotor retardation, suicidal ideation, weight loss, and lower hypochondriasis than non-hazardous drinking. The regression model also demonstrated that more frequent smoking, higher levels of suicidal ideation, and lower levels of hypochondriasis were independently correlates for hazardous drinking in depressed patients. In conclusion, depressed patients who are hazardous drinkers experience severer symptoms and a greater burden of illness than non-hazardous drinkers. In Korea, screening depressed patients for signs of hazardous drinking could help identify subjects who may benefit from comprehensive therapeutic approaches. PMID:25552886

  20. Occupational Health Hazards among Healthcare Workers in Kampala, Uganda

    PubMed Central

    Yu, Xiaozhong; Buregyeya, Esther; Musoke, David; Wang, Jia-Sheng; Halage, Abdullah Ali; Whalen, Christopher; Bazeyo, William; Williams, Phillip; Ssempebwa, John

    2015-01-01

    Objective. To assess the occupational health hazards faced by healthcare workers and the mitigation measures. Methods. We conducted a cross-sectional study utilizing quantitative data collection methods among 200 respondents who worked in 8 major health facilities in Kampala. Results. Overall, 50.0% of respondents reported experiencing an occupational health hazard. Among these, 39.5% experienced biological hazards while 31.5% experienced nonbiological hazards. Predictors for experiencing hazards included not wearing the necessary personal protective equipment (PPE), working overtime, job related pressures, and working in multiple health facilities. Control measures to mitigate hazards were availing separate areas and containers to store medical waste and provision of safety tools and equipment. Conclusion. Healthcare workers in this setting experience several hazards in their workplaces. Associated factors include not wearing all necessary protective equipment, working overtime, experiencing work related pressures, and working in multiple facilities. Interventions should be instituted to mitigate the hazards. Specifically PPE supply gaps, job related pressures, and complacence in adhering to mitigation measures should be addressed. PMID:25802531

  1. Factors in Perception of Tornado Hazard: An Exploratory Study.

    ERIC Educational Resources Information Center

    de Man, Anton; Simpson-Housley, Paul

    1987-01-01

    Administered questionnaire on tornado hazard to 142 adults. Results indicated that subject's gender and education level were best predictors of perceived probability of tornado recurrence; that ratings of severity of potential damage were related to education level; and that gender accounted for significant percentage of variance in anxiety

  2. LAND CLASSIFICATION USED TO SELECT ABANDONED HAZARDOUS WASTE STUDY SITES

    EPA Science Inventory

    The biological effects of hazardous substances in the environment are influenced by climate, physiography, and biota. These factors interact to determine the transport and fate of chemicals, but are difficult to model accurately except for small areas with a large data base. The ...

  3. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 44 Emergency Management and Assistance 1 2014-10-01 2014-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  4. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 44 Emergency Management and Assistance 1 2011-10-01 2011-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  5. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 44 Emergency Management and Assistance 1 2012-10-01 2011-10-01 true Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  6. Plant protection system optimization studies to mitigate consequences of large breaks in the Advanced Neutron Source Reactor

    SciTech Connect

    Khayat, M.I.; March-Leuba, J.

    1993-10-01

    This paper documents some of the optimization studies performed to maximize the performance of the engineered safety features and scram systems to mitigate the consequences of large breaks in the primary cooling system of the Advanced Neutron Source (ANS) Reactor.

  7. Geospatial Approach on Landslide Hazard Zonation Mapping Using Multicriteria Decision Analysis: A Study on Coonoor and Ooty, Part of Kallar Watershed, The Nilgiris, Tamil Nadu

    NASA Astrophysics Data System (ADS)

    Rahamana, S. Abdul; Aruchamy, S.; Jegankumar, R.

    2014-12-01

    Landslides are one of the critical natural phenomena that frequently lead to serious problems in hilly area, resulting to loss of human life and property, as well as causing severe damage to natural resources. The local geology with high degree of slope coupled with high intensity of rainfall along with unplanned human activities of the study area causes many landslides in this region. The present study area is more attracted by tourist throughout the year, so this area must be considered for preventive measures. Geospatial based Multicriteria decision analysis (MCDA) technique is increasingly used for landslide vulnerability and hazard zonation mapping. It enables the integration of different data layers with different levels of uncertainty. In this present study, it is used analytic hierarchy process (AHP) method to prepare landslide hazard zones of the Coonoor and Ooty, part of Kallar watershed, The Nilgiris, Tamil Nadu. The study was carried out using remote sensing data, field surveys and geographic information system (GIS) tools. The ten factors that influence landslide occurrence, such as elevation, slope aspect, slope angle, drainage density, lineament density, soil, precipitation, land use/land cover (LULC), distance from road and NDVI were considered. These factors layers were extracted from the various related spatial data's. These factors were evaluated, and then, the individual factor weight and class weight were assigned to each of the related factors. The Landslide Hazard Zone Index (LHZI) was calculated using Multicriteria decision analysis (MCDA) the technique based on the assigned weight and the rating is given by the Analytical Hierarchy Process (AHP) method. The final cumulative map of the study area was categorized into four hazard zones and classified as zone I to IV. There are 3.56% of the area comes under the hazard zone IV fallowed by 48.19% of the area comes under zone III, 43.63 % of the area in zone II and 4.61% of the area comes hazard zone I. Further resulted hazard zone map and landuse/landcover map are overlaid to check the hazard status, and existing inventory of known landslides within the present study area was compared with the resulting vulnerable and hazard zone maps. The landslide hazard zonation map is useful for landslide hazard prevention, mitigation, and improvement to society, and proper planning for land use and construction in the future.

  8. Development, Implementation, and Pilot Evaluation of a Model-Driven Envelope Protection System to Mitigate the Hazard of In-Flight Ice Contamination on a Twin-Engine Commuter Aircraft

    NASA Technical Reports Server (NTRS)

    Martos, Borja; Ranaudo, Richard; Norton, Billy; Gingras, David; Barnhart, Billy

    2014-01-01

    Fatal loss-of-control accidents have been directly related to in-flight airframe icing. The prototype system presented in this report directly addresses the need for real-time onboard envelope protection in icing conditions. The combination of prior information and real-time aerodynamic parameter estimations are shown to provide sufficient information for determining safe limits of the flight envelope during inflight icing encounters. The Icing Contamination Envelope Protection (ICEPro) system was designed and implemented to identify degradations in airplane performance and flying qualities resulting from ice contamination and provide safe flight-envelope cues to the pilot. The utility of the ICEPro system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their awareness of a hazardous aircraft state. The performance of ICEPro system was further evaluated by various levels of sensor noise and atmospheric turbulence.

  9. Nonpoint-Source Agricultural Hazard Index: A Case Study of the Province of Cremona, Italy

    NASA Astrophysics Data System (ADS)

    Trevisan, Marco; Padovani, Laura; Capri, Ettore

    2000-11-01

    This paper reports the results of a study aimed at the evaluation of the hazard level of farming activities in the province of Cremona, Italy, with particular reference to groundwater. The applied methodology employs a parametric approach based on the definition of potential hazard indexes (nonpoint-source agricultural hazard indexes, NPSAHI). Two categories of parameters were considered: the hazard factors (HF), which represent all farming activities that cause or might cause an impact on groundwater (use of fertilizers and pesticides, application of livestock and poultry manure, food industry wastewater, and urban sludge), and the control factors (CF), which adapt the hazard factor to the characteristics of the site (geographical location, slope, agronomic practices, and type of irrigation). The hazard index (HI) can be calculated multiplying the hazard factors by the control factors and, finally, the NPSAHI are obtained dividing HI into classes on a percentile basis using a scale ranging from 1 to 10. Organization, processing, and display of all data layers were performed using the geographical information system (GIS) ArcView and its Spatial Analyst extension. Results show that the potential hazard of groundwater pollution by farming activities in the province of Cremona falls mainly in the fifth class (very low hazard).

  10. Accuracy Assessment Study of UNB3m Neutral Atmosphere Model for Global Tropospheric Delay Mitigation

    NASA Astrophysics Data System (ADS)

    Farah, Ashraf

    2015-12-01

    Tropospheric delay is the second major source of error after the ionospheric delay for satellite navigation systems. The transmitted signal could face a delay caused by the troposphere of over 2m at zenith and 20m at lower satellite elevation angles of 10 degrees and below. Positioning errors of 10m or greater can result from the inaccurate mitigation of the tropospheric delay. Many techniques are available for tropospheric delay mitigation consisting of surface meteorological models and global empirical models. Surface meteorological models need surface meteorological data to give high accuracy mitigation while the global empirical models need not. Several hybrid neutral atmosphere delay models have been developed by (University of New Brunswick, Canada) UNB researchers over the past decade or so. The most widely applicable current version is UNB3m, which uses the Saastamoinen zenith delays, Niell mapping functions, and a look-up table with annual mean and amplitude for temperature, pressure, and water vapour pressure varying with respect to latitude and height. This paper presents an assessment study of the behaviour of the UNB3m model compared with highly accurate IGS-tropospheric estimation for three different (latitude/height) IGS stations. The study was performed over four nonconsecutive weeks on different seasons over one year (October 2014 to July 2015). It can be concluded that using UNB3m model gives tropospheric delay correction accuracy of 0.050m in average for low latitude regions in all seasons. The model's accuracy is about 0.075m for medium latitude regions, while its highest accuracy is about 0.014m for high latitude regions.

  11. A Study on Integrated Community Based Flood Mitigation with Remote Sensing Technique in Kota Bharu, Kelantan

    NASA Astrophysics Data System (ADS)

    'Ainullotfi, A. A.; Ibrahim, A. L.; Masron, T.

    2014-02-01

    This study is conducted to establish a community based flood management system that is integrated with remote sensing technique. To understand local knowledge, the demographic of the local society is obtained by using the survey approach. The local authorities are approached first to obtain information regarding the society in the study areas such as the population, the gender and the tabulation of settlement. The information about age, religion, ethnic, occupation, years of experience facing flood in the area, are recorded to understand more on how the local knowledge emerges. Then geographic data is obtained such as rainfall data, land use, land elevation, river discharge data. This information is used to establish a hydrological model of flood in the study area. Analysis were made from the survey approach to understand the pattern of society and how they react to floods while the analysis of geographic data is used to analyse the water extent and damage done by the flood. The final result of this research is to produce a flood mitigation method with a community based framework in the state of Kelantan. With the flood mitigation that involves the community's understanding towards flood also the techniques to forecast heavy rainfall and flood occurrence using remote sensing, it is hope that it could reduce the casualties and damage that might cause to the society and infrastructures in the study area.

  12. Landslide hazard analysis - a case study in WuShe reservoir catchment

    NASA Astrophysics Data System (ADS)

    Huang, C. M.

    2014-12-01

    A complete landslide inventory covering a long time span is the foundation for landslide hazard analysis. Previous studies only estimate landslide susceptibility because enough landslide inventory was usually unavailable. This study collects the spot image of ten events from 1994 to 2009 of WuShe reservoir catchment in central Taiwan. All of landslide inventories were manual interpretation from spot image. The recent and expanded landslide inventory due to the event was identified from the landslide inventory of pre and post event. The recent landslide inventory was used to conduct landslide hazard evaluation.This study follows the landslide hazard analysis framework of CNR-IRPR in Italy, which had demonstrated how the spatial probability, temporal probability and size probability of each slope-unit were calculated. Landslide hazard in WuShe reservoir catchment was obtained following the procedure. The preliminary validated result shows that the prediction of landslide hazard in high landslide hazard region was better, when compared with landslide susceptibility.However, the reliability of temporal probability and size probability were related to the number of landslide inventories. When no landslide was recorded, or lack of suitable landslide inventory, would lead to the incorrect analysis of landslide hazard. The event based spatial probability was affected by rainfall distribution, but the rainfall distribution of different typhoons were usually highly variable. Next phase of this study would attempt to discuss these issues and develop a suitable framework for landslide hazard analysis in Taiwan.

  13. Water Induced Hazard Mapping in Nepal: A Case Study of East Rapti River Basin

    NASA Astrophysics Data System (ADS)

    Neupane, N.

    2010-12-01

    This paper presents illustration on typical water induced hazard mapping of East Rapti River Basin under the DWIDP, GON. The basin covers an area of 2398 sq km. The methodology includes making of base map of water induced disaster in the basin. Landslide hazard maps were prepared by SINMAP approach. Debris flow hazard maps were prepared by considering geology, slope, and saturation. Flood hazard maps were prepared by using two approaches: HEC-RAS and Satellite Imagery Interpretation. The composite water-induced hazard maps were produced by compiling the hazards rendered by landslide, debris flow, and flood. The monsoon average rainfall in the basin is 1907 mm whereas maximum 24 hours precipitation is 456.8 mm. The peak discharge of the Rapati River in the year of 1993 at station was 1220 cu m/sec. This discharge nearly corresponds to the discharge of 100-year return period. The landslides, floods, and debris flows triggered by the heavy rain of July 1993 claimed 265 lives, affected 148516 people, and damaged 1500 houses in the basin. The field investigation and integrated GIS interpretation showed that the very high and high landslide hazard zones collectively cover 38.38% and debris flow hazard zone constitutes 6.58%. High flood hazard zone occupies 4.28% area of the watershed. Mitigation measures are recommendated according to Integrated Watershed Management Approach under which the non-structural and structural measures are proposed. The non-structural measures includes: disaster management training, formulation of evacuation system (arrangement of information plan about disaster), agriculture management practices, protection of water sources, slope protections and removal of excessive bed load from the river channel. Similarly, structural measures such as dike, spur, rehabilitation of existing preventive measures and river training at some locations are recommendated. The major factors that have contributed to induce high incidences of various types of mass movements and inundation in the basin are rock and soil properties, prolonged and high-intensity rainfall, steep topography and various anthropogenic factors.

  14. Hazardous waste cleanup: A case study for developing efficient programs

    SciTech Connect

    Elcock, D.; Puder, M.G.

    1995-06-01

    As officials in Pacific Basin Countries develop laws and policies for cleaning up hazardous wastes, experiences of countries with such instruments in place may be instructive. The United States has addressed cleanups of abandoned hazardous waste sites through the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). The US Congress enacted CERCLA in 1980. The task of cleaning up waste sites became larger and more costly than originally envisioned and as a result, Congress strengthened and expanded CERCLA in 1986. Today, many industry representatives, environmentalists, and other interested parties say the program is still costly and ineffective, and Congress is responding through a reauthorization process to change the law once again. Because the law and modifications to it can affect company operations and revenues, industries want to know the potential consequences of such changes. Argonne National Laboratory (ANL) recently developed a baseline for one economic sector -- the US energy industry -- against which impacts of proposed changes to CERCLA could be measured. Difficulties encountered in locating and interpreting the data for developing that baseline suggest that legislation should not only provide for meeting its stated goals (e.g., protection of human health and the environment) but also allow for its efficient evaluation over time. This lesson can be applied to any nation contemplating hazardous waste cleanup laws and policies.

  15. Observational Studies of Earthquake Preparation and Generation to Mitigate Seismic Risks in Mines

    NASA Astrophysics Data System (ADS)

    Durrheim, R. J.; Ogasawara, H.; Nakatani, M.; Milev, A.; Cichowicz, A.; Kawakata, H.; Yabe, Y.; Murakami, O.; Naoi, M. M.; Moriya, H.; Satoh, T.

    2011-12-01

    We provide a status report on a 5-year project to monitor in-situ fault instability and strong motion in South African gold mines. The project has two main aims: (1) To learn more about earthquake preparation and generation mechanisms by deploying dense arrays of high-sensitivity sensors within rock volumes where mining is likely to induce significant seismic activity. (2) To upgrade the South African national surface seismic network in the mining districts. This knowledge will contribute to efforts to upgrade schemes of seismic hazard assessment and to limit and mitigate the seismic risks in deep mines. As of 31 July 2011, 46 boreholes totalling 1.9 km in length had been drilled at project sites at Ezulwini, Moab-Khotsong and Driefontein gold mines. Several dozen more holes are still to be drilled. Acoustic emission sensors, strain- and tiltmeters, and controlled seismic sources are being installed to monitor the deformation of the rock mass, the accumulation of damage during the preparation phase, and changes in dynamic stress as the rupture front propagates. These data will be integrated with measurements of stope closure, stope strong motion, seismic data recorded by the mine-wide network, and stress modelling. Preliminary results will be reported at AGU meeting. The project is endorsed by the Japan Science and Technology Agency (JST), Japan International Cooperation Agency (JICA) and the South African government. It is funded by the JST-JICA program for Science and Technology Research Partnership for Sustainable development (SATREPS, the Council for Scientific and Industrial Research (CSIR), the Council for Geoscience, the University of the Witwatersrand and the Department of Science and Technology. The contributions of Seismogen CC, OHMS Ltd, AnglogoldAshanti Rock Engineering Applied Research Group, First Uranium, the Gold Fields Seismic Department and the Institute of Mine Seismology are gratefully acknowledged.

  16. Hazard mitigation related to water and sediment fluxes in the Yellow River basin, China, based on comparable basins of the United States

    USGS Publications Warehouse

    Osterkamp, W.R.; Gray, J.R.

    2003-01-01

    The Yellow River, north-central China, and comparative rivers of the western United States, the Rio Grande and the Colorado River, derive much of their flows from melting snow at high elevations, but derive most of their se diment loads from semiarid central parts of the basins. The three rivers are regulated by larg e reservoirs that store water and sediment, causing downstream channel scour and, farthe r downstream, flood hazard owing to re- deposition of sediment. Potential approaches to reducing continui ng bed aggradation and increasing flood hazard along the lower Yellow Ri ver include flow augmentation, retirement of irrigation that decreases flows and increas es erosion, and re-routing of the middle Yellow River to bypass large sediment i nputs of the Loess Plateau.

  17. Study of the ELM fluctuation characteristics during the mitigation of type-I ELMs

    NASA Astrophysics Data System (ADS)

    Bogomolov, A. V.; Classen, I. G. J.; Boom, J. E.; Donn, A. J. H.; Wolfrum, E.; Fischer, R.; Viezzer, E.; Schneider, P.; Manz, P.; Suttrop, W.; Luhmann, N. C., Jr.

    2015-08-01

    Transitions from type-I to small edge localized modes (ELMs) and back are studied using the electron cyclotron emission imaging (ECEI) diagnostic on the ASDEX Upgrade (AUG). ECEI measurements show that the average poloidal velocity of temperature fluctuations of both type-I ELM onsets and small ELMs is the same and is close to 5-6 km s-1. Radially, the temperature fluctuations are distributed in the same narrow 2?cm region between 0.975?slant {?\\text{pol}}?slant 1.025 with associated poloidal mode numbers m=96+/- 18 and toroidal mode numbers n=16+/- 4 . The observed fluctuations related to both type-I ELMs and small ELMs vary over the transition simultaneously, however, showing slightly different behaviour. The similarities between type-I ELMs and small ELMs observed via AUG suggest that they have the same nature and evolve together. In the transition phase a temperature fluctuation mode (inter-ELM mode) appears, which becomes continuous in the mitigated ELM phase and might cause the ELM mitigation. The mode characteristics (velocity, frequency and wave number) obtained in the analysis can be further used for direct comparison in various code simulations.

  18. Methane emissions from landfills in Serbia and potential mitigation strategies: a case study.

    PubMed

    Stanisavljevic, Nemanja; Ubavin, Dejan; Batinic, Bojan; Fellner, Johann; Vujic, Goran

    2012-10-01

    Open dumping and landfilling have represented the predominant method of waste management in Serbia during the past decades. This practice resulted in over 3600 waste disposal sites distributed all over the country. The locations of the sites and their characteristics have been determined in the framework of the presented study. The vast majority of disposal sites (up to 3300) are characterized by small deposition depth of waste and total waste volumes of less than 10,000 m(3). Only about 50 landfills in Serbia contain more than 100,000 m(3) of waste. These large landfills are responsible for more than 95% of the total CH(4) emissions from waste disposal, which was assessed as 60,000 tons of CH(4) in 2010. The evaluation of different measures [soil cover, compost cover and landfill gas (LFG) systems] for mitigating greenhouse gas emissions from Serbian landfills indicated that enhanced microbial CH(4) oxidation (using a compost cover), as well as the installation of LFG systems, could generate net revenues as saved CH(4) emissions are creditable for the European Greenhouse Gas Emissions Trading Scheme. In total between 4 and 7 million tons of CO(2) equivalent emissions could be avoided within the next 20 years by mitigating CH(4) emissions from Serbian landfills. PMID:22751946

  19. Study on mitigation of pulsed heat load for ITER cryogenic system

    NASA Astrophysics Data System (ADS)

    Peng, N.; Xiong, L. Y.; Jiang, Y. C.; Tang, J. C.; Liu, L. Q.

    2015-03-01

    One of the key requirements for ITER cryogenic system is the mitigation of the pulsed heat load deposited in the magnet system due to magnetic field variation and pulsed DT neutron production. As one of the control strategies, bypass valves of Toroidal Field (TF) case helium loop would be adjusted to mitigate the pulsed heat load to the LHe plant. A quasi-3D time-dependent thermal-hydraulic analysis of the TF winding packs and TF case has been performed to study the behaviors of TF magnets during the reference plasma scenario with the pulses of 400 s burn and repetition time of 1800 s. The model is based on a 1D helium flow and quasi-3D solid heat conduction model. The whole TF magnet is simulated taking into account thermal conduction between winding pack and case which are cooled separately. The heat loads are given as input information, which include AC losses in the conductor, eddy current losses in the structure, thermal radiation, thermal conduction and nuclear heating. The simulation results indicate that the temperature variation of TF magnet stays within the allowable range when the smooth control strategy is active.

  20. Hazardous Waste/Mixed Waste Treatment Building throughput study

    SciTech Connect

    England, J.L.; Kanzleiter, J.P.

    1991-12-18

    The hazardous waste/mixed waste HW/MW Treatment Building (TB) is the specified treatment location for solid hazardous waste/mixed waste at SRS. This report provides throughput information on the facility based on known and projected waste generation rates. The HW/MW TB will have an annual waste input for the first four years of approximately 38,000 ft{sup 3} and have an annual treated waste output of approximately 50,000 ft{sup 3}. After the first four years of operation it will have an annual waste input of approximately 16,000 ft{sup 3} and an annual waste output of approximately 18,000 ft. There are several waste streams that cannot be accurately predicted (e.g. environmental restoration, decommissioning, and decontamination). The equipment and process area sizing for the initial four years should allow excess processing capability for these poorly defined waste streams. A treatment process description and process flow of the waste is included to aid in understanding the computations of the throughput. A description of the treated wastes is also included.

  1. Landslide hazard evaluation: a review of current techniques and their application in a multi-scale study, Central Italy

    NASA Astrophysics Data System (ADS)

    Guzzetti, Fausto; Carrara, Alberto; Cardinali, Mauro; Reichenbach, Paola

    1999-12-01

    In recent years, growing population and expansion of settlements and life-lines over hazardous areas have largely increased the impact of natural disasters both in industrialized and developing countries. Third world countries have difficulty meeting the high costs of controlling natural hazards through major engineering works and rational land-use planning. Industrialized societies are increasingly reluctant to invest money in structural measures that can reduce natural risks. Hence, the new issue is to implement warning systems and land utilization regulations aimed at minimizing the loss of lives and property without investing in long-term, costly projects of ground stabilization. Government and research institutions worldwide have long attempted to assess landslide hazard and risks and to portray its spatial distribution in maps. Several different methods for assessing landslide hazard were proposed or implemented. The reliability of these maps and the criteria behind these hazard evaluations are ill-formalized or poorly documented. Geomorphological information remains largely descriptive and subjective. It is, hence, somewhat unsuitable to engineers, policy-makers or developers when planning land resources and mitigating the effects of geological hazards. In the Umbria and Marche Regions of Central Italy, attempts at testing the proficiency and limitations of multivariate statistical techniques and of different methodologies for dividing the territory into suitable areas for landslide hazard assessment have been completed, or are in progress, at various scales. These experiments showed that, despite the operational and conceptual limitations, landslide hazard assessment may indeed constitute a suitable, cost-effective aid to land-use planning. Within this framework, engineering geomorphology may play a renewed role in assessing areas at high landslide hazard, and helping mitigate the associated risk.

  2. Methodological framework for the probabilistic risk assessment of multi-hazards at a municipal scale: a case study in the Fella river valley, Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; van Westen, Cees; Reichenbach, Paola

    2013-04-01

    Local and regional authorities in mountainous areas that deal with hydro-meteorological hazards like landslides and floods try to set aside budgets for emergencies and risk mitigation. However, future losses are often not calculated in a probabilistic manner when allocating budgets or determining how much risk is acceptable. The absence of probabilistic risk estimates can create a lack of preparedness for reconstruction and risk reduction costs and a deficiency in promoting risk mitigation and prevention in an effective way. The probabilistic risk of natural hazards at local scale is usually ignored all together due to the difficulty in acknowledging, processing and incorporating uncertainties in the estimation of losses (e.g. physical damage, fatalities and monetary loss). This study attempts to set up a working framework for a probabilistic risk assessment (PRA) of landslides and floods at a municipal scale using the Fella river valley (Eastern Italian Alps) as a multi-hazard case study area. The emphasis is on the evaluation and determination of the uncertainty in the estimation of losses from multi-hazards. To carry out this framework some steps are needed: (1) by using physically based stochastic landslide and flood models we aim to calculate the probability of the physical impact on individual elements at risk, (2) this is then combined with a statistical analysis of the vulnerability and monetary value of the elements at risk in order to include their uncertainty in the risk assessment, (3) finally the uncertainty from each risk component is propagated into the loss estimation. The combined effect of landslides and floods on the direct risk to communities in narrow alpine valleys is also one of important aspects that needs to be studied.

  3. ASSESSMENT OF HAZARDOUS WASTE SURFACE IMPOUNDMENT TECHNOLOGY CASE STUDIES AND PERSPECTIVES OF EXPERTS

    EPA Science Inventory

    The available data were gathered for a large number of case studies of hazardous waste surface impoundments (SI). Actual and projected performances were compared. This collection, analysis and dissemination of the accumulated experience can contribute significantly to improving S...

  4. A study of shock mitigating materials in a split Hopkins bar configuration. Phase 2

    SciTech Connect

    Bateman, V.I.; Brown, F.A.; Hansen, N.R.

    1997-12-31

    Sandia National Laboratories (SNL) designs mechanical systems with electronics that must survive high shock environments. These mechanical systems include penetrators that must survive soil and rock penetration, nuclear transportation casks that must survive transportation environments, and laydown weapons that must survive delivery impact. These mechanical systems contain electronics that may operate during and after the high shock environment and that must be protected from the high shock environments. A study has been started to improve the packaging techniques for the advanced electronics utilized in these mechanical systems because current packaging techniques are inadequate for these sensitive electronics. In many cases, it has been found that the packaging techniques currently used not only do not mitigate the shock environment but actually amplify the shock environment. An ambitious goal for this packaging study is to avoid amplification and possibly attenuate the shock environment before it reached the electronics contained in the various mechanical systems. Here, a study to compare two thickness values, 0.125 and 0.250 in. of five materials, GE RTV 630, HS II Silicone, Polysulfide Rubber, Sylgard 184, and Teflon for their shock mitigating characteristics with a split Hopkinson bar configuration has been completed. The five materials have been tested in both unconfined and confined conditions at ambient temperature and with two applied loads of 750 {mu}{epsilon} peak (25 fps peak) with a 100 {micro}s duration, measured at 10% amplitude, and 1500 {mu}{epsilon} peak (50 fps peak) with a 100 {micro}s duration, measured at 10% amplitude. The five materials have been tested at ambient, cold ({minus}65 F), and hot (+165 F) for the unconfined condition with the 750 {mu}{epsilon} peak (25 fps peak) applied load. Time domain and frequency domain analyses of the split Hopkinson bar data have been performed to compare how these materials lengthen the shock pulse, attenuate the shock pulse, reflect high frequency content in the shock pulse, and transmit energy.

  5. Melatonin mitigate cerebral vasospasm after experimental subarachnoid hemorrhage: a study of synchrotron radiation angiography

    NASA Astrophysics Data System (ADS)

    Cai, J.; He, C.; Chen, L.; Han, T.; Huang, S.; Huang, Y.; Bai, Y.; Bao, Y.; Zhang, H.; Ling, F.

    2013-06-01

    Cerebral vasospasm (CV) after subarachnoid hemorrhage (SAH) is a devastating and unsolved clinical issue. In this study, the rat models, which had been induced SAH by prechiasmatic cistern injection, were treated with melatonin. Synchrotron radiation angiography (SRA) was employed to detect and evaluate CV of animal models. Neurological scoring and histological examinations were used to assess the neurological deficits and CV as well. Using SRA techniques and histological analyses, the anterior cerebral artery diameters of SAH rats with melatonin administration were larger than those without melatonin treatment (p < 0.05). The neurological deficits of SAH rats treated with melatonin were less than those without melatonin treatment (p < 0.05). We concluded that SRA was a precise and in vivo tool to observe and evaluate CV of SAH rats; intraperitoneally administration of melatonin could mitigate CV after experimental SAH.

  6. BICAPA case study of natural hazards that trigger technological disasters

    NASA Astrophysics Data System (ADS)

    Boca, Gabriela; Ozunu, Alexandru; Nicolae Vlad, Serban

    2010-05-01

    Industrial facilities are vulnerable to natural disasters. Natural disasters and technological accidents are not always singular or isolated events. The example in this paper show that they can occur in complex combinations and/or in rapid succession, known as NaTech disasters, thereby triggering multiple impacts. This analysis indicates that NaTech disasters have the potential to trigger hazmat releases and other types of technological accidents. Climate changes play an important role in prevalence and NATECH triggering mechanisms. Projections under the IPCC IS92 a scenario (similar to SRES A1B; IPCC, 1992) and two GCMs indicate that the risk of floods increases in central and eastern Europe. Increase in intense short-duration precipitation is likely to lead to increased risk of flash floods. (Lehner et al., 2006). It is emergent to develop tools for the assessment of risks due to NATECH events in the industrial processes, in a framework starting with the characterization of frequency and severity of natural disasters and continuing with complex analysis of industrial processes, to risk assessment and residual functionality analysis. The Ponds with dangerous technological residues are the most vulnerable targets of natural hazards. Technological accidents such as those in Baia Mare, (from January to March 2000) had an important international echo. Extreme weather phenomena, like those in the winter of 2000 in Baia Mare, and other natural disasters such as floods or earthquakes, can cause a similar disaster at Târnăveni in Transylvania Depression. During 1972 - 1978 three decanting ponds were built on the Chemical Platform Târnăveni, now SC BICAPA SA, for disposal of the hazardous-wastes resulting from the manufacture of sodium dichromate, inorganic salts, sludge from waste water purification and filtration, wet gas production from carbide. The ponds are located on the right bank of the river Târnava at about 35-50m from the flooding defense dam. The total amount of toxic waste stored in the three ponds is about 2500 tons, equivalent at 128 tons expressed in hexavalent chromium. The ponds contour dikes are strongly damaged in many places, their safety is jeopardized by leakages, sliding slopes and ravens. The upstream dike has an increased failure risk. The upstream dike has an increased failure risk. In that section the coefficients of safety are under the allowable limit, both in static applications, and the earthquake. The risk of failure is very high also due to the dikes slopes. The risk becomes higher in case of heavy rainfall, floods or an earthquake.

  7. Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria

    NASA Astrophysics Data System (ADS)

    Maltzkait, Anika; Pfurtscheller, Clemens

    2014-05-01

    Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria A. Maltzkait (1) & C. Pfurtscheller (1) (1) Institute for Interdisciplinary Mountain Research (IGF), Austrian Academy of Sciences, Innsbruck, Austria The extreme flood events of 2002, 2005 and 2013 in Austria underlined the importance of local emergency services being able to withstand and reduce the adverse impacts of natural hazards. Although for legal reasons municipal emergency and crisis management plans exist in Austria, they mostly do not cover risk analyses of natural hazards - a sound, comparable assessment to identify and evaluate risks. Moreover, total losses and operational emergencies triggered by natural hazards have increased in recent decades. Given sparse public funds, objective budget decisions are needed to ensure the efficient provision of operating resources, like personnel, vehicles and equipment in the case of natural hazards. We present a case study of the municipality of Au, Austria, which was hardly affected during the 2005 floods. Our approach is primarily based on a qualitative risk analysis, combining existing hazard plans, GIS data, field mapping and data on operational efforts of the fire departments. The risk analysis includes a map of phenomena discussed in a workshop with local experts and a list of risks as well as a risk matrix prepared at that workshop. On the basis for the exact requirements for technical and non-technical mitigation measures for each natural hazard risk were analysed in close collaboration with members of the municipal operation control and members of the local emergency services (fire brigade, Red Cross). The measures includes warning, evacuation and, technical interventions with heavy equipment and personnel. These results are used, first, to improve the municipal emergency and crisis management plan by providing a risk map, and a list of risks and, second, to check if the local emergency forces can cope with the different risk scenarios using locally available resources. The emergency response plans will identify possible resource deficiencies in personnel, vehicles and equipment. As qualitative methods and data are used, uncertainties in the study emerged in finding definitions for safety targets, in the construction of the different risk scenarios, in the inherent uncertainty beyond the probability of occurrence and the intensity of natural hazards, also in the case of the expectable losses. Finally, we used available studies and expert interviews to develop objective rules for investment decisions for the fire departments and the Red Cross to present an empirically sound basis for the efficient provision of intervention in the case of natural hazards for the municipality of Au. Again, the regulations for objective provision were developed in close collaboration with the emergency services.

  8. Echo-sounding method aids earthquake hazard studies

    USGS Publications Warehouse

    U.S. Geological Survey

    1995-01-01

    Dramatic examples of catastrophic damage from an earthquake occurred in 1989, when the M 7.1 Lorna Prieta rocked the San Francisco Bay area, and in 1994, when the M 6.6 Northridge earthquake jolted southern California. The surprising amount and distribution of damage to private property and infrastructure emphasizes the importance of seismic-hazard research in urbanized areas, where the potential for damage and loss of life is greatest. During April 1995, a group of scientists from the U.S. Geological Survey and the University of Tennessee, using an echo-sounding method described below, is collecting data in San Antonio Park, California, to examine the Monte Vista fault which runs through this park. The Monte Vista fault in this vicinity shows evidence of movement within the last 10,000 years or so. The data will give them a "picture" of the subsurface rock deformation near this fault. The data will also be used to help locate a trench that will be dug across the fault by scientists from William Lettis & Associates.

  9. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 44 Emergency Management and Assistance 1 2014-10-01 2014-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  10. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 44 Emergency Management and Assistance 1 2011-10-01 2011-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  11. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  12. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  13. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 44 Emergency Management and Assistance 1 2012-10-01 2011-10-01 true Flood Mitigation Plan approval..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  14. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Flood Mitigation Plan development. 78.5 Section 78.5 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation...

  15. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false Flood Mitigation Plan development. 78.5 Section 78.5 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation...

  16. Modeling effects of urban heat island mitigation strategies on heat-related morbidity: a case study for Phoenix, Arizona, USA

    NASA Astrophysics Data System (ADS)

    Silva, Humberto R.; Phelan, Patrick E.; Golden, Jay S.

    2010-01-01

    A zero-dimensional energy balance model was previously developed to serve as a user-friendly mitigation tool for practitioners seeking to study the urban heat island (UHI) effect. Accordingly, this established model is applied here to show the relative effects of four common mitigation strategies: increasing the overall (1) emissivity, (2) percentage of vegetated area, (3) thermal conductivity, and (4) albedo of the urban environment in a series of percentage increases by 5, 10, 15, and 20% from baseline values. In addition to modeling mitigation strategies, we present how the model can be utilized to evaluate human health vulnerability from excessive heat-related events, based on heat-related emergency service data from 2002 to 2006. The 24-h average heat index is shown to have the greatest correlation to heat-related emergency calls in the Phoenix (Arizona, USA) metropolitan region. The four modeled UHI mitigation strategies, taken in combination, would lead to a 48% reduction in annual heat-related emergency service calls, where increasing the albedo is the single most effective UHI mitigation strategy.

  17. A combined approach to physical vulnerability of large cities exposed to natural hazards - the case study of Arequipa, Peru

    NASA Astrophysics Data System (ADS)

    Thouret, Jean-Claude; Ettinger, Susanne; Zuccaro, Giulio; Guitton, Mathieu; Martelli, Kim; Degregorio, Daniela; Nardone, Stefano; Santoni, Olivier; Magill, Christina; Luque, Juan Alexis; Arguedas, Ana

    2013-04-01

    Arequipa, the second largest city in Peru with almost one million inhabitants, is exposed to various natural hazards, such as earthquakes, landslides, flash floods, and volcanic eruptions. This study focuses on the vulnerability and response of housing, infrastructure and lifelines in Arequipa to flash floods and eruption induced hazards, notably lahars from El Misti volcano. We propose a combined approach for assessing physical vulnerability in a large city based on: (1) remote sensing utilizing high-resolution imagery (SPOT5, Google Earth Pro, Bing, Pléïades) to map the distribution and type of land use, properties of city blocks in terms of exposure to the hazard (elevation above river level, distance to channel, impact angle, etc.); (2) in situ survey of buildings and critical infrastructure (e.g., bridges) and strategic resources (e.g., potable water, irrigation, sewage); (3) information gained from interviews with engineers involved in construction works, previous crises (e.g., June 2001 earthquake) and risk mitigation in Arequipa. Remote sensing and mapping at the scale of the city has focused on three pilot areas, along the perennial Rio Chili valley that crosses the city and oasis from north to south, and two of the east-margin tributaries termed Quebrada (ravine): San Lazaro crossing the northern districts and Huarangal crossing the northeastern districts. Sampling of city blocks through these districts provides varying geomorphic, structural, historical, and socio-economic characteristics for each sector. A reconnaissance survey included about 900 edifices located in 40 city blocks across districts of the pilot areas, distinct in age, construction, land use and demographics. A building acts as a structural system and its strength and resistance to flashfloods and lahars therefore highly depends on the type of construction and the used material. Each building surveyed was assigned to one of eight building categories based on physical criteria (dominant building materials, number of floors, percentage and quality of openings, etc). Future steps in this study include mapping potential impacts from flash flood and lahars as a function of frequency of occurrence and magnitude. For this purpose, we will regroup the eight building types identified in Arequipa to obtain a reduced number of vulnerability categories. Fragility functions will then be established for each vulnerability category and hazard relating percentage damage to parameters such as flow velocity, depth, and dynamic and hydrostatic pressure. These functions will be applied to flow simulations for each of the three river channels considered with the final goal to determine potential losses, identify areas of particularly high risk and to prepare plans for evacuation, relocation and rehabilitation. In the long term, this investigation aims to contribute towards a multi-hazard risk analysis including earthquake- and other volcanic hazards, e.g. ashfall and pyroclastic flows, all by considering the cascading effects of a hazard chain. We also plan to address the consequences of failure of two artificial lake dams located 40 and 70 km north of the city. A lake breakout flood or lahar would propagate beyond the city and would call for an immediate response including contingency plans and evacuation practices.

  18. Blast effect on the lower extremities and its mitigation: a computational study.

    PubMed

    Dong, Liqiang; Zhu, Feng; Jin, Xin; Suresh, Mahi; Jiang, Binhui; Sevagan, Gopinath; Cai, Yun; Li, Guangyao; Yang, King H

    2013-12-01

    A series of computational studies were performed to investigate the response of the lower extremities of mounted soldiers under landmine detonation. A numerical human body model newly developed at Wayne State University was used to simulate two types of experimental studies and the model predictions were validated against test data in terms of the tibia axial force as well as bone fracture pattern. Based on the validated model, the minimum axial force causing tibia facture was found. Then a series of parametric studies was conducted to determine the critical velocity (peak velocity of the floor plate) causing tibia fracture at different upper/lower leg angles. In addition, to limit the load transmission through the vehicular floor, two types of energy absorbing materials, namely IMPAXX(®) foam and aluminum alloy honeycomb, were selected for floor matting. Their performances in terms of blast effect mitigation were compared using the validated numerical model, and it has been found that honeycomb is a more efficient material for blast injury prevention under the loading conditions studied. PMID:23973770

  19. Public willingness to pay for CO2 mitigation and the determinants under climate change: a case study of Suzhou, China.

    PubMed

    Yang, Jie; Zou, Liping; Lin, Tiansheng; Wu, Ying; Wang, Haikun

    2014-12-15

    This study explored the factors that influence respondents' willingness to pay (WTP) for CO2 mitigation under climate change. A questionnaire survey combined with contingent valuation and psychometric paradigm methods were conducted in the city of Suzhou, Jiangsu Province in China. Respondents' traditional demographic attributes, risk perception of greenhouse gas (GHG), and attitude toward the government's risk management practices were established using a Tobit model to analyze the determinants. The results showed that about 55% of the respondents refused to pay for CO2 mitigation, respondent's WTP increased with increasing CO2 mitigation percentage. Important factors influencing WTP include people's feeling of dread of GHGs, confidence in policy, the timeliness of governmental information disclosure, age, education and income level. PMID:25151109

  20. A simulator study investigating how motorcyclists approach side-road hazards.

    PubMed

    Crundall, Elizabeth; Stedmon, Alex W; Saikayasit, Rossukorn; Crundall, David

    2013-03-01

    The most common form of motorcycle collision in the UK occurs when another road user fails to give way and pulls out from a side road in front of an oncoming motorcyclist. While research has considered these collisions from the car driver's perspective, no research to date has addressed how motorcyclists approach these potential hazards. This study conducted a detailed analysis of motorcyclist speed and road position on approach to side-roads in a simulated suburban setting. Novice, Experienced and Advanced riders rode two laps of a simulated route, encountering five side-roads on each lap. On the second lap, a car emerged from the first side-road in a typical 'looked but failed to see' accident scenario. Three Experienced riders and one Novice rider collided with the hazard. The Advanced rider group adopted the safest strategy when approaching side-roads, with a lane position closer to the centre of the road and slower speeds. In contrast, Experienced riders chose faster speeds, often over the speed limit, especially when approaching junctions with good visibility. Rider behaviour at non-hazard junctions was compared between laps, to investigate if riders modified their behaviour after experiencing the hazard. Whilst all riders were generally more cautious after the hazard, the Advanced riders modified their behaviour more than the other groups after the hazard vehicle had pulled out. The results suggest that advanced training can lead to safer riding styles that are not acquired by experience alone. PMID:23182782

  1. Underground Coal-Fires in Xinjiang, China: A Continued Effort in Applying Geophysics to Solve a Local Problem and to Mitigate a Global Hazard

    NASA Astrophysics Data System (ADS)

    Wuttke, M. W.; Halisch, M.; Tanner, D. C.; Cai, Z. Y.; Zeng, Q.; Wang, C.

    2012-04-01

    Spontaneous uncontrolled coal seam fires are a well known phenomenon that causes severe environmental problems and severe impact on natural coal reserves. Coal fires are a worldwide phenomenon, but in particular in Xinjiang, that covers 17.3 % of Chinas area and hosts approx 42 % of its coal resources. In Xinjiang since more than 50 years a rigorous strategy for fire fighting on local and regional scale is persued. The Xinjiang Coalfield Fire Fighting Bureau (FFB) has developed technologies and methods to deal with any known fire. Many fires have been extinguished already, but the problem is still there if not even growing. This problem is not only a problem for China due to the loss of valuable energy resources, but it is also a worldwide threat because of the generation of substantial amounts of greenhouse gases. Through the FFB, China is struggling to overcome this, but the activities could be much enhanced by the continuation of the already successful conjoint operations. The last ten years have seen two successful cooperative projects between China and Germany on the field of coal-fire fighting, namely the German Technical Cooperation Project on Coal Fire in Xinjiang and the Sino-German Coal Fire Research Initiative funded by the corresponding ministeries of both countries. A persistent task in the fire fighting is the identification and supervision of areas with higher risks for the ignition of coal fires, the exploration of already ignited fire zones to extinguish the fires and the monitoring of extinguished fires to detect as early as possible process that may foster re-ignition. This can be achieved by modeling both the structures and the processes that are involved. This has also been a promising part of the past cooperation projects, yet to be transformed into a standard application of fire fighting procedures. In this contribution we describe the plans for a new conjoint project between China and Germany where on the basis of field investigations and laboratory measurements realistic dynamical models of fire-zones are constructed to increase the understanding of particular coal-fires, to interpret the surface signatures of the coal-fire in terms of location and propagation and to estimate the output of hazardous exhaust products to evaluate the economic benefit of fire extinction.

  2. Studying and Improving Human Response to Natural Hazards: Lessons from the Virtual Hurricane Lab

    NASA Astrophysics Data System (ADS)

    Meyer, R.; Broad, K.; Orlove, B. S.

    2010-12-01

    One of the most critical challenges facing communities in areas prone to natural hazards is how to best encourage residents to invest in individual and collective actions that would reduce the damaging impact of low-probability, high-consequence, environmental events. Unfortunately, what makes this goal difficult to achieve is that the relative rarity natural hazards implies that many who face the risk of natural hazards have no previous experience to draw on when making preparation decisions, or have prior experience that provides misleading guidance on how best to prepare. For example, individuals who have experienced strings of minor earthquakes or near-misses from tropical cyclones may become overly complacent about the risks that extreme events actually pose. In this presentation we report the preliminary findings of a program of work that explores the use of realistic multi-media hazard simulations designed for two purposes: 1) to serve as a basic research tool for studying of how individuals make decisions to prepare for rare natural hazards in laboratory settings; and 2) to serve as an educational tool for giving people in hazard-prone areas virtual experience in hazard preparation. We demonstrate a prototype simulation in which participants experience the approach of a virtual hurricane, where they have the opportunity to invest in different kinds of action to protect their home from damage. As the hurricane approaches participants have access to an “information dashboard” in which they can gather information about the storm threat from a variety of natural sources, including mock television weather broadcasts, web sites, and conversations with neighbors. In response to this information they then have the opportunity to invest in different levels of protective actions. Some versions of the simulation are designed as games, where participants are rewarded based on their ability to make the optimal trade-off between under and over-preparing for the threat. From a basic research perspective the data provide valuable potential insights into the dynamics of information gathering prior to hurricane impacts, as well as laboratory in which we can study how both information gathering and responses varies in responses to controlled variations in such factors as the complexity of forecast information. From an applied perspective the simulations provide an opportunity for residents in hazard-prone areas to learn about different kinds of information and receive feedback on their potential biases prior to an actual encounter with a hazard. The presentation concludes with a summary of some of the basic research findings that have emerged from the hurricane lab to date, as well as a discussion of the prospects for extending the technology to a broad range of environmental hazards.

  3. Economic valuation of flood mitigation services: A case study from the Otter Creek, VT.

    NASA Astrophysics Data System (ADS)

    Galford, G. L.; Ricketts, T.; Bryan, K. L.; ONeil-Dunne, J.; Polasky, S.

    2014-12-01

    The ecosystem services provided by wetlands are widely recognized but difficult to quantify. In particular, estimating the effect of landcover and land use on downstream flood outcomes remains challenging, but is increasingly important in light of climate change predictions of increased precipitation in many areas. Economic valuation can help incorporate ecosystem services into decisions and enable communities to plan for climate and flood resiliency. Here we estimate the economic value of Otter Creek wetlands for Middlebury, VT in mitigating the flood that followed Tropical Storm Irene, as well as for ten historic floods. Observationally, hydrographs above and below the wetlands in the case of each storm indicated the wetlands functioned as a temporary reservoir, slowing the delivery of water to Middlebury. We compare observed floods, based on Middlebury's hydrograph, with simulated floods for scenarios without wetlands. To simulate these "without wetlands" scenarios, we assume the same volume of water was delivered to Middlebury, but in a shorter time pulse similar to a hydrograph upstream of the wetlands. For scenarios with and without wetlands, we map the spatial extent of flooding using LiDAR digital elevation data. We then estimate flood depth at each affected building, and calculate monetary losses as a function of the flood depth and house value using established depth damage relationships. For example, we expect damages equal to 20% of the houses value for a flood depth of two feet in a two-story home with a basement. We define the value of flood mitigation services as the difference in damages between the with and without wetlands scenario, and find that the Otter Creek wetlands reduced flood damage in Middlebury by 88% following Hurricane Irene. Using the 10 additional historic floods, we estimate an ongoing mean value of $400,000 in avoided damages per year. Economic impacts of this magnitude stress the importance of wetland conservation and warrant the consideration of ecosystem services in land use decisions. Our study indicates that here and elsewhere, green infrastructure may have to potential to increase the resilience of communities to projected changes in climate.

  4. Bayesian hierarchical modeling of latent period switching in small-area putative health hazard studies.

    PubMed

    Al-Hadhrami, Ahmed; Lawson, Andrew B

    2011-02-01

    In recent years, small area risk assessment modelling and data analysis around putative hazard sources has become a fundamental part of public health and environmental sciences. In this study, we address a fundamental problem in the analysis of such data, when intermittent operation of facilities could lead to evidence for latent periods of risk. This study examines the development of Bayesian models for the latent switching operating period of putative hazard sources such as nuclear processing plants and waste disposal incinerators. The developed methodology is applied in a simulation study as well as to a real data example. PMID:20798124

  5. Communicating Volcanic Hazards in the North Pacific

    NASA Astrophysics Data System (ADS)

    Dehn, J.; Webley, P.; Cunningham, K. W.

    2014-12-01

    For over 25 years, effective hazard communication has been key to effective mitigation of volcanic hazards in the North Pacific. These hazards are omnipresent, with a large event happening in Alaska every few years to a decade, though in many cases can happen with little or no warning (e.g. Kasatochi and Okmok in 2008). Here a useful hazard mitigation strategy has been built on (1) a large database of historic activity from many datasets, (2) an operational alert system with graduated levels of concern, (3) scenario planning, and (4) routine checks and communication with emergency managers and the public. These baseline efforts are then enhanced in the time of crisis with coordinated talking points, targeted studies and public outreach. Scientists naturally tend to target other scientists as their audience, whereas in effective monitoring of hazards that may only occur on year to decadal timescales, details can distract from the essentially important information. Creating talking points and practice in public communications can help make hazard response a part of the culture. Promoting situational awareness and familiarity can relieve indecision and concerns at the time of a crisis.

  6. Assessment of environmental impact on air quality by cement industry and mitigating measures: a case study.

    PubMed

    Kabir, G; Madugu, A I

    2010-01-01

    In this study, environmental impact on air quality was evaluated for a typical Cement Industry in Nigeria. The air pollutants in the atmosphere around the cement plant and neighbouring settlements were determined using appropriate sampling techniques. Atmospheric dust and CO2 were prevalent pollutants during the sampling period; their concentrations were recorded to be in the range of 249-3,745 mg/m3 and 2,440-2,600 mg/m3, respectively. Besides atmospheric dust and CO2, the air pollutants such as NOx, SOx and CO were in trace concentrations, below the safe limits approved by FEPA that are 0.0062-0.093 mg/m3 NOx, 0.026 mg/m3 SOx and 114.3 mg/m3 CO, respectively. Some cost-effective mitigating measures were recommended that include the utilisation of readily available and low-cost pozzolans material to produce blended cement, not only could energy efficiency be improved, but carbon dioxide emission could also be minimised during clinker production; and the installation of an advance high-pressure grinding rolls (clinker-roller-press process) to maximise energy efficiency to above what is obtainable from the traditional ball mills and to minimise CO2 emission from the power plant. PMID:19067202

  7. A Case Study in Ethical Decision Making Regarding Remote Mitigation of Botnets

    NASA Astrophysics Data System (ADS)

    Dittrich, David; Leder, Felix; Werner, Tillmann

    It is becoming more common for researchers to find themselves in a position of being able to take over control of a malicious botnet. If this happens, should they use this knowledge to clean up all the infected hosts? How would this affect not only the owners and operators of the zombie computers, but also other researchers, law enforcement agents serving justice, or even the criminals themselves? What dire circumstances would change the calculus about what is or is not appropriate action to take? We review two case studies of long-lived malicious botnets that present serious challenges to researchers and responders and use them to illuminate many ethical issues regarding aggressive mitigation. We make no judgments about the questions raised, instead laying out the pros and cons of possible choices and allowing workshop attendees to consider how and where they would draw lines. By this, we hope to expose where there is clear community consensus as well as where controversy or uncertainty exists.

  8. HAZARDOUS CHEMICALS IN FISH: WISCONSIN POWER PLANT IMPACT STUDY

    EPA Science Inventory

    The role of fish as vectors for organic chemical contaminants arising from the operation of a coal-fired power plant was assessed by in vivo studies of the fate of selected chemicals and in vitro studies of liver xenobiotic biotransformation enzymes. The results indicate that sel...

  9. Recommendations for water supply in arsenic mitigation: a case study from Bangladesh.

    PubMed

    Hoque, B A; Mahmood, A A; Quadiruzzaman, M; Khan, F; Ahmed, S A; Shafique, S A; Rahman, M; Morshed, G; Chowdhury, T; Rahman, M M; Khan, F H; Shahjahan, M; Begum, M; Hoque, M M

    2000-11-01

    Arsenic problems have been observed in several countries around the world. The challenges of arsenic mitigation are more difficult for developing and poor countries due to resource and other limitations. Bangladesh is experiencing the worst arsenic problem in the world, as about 30 million people are possibly drinking arsenic contaminated water. Lack of knowledge has hampered the mitigation initiatives. This paper presents experience gained during an action research on water supply in arsenic mitigation in rural Singair, Bangladesh. The mitigation has been implemented there through integrated research and development of appropriate water supply options and its use through community participation. Political leaders and women played key roles in the success of the mitigation. More than one option for safe water has been developed and/or identified. The main recommendations include: integration of screening of tubewells and supply of safe water, research on technological and social aspects, community, women and local government participation, education and training of all stakeholders, immediate and appropriate use of the available knowledge, links between intermediate/immediate and long term investment, effective coordination and immediate attention by health, nutrition, agriculture, education, and other programs to this arsenic issue. PMID:11114764

  10. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    SciTech Connect

    Bernreuter, D. L

    1981-08-08

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectral for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimted seismic hazard in this region of the country. 29 refs., 15 tabs.

  11. Pyrotechnic hazards classification and evaluation program test report. Heat flux study of deflagrating pyrotechnic munitions

    NASA Technical Reports Server (NTRS)

    Fassnacht, P. O.

    1971-01-01

    A heat flux study of deflagrating pyrotechnic munitions is presented. Three tests were authorized to investigate whether heat flux measurements may be used as effective hazards evaluation criteria to determine safe quantity distances for pyrotechnics. A passive sensor study was conducted simultaneously to investigate their usefulness in recording events and conditions. It was concluded that heat flux measurements can effectively be used to evaluate hazards criteria and that passive sensors are an inexpensive tool to record certain events in the vicinity of deflagrating pyrotechnic stacks.

  12. Mini-Sosie high-resolution seismic method aids hazards studies

    USGS Publications Warehouse

    Stephenson, W.J.; Odum, J.; Shedlock, K.M.; Pratt, T.L.; Williams, R.A.

    1992-01-01

    The Mini-Sosie high-resolution seismic method has been effective in imaging shallow-structure and stratigraphic features that aid in seismic-hazard and neotectonic studies. The method is not an alternative to Vibroseis acquisition for large-scale studies. However, it has two major advantages over Vibroseis as it is being used by the USGS in its seismic-hazards program. First, the sources are extremely portable and can be used in both rural and urban environments. Second, the shifting-and-summation process during acquisition improves the signal-to-noise ratio and cancels out seismic noise sources such as cars and pedestrians. -from Authors

  13. Study of the environmental hazard caused by the oil shale industry solid waste.

    PubMed

    Pllumaa, L; Maloveryan, A; Trapido, M; Sillak, H; Kahru, A

    2001-01-01

    The environmental hazard was studied of eight soil and solid waste samples originating from a region of Estonia heavily polluted by the oil shale industry. The samples were contaminated mainly with oil products (up to 7231mg/kg) and polycyclic aromatic hydrocarbons (PAHs; up to 434mg/kg). Concentrations of heavy metals and water-extractable phenols were low. The toxicities of the aqueous extracts of solid-phase samples were evaluated by using a battery of Toxkit tests (involving crustaceans, protozoa, rotifers and algae). Waste rock and fresh semi-coke were classified as of "high acute toxic hazard", whereas aged semi-coke and most of the polluted soils were classified as of "acute toxic hazard". Analysis of the soil slurries by using the photobacterial solid-phase flash assay showed the presence of particle-bound toxicity in most samples. In the case of four samples out of the eight, chemical and toxicological evaluations both showed that the levels of PAHs, oil products or both exceeded their respective permitted limit values for the living zone (20mg PAHs/kg and 500mg oil products/kg); the toxicity tests showed a toxic hazard. However, in the case of three samples, the chemical and toxicological hazard predictions differed markedly: polluted soil from the Erra River bank contained 2334mg oil/kg, but did not show any water-extractable toxicity. In contrast, spent rock and aged semi-coke that contained none of the pollutants in hazardous concentrations, showed adverse effects in toxicity tests. The environmental hazard of solid waste deposits from the oil shale industry needs further assessment. PMID:11387023

  14. Seismic hazard assessment of the cultural heritage sites: A case study in Cappadocia (Turkey)

    NASA Astrophysics Data System (ADS)

    Seyrek, Evren; Orhan, Ahmet; Diner, ?smail

    2014-05-01

    Turkey is one of the most seismically active regions in the world. Major earthquakes with the potential of threatening life and property occur frequently here. In the last decade, over 50,000 residents lost their lives, commonly as a result of building failures in seismic events. The Cappadocia region is one of the most important touristic sites in Turkey. At the same time, the region has been included to the Word Heritage List by UNESCO at 1985 due to its natural, historical and cultural values. The region is undesirably affected by several environmental conditions, which are subjected in many previous studies. But, there are limited studies about the seismic evaluation of the region. Some of the important historical and cultural heritage sites are: Goreme Open Air Museum, Uchisar Castle, Ortahisar Castle, Derinkuyu Underground City and Ihlara Valley. According to seismic hazard zonation map published by the Ministry of Reconstruction and Settlement these heritage sites fall in Zone III, Zone IV and Zone V. This map show peak ground acceleration or 10 percent probability of exceedance in 50 years for bedrock. In this connection, seismic hazard assessment of these heritage sites has to be evaluated. In this study, seismic hazard calculations are performed both deterministic and probabilistic approaches with local site conditions. A catalog of historical and instrumental earthquakes is prepared and used in this study. The seismic sources have been identified for seismic hazard assessment based on geological, seismological and geophysical information. Peak Ground Acceleration (PGA) at bed rock level is calculated for different seismic sources using available attenuation relationship formula applicable to Turkey. The result of the present study reveals that the seismic hazard at these sites is closely matching with the Seismic Zonation map published by the Ministry of Reconstruction and Settlement. Keywords: Seismic Hazard Assessment, Probabilistic Approach, Deterministic Approach, Historical Heritage, Cappadocia.

  15. Voltage Sag Mitigation Strategies for an Indian Power Systems: A Case Study

    NASA Astrophysics Data System (ADS)

    Goswami, A. K.; Gupta, C. P.; Singh, G. K.

    2014-08-01

    Under modern deregulated environment, both utilities and customers are concerned with the power quality improvement but with different objectives/interests. The utility reconfigure its power network and install mitigation devices, if needed, to improve power quality. The paper presents a strategy for selecting cost-effective solutions to mitigate voltage sags, the most frequent power quality disturbance. In this paper, mitigation device(s) is/are inducted in the optimal network topology at suitable places for their better effectiveness for further improvement in power quality. The optimal placement is looked from utility perspectives for overall benefit. Finally, their performance is evaluated on the basis of reduction in total number of voltage sags, reduction in total number of process trips and reduction in total financial losses due to voltage sags.

  16. A computational study of explosive hazard potential for reuseable launch vehicles.

    SciTech Connect

    Langston, Leo J.; Freitas, Christopher J.; Langley, Patrick; Palmer, Donald; Saul, W. Venner; Chocron, Sidney; Kipp, Marlin E.

    2004-09-01

    Catastrophic failure of a Reusable Launch Vehicle (RLV) during launch poses a significant engineering problem in the context of crew escape. The explosive hazard potential of the RLV changes during the various phases of the launch. The hazard potential in the on-pad environment is characterized by release and formation of a gas phase mixture in an oxidizer rich environment, while the hazard during the in-flight phase is dominated by the boundary layer and wake flow formed around the vehicle and the interaction with the exhaust gas plume. In order to address more effectively crew escape in these explosive environments a computational analysis program was undertaken by Lockheed Martin, funded by NASA JSC, with simulations and analyses completed by Southwest Research Institute and Sandia National Laboratories. This paper presents then the details of the methodology used in this analysis, results of the study, and important conclusions that came out of the study.

  17. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  18. Coastal dynamics studies for evaluation of hazard and vulnerability for coastal erosion. case study the town La Bocana, Buenaventura, colombian pacific

    NASA Astrophysics Data System (ADS)

    Coca-Domínguez, Oswaldo; Ricaurte-Villota, Constanza

    2015-04-01

    The analysis of the hazard and vulnerability in coastal areas caused for erosion is based on studies of coastal dynamics since that allows having a better information detail that is useful for decision-making in aspects like prevention, mitigation, disaster reduction and integrated risk management. The Town of La Bocana, located in Buenaventura (Colombian Pacific) was selected to carry out the threat assessment for coastal erosion based on three components: i) magnitude, ii) occurrence and iii) susceptibility. Vulnerability meanwhile, is also composed of three main components for its evaluation: i) exposure ii) fragility and iii) resilience, which in turn are evaluated in 6 dimensions of vulnerability: physical, social, economic, ecological, institutional and cultural. The hazard analysis performed used a semi-quantitative approach, and an index of variables such as type of geomorphological unit, type of beach, exposure of the surfing coast, occurrence, among others. Quantitative data of coastal retreat was measured through the use of DSAS (Digital Shoreline Analysis System) an application of ArcGIS, as well as the development of digital elevation models from the beach and 6 beach profiles strategically located on the coast obtained with GNSS technology. Sediment samples collected from these beaches, medium height and wave direction were used as complementary data. The information was integrated across the coast line into segments of 250 x 250 meters. 4 sectors are part of the coastal area of La Bocana: Pianguita, Vistahermosa, Donwtown and Shangay. 6 vulnerability dimensions units were taken from these population, as well as its density for exposure, wich was analyzed through a multi-array method that include variables such as, land use, population, type of structure, education, basic services, among others, to measure frailty, and their respective indicator of resilience. The hazard analysis results indicate that Vistahermosa is in very high threat, while Donwtown and Pianguita are in a medium hazard. Particularly these two sectors have the mayor population density and the biggest hotel development and services infraestructure; meanwhile Shangay was scored with low hazard because the wave action has no direct impact on it. Vulnerability analysis suggest that the sector of Shangay has a very high vulnerability status because it is a sector that does not have any basic services and have low levels of schooling, meanwhile Downtown, Vistahermosa and Pianguita are in the average of vulnerability. Additionally, it was determined that in recent years the sector of Vista hermosa the erosion rates are up to -xx m yr-1, while in other sectors the regression of the coastline can be associated with local tidal peaks that occur during April and October, while other months of the year are typically for recovery and stability processes.

  19. FMECA application to Rainfall Hazard prevention in olive trees growings

    NASA Astrophysics Data System (ADS)

    Buendia-Buenda, F. S.; Bermudez, R.; Tarquis, A. M.; Andina, D.

    2010-05-01

    The FMECA (Failure Mode Effects and Criticality Analysis) is a broadly extended System Safety tool applied in industries as Aerospace in order to prevent hazards. This methodology studies the different failure modes of a system and try to mitigate them in a systematic procedure. In this paper this tool is applied in order to mitigate economical impact hazards derived from Rainfalls to olive trees growing in Granada (Spain), understanding hazard from the System Safety perspective (Any real or potential condition that can cause injury, illness, or death to personnel; damage to or loss of a system, equipment or property; or damage to the environment). The work includes a brief introduction to the System Safety and FMECA methodologies, applying then these concepts to analyze the Olive trees as a system and identify the hazards during the different stages of the whole life cycle plant production.

  20. Mask roughness induced LER control and mitigation: aberrations sensitivity study and alternate illumination scheme

    SciTech Connect

    McClinton, Brittany M.; Naulleau, Patrick P.

    2011-03-11

    Here we conduct a mask-roughness-induced line-edge-roughness (LER) aberrations sensitivity study both as a random distribution amongst the first 16 Fringe Zernikes (for overall aberration levels of 0.25, 0.50, and 0.75nm rms) as well as an individual aberrations sensitivity matrix over the first 37 Fringe Zernikes. Full 2D aerial image modeling for an imaging system with NA = 0.32 was done for both the 22-nm and 16-nm half-pitch nodes on a rough mask with a replicated surface roughness (RSR) of 100 pm and a correlation length of 32 nm at the nominal extreme-ultraviolet lithography (EUVL) wavelength of 13.5nm. As the ideal RSR value for commercialization of EUVL is 50 pm and under, and furthermore as has been shown elsewhere, a correlation length of 32 nm of roughness on the mask sits on the peak LER value for an NA = 0.32 imaging optic, these mask roughness values and consequently the aberration sensitivity study presented here, represent a worst-case scenario. The illumination conditions were chosen based on the possible candidates for the 22-nm and 16-nm half-pitch nodes, respectively. In the 22-nm case, a disk illumination setting of {sigma} = 0.50 was used, and for the 16-nm case, crosspole illumination with {sigma} = 0.10 at an optimum offset of dx = 0 and dy = .67 in sigma space. In examining how to mitigate mask roughness induced LER, we considered an alternate illumination scheme whereby a traditional dipole's angular spectrum is extended in the direction parallel to the line-and-space mask absorber pattern to represent a 'strip'. While this illumination surprisingly provides minimal improvement to the LER as compared to several alternate illumination schemes, the overall imaging quality in terms of image-log-slope (ILS) and contrast is improved.

  1. Software safety hazard analysis

    SciTech Connect

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  2. Landslide hazard mapping with selected dominant factors: A study case of Penang Island, Malaysia

    SciTech Connect

    Tay, Lea Tien; Alkhasawneh, Mutasem Sh.; Ngah, Umi Kalthum; Lateh, Habibah

    2015-05-15

    Landslide is one of the destructive natural geohazards in Malaysia. In addition to rainfall as triggering factos for landslide in Malaysia, topographical and geological factors play important role in the landslide susceptibility analysis. Conventional topographic factors such as elevation, slope angle, slope aspect, plan curvature and profile curvature have been considered as landslide causative factors in many research works. However, other topographic factors such as diagonal length, surface area, surface roughness and rugosity have not been considered, especially for the research work in landslide hazard analysis in Malaysia. This paper presents landslide hazard mapping using Frequency Ratio (FR) and the study area is Penang Island of Malaysia. Frequency ratio approach is a variant of probabilistic method that is based on the observed relationships between the distribution of landslides and each landslide-causative factor. Landslide hazard map of Penang Island is produced by considering twenty-two (22) landslide causative factors. Among these twenty-two (22) factors, fourteen (14) factors are topographic factors. They are elevation, slope gradient, slope aspect, plan curvature, profile curvature, general curvature, tangential curvature, longitudinal curvature, cross section curvature, total curvature, diagonal length, surface area, surface roughness and rugosity. These topographic factors are extracted from the digital elevation model of Penang Island. The other eight (8) non-topographic factors considered are land cover, vegetation cover, distance from road, distance from stream, distance from fault line, geology, soil texture and rainfall precipitation. After considering all twenty-two factors for landslide hazard mapping, the analysis is repeated with fourteen dominant factors which are selected from the twenty-two factors. Landslide hazard map was segregated into four categories of risks, i.e. Highly hazardous area, Hazardous area, Moderately hazardous area and Not hazardous area. The maps was assessed using ROC (Rate of Curve) based on the area under the curve method (AUC). The result indicates an increase of accuracy from 77.76% (with all 22 factors) to 79.00% (with 14 dominant factors) in the prediction of landslide occurrence.

  3. Landslide hazard mapping with selected dominant factors: A study case of Penang Island, Malaysia

    NASA Astrophysics Data System (ADS)

    Tay, Lea Tien; Alkhasawneh, Mutasem Sh.; Ngah, Umi Kalthum; Lateh, Habibah

    2015-05-01

    Landslide is one of the destructive natural geohazards in Malaysia. In addition to rainfall as triggering factos for landslide in Malaysia, topographical and geological factors play important role in the landslide susceptibility analysis. Conventional topographic factors such as elevation, slope angle, slope aspect, plan curvature and profile curvature have been considered as landslide causative factors in many research works. However, other topographic factors such as diagonal length, surface area, surface roughness and rugosity have not been considered, especially for the research work in landslide hazard analysis in Malaysia. This paper presents landslide hazard mapping using Frequency Ratio (FR) and the study area is Penang Island of Malaysia. Frequency ratio approach is a variant of probabilistic method that is based on the observed relationships between the distribution of landslides and each landslide-causative factor. Landslide hazard map of Penang Island is produced by considering twenty-two (22) landslide causative factors. Among these twenty-two (22) factors, fourteen (14) factors are topographic factors. They are elevation, slope gradient, slope aspect, plan curvature, profile curvature, general curvature, tangential curvature, longitudinal curvature, cross section curvature, total curvature, diagonal length, surface area, surface roughness and rugosity. These topographic factors are extracted from the digital elevation model of Penang Island. The other eight (8) non-topographic factors considered are land cover, vegetation cover, distance from road, distance from stream, distance from fault line, geology, soil texture and rainfall precipitation. After considering all twenty-two factors for landslide hazard mapping, the analysis is repeated with fourteen dominant factors which are selected from the twenty-two factors. Landslide hazard map was segregated into four categories of risks, i.e. Highly hazardous area, Hazardous area, Moderately hazardous area and Not hazardous area. The maps was assessed using ROC (Rate of Curve) based on the area under the curve method (AUC). The result indicates an increase of accuracy from 77.76% (with all 22 factors) to 79.00% (with 14 dominant factors) in the prediction of landslide occurrence.

  4. A Study of Airline Passenger Susceptibility to Atmospheric Turbulence Hazard

    NASA Technical Reports Server (NTRS)

    Stewart, Eric C.

    2000-01-01

    A simple, generic, simulation math model of a commercial airliner has been developed to study the susceptibility of unrestrained passengers to large, discrete gust encounters. The math model simulates the longitudinal motion to vertical gusts and includes (1) motion of an unrestrained passenger in the rear cabin, (2) fuselage flexibility, (3) the lag in the downwash from the wing to the tail, and (4) unsteady lift effects. Airplane and passenger response contours are calculated for a matrix of gust amplitudes and gust lengths of a simulated mountain rotor. A comparison of the model-predicted responses to data from three accidents indicates that the accelerations in actual accidents are sometimes much larger than the simulated gust encounters.

  5. Flood hazards studies in the Mississippi River basin using remote sensing

    NASA Technical Reports Server (NTRS)

    Rango, A.; Anderson, A. T.

    1974-01-01

    The Spring 1973 Mississippi River flood was investigated using remotely sensed data from ERTS-1. Both manual and automatic analyses of the data indicated that ERTS-1 is extremely useful as a regional tool for flood mamagement. Quantitative estimates of area flooded were made in St. Charles County, Missouri and Arkansas. Flood hazard mapping was conducted in three study areas along the Mississippi River using pre-flood ERTS-1 imagery enlarged to 1:250,000 and 1:100,000 scale. Initial results indicate that ERTS-1 digital mapping of flood prone areas can be performed at 1:62,500 which is comparable to some conventional flood hazard map scales.

  6. When does highway construction to mitigate congestion reduce carbon emissions? A Case Study: The Caldecott Tunnel

    NASA Astrophysics Data System (ADS)

    Thurlow, M. E.; Maness, H.; Wiersema, D. J.; Mcdonald, B. C.; Harley, R.; Fung, I. Y.

    2014-12-01

    The construction of the fourth bore of the Caldecott Tunnel, which connects Oakland and Moraga, CA on State Route 24, was the second largest roadway construction project in California last year with a total cost of $417 million. The objective of the fourth bore was to reduce traffic congestion before the tunnel entrance in the off-peak direction of travel, but the project was a source of conflict between policy makers and environmental and community groups concerned about the air quality and traffic impacts. We analyze the impact of the opening of the fourth bore on CO2 emissions associated with traffic. We made surface observations of CO2from a mobile platform along State Route 24 for several weeks in November 2013 incorporating the period prior to and after the opening of the fourth bore on November 16, 2013. We directly compare bottom-up and top-down approaches to estimate the change in traffic emissions associated with the fourth bore opening. A bottom-up emissions inventory was derived from the high-resolution Performance Measurement System (PeMs) dataset and the Multi-scale Motor Vehicle and Equipment Emissions System (MOVES). The emissions inventory was used to drive a box model as well as a high-resolution regional transport model (the Weather and Regional Forecasting Model). The box model was also used to derive emissions from observations in a basic inversion. We also present an analysis of long-term traffic patterns and consider the potential for compensating changes in behavior that offset the observed emissions reductions on longer timescales. Finally, we examine how the results from the Caldecott study demonstrate the general benefit of using mobile measurements for quantifying environmental impacts of congestion mitigation projects.

  7. THE SOCIAL IMPLICATIONS OF FLAME RETARDANT CHEMICALS: A CASE STUDY IN RISK AND HAZARD PERCEPTION

    EPA Science Inventory

    This study is expected to fill an important gap in the literature by focusing on how individuals characterize exposure in terms of risk and hazard, and how this understanding can lead to concrete changes in their personal and professional lives. I expect that people differ gre...

  8. A PRELIMINARY FEASIBILITY STUDY FOR AN OFFSHORE HAZARDOUS WASTE INCINERATION FACILITY

    EPA Science Inventory

    The report summarizes a feasibility study of using an existing offshore oil platform, being offered to the Government, as a site for incineration of hazardous wastes and related research. The platform, located in the Gulf of Mexico about 100 km south of Mobile, AL, has potential ...

  9. Preparation of a national Copernicus service for detection and monitoring of land subsidence and mass movements in the context of remote sensing assisted hazard mitigation

    NASA Astrophysics Data System (ADS)

    Kalia, Andre C.; Frei, Michaela; Lege, Thomas

    2014-10-01

    Land subsidence can cause severe damage for e.g. infrastructure and buildings and mass movements even can lead to loss of live. Detection and monitoring of these processes by terrestrial measurement techniques remain a challenge due to limitations in spatial coverage and temporal resolution. Since the launch of ERS-1 in 1991 numerous scientific studies demonstrated the capability of differential SAR-Interferometry (DInSAR) for the detection of surface deformation proving the usability of this method. In order to assist the utilization of DInSAR for governmental tasks a national service-concept within the EU-ESA Program "Copernicus" is in the process of preparation. This is done by i) analyzing the user requirements, ii) developing a concept and iii) perform case studies as "proof of concept". Due to the iterative nature of this procedure governmental users as well as DInSAR experts are involved. This paper introduces the concept, shows the available SAR data archive from ERS-1/2, TerraSAR-X and TanDEM-X as well as the proposed case study. The case study is focusing on the application of advanced DInSAR methods for the detection of subsidence in a region with active gas extraction. The area of interest is located in the state of Lower Saxony in the northwest of Germany. The DInSAR analysis will be based on ERS-1/2 and on TerraSARX/ TanDEM-X SAR data. The usability of the DInSAR products will be discussed with the responsible mining authority (LBEG) in order to adapt the DInSAR products to the user needs and to evaluate the proposed concept.

  10. Urban Vulnerability Assessment to Seismic Hazard through Spatial Multi-Criteria Analysis. Case Study: the Bucharest Municipality/Romania

    NASA Astrophysics Data System (ADS)

    Armas, Iuliana; Dumitrascu, Silvia; Bostenaru, Maria

    2010-05-01

    In the context of an explosive increase in value of the damage caused by natural disasters, an alarming challenge in the third millennium is the rapid growth of urban population in vulnerable areas. Cities are, by definition, very fragile socio-ecological systems with a high level of vulnerability when it comes to environmental changes and that are responsible for important transformations of the space, determining dysfunctions shown in the state of the natural variables (Parker and Mitchell, 1995, The OFDA/CRED International Disaster Database). A contributing factor is the demographic dynamic that affects urban areas. The aim of this study is to estimate the overall vulnerability of the urban area of Bucharest in the context of the seismic hazard, by using environmental, socio-economic, and physical measurable variables in the framework of a spatial multi-criteria analysis. For this approach the capital city of Romania was chosen based on its high vulnerability due to the explosive urban development and the advanced state of degradation of the buildings (most of the building stock being built between 1940 and 1977). Combining these attributes with the seismic hazard induced by the Vrancea source, Bucharest was ranked as the 10th capital city worldwide in the terms of seismic risk. Over 40 years of experience in the natural risk field shows that the only directly accessible way to reduce the natural risk is by reducing the vulnerability of the space (Adger et al., 2001, Turner et al., 2003; UN/ISDR, 2004, Dayton-Johnson, 2004, Kasperson et al., 2005; Birkmann, 2006 etc.). In effect, reducing the vulnerability of urban spaces would imply lower costs produced by natural disasters. By applying the SMCA method, the result reveals a circular pattern, signaling as hot spots the Bucharest historic centre (located on a river terrace and with aged building stock) and peripheral areas (isolated from the emergency centers and defined by precarious social and economic conditions). In effect, the example of Bucharest demonstrates how the results shape the vulnerability to seismic hazard profile of the city, based on which decision makers could develop proper mitigation strategies. To sum up, the use of an analytical framework as the standard Spatial Multi-Criteria Analysis (SMCA) - despite all difficulties in creating justifiable weights (Yeh et al., 1999) - results in accurate estimations of the state of the urban system. Although this method was often mistrusted by decision makers (Janssen, 2001), we consider that the results can represent, based on precisely the level of generalization, a decision support framework for policy makers to critically reflect on possible risk mitigation plans. Further study will lead to the improvement of the analysis by integrating a series of daytime and nighttime scenarios and a better definition of the constructed space variables.

  11. Natural phenomena hazards site characterization criteria

    SciTech Connect

    Not Available

    1994-03-01

    The criteria and recommendations in this standard shall apply to site characterization for the purpose of mitigating Natural Phenomena Hazards (wind, floods, landslide, earthquake, volcano, etc.) in all DOE facilities covered by DOE Order 5480.28. Criteria for site characterization not related to NPH are not included unless necessary for clarification. General and detailed site characterization requirements are provided in areas of meteorology, hydrology, geology, seismology, and geotechnical studies.

  12. A Study of Aircraft Fire Hazards Related to Natural Electrical Phenomena

    NASA Technical Reports Server (NTRS)

    Kester, Frank L.; Gerstein, Melvin; Plumer, J. A.

    1960-01-01

    The problems of natural electrical phenomena as a fire hazard to aircraft are evaluated. Assessment of the hazard is made over the range of low level electrical discharges, such as static sparks, to high level discharges, such as lightning strikes to aircraft. In addition, some fundamental work is presented on the problem of flame propagation in aircraft fuel vent systems. This study consists of a laboratory investigation in five parts: (1) a study of the ignition energies and flame propagation rates of kerosene-air and JP-6-air foams, (2) a study of the rate of flame propagation of n-heptane, n-octane, n-nonane, and n-decane in aircraft vent ducts, (3) a study of the damage to aluminum, titanium, and stainless steel aircraft skin materials by lightning strikes, (4) a study of fuel ignition by lightning strikes to aircraft skins, and (5) a study of lightning induced flame propagation in an aircraft vent system.

  13. Use of geotextiles for mitigation of the effects of man-made hazards such as greening of waste deposits in frame of the conversion of industrial areas

    NASA Astrophysics Data System (ADS)

    Bostenaru, Magdalena; Siminea, Ioana; Bostenaru, Maria

    2010-05-01

    The city of Karlsruhe lays on the Rhine valley; however, it is situated at a certain distance from the Rhine river and the coastal front is not integrated in the urban development. However, the port to the Rhine developed to the second largest internal port in Germany. With the process of deindustrialisation, industrial use is now shrinking. With the simultaneous process of the ecological re-win of rivers, the conversion of the industrial area to green and residential areals is imposed. In the 1990s a project was made by the third author of the contribution with Andrea Ciobanu as students of the University of Karlsruhe for the conversion of the Rhine port area of Karlsruhe into such a nature-residential use. The area included also a waste deposit, proposed to be transformed into a "green hill". Such an integration of a waste deposit into a park in the process of the conversion of an industrial area is not singular in Germany; several such projects were proposed and some of them realised at the IBA Emscher Park in the Ruhr area. Some of them were coupled with artistic projects. The technical details are also subject of the contribution. Studies were made by the first two authors on the conditions in which plants grow on former waste deposits if supported by intermediar layers of a geotextile. The characteristics of the geotextiles, together with the technologic process of obtaining, and the results of laboratory and field experiments for use on waste deposits in comparable conditions in Romania will be shown. The geotextile is also usable for ash deposits such as those in the Ruhr area.

  14. Natural hazard risk perception of Italian population: case studies along national territory.

    NASA Astrophysics Data System (ADS)

    Gravina, Teresita; Tupputi Schinosa, Francesca De Luca; Zuddas, Isabella; Preto, Mattia; Marengo, Angelo; Esposito, Alessandro; Figliozzi, Emanuele; Rapinatore, Matteo

    2015-04-01

    Risk perception is judgment that people make about the characteristics and severity of risks, in last few years risk perception studies focused on provide cognitive elements to communication experts responsible in order to design citizenship information and awareness appropriate strategies. Several authors in order to determine natural hazards risk (Seismic, landslides, cyclones, flood, Volcanic) perception used questionnaires as tool for providing reliable quantitative data and permitting comparison the results with those of similar surveys. In Italy, risk perception studies based on surveys, were also carried out in order to investigate on national importance Natural risk, in particular on Somma-Vesuvio and Phlegrean Fields volcanic Risks, but lacked risk perception studies on local situation distributed on whole national territory. National importance natural hazard were frequently reported by national mass media and there were debate about emergencies civil protection plans, otherwise could be difficult to obtain information on bonded and regional nature natural hazard which were diffuses along National territory. In fact, Italian peninsula was a younger geological area subjected to endogenous phenomena (volcanoes, earthquake) and exogenous phenomena which determine land evolution and natural hazard (landslide, coastal erosion, hydrogeological instability, sinkhole) for population. For this reason we decided to investigate on natural risks perception in different Italian place were natural hazard were taken place but not reported from mass media, as were only local relevant or historical event. We carried out surveys in different Italian place interested by different types of natural Hazard (landslide, coastal erosion, hydrogeological instability, sinkhole, volcanic phenomena and earthquake) and compared results, in order to understand population perception level, awareness and civil protection exercises preparation. Our findings support that risks communication have to be based on citizen knowledge and conscious in natural hazards. In fact, informed citizen could participate actively in decision in urban development planning and accept positively legislation and regulation introduced to avoid natural risks. The study has gone some way towards enhancing understanding in citizens conscious in natural risks and allow us to say that communication on natural risks could not be based only in transferring emergency behavior to citizens but also allow people to improve their knowledge in landscape evolution in order to assume aware environmental behavior.

  15. Effectiveness of water spray mitigation systems for accidental releases of hydrogen fluoride

    SciTech Connect

    Schatz, K.W. ); Koopman, R.P. )

    1989-06-01

    This report is one of several work products generated by the Industry Cooperative HF Mitigation/Assessment Program. This ad hoc industry program began in late 1987 to study and test techniques for mitigating accidental releases of hydrogen fluoride (HF) and alkylation unit acid (AUA) and to better estimate ambient impacts from such releases. The hazards of HF have long been recognized. Standard operating practices focused on minimizing the possibility of a release and mitigating the effects if a release should occur. These practices are continually monitored and improved to maximize safety protection based on the available technical data. This recent program targeted further improvements based on new technical data.

  16. Climate engineering of vegetated land for hot extremes mitigation: an ESM sensitivity study

    NASA Astrophysics Data System (ADS)

    Wilhelm, Micah; Davin, Edouard; Seneviratne, Sonia

    2014-05-01

    Mitigation efforts to reduce anthropogenic climate forcing have thus far proven inadequate, as evident from accelerating greenhouse gas emissions. Many subtropical and mid-latitude regions are expected to experience longer and more frequent heat waves and droughts within the next century. This increased occurrence of weather extremes has important implications for human health, mortality and for socio-economic factors including forest fires, water availability and agricultural production. Various solar radiation management (SRM) schemes that attempt to homogeneously counter the anthropogenic forcing have been examined with different Earth System Models (ESM). Land climate engineering schemes have also been investigated which reduces the amount of solar radiation that is absorbed at the surface. However, few studies have investigated their effects on extremes but rather on mean climate response. Here we present the results of a series of climate engineering sensitivity experiments performed with the Community Earth System Model (CESM) version 1.0.2 at 2°-resolution. This configuration entails 5 fully coupled model components responsible for simulating the Earth's atmosphere, land, land-ice, ocean and sea-ice that interact through a central coupler. Historical and RCP8.5 scenarios were performed with transient land-cover changes and prognostic terrestrial Carbon/Nitrogen cycles. Four sets of experiments are performed in which surface albedo over snow-free vegetated grid points is increased by 0.5, 0.10, 0.15 and 0.20. The simulations show a strong preferential cooling of hot extremes throughout the Northern mid-latitudes during boreal summer. A strong linear scaling between the cooling of extremes and additional surface albedo applied to the land model is observed. The strongest preferential cooling is found in southeastern Europe and the central United States, where increases of soil moisture and evaporative fraction are the largest relative to the control simulation. This preferential cooling is found to intensify in the future scenario. Cloud cover strongly limits the efficacy of a given surface albedo increase to reflect incoming solar radiation back into space. As anthropogenic forcing increases, cloud cover decreases over much of the northern mid-latitudes in CESM.

  17. Hazardous Drugs

    MedlinePLUS

    ... Commission . OSHA Trade News Release, (2011, April 7). Healthcare Wide Hazards - Hazardous Chemicals . OSHA Hospital eTool. Provides hazards and solutions for employee exposure to ... Occupational ...

  18. Comparative risk judgements for oral health hazards among Norwegian adults: a cross sectional study

    PubMed Central

    Åstrøm, Anne Nordrehaug

    2002-01-01

    Background This study identified optimistic biases in health and oral health hazards, and explored whether comparative risk judgements for oral health hazards vary systematically with socio-economic characteristics and self-reported risk experience. Methods A simple random sample of 1,190 residents born in 1972 was drawn from the population resident in three counties of Norway. A total of 735 adults (51% women) completed postal questionnaires at home. Results Mean ratings of comparative risk judgements differed significantly (p < 0.001) from the mid point of the scales. T-values ranged from -13.1 and -12.1 for the perceived risk of being divorced and loosing all teeth to -8.2 and -7.8 (p < 0.001) for having gum disease and toothdecay. Multivariate analyses using General Linear Models, GLM, revealed gender differences in comparative risk judgements for gum disease, whereas social position varied systematically with risk judgements for tooth decay, gum disease and air pollution. The odds ratios for being comparatively optimistic with respect to having gum disease were 2.9, 1.9, 1.8 and 1.5 if being satisfied with dentition, having a favourable view of health situation, and having high and low involvement with health enhancing and health detrimental behaviour, respectively. Conclusion Optimism in comparative judgements for health and oral health hazards was evident in young Norwegian adults. When judging their comparative susceptibility for oral health hazards, they consider personal health situation and risk behaviour experience. PMID:12186656

  19. Robot-assisted home hazard assessment for fall prevention: a feasibility study.

    PubMed

    Sadasivam, Rajani S; Luger, Tana M; Coley, Heather L; Taylor, Benjamin B; Padir, Taskin; Ritchie, Christine S; Houston, Thomas K

    2014-01-01

    We examined the feasibility of using a remotely manoeuverable robot to make home hazard assessments for fall prevention. We employed use-case simulations to compare robot assessments with in-person assessments. We screened the homes of nine elderly patients (aged 65 years or more) for fall risks using the HEROS screening assessment. We also assessed the participants' perspectives of the remotely-operated robot in a survey. The nine patients had a median Short Blessed Test score of 8 (interquartile range, IQR 2-20) and a median Life-Space Assessment score of 46 (IQR 27-75). Compared to the in-person assessment (mean?=?4.2 hazards identified per participant), significantly more home hazards were perceived in the robot video assessment (mean?=?7.0). Only two checklist items (adequate bedroom lighting and a clear path from bed to bathroom) had more than 60% agreement between in-person and robot video assessment. Participants were enthusiastic about the robot and did not think it violated their privacy. The study found little agreement between the in-person and robot video hazard assessments. However, it identified several research questions about how to best use remotely-operated robots. PMID:24352900

  20. Integration of Airborne Laser Scanning Altimetry Data in Alpine Geomorphological and Hazard Studies

    NASA Astrophysics Data System (ADS)

    Seijmonsbergen, A. C.

    2007-12-01

    A digital terrain and surface model derived from an airborne laser scanning (ALS) altimetry dataset was used in the Austrian Alps for the preparation, improvement and the evaluation of a digital geomorphological hazard map. The geomorphology in the study area consists of a wide variety of landforms, which include glacial landforms such as cirques, hanging valleys, and moraine deposits, of pre- and postglacial mass movement landforms and processes, such as deep seated slope failures, rock fall, debris flows and solifluction. The area includes naked and covered gypsum karst, collapse dolines and fluvial landforms and deposits such as river terraces, incisions, alluvial fans and gullies. A detailed symbol based paper geomorphological map served as a basis for the digitalization of basic morphogenetic landform and process units. These units were assigned a `geomorphological unit type`, `hazard type` and `activity` code in the attribute table, according to a morphogenetic classification scheme. Selected zonal statistical attributes - mean height, aspect and slope angle - were calculated in a GIS using the vector based morphogenetic landform and process units and the underlying 1m resolution laser altimetry raster dataset. This statistical information was added to the attribute table of the `geomorphological hazard map`. Interpretation of the zonal statistical information shows that indicative topographic signatures exist for the various geomorphological and hazard units in this region of the Alps. Based on this experience a further step is made towards semi-automated geomorphological hazard classification of segmented laser altimetry data using expert knowledge rules. The first results indicate a classification accuracy of 50-70 percent for most landform associations. Areas affected by slide processes resulted in less accurate classification, probably because of their polygenetic history in this area. It is concluded that the use of lidar data improves visual recognition of landslides, especially in combination with high resolution 3D air-photo information. The zonal statistical information helps to more realistically delineate, identify and classify landscape units and promotes consistent data extraction. This leads to efficient and time saving (semi-automated) classification procedures in alpine areas. For future developments it seems promising to further integrate lidar in automated landscape classification tools and in hazard mapping strategies. If high resolution satellite borne laser data becomes available on a regular base a further step towards monitoring of landscape evolution in geomorphological hazard studies can be made.

  1. Success in transmitting hazard science

    NASA Astrophysics Data System (ADS)

    Price, J. G.; Garside, T.

    2010-12-01

    Money motivates mitigation. An example of success in communicating scientific information about hazards, coupled with information about available money, is the follow-up action by local governments to actually mitigate. The Nevada Hazard Mitigation Planning Committee helps local governments prepare competitive proposals for federal funds to reduce risks from natural hazards. Composed of volunteers with expertise in emergency management, building standards, and earthquake, flood, and wildfire hazards, the committee advises the Nevada Division of Emergency Management on (1) the content of the States hazard mitigation plan and (2) projects that have been proposed by local governments and state agencies for funding from various post- and pre-disaster hazard mitigation programs of the Federal Emergency Management Agency. Local governments must have FEMA-approved hazard mitigation plans in place before they can receive this funding. The committee has been meeting quarterly with elected and appointed county officials, at their offices, to encourage them to update their mitigation plans and apply for this funding. We have settled on a format that includes the countys giving the committee an overview of its infrastructure, hazards, and preparedness. The committee explains the process for applying for mitigation grants and presents the latest information that we have about earthquake hazards, including locations of nearby active faults, historical seismicity, geodetic strain, loss-estimation modeling, scenarios, and documents about what to do before, during, and after an earthquake. Much of the county-specific information is available on the web. The presentations have been well received, in part because the committee makes the effort to go to their communities, and in part because the committee is helping them attract federal funds for local mitigation of not only earthquake hazards but also floods (including canal breaches) and wildfires, the other major concerns in Nevada. Local citizens appreciate the efforts of the state officials to present the information in a public forum. The Committees earthquake presentations to the counties are supplemented by regular updates in the two most populous counties during quarterly meetings of the Nevada Earthquake Safety Council, generally alternating between Las Vegas and Reno. We have only 17 counties in Nevada, so we are making good progress at reaching each within a few years. The Committee is also learning from the county officials about their frustrations in dealing with the state and federal bureaucracies. Success is documented by the mitigation projects that FEMA has funded.

  2. 44 CFR 201.4 - Standard State Mitigation Plans.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... post-disaster hazard management policies, programs, and capabilities to mitigate the hazards in the... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Standard State Mitigation Plans. 201.4 Section 201.4 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT...

  3. Google Earth Views of Probabilistic Tsunami Hazard Analysis Pilot Study, Seaside, Oregon

    NASA Astrophysics Data System (ADS)

    Wong, F. L.; Venturato, A. J.; Geist, E. L.

    2006-12-01

    Virtual globes such as Google Earth provide immediate geographic context for research data for coastal hazard planning. We present Google Earth views of data from a Tsunami Pilot Study conducted within and near Seaside and Gearhart, Oregon, as part of FEMA's Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). Two goals of the pilot study were to develop probabilistic 100- year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improved tsunami hazard assessment guidelines. The Seaside area was chosen because it is typical of many coastal communities along the Cascadia subduction zone that extends from Cape Mendocino, California, to the Strait of Juan de Fuca, Washington. State and local stakeholders also expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study report will be augmented by a separate geographic information systems (GIS) data publication that provides model data and results. In addition to traditional GIS data formats, Google Earth kmz files are available to provide rapid visualization of the data against the rich base map provided by the interface. The data include verbal and geologic observations of historic tsunami events, newly constructed DEMs, historic shorelines, earthquake sources, models of tsunami wave heights, and maps of the estimated 100- and 500-year probabilistic floods. Tsunami Pilot Study Working Group, 2006, Seaside, Oregon Tsunami Pilot Study - Modernization of FEMA Flood Hazard Maps: U.S. Geological Survey Open-file Report 2006-1234, http://pubs.usgs.gov/of/2006/1234/.

  4. Hazardous Materials Management and Emergency Response (HAMMER) Training Center feasibility study

    SciTech Connect

    Curry, R.

    1989-11-01

    This report presents the results of a feasibility study for location of a Hazardous Materials Management and Emergency Response Training Center at Hanford. Westinghouse Hanford Company conducted the study at the request of the US Department of Energy-Richland Operations Office. The study resulted from an initiative by Tri-Cities, Washington area municipal and county entities proposing such a training center. The initiative responded to changes in federal law requiring extensive and specific personnel training for response to incidents involving hazardous materials. Washington Congressman Sid Morrison requested the Department of Energy-Richland Operations Office to evaluate the initiative as a potential business opportunity for the Department of Energy's Hanford Site. 8 refs., 6 figs., 1 tab.

  5. Proportional hazards regression in epidemiologic follow-up studies: an intuitive consideration of primary time scale.

    PubMed

    Cologne, John; Hsu, Wan-Ling; Abbott, Robert D; Ohishi, Waka; Grant, Eric J; Fujiwara, Saeko; Cullings, Harry M

    2012-07-01

    In epidemiologic cohort studies of chronic diseases, such as heart disease or cancer, confounding by age can bias the estimated effects of risk factors under study. With Cox proportional-hazards regression modeling in such studies, it would generally be recommended that chronological age be handled nonparametrically as the primary time scale. However, studies involving baseline measurements of biomarkers or other factors frequently use follow-up time since measurement as the primary time scale, with no explicit justification. The effects of age are adjusted for by modeling age at entry as a parametric covariate. Parametric adjustment raises the question of model adequacy, in that it assumes a known functional relationship between age and disease, whereas using age as the primary time scale does not. We illustrate this graphically and show intuitively why the parametric approach to age adjustment using follow-up time as the primary time scale provides a poor approximation to age-specific incidence. Adequate parametric adjustment for age could require extensive modeling, which is wasteful, given the simplicity of using age as the primary time scale. Furthermore, the underlying hazard with follow-up time based on arbitrary timing of study initiation may have no inherent meaning in terms of risk. Given the potential for biased risk estimates, age should be considered as the preferred time scale for proportional-hazards regression with epidemiologic follow-up data when confounding by age is a concern. PMID:22517300

  6. Social and ethical perspectives of landslide risk mitigation measures

    NASA Astrophysics Data System (ADS)

    Kalsnes, Bjørn; Vangelsten, Bjørn V.

    2015-04-01

    Landslide risk may be mitigated by use of a wide range of measures. Mitigation and prevention options may include (1) structural measures to reduce the frequency, severity or exposure to the hazard, (2) non-structural measures, such as land-use planning and early warning systems, to reduce the hazard frequency and consequences, and (3) measures to pool and transfer the risks. In a given situation the appropriate system of mitigation measures may be a combination of various types of measures, both structural and non-structural. In the process of choosing mitigation measures for a given landslide risk situation, the role of the geoscientist is normally to propose possible mitigation measures on basis of the risk level and technical feasibility. Social and ethical perspectives are often neglected in this process. However, awareness of the need to consider social as well as ethical issues in the design and management of mitigating landslide risk is rising. There is a growing understanding that technical experts acting alone cannot determine what will be considered the appropriate set of mitigation and prevention measures. Issues such as environment versus development, questions of acceptable risk, who bears the risks and benefits, and who makes the decisions, also need to be addressed. Policymakers and stakeholders engaged in solving environmental risk problems are increasingly recognising that traditional expert-based decision-making processes are insufficient. This paper analyse the process of choosing appropriate mitigation measures to mitigate landslide risk from a social and ethical perspective, considering technical, cultural, economical, environmental and political elements. The paper focus on stakeholder involvement in the decision making process, and shows how making strategies for risk communication is a key for a successful process. The study is supported by case study examples from Norway and Italy. In the Italian case study, three different risk mitigation options was presented to the local community. The options were based on a thorough stakeholder involvement process ending up in three different views on how to deal with the landslide risk situation: i) protect lives and properties (hierarchical) ; ii) careful stewardship of the mountains (egalitarian); and iii) rational individual choice (individualist).

  7. A web-based tool for ranking landslide mitigation measures

    NASA Astrophysics Data System (ADS)

    Lacasse, S.; Vaciago, G.; Choi, Y. J.; Kalsnes, B.

    2012-04-01

    As part of the research done in the European project SafeLand "Living with landslide risk in Europe: Assessment, effects of global change, and risk management strategies", a compendium of structural and non-structural mitigation measures for different landslide types in Europe was prepared, and the measures were assembled into a web-based "toolbox". Emphasis was placed on providing a rational and flexible framework applicable to existing and future mitigation measures. The purpose of web-based toolbox is to assist decision-making and to guide the user in the choice of the most appropriate mitigation measures. The mitigation measures were classified into three categories, describing whether the mitigation measures addressed the landslide hazard, the vulnerability or the elements at risk themselves. The measures considered include structural measures reducing hazard and non-structural mitigation measures, reducing either the hazard or the consequences (or vulnerability and exposure of elements at risk). The structural measures include surface protection and control of surface erosion; measures modifying the slope geometry and/or mass distribution; measures modifying surface water regime - surface drainage; measures modifying groundwater regime - deep drainage; measured modifying the mechanical characteristics of unstable mass; transfer of loads to more competent strata; retaining structures (to modify slope geometry and/or to transfer stress to competent layer); deviating the path of landslide debris; dissipating the energy of debris flows; and arresting and containing landslide debris or rock fall. The non-structural mitigation measures, reducing either the hazard or the consequences: early warning systems; restricting or discouraging construction activities; increasing resistance or coping capacity of elements at risk; relocation of elements at risk; sharing of risk through insurance. The measures are described in the toolbox with fact sheets providing a brief description, guidance on design, schematic details, practical examples and references for each mitigation measure. Each of the measures was given a score on its ability and applicability for different types of landslides and boundary conditions, and a decision support matrix was established. The web-based toolbox organizes the information in the compendium and provides an algorithm to rank the measures on the basis of the decision support matrix, and on the basis of the risk level estimated at the site. The toolbox includes a description of the case under study and offers a simplified option for estimating the hazard and risk levels of the slide at hand. The user selects the mitigation measures to be included in the assessment. The toolbox then ranks, with built-in assessment factors and weights and/or with user-defined ranking values and criteria, the mitigation measures included in the analysis. The toolbox includes data management, e.g. saving data half-way in an analysis, returning to an earlier case, looking up prepared examples or looking up information on mitigation measures. The toolbox also generates a report and has user-forum and help features. The presentation will give an overview of the mitigation measures considered and examples of the use of the toolbox, and will take the attendees through the application of the toolbox.

  8. Effects of Long-Term Land Subsidence on the Flood Hazard- A Case Study in the Southwest Coastal Area, Taiwan

    NASA Astrophysics Data System (ADS)

    Chang, Yin-Lung; Tsai, Tung-Lin; Yang, Jinn-Chuang; Chiu, Hsin-Yu

    2014-05-01

    Typically, the flood hazard assessment is conducted for current states of topography. However, the spatial-temporal variability of land subsidence should be considered in assessing the flood hazard for the land subsidence prone area. This study numerically investigated the effects of pumping induced land subsidence on the freeboard and inundation depth in the southwest coastal area, Taiwan. Firstly, the spatial distribution of accumulative land subsidence between 2012 and 2021 was predicted by a process-based land subsidence model. The digital elevation model (DEM) and channel geometry in 2021 were produced based on the predicted land subsidence field along with the present topography data. The freeboard and inundation depth before and after ten years land subsidence were simulated by the SOBEK Suite (1D channel flow couples 2D overland flow). The analysis of freeboard showed that except the extremely low-lying area where the elevation is lower than the spring high tide level, the change of freeboard after ten years land subsidence is mostly influenced by the spatial variation of land subsidence field. Higher change rate of the land subsidence field along the channel direction induces more significant change of freeboard. Besides, the freeboard at a cross section tends to decrease after ten years land subsidence if the land subsidence decreases in the channel downstream direction, and vice versa. However, the decreased (or increased) freeboard at a cross section is typically less than 0.3 times the magnitude of land subsidence at the same cross section. The spatial variation of land subsidence field also significantly influences the change of inundation depth outside the extremely low-lying area. The inundation depth at a computation grid tends to increase after ten years land subsidence if the land subsidence at this grid is greater than that at its neighbour grids, and vice versa. However, the increased (or decreased) inundation depth at a grid is significantly less than the magnitude of land subsidence at the same grid. Unlike the changes of freeboard and inundation depth outside the extremely low-lying area, the freeboard and inundation depth within the extremely low-lying area are always decreased and increased respectively after ten years land subsidence. Furthermore, the decreased freeboard at a channel cross section or the increased inundation depth at a grid are typically 0.8 to 1.0 times the magnitude of land subsidence at the same cross section or grid. To mitigate the effect of pumping induced land subsidence on flood hazard, a groundwater quantity management model was developed. The management model determines the optimal pumping patterns which prevent the flood hazard to be increased due to long-term subsidence while satisfy the groundwater demand. This study showed that for a coastal area with potential land subsidence problem, the spatial-temporal variability of future land subsidence should be quantified and incorporated into the flood management.

  9. Assessment and indirect adjustment for confounding by smoking in cohort studies using relative hazards models.

    PubMed

    Richardson, David B; Laurier, Dominique; Schubauer-Berigan, Mary K; Tchetgen Tchetgen, Eric; Cole, Stephen R

    2014-11-01

    Workers' smoking histories are not measured in many occupational cohort studies. Here we discuss the use of negative control outcomes to detect and adjust for confounding in analyses that lack information on smoking. We clarify the assumptions necessary to detect confounding by smoking and the additional assumptions necessary to indirectly adjust for such bias. We illustrate these methods using data from 2 studies of radiation and lung cancer: the Colorado Plateau cohort study (1950-2005) of underground uranium miners (in which smoking was measured) and a French cohort study (1950-2004) of nuclear industry workers (in which smoking was unmeasured). A cause-specific relative hazards model is proposed for estimation of indirectly adjusted associations. Among the miners, the proposed method suggests no confounding by smoking of the association between radon and lung cancer--a conclusion supported by adjustment for measured smoking. Among the nuclear workers, the proposed method suggests substantial confounding by smoking of the association between radiation and lung cancer. Indirect adjustment for confounding by smoking resulted in an 18% decrease in the adjusted estimated hazard ratio, yet this cannot be verified because smoking was unmeasured. Assumptions underlying this method are described, and a cause-specific proportional hazards model that allows easy implementation using standard software is presented. PMID:25245043

  10. Assessment and Indirect Adjustment for Confounding by Smoking in Cohort Studies Using Relative Hazards Models

    PubMed Central

    Richardson, David B.; Laurier, Dominique; Schubauer-Berigan, Mary K.; Tchetgen, Eric Tchetgen; Cole, Stephen R.

    2014-01-01

    Workers' smoking histories are not measured in many occupational cohort studies. Here we discuss the use of negative control outcomes to detect and adjust for confounding in analyses that lack information on smoking. We clarify the assumptions necessary to detect confounding by smoking and the additional assumptions necessary to indirectly adjust for such bias. We illustrate these methods using data from 2 studies of radiation and lung cancer: the Colorado Plateau cohort study (1950–2005) of underground uranium miners (in which smoking was measured) and a French cohort study (1950–2004) of nuclear industry workers (in which smoking was unmeasured). A cause-specific relative hazards model is proposed for estimation of indirectly adjusted associations. Among the miners, the proposed method suggests no confounding by smoking of the association between radon and lung cancer—a conclusion supported by adjustment for measured smoking. Among the nuclear workers, the proposed method suggests substantial confounding by smoking of the association between radiation and lung cancer. Indirect adjustment for confounding by smoking resulted in an 18% decrease in the adjusted estimated hazard ratio, yet this cannot be verified because smoking was unmeasured. Assumptions underlying this method are described, and a cause-specific proportional hazards model that allows easy implementation using standard software is presented. PMID:25245043

  11. Proportional hazards regression for the analysis of clustered survival data from case-cohort studies

    PubMed Central

    Schaubel, Douglas E.; Kalbeisch, John D.

    2015-01-01

    Summary Case-cohort sampling is a commonly used and efficient method for studying large cohorts. Most existing methods of analysis for case-cohort data have concerned the analysis of univariate failure time data. However, clustered failure time data are commonly encountered in public health studies. For example, patients treated at the same center are unlikely to be independent. In this article, we consider methods based on estimating equations for case-cohort designs for clustered failure time data. We assume a marginal hazards model, with a common baseline hazard and common regression coefficient across clusters. The proposed estimators of the regression parameter and cumulative baseline hazard are shown to be consistent and asymptotically normal, and consistent estimators of the asymptotic covariance matrices are derived. The regression parameter estimator is easily computed using any standard Cox regression software that allows for offset terms. The proposed estimators are investigated in simulation studies, and demonstrated empirically to have increased efficiency relative to some existing methods. The proposed methods are applied to a study of mortality among Canadian dialysis patients. PMID:20560939

  12. Space Debris & its Mitigation

    NASA Astrophysics Data System (ADS)

    Kaushal, Sourabh; Arora, Nishant

    2012-07-01

    Space debris has become a growing concern in recent years, since collisions at orbital velocities can be highly damaging to functioning satellites and can also produce even more space debris in the process. Some spacecraft, like the International Space Station, are now armored to deal with this hazard but armor and mitigation measures can be prohibitively costly when trying to protect satellites or human spaceflight vehicles like the shuttle. This paper describes the current orbital debris environment, outline its main sources, and identify mitigation measures to reduce orbital debris growth by controlling these sources. We studied the literature on the topic Space Debris. We have proposed some methods to solve this problem of space debris. We have also highlighted the shortcomings of already proposed methods by space experts and we have proposed some modification in those methods. Some of them can be very effective in the process of mitigation of space debris, but some of them need some modification. Recently proposed methods by space experts are maneuver, shielding of space elevator with the foil, vaporizing or redirecting of space debris back to earth with the help of laser, use of aerogel as a protective layer, construction of large junkyards around international space station, use of electrodynamics tether & the latest method proposed is the use of nano satellites in the clearing of the space debris. Limitations of the already proposed methods are as follows: - Maneuvering can't be the final solution to our problem as it is the act of self-defence. - Shielding can't be done on the parts like solar panels and optical devices. - Vaporizing or redirecting of space debris can affect the human life on earth if it is not done in proper manner. - Aerogel has a threshold limit up to which it can bear (resist) the impact of collision. - Large junkyards can be effective only for large sized debris. In this paper we propose: A. The Use of Nano Tubes by creating a mesh: In this technique we will use the nano tubes. We will create a mesh that will act as a touch panel of the touch screen cell phone. When any small or tiny particle will come on this mesh and touch it then the mesh will act as a touch panel and so that the corresponding processor or sensor will come to know the co-ordinates of it then further by using Destructive laser beam we can destroy that particle. B. Use of the Nano tubes and Nano Bots for the collection of the Space Debris: In this method also we will use a nano mesh which is made up of the nano tubes and the corresponding arrangement will be done so that that mesh will act as a touch panel same as that of the touch screen phones. So when tiny particles will dash on the nano mesh then the Nano Bots which will be at the specific co-ordinates collect the particles and store them into the garbage storage. C. Further the space Debris can be use for the other purposes too:- As we know that the space debris can be any tiny particle in the space. So instead of decomposing that particles or destroying it we can use those particles for the purpose of energy production by using the fuel cells, but for this the one condition is that the particle material should be capable of forming the ionize liquid or solution which can be successfully use in the fuel cell for energy production. But this is useful for only the big projects where in smallest amount of energy has also the great demand or value. D. RECYCLING OF SPACE DEBRIS The general idea of making space structures by recycling space debris is to capture the aluminum of the upper stages, melt it, and form it into new aluminum structures, perhaps by coating the inside of inflatable balloons, to make very large structures of thin aluminum shells. CONCLUSION Space debris has become the topic of great concern in recent years. Space debris creation can't be stopped completely but it can be minimized by adopting some measures. Many methods of space debris mitigation have been proposed earlier by many space experts, but some of them have limitations in them. After some modification those measures can proved beneficial in the process of space debris mitigation. Some new methods of space debris mitigation have been proposed by us in this paper which includes use of nanobot and nanotube mesh technique. Moreover we have to use it for energy purpose or the making of space structures. We end this paper by appealing that ``We have already polluted our own planet earth; we should now ensure that the space is kept least polluted for our own safe exploration of the outer space and also for the safety of aliens from other planets if they happen to exist.

  13. Analytic study to evaluate associations between hazardous waste sites and birth defects. Final report

    SciTech Connect

    Marshall, E.G.; Gensburg, L.J.; Geary, N.S.; Deres, D.A.; Cayo, M.R.

    1995-06-01

    A study was conducted to evaluate the risk of two types of birth defects (central nervous system and musculoskeletal defects) associated with mothers` exposure to solvents, metals, and pesticides through residence near hazardous waste sites. The only environmental factor showing a statistically significant elevation in risk was living within one mile of industrial or commercial facilities emitting solvents into the air. Residence near these facilities showed elevated risk for central nervous system defects but no elevated risks for musculoskeletal defects.

  14. Hazards theory and the Bhopal tragedy

    SciTech Connect

    Bogard, W.C.

    1986-01-01

    The recent industrial tragedy in Bhopal, India is placed within the framework of a general theory of hazards that could explain it. In that tragedy, resulting from an accidental release of toxic chemicals from Union Carbide of India's pesticide production facility, over 2500 persons lost their lives and thousands more face possible long-term negative effects on their health and livelihood. The Bhopal tragedy is a painful reminder of disturbing facts about the hazards of global technologies in the 20th century. For the poor and disadvantaged of the Third World, these technologies are ostensibly designed to intervene in the hazards process and improve the overall quality of life. Yet evidence suggests that just the opposite may in fact be occurring. The environment may be becoming more hazardous as a result of interventions, and this hazardousness may itself be disproportionately placed on the shoulders of the poor and disadvantaged of Third World nations. A theory of hazards is proposed in which the possibility of detecting environmental threats forms a key concept for explaining the rise of hazardousness and vulnerability among the poor. Without basic detection capabilities, it is argued that all other social attempts to reduce or mitigate the potential for harm from hazards in the environment automatically fail and leave persons exposed to increased risks. This dissertation links a case study of the Bhopal tragedy and hazards theory to suggestions for enhancing the possibility of detecting hazards. If the Bhopal tragedy is not to repeat itself, hard decisions must be made concerning the irreversibilities, catastrophic potentials, and dependencies created by technologies. These decisions reflect the inevitable tradeoffs that are made in assessing issues of long term safety or harm.

  15. Reviewing and visualising relationships between anthropic processes and natural hazards within a multi-hazard framework

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2014-05-01

    Here we present a broad overview of the interaction relationships between 17 anthropic processes and 21 different natural hazard types. Anthropic processes are grouped into seven categories (subsurface extraction, subsurface addition, land use change, explosions, hydrological change, surface construction processes, miscellaneous). Natural hazards are grouped into six categories (geophysical, hydrological, shallow earth processes, atmospheric, biophysical and space). A wide-ranging review based on grey- and peer-reviewed literature from many scientific disciplines identified 54 relationships where anthropic processes have been noted to trigger natural hazards. We record case studies for all but three of these relationships. Based on the results of this review, we find that the anthropic processes of deforestation, explosions (conventional and nuclear) and reservoir construction could trigger the widest range of different natural hazard types. We also note that within the natural hazards, landslides and earthquakes are those that could be triggered by the widest range of anthropic processes. This work also examines the possibility of anthropic processes (i) resulting in an increased occurrence of a particular hazard interaction (e.g., deforestation could result in an increased interaction between storms and landslides); and (ii) inadvertently reducing the likelihood of a natural hazard or natural hazard interaction (e.g., poor drainage or deforestation reducing the likelihood of wildfires triggered by lightning). This study synthesises, using accessible visualisation techniques, the large amounts of anthropic process and natural hazard information from our review. In it we have outlined the importance of considering anthropic processes within any analysis of hazard interactions, and we reinforce the importance of a holistic approach to natural hazard assessment, mitigation and management.

  16. Incorporating induced seismicity in the 2014 United States National Seismic Hazard Model: results of the 2014 workshop and sensitivity studies

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.

    2015-01-01

    The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases potentially induced and induced are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHMs earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after further consideration of the reliability and scientific acceptability of each alternative input model. Forecasting the seismic hazard from induced earthquakes is fundamentally different from forecasting the seismic hazard for natural, tectonic earthquakes. This is because the spatio-temporal patterns of induced earthquakes are reliant on economic forces and public policy decisions regarding extraction and injection of fluids. As such, the rates of induced earthquakes are inherently variable and nonstationary. Therefore, we only make maps based on an annual rate of exceedance rather than the 50-year rates calculated for previous U.S. Geological Survey hazard maps.

  17. Mitigation of indirect environmental effects of GM crops.

    PubMed

    Pidgeon, J D; May, M J; Perry, J N; Poppy, G M

    2007-06-22

    Currently, the UK has no procedure for the approval of novel agricultural practices that is based on environmental risk management principles. Here, we make a first application of the 'bow-tie' risk management approach in agriculture, for assessment of land use changes, in a case study of the introduction of genetically modified herbicide tolerant (GMHT) sugar beet. There are agronomic and economic benefits, but indirect environmental harm from increased weed control is a hazard. The Farm Scale Evaluation (FSE) trials demonstrated reduced broad-leaved weed biomass and seed production at the field scale. The simplest mitigation measure is to leave a proportion of rows unsprayed in each GMHT crop field. Our calculations, based on FSE data, show that a maximum of 2% of field area left unsprayed is required to mitigate weed seed production and 4% to mitigate weed biomass production. Tilled margin effects could simply be mitigated by increasing the margin width from 0.5 to 1.5 m. Such changes are cheap and simple to implement in farming practices. This case study demonstrates the usefulness of the bow-tie risk management approach and the transparency with which hazards can be addressed. If adopted generally, it would help to enable agriculture to adopt new practices with due environmental precaution. PMID:17439853

  18. FDA-iRISK--a comparative risk assessment system for evaluating and ranking food-hazard pairs: case studies on microbial hazards.

    PubMed

    Chen, Yuhuan; Dennis, Sherri B; Hartnett, Emma; Paoli, Greg; Pouillot, Rgis; Ruthman, Todd; Wilson, Margaret

    2013-03-01

    Stakeholders in the system of food safety, in particular federal agencies, need evidence-based, transparent, and rigorous approaches to estimate and compare the risk of foodborne illness from microbial and chemical hazards and the public health impact of interventions. FDA-iRISK (referred to here as iRISK), a Web-based quantitative risk assessment system, was developed to meet this need. The modeling tool enables users to assess, compare, and rank the risks posed by multiple food-hazard pairs at all stages of the food supply system, from primary production, through manufacturing and processing, to retail distribution and, ultimately, to the consumer. Using standard data entry templates, built-in mathematical functions, and Monte Carlo simulation techniques, iRISK integrates data and assumptions from seven components: the food, the hazard, the population of consumers, process models describing the introduction and fate of the hazard up to the point of consumption, consumption patterns, dose-response curves, and health effects. Beyond risk ranking, iRISK enables users to estimate and compare the impact of interventions and control measures on public health risk. iRISK provides estimates of the impact of proposed interventions in various ways, including changes in the mean risk of illness and burden of disease metrics, such as losses in disability-adjusted life years. Case studies for Listeria monocytogenes and Salmonella were developed to demonstrate the application of iRISK for the estimation of risks and the impact of interventions for microbial hazards. iRISK was made available to the public at http://irisk.foodrisk.org in October 2012. PMID:23462073

  19. Study on anaerobic digestion treatment of hazardous colistin sulphate contained pharmaceutical sludge.

    PubMed

    Yin, Fubin; Wang, Dongling; Li, Zifu; Ohlsen, Thomas; Hartwig, Peter; Czekalla, Sven

    2015-02-01

    Pharmaceutical sludge is considered as a hazardous substance with high treatment and disposal fees. Anaerobic digestion could not only transform the hazardous substance into activated sludge, but also generate valuable biogas. This research had two objectives. First: studying the feasibility of anaerobic digestion and determining the biochemical methane potential (BMP) of pharmaceutical sludge under different Inoculum to substrate TS ratios (ISRs) of 0, 0.65, 2.58 and 10.32 in mesophilic condition of 371C. Secondly, investigating the removal efficiency of colistin sulphate during anaerobic digestion. The results showed that the use of anaerobic digestion to treat the pharmaceutical sludge is feasible and that it can completely eliminate the colistin sulphate. The highest biogas production from pharmaceutical sludge is 499.46 mL/g TS at an ISR of 10.32. PMID:25490101

  20. Field studies on exposure, effects, and risk mitigation of aquatic nonpoint-source insecticide pollution: a review.

    PubMed

    Schulz, Ralf

    2004-01-01

    Recently, much attention has been focused on insecticides as a group of chemicals combining high toxicity to invertebrates and fishes with low application rates, which complicates detection in the field. Assessment of these chemicals is greatly facilitated by the description and understanding of exposure, resulting biological effects, and risk mitigation strategies in natural surface waters under field conditions due to normal farming practice. More than 60 reports of insecticide-compound detection in surface waters due to agricultural nonpoint-source pollution have been published in the open literature during the past 20 years, about one-third of them having been undertaken in the past 3.5 years. Recent reports tend to concentrate on specific routes of pesticide entry, such as runoff, but there are very few studies on spray drift-borne contamination. Reported aqueous-phase insecticide concentrations are negatively correlated with the catchment size and all concentrations of > 10 microg/L (19 out of 133) were found in smaller-scale catchments (< 100 km2). Field studies on effects of insecticide contamination often lack appropriate exposure characterization. About 15 of the 42 effect studies reviewed here revealed a clear relationship between quantified, non-experimental exposure and observed effects in situ, on abundance, drift, community structure, or dynamics. Azinphos-methyl, chlorpyrifos, and endosulfan were frequently detected at levels above those reported to reveal effects in the field; however, knowledge about effects of insecticides in the field is still sparse. Following a short overview of various risk mitigation or best management practices, constructed wetlands and vegetated ditches are described as a risk mitigation strategy that have only recently been established for agricultural insecticides. Although only 11 studies are available, the results in terms of pesticide retention and toxicity reduction are very promising. Based on the reviewed literature, recommendations are made for future research activities. PMID:15074794

  1. Status of volcanic hazard studies for the Nevada Nuclear Waste Storage Investigations

    SciTech Connect

    Crowe, B.M.; Vaniman, D.T.; Carr, W.J.

    1983-03-01

    Volcanism studies of the Nevada Test Site (NTS) region are concerned with hazards of future volcanism with respect to underground disposal of high-level radioactive waste. The hazards of silicic volcanism are judged to be negligible; hazards of basaltic volcanism are judged through research approaches combining hazard appraisal and risk assessment. The NTS region is cut obliquely by a N-NE trending belt of volcanism. This belt developed about 8 Myr ago following cessation of silicic volcanism and contemporaneous with migration of basaltic activity toward the southwest margin of the Great Basin. Two types of fields are present in the belt: (1) large-volume, long-lived basalt and local rhyolite fields with numerous eruptive centers and (2) small-volume fields formed by scattered basaltic scoria cones. Late Cenozoic basalts of the NTS region belong to the second field type. Monogenetic basalt centers of this region were formed mostly by Strombolian eruptions; Surtseyean activity has been recognized at three centers. Geochemically, the basalts of the NTS region are classified as straddle A-type basalts of the alkalic suite. Petrological studies indicate a volumetric dominance of evolved hawaiite magmas. Trace- and rare-earth-element abundances of younger basalt (<4 Myr) of the NTS region and southern Death Valley area, California, indicate an enrichment in incompatible elements, with the exception of rubidium. The conditional probability of recurring basaltic volcanism and disruption of a repository by that event is bounded by the range of 10{sup -8} to 10{sup -10} as calculated for a 1-yr period. Potential disruptive and dispersal effects of magmatic penetration of a repository are controlled primarily by the geometry of basalt feeder systems, the mechanism of waste incorporation in magma, and Strombolian eruption processes.

  2. Mitigation potential of horizontal ground coupled heat pumps for current and future climatic conditions: UK environmental modelling and monitoring studies

    NASA Astrophysics Data System (ADS)

    Garca Gonzlez, Raquel; Verhoef, Anne; Vidale, Pier Luigi; Gan, Guohui; Wu, Yupeng; Hughes, Andrew; Mansour, Majdi; Blyth, Eleanor; Finch, Jon; Main, Bruce

    2010-05-01

    An increased uptake of alternative low or non-CO2 emitting energy sources is one of the key priorities for policy makers to mitigate the effects of environmental change. Relatively little work has been undertaken on the mitigation potential of Ground Coupled Heat Pumps (GCHPs) despite the fact that a GCHP could significantly reduce CO2 emissions from heating systems. It is predicted that under climate change the most probable scenario is for UK temperatures to increase and for winter rainfall to become more abundant; the latter is likely to cause a general rise in groundwater levels. Summer rainfall may reduce considerably, while vegetation type and density may change. Furthermore, recent studies underline the likelihood of an increase in the number of heat waves. Under such a scenario, GCHPs will increasingly be used for cooling as well as heating. These factors will affect long-term performance of horizontal GCHP systems and hence their economic viability and mitigation potential during their life span ( 50 years). The seasonal temperature differences encountered in soil are harnessed by GCHPs to provide heating in the winter and cooling in the summer. The performance of a GCHP system will depend on technical factors (heat exchanger (HE) type, length, depth, and spacing of pipes), but also it will be determined to a large extent by interactions between the below-ground parts of the system and the environment (atmospheric conditions, vegetation and soil characteristics). Depending on the balance between extraction and rejection of heat from and to the ground, the soil temperature in the neighbourhood of the HE may fall or rise. The GROMIT project (GROund coupled heat pumps MITigation potential), funded by the Natural Environment Research Council (UK), is a multi-disciplinary research project, in collaboration with EarthEnergy Ltd., which aims to quantify the CO2 mitigation potential of horizontal GCHPs. It considers changing environmental conditions and combines model predictions of soil moisture content and soil temperature with measurements at different GCHP locations over the UK. The combined effect of environment dynamics and horizontal GCHP technical properties on long-term GCHP performance will be assessed using a detailed land surface model (JULES: Joint UK Land Environment Simulator, Meteorological Office, UK) with additional equations embedded describing the interaction between GCHP heat exchangers and the surrounding soil. However, a number of key soil physical processes are currently not incorporated in JULES, such as groundwater flow, which, especially in lowland areas, can have an important effect on the heat flow between soil and HE. Furthermore, the interaction between HE and soil may also cause soil vapour and moisture fluxes. These will affect soil thermal conductivity and hence heat flow between the HE and the surrounding soil, which will in turn influence system performance. The project will address these issues. We propose to drive an improved version of JULES (with equations to simulate GCHP exchange embedded), with long-term gridded (1 km) atmospheric, soil and vegetation data (reflecting current and future environmental conditions) to reliably assess the mitigation potential of GCHPs over the entire domain of the UK, where uptake of GCHPs has been low traditionally. In this way we can identify areas that are most suitable for the installation of GCHPs. Only then recommendations can be made to local and regional governments, for example, on how to improve the mitigation potential in less suitable areas by adjusting GCHP configurations or design.

  3. Examination of Icing Induced Loss of Control and Its Mitigations

    NASA Technical Reports Server (NTRS)

    Reehorst, Andrew L.; Addy, Harold E., Jr.; Colantonio, Renato O.

    2010-01-01

    Factors external to the aircraft are often a significant causal factor in loss of control (LOC) accidents. In today s aviation world, very few accidents stem from a single cause and typically have a number of causal factors that culminate in a LOC accident. Very often the "trigger" that initiates an accident sequence is an external environment factor. In a recent NASA statistical analysis of LOC accidents, aircraft icing was shown to be the most common external environmental LOC causal factor for scheduled operations. When investigating LOC accident or incidents aircraft icing causal factors can be categorized into groups of 1) in-flight encounter with super-cooled liquid water clouds, 2) take-off with ice contamination, or 3) in-flight encounter with high concentrations of ice crystals. As with other flight hazards, icing induced LOC accidents can be prevented through avoidance, detection, and recovery mitigations. For icing hazards, avoidance can take the form of avoiding flight into icing conditions or avoiding the hazard of icing by making the aircraft tolerant to icing conditions. Icing detection mitigations can take the form of detecting icing conditions or detecting early performance degradation caused by icing. Recovery from icing induced LOC requires flight crew or automated systems capable of accounting for reduced aircraft performance and degraded control authority during the recovery maneuvers. In this report we review the icing induced LOC accident mitigations defined in a recent LOC study and for each mitigation describe a research topic required to enable or strengthen the mitigation. Many of these research topics are already included in ongoing or planned NASA icing research activities or are being addressed by members of the icing research community. These research activities are described and the status of the ongoing or planned research to address the technology needs is discussed

  4. Study on FPGA SEU Mitigation for the Readout Electronics of DAMPE BGO Calorimeter in Space

    NASA Astrophysics Data System (ADS)

    Shen, Zhongtao; Feng, Changqing; Gao, Shanshan; Zhang, Deliang; Jiang, Di; Liu, Shubin; An, Qi

    2015-06-01

    The BGO calorimeter, which provides a wide measurement range of the primary cosmic ray spectrum, is a key sub-detector of Dark Matter Particle Explorer (DAMPE). The readout electronics of calorimeter consists of 16 pieces of Actel ProASIC Plus FLASH-based FPGA, of which the design-level flip-flops and embedded block RAMs are single event upset (SEU) sensitive in the harsh space environment. Therefore to comply with radiation hardness assurance (RHA), SEU mitigation methods, including partial triple modular redundancy (TMR), CRC checksum, and multi-domain reset are analyzed and tested by the heavy-ion beam test. Composed of multi-level redundancy, a FPGA design with the characteristics of SEU tolerance and low resource consumption is implemented for the readout electronics.

  5. Using respondents' uncertainty scores to mitigate hypothetical bias in community-based health insurance studies.

    PubMed

    Donfouet, Hermann Pythagore Pierre; Mahieu, Pierre-Alexandre; Malin, Eric

    2013-04-01

    Community-based health insurance has been implemented in several developing countries to help the poor to gain access to adequate health-care services. Assessing what the poor are willing to pay is of paramount importance for policymaking. The contingent valuation method, which relies on a hypothetical market, is commonly used for this purpose. But the presence of the hypothetical bias that is most often inherent in this method tends to bias the estimates upward and compromises policymaking. This paper uses respondents' uncertainty scores in an attempt to mitigate hypothetical bias in community-based health insurance in one rural setting in Cameroon. Uncertainty scores are often employed in single dichotomous choice surveys. An originality of the paper is to use such an approach in a double-bounded dichotomous choice survey. The results suggest that this instrument is effective at decreasing the mean WTP. PMID:22160944

  6. Effectiveness of protected areas in mitigating fire within their boundaries: case study of Chiapas, Mexico.

    PubMed

    Román-Cuesta, María Rosa; Martínez-Vilalta, Jordi

    2006-08-01

    Since the severe 1982-1983 El Niño drought, recurrent burning has been reported inside tropical protected areas (TPAs). Despite the key role of fire in habitat degradation, little is known about the effectiveness of TPAs in mitigating fire incidence and burned areas. We used a GPS fire database (1995-2005) (n=3590 forest fires) obtained from the National Forest Commission to compare fire incidence (number of fires) and burned areas inside TPAs and their surrounding adjacent buffer areas in Southern Mexico (Chiapas). Burned areas inside parks ranged from 2% (Palenque) to 45% (Lagunas de Montebello) of a park's area, and the amount burned was influenced by two severe El Niño events (1998 and 2003). These two years together resulted in 67% and 46% of the total area burned in TPAs and buffers, respectively during the period under analysis. Larger burned areas in TPAs than in their buffers were exclusively related to the extent of natural habitats (flammable area excluding agrarian and pasture lands). Higher fuel loads together with access and extinction difficulties were likely behind this trend. A higher incidence of fire in TPAs than in their buffers was exclusively related to anthropogenic factors such as higher road densities and agrarian extensions. Our results suggest that TPAs are failing to mitigate fire impacts, with both fire incidence and total burned areas being significantly higher in the reserves than in adjacent buffer areas. Management plans should consider those factors that facilitate fires in TPAs: anthropogenic origin of fires, sensitivity of TPAs to El Niñio-droughts, large fuel loads and fuel continuity inside parks, and limited financial resources. Consideration of these factors favors lines of action such as alternatives to the use of fire (e.g., mucuna-maize system), climatic prediction to follow the evolution of El Niño, fuel management strategies that favor extinction practices, and the strengthening of local communities and ecotourism. PMID:16922224

  7. ERTS-1 flood hazard studies in the Mississippi River Basin. [Missouri, Mississippi, and Arkansas

    NASA Technical Reports Server (NTRS)

    Rango, A.; Anderson, A. T.

    1974-01-01

    The Spring 1973 Mississippi River flood was investigated using remotely sensed data from ERTS-1. Both manual and automatic analyses of the data indicate that ERTS-1 is extremely useful as a regional tool for flood and floodplain management. The maximum error of such flood area measurements is conservatively estimated to be less than five percent. Change detection analysis indicates that the flood had major impacts on soil moisture, land pattern stability, and vegetation stress. Flood hazard identification was conducted using photointerpretation techniques in three study areas along the Mississippi River using pre-flood ERTS-1 imagery down to 1:100,000 scale. Flood prone area boundaries obtained from ERTS-1 were generally in agreement with flood hazard maps produced by the U.S. Army Corps of Engineers and the U.S. Geological Survey although the latter are somewhat more detailed because of their larger scale. Initial results indicate that ERTS-1 digital mapping of the flood-prone areas can be performed at least 1:62,500 which is comparable to conventional flood hazard map scales.

  8. Probabilistic tephra hazard maps for the Neapolitan area: Quantitative volcanological study of Campi Flegrei eruptions

    NASA Astrophysics Data System (ADS)

    Mastrolorenzo, G.; Pappalardo, L.; Troise, C.; Panizza, A.; de Natale, G.

    2008-07-01

    Tephra fall is a relevant hazard of Campi Flegrei caldera (Southern Italy), due to the high vulnerability of Naples metropolitan area to such an event. Here, tephra derive from magmatic as well as phreatomagmatic activity. On the basis of both new and literature data on known, past eruptions (Volcanic Explosivity Index (VEI), grain size parameters, velocity at the vent, column heights and erupted mass), and factors controlling tephra dispersion (wind velocity and direction), 2D numerical simulations of fallout dispersion and deposition have been performed for a large number of case events. A bayesian inversion has been applied to retrieve the best values of critical parameters (e.g., vertical mass distribution, diffusion coefficients, velocity at the vent), not directly inferable by volcanological study. Simulations are run in parallel on multiple processors to allow a fully probabilistic analysis, on a very large catalogue preserving the statistical proprieties of past eruptive history. Using simulation results, hazard maps have been computed for different scenarios: upper limit scenario (worst-expected scenario), eruption-range scenario, and whole-eruption scenario. Results indicate that although high hazard characterizes the Campi Flegrei caldera, the territory to the east of the caldera center, including the whole district of Naples, is exposed to high hazard values due to the dominant westerly winds. Consistently with the stratigraphic evidence of nature of past eruptions, our numerical simulations reveal that even in the case of a subplinian eruption (VEI = 3), Naples is exposed to tephra fall thicknesses of some decimeters, thereby exceeding the critical limit for roof collapse. Because of the total number of people living in Campi Flegrei and the city of Naples (ca. two million of inhabitants), the tephra fallout risk related to a plinian eruption of Campi Flegrei largely matches or exceeds the risk related to a similar eruption at Vesuvius.

  9. Effects of anthropogenic land-subsidence on river flood hazard: a case study in Ravenna, Italy

    NASA Astrophysics Data System (ADS)

    Carisi, Francesca; Domeneghetti, Alessio; Castellarin, Attilio

    2015-04-01

    Can differential land-subsidence significantly alter the river flooding dynamics, and thus flood risk in flood prone areas? Many studies show how the lowering of the coastal areas is closely related to an increase in the flood-hazard due to more important tidal flooding and see level rise. On the contrary, the literature on the relationship between differential land-subsidence and possible alterations to riverine flood-hazard of inland areas is still sparse, while several areas characterized by significant land-subsidence rates during the second half of the 20th century experienced an intensification in both inundation magnitude and frequency. This study investigates the possible impact of a significant differential ground lowering on flood hazard in proximity of Ravenna, which is one of the oldest Italian cities, former capital of the Western Roman Empire, located a few kilometers from the Adriatic coast and about 60 km south of the Po River delta. The rate of land-subsidence in the area, naturally in the order of a few mm/year, dramatically increased up to 110 mm/year after World War II, primarily due to groundwater pumping and a number of deep onshore and offshore gas production platforms. The subsidence caused in the last century a cumulative drop larger than 1.5 m in the historical center of the city. Starting from these evidences and taking advantage of a recent digital elevation model of 10m resolution, we reconstructed the ground elevation in 1897 for an area of about 65 km2 around the city of Ravenna. We referred to these two digital elevation models (i.e. current topography and topographic reconstruction) and a 2D finite-element numerical model for the simulation of the inundation dynamics associated with several levee failure scenarios along embankment system of the river Montone. For each scenario and digital elevation model, the flood hazard is quantified in terms of water depth, speed and dynamics of the flooding front. The comparison enabled us to quantify alterations to the flooding hazard due to large and rapid differential land-subsidence, shedding some light on whether to consider anthropogenic land-subsidence among the relevant human-induced drivers of flood-risk change.

  10. Study on landslide hazard zonation based on factor weighting-rating theory in Slanic Prahova

    NASA Astrophysics Data System (ADS)

    Maftei, R.-M.; Vina, G.; Filipciuc, C.

    2012-04-01

    Studying the risks caused by landslides is important in the context of its forecast triggering. This study mainly integrates the background data that are related to historical and environmental factors and also current triggering factors. The theory on zoning hazard caused by landslides, Landslide Hazard Zonation, (LHZ) appeared in the 1960s. In this period the U.S. and many European countries began to use other triggers factors, besides the slope factor, in achieving hazard zoning. This theory has progressed due to the development of remote sensing and GIS technology, which were used to develop and analys methods and techniques consisting in combining data from different sources. The study of an area involves analysing the geographical position data, estimating the surface, the type of terrain, altitude, identifing the landslides in the area and some geological summary data. Data sources. The data used in this study are: · Landsat 7 satellite images; · 30 m spatial resolution, from which is derived the vegetation index; · topographic maps 1:25 000 from which we can obtain the numerical altitude model (DEM) (used to calculate the slope and relative altitude to land) · geological maps 1:50 000. Studied factors. The main factors used and studied in achieving land slides hazard zoning are: - the rate of displacement, the angle of slope, lithology - the index of vegetation or ground coverage of vegetation (NDVI) - river network, structural factor 1. The calculation of normalized vegetation index is made based on Landsat ETM satellite images. This vegetation factor can be both a principal and a secondary trigger factor in landslides. In areas devoid of vegetation, landslides are triggered more often compared with those in which coverage is greater. 2. Factors derived from the numerical model are the slope and elevation relative altitude. This operation was made using the topographic map 1:25 000 from were the level curvs contour was extracted by digitization, and then they were converted into points that have been interpolated. Lithological and structural factors have been extracted from the geological map by vectorization and the hydrological one from the topographic map and satellite imagery. 3. Weights Selection All these elements were transfomated in raster format with spatial resolution of 25 m. Each element was given a rating of importance from 0-9, depending on its share in causing the phenomenon, and then a quantitative index based on each specific characteristic. Was performed for each subject in hand, a risk index for each area separately. LHZ will be the established from the risk index histogram important steps. This work is presented within the framework of the SafeLand project funded by the EC (FP7).

  11. Studies On The Influence Of Soil Components On Adsorption-Desorption Of Hazardous Organics And Their Insitu Biodegradation

    NASA Astrophysics Data System (ADS)

    Khan, Z.

    2003-12-01

    Currently approximately 155 cubic yards of soil is contaminated with hazardous organics at Patancheru Industrial area (Hyderabad, India). These hazardous organic contaminants are frequently part of hazardous waste disposed on land and the study of waste site interaction is the key to assess the potential for offsite and onsite contamination. In the present study the authors report the results on the adsorption, soil leaching potential and persistence of phenol, p-nitrophenol,2,4-dichlorophenol and 4,chloro-2,nitrophenol which are the common constituents of the hazardous waste generated. The role of soil components like organic matter, clay, iron and aluminium oxides in the adsorption capacity has been studied. Desorption isotherms of soil adsorbed hazardous organics exhibited hysterisis at high initial concentration indicating the degree of irreversibility of adsorption-deesorption process. Leaching potential of the hazardous organics decreases with their increasing hydrophobicity and soil organic matter content while their persistence in terms of half life time (DT50) increases. Insitu biodegradation has been carried out by developing mixed culture systems which can degrade the phenols to complete mineralisation by utilizing them as the sole source of carbon and their corresponding biodegradation kinetic constants were evaluated. Based on the above data generated preparation of hazardous waste dumpsites with suitable soil surface having high holding capacity for organics and their insitu biodegradation by mixing with specific bacterial cultures enriched from different soils can be exploited as a cost effective technology for reclamation of contaminated sites.

  12. Prediction of Ungauged River Basin for Hydro Power Potential and Flood Risk Mitigation; a Case Study at Gin River, Sri Lanka

    NASA Astrophysics Data System (ADS)

    Ratnayake, A. S.

    2011-12-01

    The most of the primary civilizations of the world emerged in or near river valleys or floodplains. The river channels and floodplains are single hydrologic and geomorphic system. The failure to appreciate the integral connection between floodplains and channel underlies many socioeconomic and environmental problems in river management today. However it is a difficult task of collecting reliable field hydrological data. Under such situations either synthetic or statistically generated data were used for hydraulic engineering designing and flood modeling. The fundamentals of precipitation-runoff relationship through synthetic unit hydrograph for Gin River basin were prepared using the method of the Flood Studies Report of the National Environmental Research Council, United Kingdom (1975). The Triangular Irregular Network model was constructed using Geographic Information System (GIS) to determine hazard prone zones. The 1:10,000 and 1:50,000 topography maps and field excursions were also used for initial site selection of mini-hydro power units and determine flooding area. The turbines output power generations were calculated using the parameters of net head and efficiency of turbine. The peak discharge achieves within 4.74 hours from the onset of the rainstorm and 11.95 hours time takes to reach its normal discharge conditions of Gin River basin. Stream frequency of Gin River is 4.56 (Junctions/ km2) while the channel slope is 7.90 (m/km). The regional coefficient on the catchment is 0.00296. Higher stream frequency and gentle channel slope were recognized as the flood triggering factors of Gin River basin and other parameters such as basins catchment area, main stream length, standard average annual rainfall and soil do not show any significant variations with other catchments of Sri Lanka. The flood management process, including control of flood disaster, prepared for a flood, and minimize it impacts are complicated in human population encroached and modified floodplains. Thus modern GIS technology has been productively executed to prepare hazard maps based on the flood modeling and also it would be further utilized for disaster preparedness and mitigation activities. Five suitable hydraulic heads were recognized for mini-hydro power sites and it would be the most economical and applicable flood controlling hydraulic engineering structure considering all morphologic, climatic, environmental and socioeconomic proxies of the study area. Mini-hydro power sites also utilized as clean, eco friendly and reliable energy source (8630.0 kW). Finally Francis Turbine can be employed as the most efficiency turbine for the selected sites bearing in mind of both technical and economical parameters.

  13. Study on the Application of Probabilistic Tsunami Hazard Analysis for the Nuclear Power Plant Site in Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Rhee, H. M.; Kim, M.; Sheen, D. H.; Choi, I. K.

    2014-12-01

    The necessity of study on the tsunami hazard assessment for Nuclear Power Plant (NPP) site was suggested since the event of Fukushima in 2011 had been occurred. It has being emphasized because all of the NPPs in Korean Peninsula are located in coastal region. The tsunami hazard is regarded as the annual exceedance probability for the wave heights. The methodology for analysis of tsunami hazard is based on the seismic hazard analysis. The seismic hazard analysis had been performed by using both deterministic and probabilistic method. Recently, the probabilistic method had been received more attention than the deterministic method because the uncertainties of hazard analysis could be considered by using the logic tree approach. In this study, the probabilistic tsunami hazard analysis for Uljin NPP site was performed by using the information of fault sources which was published by Atomic Energy Society of Japan (AESJ). The wave parameter is the most different parameter with seismic hazard. It could be estimated from the results of tsunami propagation analysis. The TSUNAMI_ver1.0 which was developed by Japan nuclear energy safety organization (JNES), was used for the tsunami simulation. The 80 cases tsunami simulations were performed and then the wave parameters were estimated. For reducing the sensitivity which was encouraged by location of sampling point, the wave parameters were estimated from group of sampling points.The probability density function on the tsunami height was computed by using the recurrence intervals and the wave parameters. And then the exceedance probability distribution was calculated from the probability density function. The tsunami hazards for the sampling groups were calculated. The fractile curves which were shown the uncertainties of input parameters were estimated from the hazards by using the round-robin algorithm. In general, tsunami hazard analysis is focused on the maximum wave heights. But the minimum wave height should be considered for the tsunami hazard analysis on the NPP site since it is connected with water intake system. The results of tsunami hazard analysis for the NPP site was suggested by the annual exceedance probability with the wave heights. This study shows that the PTHA method could be applied for the estimation of tsunami wave height in NPP sites

  14. GIS data for the Seaside, Oregon, Tsunami Pilot Study to modernize FEMA flood hazard maps

    USGS Publications Warehouse

    Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.

    2007-01-01

    A Tsunami Pilot Study was conducted for the area surrounding the coastal town of Seaside, Oregon, as part of the Federal Emergency Management's (FEMA) Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). The Cascadia subduction zone extends from Cape Mendocino, California, to Vancouver Island, Canada. The Seaside area was chosen because it is typical of many coastal communities subject to tsunamis generated by far- and near-field (Cascadia) earthquakes. Two goals of the pilot study were to develop probabilistic 100-year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improving tsunami hazard assessment guidelines for FEMA and state and local agencies. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study model data and results are published separately as a geographic information systems (GIS) data report (Wong and others, 2006). The flood maps and GIS data are briefly described here.

  15. Creative mitigation

    SciTech Connect

    Ayer, F.; Lagassa, G.

    1989-10-01

    On May 9, 1989, in front of a small but enthusiastic group composed of residents of Columbia Falls, Maine, Downeast fisherman and a crew of Bangor Hydro-Electric employees removed some of the wooden sections of the Columbia Falls dam. The dam is located at the mouth of the Pleasant River on the Maine seacoast, only thirty miles from the border between Maine and New Brunswick, Canada. In so doing, they provided unobstructed access by Atlantic salmon to crucial upstream aquatic habitat for the first time since the day was constructed in 1981. At the same time they made possible the efficient operation of a 13 MW hydroelectric facility some 75 miles inland at West Enfield, Maine, on the Penobscot River. This article describes the creative strategies used by Bangor Pacific Hydro Associated to satisfy environmental mitigation requirements at West Enfield, Maine.

  16. Macroscopic to microscopic studies of flue gas desulfurization byproducts for acid mine drainage mitigation

    SciTech Connect

    Robbins, E.I.; Kalyoncu, R.S.; Finkelman, R.B.; Matos, G.R.; Barsotti, A.F.; Haefner, R.J.; Rowe, G.L. Jr.; Savela, C.E.; Eddy, J.I.

    1996-12-31

    The use of flue gas desulfurization (FGD) systems to reduce SO{sub 2} emissions has resulted in the generation of large quantities of byproducts. These and other byproducts are being stockpiled at the very time that alkaline materials having high neutralization potential are needed to mitigate acid mine drainage (AMD). FGD byproducts are highly alkaline materials composed primarily of unreacted sorbents (lime or limestone and sulfates and sulfites of Ca). The American Coal Ash Association estimated that approximately 20 million tons of FGD material were generated by electric power utilities equipped with wet lime-limestone PGD systems in 1993. Less than 5% of this material has been put to beneficial use for agricultural soil amendments and for the production of wallboard and cement. Four USGS projects are examining FGD byproduct use to address these concerns. These projects involve (1) calculating the volume of flue gas desulfurization (FGD) byproduct generation and their geographic locations in relation to AMD, (2) determining byproduct chemistry and mineralogy, (3) evaluating hydrology and geochemistry of atmospheric fluidized bed combustion byproduct as soil amendment in Ohio, and (4) analyzing microbial degradation of gypsum in anoxic limestone drains in West Virginia.

  17. Viscoelastic Materials Study for the Mitigation of Blast-Related Brain Injury

    NASA Astrophysics Data System (ADS)

    Bartyczak, Susan; Mock, Willis, Jr.

    2011-06-01

    Recent preliminary research into the causes of blast-related brain injury indicates that exposure to blast pressures, such as from IED detonation or multiple firings of a weapon, causes damage to brain tissue resulting in Traumatic Brain Injury (TBI) and Post Traumatic Stress Disorder (PTSD). Current combat helmets are not sufficient to protect the warfighter from this danger and the effects are debilitating, costly, and long-lasting. Commercially available viscoelastic materials, designed to dampen vibration caused by shock waves, might be useful as helmet liners to dampen blast waves. The objective of this research is to develop an experimental technique to test these commercially available materials when subject to blast waves and evaluate their blast mitigating behavior. A 40-mm-bore gas gun is being used as a shock tube to generate blast waves (ranging from 1 to 500 psi) in a test fixture at the gun muzzle. A fast opening valve is used to release nitrogen gas from the breech to impact instrumented targets. The targets consist of aluminum/ viscoelastic polymer/ aluminum materials. Blast attenuation is determined through the measurement of pressure and accelerometer data in front of and behind the target. The experimental technique, calibration and checkout procedures, and results will be presented.

  18. Study of cover source mismatch in steganalysis and ways to mitigate its impact

    NASA Astrophysics Data System (ADS)

    Kodovsk, Jan; Sedighi, Vahid; Fridrich, Jessica

    2014-02-01

    When a steganalysis detector trained on one cover source is applied to images from a different source, generally the detection error increases due to the mismatch between both sources. In steganography, this situation is recognized as the so-called cover source mismatch (CSM). The drop in detection accuracy depends on many factors, including the properties of both sources, the detector construction, the feature space used to represent the covers, and the steganographic algorithm. Although well recognized as the single most important factor negatively affecting the performance of steganalyzers in practice, the CSM received surprisingly little attention from researchers. One of the reasons for this is the diversity with which the CSM can manifest. On a series of experiments in the spatial and JPEG domains, we refute some of the common misconceptions that the severity of the CSM is tied to the feature dimensionality or their "fragility." The CSM impact on detection appears too difficult to predict due to the effect of complex dependencies among the features. We also investigate ways to mitigate the negative effect of the CSM using simple measures, such as by enlarging the diversity of the training set (training on a mixture of sources) and by employing a bank of detectors trained on multiple different sources and testing on a detector trained on the closest source.

  19. Numerical study of potential heat flux mitigation effects in the TCV snowflake divertor

    NASA Astrophysics Data System (ADS)

    Lunt, T.; Canal, G. P.; Duval, B. P.; Feng, Y.; Labit, B.; McCarthy, P.; Reimerdes, H.; Vijvers, W. A. J.; Wischmeier, M.

    2016-04-01

    We report on EMC3-Eirene simulations of the plasma and neutral particle transport the TCV boundary layer of a series of snowflake (SF) equilibria characterized by the normalized poloidal flux coordinate {ρx2} of the secondary X-point x 2. We refer to a snowflake plus (SF+) for {ρx2}<1 , a snowflake minus (SF‑) for {ρx2}>1 and a single-null (SN) for |{ρx2}-1|\\gg 0 . Four effects are identified that have the potential to mitigate the heat flux density at the outer strike point in a LFS SF‑where x 2 is located on the low field side of the primary X-point x 1: (1) a scrape-off layer heat flux splitting, (2) an impurity radiation cloud forming at x 2 (3) the increased connection length to the outer target and (4) increased transport between x 1 and x 2. The LFS SF‑ is thus expected to tolerate a larger power flux {{P}\\text{sep}} over the separatrix than a comparable SN configuration.

  20. First Production of C60 Nanoparticle Plasma Jet for Study of Disruption Mitigation for ITER

    NASA Astrophysics Data System (ADS)

    Bogatu, I. N.; Thompson, J. R.; Galkin, S. A.; Kim, J. S.; Brockington, S.; Case, A.; Messer, S. J.; Witherspoon, F. D.

    2012-10-01

    Unique fast response and large mass-velocity delivery of nanoparticle plasma jets (NPPJs) provide a novel application for ITER disruption mitigation, runaway electrons diagnostics and deep fueling. NPPJs carry a much larger mass than usual gases. An electromagnetic plasma gun provides a very high injection velocity (many km/s). NPPJ has much higher ram pressure than any standard gas injection method and penetrates the tokamak confining magnetic field. Assimilation is enhanced due to the NP large surface-to-volume ratio. Radially expanding NPPJs help achieving toroidal uniformity of radiation power. FAR-TECH's NPPJ system was successfully tested: a coaxial plasma gun prototype (35 cm length, 96 kJ energy) using a solid state TiH2/C60 pulsed power cartridge injector produced a hyper-velocity (>4 km/s), high-density (>10^23 m-3), C60 plasma jet in 0.5 ms, with 1-2 ms overall response-delivery time. We present the TiH2/C60 cartridge injector output characterization (180 mg of sublimated C60 gas) and first production results of a high momentum C60 plasma jet (0.6 g.km/s).

  1. Implications of Adhesion Studies for Dust Mitigation on Thermal Control Surfaces

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Berkebile, Stephen P.

    2012-01-01

    Experiments measuring the adhesion forces under ultrahigh vacuum conditions (10 (exp -10) torr) between a synthetic volcanic glass and commonly used space exploration materials have recently been described. The glass has a chemistry and surface structure typical of the lunar regolith. It was found that Van der Waals forces between the glass and common spacecraft materials was negligible. Charge transfer between the materials was induced by mechanically striking the spacecraft material pin against the glass plate. No measurable adhesion occurred when striking the highly conducting materials, however, on striking insulating dielectric materials the adhesion increased dramatically. This indicates that electrostatic forces dominate over Van der Waals forces under these conditions. The presence of small amounts of surface contaminants was found to lower adhesive forces by at least two orders of magnitude, and perhaps more. Both particle and space exploration material surfaces will be cleaned by the interaction with the solar wind and other energetic processes and stay clean because of the extremely high vacuum (10 (exp -12) torr) so the atomically clean adhesion values are probably the relevant ones for the lunar surface environment. These results are used to interpret the results of dust mitigation technology experiments utilizing textured surfaces, work function matching surfaces and brushing. They have also been used to reinterpret the results of the Apollo 14 Thermal Degradation Samples experiment.

  2. Towards the Seismic Hazard Reassessment of Paks NPP (Hungary) Site: Seismicity and Sensitivity Studies

    NASA Astrophysics Data System (ADS)

    Toth, Laszlo; Monus, Peter; Gyori, Erzsebet; Grenerczy, Gyula; Janos Katona, Tamas; Kiszely, Marta

    2015-04-01

    In context of extension of Paks Nuclear Power Plant by new units, a comprehensive site seismic hazard evaluation program has been developed that is already approved by the Hungarian Authorities. This includes a 3D seismic survey, drilling of several deep boreholes, extensive geological mapping, and geophysical investigations at the site and its vicinity, as well as on near regional, and regional scale. Furthermore, all relevant techniques of modern space geodesy (GPS, PSInSAR) will be also utilized to construct a new seismotectonic model. The implementation of the project is still in progress. In the presentation, some important elements of the new seismic hazard assessment are highlighted, and some results obtained in the preliminary phase of the program are presented and discussed. The first and most important component of the program is the compilation of the seismological database that is developed on different time scale zooming on different event recurrence rates such as paleo-earthquakes (10-1/a). In 1995, Paks NPP installed and started to operate a sensitive microseismic monitoring network capable for locating earthquakes as small as magnitude 2.0 within about 100 km of the NPP site. During the two decades of operation, the microseismic monitoring network located some 2,000 earthquakes within the region of latitude 45.5 - 49 N and longitude 16 - 23 E. Out of the total number of events, 130 earthquakes were reported as 'felt events'. The largest earthquake was an event of ML 4.8, causing significant damage in the epicenter area. The results of microseismic monitoring provide valuable data for seismotectonic modelling and results in more accurate earthquake recurrence equations. The first modern PSHA of Paks NPP site was carried out in 1995. Complex site characterization project was implemented and hazard curves had been evaluated for 10-3 - 10-5 annual frequency. As a follow-up, PSHA results have been reviewed and updated in the frame of periodic safety reviews, and hazard characterization of the site has been confirmed. The hazard curves have been extended to lower probability events, as it is required by the probabilistic safety analysis. These earlier projects resulted in 0.22-0.26 g and 0.43-0.54 g mean PGA at 104 and 105 return periods. The site effect and liquefaction probability have also been evaluated. As it is expected for the site of soft soil conditions, the amplification is greater at shorter periods for the lower amplitude ground motion of 104 return period compared to the longer periods for the higher amplitude of the 105 year level ground motion. Further studies will be based on the improved regional seismotectonic model, state-of-the-art hazard evaluation software, and better knowledge of the local soil conditions. The presented preliminary results can demonstrate the adequacy of the planned program and highlight the progress in the hazard assessment.

  3. A Hazard Assessment and Proposed Risk Index for Art, Architecture, Archive and Artifact Protection: Case Studies for Assorted International Museums

    NASA Astrophysics Data System (ADS)

    Kirk, Clara J.

    This study proposes a hazard/risk index for environmental, technological, and social hazards that may threaten a museum or other place of cultural storage and accession. This index can be utilized and implemented to measure the risk at the locations of these storage facilities in relationship to their geologic, geographic, environmental, and social settings. A model case study of the 1966 flood of the Arno River and its impact on the city of Florence and the Uffizi Gallery was used as the index focus. From this focus an additional eleven museums and their related risk were assessed. Each index addressed a diverse range of hazards based on past frequency and magnitude. It was found that locations nearest a hazard had exceptionally high levels of risk, however more distant locations could have influences that would increase their risk to levels similar to those locations near the hazard. Locations not normally associated with a given natural hazard can be susceptible should the right conditions be met and this research identified, complied and assessed those factions found to influence natural hazard risk at these research sites.

  4. Climate change and mitigation.

    PubMed

    Nibleus, Kerstin; Lundin, Rickard

    2010-01-01

    Planet Earth has experienced repeated changes of its climate throughout time. Periods warmer than today as well as much colder, during glacial episodes, have alternated. In our time, rapid population growth with increased demand for natural resources and energy, has made society increasingly vulnerable to environmental changes, both natural and those caused by man; human activity is clearly affecting the radiation balance of the Earth. In the session "Climate Change and Mitigation" the speakers offered four different views on coal and CO2: the basis for life, but also a major hazard with impact on Earth's climate. A common denominator in the presentations was that more than ever science and technology is required. We need not only understand the mechanisms for climate change and climate variability, we also need to identify means to remedy the anthropogenic influence on Earth's climate. PMID:20873680

  5. Hazardous materials

    MedlinePLUS

    HazCom; Hazard communication; Material Safety Data Sheet; MSDS ... Hazardous materials are substances that could harm human health or the environment. Hazardous means dangerous, so these materials must ...

  6. Thermal study of payload module for the next-generation infrared space telescope SPICA in risk mitigation phase

    NASA Astrophysics Data System (ADS)

    Shinozaki, Keisuke; Sato, Yoichi; Sawada, Kenichiro; Ando, Makiko; Sugita, Hiroyuki; Yamawaki, Toshihiko; Mizutani, Tadahito; Komatsu, Keiji; Okazaki, Shun; Ogawa, Hiroyuki; Nakagawa, Takao; Matsuhara, Hideo; Takada, Makoto; Okabayashi, Akinobu; Tsunematsu, Shoji; Narasaki, Katsuhiro

    2014-08-01

    The Space Infrared Telescope for Cosmology and Astrophysics (SPICA) is a pre-project of JAXA in collaboration with ESA to be launched around 2025. The SPICA mission is to be launched into a halo orbit around the second Lagrangian point in the Sun-Earth system, which allows us to use effective radiant cooling in combination with a mechanical cooling system in order to cool a 3m large IR telescope below 6K. The use of 4K / 1K-class Joule-Thomson coolers is proposed in order to cool the telescope and provide a 4K / 1K temperature region for Focal Plane Instruments (FPIs). This paper introduces details of the thermal design study for the SPICA payload module in the Risk-Mitigation-Phase (RMP), in which the activity is focused on mitigating the mission's highest risks. As the result of the RMP activity, most of all the goals have been fully satisfied and the thermal design of the payload module has been dramatically improved.

  7. The newest achievements of studies on the reutilization, treatment, and disposal technology of hazardous wastes

    SciTech Connect

    Liu Peizhe

    1996-12-31

    From 1991 to 1996, key studies on the reutilization, treatment, and disposal technology of hazardous wastes have been incorporated into the national plan for environmental protection science and technology. At present, the research achievements have been accomplished, have passed national approval, and have been accepted. The author of this paper, as leader of the national group for this research work, expounds the newest achievements of the studies involving four parts: (1) the reutilization technology of electroplating sludge, including the ion-exchange process for recovering the sludge and waste liquor for producing chromium tanning agent and extracting chromium and colloidal protein from tanning waste residue; on the recovery of heavy metals from the electroplating waste liquor with microbic purification; on the demonstration project of producing modified plastics from the sludge and the waste plastics; and on the demonstration of the recovery of heavy metals from waste electroplating sludge by using the ammonia-leaching process; (2) the demonstrative research of reutilization technology of chromium waste residues, including production of self-melting ore and smelting of chromium-containing pig iron, and of pyrolytic detoxification of the residue with cyclone furnace; (3) the incineration technology of hazardous wastes with successful results of the industrial incinerator system for polychlorinated biphenyls; and (4) the safety landfill technology for disposal of hazardous wastes, with a complete set of technology for pretreatment, selection of the site, development of the antipercolating materials, and design and construction of the landfill. Only a part of the achievements is introduced in this paper, most of which has been built and is being operated for demonstration to further spreading application and accumulate experience. 6 refs., 7 figs., 6 tabs.

  8. An evaluation of soil erosion hazard: A case study in Southern Africa using geomatics technologies

    NASA Astrophysics Data System (ADS)

    Eiswerth, Barbara Alice

    Accelerated soil erosion in Malawi, Southern Africa, increasingly threatens agricultural productivity, given current and projected population growth trends. Previous attempts to document soil erosion potential have had limited success, lacking appropriate information and diagnostic tools. This study utilized geomatics technologies and the latest available information from topography, soils, climate, vegetation, and land use of a watershed in southern Malawi. The Soil Loss Estimation Model for Southern Africa (SLEMSA), developed for conditions in Zimbabwe, was evaluated and used to create a soil erosion hazard map for the watershed under Malawi conditions. The SLEMSA sub-models of cover, soil loss, and topography were computed from energy interception, rainfall energy, and soil erodibility, and slope length and steepness, respectively. Geomatics technologies including remote sensing and Geographic Information Systems (GIS) provided the tools with which land cover/land use, a digital elevation model, and slope length and steepness were extracted and integrated with rainfall and soils spatial information. Geomatics technologies enable rapid update of the model as new and better data sets become available. Sensitivity analyses of the SLEMSA model revealed that rainfall energy and slope steepness have the greatest influence on soil erosion hazard estimates in this watershed. Energy interception was intermediate in sensitivity level, whereas slope length and soil erodibility ranked lowest. Energy interception and soil erodibility were shown by parameter behavior analysis to behave in a linear fashion with respect to soil erosion hazard, whereas rainfall energy, slope steepness, and slope length exhibit non-linear behavior. When SLEMSA input parameters and results were compared to alternative methods of soil erosion assessment, such as drainage density and drainage texture, the model provided more spatially explicit information using 30 meter grid cells. Results of this study indicate that more accurate soil erosion estimates can be made when: (1) higher resolution digital elevation models are used; (2) data from improved precipitation station network are available, and; (3) greater investment in rainfall energy research.

  9. Determination of metal ion content of beverages and estimation of target hazard quotients: a comparative study

    PubMed Central

    Hague, Theresa; Petroczi, Andrea; Andrews, Paul LR; Barker, James; Naughton, Declan P

    2008-01-01

    Background Considerable research has been directed towards the roles of metal ions in nutrition with metal ion toxicity attracting particular attention. The aim of this study is to measure the levels of metal ions found in selected beverages (red wine, stout and apple juice) and to determine their potential detrimental effects via calculation of the Target Hazard Quotients (THQ) for 250 mL daily consumption. Results The levels (mean ± SEM) and diversity of metals determined by ICP-MS were highest for red wine samples (30 metals totalling 5620.54 ± 123.86 ppb) followed by apple juice (15 metals totalling 1339.87 ± 10.84 ppb) and stout (14 metals totalling 464.85 ± 46.74 ppb). The combined THQ values were determined based upon levels of V, Cr, Mn, Ni, Cu, Zn and Pb which gave red wine samples the highest value (5100.96 ± 118.93 ppb) followed by apple juice (666.44 ± 7.67 ppb) and stout (328.41 ± 42.36 ppb). The THQ values were as follows: apple juice (male 3.11, female 3.87), stout (male 1.84, female 2.19), red wine (male 126.52, female 157.22) and ultra-filtered red wine (male 110.48, female 137.29). Conclusion This study reports relatively high levels of metal ions in red wine, which give a very high THQ value suggesting potential hazardous exposure over a lifetime for those who consume at least 250 mL daily. In addition to the known hazardous metals (e.g. Pb), many metals (e.g. Rb) have not had their biological effects systematically investigated and hence the impact of sustained ingestion is not known. PMID:18578877

  10. Experimental study of a highway bridge with shape memory alloy restrainers focusing on the mitigation of unseating and pounding

    NASA Astrophysics Data System (ADS)

    Guo, Anxin; Zhao, Qingjie; Li, Hui

    2012-03-01

    This paper presents an experimental study to investigate the performance of shape memory alloy (SMA) restrainers for mitigating the pounding and unseating of highway bridges when subjected to seismic excitations. Mechanical property tests of the SMA wire used in the restrainers are conducted first to understand the pseudo-elastic characteristics of the material. Then, a series of shaking table tests are carried out on a highway bridge model. The structural responses of the highway bridge model equipped with SMA restrainers, installed in the form of deck-deck and deck-pile connections, are analyzed and compared with the uncontrolled structures. The test results of this study indicate that the SMA restrainers are not only effective in preventing unseating but also in suppressing the seismic-induced pounding of the highway bridge model used in this study.

  11. Awareness of occupational hazards and use of safety measures among welders: a cross-sectional study from eastern Nepal

    PubMed Central

    Budhathoki, Shyam Sundar; Singh, Suman Bahadur; Sagtani, Reshu Agrawal; Niraula, Surya Raj; Pokharel, Paras Kumar

    2014-01-01

    Objective The proper use of safety measures by welders is an important way of preventing and/or reducing a variety of health hazards that they are exposed to during welding. There is a lack of knowledge about hazards and personal protective equipments (PPEs) and the use of PPE among the welders in Nepal is limited. We designed a study to assess welders awareness of hazards and PPE, and the use of PPE among the welders of eastern Nepal and to find a possible correlation between awareness and use of PPE among them. Materials and methods A cross-sectional study of 300 welders selected by simple random sampling from three districts of eastern Nepal was conducted using a semistructured questionnaire. Data regarding age, education level, duration of employment, awareness of hazards, safety measures and the actual use of safety measures were recorded. Results Overall, 272 (90.7%) welders were aware of at least one hazard of welding and a similar proportion of welders were aware of at least one PPE. However, only 47.7% used one or more types of PPE. Education and duration of employment were significantly associated with the awareness of hazards and of PPE and its use. The welders who reported using PPE during welding were two times more likely to have been aware of hazards (OR=2.52, 95% CI 1.09 to 5.81) and five times more likely to have been aware of PPE compared with the welders who did not report the use of PPE (OR=5.13, 95% CI 2.34 to 11.26). Conclusions The welders using PPE were those who were aware of hazards and PPE. There is a gap between being aware of hazards and PPE (90%) and use of PPE (47%) at work. Further research is needed to identify the underlying factors leading to low utilisation of PPE despite the welders of eastern Nepal being knowledgeable of it. PMID:24889850

  12. Economics of Tsunami Mitigation in the Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Goettel, K. A.; Rizzo, A.; Sigrist, D.; Bernard, E. N.

    2011-12-01

    The death total in a major Cascadia Subduction Zone (CSZ) tsunami may be comparable to the Tohoku tsunami - tens of thousands. To date, tsunami risk reduction activities have been almost exclusively hazard mapping and evacuation planning. Reducing deaths in locations where evacuation to high ground is impossible in the short time between ground shaking and arrival of tsunamis requires measures such as vertical evacuation facilities or engineered pathways to safe ground. Yet, very few, if any, such tsunami mitigation projects have been done. In contrast, many tornado safe room and earthquake mitigation projects driven entirely or in largely by life safety have been done with costs in the billions of dollars. The absence of tsunami mitigation measures results from the belief that tsunamis are too infrequent and the costs too high to justify life safety mitigation measures. A simple analysis based on return periods, death rates, and the geographic distribution of high risk areas for these hazards demonstrates that this belief is incorrect: well-engineered tsunami mitigation projects are more cost-effective with higher benefit-cost ratios than almost all tornado or earthquake mitigation projects. Goldfinger's paleoseismic studies of CSZ turbidites indicate return periods for major CSZ tsunamis of about 250-500 years (USGS Prof. Paper 1661-F in press). Tsunami return periods are comparable to those for major earthquakes at a given location in high seismic areas and are much shorter than those for tornados at any location which range from >4,000 to >16,000 years for >EF2 and >EF4 tornadoes, respectively. The average earthquake death rate in the US over the past 100-years is about 1/year, or about 30/year including the 1906 San Francisco earthquake. The average death rate for tornadoes is about 90/year. For CSZ tsunamis, the estimated average death rate ranges from about 20/year (10,000 every 500 years) to 80/year (20,000 every 250 years). Thus, the long term deaths rates from tsunamis, earthquakes and tornadoes are comparable. High hazard areas for tornadoes and earthquakes cover ~40% and ~15% of the contiguous US, ~1,250,000 and ~500,000 square miles, respectively. In marked contrast, tsunami life safety risk is concentrated in communities with significant populations in areas where evacuation to high ground is impossible: probably <4,000 square miles or <0.1% of the US. The geographic distribution of life safety risk profoundly affects the economics of tsunami life safety mitigation projects. Consider a tsunami life safety project which saves an average of one life per year (500 lives per 500 years). Using FEMA's value of human life (5.8 million), 7% discount rate and a 50-year project useful lifetime, the net present value of avoided deaths is 80 million. Thus, the benefit-cost ratio would be about 16 or about 80 for tsunami mitigation projects which cost 5 million or 1 million, respectively. These rough calculations indicate that tsunami mitigation projects in high risk locations are economically justified. More importantly, these results indicate that national and local priorities for natural hazard mitigation should be reconsidered, with tsunami mitigation given a very high priority.

  13. Hazard Ranking Methodology for Assessing Health Impacts of Unconventional Natural Gas Development and Production: The Maryland Case Study.

    PubMed

    Boyle, Meleah D; Payne-Sturges, Devon C; Sangaramoorthy, Thurka; Wilson, Sacoby; Nachman, Keeve E; Babik, Kelsey; Jenkins, Christian C; Trowell, Joshua; Milton, Donald K; Sapkota, Amir

    2016-01-01

    The recent growth of unconventional natural gas development and production (UNGDP) has outpaced research on the potential health impacts associated with the process. The Maryland Marcellus Shale Public Health Study was conducted to inform the Maryland Marcellus Shale Safe Drilling Initiative Advisory Commission, State legislators and the Governor about potential public health impacts associated with UNGDP so they could make an informed decision that considers the health and well-being of Marylanders. In this paper, we describe an impact assessment and hazard ranking methodology we used to assess the potential public health impacts for eight hazards associated with the UNGDP process. The hazard ranking included seven metrics: 1) presence of vulnerable populations (e.g. children under the age of 5, individuals over the age of 65, surface owners), 2) duration of exposure, 3) frequency of exposure, 4) likelihood of health effects, 5) magnitude/severity of health effects, 6) geographic extent, and 7) effectiveness of setbacks. Overall public health concern was determined by a color-coded ranking system (low, moderately high, and high) that was generated based on the overall sum of the scores for each hazard. We provide three illustrative examples of applying our methodology for air quality and health care infrastructure which were ranked as high concern and for water quality which was ranked moderately high concern. The hazard ranking was a valuable tool that allowed us to systematically evaluate each of the hazards and provide recommendations to minimize the hazards. PMID:26726918

  14. Hazard Ranking Methodology for Assessing Health Impacts of Unconventional Natural Gas Development and Production: The Maryland Case Study

    PubMed Central

    Sangaramoorthy, Thurka; Wilson, Sacoby; Nachman, Keeve E.; Babik, Kelsey; Jenkins, Christian C.; Trowell, Joshua; Milton, Donald K.; Sapkota, Amir

    2016-01-01

    The recent growth of unconventional natural gas development and production (UNGDP) has outpaced research on the potential health impacts associated with the process. The Maryland Marcellus Shale Public Health Study was conducted to inform the Maryland Marcellus Shale Safe Drilling Initiative Advisory Commission, State legislators and the Governor about potential public health impacts associated with UNGDP so they could make an informed decision that considers the health and well-being of Marylanders. In this paper, we describe an impact assessment and hazard ranking methodology we used to assess the potential public health impacts for eight hazards associated with the UNGDP process. The hazard ranking included seven metrics: 1) presence of vulnerable populations (e.g. children under the age of 5, individuals over the age of 65, surface owners), 2) duration of exposure, 3) frequency of exposure, 4) likelihood of health effects, 5) magnitude/severity of health effects, 6) geographic extent, and 7) effectiveness of setbacks. Overall public health concern was determined by a color-coded ranking system (low, moderately high, and high) that was generated based on the overall sum of the scores for each hazard. We provide three illustrative examples of applying our methodology for air quality and health care infrastructure which were ranked as high concern and for water quality which was ranked moderately high concern. The hazard ranking was a valuable tool that allowed us to systematically evaluate each of the hazards and provide recommendations to minimize the hazards. PMID:26726918

  15. Mapping Europe's Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Giardini, Domenico; Wssner, Jochen; Danciu, Laurentiu

    2014-07-01

    From the rift that cuts through the heart of Iceland to the complex tectonic convergence that causes frequent and often deadly earthquakes in Italy, Greece, and Turkey to the volcanic tremors that rattle the Mediterranean, seismic activity is a prevalent and often life-threatening reality across Europe. Any attempt to mitigate the seismic risk faced by society requires an accurate estimate of the seismic hazard.

  16. Concerns About Climate Change Mitigation Projects: Summary of Findings from Case Studies in Brazil, India, Mexico, and South Africa

    SciTech Connect

    Sathaye, Jayant A.; Andrasko, Kenneth; Makundi, Willy; La Rovere, Emilio Lebre; Ravinandranath, N.H.; Melli, Anandi; Rangachari, Anita; Amaz, Mireya; Gay, Carlos; Friedmann, Rafael; Goldberg, Beth; van Horen, Clive; Simmonds, Gillina; Parker, Gretchen

    1998-11-01

    The concept of joint implementation as a way to implement climate change mitigation projects in another country has been controversial ever since its inception. Developing countries have raised numerous issues at the project-specific technical level, and broader concerns having to do with equity and burden sharing. This paper summarizes the findings of studies for Brazil, India, Mexico and South Africa, four countries that have large greenhouse gas emissions and are heavily engaged in the debate on climate change projects under the Kyoto Protocol. The studies examine potential or current projects/programs to determine whether eight technical concerns about joint implementation can be adequately addressed. They conclude that about half the concerns were minor or well managed by project developers, but concerns about additionality of funds, host country institutions and guarantees of performance (including the issues of baselines and possible leakage) need much more effort to be adequately addressed. All the papers agree on the need to develop institutional arrangements for approving and monitoring such projects in each of the countries represented. The case studies illustrate that these projects have the potential to bring new technology, investment, employment and ancillary socioeconomic and environmental benefits to developing countries. These benefits are consistent with the goal of sustainable development in the four study countries. At a policy level, the studies' authors note that in their view, the Annex I countries should consider limits on the use of jointly implemented projects as a way to get credits against their own emissions at home, and stress the importance of industrialized countries developing new technologies that will benefit all countries. The authors also observe that if all countries accepted caps on their emissions (with a longer time period allowed for developing countries to do so) project-based GHG mitigation would be significantly facilitated by the improved private investment climate.

  17. Expert study to select indicators of the occurrence of emerging mycotoxin hazards.

    PubMed

    Kandhai, M C; Booij, C J H; Van der Fels-Klerx, H J

    2011-01-01

    This article describes a Delphi-based expert judgment study aimed at the selection of indicators to identify the occurrence of emerging mycotoxin hazards related to?Fusarium?spp. in wheat supply chains. A panel of 29 experts from 12 European countries followed a holistic approach to evaluate the most important indicators for different chain stages (growth, transport and storage, and processing) and their relative importance. After three e-mailing rounds, the experts reached consensus on the most important indicators for each of the three stages: wheat growth, transport and storage, and processing. For wheat growth, these indicators include: relative humidity/rainfall, crop rotation, temperature, tillage practice, water activity of the kernels, and crop variety/cultivar. For the transport and storage stage, they include water activity in the kernels, relative humidity, ventilation, temperature, storage capacity, and logistics. For wheat processing, indicators include quality data, fraction of the cereal used, water activity in the kernels, quality management and traceability systems, and carryover of contamination. The indicators selected in this study can be used in an identification system for the occurrence of emerging mycotoxin hazards in wheat supply chains. Such a system can be used by risk managers within governmental (related) organizations and/or the food and feed industry in order to react proactively to the occurrence of these emerging mycotoxins. PMID:20846166

  18. Vertical Field of View Reference Point Study for Flight Path Control and Hazard Avoidance

    NASA Technical Reports Server (NTRS)

    Comstock, J. Raymond, Jr.; Rudisill, Marianne; Kramer, Lynda J.; Busquets, Anthony M.

    2002-01-01

    Researchers within the eXternal Visibility System (XVS) element of the High-Speed Research (HSR) program developed and evaluated display concepts that will provide the flight crew of the proposed High-Speed Civil Transport (HSCT) with integrated imagery and symbology to permit path control and hazard avoidance functions while maintaining required situation awareness. The challenge of the XVS program is to develop concepts that would permit a no-nose-droop configuration of an HSCT and expanded low visibility HSCT operational capabilities. This study was one of a series of experiments exploring the 'design space' restrictions for physical placement of an XVS display. The primary experimental issues here was 'conformality' of the forward display vertical position with respect to the side window in simulated flight. 'Conformality' refers to the case such that the horizon and objects appear in the same relative positions when viewed through the forward windows or display and the side windows. This study quantified the effects of visual conformality on pilot flight path control and hazard avoidance performance. Here, conformality related to the positioning and relationship of the artificial horizon line and associated symbology presented on the forward display and the horizon and associated ground, horizon, and sky textures as they would appear in the real view through a window presented in the side window display. No significant performance consequences were found for the non-conformal conditions.

  19. Standardization of Seismic Microzonification and Probabilistic Seismic Hazard Study Considering Site Effect for Metropolitan Areas in the State of Veracruz

    NASA Astrophysics Data System (ADS)

    Torres Morales, G. F.; Leonardo Suárez, M.; Dávalos Sotelo, R.; Castillo Aguilar, S.; Mora González, I.

    2014-12-01

    Preliminary results obtained from the project "Seismic Hazard in the State of Veracruz and Xalapa Conurbation" and "Microzonation of geological and hydrometeorological hazards for conurbations of Orizaba, Veracruz, and major sites located in the lower sub-basins: The Antigua and Jamapa" are presented. These projects were sponsored respectively by the PROMEP program and the Joint Funds CONACyT-Veracruz state government. The study consists of evaluating the probabilistic seismic hazard considering the site effect (SE) in the urban zones of cities of Xalapa and Orizaba; the site effects in this preliminary stage were incorporated through a standard format proposed in studies of microzonation and application in computer systems, which allows to optimize and condense microzonation studies of a city. This study stems from the need to know the seismic hazard (SH) in the State of Veracruz and its major cities, defining SH as the probabilistic description of exceedance of a given level of ground motion intensity (generally designated by the acceleration soil or maximum ordinate in the response spectrum of pseudo-acceleration, PGA and Sa, respectively) as a result of the action of an earthquake in the area of influence for a specified period of time. The evaluation results are presented through maps of seismic hazard exceedance rate curves and uniform hazard spectra (UHS) for different spectral ordinates and return periods, respectively.

  20. Photobiomodulation Mitigates Diabetes-Induced Retinopathy by Direct and Indirect Mechanisms: Evidence from Intervention Studies in Pigmented Mice

    PubMed Central

    Liu, Haitao; Patel, Shyam; Roberts, Robin; Berkowitz, Bruce A.; Kern, Timothy S.

    2015-01-01

    Objective Daily application of far-red light from the onset of diabetes mitigated diabetes-induced abnormalities in retinas of albino rats. Here, we test the hypothesis that photobiomodulation (PBM) is effective in diabetic, pigmented mice, even when delayed until weeks after onset of diabetes. Direct and indirect effects of PBM on the retina also were studied. Methods Diabetes was induced in C57Bl/6J mice using streptozotocin. Some diabetics were exposed to PBM therapy (4 min/day; 670 nm) daily. In one study, mice were diabetic for 4 weeks before initiation of PBM for an additional 10 weeks. Retinal oxidative stress, inflammation, and retinal function were measured. In some mice, heads were covered with a lead shield during PBM to prevent direct illumination of the eye, or animals were treated with an inhibitor of heme oxygenase-1. In a second study, PBM was initiated immediately after onset of diabetes, and administered daily for 2 months. These mice were examined using manganese-enhanced MRI to assess effects of PBM on transretinal calcium channel function in vivo. Results PBM intervention improved diabetes-induced changes in superoxide generation, leukostasis, expression of ICAM-1, and visual performance. PBM acted in part remotely from the retina because the beneficial effects were achieved even with the head shielded from the light therapy, and because leukocyte-mediated cytotoxicity of retinal endothelial cells was less in diabetics treated with PBM. SnPP+PBM significantly reduced iNOS expression compared to PBM alone, but significantly exacerbated leukostasis. In study 2, PBM largely mitigated diabetes-induced retinal calcium channel dysfunction in all retinal layers. Conclusions PBM induces retinal protection against abnormalities induced by diabetes in pigmented animals, and even as an intervention. Beneficial effects on the retina likely are mediated by both direct and indirect mechanisms. PBM is a novel non-pharmacologic treatment strategy to inhibit early changes of diabetic retinopathy. PMID:26426815

  1. A Randomized, Controlled Trial of Home Injury Hazard Reduction: The HOME Injury Study

    PubMed Central

    Phelan, Kieran J.; Khoury, Jane; Xu, Yingying; Liddy, Stacey; Hornung, Richard; Lanphear, Bruce P.

    2013-01-01

    Objective Test the efficacy of an intervention of safety device installation on medically-attended injury in children birth to 3 years of age. Design A nested, prospective, randomized, controlled trial. Setting Indoor environment of housing units of mothers and children. Participants Mothers and their children enrolled in a birth cohort examining the effects of prevalent neurotoxicants on child development, the Home Observation and Measures of the Environment (HOME) Study. Intervention Installation of multiple, passive measures (stairgates, window locks, smoke & carbon monoxide detectors, to reduce exposure to injury hazards present in housing units. Outcome measure Self-reported and medically-attended and modifiable injury. Methods 1263 (14%) prenatal patients were eligible, 413 (33%) agreed to participate and 355 were randomly assigned to the experimental (n=181) or control (n=174) groups. Injury hazards were assessed at home visits by teams of trained research assistants using a validated survey. Safety devices were installed in intervention homes. Intention-to-treat analyses to test efficacy were conducted on: 1) total injury rates and 2) on injuries deemed, a priori, modifiable by the installation of safety devices. Rates of medically attended injuries (phone calls, office or emergency visits) were calculated using generalized estimating equations. Results The mean age of the children at intervention was 6 months. Injury hazards were significantly reduced in the intervention but not in control group homes at one and two years (p<0.004). There was not a significant difference in the rate for all medically-attended injuries in intervention compared with control group children, 14.3 (95%CI 9.7, 21.1) vs. 20.8 (14.4, 29.9) per 100 child-years (p=0.17) respectively; but there was a significant reduction in modifiable medically attended injuries in intervention compared with control group children, 2.3 (1.0, 5.5) vs. 7.7 (4.2, 14.2) per 100 child-years, respectively (p=0.026). Conclusions An intervention to reduce exposure to hazards in the homes of young children led to a 70% reduction in modifiable medically-attended injury. PMID:21464382

  2. The Study on Ecological Treatment of Saline Lands to Mitigate the Effects of Climate Change

    NASA Astrophysics Data System (ADS)

    Xie, Jiancang; Zhu, Jiwei; Wang, Tao

    2010-05-01

    The soil water and salt movement is influenced strongly by the frequent droughts, floods and climate change. Additionally, as continued population growth, large-scale reclaiming of arable land and long-term unreasonable irrigation, saline land is increasing at the rate of 1,000,000~15,000,000 mu each year all over the world. In the tradition management, " drainage as the main " measure has series of problem, which appears greater project, more occupation of land, harmful for water saving and downstream pollution. To response the global climate change, it has become the common understanding, which promote energy-saving and environment protection, reflect the current model, explore the ecological management model. In this paper, we take severe saline landLubotan in Shaanxi Province as an example. Through nearly 10 years harnessing practice and observing to meteorology, hydrology, soil indicators of climate, we analyze the influence of climate change to soil salinity movement at different seasons and years, then put forward and apply a new model of saline land harnessing to mitigate the Effects of Climate Change and self-rehabilitate entironment. This model will be changed "drainage" to "storage", through the establishment engineering of " storage as the main ", taken comprehensive measures of " project - biology - agriculture ", we are changing saline land into arable land. Adapted to natural changes of climate, rainfall, irrigation backwater, groundwater level, reduced human intervention to achieve system dynamic equilibrium. During the ten years, the salt of plough horizon has reduced from 0.74% to 0.20%, organic matter has increased from 0.7% to 0.92%, various indicators of soil is begining to go better. At the same time, reduced the water for irrigation, drainage pollution and investment costs. Through the model, reformed severe saline land 18,900 mu, increased new cultivated land 16,500 mu, comprehensive efficient significant, ensured the coordinated development of " water - biology - environment " in the region. Model application and promotion can treat saline-alkali and add cultivated land effectively, at the same time, ease the pressure for urban construction land, promote energy saving and emission reducting and ecological restoration, so we can construct a resource-saving and environment-friendly society, realize sustainable development of the population, resources and environment.

  3. Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology

    NASA Technical Reports Server (NTRS)

    Woods, Stephen

    2009-01-01

    This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.

  4. Cost-benefit analysis of alternative LNG vapor-mitigation measures. Topical report, September 14, 1987-January 15, 1991

    SciTech Connect

    Atallah, S.

    1992-06-25

    A generalized methodology is presented for comparing the costs and safety benefits of alternative hazard mitigation measures for a large LNG vapor release. The procedure involves the quantification of the risk to the public before and after the application of LNG vapor mitigation measures. In the study, risk was defined as the product of the annual accident frequency, estimated from a fault tree analysis, and the severity of the accident. Severity was measured in terms of the number of people who may be exposed to 2.5% or higher concentration. The ratios of the annual costs of the various mitigation measures to their safety benefits (as determined by the differences between the risk before and after mitigation measure implementation), were then used to identify the most cost-effective approaches to vapor cloud mitigation.

  5. Application of multi-agent coordination methods to the design of space debris mitigation tours

    NASA Astrophysics Data System (ADS)

    Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby

    2016-04-01

    The growth in the number of defunct and fragmented objects near to the Earth poses a growing hazard to launch operations as well as existing on-orbit assets. Numerous studies have demonstrated the positive impact of active debris mitigation campaigns upon the growth of debris populations, but comparatively fewer investigations incorporate specific mission scenarios. Furthermore, while many active mitigation methods have been proposed, certain classes of debris objects are amenable to mitigation campaigns employing chaser spacecraft with existing chemical and low-thrust propulsive technologies. This investigation incorporates an ant colony optimization routing algorithm and multi-agent coordination via auctions into a debris mitigation tour scheme suitable for preliminary mission design and analysis as well as spacecraft flight operations.

  6. 78 FR 36212 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-17

    ..., identified by Docket No. FEMA-B-1323, to Luis Rodriguez, Chief, Engineering Management Branch, Federal..., Engineering Management Branch, Federal Insurance and Mitigation Administration, FEMA, 500 C Street SW... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal...

  7. 77 FR 56669 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-13

    ..., identified by Docket No. FEMA-B-1265, to Luis Rodriguez, Chief, Engineering Management Branch, Federal..., Engineering Management Branch, Federal Insurance and Mitigation Administration, FEMA, 500 C Street SW... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal...

  8. 78 FR 32679 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-31

    ..., identified by Docket No. FEMA-B-1309, to Luis Rodriguez, Chief, Engineering Management Branch, Federal..., Engineering Management Branch, Federal Insurance and Mitigation Administration, FEMA, 500 C Street SW... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal...

  9. Model-Predictive Cascade Mitigation in Electric Power Systems With Storage and Renewables-Part II: Case-Study

    SciTech Connect

    Almassalkhi, MR; Hiskens, IA

    2015-01-01

    The novel cascade-mitigation scheme developed in Part I of this paper is implemented within a receding-horizon model predictive control (MPC) scheme with a linear controller model. This present paper illustrates the MPC strategy with a case-study that is based on the IEEE RTS-96 network, though with energy storage and renewable generation added. It is shown that the MPC strategy alleviates temperature overloads on transmission lines by rescheduling generation, energy storage, and other network elements, while taking into account ramp-rate limits and network limitations. Resilient performance is achieved despite the use of a simplified linear controller model. The MPC scheme is compared against a base-case that seeks to emulate human operator behavior.

  10. Eco-efficiency for greenhouse gas emissions mitigation of municipal solid waste management: a case study of Tianjin, China.

    PubMed

    Zhao, Wei; Huppes, Gjalt; van der Voet, Ester

    2011-06-01

    The issue of municipal solid waste (MSW) management has been highlighted in China due to the continually increasing MSW volumes being generated and the limited capacity of waste treatment facilities. This article presents a quantitative eco-efficiency (E/E) analysis on MSW management in terms of greenhouse gas (GHG) mitigation. A methodology for E/E analysis has been proposed, with an emphasis on the consistent integration of life cycle assessment (LCA) and life cycle costing (LCC). The environmental and economic impacts derived from LCA and LCC have been normalized and defined as a quantitative E/E indicator. The proposed method was applied in a case study of Tianjin, China. The study assessed the current MSW management system, as well as a set of alternative scenarios, to investigate trade-offs between economy and GHG emissions mitigation. Additionally, contribution analysis was conducted on both LCA and LCC to identify key issues driving environmental and economic impacts. The results show that the current Tianjin's MSW management system emits the highest GHG and costs the least, whereas the situation reverses in the integrated scenario. The key issues identified by the contribution analysis show no linear relationship between the global warming impact and the cost impact in MSW management system. The landfill gas utilization scenario is indicated as a potential optimum scenario by the proposed E/E analysis, given the characteristics of MSW, technology levels, and chosen methodologies. The E/E analysis provides an attractive direction towards sustainable waste management, though some questions with respect to uncertainty need to be discussed further. PMID:21316937

  11. Eco-efficiency for greenhouse gas emissions mitigation of municipal solid waste management: A case study of Tianjin, China

    SciTech Connect

    Zhao Wei; Huppes, Gjalt; Voet, Ester van der

    2011-06-15

    The issue of municipal solid waste (MSW) management has been highlighted in China due to the continually increasing MSW volumes being generated and the limited capacity of waste treatment facilities. This article presents a quantitative eco-efficiency (E/E) analysis on MSW management in terms of greenhouse gas (GHG) mitigation. A methodology for E/E analysis has been proposed, with an emphasis on the consistent integration of life cycle assessment (LCA) and life cycle costing (LCC). The environmental and economic impacts derived from LCA and LCC have been normalized and defined as a quantitative E/E indicator. The proposed method was applied in a case study of Tianjin, China. The study assessed the current MSW management system, as well as a set of alternative scenarios, to investigate trade-offs between economy and GHG emissions mitigation. Additionally, contribution analysis was conducted on both LCA and LCC to identify key issues driving environmental and economic impacts. The results show that the current Tianjin's MSW management system emits the highest GHG and costs the least, whereas the situation reverses in the integrated scenario. The key issues identified by the contribution analysis show no linear relationship between the global warming impact and the cost impact in MSW management system. The landfill gas utilization scenario is indicated as a potential optimum scenario by the proposed E/E analysis, given the characteristics of MSW, technology levels, and chosen methodologies. The E/E analysis provides an attractive direction towards sustainable waste management, though some questions with respect to uncertainty need to be discussed further.

  12. Distinguishing Realistic Military Blasts from Firecrackers in Mitigation Studies of Blast Induced Traumatic Brain Injury

    SciTech Connect

    Moss, W C; King, M J; Blackman, E G

    2011-01-21

    In their Contributed Article, Nyein et al. (1,2) present numerical simulations of blast waves interacting with a helmeted head and conclude that a face shield may significantly mitigate blast induced traumatic brain injury (TBI). A face shield may indeed be important for future military helmets, but the authors derive their conclusions from a much smaller explosion than typically experienced on the battlefield. The blast from the 3.16 gm TNT charge of (1) has the following approximate peak overpressures, positive phase durations, and incident impulses (3): 10 atm, 0.25 ms, and 3.9 psi-ms at the front of the head (14 cm from charge), and 1.4 atm, 0.32 ms, and 1.7 psi-ms at the back of a typical 20 cm head (34 cm from charge). The peak pressure of the wave decreases by a factor of 7 as it traverses the head. The blast conditions are at the threshold for injury at the front of the head, but well below threshold at the back of the head (4). The blast traverses the head in 0.3 ms, roughly equal to the positive phase duration of the blast. Therefore, when the blast reaches the back of the head, near ambient conditions exist at the front. Because the headform is so close to the charge, it experiences a wave with significant curvature. By contrast, a realistic blast from a 2.2 kg TNT charge ({approx} an uncased 105 mm artillery round) is fatal at an overpressure of 10 atm (4). For an injury level (4) similar to (1), a 2.2 kg charge has the following approximate peak overpressures, positive phase durations, and incident impulses (3): 2.1 atm, 2.3 ms, and 18 psi-ms at the front of the head (250 cm from charge), and 1.8 atm, 2.5 ms, and 16.8 psi-ms at the back of the head (270 cm from charge). The peak pressure decreases by only a factor of 1.2 as it traverses the head. Because the 0.36 ms traversal time is much smaller than the positive phase duration, pressures on the head become relatively uniform when the blast reaches the back of the head. The larger standoff implies that the headform locally experiences a nearly planar blast wave. Also, the positive phase durations and blast impulses are much larger than those of (1). Consequently, the blast model used in (1) is spatially and temporally very different from a military blast. It would be useful to repeat the calculations using military blast parameters. Finally, (1) overlooks a significant part of (5). On page 1 and on page 3, (1) states that (5) did not consider helmet pads. But pages pages 3 and 4 of (5) present simulations of blast wave propagation across an ACH helmeted head form with and without pads. (5) states that when the pads are present, the 'underwash' of air under the helmet is blocked when compared to the case without. (1) reaches this same conclusion, but reports it as a new result rather than a confirmation of that already found in (5).

  13. Balancing Mitigation Against Impact: A Case Study From the 2005 Chicxulub Seismic Survey

    NASA Astrophysics Data System (ADS)

    Barton, P.; Diebold, J.; Gulick, S.

    2006-05-01

    In early 2005 the R/V Maurice Ewing conducted a large-scale deep seismic reflection-refraction survey offshore Yucatan, Mexico, to investigate the internal structure of the Chicxulub impact crater, centred on the coastline. Shots from a tuned 20 airgun, 6970 cu in array were recorded on a 6 km streamer and 25 ocean bottom seismometers (OBS). The water is exceptionally shallow to large distances offshore, reaching 30 m about 60 km from the land, making it unattractive to the larger marine mammals, although there are small populations of Atlantic and spotted dolphins living in the area, as well as several turtle breeding and feeding grounds on the Yucatan peninsula. In the light of calibrated tests of the Ewing's array (Tolstoy et al., 2004, Geophysical Research Letters 31, L14310), a 180 dB safety radius of 3.5 km around the gun array was adopted. An energetic campaign was organised by environmentalists opposing the work. In addition to the usual precautions of visual and listening watches by independent observers, gradual ramp-ups of the gun arrays, and power-downs or shut-downs for sightings, constraints were also placed to limit the survey to daylight hours and weather conditions not exceeding Beaufort 4. The operations were subject to several on-board inspections by the Mexican environmental authorities, causing logistical difficulties. Although less than 1% of the total working time was lost to shutdowns due to actual observation of dolphins or turtles, approximately 60% of the cruise time was taken up in precautionary inactivity. A diver in the water 3.5 km from the profiling ship reported that the sound in the water was barely noticeable, leading us to examine the actual sound levels recorded by both the 6 km streamer and the OBS hydrophones. The datasets are highly self-consistent, and give the same pattern of decay with distance past about 2 km offset, but with different overall levels: this may be due to geometry or calibration differences under investigation. Both datasets indicate significantly lower levels than reported by Tolstoy et al. (2004). There was no evidence of environmental damage created by this survey. It can be concluded that the mitigation measures were extremely successful, but there is also a concern that the overhead cost of the environmental protection made this one of the most costly academic surveys ever undertaken, and that not all of this protection was necessary. In particular, the predicted 180 dB safety radius appeared to be overly conservative, even though based on calibrated measurements in very similar physical circumstances, and we suggest that these differences were a result of local seismic velocity structure in the water column and/or shallow seabed, which resulted in different partitioning of the energy. These results suggest that real time monitoring of hydrophone array data may provide a method of determining the safety radius dynamically, in response to local conditions.

  14. Economic optimization of natural hazard protection - conceptual study of existing approaches

    NASA Astrophysics Data System (ADS)

    Spackova, Olga; Straub, Daniel

    2013-04-01

    Risk-based planning of protection measures against natural hazards has become a common practice in many countries. The selection procedure aims at identifying an economically efficient strategy with regard to the estimated costs and risk (i.e. expected damage). A correct setting of the evaluation methodology and decision criteria should ensure an optimal selection of the portfolio of risk protection measures under a limited state budget. To demonstrate the efficiency of investments, indicators such as Benefit-Cost Ratio (BCR), Marginal Costs (MC) or Net Present Value (NPV) are commonly used. However, the methodologies for efficiency evaluation differ amongst different countries and different hazard types (floods, earthquakes etc.). Additionally, several inconsistencies can be found in the applications of the indicators in practice. This is likely to lead to a suboptimal selection of the protection strategies. This study provides a general formulation for optimization of the natural hazard protection measures from a socio-economic perspective. It assumes that all costs and risks can be expressed in monetary values. The study regards the problem as a discrete hierarchical optimization, where the state level sets the criteria and constraints, while the actual optimization is made on the regional level (towns, catchments) when designing particular protection measures and selecting the optimal protection level. The study shows that in case of an unlimited budget, the task is quite trivial, as it is sufficient to optimize the protection measures in individual regions independently (by minimizing the sum of risk and cost). However, if the budget is limited, the need for an optimal allocation of resources amongst the regions arises. To ensure this, minimum values of BCR or MC can be required by the state, which must be achieved in each region. The study investigates the meaning of these indicators in the optimization task at the conceptual level and compares their suitability. To illustrate the theoretical findings, the indicators are tested on a hypothetical example of five regions with different risk levels. Last but not least, political and societal aspects and limitations in the use of the risk-based optimization framework are discussed.

  15. New approach to inventorying army hazardous materials. A study done for the Eighth U. S. Army, Korea. Volume 2. Hazardous-material data. Final report

    SciTech Connect

    Kim, B.J.; Gee, C.S.; Lee, Y.H.; Mikulich, L.R.; Grafmyer, D.E.

    1991-09-01

    The goal of the Army hazardous waste minimization program is to achieve a 50 percent reduction of the hazardous waste generated by calendar year 1992 (CY92), as compared to baseline CY85. A first step in achieving effective hazardous waste management is to conduct a thorough hazardous material inventory. Volume I describes a method created to inventory hazardous material by collecting supply data from Logistics Control Activity (LCA) at the Presidio, San Francisco, CA, an comparing this data with the Material Safety Data Sheet (MSDS) in the Hazardous Material Information System (HMIS). Volume H lists hazardous material data collected for the Eighth U.S. Army (EUSA), Korea. Common elements between the two data bases were compiled, analyzed, and validated. It was found that the intersection of the two data bases created a composite list that substantially reduced the number of nonhazardous wastes included in the individual lists. This method may also be applied to supply data from other Army installations.

  16. Progress in NTHMP Hazard Assessment

    USGS Publications Warehouse

    Gonzalez, F.I.; Titov, V.V.; Mofjeld, H.O.; Venturato, A.J.; Simmons, R.S.; Hansen, R.; Combellick, R.; Eisner, R.K.; Hoirup, D.F.; Yanagi, B.S.; Yong, S.; Darienzo, M.; Priest, G.R.; Crawford, G.L.; Walsh, T.J.

    2005-01-01

    The Hazard Assessment component of the U.S. National Tsunami Hazard Mitigation Program has completed 22 modeling efforts covering 113 coastal communities with an estimated population of 1.2 million residents that are at risk. Twenty-three evacuation maps have also been completed. Important improvements in organizational structure have been made with the addition of two State geotechnical agency representatives to Steering Group membership, and progress has been made on other improvements suggested by program reviewers. ?? Springer 2005.

  17. Case study: Mapping tsunami hazards associated with debris flow into a reservoir

    USGS Publications Warehouse

    Walder, J.S.; Watts, P.; Waythomas, C.F.

    2006-01-01

    Debris-flow generated impulse waves (tsunamis) pose hazards in lakes, especially those used for hydropower or recreation. We describe a method for assessing tsunami-related hazards for the case in which inundation by coherent water waves, rather than chaotic splashing, is of primary concern. The method involves an experimentally based initial condition (tsunami source) and a Boussinesq model for tsunami propagation and inundation. Model results are used to create hazard maps that offer guidance for emergency planners and responders. An example application explores tsunami hazards associated with potential debris flows entering Baker Lake, a reservoir on the flanks of the Mount Baker volcano in the northwestern United States. ?? 2006 ASCE.

  18. Status of volcanic hazard studies for the Nevada Nuclear Waste Storage Investigations. Volume II

    SciTech Connect

    Crowe, B.M.; Wohletz, K.H.; Vaniman, D.T.; Gladney, E.; Bower, N.

    1986-01-01

    Volcanic hazard investigations during FY 1984 focused on five topics: the emplacement mechanism of shallow basalt intrusions, geochemical trends through time for volcanic fields of the Death Valley-Pancake Range volcanic zone, the possibility of bimodal basalt-rhyolite volcanism, the age and process of enrichment for incompatible elements in young basalts of the Nevada Test Site (NTS) region, and the possibility of hydrovolcanic activity. The stress regime of Yucca Mountain may favor formation of shallow basalt intrusions. However, combined field and drill-hole studies suggest shallow basalt intrusions are rare in the geologic record of the southern Great Basin. The geochemical patterns of basaltic volcanism through time in the NTS region provide no evidence for evolution toward a large-volume volcanic field or increases in future rates of volcanism. Existing data are consistent with a declining volcanic system comparable to the late stages of the southern Death Valley volcanic field. The hazards of bimodal volcanism in this area are judged to be low. The source of a 6-Myr pumice discovered in alluvial deposits of Crater Flat has not been found. Geochemical studies show that the enrichment of trace elements in the younger rift basalts must be related to an enrichment of their mantle source rocks. This geochemical enrichment event, which may have been metasomatic alteration, predates the basalts of the silicic episode and is, therefore, not a young event. Studies of crater dimensions of hydrovolcanic landforms indicate that the worst case scenario (exhumation of a repository at Yucca Mountain by hydrovolcanic explosions) is unlikely. Theoretical models of melt-water vapor explosions, particularly the thermal detonation model, suggest hydrovolcanic explosion are possible at Yucca Mountain. 80 refs., 21 figs., 5 tabs.

  19. GIS-based pollution hazard mapping and assessment framework of shallow lakes: southeastern Pampean lakes (Argentina) as a case study.

    PubMed

    Romanelli, A; Esquius, K S; Massone, H E; Escalante, A H

    2013-08-01

    The assessment of water vulnerability and pollution hazard traditionally places particular emphasis on the study on groundwaters more than on surface waters. Consequently, a GIS-based Lake Pollution Hazard Index (LPHI) was proposed for assessing and mapping the potential pollution hazard for shallow lakes due to the interaction between the Potential Pollutant Load and the Lake Vulnerability. It includes easily measurable and commonly used parameters: land cover, terrain slope and direction, and soil media. Three shallow lake ecosystems of the southeastern Pampa Plain (Argentina) were chosen to test the usefulness and applicability of this suggested index. Moreover, anthropogenic and natural medium influence on biophysical parameters in these three ecosystems was examined. The evaluation of the LPHI map shows for La Brava and Los Padres lakes the highest pollution hazard (?30% with high to very high category) while Nahuel Ruc Lake seems to be the less hazardous water body (just 9.33% with high LPHI). The increase in LPHI value is attributed to a different loading of pollutants governed by land cover category and/or the exposure to high slopes and influence of slope direction. Dissolved oxygen and biochemical oxygen demand values indicate a moderately polluted and eutrophized condition of shallow lake waters, mainly related to moderate agricultural activities and/or cattle production. Obtained information by means of LPHI calculation result useful to perform a local diagnosis of the potential pollution hazard to a freshwater ecosystem in order to implement basic guidelines to improve lake sustainability. PMID:23355019

  20. Property-close source separation of hazardous waste and waste electrical and electronic equipment--a Swedish case study.

    TOXLINE Toxicology Bibliographic Information

    Bernstad A; la Cour Jansen J; Aspegren H

    2011-03-01

    Through an agreement with EEE producers, Swedish municipalities are responsible for collection of hazardous waste and waste electrical and electronic equipment (WEEE). In most Swedish municipalities, collection of these waste fractions is concentrated to waste recycling centres where households can source-separate and deposit hazardous waste and WEEE free of charge. However, the centres are often located on the outskirts of city centres and cars are needed in order to use the facilities in most cases. A full-scale experiment was performed in a residential area in southern Sweden to evaluate effects of a system for property-close source separation of hazardous waste and WEEE. After the system was introduced, results show a clear reduction in the amount of hazardous waste and WEEE disposed of incorrectly amongst residual waste or dry recyclables. The systems resulted in a source separation ratio of 70 wt% for hazardous waste and 76 wt% in the case of WEEE. Results show that households in the study area were willing to increase source separation of hazardous waste and WEEE when accessibility was improved and that this and similar collection systems can play an important role in building up increasingly sustainable solid waste management systems.

  1. Property-close source separation of hazardous waste and waste electrical and electronic equipment--a Swedish case study.

    PubMed

    Bernstad, Anna; la Cour Jansen, Jes; Aspegren, Henrik

    2011-03-01

    Through an agreement with EEE producers, Swedish municipalities are responsible for collection of hazardous waste and waste electrical and electronic equipment (WEEE). In most Swedish municipalities, collection of these waste fractions is concentrated to waste recycling centres where households can source-separate and deposit hazardous waste and WEEE free of charge. However, the centres are often located on the outskirts of city centres and cars are needed in order to use the facilities in most cases. A full-scale experiment was performed in a residential area in southern Sweden to evaluate effects of a system for property-close source separation of hazardous waste and WEEE. After the system was introduced, results show a clear reduction in the amount of hazardous waste and WEEE disposed of incorrectly amongst residual waste or dry recyclables. The systems resulted in a source separation ratio of 70 wt% for hazardous waste and 76 wt% in the case of WEEE. Results show that households in the study area were willing to increase source separation of hazardous waste and WEEE when accessibility was improved and that this and similar collection systems can play an important role in building up increasingly sustainable solid waste management systems. PMID:20952178

  2. Property-close source separation of hazardous waste and waste electrical and electronic equipment - A Swedish case study

    SciTech Connect

    Bernstad, Anna; Cour Jansen, Jes la; Aspegren, Henrik

    2011-03-15

    Through an agreement with EEE producers, Swedish municipalities are responsible for collection of hazardous waste and waste electrical and electronic equipment (WEEE). In most Swedish municipalities, collection of these waste fractions is concentrated to waste recycling centres where households can source-separate and deposit hazardous waste and WEEE free of charge. However, the centres are often located on the outskirts of city centres and cars are needed in order to use the facilities in most cases. A full-scale experiment was performed in a residential area in southern Sweden to evaluate effects of a system for property-close source separation of hazardous waste and WEEE. After the system was introduced, results show a clear reduction in the amount of hazardous waste and WEEE disposed of incorrectly amongst residual waste or dry recyclables. The systems resulted in a source separation ratio of 70 wt% for hazardous waste and 76 wt% in the case of WEEE. Results show that households in the study area were willing to increase source separation of hazardous waste and WEEE when accessibility was improved and that this and similar collection systems can play an important role in building up increasingly sustainable solid waste management systems.

  3. Classification of residential areas according to physical vulnerability to natural hazards: a case study of anakkale, Turkey.

    PubMed

    Ba?aran-Uysal, Arzu; Sezen, Funda; Ozden, Sha; Karaca, Oznur

    2014-01-01

    The selection of new settlement areas and the construction of safe buildings, as well as rendering built-up areas safe, are of great importance in mitigating the damage caused by natural disasters. Most cities in Turkey are unprepared for natural hazards. In this paper, anakkale, located in a first-degree seismic zone and sprawled around the Sartay Delta, is examined in terms of its physical vulnerability to natural hazards. Residential areas are analysed using GIS (geographic information system) and remote-sensing technologies in relation to selected indicators. Residential areas of the city are divided into zones according to an evaluation of geological characteristics, the built-up area's features, and urban infrastructure, and four risk zones are determined. The results of the analysis show that the areas of the city suitable for housing are very limited. In addition, the historical centre and the housing areas near Sartay stream are shown to be most problematic in terms of natural disasters and sustainability. PMID:24325245

  4. Impact and effectiveness of risk mitigation strategies on the insurability of nanomaterial production: evidences from industrial case studies.

    PubMed

    Bergamaschi, Enrico; Murphy, Finbarr; Poland, Craig A; Mullins, Martin; Costa, Anna L; McAlea, Eamonn; Tran, Lang; Tofail, Syed A M

    2015-01-01

    Workers involved in producing nanomaterials or using nanomaterials in manufacturing plants are likely to have earlier and higher exposure to manufactured/engineered nanomaterials (ENM) than the general population. This is because both the volume handled and the probability of the effluence of 'free' nanoparticles from the handled volume are much higher during a production process than at any other stage in the lifecycle of nanomaterials and nanotechnology-enabled products. Risk assessment (RA) techniques using control banding (CB) as a framework for risk transfer represents a robust theory but further progress on implementing the model is required so that risk can be transferred to insurance companies. Following a review of RA in general and hazard measurement in particular, we subject a Structural Alert Scheme methodology to three industrial case studies using ZrO2 , TiO2 , and multi-walled carbon nanotubes (MWCNT). The materials are tested in a pristine state and in a remediated (coated) state, and the respective emission and hazard rates are tested alongside the material performance as originally designed. To our knowledge, this is the first such implementation of a CB RA in conjunction with an ENM performance test and offers both manufacturers and underwriters an insight into future applications. PMID:25808636

  5. UAV-based Natural Hazard Management in High-Alpine Terrain - Case Studies from Austria

    NASA Astrophysics Data System (ADS)

    Sotier, Bernadette; Adams, Marc; Lechner, Veronika

    2015-04-01

    Unmanned Aerial Vehicles (UAV) have become a standard tool for geodata collection, as they allow conducting on-demand mapping missions in a flexible, cost-effective manner at an unprecedented level of detail. Easy-to-use, high-performance image matching software make it possible to process the collected aerial images to orthophotos and 3D-terrain models. Such up-to-date geodata have proven to be an important asset in natural hazard management: Processes like debris flows, avalanches, landslides, fluvial erosion and rock-fall can be detected and quantified; damages can be documented and evaluated. In the Alps, these processes mostly originate in remote areas, which are difficult and hazardous to access, thus presenting a challenging task for RPAS data collection. In particular, the problems include finding suitable landing and piloting-places, dealing with bad or no GPS-signals and the installation of ground control points (GCP) for georeferencing. At the BFW, RPAS have been used since 2012 to aid natural hazard management of various processes, of which three case studies are presented below. The first case study deals with the results from an attempt to employ UAV-based multi-spectral remote sensing to monitor the state of natural hazard protection forests. Images in the visible and near-infrared (NIR) band were collected using modified low-cost cameras, combined with different optical filters. Several UAV-flights were performed in the 72 ha large study site in 2014, which lies in the Wattental, Tyrol (Austria) between 1700 and 2050 m a.s.l., where the main tree species are stone pine and mountain pine. The matched aerial images were analysed using different UAV-specific vitality indices, evaluating both single- and dual-camera UAV-missions. To calculate the mass balance of a debris flow in the Tyrolean Halltal (Austria), an RPAS flight was conducted in autumn 2012. The extreme alpine environment was challenging for both the mission and the evaluation of the aerial images: In the upper part of the steep channel there was no GPS-signal available, because of the high surrounding rock faces, the landing area consisted of coarse gravel. Therefore, only a manual flight with a high risk of damage was possible. With the calculated RPAS-based digital surface model, created from the 600 aerial images, a chronologically resolved back-calculation of the last big debris-flow event could be performed. In a third case study, aerial images from RPAS were used for a similar investigation in Virgen, Eastern Tyrol (Austria). A debris flow in the Firschnitzbach catchment caused severe damages to the village of Virgen in August 2012. An RPAS-flight was performed, in order to refine the estimated displaced debris mass for assessment purposes. The upper catchment of the Firschnitzbach is situated above the timberline and covers an area of 6.5 ha at a height difference of 1000 m. Therefore, three separate flights were necessary to achieve a sufficient image overlap. The central part of the Firschnitzbach consists of a steep and partly dense forested canyon / gorge, so there was no flight possible for this section up to now. The evaluation of the surface model from the images showed, that only half of the estimated debris mass came from the upper part of the catchment.

  6. A Study for Health Hazard Evaluation of Methylene Chloride Evaporated from the Tear Gas Mixture

    PubMed Central

    Chung, Eun-Kyo; Yi, Gwang-Yong; Chung, Kwang-Jae; Shin, Jung-Ah; Lee, In-Seop

    2010-01-01

    This study explored the health hazard of those exposed to methylene chloride by assessing its atmospheric concentration when a tear gas mixture was aerially dispersed. The concentration of methylene chloride ranged from 311.1-980.3 ppm (geometric mean, 555.8 ppm), 30 seconds after the dispersion started. However, the concentration fell rapidly to below 10 ppm after dispersion was completed. The concentration during the dispersion did not surpass the National Institute for Occupational Safety and Health 'immediately dangerous to life or health' value of 2,300 ppm, but did exceed the American Conference of Governmental Industrial Hygienists excursion limit of 250 ppm. Since methylene chloride is highly volatile (vapor pressure, 349 mmHg at 20?), the postdispersion atmospheric concentration can rise instantaneously. Moreover, the o-chlorobenzylidenemalononitrile formulation of tear gas (CS gas) is an acute upper respiratory tract irritant. Therefore, tear gas mixtures should be handled with delicate care. PMID:22953168

  7. CMMAD Usability Case Study in Support of Countermine and Hazard Sensing

    SciTech Connect

    Victor G. Walker; David I. Gertman

    2010-04-01

    During field trials, operator usability data were collected in support of lane clearing missions and hazard sensing for two robot platforms with Robot Intelligence Kernel (RIK) software and sensor scanning payloads onboard. The tests featured autonomous and shared robot autonomy levels where tasking of the robot used a graphical interface featuring mine location and sensor readings. The goal of this work was to provide insights that could be used to further technology development. The efficacy of countermine systems in terms of mobility, search, path planning, detection, and localization were assessed. Findings from objective and subjective operator interaction measures are reviewed along with commentary from soldiers having taken part in the study who strongly endorse the system.

  8. Geomorphological surveys and software simulations for rock fall hazard assessment: a case study in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Devoto, S.; Boccali, C.; Podda, F.

    2014-12-01

    In northern Italy, fast-moving landslides represent a significant threat to the population and human facilities. In the eastern portion of the Italian Alps, rock falls are recurrent and are often responsible for casualties or severe damage to roads and buildings. The above-cited type of landslide is frequent in mountain ranges, is characterised by strong relief energy and is triggered by earthquakes or copious rainfall, which often exceed 2000 mm yr-1. These factors cause morphological dynamics with intense slope erosion and degradation processes. This work investigates the appraisal of the rock-fall hazard related to the presence of several large unstable blocks located at the top of a limestone peak, approximately 500 m NW with respect to the Village of Cimolais. Field surveys recognised a limestone block exceeding a volume of 400 m3 and identified this block as the most hazardous for Cimolais Village because of its proximity to the rocky cliff. A first assessment of the possible transit and stop areas has been investigated through in-depth traditional activities, such as geomorphological mapping and aerial photo analysis. The output of field surveys was a detailed land use map, which provided a fundamental starting point for rock fall software analysis. The geomorphological observations were correlated with DTMs derived by regional topography and Airborne Laser Scanning (ALS) surveys to recognise possible rock fall routes. To simulate properly rock fall trajectories with a hybrid computer program, particular attention was devoted to the correct quantification of rates of input parameters, such as restitution coefficients and horizontal acceleration associated to earthquakes, which historically occur in this portion of Italy. The simulation outputs regarding the distribution of rock fall end points and kinetic energy along rock falling paths highlight the hazardous situation for Cimolais Village. Because of this reason, mitigation works have been suggested to immediately reduce the landslide risk. This proposal accounts for the high volume of blocks, which, in case of a fall, render the passive mitigation measures already in place at the back of Cimolais worthless.

  9. A Geophysical Study of the Cadell Fault Scarp for Earthquake Hazard Assessment in Southeast Australia

    NASA Astrophysics Data System (ADS)

    Collins, C. D.

    2004-12-01

    The historical record of seismicity in Australia is too short (less than 150 years) to confidently define seismic source zones, particularly the recurrence rates for large, potentially damaging earthquakes, and this leads to uncertainty in hazard assessments. One way to extend this record is to search for evidence of earthquakes in the landscape, including Quaternary fault scarps, tilt blocks and disruptions to drainage patterns. A recent Geoscience Australia compilation of evidence of Quaternary tectonics identified over one hundred examples of potentially recent structures in Australia, testifying to the fact that a greater hazard may exist from large earthquakes than is evident from the recorded history alone. Most of these structures have not been studied in detail and have not been dated, so the recurrence rate for damaging events is unknown. One example of recent tectonic activity lies on the Victoria-New South Wales border, where geologically recent uplift has resulted in the formation of the Cadell Fault Scarp, damming Australia's largest river, the Murray River, and diverting its course. The scarp extends along a north-south strike for at least 50 km and reaches a maximum height of about 13 metres. The scarp displaces sands and clays of the Murray Basin sediments which overlie Palaeozoic bedrock at a depth of 100 to 250 m. There is evidence that the river system has eroded the scarp and displaced the topographic expression away from the location where the fault, or faults, meets the surface. Thus, to locate potential sites for trenching which intersect the faults, Geoscience Australia acquired ground-penetrating radar, resistivity and multi-channel high-resolution seismic reflection and refraction data along traverses across the scarp. The seismic data were acquired using an IVI T15000 MiniVib vibrator operating in p-wave mode, and a 24-channel Stratavisor acquisition system. Four 10-second sweeps, with a frequency range of 10-240 Hz, were carried out every 10 m at each receiver location; the receivers comprised groups of four vertical component 10 Hz geophones. Additional sources were located at offsets of up to a kilometre to record refraction data from the basement. A hammer source was also used for comparison. As the resolution of the seismic data precludes imaging at very shallow depths, GPR and resistivity data were acquired at selected locations to sample the upper 3 metres. The data are currently being processed, and a synthesis of recent geophysical and geological investigations will be presented to describe the architecture of the Cadell Fault Scarp. The results will be used to constrain earthquake hazard assessments for the region and for south east Australia in general.

  10. Case studies. [hazardous effects of early medical use of X-rays

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The characteristics of the technology assessment process which were manifested in response to the hazardous effects of early medical uses of X-rays are considered. Other topics discussed include: controlling the potential hazards of government-sponsored technology, genetic technology, community level impacts of expanded underground coal mining, and an integrated strategy for aircraft/airport noise abatement.

  11. AMERICAN HEALTHY HOMES SURVEY: A NATIONAL STUDY OF RESIDENTIAL RELATED HAZARDS

    EPA Science Inventory

    The US Environmental Protection Agency's (EPA) National Exposure Research Laboratory (NERL) and the US Department of Housing and Urban Development's (HUD) Office of Healthy Homes and Lead Hazard Control conducted a national survey of housing related hazards in US residences. The...

  12. Marginal hazards model for case-cohort studies with multiple disease outcomes.

    PubMed

    Kang, S; Cai, J

    2009-12-01

    Case-cohort study designs are widely used to reduce the cost of large cohort studies while achieving the same goals, especially when the disease rate is low. A key advantage of the case-cohort study design is its capacity to use the same subcohort for several diseases or for several subtypes of disease. In order to compare the effect of a risk factor on different types of diseases, times to different events need to be modelled simultaneously. Valid statistical methods that take the correlations among the outcomes from the same subject into account need to be developed. To this end, we consider marginal proportional hazards regression models for case-cohort studies with multiple disease outcomes. We also consider generalized case-cohort designs that do not require sampling all the cases, which is more realistic for multiple disease outcomes. We propose an estimating equation approach for parameter estimation with two different types of weights. Consistency and asymptotic normality of the proposed estimators are established. Large sample approximation works well in small samples in simulation studies. The proposed methods are applied to the Busselton Health Study. PMID:23946547

  13. Review: Assessment of completeness of reporting in intervention studies using livestock: an example from pain mitigation interventions in neonatal piglets.

    PubMed

    O'Connor, A; Anthony, R; Bergamasco, L; Coetzee, J F; Dzikamunhenga, R S; Johnson, A K; Karriker, L A; Marchant-Forde, J N; Martineau, G P; Millman, S T; Pajor, E A; Rutherford, K; Sprague, M; Sutherland, M A; von Borell, E; Webb, S R

    2016-04-01

    Accurate and complete reporting of study methods, results and interpretation are essential components for any scientific process, allowing end-users to evaluate the internal and external validity of a study. When animals are used in research, excellence in reporting is expected as a matter of continued ethical acceptability of animal use in the sciences. Our primary objective was to assess completeness of reporting for a series of studies relevant to mitigation of pain in neonatal piglets undergoing routine management procedures. Our second objective was to illustrate how authors can report the items in the Reporting guidElines For randomized controLled trials for livEstoCk and food safety (REFLECT) statement using examples from the animal welfare science literature. A total of 52 studies from 40 articles were evaluated using a modified REFLECT statement. No single study reported all REFLECT checklist items. Seven studies reported specific objectives with testable hypotheses. Six studies identified primary or secondary outcomes. Randomization and blinding were considered to be partially reported in 21 and 18 studies, respectively. No studies reported the rationale for sample sizes. Several studies failed to report key design features such as units for measurement, means, standard deviations, standard errors for continuous outcomes or comparative characteristics for categorical outcomes expressed as either rates or proportions. In the discipline of animal welfare science, authors, reviewers and editors are encouraged to use available reporting guidelines to ensure that scientific methods and results are adequately described and free of misrepresentations and inaccuracies. Complete and accurate reporting increases the ability to apply the results of studies to the decision-making process and prevent wastage of financial and animal resources. PMID:26556522

  14. Seismic Hazard Analysis in EL Paso/juarez Area from Study of Young Fault Scarps

    NASA Astrophysics Data System (ADS)

    ashenfelter, K. R.

    2012-12-01

    The El Paso-Juarez metropolitan area contains a known record of active faulting, but also has one of the most poorly known paleoseismic records. The scarcity of data means that nearly 2 million people are exposed to a seismic hazard with little known on the actual risk. Active faults are known along the eastern side of the Franklin Mountains as well as young ruptures within the Hueco Bolson in East El Paso, yet the only fault with paleoseismic studies is the East Franklin's fault. Recent population increases in the El Paso region have led to a construction boom in east El Paso, and many of these construction sites cross known Quaternary fault ruptures. This research project contains two potential components: 1) An exploratory component: students can use existing fault maps and high resolution aerial photography to seek out sites where active construction sites might be unearthing exposures of young fault ruptures. The study is exploratory, and involves carefully mapping using field GIS systems to document areas for potential study, map possible faults, etc. 2) An active fault study in an urbanized environment: The east Franklins fault is a known active fault. The scarp is exposed near trans-mountain road, and along some side streets in NE El Paso. Potential studies include careful mapping of fault scarp topographic profiles, and mapping surface traces.

  15. Remedial Action Assessment System (RAAS): Evaluation of selected feasibility studies of CERCLA (Comprehensive Environmental Response, Compensation, and Liability Act) hazardous waste sites

    SciTech Connect

    Whelan, G. ); Hartz, K.E.; Hilliard, N.D. and Associates, Seattle, WA )

    1990-04-01

    Congress and the public have mandated much closer scrutiny of the management of chemically hazardous and radioactive mixed wastes. Legislative language, regulatory intent, and prudent technical judgment, call for using scientifically based studies to assess current conditions and to evaluate and select costeffective strategies for mitigating unacceptable situations. The NCP requires that a Remedial Investigation (RI) and a Feasibility Study (FS) be conducted at each site targeted for remedial response action. The goal of the RI is to obtain the site data needed so that the potential impacts on public health or welfare or on the environment can be evaluated and so that the remedial alternatives can be identified and selected. The goal of the FS is to identify and evaluate alternative remedial actions (including a no-action alternative) in terms of their cost, effectiveness, and engineering feasibility. The NCP also requires the analysis of impacts on public health and welfare and on the environment; this analysis is the endangerment assessment (EA). In summary, the RI, EA, and FS processes require assessment of the contamination at a site, of the potential impacts in public health or the environment from that contamination, and of alternative RAs that could address potential impacts to the environment. 35 refs., 7 figs., 1 tab.

  16. Mitigating Challenges of Using Virtual Reality in Online Courses: A Case Study

    ERIC Educational Resources Information Center

    Stewart, Barbara; Hutchins, Holly M.; Ezell, Shirley; De Martino, Darrell; Bobba, Anil

    2010-01-01

    Case study methodology was used to describe the challenges experienced in the development of a virtual component for a freshman-level undergraduate course. The purpose of the project was to use a virtual environment component to provide an interactive and engaging learning environment. While some student and faculty feedback was positive, this

  17. A Competence-Based Science Learning Framework Illustrated through the Study of Natural Hazards and Disaster Risk Reduction

    ERIC Educational Resources Information Center

    Oyao, Sheila G.; Holbrook, Jack; Rannikme, Miia; Pagunsan, Marmon M.

    2015-01-01

    This article proposes a competence-based learning framework for science teaching, applied to the study of "big ideas", in this case to the study of natural hazards and disaster risk reduction (NH&DRR). The framework focuses on new visions of competence, placing emphasis on nurturing connectedness and behavioral actions toward

  18. 75 FR 3912 - Agency Information Collection Activities: Proposed Collection; Comment Request, 1660-0076; Hazard...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-25

    ...; Comment Request, 1660-0076; Hazard Mitigation Grant Program Application and Reporting AGENCY: Federal... Notice seeks comments concerning the Hazard Mitigation Grant Program application and reporting..., Grants Policy Branch, Mitigation Division, (202) 646-3321 for additional information. You may contact...

  19. Natural phenomena hazards, Hanford Site, Washington

    SciTech Connect

    Conrads, T.J.

    1998-09-29

    This document presents the natural phenomena hazard loads for use in implementing DOE Order 5480.28, Natural Phenomena Hazards Mitigation, and supports development of double-shell tank systems specifications at the Hanford Site in south-central Washington State. The natural phenomena covered are seismic, flood, wind, volcanic ash, lightning, snow, temperature, solar radiation, suspended sediment, and relative humidity.

  20. Methane emission from ruminants and solid waste: A critical analysis of baseline and mitigation projections for climate and policy studies

    NASA Astrophysics Data System (ADS)

    Matthews, E.

    2012-12-01

    Current and projected estimates of methane (CH4) emission from anthropogenic sources are numerous but largely unexamined or compared. Presented here is a critical appraisal of CH4 projections used in climate-chemistry and policy studies. We compare emissions for major CH4 sources from several groups, including our own new data and RCP projections developed for climate-chemistry models for the next IPCC Assessment Report (AR5). We focus on current and projected baseline and mitigation emissions from ruminant animals and solid waste that are both predicted to rise dramatically in coming decades, driven primarily by developing countries. For waste, drivers include increasing urban populations, higher per capita waste generation due to economic growth and increasing landfilling rates. Analysis of a new global data base detailing waste composition, collection and disposal indicates that IPCC-based methodologies and default data overestimate CH4 emission for the current period which cascades into substantial overestimates in future projections. CH4 emission from solid waste is estimated to be ~10-15 Tg CH4/yr currently rather than the ~35 Tg/yr often reported in the literature. Moreover, emissions from developing countries are unlikely to rise rapidly in coming decades because new management approaches, such as sanitary landfills, that would increase emissions are maladapted to infrastructures in these countries and therefore unlikely to be implemented. The low current emission associated with solid waste (~10 Tg), together with future modest growth, implies that mitigation of waste-related CH4 emission is a poor candidate for slowing global warming. In the case of ruminant animals (~90 Tg CH4/yr currently), the dominant assumption driving future trajectories of CH4 emission is a substantial increase in meat and dairy consumption in developing countries to be satisfied by growing animal populations. Unlike solid waste, current ruminant emissions among studies exhibit a narrow range that does not necessarily signal low uncertainty but rather a reliance on similar animal statistics and emission factors. The UN Food and Agriculture Organization (FAO) projects 2000-2030 growth rates of livestock for most developing countries at 2% to >3% annually. However, the assumption of rapidly rising meat consumption is not supported by current trends nor by resource availability. For example, increased meat consumption in China and other developing countries is poultry and pork that do not affect CH4 emissions, suggesting that the rapid growth projected for all animals, boosting growth in CH4 emission, will not occur. From a resource standpoint, large increases in cattle, sheep and goat populations, especially for African countries (~60% by 2030), are not supportable on arid grazing lands that require very low stocking rates and semi-nomadic management. Increases projected for African animal populations would require either that about 2/3 more animals are grazed on increasingly drier lands or that all non-forested areas become grazing lands. Similar to solid waste, future methane emission from ruminant animals is likely to grow modestly although animals are not a likely candidate for CH4 mitigation due to their dispersed distribution throughout widely varying agricultural systems under very local management.

  1. Safety in earth orbit study. Volume 2: Analysis of hazardous payloads, docking, on-board survivability

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Detailed and supporting analyses are presented of the hazardous payloads, docking, and on-board survivability aspects connected with earth orbital operations of the space shuttle program. The hazards resulting from delivery, deployment, and retrieval of hazardous payloads, and from handling and transport of cargo between orbiter, sortie modules, and space station are identified and analyzed. The safety aspects of shuttle orbiter to modular space station docking includes docking for assembly of space station, normal resupply docking, and emergency docking. Personnel traffic patterns, escape routes, and on-board survivability are analyzed for orbiter with crew and passenger, sortie modules, and modular space station, under normal, emergency, and EVA and IVA operations.

  2. Exploratory study of burn time, duty factor, and fluence on ITER activation hazards

    SciTech Connect

    Piet, S.J.

    1992-08-01

    The safety analyses for the Conceptual Design Activity (CDA) of the International Thermonuclear Experimental Reactor (ITER) were based on the simplifying assumption that the activation of materials occurs continuously. Since the analyses showed a significant hazard, it is appropriate to examine how much hazard reduction might occur if this conservative assumption were relaxed. This report explores how much reduction might be gained by considering non-continuous operation, that is, by considering plasma burn time, duty factor, and integrated fluence. Other factors impacting activation hazards - material choice, flux, and size - are not considered here.

  3. Field study of exhaust fans for mitigating indoor air quality problems: Final report

    SciTech Connect

    Grimsrud, D.T.; Szydlowski, R.F.; Turk, B.H.

    1986-09-01

    Residential ventilation in the United States housing stock is provided primarily by infiltration, the natural leakage of outdoor air into a building through cracks and holes in the building shell. Since ventilation is the dominant mechanism for control of indoor pollutant concentrations, low infiltration rates caused fluctuation in weather conditions may lead to high indoor pollutant concentrations. Supplemental mechanical ventilation can be used to eliminate these periods of low infiltration. This study examined effects of small continuously-operating exhaust fan on pollutant concentrations and energy use in residences.

  4. Laboratory Study of Polychlorinated Biphenyl Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs

    EPA Science Inventory

    This is the fourth, also the last, report of the report series entitled Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings. This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method...

  5. Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs

    EPA Science Inventory

    This is the fourth, also the last, report of the report series entitled Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings. This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method, developed ...