Science.gov

Sample records for hazard mitigation studies

  1. Unacceptable Risk: Earthquake Hazard Mitigation in One California School District. Hazard Mitigation Case Study.

    ERIC Educational Resources Information Center

    California State Office of Emergency Services, Sacramento.

    Earthquakes are a perpetual threat to California's school buildings. School administrators must be aware that hazard mitigation means much more than simply having a supply of water bottles in the school; it means getting everyone involved in efforts to prevent tragedies from occurring in school building in the event of an earthquake. The PTA in…

  2. Study proposes wholesale change in thinking about natural hazards mitigation

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    The “lollapaloozas,” the major natural catastrophes, are getting bigger and bigger, and it is time to confront this growing problem by dramatically changing the way that society approaches natural hazard mitigation, conducts itself in relation to the natural environment, and accepts responsibility for activities that could lead to or increase disasters, according to Dennis Mileti, principal investigator of a new study on natural hazards, and director of the Natural Hazards Research and Applications Information Center at the University of Colorado at Boulder.Since 1989, the United States has been struck by seven of the nation's 10 most costly natural disasters, including the 1994 Northridge earthquake in California that caused $25 billion in damages. Also since 1989, the financial cost of natural hazards in the United States—which includes floods, earthquakes, hurricanes, and wildfires, as well as landslides, heat, and fog—has frequently averaged $1 billion per week, a price that some experts say will continue rising. Internationally, the Kobe, Japan, earthquake cost more than $100 billion and is the most financially costly disaster in world history None of these figures include indirect losses related to natural disasters, such as lost economic productivity

  3. Mitigating lightning hazards

    SciTech Connect

    Hasbrouck, R.

    1996-05-01

    A new draft document provides guidance for assessing and mitigating the effects of lightning hazards on a Department of Energy (or any other) facility. Written by two Lawrence Livermore Engineers, the document combines lightning hazard identification and facility categorization with a new concept, the Lightning Safety System, to help dispel the confusion and mystery surrounding lightning and its effects. The guidance is of particular interest to DOE facilities storing and handling nuclear and high-explosive materials. The concepts presented in the document were used to evaluate the lightning protection systems of the Device Assembly Facility at the Nevada Test Site.

  4. Numerical Study on Tsunami Hazard Mitigation Using a Submerged Breakwater

    PubMed Central

    Yoo, Jeseon; Han, Sejong; Cho, Yong-Sik

    2014-01-01

    Most coastal structures have been built in surf zones to protect coastal areas. In general, the transformation of waves in the surf zone is quite complicated and numerous hazards to coastal communities may be associated with such phenomena. Therefore, the behavior of waves in the surf zone should be carefully analyzed and predicted. Furthermore, an accurate analysis of deformed waves around coastal structures is directly related to the construction of economically sound and safe coastal structures because wave height plays an important role in determining the weight and shape of a levee body or armoring material. In this study, a numerical model using a large eddy simulation is employed to predict the runup heights of nonlinear waves that passed a submerged structure in the surf zone. Reduced runup heights are also predicted, and their characteristics in terms of wave reflection, transmission, and dissipation coefficients are investigated. PMID:25215334

  5. Numerical study on tsunami hazard mitigation using a submerged breakwater.

    PubMed

    Ha, Taemin; Yoo, Jeseon; Han, Sejong; Cho, Yong-Sik

    2014-01-01

    Most coastal structures have been built in surf zones to protect coastal areas. In general, the transformation of waves in the surf zone is quite complicated and numerous hazards to coastal communities may be associated with such phenomena. Therefore, the behavior of waves in the surf zone should be carefully analyzed and predicted. Furthermore, an accurate analysis of deformed waves around coastal structures is directly related to the construction of economically sound and safe coastal structures because wave height plays an important role in determining the weight and shape of a levee body or armoring material. In this study, a numerical model using a large eddy simulation is employed to predict the runup heights of nonlinear waves that passed a submerged structure in the surf zone. Reduced runup heights are also predicted, and their characteristics in terms of wave reflection, transmission, and dissipation coefficients are investigated. PMID:25215334

  6. Teaching Hazards Mitigation.

    ERIC Educational Resources Information Center

    Abernethy, James

    1980-01-01

    It is recommended that courses be provided for architectural students in postoccupancy building performance and user experience. A course in disaster mitigation is described. It was introduced to increase student awareness of the importance of design decisions in building safety. (MSE)

  7. Washington Tsunami Hazard Mitigation Program

    NASA Astrophysics Data System (ADS)

    Walsh, T. J.; Schelling, J.

    2012-12-01

    Washington State has participated in the National Tsunami Hazard Mitigation Program (NTHMP) since its inception in 1995. We have participated in the tsunami inundation hazard mapping, evacuation planning, education, and outreach efforts that generally characterize the NTHMP efforts. We have also investigated hazards of significant interest to the Pacific Northwest. The hazard from locally generated earthquakes on the Cascadia subduction zone, which threatens tsunami inundation in less than hour following a magnitude 9 earthquake, creates special problems for low-lying accretionary shoreforms in Washington, such as the spits of Long Beach and Ocean Shores, where high ground is not accessible within the limited time available for evacuation. To ameliorate this problem, we convened a panel of the Applied Technology Council to develop guidelines for construction of facilities for vertical evacuation from tsunamis, published as FEMA 646, now incorporated in the International Building Code as Appendix M. We followed this with a program called Project Safe Haven (http://www.facebook.com/ProjectSafeHaven) to site such facilities along the Washington coast in appropriate locations and appropriate designs to blend with the local communities, as chosen by the citizens. This has now been completed for the entire outer coast of Washington. In conjunction with this effort, we have evaluated the potential for earthquake-induced ground failures in and near tsunami hazard zones to help develop cost estimates for these structures and to establish appropriate tsunami evacuation routes and evacuation assembly areas that are likely to to be available after a major subduction zone earthquake. We intend to continue these geotechnical evaluations for all tsunami hazard zones in Washington.

  8. Contributions of Nimbus 7 TOMS Data to Volcanic Study and Hazard Mitigation

    NASA Technical Reports Server (NTRS)

    Krueger, Arlin J.; Bluth, G. J. S.; Schaefer, S. A.

    1998-01-01

    Nimbus TOMS data have led to advancements among many volcano-related scientific disciplines, from the initial ability to quantify SO2 clouds leading to derivations of eruptive S budgets and fluxes, to tracking of individual clouds, assessing global volcanism and atmospheric impacts. Some of the major aspects of TOMS-related research, listed below, will be reviewed and updated: (1) Measurement of volcanic SO2 clouds: Nimbus TOMS observed over 100 individual SO2 clouds during its mission lifetime; large explosive eruptions are now routinely and reliably measured by satellite. (2) Eruption processes: quantification of SO2 emissions have allowed assessments of eruption sulfur budgets, the evaluation of "excess" sulfur, and inferences of H2S emissions. (3) Detection of ash: TOMS data are now used to detect volcanic particulates in the atmosphere, providing complementary analyses to infrared methods of detection. Paired TOMS and AVHRR studies have provided invaluable information on volcanic cloud compositions and processes. (4) Cloud tracking and hazard mitigation: volcanic clouds can be considered gigantic tracers in the atmosphere, and studies of the fates of these clouds have led to new knowledge of their physical and chemical dispersion in the atmosphere for predictive models. (5) Global trends: the long term data set has provided researchers an unparalleled record of explosive volcanism, and forms a key component in assessing annual to decadal trends in global S emissions. (6) Atmospheric impacts: TOMS data have been linked to independent records of atmospheric change, in order to compare cause and effect processes following a massive injection of SO2 into the atmosphere. (7) Future TOMS instruments and applications: Nimbus TOMS has given way to new satellite platforms, with several wavelength and resolution modifications. New efforts to launch a geostationary TOMS could provide unprecedented observations of volcanic activity.

  9. Landslide hazard mitigation in North America

    USGS Publications Warehouse

    Wieczorek, G.F.; Leahy, P.P.

    2008-01-01

    Active landslides throughout the states and territories of the United States result in extensive property loss and 25-50 deaths per year. The U.S. Geological Survey (USGS) has a long history of detailed examination of landslides since the work of Howe (1909) in the San Juan Mountains of Colorado. In the last four decades, landslide inventory maps and landslide hazard maps have depicted landslides of different ages, identified fresh landslide scarps, and indicated the direction of landslide movement for different regions of the states of Colorado, California, and Pennsylvania. Probability-based methods improve landslide hazards assessments. Rainstorms, earthquakes, wildfires, and volcanic eruptions can trigger landslides. Improvements in remote sensing of rainfall make it possible to issue landslide advisories and warnings for vulnerable areas. From 1986 to 1995, the USGS issued hazard warnings based on rainfall in the San Francisco Bay area. USGS workers also identified rainfall thresholds triggering landslides in Puerto Rico, Hawaii, Washington, and the Blue Ridge Mountains of central Virginia. Detailed onsite monitoring of landslides near highways in California and Colorado aided transportation officials. The USGS developed a comprehensive, multi-sector, and multi-agency strategy to mitigate landslide hazards nationwide. This study formed the foundation of the National Landslide Hazards Mitigation Strategy. The USGS, in partnership with the U.S. National Weather Service and the State of California, began to develop a real-time warning system for landslides from wildfires in Southern California as a pilot study in 2005.

  10. Predictability and extended-range prognosis in natural hazard risk mitigation process: A case study over west Greece

    NASA Astrophysics Data System (ADS)

    Matsangouras, Ioannis T.; Nastos, Panagiotis T.

    2014-05-01

    Natural hazards pose an increasing threat to society and new innovative techniques or methodologies are necessary to be developed, in order to enhance the risk mitigation process in nowadays. It is commonly accepted that disaster risk reduction is a vital key for future successful economic and social development. The systematic improvement accuracy of extended-range prognosis products, relating with monthly and seasonal predictability, introduced them as a new essential link in risk mitigation procedure. Aiming at decreasing the risk, this paper presents the use of seasonal and monthly forecasting process that was tested over west Greece from September to December, 2013. During that season significant severe weather events occurred, causing significant impact to the local society (severe storms/rainfalls, hail, flash floods, etc). Seasonal and monthly forecasting products from European Centre for Medium-Range Weather Forecasts (ECMWF) depicted, with probabilities stratified by terciles, areas of Greece where significant weather may occur. As atmospheric natural hazard early warning systems are able to deliver warnings up to 72 hours in advance, this study illustrates that extended-range prognosis could be introduced as a new technique in risk mitigation. Seasonal and monthly forecast products could highlight extended areas where severe weather events may occur in one month lead time. In addition, a risk mitigation procedure, that extended prognosis products are adopted, is also presented providing useful time to preparedness process at regional administration level.

  11. An economic and geographic appraisal of a spatial natural hazard risk: a study of landslide mitigation rules

    USGS Publications Warehouse

    Bernknopf, R.L.; Brookshire, D.S.; Campbell, R.H.; Shapiro, C.D.

    1988-01-01

    Efficient mitigation of natural hazards requires a spatial representation of the risk, based upon the geographic distribution of physical parameters and man-related development activities. Through such a representation, the spatial probability of landslides based upon physical science concepts is estimated for Cincinnati, Ohio. Mitigation programs designed to reduce loss from landslide natural hazards are then evaluated. An optimum mitigation rule is suggested that is spatially selective and is determined by objective measurements of hillside slope and properties of the underlying soil. -Authors

  12. Playing against nature: improving earthquake hazard mitigation

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Stein, J.

    2012-12-01

    The great 2011 Tohoku earthquake dramatically demonstrated the need to improve earthquake and tsunami hazard assessment and mitigation policies. The earthquake was much larger than predicted by hazard models, and the resulting tsunami overtopped coastal defenses, causing more than 15,000 deaths and $210 billion damage. Hence if and how such defenses should be rebuilt is a challenging question, because the defences fared poorly and building ones to withstand tsunamis as large as March's is too expensive,. A similar issue arises along the Nankai Trough to the south, where new estimates warning of tsunamis 2-5 times higher than in previous models raise the question of what to do, given that the timescale on which such events may occur is unknown. Thus in the words of economist H. Hori, "What should we do in face of uncertainty? Some say we should spend our resources on present problems instead of wasting them on things whose results are uncertain. Others say we should prepare for future unknown disasters precisely because they are uncertain". Thus society needs strategies to mitigate earthquake and tsunami hazards that make economic and societal sense, given that our ability to assess these hazards is poor, as illustrated by highly destructive earthquakes that often occur in areas predicted by hazard maps to be relatively safe. Conceptually, we are playing a game against nature "of which we still don't know all the rules" (Lomnitz, 1989). Nature chooses tsunami heights or ground shaking, and society selects the strategy to minimize the total costs of damage plus mitigation costs. As in any game of chance, we maximize our expectation value by selecting the best strategy, given our limited ability to estimate the occurrence and effects of future events. We thus outline a framework to find the optimal level of mitigation by balancing its cost against the expected damages, recognizing the uncertainties in the hazard estimates. This framework illustrates the role of the uncertainties and the need to candidly assess them. It can be applied to exploring policies under various hazard scenarios and mitigating other natural hazards.ariation in total cost, the sum of expected loss and mitigation cost, as a function of mitigation level. The optimal level of mitigation, n*, minimizes the total cost. The expected loss depends on the hazard model, so the better the hazard model, the better the mitigation policy (Stein and Stein, 2012).

  13. Mitigation of Hazardous Comets and Asteroids

    NASA Astrophysics Data System (ADS)

    Belton, Michael J. S.; Morgan, Thomas H.; Samarasinha, Nalin H.; Yeomans, Donald K.

    2004-11-01

    Preface; 1. Recent progress in interpreting the nature of the near-Earth object population W. Bottke, A. Morbidelli and R. Jedicke; 2. Earth impactors: orbital characteristics and warning times S. R. Chesley and T. B. Spahr; 3. The role of radar in predicting and preventing asteroid and comet collisions with Earth S. J. Ostro and J. D. Giorgini; 4. Interior structures for asteroids and cometary nuclei E. Asphaug; 5. What we know and don't know about surfaces of potentially hazardous small bodies C. R. Chapman; 6. About deflecting asteroids and comets K. A. Holsapple; 7. Scientific requirements for understanding the near-Earth asteroid population A. W. Harris; 8. Physical properties of comets and asteroids inferred from fireball observations M. D. Martino and A. Cellino; 9. Mitigation technologies and their requirements C. Gritzner and R. Kahle; 10. Peering inside near-Earth objects with radio tomography W. Kofman and A. Safaeinili; 11. Seismological imvestigation of asteroid and comet interiors J. D. Walker and W. F. Huebner; 12. Lander and penetrator science for near-Earth object mitigation studies A. J. Ball, P. Lognonne, K. Seiferlin, M. Patzold and T. Spohn; 13. Optimal interpretation and deflection of Earth-approaching asteroids using low-thrust electric propulsion B. A. Conway; 14. Close proximity operations at small bodies: orbiting, hovering, and hopping D. J. Scheeres; 15. Mission operations in low gravity regolith and dust D. Sears, M. Franzen, S. Moore, S. Nichols, M. Kareev and P. Benoit; 16. Impacts and the public: communicating the nature of the impact hazard D. Morrison, C. R. Chapman, D. Steel and R. P. Binzel; 17. Towards a program to remove the threat of hazardous NEOs M. J. S. Belton.

  14. Mitigation of Hazardous Comets and Asteroids

    NASA Astrophysics Data System (ADS)

    Belton, Michael J. S.; Morgan, Thomas H.; Samarasinha, Nalin H.; Yeomans, Donald K.

    2011-03-01

    Preface; 1. Recent progress in interpreting the nature of the near-Earth object population W. Bottke, A. Morbidelli and R. Jedicke; 2. Earth impactors: orbital characteristics and warning times S. R. Chesley and T. B. Spahr; 3. The role of radar in predicting and preventing asteroid and comet collisions with Earth S. J. Ostro and J. D. Giorgini; 4. Interior structures for asteroids and cometary nuclei E. Asphaug; 5. What we know and don't know about surfaces of potentially hazardous small bodies C. R. Chapman; 6. About deflecting asteroids and comets K. A. Holsapple; 7. Scientific requirements for understanding the near-Earth asteroid population A. W. Harris; 8. Physical properties of comets and asteroids inferred from fireball observations M. D. Martino and A. Cellino; 9. Mitigation technologies and their requirements C. Gritzner and R. Kahle; 10. Peering inside near-Earth objects with radio tomography W. Kofman and A. Safaeinili; 11. Seismological imvestigation of asteroid and comet interiors J. D. Walker and W. F. Huebner; 12. Lander and penetrator science for near-Earth object mitigation studies A. J. Ball, P. Lognonne, K. Seiferlin, M. Patzold and T. Spohn; 13. Optimal interpretation and deflection of Earth-approaching asteroids using low-thrust electric propulsion B. A. Conway; 14. Close proximity operations at small bodies: orbiting, hovering, and hopping D. J. Scheeres; 15. Mission operations in low gravity regolith and dust D. Sears, M. Franzen, S. Moore, S. Nichols, M. Kareev and P. Benoit; 16. Impacts and the public: communicating the nature of the impact hazard D. Morrison, C. R. Chapman, D. Steel and R. P. Binzel; 17. Towards a program to remove the threat of hazardous NEOs M. J. S. Belton.

  15. Volcano hazard mitigation program in Indonesia

    USGS Publications Warehouse

    Sudradjat, A.

    1990-01-01

    Volcanological investigations in Indonesia were started in the 18th century, when Valentijn in 1726 prepared a chronological report of the eruption of Banda Api volcno, Maluku. Modern and intensive volcanological studies did not begin until the catastrophic eruption of Kelut volcano, East Java, in 1919. The eruption took 5,011 lives and destroyed thousands of acres of coffee plantation. An eruption lahar generated by the crater lake water mixed with volcanic eruptions products was the cause of death for a high number of victims. An effort to mitigate the danger from volcanic eruption was first initiated in 1921 by constructing a tunnel to drain the crater lake water of Kelut volcano. At the same time a Volcanological Survey was established by the government with the responsibility of seeking every means for minimizing the hazard caused by volcanic eruption. 

  16. PROFILE: Examining Hazard Mitigation Within the Context of Public Goods.

    PubMed

    Reddy

    2000-02-01

    / This paper presents a case study of an American barrier island devastated by a hurricane to show how it is addressing the free-riding problem and protecting its public goods, thereby contributing to hazard mitigation. It examines hazard mitigation and the free-riding problem within the public goods framework. Free-riding is a term used in the public choice theory and common pool resource literature. It is a term used for describing the actions of rational individuals who freely exploit a collective or public good at the expense of others. Free-riding is a major problem faced by public goods. The problem very frequently occurs in the context of hazard mitigation and coastal resource management. Very little is known about the factors that contribute to the promotion of hazard mitigation. This paper identifies some of the important factors that help local institutions provide and sustain hazard mitigation measures. Theoretical and practical implications for hazards research and disaster management policy are presented. PMID:10594187

  17. WHC natural phenomena hazards mitigation implementation plan

    SciTech Connect

    Conrads, T.J.

    1996-09-11

    Natural phenomena hazards (NPH) are unexpected acts of nature which pose a threat or danger to workers, the public or to the environment. Earthquakes, extreme winds (hurricane and tornado),snow, flooding, volcanic ashfall, and lightning strike are examples of NPH at Hanford. It is the policy of U.S. Department of Energy (DOE) to design, construct and operate DOE facilitiesso that workers, the public and the environment are protected from NPH and other hazards. During 1993 DOE, Richland Operations Office (RL) transmitted DOE Order 5480.28, ``Natural Phenomena Hazards Mitigation,`` to Westinghouse Hanford COmpany (WHC) for compliance. The Order includes rigorous new NPH criteria for the design of new DOE facilities as well as for the evaluation and upgrade of existing DOE facilities. In 1995 DOE issued Order 420.1, ``Facility Safety`` which contains the same NPH requirements and invokes the same applicable standards as Order 5480.28. It will supersede Order 5480.28 when an in-force date for Order 420.1 is established through contract revision. Activities will be planned and accomplished in four phases: Mobilization; Prioritization; Evaluation; and Upgrade. The basis for the graded approach is the designation of facilities/structures into one of five performance categories based upon safety function, mission and cost. This Implementation Plan develops the program for the Prioritization Phase, as well as an overall strategy for the implemention of DOE Order 5480.2B.

  18. California Earthquakes: Science, Risks, and the Politics of Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Shedlock, Kaye M.

    "Politics" should be the lead word in the sub-title of this engrossing study of the emergence and growth of the California and federal earthquake hazard reduction infrastructures. Beginning primarily with the 1906 San Francisco earthquake, scientists, engineers, and other professionals cooperated and clashed with state and federal officials, the business community, " boosters," and the general public to create programs, agencies, and commissions to support earthquake research and hazards mitigation. Moreover, they created a "regulatory-state" apparatus that governs human behavior without sustained public support for its creation. The public readily accepts that earthquake research and mitigation are government responsibilities. The government employs or funds the scientists, engineers, emergency response personnel, safety officials, building inspectors, and others who are instrumental in reducing earthquake hazards. This book clearly illustrates how, and why all of this came to pass.

  19. 76 FR 61070 - Disaster Assistance; Hazard Mitigation Grant Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-03

    ... (63 FR 24143), is withdrawn as of October 3, 2011. ADDRESSES: The Notice of Proposed Rulemaking and... not all-inclusive. FEMA published a Notice of Proposed Rulemaking (NPRM) (63 FR 24143, May 1, 1998... hazard mitigation and erosion hazard mitigation in the list of eligible activities; it proposed to...

  20. Risk perception and volcanic hazard mitigation: Individual and social perspectives

    NASA Astrophysics Data System (ADS)

    Paton, Douglas; Smith, Leigh; Daly, Michele; Johnston, David

    2008-05-01

    This paper discusses how people's interpretation of their experience of volcanic hazards and public volcanic hazard education programs influences their risk perception and whether or not they adopt measures that can mitigate their risk. Drawing on four studies of volcanic risk perception and preparedness, the paper first examines why experiencing volcanic hazards need not necessarily motivate people to prepare for future volcanic crises. This work introduces how effective risk communication requires communities and civic agencies to play complementary roles in the risk management process. Next, the findings of a study evaluating the effectiveness of a public volcanic hazard education program introduce the important role that social interaction amongst community members plays in risk management. Building on the conclusions of these studies, a model that depicts preparing as a social process is developed and tested. The model predicts that it is the quality of the relationships between people, communities and civic agencies that determines whether people adopt measures that can reduce their risk from volcanic hazard consequences. The implications of the model for conceptualizing and delivering volcanic hazard public education programs in ways that accommodate these relationships is discussed.

  1. Debris flow hazards mitigation--Mechanics, prediction, and assessment

    USGS Publications Warehouse

    2007-01-01

    These proceedings contain papers presented at the Fourth International Conference on Debris-Flow Hazards Mitigation: Mechanics, Prediction, and Assessment held in Chengdu, China, September 10-13, 2007. The papers cover a wide range of topics on debris-flow science and engineering, including the factors triggering debris flows, geomorphic effects, mechanics of debris flows (e.g., rheology, fluvial mechanisms, erosion and deposition processes), numerical modeling, various debris-flow experiments, landslide-induced debris flows, assessment of debris-flow hazards and risk, field observations and measurements, monitoring and alert systems, structural and non-structural countermeasures against debris-flow hazards and case studies. The papers reflect the latest devel-opments and advances in debris-flow research. Several studies discuss the development and appli-cation of Geographic Information System (GIS) and Remote Sensing (RS) technologies in debris-flow hazard/risk assessment. Timely topics presented in a few papers also include the development of new or innovative techniques for debris-flow monitoring and alert systems, especially an infra-sound acoustic sensor for detecting debris flows. Many case studies illustrate a wide variety of debris-flow hazards and related phenomena as well as their hazardous effects on human activities and settlements.

  2. Potentially Hazardous Objects (PHO) Mitigation Program

    NASA Astrophysics Data System (ADS)

    Huebner, Walter

    Southwest Research Institute (SwRI) and its partner, Los Alamos National Laboratory (LANL), are prepared to develop, implement, and expand procedures to avert collisions of potentially hazardous objects (PHOs) with Earth as recommended by NASA in its White Paper "Near- Earth Object Survey and Deflection Analysis of Alternatives" requested by the US Congress and submitted to it in March 2007. In addition to developing the general mitigation program as outlined in the NASA White Paper, the program will be expanded to include aggressive mitigation procedures for small (e.g., Tunguska-sized) PHOs and other short warning-time PHOs such as some long-period comet nuclei. As a first step the program will concentrate on the most likely and critical cases, namely small objects and long-period comet nuclei with short warning-times, but without losing sight of objects with longer warning-times. Objects smaller than a few hundred meters are of interest because they are about 1000 times more abundant than kilometer-sized objects and are fainter and more difficult to detect, which may lead to short warning times and hence short reaction times. Yet, even these small PHOs can have devastating effects as the 30 June 1908, Tungaska event has shown. In addition, long-period comets, although relatively rare but large (sometimes tens of kilometers in size), cannot be predicted because of their long orbital periods. Comet C/1983 H1 (IRAS-Araki-Alcock), for example, has an orbital period of 963.22 years, was discovered 27 April 1983, and passed Earth only two weeks later, on 11 May 1983, at a distance of 0.0312 AU. Aggressive methods and continuous alertness will be needed to defend against objects with such short warning times. While intact deflection of a PHO remains a key objective, destruction of a PHO and dispersion of the pieces must also be considered. The effectiveness of several alternative methods including nuclear demolition munitions, conventional explosives, and hyper-velocity impacts will be investigated and compared. This comparison is important for technical as well as political reasons, both domestic and international. The long-range plan includes evaluation of technical readiness including launch capabilities, tests for effectiveness using materials simulating PHOs, and building and testing several modular systems appropriate for alternative applications depending on the type of PHO.

  3. Influence of behavioral biases on the assessment of multi-hazard risks and the implementation of multi-hazard risks mitigation measures: case study of multi-hazard cyclone shelters in Tamil Nadu, India

    NASA Astrophysics Data System (ADS)

    Komendantova, Nadejda; Patt, Anthony

    2013-04-01

    In December 2004, a multiple hazards event devastated the Tamil Nadu province of India. The Sumatra -Andaman earthquake with a magnitude of Mw=9.1-9.3 caused the Indian Ocean tsunami with wave heights up to 30 m, and flooding that reached up to two kilometers inland in some locations. More than 7,790 persons were killed in the province of Tamil Nadu, with 206 in its capital Chennai. The time lag between the earthquake and the tsunami's arrival in India was over an hour, therefore, if a suitable early warning system existed, a proper means of communicating the warning and shelters existing for people would exist, than while this would not have prevented the destruction of infrastructure, several thousands of human lives would have been saved. India has over forty years of experience in the construction of cyclone shelters. With additional efforts and investment, these shelters could be adapted to other types of hazards such as tsunamis and flooding, as well as the construction of new multi-hazard cyclone shelters (MPCS). It would therefore be possible to mitigate one hazard such as cyclones by the construction of a network of shelters while at the same time adapting these shelters to also deal with, for example, tsunamis, with some additional investment. In this historical case, the failure to consider multiple hazards caused significant human losses. The current paper investigates the patterns of the national decision-making process with regards to multiple hazards mitigation measures and how the presence of behavioral and cognitive biases influenced the perceptions of the probabilities of multiple hazards and the choices made for their mitigation by the national decision-makers. Our methodology was based on the analysis of existing reports from national and international organizations as well as available scientific literature on behavioral economics and natural hazards. The results identified several biases in the national decision-making process when the construction of cyclone shelters was being undertaken. The availability heuristics caused a perception of low probability of tsunami following an earthquake, as the last large similar event happened over a hundred years ago. Another led to a situation when decisions were taken on the basis of experience and not statistical evidence, namely, experience showed that the so-called "Ring of Fire" generates underground earthquakes and tsunamis in the Pacific Ocean. This knowledge made decision-makers to neglect the numerical estimations about probability of underground earthquake in the Indian Ocean even though seismologists were warning about probability of a large underground earthquake in the Indian Ocean. The bounded rationality bias led to misperception of signals from the early warning center in the Pacific Ocean. The resulting limited concern resulted in risk mitigation measures that considered cyclone risks, but much less about tsunami. Under loss aversion considerations, the decision-makers perceived the losses connected with the necessary additional investment as being greater than benefits from mitigating a less probable hazard.

  4. Destructive Interactions Between Mitigation Strategies and the Causes of Unexpected Failures in Natural Hazard Mitigation Systems

    NASA Astrophysics Data System (ADS)

    Day, S. J.; Fearnley, C. J.

    2013-12-01

    Large investments in the mitigation of natural hazards, using a variety of technology-based mitigation strategies, have proven to be surprisingly ineffective in some recent natural disasters. These failures reveal a need for a systematic classification of mitigation strategies; an understanding of the scientific uncertainties that affect the effectiveness of such strategies; and an understanding of how the different types of strategy within an overall mitigation system interact destructively to reduce the effectiveness of the overall mitigation system. We classify mitigation strategies into permanent, responsive and anticipatory. Permanent mitigation strategies such as flood and tsunami defenses or land use restrictions, are both costly and 'brittle': when they malfunction they can increase mortality. Such strategies critically depend on the accuracy of the estimates of expected hazard intensity in the hazard assessments that underpin their design. Responsive mitigation strategies such as tsunami and lahar warning systems rely on capacities to detect and quantify the hazard source events and to transmit warnings fast enough to enable at risk populations to decide and act effectively. Self-warning and voluntary evacuation is also usually a responsive mitigation strategy. Uncertainty in the nature and magnitude of the detected hazard source event is often the key scientific obstacle to responsive mitigation; public understanding of both the hazard and the warnings, to enable decision making, can also be a critical obstacle. Anticipatory mitigation strategies use interpretation of precursors to hazard source events and are used widely in mitigation of volcanic hazards. Their critical limitations are due to uncertainties in time, space and magnitude relationships between precursors and hazard events. Examples of destructive interaction between different mitigation strategies are provided by the Tohoku 2011 earthquake and tsunami; recent earthquakes that have impacted population centers with poor enforcement of building codes, unrealistic expectations of warning systems or failures to understand local seismic damage mechanisms; and the interaction of land use restriction strategies and responsive warning strategies around lahar-prone volcanoes. A more complete understanding of the interactions between these different types of mitigation strategy, especially the consequences for the expectations and behaviors of the populations at risk, requires models of decision-making under high levels of both uncertainty and danger. The Observation-Orientation-Decision-Action (OODA) loop model (Boyd, 1987) may be a particularly useful model. It emphasizes the importance of 'orientation' (the interpretation of observations and assessment of their significance for the observer and decision-maker), the feedback between decisions and subsequent observations and orientations, and the importance of developing mitigation strategies that are flexible and so able to respond to the occurrence of the unexpected. REFERENCE: Boyd, J.R. A Discourse on Winning and Losing [http://dnipogo.org/john-r-boyd/

  5. Volcanic hazards and their mitigation: Progress and problems

    NASA Astrophysics Data System (ADS)

    Tilling, Robert I.

    1989-05-01

    At the beginning of the twentieth century, volcanology began to emerge as a modern science as a result of increased interest in eruptive phenomena following some of the worst volcanic disasters in recorded history: Krakatau (Indonesia) in 1883 and Mont Pelée (Martinique), Soufrière (St. Vincent), and Santa María (Guatemala) in 1902. Volcanology is again experiencing a period of heightened public awareness and scientific growth in the 1980s, the worst period since 1902 in terms of volcanic disasters and crises. A review of hazards mitigation approaches and techniques indicates that significant advances have been made in hazards assessment, volcano monitoring, and eruption forecasting. For example, the remarkable accuracy of the predictions of dome-building events at Mount St. Helens since June 1980 is unprecedented. Yet a predictive capability for more voluminous and explosive eruptions still has not been achieved. Studies of magma-induced seismicity and ground deformation continue to provide the most systematic and reliable data for early detection of precursors to eruptions and shallow intrusions. In addition, some other geophysical monitoring techniques and geochemical methods have been refined and are being more widely applied and tested. Comparison of the four major volcanic disasters of the 1980s (Mount St. Helens, U.S.A. (1980), El Chichón, Mexico (1982); Galunggung, Indonesia (1982); and Nevado del Ruíz, Colombia (1985) illustrates the importance of predisaster geoscience studies, volcanic hazards assessments, volcano monitoring, contingency planning, and effective communications between scientists and authorities. The death toll (>22,000) from the Ruíz catastrophe probably could have been greatly reduced; the reasons for the tragically ineffective implementation of evacuation measures are still unclear and puzzling in view of the fact that sufficient warnings were given. The most pressing problem in the mitigation of volcanic and associated hazards on a global scale is that most of the world's dangerous volcanoes are in densely populated countries that lack the economic and scientific resources or the political will to adequately study and monitor them. This problem afflicts both developed and developing countries, but it is especially acute for the latter. The greatest advances in volcanic hazards mitigation in the near future are most likely to be achieved by wider application of existing technology to poorly understood and studied volcanoes, rather than by refinements or new discoveries in technology alone.

  6. Space options for tropical cyclone hazard mitigation

    NASA Astrophysics Data System (ADS)

    Dicaire, Isabelle; Nakamura, Ryoko; Arikawa, Yoshihisa; Okada, Kazuyuki; Itahashi, Takamasa; Summerer, Leopold

    2015-02-01

    This paper investigates potential space options for mitigating the impact of tropical cyclones on cities and civilians. Ground-based techniques combined with space-based remote sensing instrumentation are presented together with space-borne concepts employing space solar power technology. Two space-borne mitigation options are considered: atmospheric warming based on microwave irradiation and laser-induced cloud seeding based on laser power transfer. Finally technology roadmaps dedicated to the space-borne options are presented, including a detailed discussion on the technological viability and technology readiness level of our proposed systems. Based on these assessments, the space-borne cyclone mitigation options presented in this paper may be established in a quarter of a century.

  7. Rockslide susceptibility and hazard assessment for mitigation works design along vertical rocky cliffs: workflow proposal based on a real case-study conducted in Sacco (Campania), Italy

    NASA Astrophysics Data System (ADS)

    Pignalosa, Antonio; Di Crescenzo, Giuseppe; Marino, Ermanno; Terracciano, Rosario; Santo, Antonio

    2015-04-01

    The work here presented concerns a case study in which a complete multidisciplinary workflow has been applied for an extensive assessment of the rockslide susceptibility and hazard in a common scenario such as a vertical and fractured rocky cliffs. The studied area is located in a high-relief zone in Southern Italy (Sacco, Salerno, Campania), characterized by wide vertical rocky cliffs formed by tectonized thick successions of shallow-water limestones. The study concerned the following phases: a) topographic surveying integrating of 3d laser scanning, photogrammetry and GNSS; b) gelogical surveying, characterization of single instabilities and geomecanichal surveying, conducted by geologists rock climbers; c) processing of 3d data and reconstruction of high resolution geometrical models; d) structural and geomechanical analyses; e) data filing in a GIS-based spatial database; f) geo-statistical and spatial analyses and mapping of the whole set of data; g) 3D rockfall analysis; The main goals of the study have been a) to set-up an investigation method to achieve a complete and thorough characterization of the slope stability conditions and b) to provide a detailed base for an accurate definition of the reinforcement and mitigation systems. For this purposes the most up-to-date methods of field surveying, remote sensing, 3d modelling and geospatial data analysis have been integrated in a systematic workflow, accounting of the economic sustainability of the whole project. A novel integrated approach have been applied both fusing deterministic and statistical surveying methods. This approach enabled to deal with the wide extension of the studied area (near to 200.000 m2), without compromising an high accuracy of the results. The deterministic phase, based on a field characterization of single instabilities and their further analyses on 3d models, has been applied for delineating the peculiarity of each single feature. The statistical approach, based on geostructural field mapping and on punctual geomechanical data from scan-line surveying, allowed the rock mass partitioning in homogeneous geomechanical sectors and data interpolation through bounded geostatistical analyses on 3d models. All data, resulting from both approaches, have been referenced and filed in a single spatial database and considered in global geo-statistical analyses for deriving a fully modelled and comprehensive evaluation of the rockslide susceptibility. The described workflow yielded the following innovative results: a) a detailed census of single potential instabilities, through a spatial database recording the geometrical, geological and mechanical features, along with the expected failure modes; b) an high resolution characterization of the whole slope rockslide susceptibility, based on the partitioning of the area according to the stability and mechanical conditions which can be directly related to specific hazard mitigation systems; c) the exact extension of the area exposed to the rockslide hazard, along with the dynamic parameters of expected phenomena; d) an intervention design for hazard mitigation.

  8. Mitigation options for accidental releases of hazardous gases

    SciTech Connect

    Fthenakis, V.M.

    1995-05-01

    The objective of this paper is to review and compare technologies available for mitigation of unconfined releases of toxic and flammable gases. These technologies include: secondary confinement, deinventory, vapor barriers, foam spraying, and water sprays/monitors. Guidelines for the design and/or operation of effective post-release mitigation systems and case studies involving actual industrial mitigation systems are also presented.

  9. New Approaches to Tsunami Hazard Mitigation Demonstrated in Oregon

    NASA Astrophysics Data System (ADS)

    Priest, G. R.; Rizzo, A.; Madin, I.; Lyles Smith, R.; Stimely, L.

    2012-12-01

    Oregon Department of Geology and Mineral Industries and Oregon Emergency Management collaborated over the last four years to increase tsunami preparedness for residents and visitors to the Oregon coast. Utilizing support from the National Tsunami Hazards Mitigation Program (NTHMP), new approaches to outreach and tsunami hazard assessment were developed and then applied. Hazard assessment was approached by first doing two pilot studies aimed at calibrating theoretical models to direct observations of tsunami inundation gleaned from the historical and prehistoric (paleoseismic/paleotsunami) data. The results of these studies were then submitted to peer-reviewed journals and translated into 1:10,000-12,000-scale inundation maps. The inundation maps utilize a powerful new tsunami model, SELFE, developed by Joseph Zhang at the Oregon Health & Science University. SELFE uses unstructured computational grids and parallel processing technique to achieve fast accurate simulation of tsunami interactions with fine-scale coastal morphology. The inundation maps were simplified into tsunami evacuation zones accessed as map brochures and an interactive mapping portal at http://www.oregongeology.org/tsuclearinghouse/. Unique in the world are new evacuation maps that show separate evacuation zones for distant versus locally generated tsunamis. The brochure maps explain that evacuation time is four hours or more for distant tsunamis but 15-20 minutes for local tsunamis that are invariably accompanied by strong ground shaking. Since distant tsunamis occur much more frequently than local tsunamis, the two-zone maps avoid needless over evacuation (and expense) caused by one-zone maps. Inundation mapping for the entire Oregon coast will be complete by ~2014. Educational outreach was accomplished first by doing a pilot study to measure effectiveness of various approaches using before and after polling and then applying the most effective methods. In descending order, the most effective methods were: (1) door-to-door (person-to-person) education, (2) evacuation drills, (3) outreach to K-12 schools, (4) media events, and (5) workshops targeted to key audiences (lodging facilities, teachers, and local officials). Community organizers were hired to apply these five methods to clusters of small communities, measuring performance by before and after polling. Organizers were encouraged to approach the top priority, person-to-person education, by developing Community Emergency Response Teams (CERT) or CERT-like organizations in each community, thereby leaving behind a functioning volunteer-based group that will continue the outreach program and build long term resiliency. One of the most effective person-to-person educational tools was the Map Your Neighborhood program that brings people together so they can sketch the basic layout of their neighborhoods to depict key earthquake and tsunami hazards and mitigation solutions. The various person-to-person volunteer efforts and supporting outreach activities are knitting communities together and creating a permanent culture of tsunami and earthquake preparedness. All major Oregon coastal population centers will have been covered by this intensive outreach program by ~2014.

  10. Collaborative Monitoring and Hazard Mitigation at Fuego Volcano, Guatemala

    NASA Astrophysics Data System (ADS)

    Lyons, J. J.; Bluth, G. J.; Rose, W. I.; Patrick, M.; Johnson, J. B.; Stix, J.

    2007-05-01

    A portable, digital sensor network has been installed to closely monitor changing activity at Fuego volcano, which takes advantage of an international collaborative effort among Guatemala, U.S. and Canadian universities, and the Peace Corps. The goal of this effort is to improve the understanding shallow internal processes, and consequently to more effectively mitigate volcanic hazards. Fuego volcano has had more than 60 historical eruptions and nearly-continuous activity make it an ideal laboratory to study volcanic processes. Close monitoring is needed to identify base-line activity, and rapidly identify and disseminate changes in the activity which might threaten nearby communities. The sensor network is comprised of a miniature DOAS ultraviolet spectrometer fitted with a system for automated plume scans, a digital video camera, and two seismo-acoustic stations and portable dataloggers. These sensors are on loan from scientists who visited Fuego during short field seasons and donated use of their sensors to a resident Peace Corps Masters International student from Michigan Technological University for extended data collection. The sensor network is based around the local volcano observatory maintained by Instituto National de Sismologia, Vulcanologia, Metrologia e Hidrologia (INSIVUMEH). INSIVUMEH provides local support and historical knowledge of Fuego activity as well as a secure location for storage of scientific equipment, data processing, and charging of the batteries that power the sensors. The complete sensor network came online in mid-February 2007 and here we present preliminary results from concurrent gas, seismic, and acoustic monitoring of activity from Fuego volcano.

  11. A mitigation program for potentially hazardous long-period comets

    NASA Astrophysics Data System (ADS)

    Boice, Daniel; Huebner, Walter

    A program is being developed to avert collisions of potentially hazardous objects (PHOs) with Earth by Southwest Research Institute and Los Alamos National Laboratory. In addition to developing the general mitigation strategies, the program will be expanded to include aggressive mitigation procedures for small (e.g., Tunguska-sized) potentially hazardous objects (PHO) and other short warning-time PHOs, such as some long-period comet nuclei. The program will initially concentrate on the most likely and critical cases, namely small objects and long-period comet nuclei with short warning-times. In this paper we discuss the threat posed by long-period comets. Although relatively rare but large (sometimes tens of kilometers in size) and fast moving, their detection cannot be predicted because of their long orbital periods, for example, comet C/1983 H1 (IRAS-Araki-Alcock) has an orbital period of 963.22 years. It was discovered on 27 April 1983, and passed Earth at a distance of 0.0312 AU on 11 May 1983, only two weeks later. Aggressive methods and continuous alertness will be needed to defend against objects with such short warning times. We summarize results on anticipated warning times of long-period comets given advances in modern telescopic facilities searching for such objects to present nominal and worst-case scenarios for these potentially hazardous objects.

  12. GO/NO-GO - When is medical hazard mitigation acceptable for launch?

    NASA Technical Reports Server (NTRS)

    Hamilton, Douglas R.; Polk, James D.

    2005-01-01

    Medical support of spaceflight missions is composed of complex tasks and decisions that dedicated to maintaining the health and performance of the crew and the completion of mission objectives. Spacecraft represent one of the most complex vehicles built by humans, and are built to very rigorous design specifications. In the course of a Flight Readiness Review (FRR) or a mission itself, the flight surgeon must be able to understand the impact of hazards and risks that may not be completely mitigated by design alone. Some hazards are not mitigated because they are never actually identified. When a hazard is identified, it must be reduced or waivered. Hazards that cannot be designed out of the vehicle or mission, are usually mitigated through other means to bring the residual risk to an acceptable level. This is possible in most engineered systems because failure modes are usually predictable and analysis can include taking these systems to failure. Medical support of space missions is complicated by the inability of flight surgeons to provide "exact" hazard and risk numbers to the NASA engineering community. Taking humans to failure is not an option. Furthermore, medical dogma is mostly comprised of "medical prevention" strategies that mitigate risk by examining the behaviour of a cohort of humans similar to astronauts. Unfortunately, this approach does not lend itself well for predicting the effect of a hazard in the unique environment of space. This presentation will discuss how Medical Operations uses an evidence-based approach to decide if hazard mitigation strategies are adequate to reduce mission risk to acceptable levels. Case studies to be discussed will include: 1. Risk of electrocution risk during EVA 2. Risk of cardiac event risk during long and short duration missions 3. Degraded cabin environmental monitoring on the ISS. Learning Objectives 1.) The audience will understand the challenges of mitigating medical risk caused by nominal and off-nominal mission events. 2.) The audience will understand the process by which medical hazards are identified and mitigated before launch. 3.) The audience will understand the roles and responsibilities of all the other flight control positions in participating in the process of reducing hazards and reducing medical risk to an acceptable level.

  13. Composite Materials for Hazard Mitigation of Reactive Metal Hydrides.

    SciTech Connect

    Pratt, Joseph William; Cordaro, Joseph Gabriel; Sartor, George B.; Dedrick, Daniel E.; Reeder, Craig L.

    2012-02-01

    In an attempt to mitigate the hazards associated with storing large quantities of reactive metal hydrides, polymer composite materials were synthesized and tested under simulated usage and accident conditions. The composites were made by polymerizing vinyl monomers using free-radical polymerization chemistry, in the presence of the metal hydride. Composites with vinyl-containing siloxane oligomers were also polymerized with and without added styrene and divinyl benzene. Hydrogen capacity measurements revealed that addition of the polymer to the metal hydride reduced the inherent hydrogen storage capacity of the material. The composites were found to be initially effective at reducing the amount of heat released during oxidation. However, upon cycling the composites, the mitigating behavior was lost. While the polymer composites we investigated have mitigating potential and are physically robust, they undergo a chemical change upon cycling that makes them subsequently ineffective at mitigating heat release upon oxidation of the metal hydride. Acknowledgements The authors would like to thank the following people who participated in this project: Ned Stetson (U.S. Department of Energy) for sponsorship and support of the project. Ken Stewart (Sandia) for building the flow-through calorimeter and cycling test stations. Isidro Ruvalcaba, Jr. (Sandia) for qualitative experiments on the interaction of sodium alanate with water. Terry Johnson (Sandia) for sharing his expertise and knowledge of metal hydrides, and sodium alanate in particular. Marcina Moreno (Sandia) for programmatic assistance. John Khalil (United Technologies Research Corp) for insight into the hazards of reactive metal hydrides and real-world accident scenario experiments. Summary In an attempt to mitigate and/or manage hazards associated with storing bulk quantities of reactive metal hydrides, polymer composite materials (a mixture of a mitigating polymer and a metal hydride) were synthesized and tested under simulated usage and accident conditions. Mitigating the hazards associated with reactive metal hydrides during an accident while finding a way to keep the original capability of the active material intact during normal use has been the focus of this work. These composites were made by polymerizing vinyl monomers using free-radical polymerization chemistry, in the presence of the metal hydride, in this case a prepared sodium alanate (chosen as a representative reactive metal hydride). It was found that the polymerization of styrene and divinyl benzene could be initiated using AIBN in toluene at 70 degC. The resulting composite materials can be either hard or brittle solids depending on the cross-linking density. Thermal decomposition of these styrene-based composite materials is lower than neat polystyrene indicating that the chemical nature of the polymer is affected by the formation of the composite. The char-forming nature of cross-linked polystyrene is low and therefore, not an ideal polymer for hazard mitigation. To obtain composite materials containing a polymer with higher char-forming potential, siloxane-based monomers were investigated. Four vinyl-containing siloxane oligomers were polymerized with and without added styrene and divinyl benzene. Like the styrene materials, these composite materials exhibited thermal decomposition behavior significantly different than the neat polymers. Specifically, the thermal decomposition temperature was shifted approximately 100 degC lower than the neat polymer signifying a major chemical change to the polymer network. Thermal analysis of the cycled samples was performed on the siloxane-based composite materials. It was found that after 30 cycles the siloxane-containing polymer composite material has similar TGA/DSC-MS traces as the virgin composite material indicating that the polymer is physically intact upon cycling. Hydrogen capacity measurements revealed that addition of the polymer to the metal hydride in the form of a composite material reduced the inherent hydrogen storage capacity of the material. This reduction in capacity was observed to be independent of the amount of charge/discharge cycles except for the composites containing siloxane, which showed less of an impact on hydrogen storage capacity as it was cycled further. While the reason for this is not clear, it may be due to a chemically stabilizing effect of the siloxane on the metal hydride. Flow-through calorimetry was used to characterize the mitigating effectiveness of the different composites relative to the neat (no polymer) material. The composites were found to be initially effective at reducing the amount of heat released during oxidation, and the best performing material was the siloxane-containing composite which reduced the heat release to less than 50% of the value of the neat material. However, upon cycling the composites, all mitigating behavior was lost. The combined results of the flow-through calorimetry, hydrogen capacity, and thermogravimetric analysis tests lead to the proposed conclusion that while the polymer composites have mitigating potential and are physically robust under cycling, they undergo a chemical change upon cycling that makes them ineffective at mitigating heat release upon oxidation of the metal hydride.

  14. Modeling and mitigating natural hazards: Stationarity is immortal!

    NASA Astrophysics Data System (ADS)

    Montanari, Alberto; Koutsoyiannis, Demetris

    2014-12-01

    Environmental change is a reason of relevant concern as it is occurring at an unprecedented pace and might increase natural hazards. Moreover, it is deemed to imply a reduced representativity of past experience and data on extreme hydroclimatic events. The latter concern has been epitomized by the statement that "stationarity is dead." Setting up policies for mitigating natural hazards, including those triggered by floods and droughts, is an urgent priority in many countries, which implies practical activities of management, engineering design, and construction. These latter necessarily need to be properly informed, and therefore, the research question on the value of past data is extremely important. We herein argue that there are mechanisms in hydrological systems that are time invariant, which may need to be interpreted through data inference. In particular, hydrological predictions are based on assumptions which should include stationarity. In fact, any hydrological model, including deterministic and nonstationary approaches, is affected by uncertainty and therefore should include a random component that is stationary. Given that an unnecessary resort to nonstationarity may imply a reduction of predictive capabilities, a pragmatic approach, based on the exploitation of past experience and data is a necessary prerequisite for setting up mitigation policies for environmental risk.

  15. 77 FR 24505 - Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential Buildings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-24

    ... SECURITY Federal Emergency Management Agency Hazard Mitigation Assistance for Wind Retrofit Projects for... comments on Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential Buildings... property from hazards and their effects. One such activity is the implementation of wind retrofit...

  16. Deterministic and Nondeterministic Behavior of Earthquakes and Hazard Mitigation Strategy

    NASA Astrophysics Data System (ADS)

    Kanamori, H.

    2014-12-01

    Earthquakes exhibit both deterministic and nondeterministic behavior. Deterministic behavior is controlled by length and time scales such as the dimension of seismogenic zones and plate-motion speed. Nondeterministic behavior is controlled by the interaction of many elements, such as asperities, in the system. Some subduction zones have strong deterministic elements which allow forecasts of future seismicity. For example, the forecasts of the 2010 Mw=8.8 Maule, Chile, earthquake and the 2012 Mw=7.6, Costa Rica, earthquake are good examples in which useful forecasts were made within a solid scientific framework using GPS. However, even in these cases, because of the nondeterministic elements uncertainties are difficult to quantify. In some subduction zones, nondeterministic behavior dominates because of complex plate boundary structures and defies useful forecasts. The 2011 Mw=9.0 Tohoku-Oki earthquake may be an example in which the physical framework was reasonably well understood, but complex interactions of asperities and insufficient knowledge about the subduction-zone structures led to the unexpected tragic consequence. Despite these difficulties, broadband seismology, GPS, and rapid data processing-telemetry technology can contribute to effective hazard mitigation through scenario earthquake approach and real-time warning. A scale-independent relation between M0 (seismic moment) and the source duration, t, can be used for the design of average scenario earthquakes. However, outliers caused by the variation of stress drop, radiation efficiency, and aspect ratio of the rupture plane are often the most hazardous and need to be included in scenario earthquakes. The recent development in real-time technology would help seismologists to cope with, and prepare for, devastating tsunamis and earthquakes. Combining a better understanding of earthquake diversity and modern technology is the key to effective and comprehensive hazard mitigation practices.

  17. Meteorological Hazard Assessment and Risk Mitigation in Rwanda.

    NASA Astrophysics Data System (ADS)

    Nduwayezu, Emmanuel; Jaboyedoff, Michel; Bugnon, Pierre-Charles; Nsengiyumva, Jean-Baptiste; Horton, Pascal; Derron, Marc-Henri

    2015-04-01

    Between 10 and 13 April 2012, heavy rains hit sectors adjacent to the Vulcanoes National Park (Musanze District in the Northern Province and Nyabihu and Rubavu Districts in the Western Province of RWANDA), causing floods that affected about 11,000 persons. Flooding caused deaths and injuries among the affected population, and extensive damage to houses and properties. 348 houses were destroyed and 446 were partially damaged or have been underwater for several days. Families were forced to leave their flooded homes and seek temporal accommodation with their neighbors, often in overcrowded places. Along the West-northern border of RWANDA, Virunga mountain range consists of 6 major volcanoes. Mount Karisimbi is the highest volcano at 4507m. The oldest mountain is mount Sabyinyo which rises 3634m. The hydraulic network in Musanze District is formed by temporary torrents and permanent watercourses. Torrents surge during strong storms, and are provoked by water coming downhill from the volcanoes, some 20 km away. This area is periodically affected by flooding and landslides because of heavy rain (Rwanda has 2 rainy seasons from February to April and from September to November each year in general and 2 dry seasons) striking the Volcano National Park. Rain water creates big water channels (in already known torrents or new ones) that impact communities, agricultural soils and crop yields. This project aims at identifying hazardous and risky areas by producing susceptibility maps for floods, debris flow and landslides over this sector. Susceptibility maps are being drawn using field observations, during and after the 2012 events, and an empirical model of propagation for regional susceptibility assessments of debris flows (Flow-R). Input data are 10m and 30m resolution DEMs, satellite images, hydrographic network, and some information on geological substratum and soil occupation. Combining susceptibility maps with infrastructures, houses and population density maps will be used in identifying the most risky areas. Finally, based on practical experiences in this kind of field and produced documents some recommendations for low-cost mitigation measures will be proposed. Reference: MIDIMAR, Impacts of floods and landslides on socio-economic development profile. Case study: Musanze District. Kigali, June 2012.

  18. Tsunami hazard mitigation in tourism in the tropical and subtropical coastal areas: a case study in the Ryukyu Islands, southwest of Japan

    NASA Astrophysics Data System (ADS)

    Matsumoto, T.

    2006-12-01

    Life and economy (including tourism) in tropical and subtropical coastal areas, such as Okinawa Prefecture (Ryukyu) are highly relying on the sea. The sea has both "gentle" side to give people healing and "dangerous" side to kill people. If we are going to utilise the sea for marine tourism such as constructing resort facilities on the oceanfront, we should know all of the sea, including the both sides of the sea: especially the nature of tsunamis. And also we islanders should issue accurate information about the sea towards outsiders, especially tourists visiting the island. We have already learned a lesson about this issue from the Sumatra tsunami in 2004. However, measures against the tsunami disaster by marine tourism industry are still inadequate in these areas. The goal of tsunami hazard mitigation for those engaged in tourism industry in tropical and subtropical coastal areas should be as follows. (1) Preparedness against tsunamis: "Be aware of the characteristics of tsunamis." "Prepare tsunamis when you feel an earthquake." "Prepare tsunamis when an earthquake takes place somewhere in the world." (2) Maintenance of an exact tsunami hazard map under quantitative analyses of the characteristics of tsunamis: "Flooding areas by tsunami attacks are dependent not only on altitude but also on amplification and inundation due to the seafloor topography near the coast and the onland topographic relief." "Tsunami damage happens repeatedly." (3) Maintenance of a tsunami disaster prevention manual and training after the manual: "Who should do what in case of tsunamis?" "How should the resort hotel employees lead the guests to the safe place?" Such a policy for disaster prevention is discussed in the class of the general education of "Ocean Sciences" in University of the Ryukyus (UR) and summer school for high school students. The students (most of them are from Okinawa Prefecture) consider, discuss and make reports about what to do in case of tsunamis as an islander. Especially, students of Department of Tourism Sciences (DTS) are keen on the discussion and make excellent reports/proposals. Here, the author would also like to introduce some of them in the presentation.

  19. Mitigation of unconfined releases of hazardous gases via liquid spraying

    SciTech Connect

    Fthenakis, V.M.

    1997-02-01

    The capability of water sprays in mitigating clouds of hydrofluoric acid (HF) has been demonstrated in the large-scale field experiments of Goldfish and Hawk, which took place at the DOE Nevada Test Site. The effectiveness of water sprays and fire water monitors to remove HF from vapor plume, has also been studied theoretically using the model HGSPRAY5 with the near-field and far-field dispersion described by the HGSYSTEM models. This paper presents options to select and evaluate liquid spraying systems, based on the industry experience and mathematical modeling.

  20. ANALYSIS AND MITIGATION OF X-RAY HAZARD GENERATED FROM HIGH INTENSITY LASER-TARGET INTERACTIONS

    SciTech Connect

    Qiu, R.; Liu, J.C.; Prinz, A.A.; Rokni, S.H.; Woods, M.; Xia, Z.; ,

    2011-03-21

    Interaction of a high intensity laser with matter may generate an ionizing radiation hazard. Very limited studies have been made, however, on the laser-induced radiation protection issue. This work reviews available literature on the physics and characteristics of laser-induced X-ray hazards. Important aspects include the laser-to-electron energy conversion efficiency, electron angular distribution, electron energy spectrum and effective temperature, and bremsstrahlung production of X-rays in the target. The possible X-ray dose rates for several femtosecond Ti:sapphire laser systems used at SLAC, including the short pulse laser system for the Matter in Extreme Conditions Instrument (peak power 4 TW and peak intensity 2.4 x 10{sup 18} W/cm{sup 2}) were analysed. A graded approach to mitigate the laser-induced X-ray hazard with a combination of engineered and administrative controls is also proposed.

  1. The seismic project of the National Tsunami Hazard Mitigation Program

    USGS Publications Warehouse

    Oppenheimer, D.H.; Bittenbinder, A.N.; Bogaert, B.M.; Buland, R.P.; Dietz, L.D.; Hansen, R.A.; Malone, S.D.; McCreery, C.S.; Sokolowski, T.J.; Whitmore, P.M.; Weaver, C.S.

    2005-01-01

    In 1997, the Federal Emergency Management Agency (FEMA), National Oceanic and Atmospheric Administration (NOAA), U.S. Geological Survey (USGS), and the five western States of Alaska, California, Hawaii, Oregon, and Washington joined in a partnership called the National Tsunami Hazard Mitigation Program (NTHMP) to enhance the quality and quantity of seismic data provided to the NOAA tsunami warning centers in Alaska and Hawaii. The NTHMP funded a seismic project that now provides the warning centers with real-time seismic data over dedicated communication links and the Internet from regional seismic networks monitoring earthquakes in the five western states, the U.S. National Seismic Network in Colorado, and from domestic and global seismic stations operated by other agencies. The goal of the project is to reduce the time needed to issue a tsunami warning by providing the warning centers with high-dynamic range, broadband waveforms in near real time. An additional goal is to reduce the likelihood of issuing false tsunami warnings by rapidly providing to the warning centers parametric information on earthquakes that could indicate their tsunamigenic potential, such as hypocenters, magnitudes, moment tensors, and shake distribution maps. New or upgraded field instrumentation was installed over a 5-year period at 53 seismic stations in the five western states. Data from these instruments has been integrated into the seismic network utilizing Earthworm software. This network has significantly reduced the time needed to respond to teleseismic and regional earthquakes. Notably, the West Coast/Alaska Tsunami Warning Center responded to the 28 February 2001 Mw 6.8 Nisqually earthquake beneath Olympia, Washington within 2 minutes compared to an average response time of over 10 minutes for the previous 18 years. ?? Springer 2005.

  2. Next-Generation GPS Station for Hazards Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2013-12-01

    Our objective is to better forecast, assess, and mitigate natural hazards, including earthquakes, tsunamis, and extreme storms and flooding through development and implementation of a modular technology for the next-generation in-situ geodetic station to support the flow of information from multiple stations to scientists, mission planners, decision makers, and first responders. The same technology developed under NASA funding can be applied to enhance monitoring of large engineering structures such as bridges, hospitals and other critical infrastructure. Meaningful warnings save lives when issued within 1-2 minutes for destructive earthquakes, several tens of minutes for tsunamis, and up to several hours for extreme storms and flooding, and can be provided by on-site fusion of multiple data types and generation of higher-order data products: GPS/GNSS and accelerometer measurements to estimate point displacements, and GPS/GNSS and meteorological measurements to estimate moisture variability in the free atmosphere. By operating semi-autonomously, each station can then provide low-latency, high-fidelity and compact data products within the constraints of narrow communications bandwidth that often accompanies natural disasters. We have developed a power-efficient, low-cost, plug-in Geodetic Module for fusion of data from in situ sensors including GPS, a strong-motion accelerometer module, and a meteorological sensor package, for deployment at existing continuous GPS stations in southern California; fifteen stations have already been upgraded. The low-cost modular design is scalable to the many existing continuous GPS stations worldwide. New on-the-fly data products are estimated with 1 mm precision and accuracy, including three-dimensional seismogeodetic displacements for earthquake, tsunami and structural monitoring and precipitable water for forecasting extreme weather events such as summer monsoons and atmospheric rivers experienced in California. Unlike more traditional approaches where data are collected and analyzed from a network of stations at a central processing facility, we are embedding these capabilities in the Geodetic Module's processor for in situ analysis and data delivery through TCP/IP to avoid single points of failure during emergencies. We are infusing our technology to several local and state groups, including the San Diego County Office of Emergency Services for earthquake and tsunami early warnings, UC San Diego Health Services for hospital monitoring and early warning, Caltrans for bridge monitoring, and NOAA's Weather Forecasting Offices in San Diego and Los Angeles Counties for forecasting extreme weather events. We describe our overall system and the ongoing efforts at technology infusion.

  3. Aligning Natural Resource Conservation and Flood Hazard Mitigation in California

    PubMed Central

    Calil, Juliano; Beck, Michael W.; Gleason, Mary; Merrifield, Matthew; Klausmeyer, Kirk; Newkirk, Sarah

    2015-01-01

    Flooding is the most common and damaging of all natural disasters in the United States, and was a factor in almost all declared disasters in U.S. history. Direct flood losses in the U.S. in 2011 totaled $8.41 billion and flood damage has also been on the rise globally over the past century. The National Flood Insurance Program paid out more than $38 billion in claims since its inception in 1968, more than a third of which has gone to the one percent of policies that experienced multiple losses and are classified as “repetitive loss.” During the same period, the loss of coastal wetlands and other natural habitat has continued, and funds for conservation and restoration of these habitats are very limited. This study demonstrates that flood losses could be mitigated through action that meets both flood risk reduction and conservation objectives. We found that there are at least 11,243km2 of land in coastal California, which is both flood-prone and has natural resource conservation value, and where a property/structure buyout and habitat restoration project could meet multiple objectives. For example, our results show that in Sonoma County, the extent of land that meets these criteria is 564km2. Further, we explore flood mitigation grant programs that can be a significant source of funds to such projects. We demonstrate that government funded buyouts followed by restoration of targeted lands can support social, environmental, and economic objectives: reduction of flood exposure, restoration of natural resources, and efficient use of limited governmental funds. PMID:26200353

  4. Implementation strategies for U.S. DOE Order 5480.28 Natural Phenomena Hazards Mitigation

    SciTech Connect

    Conrads, T.J.

    1995-01-01

    This paper describes the strategies used by Westinghouse Hanford Company for implementing a new U.S. Department of Energy Order 5480.28, Natural Phenomena Hazards Mitigation. The order requires that all new and existing structures, systems, and components be designed and evaluated for the effects of natural phenomena (seismic, wind, flood, and volcano) applicable at a given site. It also requires that instrumentation be available to record the expected seismic events and that procedures be available to inspect facilities for damage following a natural phenomena event. This order requires that probabilistic hazards studies be conducted for the applicable natural phenomena to determine appropriate loads to be applied in a graded approach to structures, systems, and components important to safety. This paper discusses the processes, tasks, and methods used to implement this directive, which altered the standard design basis for new and existing structures, systems, and components at the Hanford Site. It also addresses a correlation between the performance category nomenclature of DOE Order 5480.28 and the safety classification described in DOE Order 5480.23, Nuclear Safety Analysis Reports. This correlation was deemed to be a prerequisite for the cost-effective implementation of the new DOE Order on natural phenomena hazards mitigation.

  5. Numerical and probabilistic analysis of asteroid and comet impact hazard mitigation

    SciTech Connect

    Plesko, Catherine S; Weaver, Robert P; Huebner, Walter F

    2010-09-09

    The possibility of asteroid and comet impacts on Earth has received significant recent media and scientific attention. Still, there are many outstanding questions about the correct response once a potentially hazardous object (PHO) is found. Nuclear munitions are often suggested as a deflection mechanism because they have a high internal energy per unit launch mass. However, major uncertainties remain about the use of nuclear munitions for hazard mitigation. There are large uncertainties in a PHO's physical response to a strong deflection or dispersion impulse like that delivered by nuclear munitions. Objects smaller than 100 m may be solid, and objects at all sizes may be 'rubble piles' with large porosities and little strength. Objects with these different properties would respond very differently, so the effects of object properties must be accounted for. Recent ground-based observations and missions to asteroids and comets have improved the planetary science community's understanding of these objects. Computational power and simulation capabilities have improved such that it is possible to numerically model the hazard mitigation problem from first principles. Before we know that explosive yield Y at height h or depth -h from the target surface will produce a momentum change in or dispersion of a PHO, we must quantify energy deposition into the system of particles that make up the PHO. Here we present the initial results of a parameter study in which we model the efficiency of energy deposition from a stand-off nuclear burst onto targets made of PHO constituent materials.

  6. Earthquake and Volcanic Hazard Mitigation and Capacity Building in Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Ayele, A.

    2012-04-01

    The East African Rift System (EARS) is a classic example of active continental rifting, and a natural laboratory setting to study initiation and early stage evolution of continental rifts. The EARS is at different stages of development that varies from relatively matured rift (16 mm/yr) in the Afar to a weakly extended Okavango Delta in the south with predicted opening velocity < 3 mm/yr. Recent studies in the region helped researchers to highlight the length and timescales of magmatism and faulting, the partitioning of strain between faulting and magmatism, and their implications for the development of along-axis segmentation. Although the human resource and instrument coverage is sparse in the continent, our understanding of rift processes and deep structure has improved in the last decade after the advent of space geodesy and broadband seismology. The recent major earthquakes, volcanic eruptions and mega dike intrusions that occurred along the EARS attracted several earth scientist teams across the globe. However, most African countries traversed by the rift do not have the full capacity to monitor and mitigate earthquake and volcanic hazards. Few monitoring facilities exist in some countries, and the data acquisition is rarely available in real-time for mitigation purpose. Many sub-Saharan Africa governments are currently focused on achieving the millennium development goals with massive infrastructure development scheme and urbanization while impending natural hazards of such nature are severely overlooked. Collaborations with overseas researchers and other joint efforts by the international community are opportunities to be used by African institutions to best utilize limited resources and to mitigate earthquake and volcano hazards.

  7. Evaluating fuel complexes for fire hazard mitigation planning in the southeastern United States.

    SciTech Connect

    Andreu, Anne G.; Shea, Dan; Parresol, Bernard, R.; Ottmar, Roger, D.

    2012-01-01

    Fire hazard mitigation planning requires an accurate accounting of fuel complexes to predict potential fire behavior and effects of treatment alternatives. In the southeastern United States, rapid vegetation growth coupled with complex land use history and forest management options requires a dynamic approach to fuel characterization. In this study we assessed potential surface fire behavior with the Fuel Characteristic Classification System (FCCS), a tool which uses inventoried fuelbed inputs to predict fire behavior. Using inventory data from 629 plots established in the upper Atlantic Coastal Plain, South Carolina, we constructed FCCS fuelbeds representing median fuel characteristics by major forest type and age class. With a dry fuel moisture scenario and 6.4 km h{sub 1} midflame wind speed, the FCCS predicted moderate to high potential fire hazard for the majority of the fuelbeds under study. To explore fire hazard under potential future fuel conditions, we developed fuelbeds representing the range of quantitative inventorydata for fuelbed components that drive surface fire behavior algorithms and adjusted shrub species composition to represent 30% and 60% relative cover of highly flammable shrub species. Results indicate that the primary drivers of surface fire behavior vary by forest type, age and surface fire behavior rating. Litter tends to be a primary or secondary driver in most forest types. In comparison to other surface fire contributors, reducing shrub loading results in reduced flame lengths most consistently across forest types. FCCS fuelbeds and the results from this project can be used for fire hazard mitigation planning throughout the southern Atlantic Coastal Plain where similar forest types occur. The approach of building simulated fuelbeds across the range of available surface fuel data produces sets of incrementally different fuel characteristics that can be applied to any dynamic forest types in which surface fuel conditions change rapidly.

  8. Spatio-temporal patterns of hazards and their use in risk assessment and mitigation. Case study of road accidents in Romania

    NASA Astrophysics Data System (ADS)

    Catalin Stanga, Iulian

    2013-04-01

    Road accidents are among the leading causes of death in many world countries, partly as an inherent consequence of the increasing mobility of today society. The World Health Organization estimates that 1.3 million people died in road accidents in 2011, which means 186 deaths per million. The tragic picture is completed by millions of peoples experiencing different physical injuries or by the enormous social and economic costs that these events imply. Romania has one of the most unsafe road networks within the European Union, with annual averages of 9400 accidents, 8300 injuries and almost 2680 fatalities (2007-2012). An average of 141 death per million is more than twice the average fatality rate in European Union (about 60 death per million). Other specific indicators (accidents or fatalities reported to the road length, vehicle fleet size, driving license owners or adult population etc.) are even worst in the same European context. Road accidents are caused by a complex series of factors, some of them being a relatively constant premise, while others act as catalyzing factors or triggering agent: road features and quality, vehicle technical state, weather conditions, human related factors etc. All these lead to a complex equation with too many unknown variables, making almost impossible a probabilistic approach. However, the high concentration of accidents in a region or in some road sectors is caused by the existence of a specific context, created by factors with permanent or repetitive character, and leads to the idea of a spatial autocorrelation between locations of different adjoining accident. In the same way, the increasing frequency of road accidents and of their causes repeatability in different periods of the year would allow to identify those black timeframes with higher incidence of road accidents. Identifying and analyzing the road blackspots (hotspots) and black zones would help to improve road safety by acting against the common causes that create the spatial or temporal clustering of crash accidents. Since the 1990's, Geographical Informational Systems (GIS) became a very important tool for traffic and road safety management, allowing not only the spatial and multifactorial analysis, but also graphical and non-graphical outputs. The current paper presents an accessible GIS methodology to study the spatio-temporal pattern of injury related road accidents, to identify the high density accidents zones, to make a cluster analysis, to create multicriterial typologies, to identify spatial and temporal similarities and to explain them. In this purpose, a Geographical Information System was created, allowing a complex analysis that involves not only the events, but also a large set of interrelated and spatially linked attributes. The GIS includes the accidents as georeferenced point elements with a spatially linked attribute database: identification information (date, location details); accident type; main, secondary and aggravating causes; data about driver; vehicle information; consequences (damages, injured peoples and fatalities). Each attribute has its own number code that allows both the statistical analysis and the spatial interrogation. The database includes those road accidents that led to physical injuries and loss of human lives between 2007 and 2012 and the spatial analysis was realized using TNTmips 7.3 software facilities. Data aggregation and processing allowed creating the spatial pattern of injury related road accidents through Kernel density estimation at three different levels (national - Romania; county level - Iasi County; local level - Iasi town). Spider graphs were used to create the temporal pattern or road accidents at three levels (daily, weekly and monthly) directly related to their causes. Moreover the spatial and temporal database relates the natural hazards (glazed frost, fog, and blizzard) with the human made ones, giving the opportunity to evaluate the nature of uncertainties in risk assessment. At the end, this paper provides a clustering methodology based on several environmental indicators intended to classify the spatial and temporal hotspots of road traffic insecurity. The results are a useful guide for planners and decision makers in developing effective road safety strategies and measures.

  9. The variational effects of jurisdictional attributes on hazard mitigation planning costs.

    PubMed

    Jackman, Andrea M; Beruvides, Mario G

    2015-01-01

    Under the Disaster Mitigation Act of 2000 and Federal Emergency Management Agency's subsequent Interim Final Rule, the requirement was placed on local governments to author and gain approval for a Hazard Mitigation Plan (HMP) for the areas under their jurisdiction. Low completion percentages for HMPs-less than one-third of eligible governments-were found by an analysis conducted 3 years after the final deadline for the aforementioned legislation took place. Follow-up studies showed little improvement at 5 and 8 years after the deadline. Based on these results, a previous study hypothesized that the cost of creating a HMP might be an influential factor in explaining why most jurisdictions had failed to write or gain approval for a HMP. The frequency of natural hazards experienced by the planning jurisdiction, the number of jurisdictions participating in the plan, and the population and population density were found to explain more than half of the variation in HMP costs. This study is a continuation of that effort, finding that there are significant differences in cost both across ranges of values for the jurisdictional attributes and single-jurisdictional versus multijurisdictional plans. PMID:25779899

  10. Threshold effects of hazard mitigation in coastal human-environmental systems

    NASA Astrophysics Data System (ADS)

    Lazarus, E. D.

    2014-01-01

    Despite improved scientific insight into physical and social dynamics related to natural disasters, the financial cost of extreme events continues to rise. This paradox is particularly evident along developed coastlines, where future hazards are projected to intensify with consequences of climate change, and where the presence of valuable infrastructure exacerbates risk. By design, coastal hazard mitigation buffers human activities against the variability of natural phenomena such as storms. But hazard mitigation also sets up feedbacks between human and natural dynamics. This paper explores developed coastlines as exemplary coupled human-environmental systems in which hazard mitigation is the key coupling mechanism. Results from a simplified numerical model of an agent-managed seawall illustrate the nonlinear effects that economic and physical thresholds can impart into coastal human-environmental system dynamics. The scale of mitigation action affects the time frame over which human activities and natural hazards interact. By accelerating environmental changes observable in some settings over human timescales of years to decades, climate change may temporarily strengthen the coupling between human and environmental dynamics. However, climate change could ultimately result in weaker coupling at those human timescales as mitigation actions increasingly engage global-scale systems.

  11. Assessing the costs of hazard mitigation through landscape interventions in the urban structure

    NASA Astrophysics Data System (ADS)

    Bostenaru-Dan, Maria; Aldea Mendes, Diana; Panagopoulos, Thomas

    2014-05-01

    In this paper we look at an issue rarely approached, the economic efficiency of natural hazard risk mitigation. The urban scale at which a natural hazard can impact leads to the importance of urban planning strategy in risk management. However, usually natural, engineering, and social sciences deal with it, and the role of architecture and urban planning is neglected. Climate change can lead to risks related to increased floods, desertification, sea level rise among others. Reducing the sealed surfaces in cities through green spaces in the crowded centres can mitigate them, and can be foreseen in restructuration plans in presence or absence of disasters. For this purpose we reviewed the role of green spaces and community centres such as churches in games, which can build the core for restructuration efforts, as also field and archive studies show. We look at the way ICT can contribute to organize the information from the building survey to economic computations in direct modeling or through games. The roles of game theory, agent based modeling and networks and urban public policies in designing decision systems for risk management are discussed. Games rules are at the same time supported by our field and archive studies, as well as research by design. Also we take into consideration at a rare element, which is the role of landscape planning, through the inclusion of green elements in reconstruction after the natural and man-made disasters, or in restructuration efforts to mitigate climate change. Apart of existing old city tissue also landscape can be endangered by speculation and therefore it is vital to highlight its high economic value, also in this particular case. As ICOMOS highlights for the 2014 congress, heritage and landscape are two sides of the same coin. Landscape can become or be connected to a community centre, the first being necessary for building a settlement, the second raising its value, or can build connections between landmarks in urban routes. For this reason location plays a role not only for mitigating the effects of hazards but also for increasing the value of land through vicinities. Games are only another way to build a model of the complex system which is the urban organism in this regard, and a model is easier to be analysed than the system while displaying its basic rules. The role of landscape of building roads of memory between landmarks in the reconstruction is yet to be investigated in a future proposed COST action.

  12. The price of safety: costs for mitigating and coping with Alpine hazards

    NASA Astrophysics Data System (ADS)

    Pfurtscheller, C.; Thieken, A. H.

    2013-10-01

    Due to limited public budgets and the need to economize, the analysis of costs of hazard mitigation and emergency management of natural hazards becomes increasingly important for public natural hazard and risk management. In recent years there has been a growing body of literature on the estimation of losses which supported to help to determine benefits of measures in terms of prevented losses. On the contrary, the costs of mitigation are hardly addressed. This paper thus aims to shed some light on expenses for mitigation and emergency services. For this, we analysed the annual costs of mitigation efforts in four regions/countries of the Alpine Arc: Bavaria (Germany), Tyrol (Austria), South Tyrol (Italy) and Switzerland. On the basis of PPP values (purchasing power parities), annual expenses on public safety ranged from EUR 44 per capita in the Free State of Bavaria to EUR 216 in the Autonomous Province of South Tyrol. To analyse the (variable) costs for emergency services in case of an event, we used detailed data from the 2005 floods in the Federal State of Tyrol (Austria) as well as aggregated data from the 2002 floods in Germany. The analysis revealed that multi-hazards, the occurrence and intermixture of different natural hazard processes, contribute to increasing emergency costs. Based on these findings, research gaps and recommendations for costing Alpine natural hazards are discussed.

  13. Assessment and mitigation of combustible dust hazards in the plastics industry

    NASA Astrophysics Data System (ADS)

    Stern, Michael C.; Ibarreta, Alfonso; Myers, Timothy J.

    2015-05-01

    A number of recent industrial combustible dust fires and explosions, some involving powders used in the plastics industry, have led to heightened awareness of combustible dust hazards, increased regulatory enforcement, and changes to the current standards and regulations. This paper provides a summary of the fundamentals of combustible dust explosion hazards, comparing and contrasting combustible dust to flammable gases and vapors. The types of tests used to quantify and evaluate the potential hazard posed by plastic dusts are explored. Recent changes in NFPA 654, a standard applicable to combustible dust in the plastics industry, are also discussed. Finally, guidance on the primary methods for prevention and mitigation of combustible dust hazards are provided.

  14. Numerical and Probabilistic Analysis of Asteroid and Comet Impact Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Plesko, C.; Weaver, R.; Huebner, W.

    2010-09-01

    The possibility of asteroid and comet nucleus impacts on Earth has received significant recent media and scientific attention. Still, there are many outstanding questions about the correct response once a potentially hazardous object (PHO) is found. Nuclear explosives are often suggested as a deflection mechanism because they have a high internal energy per unit launch mass. However, major uncertainties remain about the use of nuclear explosives for hazard mitigation. There are large uncertainties in a PHO’s physical response to a strong deflection or dispersion impulse like that delivered by nuclear munitions. Objects smaller than 100 m may be solid, and objects at all sizes may be “rubble piles” with large porosities and little strength [1]. Objects with these different properties would respond very differently, so the effects of object properties must be accounted for. Recent ground-based observations and missions to asteroids and comets have improved the planetary science community’s understanding of these objects. Computational power and simulation capabilities have improved to such an extent that it is possible to numerically model the hazard mitigation problem from first principles. Before we know that explosive yield Y at height h or depth -h from the target surface will produce a momentum change in or dispersion of a PHO, we must quantify the energy deposition into the system of particles that make up the PHO. Here we present the initial results of a parameter study in which we model the efficiency of energy deposition from a stand-off nuclear burst onto targets made of PHO constituent materials.

  15. Experimentally Benchmarked Numerical Approaches to Lightning Hazard Assessment and Mitigation

    NASA Astrophysics Data System (ADS)

    Jones, Malcolm; Newton, David

    2013-04-01

    A natural hazard that has been with us since the beginning of time is the lighting strike. Not only does it represent a direct hazard to humans but also to the facilities that they work within and the products they produce. The latter categories are of particular concern when they are related to potentially hazardous processes and products. For this reason experimental and numerical modelling techniques are developed to understand the nature of the hazards, to develop appropriate protective approaches which can be put in place and finally to gain assurance that the overall risks fall within national, international accepted standards and those appropriate to the special nature of the work. The latter is of particular importance when the processes and the products within such facilities have a potential susceptibility to lightning strike and where failure is deemed unacceptable. This paper covers examples of the modelling approaches applied to such facilities within which high consequence operations take place, together with the protection that is required for high consequence products. In addition examples are given of how the numerical techniques are benchmarked by supporting experimental programmes. Not only should such a safety rationale be laid down and agreed early for these facilities and products but that it is maintained during the inevitable changes that will occur during the design, development, production and maintenance phases. For example an 'improvement', as seen by a civil engineer or a facility manager, may well turn out to be detrimental to lightning safety. Constant vigilance is key to ensuring the maintenance of safety.

  16. Advances(?) in mitigating volcano hazards in Latin America

    USGS Publications Warehouse

    Hall, M.L.

    1991-01-01

    The 1980's were incredible years for volcanology. As a consequence of the Mount St. Helens and other eruptions, major advances in our understanding of volcanic processes and eruption dynamics were made. the decade also witnessed the greatest death toll caused by volcanism since 1902. Following Mount St. Helens, awareness of volcano hazards increased throughout the world; however, in Latin America, subsequent events showed that much was still to be learned. 

  17. Fourth DOE Natural Phenomena Hazards Mitigation Conference: Proceedings. Volume 1

    SciTech Connect

    Not Available

    1993-12-31

    This conference allowed an interchange in the natural phenomena area among designers, safety professionals, and managers. The papers presented in Volume I of the proceedings are from sessions I - VIII which cover the general topics of: DOE standards, lessons learned and walkdowns, wind, waste tanks, ground motion, testing and materials, probabilistic seismic hazards, risk assessment, base isolation and energy dissipation, and lifelines and floods. Individual papers are indexed separately. (GH)

  18. Department of Energy Natural Phenomena Hazards Mitigation Program

    SciTech Connect

    Murray, R.C.

    1993-09-01

    This paper will present a summary of past and present accomplishments of the Natural Phenomena Hazards Program that has been ongoing at Lawrence Livermore National Laboratory since 1975. The Natural Phenomena covered includes earthquake; winds, hurricanes, and tornadoes; flooding and precipitation; lightning; and volcanic events. The work is organized into four major areas (1) Policy, requirements, standards, and guidance (2) Technical support, research development, (3) Technology transfer, and (4) Oversight.

  19. Interdisciplinary approach for Tsunami Hazard Mitigation in Algeria (West Mediterranean)

    NASA Astrophysics Data System (ADS)

    Amir, L. A.; Cisternas, A.; Vigneresse, J. D.

    2009-12-01

    Numerous tsunamis occurred in the West Mediterranean with magnitudes ranging from m=-1 to m=2 (Imamura-Iida scale). In Algeria, tsunamis are reported from the 14th century to 2003. Northern Algeria is located at the border between the African and the Eurasian plate. Destructive earthquakes with magnitude greater than 6.7 occurred 3 times in the last century. The North Algeria western region is characterized by the Murdjadjo anticline. A destructive earthquake hit Oran city on October 1790 (Intensity: X, West of Algeria). A tsunami was triggered in the Alboran sea. The Spanish and North Africa coasts were flooded. Run-up’s of 2 meters in height are reported in historical documents (Lopez Marinas and Salord, 1990). Here, the 1790 Alboran tsunami is studied from a modelling approach. The tsunami source is determined from the Okada equations and the tsunami propagation is estimated from the SWAN code (Mader, 2004). Results show that active thrust faulting related to the Murdjadjo structure is responsible for the tsunami. In the central part of Algeria, the Algiers city (capital of Algeria) was the location of destructive earthquakes (Intensity: X) that were followed by tsunamis in 1365 and in 1773. Flooding and run-up’s of 2 meters in height are reported in historical documents for the 1365 event. The central part of Algeria is the site of the Sahel anticline. A tsunami modelling is also performed considering the Sahel fault system as a potential tsunami source. Results show that it takes less than 15 minutes for the tsunami waves to reach the Spanish coast. Run-up’s are estimated lower than 2 meters in height. Discrepancies are attributed to the resolution of the bathymetry and the limits of the modelling. In the eastern region, historical reports also reveal run-up’s up to 5 meters in height after a tsunami triggered by a destructive earthquake in 1856 in Jijel city (intensity: VIII). From tsunami catalogs, seismic and tsunami data are plotted using a tsunami vulnerability parameter. The vulnerability index is estimated from the tsunami intensity and the seismic intensity using the Papadopoulos and the EMS scale. Results show that in Algeria, tsunami damages are minor relative to seismic damages. Since the 2004 Sumatra-Andaman tsunami, intergovernmental coordinated groups are working on an Indian and a Mediterranean tsunami alert system. To reduce vulnerability and increase resilience, it is very important to implement an efficiency warning system and a communication policy for fast urbanized coastal cities. In that context, lessons from the pacific case study are of major interest. Chile is marked by a very high seismic and tsunami hazard. The Iquique area is a threaten zone for a potential earthquake of magnitude greater than 8 and a local tsunami that could generate run-up’s up to 20 meters in height. In addition to the Pacific Tsunami Warning centre based in Hawaii, the Chile has elaborated a local tsunami warning centre. The Chilean case study is presented in discussion to highlight some lessons that may serve as an example for fast urbanized coastal cities that have to face local tsunamis.

  20. Looking before we leap: an ongoing, quantative investigation of asteroid and comet impact hazard mitigation

    SciTech Connect

    Plesko, Catherine S; Weaver, Robert P; Bradley, Paul A; Huebner, Walter F

    2010-01-01

    There are many outstanding questions about the correct response to an asteroid or comet impact threat on Earth. Nuclear munitions are currently thought to be the most efficient method of delivering an impact-preventing impulse to a potentially hazardous object (PHO). However, there are major uncertainties about the response of PHOs to a nuclear burst, and the most appropriate ways to use nuclear munitions for hazard mitigation.

  1. Hazards in the Heliosphere: Forecasting and Mitigation Techniques

    NASA Astrophysics Data System (ADS)

    Crosby, N.

    2007-08-01

    Spacecraft have to survive very hostile environments which can severely limit space missions as well as pose threats to humans. Shielding requirements, including space storm shelters, both on the spacecraft as well as radiation protection facilities on the target, need to be taken into consideration with respect to travel time, local target space weather conditions and the phase of the solar cycle. Be it on Mars or a different planet, once we reach our target the local space weather conditions will be a function of the planet's location in the solar system and whether it has a magnetosphere and/or atmosphere around it. This presentation will look at the various opportunities that heliospheric exploration offers while in parallel evaluating the obstacles that must be overcome to realize these scenarios considering the feasibility to use and integrate existing systems (e.g. forecasting), as well as presenting innovative mitigation techniques.

  2. Assessment of indirect losses and costs of emergency for project planning of alpine hazard mitigation

    NASA Astrophysics Data System (ADS)

    Amenda, Lisa; Pfurtscheller, Clemens

    2013-04-01

    By virtue of augmented settling in hazardous areas and increased asset values, natural disasters such as floods, landslides and rockfalls cause high economic losses in Alpine lateral valleys. Especially in small municipalities, indirect losses, mainly stemming from a breakdown of transport networks, and costs of emergency can reach critical levels. A quantification of these losses is necessary to estimate the worthiness of mitigation measures, to determine the appropriate level of disaster assistance and to improve risk management strategies. There are comprehensive approaches available for assessing direct losses. However, indirect losses and costs of emergency are widely not assessed and the empirical basis for estimating these costs is weak. To address the resulting uncertainties of project appraisals, a standardized methodology has been developed dealing with issues of local economic effects and emergency efforts needed. In our approach, the cost-benefit-analysis for technical mitigation of the Austrian Torrent and Avalanche Control (TAC) will be optimized and extended using the 2005-debris flow as a design event, which struggled a small town in the upper Inn valley in southwest Tyrol (Austria). Thereby, 84 buildings were affected, 430 people were evacuated and due to this, the TAC implemented protection measures for 3.75 million Euros. Upgrading the method of the TAC and analyzing to what extent the cost-benefit-ratio is about to change, is one of the main objectives of this study. For estimating short-run indirect effects and costs of emergency on the local level, data was collected via questionnaires, field mapping, guided interviews, as well as intense literature research. According to this, up-to-date calculation methods were evolved and the cost-benefit-analysis of TAC was recalculated with these new-implemented results. The cost-benefit-ratio will be more precise and specific and hence, the decision, which mitigation alternative will be carried out. Based on this, the worthiness of the mitigation measures can be determined in more detail and the proper level of emergency assistance can be calculated more adequately. By dint of this study, a better data basis will be created evaluating technical and non-technical mitigation measures, which is useful for government agencies, insurance companies and research.

  3. Monitoring Fogo Island, Cape Verde Archipelago, for Volcanic Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Faria, B. V.; Heleno, S. I.; Barros, I. J.; d'Oreye, N.; Bandomo, Z.; Fonseca, J. F.

    2001-12-01

    Fogo Island, in the Cape Verde Archipelago (North Atlantic), with a total area of 476 km2 and a population of about 40000, is an active ocean island volcano raising from an average sea-bottom depth of the order of -3000m to a maximum altitude of 2820m. All of the 28 historically recorded eruptions (Ribeiro, 1960) since the arrival of the first settlers in the 15th Century took place in Cha das Caldeiras, a 9 km-wide flat zone 1700 meters above sea level that resulted from the infill of a large lateral collapse caldera (Day et al., 2000). The last eruptions occurred in 1951 and 1995, through secondary cones at the basis of Pico do Fogo, the main volcanic edifice. A tall scarp surrounds Cha das Calderas on its western side only, and the eastern limit leads to a very steep sub-aerial slope down to the coastline. With this morphology, the volcanic hazard is significant inside Cha das Caldeiras - with a resident population of the order of 800 - and particularly in the villages of the eastern coast. Because the magma has low viscosity, eruptions in Fogo have scarce precursory activity, and its forecast is therefore challenging. The VIGIL monitoring network was installed between 1997 and 2001, and is currently in full operation. It consists of seven seismographic stations - two of which broadband - four tilt stations, a CO2 monitoring station and a meteo station. The data is telemetred in real time to the central laboratory in the neighbor island of Santiago, and analyzed on a daily basis. The continuous data acquisition is complemented by periodic GPS, gravity and leveling surveys (Lima et al., this conference). In this paper we present the methodology adopted to monitor the level of volcanic activity of Fogo Volcano, and show examples of the data being collected. Anomalous data recorded at the end of September 2000, which led to the only occurrence of an alert warning so far, are also presented and discussed.

  4. Mitigation of EMU Cut Glove Hazard from Micrometeoroid and Orbital Debris Impacts on ISS Handrails

    NASA Technical Reports Server (NTRS)

    Ryan, Shannon; Christiansen, Eric L.; Davis, Bruce A.; Ordonez, Erick

    2009-01-01

    Recent cut damages sustained on crewmember gloves during extravehicular activity (ISS) onboard the International Space Station (ISS) have been caused by contact with sharp edges or a pinch point according to analysis of the damages. One potential source are protruding sharp edged crater lips from micrometeoroid and orbital debris (MMOD) impacts on metallic handrails along EVA translation paths. A number of hypervelocity impact tests were performed on ISS handrails, and found that mm-sized projectiles were capable of inducing crater lip heights two orders of magnitude above the minimum value for glove abrasion concerns. Two techniques were evaluated for mitigating the cut glove hazard of MMOD impacts on ISS handrails: flexible overwraps which act to limit contact between crewmember gloves and impact sites, and; alternate materials which form less hazardous impact crater profiles. In parallel with redesign efforts to increase the cut resilience of EMU gloves, the modifications to ISS handrails evaluated in this study provide the means to significantly reduce cut glove risk from MMOD impact craters

  5. The U.S. National Tsunami Hazard Mitigation Program: Successes in Tsunami Preparedness

    NASA Astrophysics Data System (ADS)

    Whitmore, P.; Wilson, R. I.

    2012-12-01

    Formed in 1995 by Congressional Action, the National Tsunami Hazards Mitigation Program (NTHMP) provides the framework for tsunami preparedness activities in the United States. The Program consists of the 28 U.S. coastal states, territories, and commonwealths (STCs), as well as three Federal agencies: the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), and the United States Geological Survey (USGS). Since its inception, the NTHMP has advanced tsunami preparedness in the United States through accomplishments in many areas of tsunami preparedness: - Coordination and funding of tsunami hazard analysis and preparedness activities in STCs; - Development and execution of a coordinated plan to address education and outreach activities (materials, signage, and guides) within its membership; - Lead the effort to assist communities in meeting National Weather Service (NWS) TsunamiReady guidelines through development of evacuation maps and other planning activities; - Determination of tsunami hazard zones in most highly threatened coastal communities throughout the country by detailed tsunami inundation studies; - Development of a benchmarking procedure for numerical tsunami models to ensure models used in the inundation studies meet consistent, NOAA standards; - Creation of a national tsunami exercise framework to test tsunami warning system response; - Funding community tsunami warning dissemination and reception systems such as sirens and NOAA Weather Radios; and, - Providing guidance to NOAA's Tsunami Warning Centers regarding warning dissemination and content. NTHMP activities have advanced the state of preparedness of United States coastal communities, and have helped save lives and property during recent tsunamis. Program successes as well as future plans, including maritime preparedness, are discussed.

  6. New Activities of the U.S. National Tsunami Hazard Mitigation Program, Mapping and Modeling Subcommittee

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Eble, M. C.

    2013-12-01

    The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is comprised of representatives from coastal states and federal agencies who, under the guidance of NOAA, work together to develop protocols and products to help communities prepare for and mitigate tsunami hazards. Within the NTHMP are several subcommittees responsible for complimentary aspects of tsunami assessment, mitigation, education, warning, and response. The Mapping and Modeling Subcommittee (MMS) is comprised of state and federal scientists who specialize in tsunami source characterization, numerical tsunami modeling, inundation map production, and warning forecasting. Until September 2012, much of the work of the MMS was authorized through the Tsunami Warning and Education Act, an Act that has since expired but the spirit of which is being adhered to in parallel with reauthorization efforts. Over the past several years, the MMS has developed guidance and best practices for states and territories to produce accurate and consistent tsunami inundation maps for community level evacuation planning, and has conducted benchmarking of numerical inundation models. Recent tsunami events have highlighted the need for other types of tsunami hazard analyses and products for improving evacuation planning, vertical evacuation, maritime planning, land-use planning, building construction, and warning forecasts. As the program responsible for producing accurate and consistent tsunami products nationally, the NTHMP-MMS is initiating a multi-year plan to accomplish the following: 1) Create and build on existing demonstration projects that explore new tsunami hazard analysis techniques and products, such as maps identifying areas of strong currents and potential damage within harbors as well as probabilistic tsunami hazard analysis for land-use planning. 2) Develop benchmarks for validating new numerical modeling techniques related to current velocities and landslide sources. 3) Generate guidance and protocols for the production and use of new tsunami hazard analysis products. 4) Identify multistate collaborations and funding partners interested in these new products. Application of these new products will improve the overall safety and resilience of coastal communities exposed to tsunami hazards.

  7. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    The United States spends approximately four million dollars each year searching for near-Earth objects (NEOs). The objective is to detect those that may collide with Earth. The majority of this funding supports the operation of several observatories that scan the sky searching for NEOs. This, however, is insufficient in detecting the majority of NEOs that may present a tangible threat to humanity. A significantly smaller amount of funding supports ways to protect the Earth from such a potential collision or "mitigation." In 2005, a Congressional mandate called for NASA to detect 90 percent of NEOs with diameters of 140 meters of greater by 2020. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies identifies the need for detection of objects as small as 30 to 50 meters as these can be highly destructive. The book explores four main types of mitigation including civil defense, "slow push" or "pull" methods, kinetic impactors and nuclear explosions. It also asserts that responding effectively to hazards posed by NEOs requires national and international cooperation. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies is a useful guide for scientists, astronomers, policy makers and engineers.

  8. Developing a scientific procedure for community based hazard mapping and risk mitigation

    NASA Astrophysics Data System (ADS)

    Verrier, M.

    2011-12-01

    As an international exchange student from the Geological Sciences Department at San Diego State University (SDSU), I joined the KKN-PPM program at Universitas Gadjah Mada (UGM), Yogyakarta, Indonesia, in July 2011 for 12 days (July 4th to July 16th) of its two month duration (July 4th to August 25th). The KKN-PPM group I was attached was designated 154 and was focused in Plosorejo Village, Karanganyar, Kerjo, Central Java, Indonesia. The mission of KKN-PPM 154 was to survey Plosorejo village for existing landslides, to generate a simple hazard susceptibility map that can be understood by local villagers, and then to begin dissemination of that map into the community. To generate our susceptibility map we first conducted a geological survey of the existing landslides in the field study area, with a focus on determining landslide triggers and gauging areas for susceptibility for future landslides. The methods for gauging susceptibility included lithological observation, the presence of linear cracking, visible loss of structural integrity in structures such as villager homes, as well as collaboration with local residents and with the local rescue and response team. There were three color distinctions used in representing susceptibility which were green, where there is no immediate danger of landslide damage; orange, where transportation routes are at risk of being disrupted by landslides; and red, where imminent landslide potential puts a home in direct danger. The landslide inventory and susceptibility data was compiled into digital mediums such as CorelDraw, ArcGIS and Google Earth. Once a technical map was generated, we presented it to the village leadership for confirmation and modification based on their experience. Finally, we began to use the technical susceptibility map to draft evacuation routes and meeting points in the event of landslides, as well as simple susceptibility maps that can be understood and utilized by local villagers. Landslide mitigation projects that are being conducted alongside the community hazard map include marking evacuation routes with painted bamboo signs, creating a meaningful landslide awareness mural, and installing simple early warning systems that detect land movement and alert residents that evacuation routes should be used. KKN-PPM is scheduled to continue until August 25th, 2011. In the future, research will be done into using the model for community based hazard mapping outlined here in the Geological Sciences Department at SDSU to increase georisk awareness and improve mitigation of landslides in local areas of need such as Tijuana, Mexico.

  9. Almost strict liability: Wind River Petroleum and the Utah Hazardous Substance Mitigation Act

    SciTech Connect

    1996-12-31

    In Wind River, the Utah Supreme Court developed a two-step liability standard. The court ruled that under the act, statutorily responsible parties are strictly liable for any release of hazardous material from their facility. Among responsible parties, liability is to be apportioned on an equitable contribution standard. However, the Utah Legislature has subsequently amended the Mitigation Act to prohibit the application of unapportioned strict liability. Therefore, Wind River can no longer be relied upon as the law regarding liability under the Mitigation Act.

  10. The influence of hazard models on GIS-based regional risk assessments and mitigation policies

    USGS Publications Warehouse

    Bernknopf, R.L.; Rabinovici, S.J.M.; Wood, N.J.; Dinitz, L.B.

    2006-01-01

    Geographic information systems (GIS) are important tools for understanding and communicating the spatial distribution of risks associated with natural hazards in regional economies. We present a GIS-based decision support system (DSS) for assessing community vulnerability to natural hazards and evaluating potential mitigation policy outcomes. The Land Use Portfolio Modeler (LUPM) integrates earth science and socioeconomic information to predict the economic impacts of loss-reduction strategies. However, the potential use of such systems in decision making may be limited when multiple but conflicting interpretations of the hazard are available. To explore this problem, we conduct a policy comparison using the LUPM to test the sensitivity of three available assessments of earthquake-induced lateral-spread ground failure susceptibility in a coastal California community. We find that the uncertainty regarding the interpretation of the science inputs can influence the development and implementation of natural hazard management policies. Copyright ?? 2006 Inderscience Enterprises Ltd.

  11. The NEOShield Project: Understanding the Mitigation-Relevant Physical Properties of Potentially Hazardous Asteroids

    NASA Astrophysics Data System (ADS)

    Harris, Alan W.; Drube, L.; Consortium, NEOShield

    2012-10-01

    NEOShield is a European-Union funded project to address impact hazard mitigation issues, coordinated by the German Aerospace Center, DLR. The NEOShield consortium consists of 13 research institutes, universities, and industrial partners from 6 countries and includes leading US and Russian space organizations. The primary aim of the 5.8 million euro, 3.5 year project, which commenced in January 2012, is to investigate in detail promising mitigation techniques, such as the kinetic impactor, blast deflection, and the gravity tractor, and devise feasible demonstration missions. Options for an international strategy for implementation when an actual impact threat arises will also be investigated. Our current scientific work is focused on examining the mitigation-relevant physical properties of the NEO population via observational data and laboratory experiments on asteroid surface analog materials. We are attempting to narrow the range of the expected properties of objects that are most likely to threaten the Earth and trigger space-borne mitigation attempts, and investigate how such objects would respond to different mitigation techniques. The results of our scientific work will flow into the technical phase of the project, during which detailed designs of feasible mitigation demonstration missions will be developed. We briefly describe the scope of the project and report on results obtained to date. Funded under EU FP7 program agreement no. 282703.

  12. Planning ahead for asteroid and comet hazard mitigation, phase 1: parameter space exploration and scenario modeling

    SciTech Connect

    Plesko, Catherine S; Clement, R Ryan; Weaver, Robert P; Bradley, Paul A; Huebner, Walter F

    2009-01-01

    The mitigation of impact hazards resulting from Earth-approaching asteroids and comets has received much attention in the popular press. However, many questions remain about the near-term and long-term, feasibility and appropriate application of all proposed methods. Recent and ongoing ground- and space-based observations of small solar-system body composition and dynamics have revolutionized our understanding of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). Ongoing increases in computing power and algorithm sophistication make it possible to calculate the response of these inhomogeneous objects to proposed mitigation techniques. Here we present the first phase of a comprehensive hazard mitigation planning effort undertaken by Southwest Research Institute and Los Alamos National Laboratory. We begin by reviewing the parameter space of the object's physical and chemical composition and trajectory. We then use the radiation hydrocode RAGE (Gittings et al. 2008), Monte Carlo N-Particle (MCNP) radiation transport (see Clement et al., this conference), and N-body dynamics codes to explore the effects these variations in object properties have on the coupling of energy into the object from a variety of mitigation techniques, including deflection and disruption by nuclear and conventional munitions, and a kinetic impactor.

  13. A perspective multidisciplinary geological approach for mitigation of effects due to the asbestos hazard

    NASA Astrophysics Data System (ADS)

    Vignaroli, Gianluca; Rossetti, Federico; Belardi, Girolamo; Billi, Andrea

    2010-05-01

    Asbestos-bearing rock sequences constitute a remarkable natural hazard that poses important threat to human health and may be at the origin of diseases such as asbestosis, mesothelioma and lung cancer). Presently, asbestos is classified as Category 1 carcinogen by world health authorities. Although regulatory agencies in many countries prohibit or restrict the use of asbestos, and discipline the environmental asbestos exposure, the impact of asbestos on human life still constitutes a major problem. Naturally occurring asbestos includes serpentine and amphibole minerals characterised by fibrous morphology and it is a constituent of mineralogical associations typical of mafic and ultramafic rocks within the ophiolitic sequences. Release of fibres can occur both through natural processes (erosion) and through human activities requiring fragmentation of ophiolite rocks (quarrying, tunnelling, railways construction, etc.). As a consequence, vulnerability is increasing in sites where workers and living people are involved by dispersion of fibres during mining and milling of ophiolitic rocks. By analysing in the field different exposures of ophiolitic sequences from the Italian peninsula and after an extensive review of the existing literature, we remark the importance of the geological context (origin, tectonic and deformation history) of ophiolites as a first-order parameter in evaluating the asbestos hazard. Integrated structural, textural, mineralogical and petrological studies significantly improve our understanding of the mechanisms governing the nucleation/growth of fibrous minerals in deformation structures (both ductile and brittle) within the ophiolitic rocks. A primary role is recognised in the structural processes favouring the fibrous mineralization, with correlation existing between the fibrous parameters (such as mineralogical composition, texture, mechanics characteristics) and the particles released in the air (such as shape, size, and amount liberated during rock fragmentation). Accordingly, we are confident that definition of an analytical protocol based on the geological attributes of the asbestos-bearing rocks may constitute a propaedeutical tool to evaluate the asbestos hazard in natural environments. This approach may have important implications for mitigation effects of the asbestos hazard from the medical field to the engineering operations.

  14. Earthquake Scaling and Development of Ground Motion Prediction for Earthquake Hazard Mitigation in Taiwan

    NASA Astrophysics Data System (ADS)

    Ma, K.; Yen, Y.

    2011-12-01

    For earthquake hazard mitigation toward risk management, integration study from development of source model to ground motion prediction is crucial. The simulation for high frequency component ( > 1 Hz) of strong ground motions in the near field was not well resolved due to the insufficient resolution in velocity structure. Using the small events as Green's functions (i.e. empirical Green's function (EGF) method) can resolve the problem of lack of precise velocity structure to replace the path effect evaluation. If the EGF is not available, a stochastic Green's function (SGF) method can be employed. Through characterizing the slip models derived from the waveform inversion, we directly extract the parameters needed for the ground motion prediction in the EGF method or the SGF method. The slip models had been investigated from Taiwan dense strong motion and global teleseismic data. In addition, the low frequency ( < 1 Hz) can obtained numerically by the Frequency-Wavenumber (FK) method. Thus, broadband frequency strong ground motion can be calculated by a hybrid method that combining a deterministic FK method for the low frequency simulation and the EGF or SGF method for high frequency simulation. Characterizing the definitive source parameters from the empirical scaling study can provide directly to the ground motion simulation. To give the ground motion prediction for a scenario earthquake, we compiled the earthquake scaling relationship from the inverted finite-fault models of moderate to large earthquakes in Taiwan. The studies show the significant involvement of the seismogenic depth to the development of rupture width. In addition to that, several earthquakes from blind fault show distinct large stress drop, which yield regional high PGA. According to the developing scaling relationship and the possible high stress drops for earthquake from blind faults, we further deploy the hybrid method mentioned above to give the simulation of the strong motion in displacement, velocity and acceleration. We now give this exercise to the high stress drop event, and the events, which might have potential seismic hazard to a specific site to give further estimation on seismic hazard evaluation.

  15. L-Reactor Habitat Mitigation Study

    SciTech Connect

    Not Available

    1988-02-01

    The L-Reactor Fish and Wildlife Resource Mitigation Study was conducted to quantify the effects on habitat of the L-Reactor restart and to identify the appropriate mitigation for these impacts. The completed project evaluated in this study includes construction of a 1000 acre reactor cooling reservoir formed by damming Steel Creek. Habitat impacts identified include a loss of approximately 3,700 average annual habitat units. This report presents a mitigation plan, Plan A, to offset these habitat losses. Plan A will offset losses for all species studied, except whitetailed deer. The South Carolina Wildlife and Marine Resources Department strongly recommends creation of a game management area to provide realistic mitigation for loss of deer habitats. 10 refs., 5 figs., 3 tabs. (MHB)

  16. A portfolio approach to evaluating natural hazard mitigation policies: An Application to lateral-spread ground failure in Coastal California

    USGS Publications Warehouse

    Bernknopf, R.L.; Dinitz, L.B.; Rabinovici, S.J.M.; Evans, A.M.

    2001-01-01

    In the past, efforts to prevent catastrophic losses from natural hazards have largely been undertaken by individual property owners based on site-specific evaluations of risks to particular buildings. Public efforts to assess community vulnerability and encourage mitigation have focused on either aggregating site-specific estimates or adopting standards based upon broad assumptions about regional risks. This paper develops an alternative, intermediate-scale approach to regional risk assessment and the evaluation of community mitigation policies. Properties are grouped into types with similar land uses and levels of hazard, and hypothetical community mitigation strategies for protecting these properties are modeled like investment portfolios. The portfolios consist of investments in mitigation against the risk to a community posed by a specific natural hazard, and are defined by a community's mitigation budget and the proportion of the budget invested in locations of each type. The usefulness of this approach is demonstrated through an integrated assessment of earthquake-induced lateral-spread ground failure risk in the Watsonville, California area. Data from the magnitude 6.9 Loma Prieta earthquake of 1989 are used to model lateral-spread ground failure susceptibility. Earth science and economic data are combined and analyzed in a Geographic Information System (GIS). The portfolio model is then used to evaluate the benefits of mitigating the risk in different locations. Two mitigation policies, one that prioritizes mitigation by land use type and the other by hazard zone, are compared with a status quo policy of doing no further mitigation beyond that which already exists. The portfolio representing the hazard zone rule yields a higher expected return than the land use portfolio does: However, the hazard zone portfolio experiences a higher standard deviation. Therefore, neither portfolio is clearly preferred. The two mitigation policies both reduce expected losses and increase overall expected community wealth compared to the status quo policy.

  17. Lidar and Electro-Optics for Atmospheric Hazard Sensing and Mitigation

    NASA Technical Reports Server (NTRS)

    Clark, Ivan O.

    2012-01-01

    This paper provides an overview of the research and development efforts of the Lidar and Electro-Optics element of NASA's Aviation Safety Program. This element is seeking to improve the understanding of the atmospheric environments encountered by aviation and to provide enhanced situation awareness for atmospheric hazards. The improved understanding of atmospheric conditions is specifically to develop sensor signatures for atmospheric hazards. The current emphasis is on kinetic air hazards such as turbulence, aircraft wake vortices, mountain rotors, and windshear. Additional efforts are underway to identify and quantify the hazards arising from multi-phase atmospheric conditions including liquid and solid hydrometeors and volcanic ash. When the multi-phase conditions act as obscurants that result in reduced visual awareness, the element seeks to mitigate the hazards associated with these diminished visual environments. The overall purpose of these efforts is to enable safety improvements for air transport class and business jet class aircraft as the transition to the Next Generation Air Transportation System occurs.

  18. The asteroid and comet impact hazard: risk assessment and mitigation options

    NASA Astrophysics Data System (ADS)

    Gritzner, Christian; Drfeld, Kai; Kasper, Jan; Fasoulas, Stefanos

    2006-08-01

    The impact of extraterrestrial matter onto Earth is a continuous process. On average, some 50,000 tons of dust are delivered to our planet every year. While objects smaller than about 30 m mainly disintegrate in the Earths atmosphere, larger ones can penetrate through it and cause damage on the ground. When an object of hundreds of meters in diameter impacts an ocean, a tsunami is created that can devastate coastal cities. Further, if a km-sized object hit the Earth it would cause a global catastrophe due to the transport of enormous amounts of dust and vapour into the atmosphere resulting in a change in the Earths climate. This article gives an overview of the near-Earth asteroid and comet (near-Earth object-NEO) impact hazard and the NEO search programmes which are gathering important data on these objects. It also points out options for impact hazard mitigation by using deflection systems. It further discusses the critical constraints for NEO deflection strategies and systems as well as mitigation and evacuation costs and benefits. Recommendations are given for future activities to solve the NEO impact hazard problem.

  19. Linear Aerospike SR-71 Experiment (LASRE): Aerospace Propulsion Hazard Mitigation Systems

    NASA Technical Reports Server (NTRS)

    Mizukami, Masashi; Corpening, Griffin P.; Ray, Ronald J.; Hass, Neal; Ennix, Kimberly A.; Lazaroff, Scott M.

    1998-01-01

    A major hazard posed by the propulsion system of hypersonic and space vehicles is the possibility of fire or explosion in the vehicle environment. The hazard is mitigated by minimizing or detecting, in the vehicle environment, the three ingredients essential to producing fire: fuel, oxidizer, and an ignition source. The Linear Aerospike SR-71 Experiment (LASRE) consisted of a linear aerospike rocket engine integrated into one-half of an X-33-like lifting body shape, carried on top of an SR-71 aircraft. Gaseous hydrogen and liquid oxygen were used as propellants. Although LASRE is a one-of-a-kind experimental system, it must be rated for piloted flight, so this test presented a unique challenge. To help meet safety requirements, the following propulsion hazard mitigation systems were incorporated into the experiment: pod inert purge, oxygen sensors, a hydrogen leak detection algorithm, hydrogen sensors, fire detection and pod temperature thermocouples, water misting, and control room displays. These systems are described, and their development discussed. Analyses, ground test, and flight test results are presented, as are findings and lessons learned.

  20. The asteroid and comet impact hazard: risk assessment and mitigation options.

    PubMed

    Gritzner, Christian; Dürfeld, Kai; Kasper, Jan; Fasoulas, Stefanos

    2006-08-01

    The impact of extraterrestrial matter onto Earth is a continuous process. On average, some 50,000 tons of dust are delivered to our planet every year. While objects smaller than about 30 m mainly disintegrate in the Earth's atmosphere, larger ones can penetrate through it and cause damage on the ground. When an object of hundreds of meters in diameter impacts an ocean, a tsunami is created that can devastate coastal cities. Further, if a km-sized object hit the Earth it would cause a global catastrophe due to the transport of enormous amounts of dust and vapour into the atmosphere resulting in a change in the Earth's climate. This article gives an overview of the near-Earth asteroid and comet (near-Earth object-NEO) impact hazard and the NEO search programmes which are gathering important data on these objects. It also points out options for impact hazard mitigation by using deflection systems. It further discusses the critical constraints for NEO deflection strategies and systems as well as mitigation and evacuation costs and benefits. Recommendations are given for future activities to solve the NEO impact hazard problem. PMID:16670908

  1. The Puerto Rico Component of the National Tsunami Hazard and Mitigation Program Pr-Nthmp

    NASA Astrophysics Data System (ADS)

    Huerfano Moreno, V. A.; Hincapie-Cardenas, C. M.

    2014-12-01

    Tsunami hazard assessment, detection, warning, education and outreach efforts are intended to reduce losses to life and property. The Puerto Rico Seismic Network (PRSN) is participating in an effort with local and federal agencies, to developing tsunami hazard risk reduction strategies under the National Tsunami Hazards and Mitigation Program (NTHMP). This grant supports the TsunamiReady program which is the base of the tsunami preparedness and mitigation in PR. The Caribbean region has a documented history of damaging tsunamis that have affected coastal areas. The seismic water waves originating in the prominent fault systems around PR are considered to be a near-field hazard for Puerto Rico and the Virgin islands (PR/VI) because they can reach coastal areas within a few minutes after the earthquake. Sources for local, regional and tele tsunamis have been identified and modeled and tsunami evacuation maps were prepared for PR. These maps were generated in three phases: First, hypothetical tsunami scenarios on the basis of the parameters of potential underwater earthquakes were developed. Secondly, each of these scenarios was simulated. The third step was to determine the worst case scenario (MOM). The run-ups were drawn on GIS referenced maps and aerial photographs. These products are being used by emergency managers to educate the public and develop mitigation strategies. Online maps and related evacuation products are available to the public via the PR-TDST (PR Tsunami Decision Support Tool). Currently all the 44 coastal municipalities were recognized as TsunamiReady by the US NWS. The main goal of the program is to declare Puerto Rico as TsunamiReady, including two cities that are not coastal but could be affected by tsunamis. Based on these evacuation maps, tsunami signs were installed, vulnerability profiles were created, communication systems to receive and disseminate tsunami messages were installed in each TWFP, and tsunami response plans were approved. Also, the existing tsunami protocol and criteria in the PR/VI was updated. This paper describes the PR-NTHMP project, including the real time earthquake and tsunami monitoring as well as the specific protocols used to broadcast tsunami messages. The paper highlights tsunami hazards assessment, detection, warning, education and outreach in Puerto Rico.

  2. Fluor Daniel Hanford implementation plan for DOE Order 5480.28, Natural phenomena hazards mitigation

    SciTech Connect

    Conrads, T.J.

    1997-09-12

    Natural phenomena hazards (NPH) are unexpected acts of nature that pose a threat or danger to workers, the public, or the environment. Earthquakes, extreme winds (hurricane and tornado), snow, flooding, volcanic ashfall, and lightning strikes are examples of NPH that could occur at the Hanford Site. U.S. Department of Energy (DOE) policy requires facilities to be designed, constructed, and operated in a manner that protects workers, the public, and the environment from hazards caused by natural phenomena. DOE Order 5480.28, Natural Phenomena Hazards Mitigation, includes rigorous new natural phenomena criteria for the design of new DOE facilities, as well as for the evaluation and, if necessary, upgrade of existing DOE facilities. The Order was transmitted to Westinghouse Hanford Company in 1993 for compliance and is also identified in the Project Hanford Management Contract, Section J, Appendix C. Criteria and requirements of DOE Order 5480.28 are included in five standards, the last of which, DOE-STD-1023, was released in fiscal year 1996. Because the Order was released before all of its required standards were released, enforcement of the Order was waived pending release of the last standard and determination of an in-force date by DOE Richland Operations Office (DOE-RL). Agreement also was reached between the Management and Operations Contractor and DOE-RL that the Order would become enforceable for new structures, systems, and components (SSCS) 60 days following issue of a new order-based design criteria in HNF-PRO-97, Engineering Design and Evaluation. The order also requires that commitments addressing existing SSCs be included in an implementation plan that is to be issued 1 year following the release of the last standard. Subsequently, WHC-SP-1175, Westinghouse Hanford Company Implementation Plan for DOE Order 5480.28, Natural Phenomena Hazards Mitigation, Rev. 0, was issued in November 1996, and this document, HNF-SP-1175, Fluor Daniel Hanford Implementation Plan for DOE Order 5480.28, Natural Phenomena Hazards Mitigation, is Rev. 1 of that plan.

  3. Use of a Novel Visual Metaphor Measure (PRISM) to Evaluate School Children's Perceptions of Natural Hazards, Sources of Hazard Information, Hazard Mitigation Organizations, and the Effectiveness of Future Hazard Education Programs in Dominica, Eastern Car

    NASA Astrophysics Data System (ADS)

    Parham, Martin; Day, Simon; Teeuw, Richard; Solana, Carmen; Sensky, Tom

    2015-04-01

    This project aims to study the development of understanding of natural hazards (and of hazard mitigation) from the age of 11 to the age of 15 in secondary school children from 5 geographically and socially different schools on Dominica, through repeated interviews with the students and their teachers. These interviews will be coupled with a structured course of hazard education in the Geography syllabus; the students not taking Geography will form a control group. To avoid distortion of our results arising from the developing verbalization and literacy skills of the students over the 5 years of the project, we have adapted the PRISM tool used in clinical practice to assess patient perceptions of illness and treatment (Buchi & Sensky, 1999). This novel measure is essentially non-verbal, and uses spatial positions of moveable markers ("object" markers) on a board, relative to a fixed marker that represents the subject's "self", as a visual metaphor for the importance of the object to the subject. The subjects also explain their reasons for placing the markers as they have, to provide additional qualitative information. The PRISM method thus produces data on the perceptions measured on the board that can be subjected to statistical analysis, and also succinct qualitative data about each subject. Our study will gather data on participants' perceptions of different natural hazards, different sources of information about these, and organizations or individuals to whom they would go for help in a disaster, and investigate how these vary with geographical and social factors. To illustrate the method, which is generalisable, we present results from our initial interviews of the cohort of 11 year olds whom we will follow through their secondary school education. Büchi, S., & Sensky, T. (1999). PRISM: Pictorial Representation of Illness and Self Measure: a brief nonverbal measure of illness impact and therapeutic aid in psychosomatic medicine. Psychosomatics, 40(4), 314-320.

  4. Use of a Novel Visual Metaphor Measure (PRISM) to Evaluate School Children's Perceptions of Natural Hazards, Sources of Hazard Information, Hazard Mitigation Organizations, and the Effectiveness of Future Hazard Education Programs in Dominica, Eastern Caribbean

    NASA Astrophysics Data System (ADS)

    Parham, M.; Day, S. J.; Teeuw, R. M.; Solana, C.; Sensky, T.

    2014-12-01

    This project aims to study the development of understanding of natural hazards (and of hazard mitigation) from the age of 11 to the age of 15 in secondary school children from 5 geographically and socially different schools on Dominica, through repeated interviews with the students and their teachers. These interviews will be coupled with a structured course of hazard education in the Geography syllabus; the students not taking Geography will form a control group. To avoid distortion of our results arising from the developing verbalization and literacy skills of the students over the 5 years of the project, we have adapted the PRISM tool used in clinical practice to assess patient perceptions of illness and treatment (Buchi & Sensky, 1999). This novel measure is essentially non-verbal, and uses spatial positions of moveable markers ("object" markers) on a board, relative to a fixed marker that represents the subject's "self", as a visual metaphor for the importance of the object to the subject. The subjects also explain their reasons for placing the markers as they have, to provide additional qualitative information. The PRISM method thus produces data on the perceptions measured on the board that can be subjected to statistical analysis, and also succinct qualitative data about each subject. Our study will gather data on participants' perceptions of different natural hazards, different sources of information about these, and organizations or individuals to whom they would go for help in a disaster, and investigate how these vary with geographical and social factors. To illustrate the method, which is generalisable, we present results from our initial interviews of the cohort of 11 year olds whom we will follow through their secondary school education.Büchi, S., & Sensky, T. (1999). PRISM: Pictorial Representation of Illness and Self Measure: a brief nonverbal measure of illness impact and therapeutic aid in psychosomatic medicine. Psychosomatics, 40(4), 314-320.

  5. Impact hazard mitigation: understanding the effects of nuclear explosive outputs on comets and asteroids

    SciTech Connect

    Clement, Ralph R C; Plesko, Catherine S; Bradley, Paul A; Conlon, Leann M

    2009-01-01

    The NASA 2007 white paper ''Near-Earth Object Survey and Deflection Analysis of Alternatives'' affirms deflection as the safest and most effective means of potentially hazardous object (PHO) impact prevention. It also calls for further studies of object deflection. In principle, deflection of a PHO may be accomplished by using kinetic impactors, chemical explosives, gravity tractors, solar sails, or nuclear munitions. Of the sudden impulse options, nuclear munitions are by far the most efficient in terms of yield-per-unit-mass launched and are technically mature. However, there are still significant questions about the response of a comet or asteroid to a nuclear burst. Recent and ongoing observational and experimental work is revolutionizing our understanding of the physical and chemical properties of these bodies (e.g ., Ryan (2000) Fujiwara et al. (2006), and Jedicke et al. (2006)). The combination of this improved understanding of small solar-system bodies combined with current state-of-the-art modeling and simulation capabilities, which have also improved dramatically in recent years, allow for a science-based, comprehensive study of PHO mitigation techniques. Here we present an examination of the effects of radiation from a nuclear explosion on potentially hazardous asteroids and comets through Monte Carlo N-Particle code (MCNP) simulation techniques. MCNP is a general-purpose particle transport code commonly used to model neutron, photon, and electron transport for medical physics reactor design and safety, accelerator target and detector design, and a variety of other applications including modeling the propagation of epithermal neutrons through the Martian regolith (Prettyman 2002). It is a massively parallel code that can conduct simulations in 1-3 dimensions, complicated geometries, and with extremely powerful variance reduction techniques. It uses current nuclear cross section data, where available, and fills in the gaps with analytical models where data are not available. MCNP has undergone extensive verification and validation and is considered the gold-standard for particle transport. (Forrest B. Brown, et al., ''MCNP Version 5,'' Trans. Am. Nucl. Soc., 87, 273, November 2002.) Additionally, a new simulation capability using MCNP has become available to this collaboration. The first results of this new capability will also be presented.

  6. Impact Hazard Mitigation: Understanding the Effects of Nuclear Explosive Outputs on Comets and Asteroids

    NASA Astrophysics Data System (ADS)

    Clement, R.

    The NASA 2007 white paper "Near-Earth Object Survey and Deflection Analysis of Alternatives" affirms deflection as the safest and most effective means of potentially hazardous object (PHO) impact prevention. It also calls for further studies of object deflection. In principle, deflection of a PHO may be accomplished by using kinetic impactors, chemical explosives, gravity tractors, solar sails, or nuclear munitions. Of the sudden impulse options, nuclear munitions are by far the most efficient in terms of yield-per-unit-mass launched and are technically mature. However, there are still significant questions about the response of a comet or asteroid to a nuclear burst. Recent and ongoing observational and experimental work is revolutionizing our understanding of the physical and chemical properties of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). The combination of this improved understanding of small solar-system bodies combined with current state-of-the-art modeling and simulation capabilities, which have also improved dramatically in recent years, allow for a science-based, comprehensive study of PHO mitigation techniques. Here we present an examination of the effects of radiation from a nuclear explosion on potentially hazardous asteroids and comets through Monte Carlo N-Particle code (MCNP) simulation techniques. MCNP is a general-purpose particle transport code commonly used to model neutron, photon, and electron transport for medical physics, reactor design and safety, accelerator target and detector design, and a variety of other applications including modeling the propagation of epithermal neutrons through the Martian regolith (Prettyman 2002). It is a massively parallel code that can conduct simulations in 1-3 dimensions, complicated geometries, and with extremely powerful variance reduction techniques. It uses current nuclear cross section data, where available, and fills in the gaps with analytical models where data are not available. MCNP has undergone extensive verification and validation and is considered the gold-standard for particle transport. (Forrest B. Brown, et al., "MCNP Version 5," Trans. Am. Nucl. Soc., 87, 273, November 2002.) Additionally, a new simulation capability using MCNP has become available to this collaboration. The first results of this new capability will also be presented. In particular, we will show results of neutron and gamma-ray energy deposition and flux as a function of material depth, composition, density, geometry, and distance from the source (nuclear burst). We will also discuss the benefits and shortcomings of linear Monte Carlo. Finally, we will set the stage for the correct usage and limitations of these results in coupled radiation-hydrodynamic calculations (see Plesko et al, this conference).

  7. Identification, prediction, and mitigation of sinkhole hazards in evaporite karst areas

    USGS Publications Warehouse

    Gutierrez, F.; Cooper, A.H.; Johnson, K.S.

    2008-01-01

    Sinkholes usually have a higher probability of occurrence and a greater genetic diversity in evaporite terrains than in carbonate karst areas. This is because evaporites have a higher solubility and, commonly, a lower mechanical strength. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas. To deal with these hazards, a phased approach is needed for sinkhole identification, investigation, prediction, and mitigation. Identification techniques include field surveys and geomorphological mapping combined with accounts from local people and historical sources. Detailed sinkhole maps can be constructed from sequential historical maps, recent topographical maps, and digital elevation models (DEMs) complemented with building-damage surveying, remote sensing, and high-resolution geodetic surveys. On a more detailed level, information from exposed paleosubsidence features (paleokarst), speleological explorations, geophysical investigations, trenching, dating techniques, and boreholes may help in investigating dissolution and subsidence features. Information on the hydrogeological pathways including caves, springs, and swallow holes are particularly important especially when corroborated by tracer tests. These diverse data sources make a valuable database-the karst inventory. From this dataset, sinkhole susceptibility zonations (relative probability) may be produced based on the spatial distribution of the features and good knowledge of the local geology. Sinkhole distribution can be investigated by spatial distribution analysis techniques including studies of preferential elongation, alignment, and nearest neighbor analysis. More objective susceptibility models may be obtained by analyzing the statistical relationships between the known sinkholes and the conditioning factors. Chronological information on sinkhole formation is required to estimate the probability of occurrence of sinkholes (number of sinkholes/km2 year). Such spatial and temporal predictions, frequently derived from limited records and based on the assumption that past sinkhole activity may be extrapolated to the future, are non-corroborated hypotheses. Validation methods allow us to assess the predictive capability of the susceptibility maps and to transform them into probability maps. Avoiding the most hazardous areas by preventive planning is the safest strategy for development in sinkhole-prone areas. Corrective measures could be applied to reduce the dissolution activity and subsidence processes. A more practical solution for safe development is to reduce the vulnerability of the structures by using subsidence-proof designs. ?? 2007 Springer-Verlag.

  8. Monitoring active volcanoes and mitigating volcanic hazards: the case for including simple approaches

    NASA Astrophysics Data System (ADS)

    Stoiber, Richard E.; Williams, Stanley N.

    1990-07-01

    Simple approaches to problems brought about eruptions and their ensuing hazardous effects should be advocated and used by volcanologists while awaiting more sophisticated remedies. The expedients we advocate have all or many of the following attributes: only locally available materials are required; no extensive training of operators or installation is necessary; they are affordable and do not require foreign aid or exports; they are often labor intensive and are sustainable without outside assistance. Where appropriate, the involvement of local residents is advocated. Examples of simple expedients which can be used in forecasting or mitigating the effects of crises emphasize the relative ease and the less elaborate requirements with which simple approaches can be activated. Emphasis is on visual observations often by untrained observers, simple meteorogical measurements, observations of water level in lakes, temperature and chemistry of springs and fumaroles, new springs and collapse areas and observations of volcanic plumes. Simple methods are suggested which can be applied to mitigating damage from mudflows, nuées ardentes, tephra falls and gas discharge. A review in hindsight at Ruiz includes the use of both chemical indicators and simple mudflow alarms. Simple expedients are sufficiently effective that any expert volcanologist called to aid in a crisis must include them in the package of advice offered. Simple approaches are a critical and logical complement to highly technical solutions to hazardous situations.

  9. 2009 ERUPTION OF REDOUBT VOLCANO: Lahars, Oil, and the Role of Science in Hazards Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Swenson, R.; Nye, C. J.

    2009-12-01

    In March, 2009, Redoubt Volcano erupted for the third time in 45 years. More than 19 explosions produced ash plumes to 60,000 ft asl, lahar flows of mud and ice down the Drift river ~30 miles to the coast, and tephra fall up to 1.5 mm onto surrounding communities. The eruption had severe impact on many operations. Airlines were forced to cancel or divert hundreds of international and domestic passenger and cargo flights, and Anchorage International airport closed for over 12 hours. Mudflows and floods down the Drift River to the coast impacted operations at the Drift River Oil Terminal (DROT) which was forced to shut down and ultimately be evacuated. Prior mitigation efforts to protect the DROT oil tank farm from potential impacts associated with a major eruptive event were successful, and none of the 148,000 barrels of oil stored at the facility was spilled or released. Nevertheless, the threat of continued eruptive activity at Redoubt, with the possibility of continued lahar flows down the Drift River alluvial fan, required an incident command post be established so that the US Coast Guard, Alaska Dept. of Environmental Conservation, and the Cook Inlet Pipeline Company could coordinate a response to the potential hazards. Ultimately, the incident command team relied heavily on continuous real-time data updates from the Alaska Volcano Observatory, as well as continuous geologic interpretations and risk analysis by the USGS Volcanic Hazards group, the State Division of Geological and Geophysical Surveys and the University of Alaska Geophysical Institute, all members of the collaborative effort of the Alaska Volcano Observatory. The great success story that unfolded attests to the efforts of the incident command team, and their reliance on real-time scientific analysis from scientific experts. The positive results also highlight how pre-disaster mitigation and monitoring efforts, in concert with hazards response planning, can be used in a cooperative industry / multi-agency effort to positively affect hazards mitigation. The final outcomes from this potentially disastrous event included: 1) no on-site personnel were injured; 2) no detrimental environmental impacts associated with the oil terminal occurred; and 3) incident command personnel, together with numerous industry representatives, were able to make well-informed, although costly decisions that resulted in safe removal of the oil from the storage facilities. The command team’s efforts also furthered the process of restarting the Cook Inlet oil production after a forced five month shutdown.

  10. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

    NASA Astrophysics Data System (ADS)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

    2013-12-01

    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at five of them. The results showed that teams quickly learned to collect high-quality data for each method of analysis. SPAC and refraction microtremor analysis each demonstrated that dispersion relations based on ambient noise and from arrays with an aperture of less than 200 meters could be used to determine the depth of a weak, disaggregated layer known to underlie the fast near-surface limestone terraces on which Santo Domingo is situated, and indicated the presence of unexpectedly strong rocks below. All three array methods concurred that most Santo Domingo sites has relatively high VS30 (average shear velocity to a depth of 30 m), generally at the B-C NEHRP hazard class boundary or higher. HVSR analysis revealed that the general pattern of resonance was short periods close to the coast, and an increase with distance from the shore line. In the east-west direction, significant variations were also evident at the highest elevation terrace, and near the Ozama River. In terms of the sub-soil conditions, the observed pattern of HVSR values, departs form the expected increase of sediments thickness close to the coast.

  11. Coupling Radar Rainfall Estimation and Hydrological Modelling For Flash-flood Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Borga, M.; Creutin, J. D.

    Flood risk mitigation is accomplished through managing either or both the hazard and vulnerability. Flood hazard may be reduced through structural measures which alter the frequency of flood levels in the area. The vulnerability of a community to flood loss can be mitigated through changing or regulating land use and through flood warning and effective emergency response. When dealing with flash-flood hazard, it is gener- ally accepted that the most effective way (and in many instances the only affordable in a sustainable perspective) to mitigate the risk is by reducing the vulnerability of the involved communities, in particular by implementing flood warning systems and community self-help programs. However, both the inherent characteristics of the at- mospheric and hydrologic processes involved in flash-flooding and the changing soci- etal needs provide a tremendous challenge to traditional flood forecasting and warning concepts. In fact, the targets of these systems are traditionally localised like urbanised sectors or hydraulic structures. Given the small spatial scale that characterises flash floods and the development of dispersed urbanisation, transportation, green tourism and water sports, human lives and property are exposed to flash flood risk in a scat- tered manner. This must be taken into consideration in flash flood warning strategies and the investigated region should be considered as a whole and every section of the drainage network as a potential target for hydrological warnings. Radar technology offers the potential to provide information describing rain intensities almost contin- uously in time and space. Recent research results indicate that coupling radar infor- mation to distributed hydrologic modelling can provide hydrologic forecasts at all potentially flooded points of a region. Nevertheless, very few flood warning services use radar data more than on a qualitative basis. After a short review of current under- standing in this area, two issues are examined: advantages and caveats of using radar rainfall estimates in operational flash flood forecasting, methodological problems as- sociated to the use of hydrological models for distributed flash flood forecasting with rainfall input estimated from radar.

  12. Earth sciences, GIS and geomatics for natural hazards assessment and risks mitigation: a civil protection perspective

    NASA Astrophysics Data System (ADS)

    Perotti, Luigi; Conte, Riccardo; Lanfranco, Massimo; Perrone, Gianluigi; Giardino, Marco; Ratto, Sara

    2010-05-01

    Geo-information and remote sensing are proper tools to enhance functional strategies for increasing awareness on natural hazards and risks and for supporting research and operational activities devoted to disaster reduction. An improved Earth Sciences knowledge coupled with Geomatics advanced technologies has been developed by the joint research group and applied by the ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action) centre, within its partnership with the UN World Food Programme (WFP) with the goal of reducing human, social, economic and environmental losses due to natural hazards and related disasters. By cooperating with local and regional authorities (Municipalities, Centro Funzionale of the Aosta Valley, Civil Protection Agency of Regione Piemonte), data on natural hazards and risks have been collected, compared to national and global data, then interpreted for helping communities and civil protection agencies of sensitive mountain regions to make strategic choices and decisions to better mitigation and adaption measures. To enhance the application of GIS and Remote-sensing technologies for geothematic mapping of geological and geomorphological risks of mountain territories of Europe and Developing Countries, research activities led to the collection and evaluation of data from scientific literature and historical technical archives, for the definition of predisposing/triggering factors and evolutionary processes of natural instability phenomena (landslides, floods, storms, …) and for the design and implementation of early-warning and early-impact systems. Geodatabases, Remote Sensing and Mobile-GIS applications were developed to perform analysis of : 1) large climate-related disaster (Hurricane Mitch, Central America), by the application of remote sensing techniques, either for early warning or mitigation measures at the national and international scale; 2) distribution of slope instabilities at the regional scale (Aosta Valley, NW-Italy), for preventing and recovering measures; 3) geological and geomorphological controlling factors of seismicity, to provide microzonation maps and scenarios for co-seismic response of instable zones (Dronero, NW- Italian Alps); 4) earthquake effects on ground and infrastructures, in order to register early assessment for awareness situations and for compile damage inventories (Asti-Alessandria seismic events, 2000, 2001, 2003). The research results has been able to substantiate early warning models by structuring geodatabases on natural disasters, and to support humanitarian relief and disaster management activities by creating and testing SRG2, a mobile-GIS application for field-data collection on natural hazards and risks.

  13. Solutions Network Formulation Report. NASA's Potential Contributions using ASTER Data in Marine Hazard Mitigation

    NASA Technical Reports Server (NTRS)

    Fletcher, Rose

    2010-01-01

    The 28-foot storm surge from Hurricane Katrina pushed inland along bays and rivers for a distance of 12 miles in some areas, contributing to the damage or destruction of about half of the fleet of boats in coastal Mississippi. Most of those boats had sought refuge in back bays and along rivers. Some boats were spared damage because the owners chose their mooring site well. Gulf mariners need a spatial analysis tool that provides guidance on the safest places to anchor their boats during future hurricanes. This product would support NOAA s mission to minimize the effects of coastal hazards through awareness, education, and mitigation strategies and could be incorporated in the Coastal Risk Atlas decision support tool.

  14. Rockfall hazard assessment, risk quantification, and mitigation options for reef cove resort development, False Cape, Queensland, Australia

    NASA Astrophysics Data System (ADS)

    Schlotfeldt, P.

    2009-04-01

    GIS and 2-D rock fall simulations were used as the primary tools during a rock fall hazard assessment and analyses for a major resort and township development near Cairns, Queensland in Australia. The methods used included 1) the development of a digital elevation model (DEM); undertaking rock fall trajectory analyses to determine the end points of rockfalls, the distribution of kinetic energy for identified rock fall runout Zones, and 3) undertaking event tree analyses based on a synthesis of all data in order to establish Zones with the highest risk of fatalities. This paper describes the methodology used and the results of this work. Recommendations to mitigate the hazard included having exclusions zones with no construction, scaling (including trim blasting), construction of berms and rockfall catch fences. Keywords: GIS, rockfall simulation, rockfall runout Zones, mitigation options INTRODUCTION False Cape is located on the east side of the Trinity inlet near Cairns (Figure 1). Construction is underway for a multi-million dollar development close the beach front. The development will ultimately cover about 1.5 km of prime coast line. The granite slopes above the development are steep and are covered with a number of large, potentially unstable boulders. Sheet jointing is present in the in-situ bedrock and these combined with other tectonic joint sets have provided a key mechanism for large side down slope on exposed bedrock. With each rock fall (evidence by boulders strew in gullies, over the lower parts of the slope, and on the beach) the failure mechanism migrates upslope. In order for the Developer to proceed with construction he needs to mitigate the identified rock fall hazard. The method used to study the hazard and key finding are presented in this paper. Discussion is provided in the conclusion on mitigation options. KEY METHODS USED TO STUDY THE HAZARD In summary the methods used to study the hazard for the False Cape project include; 1. The development of a digital elevation model (DEM) used to delineate rock fall runout Zones [1] that included the spatial location of boulder fields mapped within Zones(Figure 2). A Zone is defined as an area above the development on steep sided slopes where falling rocks are channeled into gullies / and or are contained between topographic features such as ridges and spurs that extend down the mountainside. These natural barriers generally ensure that falling rocks do not fall or roll into adjacent Zones; 2. The use of ‘Flow Path Tracing Tool' in Arc GIS spatial analyst to confirm typical descents of boulders in Zones. These were shown to correlated strongly with the endpoints of boulders observed within the development and major clusters of boulders on the beach front; 3. The use of 2-D rockfall trajectory analyses [2] using sections cut along typical 3-D trajectory paths mapped out in ARC GIS per Zone. Sections along typical paths in Zones simulated, to some degree, the 3-D affect or path of rocks as they bounce roll down slope (Figure 3); 4. The calibration of rockfall input parameters (coefficients of normal and tangential restitution, slope roughness, friction angle, etc.) using field identified endpoints and size of fallen rock and boulder; and 5. Undertaking risk evolutions in order to quantify the potential risk for each independent rockfall Zone. KEY FINDINGS FROM THE STUDIES The key findings from the study include; 1. Multiple potentially unstable in-situ boulders (some in excess of several thousand tonnes) are present above the development. 2. Similar geological structures (dykes, jointing, etc.) are present in the boulders on the beach front and within the development exposed in-situ bedrock located above the development. Measurement and comparison of the orientation of these geological structures present in boulders with that observed in the in-situ bedrock provided strong evidence that that the boulders have mitigated down slope. 3. Eight discrete Rockfall Runout Zones were identified using the digital elevation model set up in ARC GIS (Figure 4). The boundaries were field verified as far as possible. The identified Zones formed the basis of all subsequent work. 4. Once calibrated the rockfall trajectory modeling showed that only between 1% and in the worst case 28% of falling rocks (percentage of 1000 seeding events) per Zones would actually reach the development. While this indicated a reduced likelihood of an incident and hence the risk, the kinetic energy in the case of an impact in most Zones was so high (for the given design block size) that the consequence would be untenable without some form of mitigation. 5. An event tree analysis showed that five out of the eight Zones identified had risk profiles that fell above or very close to what was considered to be an acceptable annual probability of occurrence of a fatality or fatalities. CONCLUSIONS Each Zone has unique characteristics that influence the risk profile associated with the rock fall hazard to the development. Mitigation options and recommendations needed to be adjusted accordingly to fit the physical characteristics and assessed risk profile of each Zone. These included: 1. The possible implantation of exclusion zones (no build areas); 2. Scaling (including controlled blasting) to reduce the potential kinetic energy associated with identified potentially unstable boulders; and 3. The design and construction of Berms and rockfall catch fences.

  15. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    NASA Astrophysics Data System (ADS)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation of earthquake scientists and engineers. In addition to the national maps, the USGS produces more detailed urban seismic hazard maps that communities have used to prioritize retrofits and design critical infrastructure that can withstand large earthquakes. At a regional scale, the USGS and its partners in California have developed a time-dependent earthquake rupture forecast that is being used by the insurance sector, which can serve to distribute risk and foster mitigation if the right incentives are in place. What the USGS and partners are doing at the urban, regional, and national scales, the Global Earthquake Model project is seeking to do for the world. A significant challenge for engaging the public to prepare for earthquakes is making low-probability, high-consequence events real enough to merit personal action. Scenarios help by starting with the hazard posed by a specific earthquake and then exploring the fragility of the built environment, cascading failures, and the real-life consequences for the public. To generate such a complete picture takes multiple disciplines working together. Earthquake scenarios are being used both for emergency management exercises and much broader public preparedness efforts like the Great California ShakeOut, which engaged nearly 7 million people.

  16. Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Research Team

    NASA Technical Reports Server (NTRS)

    Kelly, Michael J.

    2013-01-01

    The Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage raft empennage.

  17. Using Robust Decision Making to Assess and Mitigate the Risks of Natural Hazards in Developing Countries

    NASA Astrophysics Data System (ADS)

    Kalra, N.; Lempert, R. J.; Peyraud, S.

    2012-12-01

    Ho Chi Minh City (HCMC) ranks fourth globally among coastal cities most vulnerable to climate change and already experiences extensive routine flooding. In the coming decades, increased precipitation, rising sea levels, and land subsidence could permanently inundate a large portion of the city's population, place the poor at particular risk, and threaten new economic development in low-lying areas. HCMC is not alone in facing the impacts of natural hazards exacerbated by uncertain future climate change, development, and other deep uncertainties. Assessing and managing these risks is a tremendous challenge, particularly in developing countries which face pervasive shortages of the data and models generally used to plan for such changes. Using HCMC as a case study, this talk will demonstrate how a scenario-based approach that uses robustness as a decision and planning element can help developing countries assess future climate risk and manage the risk of natural disasters. In contrast to traditional approaches which treat uncertainty with a small number of handcrafted scenarios, this talk will emphasize how robust decision making, which uses modeling to explore over thousands of scenarios, can identify potential vulnerabilities to HCMC's emerging flood risk management strategy and suggest potential responses. The talk will highlight several novel features of the collaboration with the HCMC Steering Committee for Flood Control. First, it examines several types of risk -- risk to the poor, risk to the non-poor, and risk to the economy -- and illustrates how management policies have different implications for these sectors. Second, it demonstrates how diverse and sometimes incomplete climate, hydrologic, socioeconomic, GIS, and other data and models can be integrated into a modeling framework to develop and evaluate many scenarios of flood risk. Third, it illustrates the importance of non-structural policies such as land use management and building design to manage flood risk. Finally, it demonstrates how an adaptive management strategy that evolves over time and implements management options in response to new information can more effectively mitigate risks from natural disasters than can static policies.; A scatter plot of risk to the poor and non-poor in 1000 different scenarios under eight different risk management options (differentiated by color).

  18. Evaluation Of Risk And Possible Mitigation Schemes For Previously Unidentified Hazards

    NASA Technical Reports Server (NTRS)

    Linzey, William; McCutchan, Micah; Traskos, Michael; Gilbrech, Richard; Cherney, Robert; Slenski, George; Thomas, Walter, III

    2006-01-01

    This report presents the results of arc track testing conducted to determine if such a transfer of power to un-energized wires is possible and/or likely during an arcing event, and to evaluate an array of protection schemes that may significantly reduce the possibility of such a transfer. The results of these experiments may be useful for determining the level of protection necessary to guard against spurious voltage and current being applied to safety critical circuits. It was not the purpose of these experiments to determine the probability of the initiation of an arc track event only if an initiation did occur could it cause the undesired event: an inadvertent thruster firing. The primary wire insulation used in the Orbiter is aromatic polyimide, or Kapton , a construction known to arc track under certain conditions [3]. Previous Boeing testing has shown that arc tracks can initiate in aromatic polyimide insulated 28 volts direct current (VDC) power circuits using more realistic techniques such as chafing with an aluminum blade (simulating the corner of an avionics box or lip of a wire tray), or vibration of an aluminum plate against a wire bundle [4]. Therefore, an arc initiation technique was chosen that provided a reliable and consistent technique of starting the arc and not a realistic simulation of a scenario on the vehicle. Once an arc is initiated, the current, power and propagation characteristics of the arc depend on the power source, wire gauge and insulation type, circuit protection and series resistance rather than type of initiation. The initiation method employed for these tests was applying an oil and graphite mixture to the ends of a powered twisted pair wire. The flight configuration of the heater circuits, the fuel/oxider (or ox) wire, and the RCS jet solenoid were modeled in the test configuration so that the behavior of these components during an arcing event could be studied. To determine if coil activation would occur with various protection wire schemes, 145 tests were conducted using various fuel/ox wire alternatives (shielded and unshielded) and/or different combinations of polytetrafuloroethylene (PTFE), Mystik tape and convoluted wraps to prevent unwanted coil activation. Test results were evaluated along with other pertinent data and information to develop a mitigation strategy for an inadvertent RCS firing. The SSP evaluated civilian aircraft wiring failures to search for aging trends in assessing the wire-short hazard. Appendix 2 applies Weibull statistical methods to the same data with a similar purpose.

  19. Bulgaria country study address climate change mitigation

    SciTech Connect

    Simeonova, K.

    1996-09-01

    The Bulgaria Country Study to address climate change is a research project that incorporates three major elements: inventory of greenhouse gases (GHG), assessment of vulnerability and adaptation to climate change, and mitigation analysis. The mitigation analysis is the central part of the study. Based on the assumptions of the socioeconomic development of the country up to the year 2020, it allows identification of policies and measures that may lead to the stabilization of GHG emissions, as required by the United Nations Framework Convention on Climate Change (FCCC). Bulgaria signed the FCCC in 1992, ratified it in 1995, and is now undertaking measures related to its implementation. At present Bulgaria is in the process of transition from a centrally planned economy to a market driven economy. This process is characterized by basic economic structural changes, freeing of prices (with only energy prices still controlled by the government), a drastic drop in GDP by 27% in 1992 compared to 1988, and reduction of energy consumption by 44% in 1992 compared to 1988. The analysis of the mitigation measures and their impact on the future development of economy and on the energy sector is a very complicated task. This paper will focus on the choice of methodology, some basic assumptions, and results of the study to date. 9 refs., 13 figs.

  20. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    USGS Publications Warehouse

    Hearn,, Paul P., Jr.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  1. Volcano Hazard Tracking and Disaster Risk Mitigation: A Detailed Gap Analysis from Data-Collection to User Implementation

    NASA Astrophysics Data System (ADS)

    Faied, D.; Sanchez, A.

    2009-04-01

    Volcano Hazard Tracking and Disaster Risk Mitigation: A Detailed Gap Analysis from Data-Collection to User Implementation Dohy Faied, Aurora Sanchez (on behalf of SSP08 VAPOR Project Team) Dohy.Faied@masters.isunet.edu While numerous global initiatives exist to address the potential hazards posed by volcanic eruption events and assess impacts from a civil security viewpoint, there does not yet exist a single, unified, international system of early warning and hazard tracking for eruptions. Numerous gaps exist in the risk reduction cycle, from data collection, to data processing, and finally dissemination of salient information to relevant parties. As part of the 2008 International Space University's Space Studies Program, a detailed gap analysis of the state of volcano disaster risk reduction was undertaken, and this paper presents the principal results. This gap analysis considered current sensor technologies, data processing algorithms, and utilization of data products by various international organizations. Recommendations for strategies to minimize or eliminate certain gaps are also provided. In the effort to address the gaps, a framework evolved at system level. This framework, known as VIDA, is a tool to develop user requirements for civil security in hazardous contexts, and a candidate system concept for a detailed design phase. VIDA also offers substantial educational potential: the framework includes a centralized clearinghouse for volcanology data which could support education at a variety of levels. Basic geophysical data, satellite maps, and raw sensor data are combined and accessible in a way that allows the relationships between these data types to be explored and used in a training environment. Such a resource naturally lends itself to research efforts in the subject but also research in operational tools, system architecture, and human/machine interaction in civil protection or emergency scenarios.

  2. Field Guide for Testing Existing Photovoltaic Systems for Ground Faults and Installing Equipment to Mitigate Fire Hazards

    SciTech Connect

    Brooks, William; Basso, Thomas; Coddington, Michael

    2015-10-01

    Ground faults and arc faults are the two most common reasons for fires in photovoltaic (PV) arrays and methods exist that can mitigate the hazards. This report provides field procedures for testing PV arrays for ground faults, and for implementing high resolution ground fault and arc fault detectors in existing and new PV system designs.

  3. The Identification of Filters and Interdependencies for Effective Resource Allocation: Coupling the Mitigation of Natural Hazards to Economic Development.

    NASA Astrophysics Data System (ADS)

    Agar, S. M.; Kunreuther, H.

    2005-12-01

    Policy formulation for the mitigation and management of risks posed by natural hazards requires that governments confront difficult decisions for resource allocation and be able to justify their spending. Governments also need to recognize when spending offers little improvement and the circumstances in which relatively small amounts of spending can make substantial differences. Because natural hazards can have detrimental impacts on local and regional economies, patterns of economic development can also be affected by spending decisions for disaster mitigation. This paper argues that by mapping interdependencies among physical, social and economic factors, governments can improve resource allocation to mitigate the risks of natural hazards while improving economic development on local and regional scales. Case studies of natural hazards in Turkey have been used to explore specific "filters" that act to modify short- and long-term outcomes. Pre-event filters can prevent an event from becoming a natural disaster or change a routine event into a disaster. Post-event filters affect both short and long-term recovery and development. Some filters cannot be easily modified by spending (e.g., rural-urban migration) but others (e.g., land-use practices) provide realistic spending targets. Net social benefits derived from spending, however, will also depend on the ways by which filters are linked, or so-called "interdependencies". A single weak link in an interdependent system, such as a power grid, can trigger a cascade of failures. Similarly, weak links in social and commercial networks can send waves of disruption through communities. Conversely, by understanding the positive impacts of interdependencies, spending can be targeted to maximize net social benefits while mitigating risks and improving economic development. Detailed information on public spending was not available for this study but case studies illustrate how networks of interdependent filters can modify social benefits and costs. For example, spending after the 1992 Erzincan earthquake targeted local businesses but limited alternative employment, labor losses and diminished local markets all contributed to economic stagnation. Spending after the 1995 Dinar earthquake provided rent subsidies, supporting a major exodus from the town. Consequently many local people were excluded from reconstruction decisions and benefits offered by reconstruction funds. After the 1999 Marmara earthquakes, a 3-year economic decline in Yalova illustrates the vulnerability of local economic stability to weak regulation enforcement by a few agents. A resource allocation framework indicates that government-community relations, lack of economic diversification, beliefs, and compensation are weak links for effective spending. Stronger positive benefits could be achieved through spending to target land-use regulation enforcement, labor losses, time-critical needs of small businesses, and infrastructure. While the impacts of the Marmara earthquakes were devastating, strong commercial networks and international interests helped to re-establish the regional economy. Interdependencies may have helped to drive a recovery. Smaller events in eastern Turkey, however, can wipe out entire communities and can have long-lasting impacts on economic development. These differences may accelerate rural to urban migration and perpetuate regional economic divergence in the country. 1: Research performed in the Wharton MBA Program, Univ. of Pennsylvania.

  4. Challenges in understanding, modelling, and mitigating Lake Outburst Flood Hazard: experiences from Central Asia

    NASA Astrophysics Data System (ADS)

    Mergili, Martin; Schneider, Demian; Andres, Norina; Worni, Raphael; Gruber, Fabian; Schneider, Jean F.

    2010-05-01

    Lake Outburst Floods can evolve from complex process chains like avalanches of rock or ice that produce flood waves in a lake which may overtop and eventually breach glacial, morainic, landslide, or artificial dams. Rising lake levels can lead to progressive incision and destabilization of a dam, to enhanced ground water flow (piping), or even to hydrostatic failure of ice dams which can cause sudden outflow of accumulated water. These events often have a highly destructive potential because a large amount of water is released in a short time, with a high capacity to erode loose debris, leading to a powerful debris flow with a long travel distance. The best-known example of a lake outburst flood is the Vajont event (Northern Italy, 1963), where a landslide rushed into an artificial lake which spilled over and caused a flood leading to almost 2000 fatalities. Hazards from the failure of landslide dams are often (not always) fairly manageable: most breaches occur in the first few days or weeks after the landslide event and the rapid construction of a spillway - though problematic - has solved some hazardous situations (e.g. in the case of Hattian landslide in 2005 in Pakistan). Older dams, like Usoi dam (Lake Sarez) in Tajikistan, are usually fairly stable, though landsildes into the lakes may create floodwaves overtopping and eventually weakening the dams. The analysis and the mitigation of glacial lake outburst flood (GLOF) hazard remains a challenge. A number of GLOFs resulting in fatalities and severe damage have occurred during the previous decades, particularly in the Himalayas and in the mountains of Central Asia (Pamir, Tien Shan). The source area is usually far away from the area of impact and events occur at very long intervals or as singularities, so that the population at risk is usually not prepared. Even though potentially hazardous lakes can be identified relatively easily with remote sensing and field work, modeling and predicting of GLOFs (and also the outburst of landslide-dammed lakes) remains a challenge: • The knowledge about the onset of the process is often limited (bathymetry of the lakes, subsurface water, properties of dam (content of ice), type of dam breach, understanding of process chains and interactions). • The size of glacial lakes may change rapidly but continuously, and many lakes break out within a short time after their development. Continuous monitoring is therefore required to keep updated on the existing hazards. • Also the outburst of small glacial lakes may lead to significant debris floods or even debris flows if there is plenty of erodible material available. • The available modeling software packages are of limited suitability for lake outburst floods: e.g. software developed by the hydrological community is specialized to simulate (debris) floods with input hydrographs on moderately steep flow channels and with lower sediment loads. In contrast to this, programs for rapid mass movements are better suited on steeper slopes and sudden onset of the movement. The typical characteristics of GLOFs are in between and vary for different channel sections. In summary, the major bottlenecks remain in deriving realistic or worst case scenarios and predicting their magnitude and area of impact. This mainly concerns uncertainties in the dam break process, involved volumes, erosion rates, changing rheologies, and the limited capabilities of available software packages to simulate process interactions and transformations such as the development of a hyperconcentrated flow into a debris flow. In addition, many areas prone to lake outburst floods are located in developing countries with a limited scope of the threatened population for decision-making and limited resources for mitigation.

  5. Environmental legislation as the legal framework for mitigating natural hazards in Spain

    NASA Astrophysics Data System (ADS)

    Garrido, Jesús; Arana, Estanislao; Jiménez Soto, Ignacio; Delgado, José

    2015-04-01

    In Spain, the socioeconomic losses due to natural hazards (floods, earthquakes or landslides) are considerable, and the indirect costs associated with them are rarely considered because they are very difficult to evaluate. The prevention of losses due to natural hazards is more economic and efficient through legislation and spatial planning rather than through structural measures, such as walls, anchorages or structural reinforcements. However, there isn't a Spanish natural hazards law and national and regional sector legislation make only sparse mention of them. After 1978, when the Spanish Constitution was enacted, the Autonomous Communities (Spanish regions) were able to legislate according to the different competences (urban planning, environment or civil protection), which were established in the Constitution. In the 1990's, the Civil Protection legislation (national law and regional civil protection tools) dealt specifically with natural hazards (floods, earthquakes and volcanoes), but this was before any soil, seismic or hydrological studies were recommended in the national sector legislation. On the other hand, some Autonomous Communities referred to natural hazards in the Environmental Impact Assessment legislation (EIA) and also in the spatial and urban planning legislation and tools. The National Land Act, enacted in 1998, established, for the first time, that those lands exposed to natural hazards should be classified as non-developable. The Spanish recast text of the Land Act, enacted by Royal Legislative Decree 2/2008, requires that a natural hazards map be included in the Environmental Sustainability Report (ESR), which is compulsory for all master plans, according to the provisions set out by Act 9/2006, known as Spanish Strategic Environmental Assessment (SEA). Consequently, the environmental legislation, after the aforementioned transposition of the SEA European Directive 2001/42/EC, is the legal framework to prevent losses due to natural hazards through land use planning. However, most of the Spanish master plans approved after 2008 don't include a natural hazards map or/and don't classify the areas exposed to natural hazards as non-developable. Restrictions or prohibitions for building in natural hazard-prone areas are not usually established in the master plans. According to the jurisprudence, the environmental legislation prevails over spatial and urban planning regulations. On the other hand, the precedence of the national competence in public security would allow reclassification or the land, independently of the political or economic motivations of the municipal government. Despite of the technical building code or the seismic building code where some recommendations for avoiding "geotechnical" or seismic hazards are established, there are not compulsory guidelines to do technical studies/hazard maps for floods or landslides. The current legislation should be improved, under a technical point of view, and some mechanisms for enforcing the law should be also considered.

  6. The Effective Organization and Use of Data in Bridging the Hazard Mitigation-Climate Change Adaptation Divide (Invited)

    NASA Astrophysics Data System (ADS)

    Smith, G. P.; Fox, J.; Shuford, S.

    2010-12-01

    The costs associated with managing natural hazards and disasters continue to rise in the US and elsewhere. Many climate change impacts are manifested in stronger or more frequent natural hazards such as floods, wildfire, hurricanes and typhoons, droughts, and heat waves. Despite this common problem, the climate change adaptation and hazards management communities have largely failed to acknowledge each other’s work in reducing hazard impacts. This is even reflected in the language that each community uses; for example, the hazards management community refers to hazard risk reduction as mitigation while the climate change community refers to it as adaptation. In order to help bridge this divide, we suggest each community utilize data in a more formally-organized and effective manner based on four principles: 1. The scale of the data must reflect the needs of the decision maker. In most cases, decision makers’ needs are most effectively met through the development of a multiple alternatives that takes into account a variety of possible impacts. 2. Investments intended to reduce vulnerability and increase resilience should be driven by the wise use of available data using a “risk-based” strategy. 3. Climate change adaptation and hazard mitigation strategies must be integrated with other value drivers when building resiliency. Development and use of data that underscore the concept of “no regrets” risk reduction can be used to accomplish this aim. 4. The use of common data is critical in building a bridge between the climate change adaptation and hazards management communities. We will explore how the creation of data repositories that collect, analyze, display and archive hazards and disaster data can help address the challenges posed by the current and hazards management and climate change adaptation divide.

  7. New Multi-HAzard and MulTi-RIsk Assessment MethodS for Europe (MATRIX): A research program towards mitigating multiple hazards and risks in Europe

    NASA Astrophysics Data System (ADS)

    Fleming, K. M.; Zschau, J.; Gasparini, P.; Modaressi, H.; Matrix Consortium

    2011-12-01

    Scientists, engineers, civil protection and disaster managers typically treat natural hazards and risks individually. This leads to the situation where the frequent causal relationships between the different hazards and risks, e.g., earthquakes and volcanos, or floods and landslides, are ignored. Such an oversight may potentially lead to inefficient mitigation planning. As part of their efforts to confront this issue, the European Union, under its FP7 program, is supporting the New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe or MATRIX project. The focus of MATRIX is on natural hazards, in particular earthquakes, landslides, volcanos, wild fires, storms and fluvial and coastal flooding. MATRIX will endeavour to develop methods and tools to tackle multi-type natural hazards and risks within a common framework, focusing on methodologies that are suited to the European context. The work will involve an assessment of current single-type hazard and risk assessment methodologies, including a comparison and quantification of uncertainties and harmonization of single-type methods, examining the consequence of cascade effects within a multi-hazard environment, time-dependent vulnerability, decision making and support for multi-hazard mitigation and adaption, and a series of test cases. Three test sites are being used to assess the methods developed within the project (Naples, Cologne, and the French West Indies), as well as a "virtual city" based on a comprehensive IT platform that will allow scenarios not represented by the test cases to be examined. In addition, a comprehensive dissemination program that will involve national platforms for disaster management, as well as various outreach activities, will be undertaken. The MATRIX consortium consists of ten research institutions (nine European and one Canadian), an end-user (i.e., one of the European national platforms for disaster reduction) and a partner from industry.

  8. A review of accidents, prevention and mitigation options related to hazardous gases

    SciTech Connect

    Fthenakis, V.M.

    1993-05-01

    Statistics on industrial accidents are incomplete due to lack of specific criteria on what constitutes a release or accident. In this country, most major industrial accidents were related to explosions and fires of flammable materials, not to releases of chemicals into the environment. The EPA in a study of 6,928 accidental releases of toxic chemicals revealed that accidents at stationary facilities accounted for 75% of the total number of releases, and transportation accidents for the other 25%. About 7% of all reported accidents (468 cases) resulted in 138 deaths and 4,717 injuries ranging from temporary respiratory problems to critical injuries. In-plant accidents accounted for 65% of the casualties. The most efficient strategy to reduce hazards is to choose technologies which do not require the use of large quantities of hazardous gases. For new technologies this approach can be implemented early in development, before large financial resources and efforts are committed to specific options. Once specific materials and options have been selected, strategies to prevent accident initiating events need to be evaluated and implemented. The next step is to implement safety options which suppress a hazard when an accident initiating event occurs. Releases can be prevented or reduced with fail-safe equipment and valves, adequate warning systems and controls to reduce and interrupt gas leakage. If an accident occurs and safety systems fail to contain a hazardous gas release, then engineering control systems will be relied on to reduce/minimize environmental releases. As a final defensive barrier, the prevention of human exposure is needed if a hazardous gas is released, in spite of previous strategies. Prevention of consequences forms the final defensive barrier. Medical facilities close by that can accommodate victims of the worst accident can reduce the consequences of personnel exposure to hazardous gases.

  9. Volcanic hazard in Mexico: a comprehensive on-line database for risk mitigation

    NASA Astrophysics Data System (ADS)

    Manea, Marina; Constantin Manea, Vlad; Capra, Lucia; Bonasia, Rosanna

    2013-04-01

    Researchers are currently working on several key aspects of the Mexican volcanoes, such as remote sensing, field data of old and recent volcaniclastic deposits, structural framework, monitoring (rainfall data and visual observation of lahars), and laboratory experiment (analogue models and numerical simulations - fall3D, titan2D). Each investigation is focused on specific processes, but it is fundamental to visualize the global status of the volcano in order to understand its behavior and to mitigate future hazards. The Mexican Volcanoes @nline represents a novel initiative aimed to collect, on a systematic basis, the complete set of data obtained so far on the volcanoes, and to continuously update the database with new data. All the information is compiled from published works and updated frequently. Maps, such as the geological map of the Mexican volcanos and the associated hazard zonation, as well as point data, such as stratigraphic sections, sedimentology and diagrams of rainfall intensities, are presented in Google Earth format in order to be easily accessed by the scientific community and the general public. An important section of this online database is the presentation of numerical simulations results for ash dispersion associated with the principal Mexican active volcanoes. Daily prediction of ash flow dispersion (based on real-time data from CENAPRED and the Mexican Meteorological Service), as well as large-scale high-resolution subduction simulations performed on HORUS (the Computational Geodynamics Laboratory's supercomputer) represent a central part of the Mexican Volcanos @nline database. The Mexican Volcanoes @nline database is maintained by the Computational Geodynamics Laboratory and it is based entirely on Open Source software. The website can be visited at: http://www.geociencias.unam.mx/mexican_volcanoes.

  10. Improving Tsunami Hazard Mitigation and Preparedness Using Real-Time and Post-Tsunami Field Data

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Miller, K. M.

    2012-12-01

    The February 27, 2010 Chile and March 11, 2011 Japan tsunamis caused dramatic loss of life and damage in the near-source region, and notable impacts in distant coastal regions like California. Comprehensive real-time and post-tsunami field surveys and the availability of hundreds of videos within harbors and marinas allow for detailed documentation of these two events by the State of California Tsunami Program, which receives funding through the National Tsunami Hazard Mitigation Program. Although neither event caused significant inundation of dry land in California, dozens of harbors sustained damage totaling nearly $100-million. Information gathered from these events has guided new strategies in tsunami evacuation planning and maritime preparedness. Scenario-specific, tsunami evacuation "playbook" maps and guidance are being produced detailing inundation from tsunamis of various size and source location. These products help coastal emergency managers prepare local response plans when minor distant source tsunamis or larger tsunamis from local and regional sources are generated. In maritime communities, evaluation of strong tsunami currents and damage are being used to validate/calibrate numerical tsunami model currents and produce in-harbor hazard maps and identify offshore safety zones for potential boat evacuation when a tsunami Warning is issued for a distant source event. Real-time and post-tsunami field teams have been expanded to capture additional detailed information that can be shared in a more timely manner during and after an event through a state-wide clearinghouse. These new products and related efforts will result in more accurate and efficient emergency response by coastal communities, potentially reducing the loss of lives and property during future tsunamis.

  11. Integrated Tsunami Data Supports Forecast, Warning, Research, Hazard Assessment, and Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Stroker, K. J.

    2009-12-01

    With nearly 230,000 fatalities, the 26 December 2004 Indian Ocean tsunami was the deadliest tsunami in history, illustrating the importance of developing basinwide warning systems. Key to creating these systems is easy access to quality-controlled, verified data on past tsunamis. It is essential that warning centers, emergency managers, and modelers can determine if and when similar events have occurred. Following the 2004 tsunami, the National Oceanic and Atmospheric Administration’s (NOAA) National Geophysical Data Center (NGDC) began examining all aspects of the tsunami data archive to help answer questions regarding the frequency and severity of past tsunamis. Historical databases span insufficient time to reveal a region’s full tsunami hazard, so a global database of citations to articles on tsunami deposits was added to the archive. NGDC further expanded the archive to include high-resolution tide gauge data, deep-ocean sensor data, and digital elevation models used for propagation and inundation modeling. NGDC continuously reviews the data for accuracy, making modifications as new information is obtained. These added databases allow NGDC to provide the tsunami data necessary for warning guidance, hazard assessments, and mitigation efforts. NGDC is also at the forefront of standards-based Web delivery of integrated science data through a variety of tools, from Web-form interfaces to interactive maps. The majority of the data in the tsunami archive are discoverable online. Scientists, journalists, educators, planners, and emergency managers are among the many users of these public domain data, which may be used without restriction provided that users cite data sources.

  12. Physical Prototype Development for the Real-Time Detection and Mitigation of Hazardous Releases into a Flow System

    NASA Astrophysics Data System (ADS)

    Rimer, Sara; Katopodes, Nikolaos

    2013-11-01

    The threat of accidental or deliberate toxic chemicals released into public spaces is a significant concern to public safety. The real-time detection and mitigation of such hazardous contaminants has the potential to minimize harm and save lives. In this study, we demonstrate the feasibility of feedback control of a hazardous contaminant by means of a laboratory-scale physical prototype integrated with a previously-developed robust predictive control numerical model. The physical prototype is designed to imitate a public space characterized by a long conduit with an ambient flow (e.g. airport terminal). Unidirectional air flows through a 24-foot long duct. The ``contaminant'' plume of propylene glycol smoke is released into the duct. Camera sensors are used to visually measure concentration of the plume. A pneumatic system is utilized to localize the contaminant via air curtains, and draw it out via vacuum nozzles. The control prescribed to the pneumatic system is based on the numerical model. NSF-CMMI 0856438.

  13. Mitigation of EMU Glove Cut Hazard by MMOD Impact Craters on Exposed ISS Handrails

    NASA Technical Reports Server (NTRS)

    Christiansen, Eric L.; Ryan, Shannon

    2009-01-01

    Recent cut damages to crewmember extravehicular mobility unit (EMU) gloves during extravehicular activity (EVA) onboard the International Space Station (ISS) has been found to result from contact with sharp edges or pinch points rather than general wear or abrasion. One possible source of cut-hazards are protruding sharp edged crater lips from impact of micrometeoroid and orbital debris (MMOD) particles on external metallic handrails along EVA translation paths. During impact of MMOD particles at hypervelocity an evacuation flow develops behind the shock wave, resulting in the formation of crater lips that can protrude above the target surface. In this study, two methods were evaluated to limit EMU glove cut-hazards due to MMOD impact craters. In the first phase, four flexible overwrap configurations are evaluated: a felt-reusable surface insulation (FRSI), polyurethane polyether foam with beta-cloth cover, double-layer polyurethane polyether foam with beta-cloth cover, and multi-layer beta-cloth with intermediate Dacron netting spacers. These overwraps are suitable for retrofitting ground equipment that has yet to be flown, and are not intended to protect the handrail from impact of MMOD particles, rather to act as a spacer between hazardous impact profiles and crewmember gloves. At the impact conditions considered, all four overwrap configurations evaluated were effective in limiting contact between EMU gloves and impact crater profiles. The multi-layer beta-cloth configuration was the most effective in reducing the height of potentially hazardous profiles in handrail-representative targets. In the second phase of the study, four material alternatives to current aluminum and stainless steel alloys were evaluated: a metal matrix composite, carbon fiber reinforced plastic (CFRP), fiberglass, and a fiber metal laminate. Alternative material handrails are intended to prevent the formation of hazardous damage profiles during MMOD impact and are suitable for flight hardware yet to be constructed. Of the four materials evaluated, only the fiberglass formed a less hazardous damage profile than the baseline metallic target. Although the CFRP laminate did not form any noticeable crater lip, brittle protruding fibers are considered a puncture risk. In parallel with EMU glove redesign efforts, modifications to metallic ISS handrails such as those evaluated in this study provide the means to significantly reduce cut-hazards from MMOD impact craters.

  14. Earthquake Hazard Mitigation and Real-Time Warnings of Tsunamis and Earthquakes

    NASA Astrophysics Data System (ADS)

    Kanamori, Hiroo

    2015-09-01

    With better understanding of earthquake physics and the advent of broadband seismology and GPS, seismologists can forecast the future activity of large earthquakes on a sound scientific basis. Such forecasts are critically important for long-term hazard mitigation, but because stochastic fracture processes are complex, the forecasts are inevitably subject to large uncertainties, and unexpected events will inevitably occur. Recent developments in real-time seismology helps seismologists cope with and prepare for such unexpected events, including tsunamis and earthquakes. For a tsunami warning, the required warning time is fairly long (usually 5 min or longer) and enables use of a rigorous method for this purpose. Significant advances have already been made. In contrast, early warning of earthquakes is far more challenging because the required warning time is very short (as short as three seconds). Despite this difficulty the methods used for regional warnings have advanced substantially, and several systems have been already developed and implemented. A future strategy for more challenging, rapid (a few second) warnings, which are critically important for saving properties and lives, is discussed.

  15. Health hazards and mitigation of chronic poisoning from arsenic in drinking water: Taiwan experiences.

    PubMed

    Chen, Chien-Jen

    2014-01-01

    There are two endemic areas of long-term exposure to arsenic from drinking water in Taiwan. Residents in the southwestern and northeastern endemic areas started using high-arsenic artesian well water in the early 1910s and late 1940s, respectively. Public water supply system using surface water was implemented in southwestern and northeastern endemic areas in the 1970s and 1990s, respectively. Systemic health hazards of long-term exposure to arsenic in drinking water have been intensively investigated since the 1960s, especially after 1985 in Taiwan. Several diseases have been well documented to be associated with chronic arsenic poisoning from drinking water showing a dose-response relation. They include characteristic skin lesions like hyperpigmentation or depigmentation, hyperkeratosis in palms and soles, and Bowen disease, peripheral vascular disease (specifically blackfoot disease), ischemic heart disease, cerebral infarction, microvascular diseases, abnormal peripheral microcirculation, carotid atherosclerosis, QT prolongation and increased dispersion in electrocardiography, hypertension, goiter, diabetes mellitus, cataract (specifically posterior subcapsular lens opacity), pterygium, slow neural conduction, retarded neurobehavioral development, erectile dysfunction, and cancers of the skin, lung, urinary bladder, kidney, and liver. The method of choice to mitigate arsenic poisoning through drinking water is to use safe drinking water from uncontaminated sources. PMID:24552958

  16. California Real Time Network: Test Bed for Mitigation of Geological and Atmospheric Hazards within a Modern Data Portal Environment

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2008-12-01

    Global geological and atmospheric hazards such as earthquakes, volcanoes, tsunamis, landslides, storms and floods continue to wreak havoc on the lives of millions of people worldwide. High precision geodetic observations of surface displacements and atmospheric water vapor are indispensable tools in studying natural hazards along side more traditional seismic and atmospheric measurements. The rapid proliferation of dense in situ GPS networks for crustal deformation studies such as the Earthscope Plate Boundary Observatory provides us with unique data sets. However, the full information content and timeliness of these observations have not been fully developed, in particular at higher frequencies than traditional daily continuous GPS position time series. Nor have scientists taken full advantage of the complementary nature of space-based and in situ observations in forecasting, assessing and mitigating natural hazards. The primary operating mode for in situ GPS networks has been daily download of GPS data sampled at a 15-30 s sample rate, and the production of daily position time series or hourly tropospheric zenith delay estimates. However, as continuous GPS networks are being upgraded to provide even higher-frequency information approaching the sampling rates (1-50 Hz) of modern GPS receivers, and with a latency of less than 1 second, new data processing approaches are being developed. Low-latency high-rate measurements are being applied to earthquake source modeling, early warning of natural hazards (geological and atmospheric), and structural monitoring. Since 2002, more than 80 CGPS stations in southern California have been upgraded to a 1 Hz sample rate, including stations from the SCIGN and PBO networks, and several large earthquakes have been recorded. The upgraded stations comprise the California Real Time Network (CRTN - http://sopac.ucsd.edu/projects/realtime/). This prototype network provides continuous 1 Hz (upgradable to 10 Hz at some stations) GPS relative displacements and troposphere delay estimates with a latency of less than 1 s. Steps are being taken to tie these higher-order data products to the global (ITRF) reference frame, and discussions are underway to extend this capability throughout California by upgrading more UNAVCO/PBO stations. With funding from NASA, CRTN provides a test bed for developing advanced in situ- based observation systems within a modern data portal environment, which can be extended seamlessly to the entire PBO region and to other plate boundaries. I describe a prototype early warning system for earthquakes using CRTN, which is also being deployed at other plate boundaries. I show how researchers can access advanced observations of displacements and troposphere delay in real-time and replay significant events within the GPS Explorer data portal and Geophysical Resource Web Services (GRWS) environment.

  17. Past, Present, and Future Challenges in Earthquake Hazard Mitigation of Indonesia: A Collaborative Work of Geological Agency Indonesia and Geoscience Australia

    NASA Astrophysics Data System (ADS)

    Hidayati, S.; Cummins, P. R.; Cipta, A.; Omang, A.; Griffin, J.; Horspool, N.; Robiana, R.; Sulaeman, C.

    2012-12-01

    In the last decade, Indonesia has suffered from earthquakes disaster since four out of twelve of the world's large earthquakes with more than 1000 causalities occurred in Indonesia. The great Sumatra earthquake of December 26, 2004 followed by tsunami which cost 227,898 of lives has brought Indonesia and its active tectonic setting to the world's attention. Therefore the government of Indonesia encourages hazard mitigation efforts that are more focused on the pre-disaster phase. In response to government policy in earthquake disaster mitigation, Geological Agency Indonesia attempts to meet the need for rigorous earthquake hazard map throughout the country in provincial scale in 2014. A collaborative work with Geoscience Australia through short-term training missions; on-going training, mentoring, assistance and studying in Australia, under the auspices of Australia-Indonesia Facility for Disaster Reduction (AIFDR) have accelerated the execution of these maps. Since 2010 to date of collaboration, by using probabilistic seismic hazard assessment (PSHA) method, provincial earthquake hazard maps of Central Java (2010), West Sulawesi, Gorontalo, and North Maluku (2011) have been published. In 2012, by the same method, the remaining provinces of Sulawesi Island, Papua, North Sumatera and Jambi will be published. In the end of 2014, all 33 Indonesian provinces hazard maps will be delivered. The future challenges are to work together with the stakeholders, to produce district scale maps and establish a national standard for earthquake hazard maps. Moreover, the most important consideration is to build the capacity to update, maintain and revise the maps as recent information available.

  18. PREDICTION/MITIGATION OF SUBSIDENCE DAMAGE TO HAZARDOUS WASTE LANDFILL COVERS

    EPA Science Inventory

    Characteristics of Resource Conservation and Recovery Act hazardous waste landfills and of landfilled hazardous wastes have been described to permit development of models and other analytical techniques for predicting, reducing, and preventing landfill settlement and related cove...

  19. Geo hazard studies and their policy implications in Nicaragua

    NASA Astrophysics Data System (ADS)

    Strauch, W.

    2007-05-01

    Nicaragua, situated at the Central American Subduction zone and placed in the trajectory of tropical storms and hurricanes, is a frequent showplace of natural disasters which have multiplied the negative effects of a long term socioeconomic crisis leaving Nicaragua currently as the second poorest country of the Americas. In the last years, multiple efforts were undertaken to prevent or mitigate the affectation of the natural phenomena to the country. National and local authorities have become more involved in disaster prevention policy and international cooperation boosted funding for disaster prevention and mitigation measures in the country. The National Geosciences Institution (INETER) in cooperation with foreign partners developed a national monitoring and early warning system on geological and hydro-meteorological phenomena. Geological and risk mapping projects were conducted by INETER and international partners. Universities, NGO´s, International Technical Assistance, and foreign scientific groups cooperated to capacitate Nicaraguan geoscientists and to improve higher education on disaster prevention up to the master degree. Funded by a World Bank loan, coordinated by the National System for Disaster Prevention, Mitigation and Attention (SINAPRED) and scientifically supervised by INETER, multidisciplinary hazard and vulnerability studies were carried out between 2003 and 2005 with emphasis on seismic hazard. These GIS based works provided proposals for land use policies on a local level in 30 municipalities and seismic vulnerability and risk information for each single building in Managua, Capital of Nicaragua. Another large multidisciplinary project produced high resolution air photos, elaborated 1:50,000 vectorized topographic maps, and a digital elevation model for Western Nicaragua. These data, integrated in GIS, were used to assess: 1) Seismic Hazard for Metropolitan Managua; 2) Tsunami hazard for the Pacific coast; 3) Volcano hazard for Telica-Cerro Negro and El Hoyo volcanoes; and 4) Flood hazard map of Maravilla river. This study was realized between 2004 and 2006, through technical cooperation of Japan International Cooperation Agency with INETER, upon the request of the Government of Nicaragua. The results of the mapping and investigations are fed in a National GIS on Geohazards maintained by INETER and developed in the frame of a regional cooperation project with BGR, Germany, and other international institutions. Many maps, project reports and GIS coverage are made available on INETER´s Website to the general public. (www.ineter.gob.ni/geofisica/geofisica.html ).

  20. Catastrophic debris flows transformed from landslides in volcanic terrains : mobility, hazard assessment and mitigation strategies

    USGS Publications Warehouse

    Scott, Kevin M.; Macias, Jose Luis; Naranjo, Jose Antonio; Rodriguez, Sergio; McGeehin, John P.

    2001-01-01

    Communities in lowlands near volcanoes are vulnerable to significant volcanic flow hazards in addition to those associated directly with eruptions. The largest such risk is from debris flows beginning as volcanic landslides, with the potential to travel over 100 kilometers. Stratovolcanic edifices commonly are hydrothermal aquifers composed of unstable, altered rock forming steep slopes at high altitudes, and the terrain surrounding them is commonly mantled by readily mobilized, weathered airfall and ashflow deposits. We propose that volcano hazard assessments integrate the potential for unanticipated debris flows with, at active volcanoes, the greater but more predictable potential of magmatically triggered flows. This proposal reinforces the already powerful arguments for minimizing populations in potential flow pathways below both active and selected inactive volcanoes. It also addresses the potential for volcano flank collapse to occur with instability early in a magmatic episode, as well as the 'false-alarm problem'-the difficulty in evacuating the potential paths of these large mobile flows. Debris flows that transform from volcanic landslides, characterized by cohesive (muddy) deposits, create risk comparable to that of their syneruptive counterparts of snow and ice-melt origin, which yield noncohesive (granular) deposits, because: (1) Volcano collapses and the failures of airfall- and ashflow-mantled slopes commonly yield highly mobile debris flows as well as debris avalanches with limited runout potential. Runout potential of debris flows may increase several fold as their volumes enlarge beyond volcanoes through bulking (entrainment) of sediment. Through this mechanism, the runouts of even relatively small collapses at Cascade Range volcanoes, in the range of 0.1 to 0.2 cubic kilometers, can extend to populated lowlands. (2) Collapse is caused by a variety of triggers: tectonic and volcanic earthquakes, gravitational failure, hydrovolcanism, and precipitation, as well as magmatic activity and eruptions. (3) Risk of collapse begins with initial magmatic activity and increases as intrusion proceeds. An archetypal debris flow from volcanic terrain occurred in Colombia with a tectonic earthquake (M 6.4) in 1994. The Rio Piez conveyed a catastrophic wave of debris flow over 100 kilometers, coalesced from multiple slides of surflcial material weakened both by weathering and by hydrothermal alteration in a large strato- volcano. Similar seismogenic flows occurred in Mexico in 1920 (M -6.5), Chile in 1960 (M 9.2), and Ecuador in 1987 (M 6.1 and 6.9). Velocities of wave fronts in two examples were 60 to 90 km/hr (17-25 meters per second) over the initial 30 kilometers. Volcano flank and sector collapses may produce untransformed debris avalanches, as occurred initially at Mount St. Helens in 1980. However, at least as common is direct transformation of the failed mass to a debris flow. At two other volcanoes in the Cascade Range-- Mount Rainier and Mount Baker--rapid transformation and high mobility were typical of most of at least 15 Holocene flows. This danger exists downstream from many stratovolcanoes worldwide; the population at risk is near 150,000 and increasing at Mount Rainier. The first step in preventing future catastrophes is documenting past flows. Deposits of some debris flows, however, can be mistaken for those of less-mobile debris avalanches on the basis of mounds formed by buoyed megaclasts. Megaclasts may record only the proximal phase of a debris flow that began as a debris avalanche. Runout may have extended much farther, and thus furore flow mobility may be underestimated. Processes and behaviors of megaclast-bearing paleoflows are best inferred from the intermegaclast matrix. Mitigation strategy can respond to volcanic flows regardless of type and trigger by: (1) Avoidance: Limit settlement in flow pathways to numbers that can be evacuated after event warnings (flow is occurring). (2) Instrumental even

  1. Seismicity and seismotectonics of southern Ghana: lessons for seismic hazard mitigation

    NASA Astrophysics Data System (ADS)

    Amponsah, Paulina

    2014-05-01

    Ghana is located on the West African craton and is far from the major earthquake zone of the world. It is therefore largely considered a stable region. However, the southern part of the country is seismically active. Records of damaging earthquakes in Ghana date as far back as 1615. A study on the microseismic activity in southern Ghana shows that the seismic activity is linked with active faulting between the east-west trending Coastal boundary fault and a northeast-southwest trending Akwapim fault zone. Epicentres of most of the earthquakes have been located close to the area where the two major faults intersect. This can be related to the level of activity of the faults. Some of the epicentres have been located offshore and can be associated with the level of activity of the coastal boundary fault. A review of the geological and instrumental recordings of earthquakes in Ghana show that earthquakes have occurred in the past and are still liable to occur within the vicinity of the intersection of the Akwapim fault zone and the Coastal boundary fault. Data from both historical and instrumental records indicate that the most seismically active areas in Ghana are the west of Accra, where the Akwapim fault zone and the Coastal boundary fault intersect. There are numerous minor faults in the intersection area between the Akwapim fault zone and the Coastal boundary fault. This mosaic of faults has a major implication for seismic activity in the area. Earthquake disaster mitigation measures are being put in place in recent times to reduce the impact of any major event that may occur in the country. The National Disaster Management Organization has come out with a building guide to assist in the mitigation effort of earthquake disasters and floods in the country. The building guide clearly stipulates the kind of material to be used, the proportion, what should go into the foundation for one or two storey building, the electrical materials to be used and many others.

  2. Disruption mitigation studies in DIII-D

    SciTech Connect

    Taylor, P.L.; Kellman, A.G.; Evans, T.E.

    1999-01-01

    Data on the discharge behavior, thermal loads, halo currents, and runaway electrons have been obtained in disruptions on the DIII-D tokamak. These experiments have also evaluated techniques to mitigate the disruptions while minimizing runaway electron production. Experiments injecting cryogenic impurity killer pellets of neon and argon and massive amounts of helium gas have successfully reduced these disruption effects. The halo current generation, scaling, and mitigation are understood and are in good agreement with predictions of a semianalytic model. Results from killer pellet injection have been used to benchmark theoretical models of the pellet ablation and energy loss. Runaway electrons are often generated by the pellets and new runaway generation mechanisms, modifications of the standard Dreicer process, have been found to explain the runaways. Experiments with the massive helium gas puff have also effectively mitigated disruptions without the formation of runaway electrons that can occur with killer pellets.

  3. Experimental and Numerical Study of Free-Field Blast Mitigation

    NASA Astrophysics Data System (ADS)

    Allen, R. M.; Kirkpatrick, D. J.; Longbottom, A. W.; Milne, A. M.; Bourne, N. K.

    2004-07-01

    The development of a fundamental understanding of the mechanisms governing the attenuation of explosives effects by a surrounding mitigant material or system would benefit many civilian and military applications. Current approaches rely almost exclusively on empirical data, few if any truly predictive models exist. Dstl has recently pursued an experimental programme investigating the mitigation of effects from detonating explosives in support of general requirements to attenuate blast and fragmentation. The physical properties of a range of mitigant materials have been studied at a more fundamental level, both experimentally and numerically. A preliminary numerical parameter study has been undertaken by FGE using two-phase numerical simulations to complement the experimental studies. Initial work used idealised equations of state for generic mitigants but more recently material characterisation experiments have been undertaken at RMCS. Results confirm that porosity and particle density are dominant factors affecting the efficiency of the mitigant in reducing free-field blast.

  4. Pulsed Electric Processing of the Seismic-Active Fault for Earthquake Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Novikov, V. A.; Zeigarnik, V. A.; Konev, Yu. B.; Klyuchkin, V. N.

    2010-03-01

    Previous field and laboratory investigations performed in Russia (1999-2008) showed a possibility of application of high-power electric current pulses generated by pulsed MHD power system for triggering the weak seismicity and release of tectonic stresses in the Earth crust for earthquake hazard mitigation. The mechanism of the influence of man-made electromagnetic field on the regional seismicity is not clear yet. One of possible cause of the phenomenon may be formation of cracks in the rocks under fluid pressure increase due to Joule heat generation by electric current injected into the Earth crust. Detailed 3D-calculaton of electric current density in the Earth crust of Northern Tien Shan provided by pulsed MHD power system connected to grounded electric dipole showed that at the depth of earthquake epicenters (> 5km) the electric current density is lower than 10-7 A/m2 that is not sufficient for increase of pressure in the fluid-saturated porous geological medium due to Joule heat generation, which may provide formation of cracks resulting in the fault propagation and release of tectonic stresses in the Earth crust. Nevertheless, under certain conditions, when electric current will be injected into the fault through the casing pipes of deep wells with preliminary injection of conductive fluid into the fault, the current density may be high enough for significant increase of mechanic pressure in the porous two-phase geological medium. Numerical analysis of a crack formation triggered by high-power electric pulses based on generation of mechanical pressure in the geological medium was carried out. It was shown that calculation of mechanical pressure impulse due to high-power electrical current in the porous two-phase medium may be performed neglecting thermal conductance by solving the non-stationary equation of piezo-conductivity with Joule heat generation. For calculation of heat generation the known solution of the task of current spreading from spherical or elliptic electrode submerged into unbounded medium is used. Pressure increase due to electric current is determined by voltage of the current source and the medium parameters, and it does not depend on the electrode radius. The pressure increase is proportional to parameter ? ? /r2, where ? is viscosity factor, ? is electric conductivity of fluid in pores, r is average radius of capillaries. This parameter may vary for different media and fluids in the pores by many orders of magnitude. The pressure increase for water is insignificant. If a high-mineralized fluid (e.g. sludge) is injected into the medium, the pressure may be increased by several orders. In that case the pressure may obtain tens kilobars, and an intergrowth of cracks will be possible in the medium exposed to high-power electric current. An estimation of parameters of portable pulsed power system for electric processing of the fault was performed, when the current is injected into the fault through the casing tubes of deep wells with preliminary injection of conductive fluid into the fault between the wells. The work is supported by grant No. 09-05-12059 of Russian Foundation for Basic Research.

  5. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    J. L. Kubicek

    2001-09-07

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  6. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    Richard C. Logan

    2002-03-28

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  7. Looking Before We Leap: Recent Results From An Ongoing Quantitative Investigation Of Asteroid And Comet Impact Hazard Mitigation.

    NASA Astrophysics Data System (ADS)

    Plesko, Catherine; Weaver, R. P.; Korycansky, D. G.; Huebner, W. F.

    2010-10-01

    The asteroid and comet impact hazard is now part of public consciousness, as demonstrated by movies, Super Bowl commercials, and popular news stories. However, there is a popular misconception that hazard mitigation is a solved problem. Many people think, `we'll just nuke it.’ There are, however, significant scientific questions remaining in the hazard mitigation problem. Before we can say with certainty that an explosive yield Y at height of burst h will produce a momentum change in or dispersion of a potentially hazardous object (PHO), we need to quantify how and where energy is deposited into the rubble pile or conglomerate that may make up the PHO. We then need to understand how shock waves propagate through the system, what causes them to disrupt, and how long gravitationally bound fragments take to recombine. Here we present numerical models of energy deposition from an energy source into various materials that are known PHO constituents, and rigid body dynamics models of the recombination of disrupted objects. In the energy deposition models, we explore the effects of porosity and standoff distance as well as that of composition. In the dynamical models, we explore the effects of fragment size and velocity distributions on the time it takes for gravitationally bound fragments to recombine. Initial models indicate that this recombination time is relatively short, as little as 24 hours for a 1 km sized PHO composed of 1000 meter-scale self-gravitating fragments with an initial velocity field of v/r = 0.001 1/s.

  8. Bike Helmets and Black Riders: Experiential Approaches to Helping Students Understand Natural Hazard Assessment and Mitigation Issues

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Kley, J.; Hindle, D.; Friedrich, A. M.

    2014-12-01

    Defending society against natural hazards is a high-stakes game of chance against nature, involving tough decisions. How should a developing nation allocate its budget between building schools for towns without ones or making existing schools earthquake-resistant? Does it make more sense to build levees to protect against floods, or to prevent development in the areas at risk? Would more lives be saved by making hospitals earthquake-resistant, or using the funds for patient care? These topics are challenging because they are far from normal experience, in that they involve rare events and large sums. To help students in natural hazard classes conceptualize them, we pose tough and thought-provoking questions about complex issues involved and explore them together via lectures, videos, field trips, and in-class and homework questions. We discuss analogous examples from the students' experiences, drawing on a new book "Playing Against Nature, Integrating Science and Economics to Mitigate Natural Hazards in an Uncertain World". Asking whether they wear bicycle helmets and why or why not shows the cultural perception of risk. Individual students' responses vary, and the overall results vary dramatically between the US, UK, and Germany. Challenges in hazard assessment in an uncertain world are illustrated by asking German students whether they buy a ticket on public transportation - accepting a known cost - or "ride black" - not paying but risking a heavy fine if caught. We explore the challenge of balancing mitigation costs and benefits via the question "If you were a student in Los Angeles, how much more would you pay in rent each month to live in an earthquake-safe building?" Students learn that interdisciplinary thinking is needed, and that due to both uncertainties and sociocultural factors, no unique or right strategies exist for a particular community, much the less all communities. However, we can seek robust policies that give sensible results given uncertainties.

  9. Disruption mitigation studies in DIII-D

    SciTech Connect

    Taylor, P.L.; Kellman, A.G.; Evans, T.E.; Gray, D.S.; Humphreys, D.A.; Hyatt, A.W.; Jernigan, T.C.; Lee, R.L.; Leuer, J.A.; Luckhardt, S.C.; Parks, P.B.; Schaffer, M.J.; Whyte, D.G.; Zhang, J.

    1999-05-01

    Data on the discharge behavior, thermal loads, halo currents, and runaway electrons have been obtained in disruptions on the DIII-D tokamak [J. L. Luxon and L. G. Davis, Fusion Technol. {bold 8}, 2A 441 (1985)]. These experiments have also evaluated techniques to mitigate the disruptions while minimizing runaway electron production. Experiments injecting cryogenic impurity {open_quotes}killer{close_quotes} pellets of neon and argon and massive amounts of helium gas have successfully reduced these disruption effects. The halo current generation, scaling, and mitigation are understood and are in good agreement with predictions of a semianalytic model. Results from {open_quotes}killer{close_quotes} pellet injection have been used to benchmark theoretical models of the pellet ablation and energy loss. Runaway electrons are often generated by the pellets and new runaway generation mechanisms, modifications of the standard Dreicer process, have been found to explain the runaways. Experiments with the massive helium gas puff have also effectively mitigated disruptions without the formation of runaway electrons that can occur with {open_quotes}killer{close_quotes} pellets. {copyright} {ital 1999 American Institute of Physics.}

  10. 76 FR 23613 - Draft Programmatic Environmental Assessment for Hazard Mitigation Safe Room Construction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-27

    ... SECURITY Federal Emergency Management Agency Draft Programmatic Environmental Assessment for Hazard..., tornado). FEMA has prepared a draft Programmatic Environmental Assessment (PEA) to address the potential... on the draft Programmatic Environmental Assessment may be submitted on or before May 27,...

  11. Debris flows hazard mitigation: an example in the F. Menotre basin (central Italy)

    NASA Astrophysics Data System (ADS)

    Taramelli, A.; Melelli, L.; Cattuto, C.; Gregori, L.

    2003-04-01

    Debris flows are important geomorphic events in a wide variety of landscape. Although repeated landsliding causes bedrock to underlie many hill slopes of mountain belts, some landslide debris remains and is stored as a thin colluvial cover, particularly in "hollows". These colluvial pockets act as slope failure "hot spots" by focusing infiltration of storm runoff, leading to local groundwater concentration above perched water tables and therefore enhanced failure potential. Consequently, colluvial deposits in hollows are particularly susceptible to landsliding. Once a failure occurs, hillslope sediment transport processes refill the scar, resulting in a cycle of gradual colluvium accumulation and periodic instability. Hillslope debris moves down-slope as a result of hillslope processes and where overland flow is either non-erosive or infrequent colluvium accumulates along the line of descendent. So the association of debris flow with hollows is governed by relations between sediment transport, hillslope hydrology and slope stability. Hollows consequently define a mappable debris flow hazard source areas. With these goals in mind we propose a GIS model where we have provided a record of landslide activity (debris flow events) in response to specific storm over areas of diverse geology, geomorpholgy, microclimates and vegetation. When combined with information on the physical properties of hillslope form and gradient, rainfall characteristics, and travel distance, these inventories could provide a foundation for the development of accurate and meaningful susceptibility maps. In particular susceptibility index identify those geologic units that produced the most debris flow in each study area in response to specific rainfall condition. To examine relations between geologic units and debris flow susceptibility, we calculated an index of debris flow susceptibility for each geologic unit in each representative elementary area. This is done by dividing the number of landslide initiation locations within a particular unit by the areal extent of that unit in the study area. The final aim of our research has been the identification of the terrain attributes related to the occurrence of debris flow and to quantify their relative contribution to the hazard assessment. The modeling work has focused on slope failure in the Menotre basin because the connection between severe storms and debris flows is clear and we had access to appropriate data to constrain the modeling.

  12. Studies Update Vinyl Chloride Hazards.

    ERIC Educational Resources Information Center

    Rawls, Rebecca

    1980-01-01

    Extensive study affirms that vinyl chloride is a potent animal carcinogen. Epidemiological studies show elevated rates of human cancers in association with extended contact with the compound. (Author/RE)

  13. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr events could benefit the NTHMP. The joint NFIP/NTHMP pilot study at Seaside, Oregon is organized into three closely related components: Probabilistic, Modeling, and Impact studies. Probabilistic studies (Geist, et al., this session) are led by the USGS and include the specification of near- and far-field seismic tsunami sources and their associated probabilities. Modeling studies (Titov, et al., this session) are led by NOAA and include the development and testing of a Seaside tsunami inundation model and an associated database of computed wave height and flow velocity fields. Impact studies (Synolakis, et al., this session) are led by USC and include the computation and analyses of indices for the categorization of hazard zones. The results of each component study will be integrated to produce a Seaside tsunami hazard map. This presentation will provide a brief overview of the project and an update on progress, while the above-referenced companion presentations will provide details on the methods used and the preliminary results obtained by each project component.

  14. Advances in Remote Sensing Approaches for Hazard Mitigation and Natural Resource Protection in Pacific Latin America: A Workshop for Advanced Graduate Students, Post- Doctoral Researchers, and Junior Faculty

    NASA Astrophysics Data System (ADS)

    Gierke, J. S.; Rose, W. I.; Waite, G. P.; Palma, J. L.; Gross, E. L.

    2008-12-01

    Though much of the developing world has the potential to gain significantly from remote sensing techniques in terms of public health and safety, they often lack resources for advancing the development and practice of remote sensing. All countries share a mutual interest in furthering remote sensing capabilities for natural hazard mitigation and resource development. With National Science Foundation support from the Partnerships in International Research and Education program, we are developing a new educational system of applied research and engineering for advancing collaborative linkages among agencies and institutions in Pacific Latin American countries (to date: Guatemala, El Salvador, Nicaragua, Costa Rica, Panama, and Ecuador) in the development of remote sensing tools for hazard mitigation and water resources management. The project aims to prepare students for careers in science and engineering through their efforts to solve suites of problems needing creative solutions: collaboration with foreign agencies; living abroad immersed in different cultures; and adapting their academic training to contend with potentially difficult field conditions and limited resources. The ultimate goal of integrating research with education is to encourage cross-disciplinary, creative, and critical thinking in problem solving and foster the ability to deal with uncertainty in analyzing problems and designing appropriate solutions. In addition to traditional approaches for graduate and undergraduate research, we have built new educational systems of applied research and engineering: (1) the Peace Corp/Master's International program in Natural Hazards which features a 2-year field assignment during service in the U.S. Peace Corps, (2) the Michigan Tech Enterprise program for undergraduates, which gives teams of students from different disciplines the opportunity to work for three years in a business-like setting to solve real-world problems, and (3) a unique university exchange program in natural hazards (E-Haz). Advancements in research have been made, for example, in using thermal remote sensing methods for studying vent and eruptive processes, and in fusing RADARSAT with ASTER imagery to delineate lineaments in volcanic terrains for siting water wells. While these and other advancements are developed in conjunction with our foreign counterparts, the impacts of this work can be broadened through more comprehensive dissemination activities. Towards this end, we are in the planning phase of a Pan American workshop on applications of remote sensing techniques for natural hazards and water resources management. The workshop will be at least two weeks, sometime in July/August 2009, and involve 30-40 participants, with balanced participation from the U.S. and Latin America. In addition to fundamental aspects of remote sensing and digital image processing, the workshop topics will be presented in the context of new developments for studying volcanic processes and hazards and for characterizing groundwater systems.

  15. Remote Sensing for Hazard Mitigation and Resource Protection in Pacific Latin America: New NSF sponsored initiative at Michigan Tech.

    NASA Astrophysics Data System (ADS)

    Rose, W. I.; Bluth, G. J.; Gierke, J. S.; Gross, E.

    2005-12-01

    Though much of the developing world has the potential to gain significantly from remote sensing techniques in terms of public health and safety and, eventually, economic development, they lack the resources required to advance the development and practice of remote sensing. Both developed and developing countries share a mutual interest in furthering remote sensing capabilities for natural hazard mitigation and resource development, and this common commitment creates a solid foundation upon which to build an integrated education and research project. This will prepare students for careers in science and engineering through their efforts to solve a suite of problems needing creative solutions: collaboration with foreign agencies; living abroad immersed in different cultures; and adapting their academic training to contend with potentially difficult field conditions and limited resources. This project makes two important advances: (1) We intend to develop the first formal linkage among geoscience agencies from four Pacific Latin American countries (Guatemala, El Salvador, Nicaragua and Ecuador), focusing on the collaborative development of remote sensing tools for hazard mitigation and water resource development; (2) We will build a new educational system of applied research and engineering, using two existing educational programs at Michigan Tech: a new Peace Corp/Master's International (PC/MI) program in Natural Hazards which features a 2-year field assignment, and an "Enterprise" program for undergraduates, which gives teams of geoengineering students the opportunity to work for three years in a business-like setting to solve real-world problems This project will involve 1-2 post-doctoral researchers, 3 Ph.D., 9 PC/MI, and roughly 20 undergraduate students each year.

  16. The respiratory health hazards of volcanic ash: a review for volcanic risk mitigation

    NASA Astrophysics Data System (ADS)

    Horwell, Claire J.; Baxter, Peter J.

    2006-07-01

    Studies of the respiratory health effects of different types of volcanic ash have been undertaken only in the last 40 years, and mostly since the eruption of Mt. St. Helens in 1980. This review of all published clinical, epidemiological and toxicological studies, and other work known to the authors up to and including 2005, highlights the sparseness of studies on acute health effects after eruptions and the complexity of evaluating the long-term health risk (silicosis, non-specific pneumoconiosis and chronic obstructive pulmonary disease) in populations from prolonged exposure to ash due to persistent eruptive activity. The acute and chronic health effects of volcanic ash depend upon particle size (particularly the proportion of respirable-sized material), mineralogical composition (including the crystalline silica content) and the physico-chemical properties of the surfaces of the ash particles, all of which vary between volcanoes and even eruptions of the same volcano, but adequate information on these key characteristics is not reported for most eruptions. The incidence of acute respiratory symptoms (e.g. asthma, bronchitis) varies greatly after ashfalls, from very few, if any, reported cases to population outbreaks of asthma. The studies are inadequate for excluding increases in acute respiratory mortality after eruptions. Individuals with pre-existing lung disease, including asthma, can be at increased risk of their symptoms being exacerbated after falls of fine ash. A comprehensive risk assessment, including toxicological studies, to determine the long-term risk of silicosis from chronic exposure to volcanic ash, has been undertaken only in the eruptions of Mt. St. Helens (1980), USA, and Soufrière Hills, Montserrat (1995 onwards). In the Soufrière Hills eruption, a long-term silicosis hazard has been identified and sufficient exposure and toxicological information obtained to make a probabilistic risk assessment for the development of silicosis in outdoor workers and the general population. A more systematic approach to multi-disciplinary studies in future eruptions is recommended, including establishing an archive of ash samples and a website containing health advice for the public, together with scientific and medical study guidelines for volcanologists and health-care workers.

  17. Piloted Simulation to Evaluate the Utility of a Real Time Envelope Protection System for Mitigating In-Flight Icing Hazards

    NASA Technical Reports Server (NTRS)

    Ranaudo, Richard J.; Martos, Borja; Norton, Bill W.; Gingras, David R.; Barnhart, Billy P.; Ratvasky, Thomas P.; Morelli, Eugene

    2011-01-01

    The utility of the Icing Contamination Envelope Protection (ICEPro) system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device (ICEFTD). ICEPro provides real time envelope protection cues and alerting messages on pilot displays. The pilots participating in this test were divided into two groups; a control group using baseline displays without ICEPro, and an experimental group using ICEPro driven display cueing. Each group flew identical precision approach and missed approach procedures with a simulated failure case icing condition. Pilot performance, workload, and survey questionnaires were collected for both groups of pilots. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their situation awareness of a hazardous aircraft state.

  18. Hazardous near Earth asteroid mitigation campaign planning based on uncertain information on fundamental asteroid characteristics

    NASA Astrophysics Data System (ADS)

    Sugimoto, Y.; Radice, G.; Ceriotti, M.; Sanchez, J. P.

    2014-10-01

    Given a limited warning time, an asteroid impact mitigation campaign would hinge on uncertainty-based information consisting of remote observational data of the identified Earth-threatening object, general knowledge of near-Earth asteroids (NEAs), and engineering judgment. Due to these ambiguities, the campaign credibility could be profoundly compromised. It is therefore imperative to comprehensively evaluate the inherent uncertainty in deflection and plan the campaign accordingly to ensure successful mitigation. This research demonstrates dual-deflection mitigation campaigns consisting of primary (instantaneous/quasi-instantaneous) and secondary (slow-push) deflection missions, where both deflection efficiency and campaign credibility are taken into account. The results of the dual-deflection campaign analysis show that there are trade-offs between the competing aspects: the launch cost, mission duration, deflection distance, and the confidence in successful deflection. The design approach is found to be useful for multi-deflection campaign planning, allowing us to select the best possible combination of missions from a catalogue of campaign options, without compromising the campaign credibility.

  19. The Relation of Hazard Awareness to Adoption of Approved Mitigation Measures.

    ERIC Educational Resources Information Center

    Saarinen, Thomas F.

    The relationship between an individual's or community's awareness of natural hazards and subsequent behavior change is examined in this review of research. The document is presented in seven sections. Following Section I, the introduction, Section II discusses the role of experience in behavior change. Section III examines the role of education…

  20. The Relation of Hazard Awareness to Adoption of Approved Mitigation Measures.

    ERIC Educational Resources Information Center

    Saarinen, Thomas F.

    The relationship between an individual's or community's awareness of natural hazards and subsequent behavior change is examined in this review of research. The document is presented in seven sections. Following Section I, the introduction, Section II discusses the role of experience in behavior change. Section III examines the role of education

  1. Marine and Hydrokinetic Renewable Energy Devices, Potential Navigational Hazards and Mitigation Measures

    SciTech Connect

    Cool, Richard, M.; Hudon, Thomas, J.; Basco, David, R.; Rondorf, Neil, E.

    2009-12-01

    On April 15, 2008, the Department of Energy (DOE) issued a Funding Opportunity Announcement for Advanced Water Power Projects which included a Topic Area for Marine and Hydrokinetic Renewable Energy Market Acceleration Projects. Within this Topic Area, DOE identified potential navigational impacts of marine and hydrokinetic renewable energy technologies and measures to prevent adverse impacts on navigation as a sub-topic area. DOE defines marine and hydrokinetic technologies as those capable of utilizing one or more of the following resource categories for energy generation: ocean waves; tides or ocean currents; free flowing water in rivers or streams; and energy generation from the differentials in ocean temperature. PCCI was awarded Cooperative Agreement DE-FC36-08GO18177 from the DOE to identify the potential navigational impacts and mitigation measures for marine hydrokinetic technologies. A technical report addressing our findings is available on this Science and Technology Information site under the Product Title, "Marine and Hydrokinetic Renewable Energy Technologies: Potential Navigational Impacts and Mitigation Measures". This product is a brochure, primarily for project developers, that summarizes important issues in that more comprehensive report, identifies locations where that report can be downloaded, and identifies points of contact for more information.

  2. Debris flood hazard documentation and mitigation on the Tilcara alluvial fan (Quebrada de Humahuaca, Jujuy province, North-West Argentina)

    NASA Astrophysics Data System (ADS)

    Marcato, G.; Bossi, G.; Rivelli, F.; Borgatti, L.

    2012-06-01

    For some decades, mass wasting processes such as landslides and debris floods have been threatening villages and transportation routes in the Rio Grande Valley, named Quebrada de Humauhuaca. One of the most significant examples is the urban area of Tilcara, built on a large alluvial fan. In recent years, debris flood phenomena have been triggered in the tributary valley of the Huasamayo Stream and reached the alluvial fan on a decadal basis. In view of proper development of the area, hazard and risk assessment together with risk mitigation strategies are of paramount importance. The need is urgent also because the Quebrada de Humahuaca was recently included in the UNESCO World Cultural Heritage. Therefore, the growing tourism industry may lead to uncontrolled exploitation and urbanization of the valley, with a consequent increase of the vulnerability of the elements exposed to risk. In this context, structural and non structural mitigation measures not only have to be based on the understanding of natural processes, but also have to consider environmental and sociological factors that could hinder the effectiveness of the countermeasure works. The hydrogeological processes are described with reference to present-day hazard and risk conditions. Considering the socio-economic context, some possible interventions are outlined, which encompass budget constraints and local practices. One viable solution would be to build a protecting dam upstream of the fan apex and an artificial channel, in order to divert the floodwaters in a gully that would then convey water and sediments into the Rio Grande, some kilometers downstream of Tilcara. The proposed remedial measures should employ easily available and relatively cheap technologies and local workers, incorporating low environmental and visual impacts issues, in order to ensure both the future conservation of the site and its safe exploitation for inhabitants and tourists.

  3. Societal transformation and adaptation necessary to manage dynamics in flood hazard and risk mitigation (TRANS-ADAPT)

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Thaler, Thomas; Bonnefond, Mathieu; Clarke, Darren; Driessen, Peter; Hegger, Dries; Gatien-Tournat, Amandine; Gralepois, Mathilde; Fournier, Marie; Mees, Heleen; Murphy, Conor; Servain-Courant, Sylvie

    2015-04-01

    Facing the challenges of climate change, this project aims to analyse and to evaluate the multiple use of flood alleviation schemes with respect to social transformation in communities exposed to flood hazards in Europe. The overall goals are: (1) the identification of indicators and parameters necessary for strategies to increase societal resilience, (2) an analysis of the institutional settings needed for societal transformation, and (3) perspectives of changing divisions of responsibilities between public and private actors necessary to arrive at more resilient societies. This proposal assesses societal transformations from the perspective of changing divisions of responsibilities between public and private actors necessary to arrive at more resilient societies. Yet each risk mitigation measure is built on a narrative of exchanges and relations between people and therefore may condition the outputs. As such, governance is done by people interacting and defining risk mitigation measures as well as climate change adaptation are therefore simultaneously both outcomes of, and productive to, public and private responsibilities. Building off current knowledge this project will focus on different dimensions of adaptation and mitigation strategies based on social, economic and institutional incentives and settings, centring on the linkages between these different dimensions and complementing existing flood risk governance arrangements. The policy dimension of adaptation, predominantly decisions on the societal admissible level of vulnerability and risk, will be evaluated by a human-environment interaction approach using multiple methods and the assessment of social capacities of stakeholders across scales. As such, the challenges of adaptation to flood risk will be tackled by converting scientific frameworks into practical assessment and policy advice. In addressing the relationship between these dimensions of adaptation on different temporal and spatial scales, this project is both scientifically innovative and policy relevant, thereby supporting climate policy needs in Europe towards a concept of risk governance. Key words: climate change adaptation; transformation; flood risk management; resilience; vulnerability; innovative bottom-up developments; multifunctional use

  4. Detecting Slow Deformation Signals Preceding Dynamic Failure: A New Strategy For The Mitigation Of Natural Hazards (SAFER)

    NASA Astrophysics Data System (ADS)

    Vinciguerra, Sergio; Colombero, Chiara; Comina, Cesare; Ferrero, Anna Maria; Mandrone, Giuseppe; Umili, Gessica; Fiaschi, Andrea; Saccorotti, Gilberto

    2014-05-01

    Rock slope monitoring is a major aim in territorial risk assessment and mitigation. The high velocity that usually characterizes the failure phase of rock instabilities makes the traditional instruments based on slope deformation measurements not applicable for early warning systems. On the other hand the use of acoustic emission records has been often a good tool in underground mining for slope monitoring. Here we aim to identify the characteristic signs of impending failure, by deploying a "site specific" microseismic monitoring system on an unstable patch of the Madonna del Sasso landslide on the Italian Western Alps designed to monitor subtle changes of the mechanical properties of the medium and installed as close as possible to the source region. The initial characterization based on geomechanical and geophysical tests allowed to understand the instability mechanism and to design the monitoring systems to be placed. Stability analysis showed that the stability of the slope is due to rock bridges. Their failure progress can results in a global slope failure. Consequently the rock bridges potentially generating dynamic ruptures need to be monitored. A first array consisting of instruments provided by University of Turin, has been deployed on October 2013, consisting of 4 triaxial 4.5 Hz seismometers connected to a 12 channel data logger arranged in a 'large aperture' configuration which encompasses the entire unstable rock mass. Preliminary data indicate the occurrence of microseismic swarms with different spectral contents. Two additional geophones and 4 triaxial piezoelectric accelerometers able to operate at frequencies up to 23 KHz will be installed during summer 2014. This will allow us to develop a network capable of recording events with Mw < 0.5 and frequencies between 700 Hz and 20 kHz. Rock physical and mechanical characterization along with rock deformation laboratory experiments during which the evolution of related physical parameters under simulated conditions of stress and fluid content will be also studied and theoretical modelling will allow to come up with a full hazard assessment and test new methodologies for a much wider scale of applications within EU.

  5. A probabilistic framework for hazard assessment and mitigation of induced seismicity related to deep geothermal systems

    NASA Astrophysics Data System (ADS)

    Wiemer, S.; Bachmann, C. E.; Allmann, B.; Giardini, D.; Woessner, J.; Catalli, F.; Mena Carbrera, B.

    2011-12-01

    Slip on tectonic faults take place over a wide range of spatial and temporal scales as earthquakes, continuous aseismic creep, or transient creep events. Shallow creep events on continental strike-slip faults can occur spontaneously, or are coupled with earthquake afterslip, or are triggered by nearby earthquakes. Despite more than five decades of observations, the mechanism of shallow creep events and their implications for seismic hazard are still not fully understood. To understand the mechanism of creep events, we developed a physics-based numerical model to simulate shallow creep events on a strike-slip fault with rate-and-state frictional properties (Wei et al., 2013). We show that the widely used synoptic model (Scholz, 1998) cannot reproduce both rapid afterslip and frequent creep events as observed on the Superstition Hills fault in the Salton Trough after the 1987 Mw 6.6 earthquake. Rather, an unstable layer embedded in the shallow stable zone is required to match the geodetic observations of the creep behavior. Using the strike-slip fault model, we studied the triggering process of creep events, by either static or dynamic, or combined stress perturbations induced on the fault by nearby earthquakes. Preliminary results show that static stress perturbations in the effective normal stress on a system with spontaneous creep events can advance or delay creep events. The magnitude and timing of perturbations determines the clock change of creep events. The magnitude and interval of creep events changes permanently after static stress perturbation. Dynamic stress perturbations in effective normal stress can advance the timings of creep events when the perturbation temporally decreases the effective normal stress. A threshold exists for instantaneous triggering. The size of triggered slip increases as the dynamic perturbation increases in the direction of less normal stress. The system returns to pre-perturbation state after a long period of no slip. The length of the recovery time depends on the size of triggered slip therefore the magnitude and duration of perturbation. Perturbations that temporally increase effective normal stress do not have significant influence on the timings of future creep events. We applied our theoretical models to the Salton Trough, California, where both shallow creep events and earthquakes are common. We systematically analyzed the level of dynamic and static triggering from nearby earthquakes for the last 30 years, including moderate (> M5) to large (>M6) earthquakes. By incorporating these triggering to our fault model, we are trying to understand 1) which mechanism is dominant, static or dynamic; 2) whether a critical threshold exists, like in the generic model with synthetic dynamic perturbations for the instantaneous triggering of shallow creep events in Salton Trough; 3) the effect of fault orientation with respect to the incoming seismic waves. By developing state-of-the-art models and constraining parameters with rich datasets from Southern California, we aim to transition from a conceptual understanding of fault creep towards a quantitative and predictive understanding of the physical mechanism of creep events on continental strike-slip faults.

  6. Smart Oceans BC: Supporting Coastal and Ocean Natural Hazards Mitigation for British Columbia

    NASA Astrophysics Data System (ADS)

    Moran, K.; Insua, T. L.; Pirenne, B.; Hoeberechts, M.; McLean, S.

    2014-12-01

    Smart Oceans BC is a new multi-faceted program to support decision-makers faced with responding to natural disasters and hazards in Canada's Province of British Columbia. It leverages the unique capabilities of Ocean Networks Canada's cabled ocean observatories, NEPTUNE and VENUS to enhance public safety, marine safety and environmental monitoring. Smart Oceans BC combines existing and new marine sensing technology with its robust data management and archive system, Oceans 2.0, to deliver information and science for good ocean management and responsible ocean use. Smart Oceans BC includes new ocean observing infrastructure for: public safety, through natural hazard detection for earthquake groundshaking and near-field tsunamis; marine safety, by monitoring and providing alerts on sea state, ship traffic, and marine mammal presence; and environmental protection, by establishing baseline data in critical areas, and providing real-time environmental observations. Here we present the elements of this new ocean observing initiative that are focused on tsunami and earthquake early warning including cabled and autonomous sensor systems, real-time data delivery, software developments that enable rapid detection, analytics used in notification development, and stakeholder engagement plans.

  7. A fast global tsunami modeling suite as a trans-oceanic tsunami hazard prediction and mitigation tool

    NASA Astrophysics Data System (ADS)

    Mohammed, F.; Li, S.; Jalali Farahani, R.; Williams, C. R.; Astill, S.; Wilson, P. S.; B, S.; Lee, R.

    2014-12-01

    The past decade has been witness to two mega-tsunami events, 2004 Indian ocean tsunami and 2011 Japan tsunami and multiple major tsunami events; 2006 Java, Kuril Islands, 2007 Solomon Islands, 2009 Samoa and 2010 Chile, to name a few. These events generated both local and far field tsunami inundations with runup ranging from a few meters to around 40 m in the coastal impact regions. With a majority of the coastal population at risk, there is need for a sophisticated outlook towards catastrophe risk estimation and a quick mitigation response. At the same time tools and information are needed to aid advanced tsunami hazard prediction. There is an increased need for insurers, reinsurers and Federal hazard management agencies to quantify coastal inundations and vulnerability of coastal habitat to tsunami inundations. A novel tool is developed to model local and far-field tsunami generation, propagation and inundation to estimate tsunami hazards. The tool is a combination of the NOAA MOST propagation database and an efficient and fast GPU (Graphical Processing Unit)-based non-linear shallow water wave model solver. The tsunamigenic seismic sources are mapped on to the NOAA unit source distribution along subduction zones in the ocean basin. Slip models are defined for tsunamigenic seismic sources through a slip distribution on the unit sources while maintaining limits of fault areas. A GPU based finite volume solver is used to simulate non-linear shallow water wave propagation, inundation and runup. Deformation on the unit sources provide initial conditions for modeling local impacts, while the wave history from propagation database provides boundary conditions for far field impacts. The modeling suite provides good agreement with basins for basin wide tsunami propagation to validate local and far field tsunami inundations.

  8. Field Guide for Testing Existing Photovoltaic Systems for Ground Faults and Installing Equipment to Mitigate Fire Hazards: November 2012 - October 2013

    SciTech Connect

    Brooks, William

    2015-02-01

    Ground faults and arc faults are the two most common reasons for fires in photovoltaic (PV) arrays and methods exist that can mitigate the hazards. This report provides field procedures for testing PV arrays for ground faults, and for implementing high resolution ground fault and arc fault detectors in existing and new PV system designs.

  9. Protecting new health facilities from natural hazards: guidelines for the promotion of disaster mitigation.

    PubMed

    2004-01-01

    The health sector is particularly vulnerable to naturally occurring events. The vulnerability of the health infrastructure (hospitals and clinics) is of particular concern. Not only are such facilities vulnerable structurally, but their ability to continue to provide essential functions may be severely compromised, thus leaving the stricken population without essential services. This paper summarizes a more detailed document, Guidelines for Vulnerability Reduction in the Design of New Health Facilities published by the Pan-American Health Organization (PAHO)/ World Health Organization (WHO). The current document summarizes these Guidelines emphasizing how they may be used, by whom, and for what purpose. Potential users of the Guidelines include, but are not limited to: (1) initiators of health facility construction projects; (2) executors and supervisors of health facility construction projects; and (3) financing bodies in charge of funding health facility construction projects. The Guidelines include: (1) implications of natural phenomena upon the health infrastructure; (2) guidelines for vulnerability reduction for incorporation into development project cycles; (3) definitive phases and stages within the phases for development projects including: (I) Projects Assessment (needs assessment; assessment of options, the preliminary project); (II) Investment (project design, construction); and (III) Operational Activities (operations and maintenance). In addition, investment in damage reduction measures, policies and regulations, training and education, and the role of international organizations in the promotion and funding of mitigation strategies are addressed. PMID:15645629

  10. Novel bio-inspired smart control for hazard mitigation of civil structures

    NASA Astrophysics Data System (ADS)

    Kim, Yeesock; Kim, Changwon; Langari, Reza

    2010-11-01

    In this paper, a new bio-inspired controller is proposed for vibration mitigation of smart structures subjected to ground disturbances (i.e. earthquakes). The control system is developed through the integration of a brain emotional learning (BEL) algorithm with a proportional-integral-derivative (PID) controller and a semiactive inversion (Inv) algorithm. The BEL algorithm is based on the neurologically inspired computational model of the amygdala and the orbitofrontal cortex. To demonstrate the effectiveness of the proposed hybrid BEL-PID-Inv control algorithm, a seismically excited building structure equipped with a magnetorheological (MR) damper is investigated. The performance of the proposed hybrid BEL-PID-Inv control algorithm is compared with that of passive, PID, linear quadratic Gaussian (LQG), and BEL control systems. In the simulation, the robustness of the hybrid BEL-PID-Inv control algorithm in the presence of modeling uncertainties as well as external disturbances is investigated. It is shown that the proposed hybrid BEL-PID-Inv control algorithm is effective in improving the dynamic responses of seismically excited building structure-MR damper systems.

  11. Using Darwin's theory of atoll formation to improve tsunami hazard mitigation in the Pacific

    NASA Astrophysics Data System (ADS)

    Goff, J. R.; Terry, J. P.

    2012-12-01

    It is 130 years since Charles Darwin's death and 176 years since he his penned his subsidence theory of atoll formation on 12th April 1836 during the voyage of the Beagle through the Pacific. This theory, founded on the premise of a subsiding volcano and the corresponding upward growth of coral reef, was astonishing for the time considering the absence of an underpinning awareness of plate tectonics. Furthermore, with the exception of the occasional permutation and opposing idea his theory has endured and has an enviable longevity amongst paradigms in geomorphology. In his theory, Darwin emphasised the generally circular morphology of the atoll shape and surprisingly, the validity of this simple morphological premise has never been questioned. There are however, few atolls in the Pacific Ocean that attain such a simple morphology with most manifesting one or more arcuate 'bight-like' structures (ABLSs). These departures from the circular form complicate his simplistic model and are indicative of geomorphological processes in the Pacific Ocean which cannot be ignored. ABLSs represent the surface morphological expression of major submarine failures of atoll volcanic foundations. Such failures can occur during any stage of atoll formation and are a valuable addition to Darwin's theory because they indicate the instability of the volcanic foundations. It is widely recognized in the research community that sector/flank collapses of island edifices are invariably tsunamigenic and yet we have no clear understanding of how significant such events are in the tsunami hazard arena. The recognition of ABLSs however, now offers scientists the opportunity to establish a first order database of potential local and regional tsunamigenic sources associated with the sector/flank collapses of island edifices. We illustrate the talk with examples of arcuate 'bight-like' structures and associated tsunamis in atoll and atoll-like environments. The implications for our understanding of tsunami hazards are profound. In essence, at present we are seriously under-estimating the significance of locally and regionally generated tsunamis throughout the entire Pacific Ocean, but we now have the opportunity to enhance our understanding of such events.

  12. Educational Approach to Seismic Risk Mitigation in Indian Himalayas -Hazard Map Making Workshops at High Schools-

    NASA Astrophysics Data System (ADS)

    Koketsu, K.; Oki, S.; Kimura, M.; Chadha, R. K.; Davuluri, S.

    2014-12-01

    How can we encourage people to take preventive measures against damage risks and empower them to take the right actions in emergencies to save their lives? The conventional approach taken by scientists had been disseminating intelligible information on up-to-date seismological knowledge. However, it has been proven that knowledge alone does not have enough impact to modify people's behaviors in emergencies (Oki and Nakayachi, 2012). On the other hand, the conventional approach taken by practitioners had been to conduct emergency drills at schools or workplaces. The loss of many lives from the 2011 Tohoku earthquake has proven that these emergency drills were not enough to save people's lives, unless they were empowered to assess the given situation on their own and react flexibly. Our challenge is to bridge the gap between knowledge and practice. With reference to best practices observed in Tohoku, such as The Miracles of Kamaishi, our endeavor is to design an effective Disaster Preparedness Education Program that is applicable to other disaster-prone regions in the world, even with different geological, socio-economical and cultural backgrounds. The key concepts for this new approach are 1) empowering individuals to take preventive actions to save their lives, 2) granting community-based understanding of disaster risks and 3) building a sense of reality and relevancy to disasters. With these in mind, we held workshops at some high schools in the Lesser Himalayan Region, combining lectures with an activity called "Hazard Map Making" where students proactively identify and assess the hazards around their living areas and learn practical strategies on how to manage risks. We observed the change of awareness of the students by conducting a preliminary questionnaire survey and interviews after each session. Results strongly implied that the significant change of students' attitudes towards disaster preparedness occurred not by the lectures of scientific knowledge, but after completing the whole program of activities. Students closed their presentation by spontaneously adding messages to others about importance of life and preparedness. In this presentation, we share good practices in terms of program design and facilitation that encouraged the transition of participants from a learner to an actor.

  13. Developing Oceanic Convective Products to Mitigate the Impact of Weather Hazards on Transoceanic Flights

    NASA Astrophysics Data System (ADS)

    Nierow, A.

    2003-12-01

    Transoceanic flights will increase significantly in the next decade. To manage this increased demand for capacity, while maintaining safety, the Federal Aviation Administration (FAA) is exploring whether the separation minima normally used between aircraft crossing oceanic regions can be reduced both horizontally and vertically. However, before reducing separation standards, the increased hazard of encountering convective weather over oceanic routes must be considered. New evidence has shown that roughly half of the turbulence encounters over oceanic regions were likely associated with convective activity. This phenomenon, Convectively-Induced Turbulence (CIT), can occur several kilometers from convective cores. Operational decision-makers need to detect turbulence associated with oceanic convective activity to route or reroute aircraft safely. However, the only weather data consistently available is from satellite imagery, which can reveal potential areas of convection, but can't unambiguously isolate the hazardous regions from the benign regions. Being able to do this would improve routing and rerouting decisions. The FAA and other agencies are collaborating to develop oceanic convective products. The National Weather Service's Aviation Weather Center created a product that identifies thunderstorms by using the output from different satellite imagers. The technique exploits the difference between the 11-micron infrared (IR) channel and the 6.7-micron water vapor channel. The National Center for Atmospheric Research has developed a new product that maps cloud top temperatures drawn from IR satellite imagery and converts them to aircraft flight levels. In addition, the Naval Research Lab in Monterey, CA is developing cloud classification algorithms that will distinguish between cirrus and convective clouds. We have compared these new convective diagnostic techniques to long-range ground base lightning data and lightning data from the National Aeronautics and Space Administration (NASA) Tropical Rainfall Measuring Mission (TRMM) satellite. We will present the results of the comparison at the meeting. Developing oceanic convective and turbulence nowcasting and short-term forecasting products would have a significant positive impact on flight operations since they would show possible locations of turbulence and wind shear associated with convection. These products would also increase airspace capacity by enabling the FAA to decrease oceanic aircraft separation standards while preserving safety. We wish to reduce the incidence of noncoordinated deviations (because of unexpected encounters with turbulence associated with convection), through greater situational awareness and more time for coordination. By helping pilots avoid areas of convective activity and associated turbulence over oceanic regions, these products have the potential to improve safety of flight and increase efficiency (e.g., facilitate routing and rerouting resulting in smaller flight track deviations and reduced fuel costs).

  14. Sea otter oil-spill mitigation study

    SciTech Connect

    Davis, R.W.; Thomas, J.; Williams, T.M.; Kastelein, R.; Cornell, L.

    1986-05-01

    The objective of the study was to analyze the effectiveness of existing capture, transport, cleaning, and rehabilitation methods and develop new methods to reduce the impact of an accidental oil spill to California sea otters, resulting from the present conditions or from future Outer Continental Shelf (OCS) oil and gas development in State or Federal waters. In addition, the study investigated whether or not a systematic difference in thermal conductivity existed between the pelts of Alaska and California Sea otters. This was done to assure that conclusions drawn from the oiling experiments carried out at Hubbs Marine Research Institute, Tetra Tech, Inc. contributed to the overall study by preparing a literature review and report on the fate and effects of oil dispersants and chemically dispersed oil.

  15. Hawaiian cultural influences on support for lava flow hazard mitigation measures during the January 1960 eruption of Kīlauea volcano, Kapoho, Hawai‘i

    USGS Publications Warehouse

    Gregg, Chris E.; Houghton, B.F.; Paton, Douglas; Swanson, D.A.; Lachman, R.; Bonk, W.J.

    2008-01-01

    On average, 72% of respondents favored the construction of earthen barriers to hold back or divert lava and protect Kapoho, but far fewer agreed with the military's use of bombs (14%) to protect Kapoho. In contrast, about one-third of respondents conditionally agreed with the use of bombs. It is suggested that local participation in the bombing strategy may explain the increased conditional acceptance of bombs as a mitigation tool, although this can not be conclusively demonstrated. Belief in Pele and being of Hawaiian ethnicity did not reduce support for the use of barriers, but did reduce support for bombs in both bombing scenarios. The disparity in levels of acceptance of barriers versus bombing and of one bombing strategy versus another suggests that historically public attitudes toward lava flow hazard mitigation strategies were complex. A modern comparative study is needed before the next damaging eruption to inform debates and decisions about whether or not to interfere with the flow of lava. Recent changes in the current eruption of Kīlauea make this a timely topic.

  16. The subsurface cross section resistivity using magnetotelluric method in Pelabuhan Ratu area, West Java, implication for geological hazard mitigation

    NASA Astrophysics Data System (ADS)

    Gaffar, Eddy Z.

    2016-02-01

    Pelabuhan Ratu area is located on the south coast of West Java. Pelabuhan Ratu area's rapid development and population growth were partly stimulated by the Indonesian Government Regulation No. 66 the year 1998 that made Pelabuhan Ratu the capital city of the district of Sukabumi. Because of this fact, it is very important to create a geological hazard mitigation plan for the area. Pelabuhan Ratu were passed by two major faults: Cimandiri fault in the western and Citarik fault in the eastern. Cimandiri fault starts from the upstream of Cimandiri River to the southern of Sukabumi and Cianjur city. While Citarik fault starts from the Citarik River until the Salak Mountain. These two faults needs to be observed closely as they are prone to cause earthquake in the area. To mitigate earthquake that is estimated will occur at Cimandiri fault or the Citarik fault, the Research Center for Geotechnology LIPI conducted research using Magnetotelluric (MT) method with artificial Phoenix MT tool to determine the cross-section resistivity of the Pelabuhan Ratu and the surrounding area. Measurements were taken at 40 points along the highway towards Jampang to Pelabuhan Ratu, and to Bandung towards Cibadak with a distance of less than 500 meters between the measuring points. Measurement results using this tool will generate AMT cross-section resistivity to a depth of 1500 meters below the surface. Cross-section resistivity measurement results showed that there was a layer of rock with about 10 Ohm-m to 1000 Ohm-m resistivity. Rocks with resistivity of 10 Ohm-m was interpreted as conductive rocks that were loose or sandstone containing water. If an earthquake to occur in this area, it will lead to a strong movement and liquefaction that will destroy buildings and potentially cause casualties in this area.

  17. Volcanic Ash Image Products from MODIS for Aviation Safety and Natural Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Stephens, G.; Ellrod, G. P.; Im, J.

    2003-12-01

    Multi-spectral volcanic ash image products have been developed using Moderate Resolution Imaging Spectroradiometer (MODIS) data from the NASA Terra spacecraft (Ellrod and Im 2003). Efforts are now underway to integrate these new products into the MODIS Data Retrieval System at NESDIS, for use in the operational Hazard Mapping System (HMS). The images will be used at the Washington Volcanic Ash Advisory Center (W-VAAC) in the issuance of volcanic ash advisory statements to aircraft. In addition, the images will be made available to users in the global volcano and emergency management community via the World Wide Web. During the development process, good results (high detection rate with low false alarms") were obtained from a tri-spectral combination of MODIS Infrared (IR) bands centered near 8.6, 11.0 and 12.0 Ym (Bands 29, 31, and 32). Optimum Red-Green-Blue false color composite images were developed to provide information on ash cloud location, as well as cloud phase and surface characteristics, to aid in interpretation both day and night. Information on volcanic ash derived from the tri-spectral product was displayed using the red color gun. This information was combined with visible (0.6 Ym) and near-IR (1.6 Ym) data for green and blue, respectively, during daylight periods. At night, the 8.6 V 11.0 Ym combination and 11.0 Ym band were used for the green and blue colors in the RGB product. Currently, raw MODIS data in five minute granules" are processed for the following regions: (1) southern Alaska, (2) Mexico, Central America and the Caribbean, and (3) northern Andes region of South America. Image products are converted to Geo-spatial Information System (GIS) compatible formats for use in the HMS, and to Man-Computer Interactive Data Access System (McIDAS) Area File" format for use in currently configured W-VAAC display systems. The installation of a high speed, fiber optic line from NASA Goddard Space Flight Center to the World Weather Building, Camp Springs, Maryland (scheduled for completion by Fall, 2003) will allow a full set of data to be processed from both Terra and Aqua spacecraft.

  18. Scientific Animations for Tsunami Hazard Mitigation: The Pacific Tsunami Warning Center's YouTube Channel

    NASA Astrophysics Data System (ADS)

    Becker, N. C.; Wang, D.; Shiro, B.; Ward, B.

    2013-12-01

    Outreach and education save lives, and the Pacific Tsunami Warning Center (PTWC) has a new tool--a YouTube Channel--to advance its mission to protect lives and property from dangerous tsunamis. Such outreach and education is critical for coastal populations nearest an earthquake since they may not get an official warning before a tsunami reaches them and will need to know what to do when they feel strong shaking. Those who live far enough away to receive useful official warnings and react to them, however, can also benefit from PTWC's education and outreach efforts. They can better understand a tsunami warning message when they receive one, can better understand the danger facing them, and can better anticipate how events will unfold while the warning is in effect. The same holds true for emergency managers, who have the authority to evacuate the public they serve, and for the news media, critical partners in disseminating tsunami hazard information. PTWC's YouTube channel supplements its formal outreach and education efforts by making its computer animations available 24/7 to anyone with an Internet connection. Though the YouTube channel is only a month old (as of August 2013), it should rapidly develop a large global audience since similar videos on PTWC's Facebook page have reached over 70,000 viewers during organized media events, while PTWC's official web page has received tens of millions of hits during damaging tsunamis. These animations are not mere cartoons but use scientific data and calculations to render graphical depictions of real-world phenomena as accurately as possible. This practice holds true whether the animation is a simple comparison of historic earthquake magnitudes or a complex simulation cycling through thousands of high-resolution data grids to render tsunami waves propagating across an entire ocean basin. PTWC's animations fall into two broad categories. The first group illustrates concepts about seismology and how it is critical to tsunami warning operations, such as those about earthquake magnitudes, how earthquakes are located, where and how often earthquakes occur, and fault rupture length. The second group uses the PTWC-developed tsunami forecast model, RIFT (Wang et al., 2012), to show how various historic tsunamis propagated through the world's oceans. These animations illustrate important concepts about tsunami behavior such as their speed, how they bend around and bounce off of seafloor features, how their wave heights vary from place to place and in time, and how their behavior is strongly influenced by the type of earthquake that generated them. PTWC's YouTube channel also includes an animation that simulates both seismic and tsunami phenomena together as they occurred for the 2011 Japan tsunami including actual sea-level measurements and proper timing for tsunami alert status, thus serving as a video 'time line' for that event and showing the time scales involved in tsunami warning operations. Finally, PTWC's scientists can use their YouTube channel to communicate with their colleagues in the research community by supplementing their peer-reviewed papers with video 'figures' (e.g., Wang et al., 2012).

  19. Public Policy Issues Associated with Tsunami Hazard Mitigation, Response and Recovery: Transferable Lessons from Recent Global Disasters

    NASA Astrophysics Data System (ADS)

    Johnson, L.

    2014-12-01

    Since 2004, a sequence of devastating tsunamis has taken the lives of more than 300,000 people worldwide. The path of destruction left by each is typically measured in hundreds of meters to a few kilometers and its breadth can extend for hundreds even thousands of kilometers, crossing towns and countries and even traversing an entire oceanic basin. Tsunami disasters in Indonesia, Chile, Japan and elsewhere have also shown that the almost binary nature of tsunami impacts can present some unique risk reduction, response, recovery and rebuilding challenges, with transferable lessons to other tsunami vulnerable coastal communities around the world. In particular, the trauma can motivate survivors to relocate homes, jobs, and even whole communities to safer ground, sometimes at tremendous social and financial costs. For governments, the level of concentrated devastation usually exceeds the local capacity to respond and thus requires complex inter-governmental arrangements with regional, national and even international partners to support the recovery of impacted communities, infrastructure and economies. Two parallel projects underway in California since 2011—the SAFRR (Science Application for Risk Reduction) tsunami scenario project and the California Tsunami Policy Working Group (CTPWG)—have worked to digest key lessons from recent tsunami disasters, with an emphasis on identifying gaps to be addressed in the current state and federal policy framework to enhance tsunami risk awareness, hazard mitigation, and response and recovery planning ahead of disaster and also improve post-disaster implementation practices following a future California or U.S. tsunami event.

  20. Conceptual Study on Air Ingress Mitigation for VHTRs

    SciTech Connect

    Chang H. Oh; Eung S. Kim

    2012-09-01

    An air-ingress accident followed by a pipe break is considered as a critical event for a very high temperature gas-cooled reactor (VHTR) safety. Following helium depressurization, it is anticipated that unless countermeasures are taken, air will enter the core through the break leading to oxidation of the in-core graphite structure. Thus, without mitigation features, this accident might lead to severe exothermic chemical reactions of graphite and oxygen depending on the accident scenario and the design. Under extreme circumstances, a loss of core structural integrity may occur and lead to a detrimental situation for the VHTR safety. This paper discusses various air-ingress mitigation concepts applicable for the VHTRs. The study begins with identifying important factors (or phenomena) associated with the air-ingress accident using a root-cause analysis. By preventing main causes of the important events identified in the root-cause diagram, the basic air-ingress mitigation ideas were conceptually developed. Among them, two concepts were finally evaluated as effective candidates. One concept is to inject helium into the lower plenum which is a direct in-vessel helium injection. The other concept is to enclose the reactor with a non-pressure boundary consisting of an opening at the bottom, which is an ex-vessel enclosure boundary. Computational fluid dynamics (CFD) methods were used to validate these concepts. As a result, it was shown that both concepts can effectively mitigate the air-ingress process. In the first concept, the injected helium replaces the air in the core and the lower plenum upper part by buoyancy force because of its low density. It prevented air from moving into the reactor core showing great potential for mitigating graphite oxidation in the core. In the second concept, the air-ingress rate is controlled by molecular diffusion through the opening at the enclosure bottom after depressurization. Some modified reactor cavity design is expected to play this role in the VHTRs.

  1. Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Research Team . Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Kelly, Michael J.

    2013-01-01

    The Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage (horizontal and vertical tail). This report contains the Appendices to Volume I.

  2. Multi-scale earthquake hazard and risk in the Chinese mainland and countermeasures for the preparedness, mitigation, and management: an overview

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Jiang, C.; Ma, T.

    2012-12-01

    Earthquake hazard and risk in the Chinese mainland exhibit multi-scale characteristics. Temporal scales from centuries to months, spatial scales from the whole mainland to specific engineering structures, and energy scales from great disastrous earthquakes to small earthquakes causing social disturbance and economic loss, feature the complexity of earthquake disasters. Coping with such complex challenge, several research and application projects have been undertaken since recent years. Lessons and experiences of the 2008 Wenchuan earthquake contributed much to the launching and conducting of these projects. Understandings of the scientific problems and technical approaches taken in the mainstream studies in the Chinese mainland have no significant difference from those in the international scientific communities, albeit using of some of the terminologies have "cultural differences" - for instance, in the China Earthquake Administration (CEA), the terminology "earthquake forecast/prediction (study)" is generally used in a much broader sense, mainly indicating time-dependent seismic hazard at different spatio-temporal scales. Several scientific products have been produced serving the society in different forms. These scientific products have unique academic merits due to the long-term persistence feature and the forward forecast nature, which are all essential for the evaluation of the technical performance and the falsification of the scientific ideas. On the other hand, using the language of the "actor network theory (ANT)" in science studies (or the sociology of science), at present, the hierarchical "actors' network", making the science transformed to the actions of the public and government for the preparedness, mitigation, and management of multi-scale earthquake disasters, is still in need of careful construction and improvement.

  3. RAGE Hydrocode Modeling of Asteroid Mitigation: new simulations with parametric studies for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Weaver, R.; Plesko, C. S.; Gisler, G. R.

    2013-12-01

    We are performing detailed hydrodynamic simulations of the interaction from a strong explosion with sample Asteroid objects. The purpose of these simulations is to apply modern hydrodynamic codes that have been well verified and validated (V&V) to the problem of mitigating the hazard from a potentially hazardous object (PHO), an asteroid or comet that is on an Earth crossing orbit. The code we use for these simulations is the RAGE code from Los Alamos National Laboratory [1-6]. Initial runs were performed using a spherical object. Next we ran simulations using the shape form from a known asteroid: 25143 Itokawa. This particular asteroid is not a PHO but we use its shape to consider the influence of non-spherical objects. The initial work was performed using 2D cylindrically symmetric simulations and simple geometries. We then performed a major fully 3D simulation. For an Itokawa size object (~500 m) and an explosion energies ranging from 0.5 - 1 megatons, the velocities imparted to all of the PHO "rocks" in all cases were many m/s. The velocities calculated were much larger than escape velocity and would preclude re-assembly of the fragments. The dispersion of the asteroid remnants is very directional from a surface burst, with all fragments moving away from the point of the explosion. This detail can be used to time the intercept for maximum movement off the original orbit. Results from these previous studies will be summarized for background. In the new work presented here we show a variety of parametric studies around these initial simulations. We modified the explosion energy by +/- 20% and varied the internal composition from a few large "rocks" to several hundred smaller rocks. The results of these parametric studies will be presented. We have also extended our work [6],[7] to stand-off nuclear bursts and will present the initial results for the energy deposition by a generic source into the non-uniform composition asteroid. The goal of this new work is to obtain an "optimal stand-off" distance from detailed radiation transport-hydrodynamic simulations from generic explosion properties. The initial results of these two studies will also be presented. References [1] Gitting, Weaver et al 'The RAGE radiation-hydrodynamics Code,' Comp. Sci. Disc. 1 (2008) 015005 November 21, 2008 [2] Huebner, W.F. et al, 'The Engagement Space for Countermeasures Against Potentially Hazardous Objects (PHOs),' International Conference in Asteroid and Comet Hazards, 2009 held at the Russian Academy of Sciences, St. Petersburg 21-25-September 2009. [3] Gisler, Weaver, Mader, & Gittings, Two and three dimensional asteroid impact simulations, Computing in Science & Engineering, 6, 38 (2004). [4] NASA geometry courtesy of S.J. Osto et al. (2002) in Asteroids Book 3 [5] Itokawa image courtesy of JAXA: http://www.isas.jaxa.jp/e/snews/2005/1102.shtml#pic001 [6] Plesko, C et al "Looking Before we Leap: Recent Results from an Ongoing, Quantitative Investigation of Asteroid and Comet Impact Hazard Mitigation" Division of Planetary Sciences 2010. [7] Plesko, C et. al. "Numerical Models of Asteroid and Comet Impact Hazard Mitigation by Nuclear Stand-Off Burst." Planetary Defense Conference 2011.

  4. Energy Deposition in Hazard Mitigation by Nuclear Burst: Sensitivity to Energy Source Characteristics, Geometry, and Target Composition

    NASA Astrophysics Data System (ADS)

    Plesko, C. S.; Weaver, R. P.; Huebner, W. F.

    2011-03-01

    We present hydrocode and particle transport code models of energy deposition from nuclear bursts onto materials relevant to PHO mitigation. We find that momentum transfer is affected by burst geometry and PHO composition.

  5. A design study on complexity reduced multipath mitigation

    NASA Astrophysics Data System (ADS)

    Wasenmüller, U.; Brack, T.; Groh, I.; Staudinger, E.; Sand, S.; Wehn, N.

    2012-09-01

    Global navigation satellite systems, e.g. the current GPS and the future European Galileo system, are frequently used in car navigation systems or smart phones to determine the position of a user. The calculation of the mobile position is based on the signal propagation times between the satellites and the mobile terminal. At least four time of arrival (TOA) measurements from four different satellites are required to resolve the position uniquely. Further, the satellites need to be line-of-sight to the receiver for exact position calculation. However, in an urban area, the direct path may be blocked and the resulting multipath propagation causes errors in the order of tens of meters for each measurement. and in the case of non-line-of-sight (NLOS), positive errors in the order of hundreds of meters. In this paper an advanced algorithm for multipath mitigation known as CRMM is presented. CRMM features reduced algorithmic complexity and superior performance in comparison with other state of the art multipath mitigation algorithms. Simulation results demonstrate the significant improvements in position calculation in environments with severe multipath propagation. Nevertheless, in relation to traditional algorithms an increased effort is required for real-time signal processing due to the large amount of data, which has to be processed in parallel. Based on CRMM, we performed a comprehensive design study including a design space exploration for the tracking unit hardware part, and prototype implementation for hardware complexity estimation.

  6. United States studies in orbital debris - Prevention and mitigation

    NASA Technical Reports Server (NTRS)

    Loftus, Joseph P., Jr.; Potter, Andrew E.

    1990-01-01

    Debris in space has become an issue that has commanded considerable interest in recent years as society has become both more dependent upon space based systems, and more aware of its dependence. After many years of study the United States Space Policy of February 1988 directed that all sectors of the U.S. community minimize space debris. Other space organizations have adopted similar policies. Among the study activities leading to the policy and to subsequent implementing directives were discussions with the ESA, NASDA, and other space operating agencies. The policy derived from technical consensus on the nature of the issues and upon the courses of action available to mitigate the problem, but there remains the concern as to the adequacy of the data to define cost effective strategies. There are now in place mechanisms to continue technical discussions in more formal terms.

  7. Natural Hazard Mitigation thru Water Augmentation Strategies to Provide Additional Snow Pack for Water Supply and Hydropower Generation in Drought Stressed Alps/Mountains

    NASA Astrophysics Data System (ADS)

    Matthews, D.; Brilly, M.

    2009-12-01

    Climate variability and change are clearly stressing water supplies in high alpine regions of the Earth. These recent long-term natural hazards present critical challenges to policy makers and water managers. This paper addresses strategies to use enhanced scientific methods to mitigate the problem. Recent rapid depletions of glaciers and intense droughts throughout the world have created a need to reexamine modern water augmentation technologies for enhancing snow pack in mountainous regions. Today’s reliance on clean efficient hydroelectric power in the Alps and the Rocky Mountains poses a critical need for sustainable snow packs and high elevation water supplies through out the year. Hence, the need to make natural cloud systems more efficient precipitators during the cold season through anthropogenic weather modification techniques. The Bureau of Reclamation, US Department of the Interior, has spent over $39M in research from 1963 to 1990 to develop the scientific basis for snow pack augmentation in the headwaters of the Colorado, American, and Columbia River Basins in the western United States, and through USAID in Morocco in the High Atlas Mountains. This paper presents a brief summary of the research findings and shows that even during drought conditions potential exists for significant, cost-effective enhancement of water supplies. Examples of ground based propane and AgI seeding generators, cloud physics studies of supercooled cloud droplets and ice crystal characteristics that indicate seeding potential will be shown. Hypothetical analyses of seeding potential in 17 western states from Montana to California will be presented based on observed SNOTEL snow water equivalent measurements, and distributed by elevation and observed winter precipitation. Early studies indicated from 5 to 20% increases in snow pack were possible, if winter storm systems were seeded effectively. If this potential was realized in drought conditions observed in 2003, over 1.08 million acre feet (1.33 x 10**9 m3) of additional water could be captured by seeding efficiently and effectively in just 10 storms. Recent projects sponsored by the National Science Foundation, NOAA, and the States of Wyoming, Utah and Nevada, and conducted by the National Center for Atmospheric Research will be discussed briefly. Examples of conditions in extreme droughts of the Western United States will be presented that show potential to mitigate droughts in these regions through cloud seeding. Implications for American and European hydropower generation and sustainable water supplies will be discussed.

  8. Mitigating Resistance to Teaching Science Through Inquiry: Studying Self

    NASA Astrophysics Data System (ADS)

    Spector, Barbara; Burkett, Ruth S.; Leard, Cyndy

    2007-04-01

    This is the report of a qualitative emergent-design study of 2 different Web-enhanced science methods courses for preservice elementary teachers in which an experiential learning strategy, labeled “using yourself as a learning laboratory,” was implemented. Emergent grounded theory indicated this strategy, when embedded in a course organized as an inquiry with specified action foci, contributed to mitigating participants’ resistance to learning and teaching through inquiry. Enroute to embracing inquiry, learners experienced stages resembling the stages of grief one experiences after a major loss. Data sources included participant observation, electronic artifacts in WebCT, and interviews. Findings are reported in 3 major sections: “Action Foci Common to Both Courses,” “Participants’ Growth and Change,” and “Challenges and Tradeoffs.”

  9. Airflow Hazard Visualization for Helicopter Pilots: Flight Simulation Study Results

    NASA Technical Reports Server (NTRS)

    Aragon, Cecilia R.; Long, Kurtis R.

    2005-01-01

    Airflow hazards such as vortices or low level wind shear have been identified as a primary contributing factor in many helicopter accidents. US Navy ships generate airwakes over their decks, creating potentially hazardous conditions for shipboard rotorcraft launch and recovery. Recent sensor developments may enable the delivery of airwake data to the cockpit, where visualizing the hazard data may improve safety and possibly extend ship/helicopter operational envelopes. A prototype flight-deck airflow hazard visualization system was implemented on a high-fidelity rotorcraft flight dynamics simulator. Experienced helicopter pilots, including pilots from all five branches of the military, participated in a usability study of the system. Data was collected both objectively from the simulator and subjectively from post-test questionnaires. Results of the data analysis are presented, demonstrating a reduction in crash rate and other trends that illustrate the potential of airflow hazard visualization to improve flight safety.

  10. AMENDING SOILS WITH PHOSPHATE AS MEANS TO MITIGATE SOIL LEAD HAZARD: A CRITICAL REVIEW OF THE STATE OF THE SCIENCE

    EPA Science Inventory

    Ingested soil and surface dust may be important contributors to elevated blood lead (Pb) levels in children exposed to Pb contaminated environments. Mitigation strategies have typically focused on excavation and removal of the contaminated soil. However, this is not always feas...

  11. Peru mitigation assessment of greenhouse gases: Sector -- Energy. Peru climate change country study; Final report

    SciTech Connect

    1996-08-01

    The aim of this study is to determine the Inventory and propose Greenhouse Gases Mitigation alternatives in order to face the future development of the country in a clean environmental setting without delaying the development process required to improve Peruvian standard of living. The main idea of this executive abstract is to show concisely the results of the Greenhouse Gases Mitigation for Peru in the period 1990--2015. The studies about mitigation for the Energy Sector are shown in this summary.

  12. From structural investigation towards multi-parameter early warning systems: geophysical contributions to hazard mitigation at the landslide of Gschliefgraben (Gmunden, Upper Austria)

    NASA Astrophysics Data System (ADS)

    Supper, Robert; Baron, Ivo; Jochum, Birgit; Ita, Anna; Winkler, Edmund; Motschka, Klaus; Moser, Günter

    2010-05-01

    In December 2007 the large landslide system inside the Gschliefgraben valley (located at the east edge of the Traun lake, Upper Austria), known over centuries for its repeated activity, was reactivated. Although a hazard zone map was already set up in 1974, giving rise to a complete prohibition on building, some hundreds of people are living on the alluvial fan close to the lake. Consequently, in frame of the first emergency measures, 55 building had to be evacuated. Within the first phase of mitigation, measures were focused on property and infrastructure protection. Around 220 wells and one deep channel were implemented to drain the sliding mass. Additionally a big quantity of sliding material was removed close to the inhabited areas. Differential GPS and water level measurements were performed to evaluate the effectiveness of the measures, which led to a significant slowdown of the movement. Soon after the suspension of the evacuation several investigations, including drilling, borehole logging and complex geophysical measurements were performed to investigate the structure of the landslide area in order to evaluate maximum hazard scenarios as a basis for planning further measures. Based on these results, measuring techniques for an adapted, future early warning system are currently being tested. This emergency system should enable local stakeholders to take appropriate and timely measures in case of a future event thus lessening the impact of a future disaster significantly. Within this tree-step-plan the application of geophysical methodologies was an integral part of the research and could considerably contribute to the success. Several innovative approaches were implemented which will be described in more detail within the talk. Airborne multi-sensor geophysical surveying is one of new and progressive approaches which can remarkably contribute to effectively analyse triggering processes of large landslides and to better predict their hazard. It was tested in Gschliefgraben earthflow and landslide complex in September 2009. Several parameters, such as vegetation thickness, soil moisture, potassium and thorium content (gamma ray) or four layer resistivity were the principal studied parameters. These parameters were compared with the landslide inventory map of Gschliefgraben developed from differential airborne laser scan terrain models. Since mass wasting is usually triggered by rising water pore pressure due to heavy rainfall or seismic tremors, often supported by changes in the shape, structure, and hydrology of a slope or vegetation cover. As the electrical resistivity of the subsurface mainly depends on porosity, saturation, pore fluid conductivity and clay content, the geoelectric method is a reliable method to investigate the structure of the landslide and surrounding and could be an emerging tool for observing those triggering factors. Therefore, first a multi-electrode geoelectrical survey was performed in a broader area of the active earthflow to verify the subsurface structure and to optimise the location for a monitoring system, followed by the installacion of the geoelectric monitoring system Geomon4D in September 2009. The monitoring profiles were complemented by an automatic DMS inclinometer to correlate measured resistivity values with displacement rates. Since the installation, the system works continuously and data is processed on a daily basis at the monitoring centre in Vienna. These works were supported by the 7th FP project "Safeland - Living with the landslide risk in Europe".

  13. Integrated Data Products to Forecast, Mitigate, and Educate for Natural Hazard Events Based on Recent and Historical Observations

    NASA Astrophysics Data System (ADS)

    McCullough, H. L.; Dunbar, P. K.; Varner, J. D.

    2011-12-01

    Immediately following a damaging or fatal natural hazard event there is interest to access authoritative data and information. The National Geophysical Data Center (NGDC) maintains and archives a comprehensive collection of natural hazards data. The NGDC global historic event database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Examining the past record provides clues to what might happen in the future. NGDC also archives tide gauge data from stations operated by the NOAA/NOS Center for Operational Oceanographic Products and Services and the NOAA Tsunami Warning Centers. In addition to the tide gauge data, NGDC preserves deep-ocean water-level, 15-second sampled data as collected by the Deep-ocean Assessment and Reporting of Tsunami (DART) buoys. Water-level data provide evidence of sea-level fluctuation and possible inundation events. NGDC houses an extensive collection of geologic hazards photographs available online as digital images. Visual media provide invaluable pre- and post-event data for natural hazards. Images can be used to illustrate inundation and possible damage or effects. These images are organized by event or hazard type (earthquake, volcano, tsunami, landslide, etc.), along with description and location. They may be viewed via interactive online maps and are integrated with historic event details. The planning required to achieve collection and dissemination of hazard event data is extensive. After a damaging or fatal event, NGDC begins to collect and integrate data and information from many people and organizations into the hazards databases. Sources of data include the U.S. NOAA Tsunami Warning Centers, the U.S. Geological Survey, the U.S. NOAA National Data Buoy Center, the UNESCO Intergovernmental Oceanographic Commission (IOC), Smithsonian Institution's Global Volcanism Program, news organizations, etc. NGDC then works to promptly distribute data and information for the appropriate audience. For example, when a major tsunami occurs, all of the related tsunami data are combined into one timely resource. NGDC posts a publicly accessible online report which includes: 1) event summary; 2) eyewitness and instrumental recordings from preliminary field surveys; 3) regional historical observations including similar past events and effects; 4) observed water heights and calculated tsunami travel times; and 5) near-field effects. This report is regularly updated to incorporate the most recent news and observations. Providing timely access to authoritative data and information ultimately benefits researchers, state officials, the media and the public.

  14. Mitigation of hazards from future lahars from Mount Merapi in the Krasak River channel near Yogyakarta, central Java

    USGS Publications Warehouse

    Ege, John R.; Sutikno

    1983-01-01

    Procedures for reducing hazards from future lahars and debris flows in the Krasak River channel near Yogyakarta, Central Java, Indonesia, include (1) determining the history of the location, size, and effects of previous lahars and debris flows, and (2) decreasing flow velocities. The first may be accomplished by geologic field mapping along with acquiring information by interviewing local residents, and the second by increasing the cross sectional area of the river channel and constructing barriers in the flow path.

  15. Preliminary study on the transport of hazardous materials through tunnels.

    PubMed

    Bubbico, Roberto; Di Cave, Sergio; Mazzarotta, Barbara; Silvetti, Barbara

    2009-11-01

    The risk associated to road and rail transportation of some hazardous materials along two routes, one including a significant portion in tunnels, and the other following the same path, but running completely in the open, is assessed. The results show that, for rail transport, no particular risk increase or mitigation is associated to the circulation of the dangerous goods through tunnels; on the contrary, for road transport, a risk increase is generally observed in the presence of tunnels. However, for LPG, the risk curve in the open lies above that in tunnels in the high frequency-low fatality zone, according to the different evolution of the accidental scenarios in the tunnel (assuming no ventilation). The transportation of liquefied nitrogen, not hazardous in the open but potentially asphyxiating in a tunnel, gives rise to a negligible risk when performed by rail, but to a not negligible one, when performed by road. These preliminary results focused on the risk for the exposed population, suggest that it may be unnecessary to limit dangerous goods circulation through rail tunnels, while, at least for some types of dangerous goods, the circulation through road tunnels may be allowed/forbidden based on the results of a specific risk analysis. PMID:19819368

  16. 3D modelling of Mt. Talaga Bodas Crater (Indonesia) by using terrestrial laser scanner for volcano hazard mitigation

    NASA Astrophysics Data System (ADS)

    Gumilar, Irwan; Abidin, Hasanuddin Z.; Putra, Andreas D.; Haerani, Nia

    2015-04-01

    Indonesia is a country with many volcanoes. Each volcano in Indonesia typically has its own crater characteristics. One of them is the Mt.Talaga Bodas, located in Garut, West Java. Researches regarding the crater characteristics are necessary for volcanic disaster mitigation process. One of them is the modelling of the shape of the crater. One of the methods that can be used to model the volcanic crater is using Terrestrial Laser Scanner (TLS). This research aims to create a 3 dimensional (3D) model of the crater of the Mt. Talaga Bodas, that hopefully can be utilized for volcanic disaster mitigation. The methodology used in this research is by obtaining the scanning data using TLS and GPS measurements to obtain the coordinates of the reference points. The data processing methods consist of several steps, namely target to target registration, filterization, georeference, meshing point cloud, surface making, drawing, and 3D modelling. These steps were done using the Cyclone 7 software, and also using 3DS MAX for 3D modelling. The result of this data processing is a 3D model of the crater of the Mt. Talaga Bodas which is similar with the real shape. The calculation result shows that the height of the crater is 62.522 m, the diameter of the crater is 467.231 m, and the total area is 2961054.652 m2. The main obstacle in this research is the dense vegetation which becomes the noise and affects the crater model.

  17. Experimental study designs to improve the evaluation of road mitigation measures for wildlife.

    PubMed

    Rytwinski, Trina; van der Ree, Rodney; Cunnington, Glenn M; Fahrig, Lenore; Findlay, C Scott; Houlahan, Jeff; Jaeger, Jochen A G; Soanes, Kylie; van der Grift, Edgar A

    2015-05-01

    An experimental approach to road mitigation that maximizes inferential power is essential to ensure that mitigation is both ecologically-effective and cost-effective. Here, we set out the need for and standards of using an experimental approach to road mitigation, in order to improve knowledge of the influence of mitigation measures on wildlife populations. We point out two key areas that need to be considered when conducting mitigation experiments. First, researchers need to get involved at the earliest stage of the road or mitigation project to ensure the necessary planning and funds are available for conducting a high quality experiment. Second, experimentation will generate new knowledge about the parameters that influence mitigation effectiveness, which ultimately allows better prediction for future road mitigation projects. We identify seven key questions about mitigation structures (i.e., wildlife crossing structures and fencing) that remain largely or entirely unanswered at the population-level: (1) Does a given crossing structure work? What type and size of crossing structures should we use? (2) How many crossing structures should we build? (3) Is it more effective to install a small number of large-sized crossing structures or a large number of small-sized crossing structures? (4) How much barrier fencing is needed for a given length of road? (5) Do we need funnel fencing to lead animals to crossing structures, and how long does such fencing have to be? (6) How should we manage/manipulate the environment in the area around the crossing structures and fencing? (7) Where should we place crossing structures and barrier fencing? We provide experimental approaches to answering each of them using example Before-After-Control-Impact (BACI) study designs for two stages in the road/mitigation project where researchers may become involved: (1) at the beginning of a road/mitigation project, and (2) after the mitigation has been constructed; highlighting real case studies when available. PMID:25704749

  18. Change of seismic process under irradiation of the crust by electromagnetic discharge and the problem of seismic hazard mitigation.

    NASA Astrophysics Data System (ADS)

    Velikhov, E. P.; Gliko, A. O.; Tarasov, N. T.

    2008-12-01

    The effect of high-energy electromagnetic pulses emitted by a magnetohydrodynamic generator used as a source for deep electrical sounding of the crust on spatial-temporal structure of seismicity is explored. It has been shown that 2-6 days after effect of electromagnetic pulses there is statistically significant activation of relatively small earthquakes in the regions of study. The total seismic energy of initiated earthquakes is five- six orders of magnitude higher than the energy transmitted by the generator to the radiating dipole. The spatial correlation of earthquake density for each two contiguous years as a function of time was studied. Five-six years periodicity of changes in spatial distribution of seismicity was detected. It was shown that the effect of electromagnetic pulses increases the stability of the spatial distribution of earthquakes over time and simultaneously speeds up cycles of its transformation, which develop on stabilization background. The time change of the fractal dimension of the spatial epicenter's distribution was investigated too. It was found that the fractal dimensions fluctuates about a level 1.48 for the time period before the generator experiments. The beginning of the regular irradiation of the are under study by electromagnetic pulses is followed by gradual falloff in values of fractal dimension inside near field area of the radiating dipole. The decrease of the former is observed during all period of conducting of the electrical sounding. Therefore, the effect of electromagnetic pulses causes the increase of the spatial clustering of the earthquakes. Then fractal dimension gradually increases to a background level. Note that the time interval when the values of the fractal dimension were less that the background level, coincides with the time period in which increasing the stability of the spatial distribution of seismicity over time was observed. The results of the study show that action of high energy electromagnetic discharges radiated by the generators causes substantial changes of spatial temporal property of the seismic process of earthquake source zones and accelerates the release of energy stored in the crust due to the activity of natural tectonic processes, thus serving as a kind of trigger. Based on the energy balance assumption for the crust, we conclude that a man-made increase in part of the seismic energy radiated in the form of flow of relatively small earthquakes leads to an additional release of tectonic stresses, thereby diminishing the likelihood of catastrophic events (or at any rate, reduces the energy of such events). If this assumption will be confirmed by further investigations, the detected effect could be used to develop a technique of decrease of seismic hazard by artificial discharge of tectonic stress by actions of electromagnetic pulses on the crust in the earthquake source zones.

  19. Odor mitigation with vegetative buffers: Swine production case study

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Vegetative environmental buffers (VEB) are a potentially low cost sustainable odor mitigation strategy, but there is little to no data supporting their effectiveness. Wind tunnel experiments and field monitoring were used to determine the effect VEB had on wind flow patterns within a swine facility....

  20. Odor Mitigation with Tree Buffers: Swine Production Case Study

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Tree buffers are a potential low cost sustainable odor mitigation strategy, but there is little to no data on their effectiveness. Odor transport is thought to occur one of two ways either directly through vapor phase transport or indirectly through sorption onto particles. Consequently, monitoring...

  1. Prevalence and predictors of residential health hazards: a pilot study.

    PubMed

    Klitzman, Susan; Caravanos, Jack; Deitcher, Deborah; Rothenberg, Laura; Belanoff, Candice; Kramer, Rachel; Cohen, Louise

    2005-06-01

    This article reports the results of a pilot study designed to ascertain the prevalence of lead-based paint (LBP), vermin, mold, and safety conditions and hazards and to validate observations and self-reports against environmental sampling data. Data are based on a convenience sample of 70 dwellings in a low-income, urban neighborhood in Brooklyn, New York. The vast majority of residences (96%) contained multiple conditions and/or hazards: LBP hazards (80%), vermin (79%), elevated levels of airborne mold (39%), and safety hazards (100%). Observations and occupant reports were associated with environmental sampling data. In general, the more proximate an observed condition was to an actual hazard, the more likely it was to be associated with environmental sampling results (e.g., peeling LBP was associated with windowsill dust lead levels, and cockroach sightings by tenants were associated with Blatella germanica [Bla g 1] levels). Conversely, the more distal an observed condition was to an actual hazard, the less likely it was to be associated with environmental sampling results (e.g., water damage, alone, was not statistically associated with elevated levels of dust lead, Bla g 1, or airborne mold). Based on the findings from this pilot study, there is a need for industrial hygienists and others to adopt more comprehensive and integrative approaches to residential hazard assessment and remediation. Further research--using larger, randomly drawn samples, representing a range of housing types and geographical areas--is needed to clarify the relationship between readily observable conditions, occupant reports, and environmental sampling data and to assess the cumulative impact on human health. PMID:16020089

  2. Using fine-scale fuel measurements to assess wildland fuels, potential fire behavior and hazard mitigation treatments in the southeastern USA.

    SciTech Connect

    Ottmar, Roger, D.; Blake, John, I.; Crolly, William, T.

    2012-01-01

    The inherent spatial and temporal heterogeneity of fuelbeds in forests of the southeastern United States may require fine scale fuel measurements for providing reliable fire hazard and fuel treatment effectiveness estimates. In a series of five papers, an intensive, fine scale fuel inventory from the Savanna River Site in the southeastern United States is used for building fuelbeds and mapping fire behavior potential, evaluating fuel treatment options for effectiveness, and providing a comparative analysis of landscape modeled fire behavior using three different data sources including the Fuel Characteristic Classification System, LANDFIRE, and the Southern Wildfire Risk Assessment. The research demonstrates that fine scale fuel measurements associated with fuel inventories repeated over time can be used to assess broad scale wildland fire potential and hazard mitigation treatment effectiveness in the southeastern USA and similar fire prone regions. Additional investigations will be needed to modify and improve these processes and capture the true potential of these fine scale data sets for fire and fuel management planning.

  3. Remote sensing applied in natural hazards mitigation - experiences from the international UNESCO/IUGS GARS-Program 1984 - 2002

    NASA Astrophysics Data System (ADS)

    Bannert, D.

    Worldwide resources of arable land, water, groundwater, forest and expanding human habitat are under increasing pressure almost anywhere. Especially the non- industrialised countries with their rapidly increasing population are facing severe problems from natural catastrophes such as landslides, volcanic and seismic hazards, soil degradation and shortage of water or flooding. Geo-environmental research can help to identify the causes for these events, define the rehabilitation and can lead to early warning systems. Remote sensing adds considerable knowledge by providing a wide variety of sensors applied form airborne and space platforms, the data of which, once analysed, can provide completely new observations on natural risk areas. The UNESCO/IUGS sponsored GARS Program since 1984 is conducting- joint research with institutions in industrialised and developing countries. As of today, more than 40 institutes and individuals worldwide have joined the GARS- Program. Results of their research are among others contributions toLandslide assessment qVolcanic risk qCoastal hazards qDesertification processes q Space organisations and financing institutions serving developing nations are requested to help to deploy new sensors to monitor geo-dynamic processes, providing free and direct data reception in all parts of the world in order to allow national institutes to develop their own early warning capabilities.

  4. Hazard & Operability Study for Removal of Spent Nuclear Fuel from the 324 Building

    SciTech Connect

    VAN KEUREN, J.C.

    2002-05-07

    A hazard and operability (HAZOP) study was conducted to examine the hazards associated with the removal of the spent nuclear fuel from the 324 Building. Fifty-nine potentially hazardous conditions were identified.

  5. Leak detection, monitoring, and mitigation technology trade study update

    SciTech Connect

    HERTZEL, J.S.

    1998-11-10

    This document is a revision and update to the initial report that describes various leak detection, monitoring, and mitigation (LDMM) technologies that can be used to support the retrieval of waste from the single-shell tanks (SST) at the Hanford Site. This revision focuses on the improvements in the technical performance of previously identified and useful technologies, and it introduces new technologies that might prove to be useful.

  6. Hazards analysis and prediction from remote sensing and GIS using spatial data mining and knowledge discovery: a case study for landslide hazard zonation

    NASA Astrophysics Data System (ADS)

    Hsu, Pai-Hui; Su, Wen-Ray; Chang, Chy-Chang

    2011-11-01

    Due to the particular geographical location and geological condition, Taiwan suffers from many natural hazards which often cause series property damages and life losses. To reduce the damages and casualty, an effective real-time system for hazard prediction and mitigation is necessary. In this study, a case study for Landslide Hazard Zonation (LHZ) is tested in accordance with Spatial Data Mining and Knowledge Discovery (SDMKD) from database. Many different kinds of geospatial data, such as the terrain elevation, land cover types, the distance to roads and rivers, geology maps, NDVI, and monitoring rainfall data etc., are collected into the database for SDMKD. In order to guarantee the data quality, the spatial data cleaning is essential to remove the noises, errors, outliers, and inconsistency hiding in the input spatial data sets. In this paper, the Kriging interpolation is used to calibrate the QPESUMS rainfall data to the rainfall observations from rain gauge stations to remove the data inconsistency. After the data cleaning, the artificial neural networks (ANNs) is applied to generate the LHZ map throughout the test area. The experiment results show that the accuracy of LHZ is about 92.3% with the ANNs analysis, and the landslides induced by heavy-rainfall can be mapped efficiently from remotely sensed images and geospatial data using SDMKD technologies.

  7. Remote Sensing Applied in Natural Hazards Mitigation Experiences from the International UNESCO/IUGS Gars-Program 1984- 2002

    NASA Astrophysics Data System (ADS)

    Bannert, D.

    The Geological Application of Remote Sensing-Program (GARS-Program) has since 1984 devoted its efforts towards the application of remote sensing from aircraft and space platforms with the understanding that it adds considerably to the knowledge of geo-dynamic processes in the widest range. Remote sensing provides a large variety of sensors, the data of which, once analysed, can provide completely new observations on areas threatened by natural hazards. UNESCO and IUGS through the GARS-Program provide a forum, where remote sensing techniques are continuously scrutinised concerning their geological application. Especially their potential in assessing geo-environmental issues under various geological, climatic and morphological conditions is considered. In the past, international co-operation projects in developing countries have been carried out under the GARS-Program, addressing the following subjects: Currently, the GARS-Program is strongly involved in the IGOS Geohazards Theme Working Group and in the UNESCO / IAH Middle East Transboundary Aquifer Initiative, as well as numerous individual projects by member institutes. There are more than 40 institutes co- operating world-wide under the GARS-Program. Space organisations and financing institutions serving developing nations are requested to help to deploy new sensors to monitor geo-dynamic processes, providing free and direct data reception in all parts of the world in order to allow national institutes to develop their own early warning capabilities.

  8. S. 353: A bill to require the Director of the National Institute for Occupational Safety and Health to conduct a study of the prevalence and issues related to contamination of workers' homes with hazardous chemicals and substances transported from their workplace and to issue or report on regulations to prevent or mitigate the future contamination of workers' homes, and for other purposes, introduced in the United States Senate, One Hundred Second Congress, First Session, November 27, 1991

    SciTech Connect

    Not Available

    1991-01-01

    This bill was introduced into the Senate of the United States on February 5, 1991 to require NIOSH to conduct a study concerning the contamination of worker's homes with hazardous materials transported from their workplace and to issue or report on regulations to prevent future contamination. These hazardous chemicals and substances are being transported out of industries on worker's clothing and pose a threat to the health and welfare of workers and their families. Separate sections address the following: evaluation of employee transported contaminant releases; regulations or standards; and authorization of appropriations.

  9. Monitoring volcanic activity with satellite remote sensing to reduce aviation hazard and mitigate the risk: application to the North Pacific

    NASA Astrophysics Data System (ADS)

    Webley, P. W.; Dehn, J.

    2012-12-01

    Volcanic activity across the North Pacific (NOPAC) occurs on a daily basis and as such monitoring needs to occur on a 24 hour, 365 days a year basis. The risk to the local population and aviation traffic is too high for this not to happen. Given the size and remoteness of the NOPAC region, satellite remote sensing has become an invaluable tool to monitor the ground activity from the regions volcanoes as well as observe, detect and analyze the volcanic ash clouds that transverse across the Pacific. Here, we describe the satellite data collection, data analysis, real-time alert/alarm systems, observational database and nearly 20-year archive of both automated and manual observations of volcanic activity. We provide examples of where satellite remote sensing has detected precursory activity at volcanoes, prior to the volcanic eruption, as well as different types of eruptive behavior that can be inferred from the time series data. Additionally, we illustrate how the remote sensing data be used to detect volcanic ash in the atmosphere, with some of the pro's and con's to the method as applied to the NOPAC, and how the data can be used with other volcano monitoring techniques, such as seismic monitoring and infrasound, to provide a more complete understanding of a volcanoes behavior. We focus on several large volcanic events across the region, since our archive started in 1993, and show how the system can detect both these large scale events as well as the smaller in size but higher in frequency type events. It's all about how to reduce the risk, improve scenario planning and situational awareness and at the same time providing the best and most reliable hazard assessment from any volcanic activity.

  10. Deepwater Gulf of Mexico Shallow Hazards Studies Best Practices

    NASA Astrophysics Data System (ADS)

    Fernandez, M.; Hobbs, B.

    2005-05-01

    ConocoPhillips (hConoco) has been involved in deepwater exploration in the Gulf of Mexico for the last 5 years using a dynamically positioned (DP) drillship. As part of the Federal (MMS) and State permitting process for deepwater exploration, ConocoPhillips (COPC) actively undertakes in securing seabed and shallow subsurface hazard surveys and analyses for every potential drillsite. COPC conducts seabed and shallow subsurface hazards surveys for at least two main reasons: To be a safe, efficient operator, seabed and shallow subsurface hazard surveys and analyses are necessary steps of the Exploration Work Flow to help ensure a safe well, and to fulfill MMS (or local government) regulatory requirements The purpose of shallow geohazards studies is to determine seafloor and sub-bottom conditions, inspect for possible chemosynthetic communities, and to provide a shallow hazards assessment in accordance with NTL 2003-G17. During the five years of deepwater exploration COPC has contracted Fugro Geoservices to perform hazards studies in over 30 offshore blocks. Deepwater Gulf of Mexico Shallow Hazards Studies Best Practices The results of the seabed and shallow geohazards are a critical part of the construction of all of our well plans and are dynamically used in all MDT's. The results of the seabed and shallow geohazards investigations have greatly improved our drilling efficiency by predicting and avoiding possible chemosynthetic communities, sea floor faults, shallow gas, and shallow water flow. CoP's outstanding safety record and environmental stewardship with regards to geohazards has helped us in accelerating certain Exploration Plans (within MMS guidelines). These types of efforts has saved money and kept the drilling schedule running smoothly. In the last two years, the MMS has given COPC approval to use existing 3D spec seismic volumes for Shallow Hazards Assessment at several locations where applicable. This type of effort has saved ConocoPhillips hundreds of thousands of dollars that would have been spent in either acquiring 2D high resolution seismic data or reprocessing an existing 3D data volume. Examples from Selected Prospects: Magnolia (Garden Banks 783/784); Voss (Keathley Canyon 347/391/435); Lorien (Green Canyon 199); Yorick (Green Canyon 391/435)

  11. A study on seismicity and seismic hazard for Karnataka State

    NASA Astrophysics Data System (ADS)

    Sitharam, T. G.; James, Naveen; Vipin, K. S.; Raj, K. Ganesha

    2012-04-01

    This paper presents a detailed study on the seismic pattern of the state of Karnataka and also quantifies the seismic hazard for the entire state. In the present work, historical and instrumental seismicity data for Karnataka (within 300 km from Karnataka political boundary) were compiled and hazard analysis was done based on this data. Geographically, Karnataka forms a part of peninsular India which is tectonically identified as an intraplate region of Indian plate. Due to the convergent movement of the Indian plate with the Eurasian plate, movements are occurring along major intraplate faults resulting in seismic activity of the region and hence the hazard assessment of this region is very important. Apart from referring to seismotectonic atlas for identifying faults and fractures, major lineaments in the study area were also mapped using satellite data. The earthquake events reported by various national and international agencies were collected until 2009. Declustering of earthquake events was done to remove foreshocks and aftershocks. Seismic hazard analysis was done for the state of Karnataka using both deterministic and probabilistic approaches incorporating logic tree methodology. The peak ground acceleration (PGA) at rock level was evaluated for the entire state considering a grid size of 0.05° × 0.05°. The attenuation relations proposed for stable continental shield region were used in evaluating the seismic hazard with appropriate weightage factors. Response spectra at rock level for important Tier II cities and Bangalore were evaluated. The contour maps showing the spatial variation of PGA values at bedrock are presented in this work.

  12. RADON MITIGATION IN SCHOOLS: CASE STUDIES OF RADON MITIGATION SYSTEMS INSTALLED BY EPA IN FOUR MARYLAND SCHOOLS ARE PRESENTED

    EPA Science Inventory

    The first part of this two-part paper discusses radon entry into schools, radon mitigation approaches for schools, and school characteristics (e.g., heating, ventilation, and air-conditioning -- HVAC-- system design and operation) that influence radon entry and mitigation system ...

  13. Detecting Slow Deformation Signals Preceding Dynamic Failure: A New Strategy For The Mitigation Of Natural Hazards (SAFER)

    NASA Astrophysics Data System (ADS)

    Vinciguerra, Sergio; Colombero, Chiara; Comina, Cesare; Ferrero, Anna Maria; Mandrone, Giuseppe; Umili, Gessica; Fiaschi, Andrea; Saccorotti, Gilberto

    2015-04-01

    Rock slope monitoring is a major aim in territorial risk assessment and mitigation. The high velocity that usually characterizes the failure phase of rock instabilities makes the traditional instruments based on slope deformation measurements not applicable for early warning systems. The use of "site specific" microseismic monitoring systems, with particular reference to potential destabilizing factors, such as rainfalls and temperature changes, can allow to detect pre-failure signals in unstable sectors within the rock mass and to predict the possible acceleration to the failure. We deployed a microseismic monitoring system in October 2013 developed by the University of Turin/Compagnia San Paolo and consisting of a network of 4 triaxial 4.5 Hz seismometers connected to a 12 channel data logger on an unstable patch of the Madonna del Sasso, Italian Western Alps. The initial characterization based on geomechanical and geophysical tests allowed to understand the instability mechanism and to design a 'large aperture' configuration which encompasses the entire unstable rock and can monitor subtle changes of the mechanical properties of the medium. Stability analysis showed that the stability of the slope is due to rock bridges. A continuous recording at 250 Hz sampling frequency (switched in March 2014 to 1 kHz for improving the first arrival time picking and obtain wider frequency content information) and a trigger recording based on a STA/LTA (Short Time Average over Long Time Average) detection algorithm have been used. More than 2000 events with different waveforms, duration and frequency content have been recorded between November 2013 and March 2014. By inspecting the acquired events we identified the key parameters for a reliable distinction among the nature of each signal, i.e. the signal shape in terms of amplitude, duration, kurtosis and the frequency content in terms of range of maximum frequency content, frequency distribution in spectrograms. Four main classes of recorded signals can be recognised: microseismic events, regional earthquakes, electrical noises and calibration signals, and unclassified events (probably grouping rockfalls, quarry blasts, other anthropic and natural sources of seismic noise). Since the seismic velocity inside the rock mass is highly heterogeneous, as it resulted from the geophysical investigations and the signals are often noisy an accurate location is not possible. To overcome this limitation a three-dimensional P-wave velocity model linking the DSM (Digital Surface Model) of the cliff obtained from a laser-scanner survey to the results of the cross-hole seismic tomography, the geological observations and the geomechanical measures of the most pervasive fracture planes has been built. As a next step we will proceed to the localization of event sources, to the improvement and automation of data analysis procedures and to search for correlations between event rates and meteorological data, for a better understanding of the processes driving the rock mass instability.

  14. The use of questionnaires for acquiring information on public perception of natural hazards and risk mitigation - a review of current knowledge and practice

    NASA Astrophysics Data System (ADS)

    Bird, D. K.

    2009-07-01

    Questionnaires are popular and fundamental tools for acquiring information on public knowledge and perception of natural hazards. Questionnaires can provide valuable information to emergency management agencies for developing risk management procedures. Although many natural hazards researchers describe results generated from questionnaires, few explain the techniques used for their development and implementation. Methodological detail should include, as a minimum, response format (open/closed questions), mode of delivery, sampling technique, response rate and access to the questionnaire to allow reproduction of or comparison with similar studies. This article reviews current knowledge and practice for developing and implementing questionnaires. Key features include questionnaire design, delivery mode, sampling techniques and data analysis. In order to illustrate these aspects, a case study examines methods chosen for the development and implementation of questionnaires used to obtain information on knowledge and perception of volcanic hazards in a tourist region in southern Iceland. Face-to-face interviews highlighted certain issues with respect to question structure and sequence. Recommendations are made to overcome these problems before the questionnaires are applied in future research projects. In conclusion, basic steps that should be disclosed in the literature are provided as a checklist to ensure that reliable, replicable and valid results are produced from questionnaire based hazard knowledge and risk perception research.

  15. Seismic hazard studies at the Department of Energy owned Paducah and Portsmouth Gaseous Diffusion plants

    SciTech Connect

    Beavers, J.E.; Brock, W.R.; Hunt, R.J. )

    1991-01-01

    Seismic hazard levels for free-field rock motion are defined and presented in this paper as annual exceedance probabilities versus peak acceleration and as uniform hazard response spectra. The conclusions of an independent review are also summarized. Based on the seismic hazard studies, peak horizontal acceleration values and uniform hazard response spectra for rock conditions are recommended. 15 refs., 6 figs., 1 tab.

  16. Knowledge of occupational hazards in photography: a pilot study.

    PubMed

    Marlenga, B; Parker-Conrad, J E

    1993-04-01

    1. Artists use many materials composed of the same chemicals that cause major occupational health problems in industry. 2. The majority of artists are unaware of the potential hazards in the materials and processes they use. 3. The pilot study revealed that greater than 90% of the amateur photographers did not use safety precautions in the darkroom. 4. The most common perceived barrier to the use of safety precautions was the lack of knowledge about chemical safety. PMID:8507283

  17. Household hazardous waste quantification, characterization and management in China's cities: a case study of Suzhou.

    PubMed

    Gu, Binxian; Zhu, Weimo; Wang, Haikun; Zhang, Rongrong; Liu, Miaomiao; Chen, Yangqing; Wu, Yi; Yang, Xiayu; He, Sheng; Cheng, Rong; Yang, Jie; Bi, Jun

    2014-11-01

    A four-stage systematic tracking survey of 240 households was conducted from the summer of 2011 to the spring of 2012 in a Chinese city of Suzhou to determine the characteristics of household hazardous waste (HHW) generated by the city. Factor analysis and a regression model were used to study the major driving forces of HHW generation. The results indicate that the rate of HHW generation was 6.16 (0.16-31.74, 95% CI) g/person/day, which accounted for 2.23% of the household solid waste stream. The major waste categories contributing to total HHW were home cleaning products (21.33%), medicines (17.67%) and personal care products (15.19%). Packaging and containers (one-way) and products (single-use) accounted for over 80% of total HHW generation, implying a considerable potential to mitigate HHW generation by changing the packaging design and materials used by manufacturing enterprises. Strong correlations were observed between HHW generation (g/person/day) and the driving forces group of "household structure" and "consumer preferences" (among which the educational level of the household financial manager has the greatest impact). Furthermore, the HHW generation stream in Suzhou suggested the influence of another set of variables, such as local customs and culture, consumption patterns, and urban residential life-style. This study emphasizes that HHW should be categorized at its source (residential households) as an important step toward controlling the HHW hazards of Chinese cities. PMID:25022547

  18. HOUSEHOLD HAZARDOUS WASTE CHARACTERIZATION STUDY FOR PALM BEACH COUNTY, FLORIDA: A MITE PROGRAM EVALUATION

    EPA Science Inventory

    The objectives of the Household hazardous Waste Characterization Study (the HHW Study) were to quantify the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County, Florida's (the county) residential solid waste (characterized in this study as municipal soli...

  19. Study on the health hazards of scrap metal cutters.

    PubMed

    Ho, S F; Wong, P H; Kwok, S F

    1989-12-01

    Scrap metal cutters seemed to be left out in most preventive programmes as the workers were mainly contract workers. The health hazards of scrap metal cutting in 54 workers from a foundry and a ship breaking plant were evaluated. Environmental sampling showed lead levels ranging from 0.02 to 0.57 mg/m3 (threshold limit values is 0.15 mg/m3). Exposure to lead came mainly from the paint coat of the metals cut. Metal fume fever was not reported although their main complaints were cough and rhinitis. Skin burns at all stages of healing and residual scars were seen over hands, forearms and thighs. 96% of the cutters had blood lead levels exceeding 40 micrograms/100 ml with 10 workers exceeding 70 micrograms/100 ml. None had clinical evidence of lead poisoning. The study showed that scrap metal cutting is a hazardous industry associated with significant lead exposure. With proper medical supervision, the blood lead levels of this group of workers decreased illustrating the importance of identifying the hazard and implementing appropriate medical surveillance programmes. PMID:2635395

  20. Interdisciplinary approach to hydrological hazard mitigation and disaster response and effects of climate change on the occurrence of flood severity in central Alaska

    NASA Astrophysics Data System (ADS)

    Kontar, Y. Y.; Bhatt, U. S.; Lindsey, S. D.; Plumb, E. W.; Thoman, R. L.

    2015-06-01

    In May 2013, a massive ice jam on the Yukon River caused flooding that destroyed much of the infrastructure in the Interior Alaska village of Galena and forced the long-term evacuation of nearly 70% of its residents. This case study compares the communication efforts of the out-of-state emergency response agents with those of the Alaska River Watch program, a state-operated flood preparedness and community outreach initiative. For over 50 years, the River Watch program has been fostering long-lasting, open, and reciprocal communication with flood prone communities, as well as local emergency management and tribal officials. By taking into account cultural, ethnic, and socioeconomic features of rural Alaskan communities, the River Watch program was able to establish and maintain a sense of partnership and reliable communication patterns with communities at risk. As a result, officials and residents in these communities are open to information and guidance from the River Watch during the time of a flood, and thus are poised to take prompt actions. By informing communities of existing ice conditions and flood threats on a regular basis, the River Watch provides effective mitigation efforts in terms of ice jam flood effects reduction. Although other ice jam mitigation attempts had been made throughout US and Alaskan history, the majority proved to be futile and/or cost-ineffective. Galena, along with other rural riverine Alaskan communities, has to rely primarily on disaster response and recovery strategies to withstand the shock of disasters. Significant government funds are spent on these challenging efforts and these expenses might be reduced through an improved understanding of both the physical and climatological principals behind river ice breakup and risk mitigation. This study finds that long term dialogue is critical for effective disaster response and recovery during extreme hydrological events connected to changing climate, timing of river ice breakup, and flood occurrence in rural communities of the Far North.

  1. Decay extent evaluation of wood degraded by a fungal community using NIRS: application for ecological engineering structures used for natural hazard mitigation

    NASA Astrophysics Data System (ADS)

    Baptiste Barré, Jean; Bourrier, Franck; Bertrand, David; Rey, Freddy

    2015-04-01

    Ecological engineering corresponds to the design of efficient solutions for protection against natural hazards such as shallow landslides and soil erosion. In particular, bioengineering structures can be composed of a living part, made of plants, cuttings or seeds, and an inert part, a timber logs structure. As wood is not treated by preservatives, fungal degradation can occur from the start of the construction. It results in wood strength loss, which practitioners try to evaluate with non-destructive tools (NDT). Classical NDT are mainly based on density measurements. However, the fungal activity reduces the mechanical properties (modulus of elasticity - MOE) well before well before a density change could be measured. In this context, it would be useful to provide a tool for assessing the residual mechanical strength at different decay stages due to a fungal community. Near-infrared spectroscopy (NIRS) can be used for that purpose, as it can allow evaluating wood mechanical properties as well as wood chemical changes due to brown and white rots. We monitored 160 silver fir samples (30x30x6000mm) from green state to different levels of decay. The degradation process took place in a greenhouse and samples were inoculated with silver fir decayed debris in order to accelerate the process. For each sample, we calculated the normalized bending modulus of elasticity loss (Dw moe) and defined it as decay extent. Near infrared spectra collected from both green and decayed ground samples were corrected by the subtraction of baseline offset. Spectra of green samples were averaged into one mean spectrum and decayed spectra were subtracted from the mean spectrum to calculate the absorption loss. Partial least square regression (PLSR) has been performed between the normalized MOE loss Dw moe (0 < Dw moe < 1) and the absorption loss, with a correlation coefficient R² equal to 0.85. Finally, the prediction of silver fir biodegradation rate by NIRS was significant (RMSEP = 0.13). This tool improves the evaluation accuracy of wood decay extent in the context of ecological engineering structures used for natural hazard mitigation.

  2. Mitigating mass movement caused by earthquakes and typhoons: a case study of central Taiwan

    NASA Astrophysics Data System (ADS)

    Lin, Jiun-Chuan

    2013-04-01

    Typhoons caused huge damages to Taiwan at the average of 3.8 times a year in the last 100 years, according to Central Weather Bureau data. After the Chi-Chi earthquake of 1999 at the magnitude of Richard Scale 7.3, typhoons with huge rainfall would cause huge debris flow and deposits at river channels. As a result of earthquakes, loose debris falls and flows became significant hazards in central Taiwan. Analysis of rainfall data and data about the sites of slope failure show that damage from natural hazards was enhanced in the last 20 years, as a result of the Chi-Chi earthquake. There are three main types of mass movement in Central Taiwan: landslides, debris flows and gully erosion. Landslides occurred mainly along hill slopes and river channel banks. Many dams, check dams, housing structures and even river channels can be raised to as high as 60 meters as a result of stacking up floating materials of landslides. Debris flows occurred mainly through typhoon periods and activated ancient debris deposition. New gullies were thus developed from deposits loosened and shaken up by earthquakes. Extreme earthquakes and typhoon events occurred frequently in the last 20 years. This paper analyzes the geological and geomorphologic background for the precarious areas and typhoons in central Taiwan, to make a systematic understanding of mass movement harzards. The mechanism and relations of debris flows and rainfall data in central Taiwan are analyzed. Ways for mitigating mass movement threats are also proposed in this paper. Keywords: mass movement, earthquakes, typhoons, hazard mitigation, central Ta

  3. Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards: Part I. Validation of satellite-derived Volcanic Ash Levels.

    NASA Astrophysics Data System (ADS)

    Koukouli, MariLiza; Balis, Dimitris; Simopoulos, Spiros; Siomos, Nikos; Clarisse, Lieven; Carboni, Elisa; Wang, Ping; Siddans, Richard; Marenco, Franco; Mona, Lucia; Pappalardo, Gelsomina; Spinetti, Claudia; Theys, Nicolas; Tampellini, Lucia; Zehner, Claus

    2014-05-01

    The 2010 eruption of the Icelandic volcano Eyjafjallajökull attracted the attention of the public and the scientific community to the vulnerability of the European airspace to volcanic eruptions. Major disruptions in European air traffic were observed for several weeks surrounding the two eruptive episodes, which had a strong impact on the everyday life of many Europeans as well as a noticable economic loss of around 2-3 billion Euros in total. The eruptions made obvious that the decision-making bodies were not informed properly and timely about the commercial aircraft capabilities to ash-leaden air, and that the ash monitoring and prediction potential is rather limited. After the Eyjafjallajökull eruptions new guidelines for aviation, changing from zero tolerance to newly established ash threshold values, were introduced. Within this spirit, the European Space Agency project Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards, called for the creation of an optimal End-to-End System for Volcanic Ash Plume Monitoring and Prediction . This system is based on improved and dedicated satellite-derived ash plume and sulphur dioxide level assessments, as well as an extensive validation using auxiliary satellite, aircraft and ground-based measurements. The validation of volcanic ash levels extracted from the sensors GOME-2/MetopA, IASI/MetopA and MODIS/Terra and MODIS/Aqua is presented in this work with emphasis on the ash plume height and ash optical depth levels. Co-located aircraft flights, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation [CALIPSO] soundings and well as European Aerosol Research Lidar Network [EARLINET] measurements were compared to the different satellite estimates for the those two eruptive episodes. The validation results are extremely promising with most satellite sensors performing quite well and within the estimated uncertainties compared to the comparative datasets. The findings are extensively presented here and future directions discussed in length.

  4. Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards: Part II. Validation of satellite-derived Volcanic Sulphur Dioxide Levels.

    NASA Astrophysics Data System (ADS)

    Koukouli, MariLiza; Balis, Dimitris; Dimopoulos, Spiros; Clarisse, Lieven; Carboni, Elisa; Hedelt, Pascal; Spinetti, Claudia; Theys, Nicolas; Tampellini, Lucia; Zehner, Claus

    2014-05-01

    The eruption of the Icelandic volcano Eyjafjallajökull in the spring of 2010 turned the attention of both the public and the scientific community to the susceptibility of the European airspace to the outflows of large volcanic eruptions. The ash-rich plume from Eyjafjallajökull drifted towards Europe and caused major disruptions of European air traffic for several weeks affecting the everyday life of millions of people and with a strong economic impact. This unparalleled situation revealed limitations in the decision making process due to the lack of information on the tolerance to ash of commercial aircraft engines as well as limitations in the ash monitoring and prediction capabilities. The European Space Agency project Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards, was introduced to facilitate the development of an optimal End-to-End System for Volcanic Ash Plume Monitoring and Prediction. This system is based on comprehensive satellite-derived ash plume and sulphur dioxide [SO2] level estimates, as well as a widespread validation using supplementary satellite, aircraft and ground-based measurements. The validation of volcanic SO2 levels extracted from the sensors GOME-2/MetopA and IASI/MetopA are shown here with emphasis on the total column observed right before, during and after the Eyjafjallajökull 2010 eruptions. Co-located ground-based Brewer Spectrophotometer data extracted from the World Ozone and Ultraviolet Radiation Data Centre, WOUDC, were compared to the different satellite estimates. The findings are presented at length, alongside a comprehensive discussion of future scenarios.

  5. Mitigating Hazards in School Facilities

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    School safety is a human concern, one that every school and community must take seriously and strive continually to achieve. It is also a legal concern; schools can be held liable if they do not make good-faith efforts to provide a safe and secure school environment. How schools are built and maintained is an integral part of school safety and…

  6. A study of shock mitigating materials in a split Hopkinson bar configuration

    SciTech Connect

    Bateman, V.I.; Bell, R.G. III; Brown, F.A.; Hansen, N.R.

    1996-12-31

    Sandia National Laboratories (SNL) designs mechanical systems with electronics that must survive high shock environments. These mechanical systems include penetrators that must survive soil, rock, and ice penetration, nuclear transportation casks that must survive transportation environments, and laydown weapons that must survive delivery impact of 125-fps. These mechanical systems contain electronics that may operate during and after the high shock environment and that must be protected from the high shock environments. A study has been started to improve the packaging techniques for the advanced electronics utilized in these mechanical systems because current packaging techniques are inadequate for these more sensitive electronics. In many cases, it has been found that the packaging techniques currently used not only do not mitigate the shock environment but actually amplify the shock environment. An ambitious goal for this packaging study is to avoid amplification and possibly attenuate the shock environment before it reaches the electronics contained in the various mechanical system. As part of the investigation of packaging techniques, a two part study of shock mitigating materials is being conducted. This paper reports the first part of the shock mitigating materials study. A study to compare three thicknesses (0.125, 0.250, and 0.500 in.) of seventeen, unconfined materials for their shock mitigating characteristics has been completed with a split Hopkinson bar configuration. The nominal input as measured by strain gages on the incident Hopkinson bar is 50 fps {at} 100 {micro}s for these tests. It is hypothesized that a shock mitigating material has four purposes: to lengthen the shock pulse, to attenuate the shock pulse, to mitigate high frequency content in the shock pulse, and to absorb energy. Both time domain and frequency domain analyses of the split Hopkinson bar data have been performed to compare the materials` achievement of these purposes.

  7. A STUDY ON GREENHOUSE GAS EMISSIONS AND MITIGATION POTENTIALS IN AGRICULRE

    NASA Astrophysics Data System (ADS)

    Hasegawa, Tomoko; Matsuoka, Yuzuru

    In this study, world food production and consumption are estimated from 2005 to 2030 using a model developed by General-to-specific modeling methodology. Based on the agricultural production, we estimated GHG emissions and mitigation potentials and evaluated mitigation countermeasures in agriculture. As a result, world crop and meat production will increase by 1.4 and 1.3 times respectively up to 2030. World GHG emissions from agriculture were 5.7 GtCO2eq in 2005. CH4 emission from enteric fermentation and N2O emission from nitrogen fertilizer contributed a large part of the emissions. In 2030, technical and economical mitigation potentials will be 2.0 GtCO2eq and 1.2 GtCO2eq respectively. The potentials correspond to 36% and 22% of total emissions in 2000. The countermeasures with highest effects will be water management in rice paddy such as "Midseason drainage" and "Off-season straw".

  8. An exploratory study on perceptions of seismic risk and mitigation in two districts of Istanbul.

    PubMed

    Eraybar, Korel; Okazaki, Kenji; Ilki, Alper

    2010-01-01

    Istanbul is one of the world's cities most vulnerable to seismic events. According to seismologists, the probability of a severe earthquake in the next 30 years is approximately 40 per cent. Following an outline of the seismicity of this vital Turkish city and a summary of current seismic risks and mitigation studies, this paper presents the results of a survey conducted in two districts of Istanbul, Avcilar and Bakirkoy. The survey comprised some 60 questions on the seismic risk perceptions of individuals and requested basic personal data, such as on age, education level, employment type, financial income, and gender. Despite various differences among the survey population, such as academic background and level of financial income, responses were surprisingly similar, especially in terms of having no plan for a safer house. The data may help those planning mitigation programmes and public awareness campaigns on preparedness and particularly mitigation in highly vulnerable regions. PMID:19624701

  9. Insights from EMF Associated Agricultural and Forestry Greenhouse Gas Mitigation Studies

    SciTech Connect

    McCarl, Bruce A.; Murray, Brian; Kim, Man-Keun; Lee, Heng-Chi; Sands, Ronald D.; Schneider, Uwe

    2007-11-19

    Integrated assessment modeling (IAM) as employed by the Energy Modeling Forum (EMF) generally involves a multi-sector appraisal of greenhouse gas emission (GHGE) mitigation alternatives and climate change effects typically at the global level. Such a multi-sector evaluation encompasses potential climate change effects and mitigative actions within the agricultural and forestry (AF) sectors. In comparison with many of the other sectors covered by IAM, the AF sectors may require somewhat different treatment due to their critical dependence upon spatially and temporally varying resource and climatic conditions. In particular, in large countries like the United States, forest production conditions vary dramatically across the landscape. For example, some areas in the southern US present conditions favorable to production of fast growing, heat tolerant pine species, while more northern regions often favor slower-growing hardwood and softwood species. Moreover, some lands are currently not suitable for forest production (e.g., the arid western plains). Similarly, in agriculture, the US has areas where citrus and cotton can be grown and other areas where barley and wheat are more suitable. This diversity across the landscape causes differential GHGE mitigation potential in the face of climatic changes and/or responses to policy or price incentives. It is difficult for a reasonably sized global IAM system to reflect the full range of sub-national geographic AF production possibilities alluded to above. AF response in the face of climate change altered temperature precipitation regimes or mitigation incentives will likely involve region-specific shifts in land use and agricultural/forest production. This chapter addresses AF sectoral responses in climate change mitigation analysis. Specifically, we draw upon US-based studies of AF GHGE mitigation possibilities that incorporate sub-national detail drawing largely on a body of studies done by the authors in association with EMF activities. We discuss characteristics of AF sectoral responses that could be incorporated in future IAM efforts in climate change policy.

  10. Interventionist and participatory approaches to flood risk mitigation decisions: two case studies in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Bianchizza, C.; Del Bianco, D.; Pellizzoni, L.; Scolobig, A.

    2012-04-01

    Flood risk mitigation decisions pose key challenges not only from a technical but also from a social, economic and political viewpoint. There is an increasing demand for improving the quality of these processes by including different stakeholders - and especially by involving the local residents in the decision making process - and by guaranteeing the actual improvement of local social capacities during and after the decision making. In this paper we analyse two case studies of flood risk mitigation decisions, Malborghetto-Valbruna and Vipiteno-Sterzing, in the Italian Alps. In both of them, mitigation works have been completed or planned, yet following completely different approaches especially in terms of responses of residents and involvement of local authorities. In Malborghetto-Valbruna an 'interventionist' approach (i.e. leaning towards a top down/technocratic decision process) was used to make decisions after the flood event that affected the municipality in the year 2003. In Vipiteno-Sterzing, a 'participatory' approach (i.e. leaning towards a bottom-up/inclusive decision process) was applied: decisions about risk mitigation measures were made by submitting different projects to the local citizens and by involving them in the decision making process. The analysis of the two case studies presented in the paper is grounded on the results of two research projects. Structured and in-depth interviews, as well as questionnaire surveys were used to explore residents' and local authorities' orientations toward flood risk mitigation. Also a SWOT analysis (Strengths, Weaknesses, Opportunities and Threats) involving key stakeholders was used to better understand the characteristics of the communities and their perception of flood risk mitigation issues. The results highlight some key differences between interventionist and participatory approaches, together with some implications of their adoption in the local context. Strengths and weaknesses of the two approaches, as well as key challenges for the future are also discussed.

  11. CASE STUDY OF RADON DIAGNOSTICS AND MITIGATION IN A NEW YORK STATE SCHOOL

    EPA Science Inventory

    The paper discusses a case study of radon diagnostics and mitigation performed by EPA in a New York State school building. esearch focused on active subslab depressurization (ASD) in the basement and, to a lesser degree, the potential for radon reduction in the basement and slab-...

  12. Feasibility study of tank leakage mitigation using subsurface barriers

    SciTech Connect

    Treat, R.L.; Peters, B.B.; Cameron, R.J.; McCormak, W.D.; Trenkler, T.; Walters, M.F.; Rouse, J.K.; McLaughlin, T.J.; Cruse, J.M.

    1994-09-21

    The US Department of Energy (DOE) has established the Tank Waste Remediation System (TWRS) to satisfy manage and dispose of the waste currently stored in the underground storage tanks. The retrieval element of TWRS includes a work scope to develop subsurface impermeable barriers beneath SSTs. The barriers could serve as a means to contain leakage that may result from waste retrieval operations and could also support site closure activities by facilitating cleanup. Three types of subsurface barrier systems have emerged for further consideration: (1) chemical grout, (2) freeze walls, and (3) desiccant, represented in this feasibility study as a circulating air barrier. This report contains analyses of the costs and relative risks associated with combinations retrieval technologies and barrier technologies that from 14 alternatives. Eight of the alternatives include the use of subsurface barriers; the remaining six nonbarrier alternative are included in order to compare the costs, relative risks and other values of retrieval with subsurface barriers. Each alternative includes various combinations of technologies that can impact the risks associated with future contamination of the groundwater beneath the Hanford Site to varying degrees. Other potential risks associated with these alternatives, such as those related to accidents and airborne contamination resulting from retrieval and barrier emplacement operations, are not quantitatively evaluated in this report.

  13. Studies of millimeter-wave phenomenology for helicopter brownout mitigation

    NASA Astrophysics Data System (ADS)

    Schuetz, Christopher A.; Stein, E. Lee, Jr.; Samluk, Jesse; Mackrides, Daniel; Wilson, John P.; Martin, Richard D.; Dillon, Thomas E.; Prather, Dennis W.

    2009-09-01

    The unique ability of the millimeter-wave portion of the spectrum to penetrate typical visual obscurants has resulted in a wide range of possible applications for imagers in this spectrum. Of particular interest to the military community are imagers that can operate effectively in Degraded Visual Environments (DVE's) experienced by helicopter pilots when landing in dry, dusty environments, otherwise known as "brownout." One of the first steps to developing operational requirements for imagers in this spectrum is to develop a quantitative understanding of the phenomenology that governs imaging in these environments. While preliminary studies have been done in this area, quantitative, calibrated measurements of typical targets and degradation of target contrasts due to brownout conditions are not available. To this end, we will present results from calibrated, empirical measurements of typical targets of interest to helicopter pilots made in a representative desert environment. In addition, real-time measurements of target contrast reduction due to brownout conditions generated by helicopter downwash will be shown. These data were acquired using a W-band, dual-polarization radiometric scanner using optical-upconversion detectors.

  14. The fujairah united arab emirates (uae) (ml = 5.1) earthquake of march 11, 2002 a reminder for the immediate need to develop and implement a national hazard mitigation strategy

    NASA Astrophysics Data System (ADS)

    Al-Homoud, A.

    2003-04-01

    On March 11, 2002, at mid nigh, the Fujairah Masafi region in the UAE was shaken by an earthquake of shallow depth and local magnitude m = 5.1 on Richter Scale. The earthquake occurred on Dibba fault in the UAE with epicenter of the earthquake at 20 km NW of Fujairha city. The focal depth was just 10 km. The earthquake was felt in most parts of the northern emirates: Dubai, Sharjah, Ajman, Ras Al-Khaima, and Um-Qwain. The "main shock" was followed in the following weeks by more than twenty five earthquakes with local magnitude ranging from m = 4 to m = 4.8. The location of those earthquakes was along Zagros Reverse Faulting System in the Iranian side the Arabian Gulf, opposite to the Shores of the UAE. Most of these earthquakes were shallow too and were actually felt by the people. However, there was another strong earthquake in early April 2002 in the same Masafi region with local magnitude m = 5.1 and focal depth 30 km, therefore it was not felt by the northern emirates residents. No major structural damages to buildings and lifeline systems were reported in the several cities located in the vicinity of the earthquake epicenter. The very small values of ground accelerations were not enough to test the structural integrity of tall building and major infrastructures. Future major earthquakes anticipated in the region in close vicinity of northern emirates, once they occur, and considering the noticeable local site effect of the emirates sandy soils of high water table levels, will surely put these newly constructed building into the real test. Even though there were no casualties in the March 11th event, but there was major fear as a result of the loud sound of rock rupture heard in the mountains close to Maafi, the noticeable disturbance of animals and birds minutes before the earthquake incident and during the incident, cracks in the a good number of Masafi buildings and major damages that occurred in "old" buildings of Fujairah Masafi area, the closest city to the epicenter of the earthquake. Indeed, the March 11, 2002 and "aftershocks" scared the citizens of Masafi and surrounding regions and ignited the attention of the public and government to the subject matter of earthquake hazard, specialty this earthquake came one year after the near by Indian m = 6.5 destructive Earthquake. Indeed the recent m = 6.2 June 22 destructive earthquake too that hit north west Iran, has again reminded the UAE public and government with the need to take quick and concrete measures to dtake the necessary steps to mitigate any anticipated earthquake hazard. This study reflects in some details on the following aspects related to the region and vicinity: geological and tectonic setting, seismicity, earthquake activity data base and seismic hazard assessment. Moreover, it documents the following aspects of the March 11, 2002 earthquake: tectonic, seismological, instrumental seismic data, aftershocks, strong motion recordings and response spectral and local site effect analysis, geotechnical effects and structural observations in the region affected by the earthquake. The study identifies local site ground amplification effects and liquefaction hazard potential in some parts of the UAE. Moreover, the study reflects on the coverage of the incident in the media, public and government response, state of earthquake engineering practice in the construction industry in the UAE, and the national preparedness and public awareness issues. However, it is concluded for this event that the mild damages that occurred in Masafi region were due to poor quality of construction, and lack of underestimating of the design base shear. Practical recommendations are suggested for the authorities to avoid damages in newly constructed buildings and lifelines as a result of future stronger earthquakes, in addition to recommendations on a national strategy for earthquake hazard mitigation in the UAE, which is still missing. The recommendations include the development and implementation of a design code for earthquake loading in the UAE, development of macro and micro seismic hazard maps, development of local site effect and liquefaction hazard maps, installation of a national earthquake monitoring network, assessment of the vulnerability of critical structures and life line facilities, public awareness, training of rescue teams in civil defense, etc.

  15. A study of shock mitigating materials in a split Hopkinson bar configuration. Phase 1

    SciTech Connect

    Bateman, V.I.; Brown, F.A.; Hansen, N.R.

    1998-06-01

    Sandia National Laboratories (SNL) designs mechanical systems with electronics that must survive high shock environments. These mechanical systems include penetrators that must survive soil, rock, and ice penetration, nuclear transportation casks that must survive transportation environments, and laydown weapons that must survive delivery impact of 125 fps. These mechanical systems contain electronics that may operate during and after the high shock environment and that must be protected from the high shock environments. A study has been started to improve the packaging techniques for the advanced electronics utilized in these mechanical systems because current packaging techniques are inadequate for these more sensitive electronics. In many cases, it has been found that the packaging techniques currently used not only do not mitigate the shock environment but actually amplify the shock environment. An ambitious goal for this packaging study is to avoid amplification and possibly attenuate the shock environment before it reaches the electronics contained in the various mechanical systems. As part of the investigation of packaging techniques, a two phase study of shock mitigating materials is being conducted. The purpose of the first phase reported here is to examine the performance of a joint that consists of shock mitigating material sandwiched in between steel and to compare the performance of the shock mitigating materials. A split Hopkinson bar experimental configuration simulates this joint and has been used to study the shock mitigating characteristics of seventeen, unconfined materials. The nominal input for these tests is an incident compressive wave with 50 fps peak (1,500 {micro}{var_epsilon} peak) amplitude and a 100 {micro}s duration (measured at 10% amplitude).

  16. Methodological Issues In Forestry Mitigation Projects: A CaseStudy Of Kolar District

    SciTech Connect

    Ravindranath, N.H.; Murthy, I.K.; Sudha, P.; Ramprasad, V.; Nagendra, M.D.V.; Sahana, C.A.; Srivathsa, K.G.; Khan, H.

    2007-06-01

    There is a need to assess climate change mitigationopportunities in forest sector in India in the context of methodologicalissues such as additionality, permanence, leakage, measurement andbaseline development in formulating forestry mitigation projects. A casestudy of forestry mitigation project in semi-arid community grazing landsand farmlands in Kolar district of Karnataka, was undertaken with regardto baseline and project scenariodevelopment, estimation of carbon stockchange in the project, leakage estimation and assessment ofcost-effectiveness of mitigation projects. Further, the transaction coststo develop project, and environmental and socio-economic impact ofmitigation project was assessed.The study shows the feasibility ofestablishing baselines and project C-stock changes. Since the area haslow or insignificant biomass, leakage is not an issue. The overallmitigation potential in Kolar for a total area of 14,000 ha under variousmitigation options is 278,380 tC at a rate of 20 tC/ha for the period2005-2035, which is approximately 0.67 tC/ha/yr inclusive of harvestregimes under short rotation and long rotation mitigation options. Thetransaction cost for baseline establishment is less than a rupee/tC andfor project scenario development is about Rs. 1.5-3.75/tC. The projectenhances biodiversity and the socio-economic impact is alsosignificant.

  17. Experimental Study on Electrostatic Hazards in Sprayed Liquid

    NASA Astrophysics Data System (ADS)

    Choi, Kwang Seok; Yamaguma, Mizuki; Ohsawa, Atsushi

    2007-12-01

    In this study, to evaluate ignition hazards in a paint process, electrostatic sparks in the sprayed area and the amount of charge while spraying were observed. With the objective of preventing accidents involving fires and/or explosions, we deal also with the ignitability due to an electrostatic spark of a sprayed liquid relative to the percentage of nitrogen (N2), including compression in an air cylinder. For this study, an air-spray-type handheld gun with a 1-mm-internal-diameter orifice and a supply of air pressure in the range of 0.1 to 1 MPa were used. With regard to the materials, water, including some sodium chloride, was used to investigate the charge amount of the sprayed liquid, and kerosene was selected for ignition tests while spraying. Several electrostatic sparks in the sprayed region were observed while spraying. Some values of the electrostatic charge observed in the course of this study would be unsafe in the painting industry. Thus, if any of the conductive parts of the equipment are not grounded, incendiary electrostatic sparks can result. The ignitability of sprayed liquid was markedly reduced; the percentage of N2 in the air was substituted for pressurized pure air, and its efficiency increased with air pressure.

  18. OFFSHORE PLATFORM HAZARDOUS WASTE INCINERATION FACILITY: FEASIBILITY STUDY

    EPA Science Inventory

    This report describes a program conducted to evaluate the technical and environmental feasibility of using a proposed offshore platform incineration facility in the destruction of hazardous wastes and for incineration research.

  19. Feasibility Study of Radiometry for Airborne Detection of Aviation Hazards

    NASA Technical Reports Server (NTRS)

    Gimmestad, Gary G.; Papanicolopoulos, Chris D.; Richards, Mark A.; Sherman, Donald L.; West, Leanne L.; Johnson, James W. (Technical Monitor)

    2001-01-01

    Radiometric sensors for aviation hazards have the potential for widespread and inexpensive deployment on aircraft. This report contains discussions of three aviation hazards - icing, turbulence, and volcanic ash - as well as candidate radiometric detection techniques for each hazard. Dual-polarization microwave radiometry is the only viable radiometric technique for detection of icing conditions, but more research will be required to assess its usefulness to the aviation community. Passive infrared techniques are being developed for detection of turbulence and volcanic ash by researchers in this country and also in Australia. Further investigation of the infrared airborne radiometric hazard detection approaches will also be required in order to develop reliable detection/discrimination techniques. This report includes a description of a commercial hyperspectral imager for investigating the infrared detection techniques for turbulence and volcanic ash.

  20. 44 CFR 201.7 - Tribal Mitigation Plans.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION PLANNING § 201.7 Tribal Mitigation Plans. The Indian...-disaster hazard management policies, programs, and capabilities to mitigate the hazards in the area... from natural hazards, serving as a guide for decision makers as they commit resources to reducing...

  1. 44 CFR 201.4 - Standard State Mitigation Plans.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION PLANNING § 201.4 Standard State Mitigation...-disaster Mitigation (PDM) program, authorized under section 203 of the Stafford Act, 42 U.S.C. 5133, will... post-disaster hazard management policies, programs, and capabilities to mitigate the hazards in...

  2. 44 CFR 201.4 - Standard State Mitigation Plans.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., DEPARTMENT OF HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION PLANNING § 201.4 Standard State Mitigation...-disaster Mitigation (PDM) program, authorized under section 203 of the Stafford Act, 42 U.S.C. 5133, will... post-disaster hazard management policies, programs, and capabilities to mitigate the hazards in...

  3. Volcanic hazard studies for the Yucca Mountain project

    SciTech Connect

    Crowe, B.; Turrin, B.; Wells, S.; Perry, F.; McFadden, L.; Renault, C.E.; Champion, D.; Harrington, C.

    1989-05-01

    Volcanic hazard studies are ongoing to evaluate the risk of future volcanism with respect to siting of a repository for disposal of high-level radioactive waste at the Yucca Mountain site. Seven Quaternary basaltic volcanic centers are located a minimum distance of 12 km and a maximum distance of 47 km from the outer boundary of the exploration block. The conditional probability of disruption of a repository by future basaltic volcanism is bounded by the range of 10{sup {minus}8} to 10{sup {minus}10} yr{sup {minus}1}. These values are currently being reexamined based on new developments in the understanding of the evaluation of small volume, basaltic volcanic centers including: (1) Many, perhaps most, of the volcanic centers exhibit brief periods of eruptive activity separated by longer periods of inactivity. (2) The centers may be active for time spans exceeding 10{sup 5} yrs, (3) There is a decline in the volume of eruptions of the centers through time, and (4) Small volume eruptions occurred at two of the Quaternary centers during latest Pleistocene or Holocene time. We classify the basalt centers as polycyclic, and distinguish them from polygenetic volcanoes. Polycyclic volcanism is characterized by small volume, episodic eruptions of magma of uniform composition over time spans of 10{sup 3} to 10{sup 5} yrs. Magma eruption rates are low and the time between eruptions exceeds the cooling time of the magma volumes. 25 refs., 2 figs.

  4. A procedure for global flood hazard mapping - the Africa case study

    NASA Astrophysics Data System (ADS)

    Dottori, Francesco; Salamon, Peter; Feyen, Luc; Barbosa, Paulo

    2015-04-01

    River floods are recognized as one of the major causes of economic damages and loss of human lives worldwide, and their impact in the next decades could be dramatically increased by socio-economic and climatic changes. In Africa, the availability of tools and models for predicting, mapping and analysing flood hazard and risk is still limited. Consistent, high-resolution (1km or less), continental-scale hazard maps are extremely valuable for local authorities and water managers to mitigate flood risk and to reduce catastrophic impacts on population and assets. The present work describes the development of a procedure for global flood hazard mapping, which is tested and applied over Africa to derive continental flood hazard maps. We derive a long-term dataset of daily river discharges from global hydrological simulations to design flood hydrographs for different return periods for the major African river network. We then apply a hydrodynamic model to identify flood-prone areas in major river catchments, which are merged to create pan-African flood hazard maps at 900m resolution. The flood map designed for a return period of 20 years is compared with a mosaic of satellite images showing all flooded areas in the period 2000-2014. We discuss strengths and limitations emerging from the comparison and present potential future applications and developments of the methodology.

  5. Management of agricultural soils for greenhouse gas mitigation: Learning from a case study in NE Spain.

    PubMed

    Sánchez, B; Iglesias, A; McVittie, A; Álvaro-Fuentes, J; Ingram, J; Mills, J; Lesschen, J P; Kuikman, P J

    2016-04-01

    A portfolio of agricultural practices is now available that can contribute to reaching European mitigation targets. Among them, the management of agricultural soils has a large potential for reducing GHG emissions or sequestering carbon. Many of the practices are based on well tested agronomic and technical know-how, with proven benefits for farmers and the environment. A suite of practices has to be used since none of the practices can provide a unique solution. However, there are limitations in the process of policy development: (a) agricultural activities are based on biological processes and thus, these practices are location specific and climate, soils and crops determine their agronomic potential; (b) since agriculture sustains rural communities, the costs and potential for implementation have also to be regionally evaluated and (c) the aggregated regional potential of the combination of practices has to be defined in order to inform abatement targets. We believe that, when implementing mitigation practices, three questions are important: Are they cost-effective for farmers? Do they reduce GHG emissions? What policies favour their implementation? This study addressed these questions in three sequential steps. First, mapping the use of representative soil management practices in the European regions to provide a spatial context to upscale the local results. Second, using a Marginal Abatement Cost Curve (MACC) in a Mediterranean case study (NE Spain) for ranking soil management practices in terms of their cost-effectiveness. Finally, using a wedge approach of the practices as a complementary tool to link science to mitigation policy. A set of soil management practices was found to be financially attractive for Mediterranean farmers, which in turn could achieve significant abatements (e.g., 1.34 MtCO2e in the case study region). The quantitative analysis was completed by a discussion of potential farming and policy choices to shape realistic mitigation policy at European regional level. PMID:26789201

  6. The 5 key questions coping with risks due to natural hazards, answered by a case study

    NASA Astrophysics Data System (ADS)

    Hardegger, P.; Sausgruber, J. T.; Schiegg, H. O.

    2009-04-01

    Based on Maslow's hierarchy of needs, human endeavours concern primarily existential needs, consequently, to be safeguarded against both natural as well as man made threads. The subsequent needs are to realize chances in a variety of fields, as economics and many others. Independently, the 5 crucial questions are the same as for coping with risks due to natural hazards specifically. These 5 key questions are I) What is the impact in function of space and time ? II) What protection measures comply with the general opinion and how much do they mitigate the threat? III) How can the loss be adequately quantified and monetized ? IV) What budget for prevention and reserves for restoration and compensation are to be planned ? V) Which mix of measures and allocation of resources is sustainable, thus, optimal ? The 5 answers, exemplified by a case study, concerning the sustainable management of risk due to the debris flows by the Enterbach / Inzing / Tirol / Austria, are as follows : I) The impact, created by both the propagation of flooding and sedimentation, has been forecasted by modeling (numerical simulation) the 30, 50, 100, 150, 300 and 1000 year debris flow. The input was specified by detailed studies in meteorology, precipitation and runoff, in geology, hydrogeology, geomorphology and slope stability, in hydraulics, sediment transport and debris flow, in forestry, agriculture and development of communal settlement and infrastructure. All investigations were performed according to the method of ETAlp (Erosion and Transport in Alpine systems). ETAlp has been developed in order to achieve a sustainable development in alpine areas and has been evaluated by the research project "nab", within the context of the EU-Interreg IIIb projects. II) The risk mitigation measures of concern are in hydraulics at the one hand and in forestry at the other hand. Such risk management is evaluated according to sustainability, which means economic, ecologic and social, in short, "triple" compatibility. 100% protection against the 100 year event shows to be the optimal degree of protection. Consequently, impacts statistically less frequent than once in 100 year are accepted as the remaining risk. Such floods and debris flows respectively cause a fan of propagation which is substantially reduced due to the protection measures against the 100 year event. III) The "triple loss distribution" shows the monetized triple damage, dependent on its probability. The monetization is performed by the social process of participation of the impacted interests, if not, by official experts in representation. The triple loss distribution rises in time mainly due to the rise in density and value of precious goods. A comparison of the distributions of the triple loss and the triple risk, behaving in opposite direction, is shown and explained within the project. IV) The recommended yearly reserves to be stocked for restoration and compensation of losses, caused by debris flows, amount to € 70'000.- according to the approach of the "technical risk premium". The discrepancy in comparison with the much higher amounts according to the common approaches of natural hazards engineering are discussed. V) The sustainable mix of hydraulic and forestry measures with the highest return on investment at lowest risk is performed according to the portfolio theory (Markowitz), based on the triple value curves, generated by the method of TripelBudgetierung®. Accordingly, the optimum mix of measures to protect the community of Inzing against the natural hazard of debris flow, thus, the most efficient allocation of resources equals to 2/3 for hydraulic, 1/3 for forestry measures. In detail, the results of the research pilot project "Nachhaltiges Risikomanagement - Enterbach / Inzing / Tirol / Austria" may be consulted under www.ibu.hsr.ch/inzing.

  7. HOUSEHOLD HAZARDOUS WASTE CHARACTERIZATION STUDY FOR PALM BEACH COUNTY, FLORIDA - A MITE PROGRAM EVALUATION

    EPA Science Inventory

    The objectives of the Household Hazardous Waste Characterization Study (the HHW Study) were to: 1) Quantity the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County Florida’s (the County) residential solid waste (characterized in this study as municipal s...

  8. HOUSEHOLD HAZARDOUS WASTE CHARACTERIZATION STUDY FOR PALM BEACH COUNTY, FLORIDA - A MITE PROGRAM EVALUATION

    EPA Science Inventory

    The objectives of the Household Hazardous Waste Characterization Study (the HHW Study) were to: 1) Quantity the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County Floridas (the County) residential solid waste (characterized in this study as municipal s...

  9. Reducing risk from lahar hazards: concepts, case studies, and roles for scientists

    USGS Publications Warehouse

    Pierson, Thomas C.; Wood, Nathan J.; Driedger, Carolyn L.

    2014-01-01

    Lahars are rapid flows of mud-rock slurries that can occur without warning and catastrophically impact areas more than 100 km downstream of source volcanoes. Strategies to mitigate the potential for damage or loss from lahars fall into four basic categories: (1) avoidance of lahar hazards through land-use planning; (2) modification of lahar hazards through engineered protection structures; (3) lahar warning systems to enable evacuations; and (4) effective response to and recovery from lahars when they do occur. Successful application of any of these strategies requires an accurate understanding and assessment of the hazard, an understanding of the applicability and limitations of the strategy, and thorough planning. The human and institutional components leading to successful application can be even more important: engagement of all stakeholders in hazard education and risk-reduction planning; good communication of hazard and risk information among scientists, emergency managers, elected officials, and the at-risk public during crisis and non-crisis periods; sustained response training; and adequate funding for risk-reduction efforts. This paper reviews a number of methods for lahar-hazard risk reduction, examines the limitations and tradeoffs, and provides real-world examples of their application in the U.S. Pacific Northwest and in other volcanic regions of the world. An overriding theme is that lahar-hazard risk reduction cannot be effectively accomplished without the active, impartial involvement of volcano scientists, who are willing to assume educational, interpretive, and advisory roles to work in partnership with elected officials, emergency managers, and vulnerable communities.

  10. Best Practices in Grid Integration of Variable Wind Power: Summary of Recent US Case Study Results and Mitigation Measures

    SciTech Connect

    Smith, J. Charles; Parsons, Brian; Acker, Thomas; Milligan, Michael; Zavidil, Robert; Schuerger, Matthew; DeMeo, Edgar

    2010-01-22

    This paper will summarize results from a number of utility wind integration case studies conducted recently in the US, and outline a number of mitigation measures based on insights from those studies.

  11. Factors in Perception of Tornado Hazard: An Exploratory Study.

    ERIC Educational Resources Information Center

    de Man, Anton; Simpson-Housley, Paul

    1987-01-01

    Administered questionnaire on tornado hazard to 142 adults. Results indicated that subject's gender and education level were best predictors of perceived probability of tornado recurrence; that ratings of severity of potential damage were related to education level; and that gender accounted for significant percentage of variance in anxiety…

  12. STUDY ON AIR INGRESS MITIGATION METHODS IN THE VERY HIGH TEMPERATURE GAS COOLED REACTOR (VHTR)

    SciTech Connect

    Chang H. Oh

    2011-03-01

    An air-ingress accident followed by a pipe break is considered as a critical event for a very high temperature gas-cooled reactor (VHTR). Following helium depressurization, it is anticipated that unless countermeasures are taken, air will enter the core through the break leading to oxidation of the in-core graphite structure. Thus, without mitigation features, this accident might lead to severe exothermic chemical reactions of graphite and oxygen. Under extreme circumstances, a loss of core structural integrity may occur along with excessive release of radiological inventory. Idaho National Laboratory under the auspices of the U.S. Department of Energy is performing research and development (R&D) that focuses on key phenomena important during challenging scenarios that may occur in the VHTR. Phenomena Identification and Ranking Table (PIRT) studies to date have identified the air ingress event, following on the heels of a VHTR depressurization, as very important (Oh et al. 2006, Schultz et al. 2006). Consequently, the development of advanced air ingress-related models and verification and validation (V&V) requirements are part of the experimental validation plan. This paper discusses about various air-ingress mitigation concepts applicable for the VHTRs. The study begins with identifying important factors (or phenomena) associated with the air-ingress accident by using a root-cause analysis. By preventing main causes of the important events identified in the root-cause diagram, the basic air-ingress mitigation ideas can be conceptually derived. The main concepts include (1) preventing structural degradation of graphite supporters; (2) preventing local stress concentration in the supporter; (3) preventing graphite oxidation; (4) preventing air ingress; (5) preventing density gradient driven flow; (4) preventing fluid density gradient; (5) preventing fluid temperature gradient; (6) preventing high temperature. Based on the basic concepts listed above, various air-ingress mitigation methods are proposed in this study. Among them, the following two mitigation ideas are extensively investigated using computational fluid dynamic codes (CFD): (1) helium injection in the lower plenum, and (2) reactor enclosure opened at the bottom. The main idea of the helium injection method is to replace air in the core and the lower plenum upper part by buoyancy force. This method reduces graphite oxidation damage in the severe locations of the reactor inside. To validate this method, CFD simulations are addressed here. A simple 2-D CFD model is developed based on the GT-MHR 600MWt design. The simulation results showed that the helium replace the air flow into the core and significantly reduce the air concentration in the core and bottom reflector potentially protecting oxidation damage. According to the simulation results, even small helium flow was sufficient to remove air in the core, mitigating the air-ingress successfully. The idea of the reactor enclosure with an opening at the bottom changes overall air-ingress mechanism from natural convection to molecular diffusion. This method can be applied to the current system by some design modification of the reactor cavity. To validate this concept, this study also uses CFD simulations based on the simplified 2-D geometry. The simulation results showed that the enclosure open at the bottom can successfully mitigate air-ingress into the reactor even after on-set natural circulation occurs.

  13. Occupational Health Hazards among Healthcare Workers in Kampala, Uganda

    PubMed Central

    Yu, Xiaozhong; Buregyeya, Esther; Musoke, David; Wang, Jia-Sheng; Halage, Abdullah Ali; Whalen, Christopher; Bazeyo, William; Williams, Phillip; Ssempebwa, John

    2015-01-01

    Objective. To assess the occupational health hazards faced by healthcare workers and the mitigation measures. Methods. We conducted a cross-sectional study utilizing quantitative data collection methods among 200 respondents who worked in 8 major health facilities in Kampala. Results. Overall, 50.0% of respondents reported experiencing an occupational health hazard. Among these, 39.5% experienced biological hazards while 31.5% experienced nonbiological hazards. Predictors for experiencing hazards included not wearing the necessary personal protective equipment (PPE), working overtime, job related pressures, and working in multiple health facilities. Control measures to mitigate hazards were availing separate areas and containers to store medical waste and provision of safety tools and equipment. Conclusion. Healthcare workers in this setting experience several hazards in their workplaces. Associated factors include not wearing all necessary protective equipment, working overtime, experiencing work related pressures, and working in multiple facilities. Interventions should be instituted to mitigate the hazards. Specifically PPE supply gaps, job related pressures, and complacence in adhering to mitigation measures should be addressed. PMID:25802531

  14. Geospatial Approach on Landslide Hazard Zonation Mapping Using Multicriteria Decision Analysis: A Study on Coonoor and Ooty, Part of Kallar Watershed, The Nilgiris, Tamil Nadu

    NASA Astrophysics Data System (ADS)

    Rahamana, S. Abdul; Aruchamy, S.; Jegankumar, R.

    2014-12-01

    Landslides are one of the critical natural phenomena that frequently lead to serious problems in hilly area, resulting to loss of human life and property, as well as causing severe damage to natural resources. The local geology with high degree of slope coupled with high intensity of rainfall along with unplanned human activities of the study area causes many landslides in this region. The present study area is more attracted by tourist throughout the year, so this area must be considered for preventive measures. Geospatial based Multicriteria decision analysis (MCDA) technique is increasingly used for landslide vulnerability and hazard zonation mapping. It enables the integration of different data layers with different levels of uncertainty. In this present study, it is used analytic hierarchy process (AHP) method to prepare landslide hazard zones of the Coonoor and Ooty, part of Kallar watershed, The Nilgiris, Tamil Nadu. The study was carried out using remote sensing data, field surveys and geographic information system (GIS) tools. The ten factors that influence landslide occurrence, such as elevation, slope aspect, slope angle, drainage density, lineament density, soil, precipitation, land use/land cover (LULC), distance from road and NDVI were considered. These factors layers were extracted from the various related spatial data's. These factors were evaluated, and then, the individual factor weight and class weight were assigned to each of the related factors. The Landslide Hazard Zone Index (LHZI) was calculated using Multicriteria decision analysis (MCDA) the technique based on the assigned weight and the rating is given by the Analytical Hierarchy Process (AHP) method. The final cumulative map of the study area was categorized into four hazard zones and classified as zone I to IV. There are 3.56% of the area comes under the hazard zone IV fallowed by 48.19% of the area comes under zone III, 43.63 % of the area in zone II and 4.61% of the area comes hazard zone I. Further resulted hazard zone map and landuse/landcover map are overlaid to check the hazard status, and existing inventory of known landslides within the present study area was compared with the resulting vulnerable and hazard zone maps. The landslide hazard zonation map is useful for landslide hazard prevention, mitigation, and improvement to society, and proper planning for land use and construction in the future.

  15. Development, Implementation, and Pilot Evaluation of a Model-Driven Envelope Protection System to Mitigate the Hazard of In-Flight Ice Contamination on a Twin-Engine Commuter Aircraft

    NASA Technical Reports Server (NTRS)

    Martos, Borja; Ranaudo, Richard; Norton, Billy; Gingras, David; Barnhart, Billy

    2014-01-01

    Fatal loss-of-control accidents have been directly related to in-flight airframe icing. The prototype system presented in this report directly addresses the need for real-time onboard envelope protection in icing conditions. The combination of prior information and real-time aerodynamic parameter estimations are shown to provide sufficient information for determining safe limits of the flight envelope during inflight icing encounters. The Icing Contamination Envelope Protection (ICEPro) system was designed and implemented to identify degradations in airplane performance and flying qualities resulting from ice contamination and provide safe flight-envelope cues to the pilot. The utility of the ICEPro system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their awareness of a hazardous aircraft state. The performance of ICEPro system was further evaluated by various levels of sensor noise and atmospheric turbulence.

  16. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 44 Emergency Management and Assistance 1 2011-10-01 2011-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  17. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 44 Emergency Management and Assistance 1 2014-10-01 2014-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  18. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  19. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 44 Emergency Management and Assistance 1 2012-10-01 2011-10-01 true Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  20. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  1. Nonpoint-Source Agricultural Hazard Index: A Case Study of the Province of Cremona, Italy

    NASA Astrophysics Data System (ADS)

    Trevisan, Marco; Padovani, Laura; Capri, Ettore

    2000-11-01

    This paper reports the results of a study aimed at the evaluation of the hazard level of farming activities in the province of Cremona, Italy, with particular reference to groundwater. The applied methodology employs a parametric approach based on the definition of potential hazard indexes (nonpoint-source agricultural hazard indexes, NPSAHI). Two categories of parameters were considered: the hazard factors (HF), which represent all farming activities that cause or might cause an impact on groundwater (use of fertilizers and pesticides, application of livestock and poultry manure, food industry wastewater, and urban sludge), and the control factors (CF), which adapt the hazard factor to the characteristics of the site (geographical location, slope, agronomic practices, and type of irrigation). The hazard index (HI) can be calculated multiplying the hazard factors by the control factors and, finally, the NPSAHI are obtained dividing HI into classes on a percentile basis using a scale ranging from 1 to 10. Organization, processing, and display of all data layers were performed using the geographical information system (GIS) ArcView and its Spatial Analyst extension. Results show that the potential hazard of groundwater pollution by farming activities in the province of Cremona falls mainly in the fifth class (very low hazard).

  2. Plant protection system optimization studies to mitigate consequences of large breaks in the Advanced Neutron Source Reactor

    SciTech Connect

    Khayat, M.I.; March-Leuba, J.

    1993-10-01

    This paper documents some of the optimization studies performed to maximize the performance of the engineered safety features and scram systems to mitigate the consequences of large breaks in the primary cooling system of the Advanced Neutron Source (ANS) Reactor.

  3. Hazardous waste cleanup: A case study for developing efficient programs

    SciTech Connect

    Elcock, D.; Puder, M.G.

    1995-06-01

    As officials in Pacific Basin Countries develop laws and policies for cleaning up hazardous wastes, experiences of countries with such instruments in place may be instructive. The United States has addressed cleanups of abandoned hazardous waste sites through the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). The US Congress enacted CERCLA in 1980. The task of cleaning up waste sites became larger and more costly than originally envisioned and as a result, Congress strengthened and expanded CERCLA in 1986. Today, many industry representatives, environmentalists, and other interested parties say the program is still costly and ineffective, and Congress is responding through a reauthorization process to change the law once again. Because the law and modifications to it can affect company operations and revenues, industries want to know the potential consequences of such changes. Argonne National Laboratory (ANL) recently developed a baseline for one economic sector -- the US energy industry -- against which impacts of proposed changes to CERCLA could be measured. Difficulties encountered in locating and interpreting the data for developing that baseline suggest that legislation should not only provide for meeting its stated goals (e.g., protection of human health and the environment) but also allow for its efficient evaluation over time. This lesson can be applied to any nation contemplating hazardous waste cleanup laws and policies.

  4. Landslide hazard analysis - a case study in WuShe reservoir catchment

    NASA Astrophysics Data System (ADS)

    Huang, C. M.

    2014-12-01

    A complete landslide inventory covering a long time span is the foundation for landslide hazard analysis. Previous studies only estimate landslide susceptibility because enough landslide inventory was usually unavailable. This study collects the spot image of ten events from 1994 to 2009 of WuShe reservoir catchment in central Taiwan. All of landslide inventories were manual interpretation from spot image. The recent and expanded landslide inventory due to the event was identified from the landslide inventory of pre and post event. The recent landslide inventory was used to conduct landslide hazard evaluation.This study follows the landslide hazard analysis framework of CNR-IRPR in Italy, which had demonstrated how the spatial probability, temporal probability and size probability of each slope-unit were calculated. Landslide hazard in WuShe reservoir catchment was obtained following the procedure. The preliminary validated result shows that the prediction of landslide hazard in high landslide hazard region was better, when compared with landslide susceptibility.However, the reliability of temporal probability and size probability were related to the number of landslide inventories. When no landslide was recorded, or lack of suitable landslide inventory, would lead to the incorrect analysis of landslide hazard. The event based spatial probability was affected by rainfall distribution, but the rainfall distribution of different typhoons were usually highly variable. Next phase of this study would attempt to discuss these issues and develop a suitable framework for landslide hazard analysis in Taiwan.

  5. Accuracy Assessment Study of UNB3m Neutral Atmosphere Model for Global Tropospheric Delay Mitigation

    NASA Astrophysics Data System (ADS)

    Farah, Ashraf

    2015-12-01

    Tropospheric delay is the second major source of error after the ionospheric delay for satellite navigation systems. The transmitted signal could face a delay caused by the troposphere of over 2m at zenith and 20m at lower satellite elevation angles of 10 degrees and below. Positioning errors of 10m or greater can result from the inaccurate mitigation of the tropospheric delay. Many techniques are available for tropospheric delay mitigation consisting of surface meteorological models and global empirical models. Surface meteorological models need surface meteorological data to give high accuracy mitigation while the global empirical models need not. Several hybrid neutral atmosphere delay models have been developed by (University of New Brunswick, Canada) UNB researchers over the past decade or so. The most widely applicable current version is UNB3m, which uses the Saastamoinen zenith delays, Niell mapping functions, and a look-up table with annual mean and amplitude for temperature, pressure, and water vapour pressure varying with respect to latitude and height. This paper presents an assessment study of the behaviour of the UNB3m model compared with highly accurate IGS-tropospheric estimation for three different (latitude/height) IGS stations. The study was performed over four nonconsecutive weeks on different seasons over one year (October 2014 to July 2015). It can be concluded that using UNB3m model gives tropospheric delay correction accuracy of 0.050m in average for low latitude regions in all seasons. The model's accuracy is about 0.075m for medium latitude regions, while its highest accuracy is about 0.014m for high latitude regions.

  6. A Study on Integrated Community Based Flood Mitigation with Remote Sensing Technique in Kota Bharu, Kelantan

    NASA Astrophysics Data System (ADS)

    'Ainullotfi, A. A.; Ibrahim, A. L.; Masron, T.

    2014-02-01

    This study is conducted to establish a community based flood management system that is integrated with remote sensing technique. To understand local knowledge, the demographic of the local society is obtained by using the survey approach. The local authorities are approached first to obtain information regarding the society in the study areas such as the population, the gender and the tabulation of settlement. The information about age, religion, ethnic, occupation, years of experience facing flood in the area, are recorded to understand more on how the local knowledge emerges. Then geographic data is obtained such as rainfall data, land use, land elevation, river discharge data. This information is used to establish a hydrological model of flood in the study area. Analysis were made from the survey approach to understand the pattern of society and how they react to floods while the analysis of geographic data is used to analyse the water extent and damage done by the flood. The final result of this research is to produce a flood mitigation method with a community based framework in the state of Kelantan. With the flood mitigation that involves the community's understanding towards flood also the techniques to forecast heavy rainfall and flood occurrence using remote sensing, it is hope that it could reduce the casualties and damage that might cause to the society and infrastructures in the study area.

  7. Hazard mitigation related to water and sediment fluxes in the Yellow River basin, China, based on comparable basins of the United States

    USGS Publications Warehouse

    Osterkamp, W.R.; Gray, J.R.

    2003-01-01

    The Yellow River, north-central China, and comparative rivers of the western United States, the Rio Grande and the Colorado River, derive much of their flows from melting snow at high elevations, but derive most of their se diment loads from semiarid central parts of the basins. The three rivers are regulated by larg e reservoirs that store water and sediment, causing downstream channel scour and, farthe r downstream, flood hazard owing to re- deposition of sediment. Potential approaches to reducing continui ng bed aggradation and increasing flood hazard along the lower Yellow Ri ver include flow augmentation, retirement of irrigation that decreases flows and increas es erosion, and re-routing of the middle Yellow River to bypass large sediment i nputs of the Loess Plateau.

  8. Observational Studies of Earthquake Preparation and Generation to Mitigate Seismic Risks in Mines

    NASA Astrophysics Data System (ADS)

    Durrheim, R. J.; Ogasawara, H.; Nakatani, M.; Milev, A.; Cichowicz, A.; Kawakata, H.; Yabe, Y.; Murakami, O.; Naoi, M. M.; Moriya, H.; Satoh, T.

    2011-12-01

    We provide a status report on a 5-year project to monitor in-situ fault instability and strong motion in South African gold mines. The project has two main aims: (1) To learn more about earthquake preparation and generation mechanisms by deploying dense arrays of high-sensitivity sensors within rock volumes where mining is likely to induce significant seismic activity. (2) To upgrade the South African national surface seismic network in the mining districts. This knowledge will contribute to efforts to upgrade schemes of seismic hazard assessment and to limit and mitigate the seismic risks in deep mines. As of 31 July 2011, 46 boreholes totalling 1.9 km in length had been drilled at project sites at Ezulwini, Moab-Khotsong and Driefontein gold mines. Several dozen more holes are still to be drilled. Acoustic emission sensors, strain- and tiltmeters, and controlled seismic sources are being installed to monitor the deformation of the rock mass, the accumulation of damage during the preparation phase, and changes in dynamic stress as the rupture front propagates. These data will be integrated with measurements of stope closure, stope strong motion, seismic data recorded by the mine-wide network, and stress modelling. Preliminary results will be reported at AGU meeting. The project is endorsed by the Japan Science and Technology Agency (JST), Japan International Cooperation Agency (JICA) and the South African government. It is funded by the JST-JICA program for Science and Technology Research Partnership for Sustainable development (SATREPS, the Council for Scientific and Industrial Research (CSIR), the Council for Geoscience, the University of the Witwatersrand and the Department of Science and Technology. The contributions of Seismogen CC, OHMS Ltd, AnglogoldAshanti Rock Engineering Applied Research Group, First Uranium, the Gold Fields Seismic Department and the Institute of Mine Seismology are gratefully acknowledged.

  9. Feasibility study--computerized application of the hazardous material regulations

    SciTech Connect

    Ferrada, J.J.; Green, V.M.; Rawl, R.R.

    1992-09-01

    The feasibility of developing a full expert system for transportation and packaging of hazardous and radioactive materials was initiated within the framework of three subtasks: (1) analysis of commercial packages related to regulation scanning, (2) analysis of computer languages to develop the expert system, and (3) development of expert system prototypes. The strategy to develop the latter subtask was to first,develop modules to capture the knowledge of different areas of transportation and packaging and second, to analyze the feasibility of appending these different modules in one final full package. The individual modules development contemplated one prototype for transporting and packaging of radioactive material and another for transporting hazardous chemical materials. In the event that it is not feasible to link these two packages, the modules can always be used as stand-alone tools, or linked as a single package with some restrictions in their applicability. The work done during this fiscal year has focused on developing a prototype for transporting radioactive materials.

  10. Landslide hazard evaluation: a review of current techniques and their application in a multi-scale study, Central Italy

    NASA Astrophysics Data System (ADS)

    Guzzetti, Fausto; Carrara, Alberto; Cardinali, Mauro; Reichenbach, Paola

    1999-12-01

    In recent years, growing population and expansion of settlements and life-lines over hazardous areas have largely increased the impact of natural disasters both in industrialized and developing countries. Third world countries have difficulty meeting the high costs of controlling natural hazards through major engineering works and rational land-use planning. Industrialized societies are increasingly reluctant to invest money in structural measures that can reduce natural risks. Hence, the new issue is to implement warning systems and land utilization regulations aimed at minimizing the loss of lives and property without investing in long-term, costly projects of ground stabilization. Government and research institutions worldwide have long attempted to assess landslide hazard and risks and to portray its spatial distribution in maps. Several different methods for assessing landslide hazard were proposed or implemented. The reliability of these maps and the criteria behind these hazard evaluations are ill-formalized or poorly documented. Geomorphological information remains largely descriptive and subjective. It is, hence, somewhat unsuitable to engineers, policy-makers or developers when planning land resources and mitigating the effects of geological hazards. In the Umbria and Marche Regions of Central Italy, attempts at testing the proficiency and limitations of multivariate statistical techniques and of different methodologies for dividing the territory into suitable areas for landslide hazard assessment have been completed, or are in progress, at various scales. These experiments showed that, despite the operational and conceptual limitations, landslide hazard assessment may indeed constitute a suitable, cost-effective aid to land-use planning. Within this framework, engineering geomorphology may play a renewed role in assessing areas at high landslide hazard, and helping mitigate the associated risk.

  11. Study on mitigation of pulsed heat load for ITER cryogenic system

    NASA Astrophysics Data System (ADS)

    Peng, N.; Xiong, L. Y.; Jiang, Y. C.; Tang, J. C.; Liu, L. Q.

    2015-03-01

    One of the key requirements for ITER cryogenic system is the mitigation of the pulsed heat load deposited in the magnet system due to magnetic field variation and pulsed DT neutron production. As one of the control strategies, bypass valves of Toroidal Field (TF) case helium loop would be adjusted to mitigate the pulsed heat load to the LHe plant. A quasi-3D time-dependent thermal-hydraulic analysis of the TF winding packs and TF case has been performed to study the behaviors of TF magnets during the reference plasma scenario with the pulses of 400 s burn and repetition time of 1800 s. The model is based on a 1D helium flow and quasi-3D solid heat conduction model. The whole TF magnet is simulated taking into account thermal conduction between winding pack and case which are cooled separately. The heat loads are given as input information, which include AC losses in the conductor, eddy current losses in the structure, thermal radiation, thermal conduction and nuclear heating. The simulation results indicate that the temperature variation of TF magnet stays within the allowable range when the smooth control strategy is active.

  12. Emergency planning for hazardous industrial areas: a Brazilian case study.

    PubMed

    de Souza, A B

    2000-08-01

    One of the characteristics of modern industrial development is the emergence of a new typology of accidents whose effects can be spread, in space as well as in time, well beyond the borders of the installations where they occur, sometimes impacting the local population and the environment in a catastrophic fashion. This is the result of a number of factors that have changed the risk profile of modern industrial activities. For a number of reasons, the developing countries have proved to be more vulnerable to industrial disasters. Three of the most catastrophic industrial accidents--Bhopal, San Juan de Ixhuatepec, and Cubatão--occurred in developing countries, claiming thousands of lives. During the 1970s and 1980s the higher degree of public visibility of industrial hazards as a result of serious accidents, led to the creation, especially in the more industrialized countries, of regulations for greater control over industrial activities, either by means of new laws or by updating existing legislation. Some of these regulations were designed to improve the response to accidents with potential impacts outside the industrial sites. This article attempts to describe the current status and identify the shortcomings of off-site emergency planning for hazardous industrial areas in Brazil. The most important problems are the lack of specific legislation and the absence of awareness and active participation of public authorities. The experience of an off-site emergency planning process for a Brazilian industrial area is presented. This experience illustrates how difficult it is to prepare and implement emergency planning processes in an industrializing country. PMID:11051072

  13. Interactive hazards education program for youth in a low SES community: a quasi-experimental pilot study.

    PubMed

    Webb, Michelle; Ronan, Kevin R

    2014-10-01

    A pilot study of an interactive hazards education program was carried out in Canberra (Australia), with direct input from youth participants. Effects were evaluated in relation to youths' interest in disasters, motivation to prepare, risk awareness, knowledge indicators, perceived preparedness levels, planning and practice for emergencies, and fear and anxiety indicators. Parents also provided ratings, including of actual home-based preparedness activities. Using a single group pretest-posttest with benchmarking design, a sample of 20 youths and their parents from a low SES community participated. Findings indicated beneficial changes on a number of indicators. Preparedness indicators increased significantly from pre- to posttest on both youth (p < 0.01) and parent ratings (p < 0.01). Parent ratings reflected an increase of just under six home-based preparedness activities. Youth knowledge about disaster mitigation also was seen to increase significantly (p < 0.001), increasing 39% from pretest levels. While personalized risk perceptions significantly increased (p < 0.01), anxiety and worry levels were seen either not to change (generalized anxiety, p > 0.05) or to reduce between pre- and posttest (hazards-specific fears, worry, and distress, ps ranged from p < 0.05 to < 0.001). In terms of predictors of preparedness, a number of variables were found to predict posttest preparedness levels, including information searching done by participants between education sessions. These pilot findings are the first to reflect quasi-experimental outcomes for a youth hazards education program carried out in a setting other than a school that focused on a sample of youth from a low SES community. PMID:24888406

  14. BICAPA case study of natural hazards that trigger technological disasters

    NASA Astrophysics Data System (ADS)

    Boca, Gabriela; Ozunu, Alexandru; Nicolae Vlad, Serban

    2010-05-01

    Industrial facilities are vulnerable to natural disasters. Natural disasters and technological accidents are not always singular or isolated events. The example in this paper show that they can occur in complex combinations and/or in rapid succession, known as NaTech disasters, thereby triggering multiple impacts. This analysis indicates that NaTech disasters have the potential to trigger hazmat releases and other types of technological accidents. Climate changes play an important role in prevalence and NATECH triggering mechanisms. Projections under the IPCC IS92 a scenario (similar to SRES A1B; IPCC, 1992) and two GCMs indicate that the risk of floods increases in central and eastern Europe. Increase in intense short-duration precipitation is likely to lead to increased risk of flash floods. (Lehner et al., 2006). It is emergent to develop tools for the assessment of risks due to NATECH events in the industrial processes, in a framework starting with the characterization of frequency and severity of natural disasters and continuing with complex analysis of industrial processes, to risk assessment and residual functionality analysis. The Ponds with dangerous technological residues are the most vulnerable targets of natural hazards. Technological accidents such as those in Baia Mare, (from January to March 2000) had an important international echo. Extreme weather phenomena, like those in the winter of 2000 in Baia Mare, and other natural disasters such as floods or earthquakes, can cause a similar disaster at Târnăveni in Transylvania Depression. During 1972 - 1978 three decanting ponds were built on the Chemical Platform Târnăveni, now SC BICAPA SA, for disposal of the hazardous-wastes resulting from the manufacture of sodium dichromate, inorganic salts, sludge from waste water purification and filtration, wet gas production from carbide. The ponds are located on the right bank of the river Târnava at about 35-50m from the flooding defense dam. The total amount of toxic waste stored in the three ponds is about 2500 tons, equivalent at 128 tons expressed in hexavalent chromium. The ponds contour dikes are strongly damaged in many places, their safety is jeopardized by leakages, sliding slopes and ravens. The upstream dike has an increased failure risk. The upstream dike has an increased failure risk. In that section the coefficients of safety are under the allowable limit, both in static applications, and the earthquake. The risk of failure is very high also due to the dikes slopes. The risk becomes higher in case of heavy rainfall, floods or an earthquake.

  15. Mitigation and prevention of exertional heat stress in firefighters: a review of cooling strategies for structural firefighting and hazardous materials responders.

    PubMed

    McEntire, Serina J; Suyama, Joe; Hostler, David

    2013-01-01

    Most duties performed by firefighters require the use of personal protective equipment, which inhibits normal thermoregulation during exertion, creating an uncompensable heat stress. Structured rest periods are required to correct the effects of uncompensable heat stress and ensure that firefighter safety is maintained and that operations can be continued until their conclusion. While considerable work has been done to optimize firefighter cooling during fireground operations, there is little consensus on when or how cooling should be deployed. A systematic review of cooling techniques and practices among firefighters and hazardous materials operators was conducted to describe the state of the science and provide recommendations for deploying resources for fireground rehab (i.e., structured rest periods during an incident). Five electronic databases were searched using a selected combination of key words. One hundred forty publications were found in the initial search, with 27 meeting all the inclusion criteria. Two independent reviewers performed a qualitative assessment of each article based on nine specific questions. From the selected literature, the efficacy of multiple cooling strategies was compared during exertion and immediately following exertion under varying environmental conditions. When considering the literature available for cooling firefighters and hazardous materials technicians during emergency incident rehabilitation, widespread use of cooling devices does not appear to be warranted if ambient temperature and humidity approximate room temperature and protective garments can be removed. When emergency incident rehabilitation must be conducted in hot or humid conditions, active cooling devices are needed. Hand/forearm immersion is likely the best modality for cooling during rehab under hot, humid conditions; however, this therapy has a number of limitations. Cooling during work thus far has been limited primarily to cooling vests and liquid- or air-cooled suits. In general, liquid-perfused suits appear to be superior to air-cooled garments, but both add weight to the firefighter, making current iterations less desirable. There is still considerable work to be done to determine the optimal cooling strategies for firefighters and hazardous materials operators during work. PMID:23379781

  16. A study of shock mitigating materials in a split Hopkins bar configuration. Phase 2

    SciTech Connect

    Bateman, V.I.; Brown, F.A.; Hansen, N.R.

    1997-12-31

    Sandia National Laboratories (SNL) designs mechanical systems with electronics that must survive high shock environments. These mechanical systems include penetrators that must survive soil and rock penetration, nuclear transportation casks that must survive transportation environments, and laydown weapons that must survive delivery impact. These mechanical systems contain electronics that may operate during and after the high shock environment and that must be protected from the high shock environments. A study has been started to improve the packaging techniques for the advanced electronics utilized in these mechanical systems because current packaging techniques are inadequate for these sensitive electronics. In many cases, it has been found that the packaging techniques currently used not only do not mitigate the shock environment but actually amplify the shock environment. An ambitious goal for this packaging study is to avoid amplification and possibly attenuate the shock environment before it reached the electronics contained in the various mechanical systems. Here, a study to compare two thickness values, 0.125 and 0.250 in. of five materials, GE RTV 630, HS II Silicone, Polysulfide Rubber, Sylgard 184, and Teflon for their shock mitigating characteristics with a split Hopkinson bar configuration has been completed. The five materials have been tested in both unconfined and confined conditions at ambient temperature and with two applied loads of 750 {mu}{epsilon} peak (25 fps peak) with a 100 {micro}s duration, measured at 10% amplitude, and 1500 {mu}{epsilon} peak (50 fps peak) with a 100 {micro}s duration, measured at 10% amplitude. The five materials have been tested at ambient, cold ({minus}65 F), and hot (+165 F) for the unconfined condition with the 750 {mu}{epsilon} peak (25 fps peak) applied load. Time domain and frequency domain analyses of the split Hopkinson bar data have been performed to compare how these materials lengthen the shock pulse, attenuate the shock pulse, reflect high frequency content in the shock pulse, and transmit energy.

  17. Echo-sounding method aids earthquake hazard studies

    USGS Publications Warehouse

    U.S. Geological Survey

    1995-01-01

    Dramatic examples of catastrophic damage from an earthquake occurred in 1989, when the M 7.1 Lorna Prieta rocked the San Francisco Bay area, and in 1994, when the M 6.6 Northridge earthquake jolted southern California. The surprising amount and distribution of damage to private property and infrastructure emphasizes the importance of seismic-hazard research in urbanized areas, where the potential for damage and loss of life is greatest. During April 1995, a group of scientists from the U.S. Geological Survey and the University of Tennessee, using an echo-sounding method described below, is collecting data in San Antonio Park, California, to examine the Monte Vista fault which runs through this park. The Monte Vista fault in this vicinity shows evidence of movement within the last 10,000 years or so. The data will give them a "picture" of the subsurface rock deformation near this fault. The data will also be used to help locate a trench that will be dug across the fault by scientists from William Lettis & Associates.

  18. Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria

    NASA Astrophysics Data System (ADS)

    Maltzkait, Anika; Pfurtscheller, Clemens

    2014-05-01

    Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria A. Maltzkait (1) & C. Pfurtscheller (1) (1) Institute for Interdisciplinary Mountain Research (IGF), Austrian Academy of Sciences, Innsbruck, Austria The extreme flood events of 2002, 2005 and 2013 in Austria underlined the importance of local emergency services being able to withstand and reduce the adverse impacts of natural hazards. Although for legal reasons municipal emergency and crisis management plans exist in Austria, they mostly do not cover risk analyses of natural hazards - a sound, comparable assessment to identify and evaluate risks. Moreover, total losses and operational emergencies triggered by natural hazards have increased in recent decades. Given sparse public funds, objective budget decisions are needed to ensure the efficient provision of operating resources, like personnel, vehicles and equipment in the case of natural hazards. We present a case study of the municipality of Au, Austria, which was hardly affected during the 2005 floods. Our approach is primarily based on a qualitative risk analysis, combining existing hazard plans, GIS data, field mapping and data on operational efforts of the fire departments. The risk analysis includes a map of phenomena discussed in a workshop with local experts and a list of risks as well as a risk matrix prepared at that workshop. On the basis for the exact requirements for technical and non-technical mitigation measures for each natural hazard risk were analysed in close collaboration with members of the municipal operation control and members of the local emergency services (fire brigade, Red Cross). The measures includes warning, evacuation and, technical interventions with heavy equipment and personnel. These results are used, first, to improve the municipal emergency and crisis management plan by providing a risk map, and a list of risks and, second, to check if the local emergency forces can cope with the different risk scenarios using locally available resources. The emergency response plans will identify possible resource deficiencies in personnel, vehicles and equipment. As qualitative methods and data are used, uncertainties in the study emerged in finding definitions for safety targets, in the construction of the different risk scenarios, in the inherent uncertainty beyond the probability of occurrence and the intensity of natural hazards, also in the case of the expectable losses. Finally, we used available studies and expert interviews to develop objective rules for investment decisions for the fire departments and the Red Cross to present an empirically sound basis for the efficient provision of intervention in the case of natural hazards for the municipality of Au. Again, the regulations for objective provision were developed in close collaboration with the emergency services.

  19. Melatonin mitigate cerebral vasospasm after experimental subarachnoid hemorrhage: a study of synchrotron radiation angiography

    NASA Astrophysics Data System (ADS)

    Cai, J.; He, C.; Chen, L.; Han, T.; Huang, S.; Huang, Y.; Bai, Y.; Bao, Y.; Zhang, H.; Ling, F.

    2013-06-01

    Cerebral vasospasm (CV) after subarachnoid hemorrhage (SAH) is a devastating and unsolved clinical issue. In this study, the rat models, which had been induced SAH by prechiasmatic cistern injection, were treated with melatonin. Synchrotron radiation angiography (SRA) was employed to detect and evaluate CV of animal models. Neurological scoring and histological examinations were used to assess the neurological deficits and CV as well. Using SRA techniques and histological analyses, the anterior cerebral artery diameters of SAH rats with melatonin administration were larger than those without melatonin treatment (p < 0.05). The neurological deficits of SAH rats treated with melatonin were less than those without melatonin treatment (p < 0.05). We concluded that SRA was a precise and in vivo tool to observe and evaluate CV of SAH rats; intraperitoneally administration of melatonin could mitigate CV after experimental SAH.

  20. Study on mobility-disadvantage group' risk perception and coping behaviors of abrupt geological hazards in coastal rural area of China.

    PubMed

    Pan, Anping

    2016-07-01

    China is a country highly vulnerable to abrupt geological hazards. The present study aims to investigate disaster preparedness and perception of abrupt geological disasters (such as rock avalanches, landslide, mud-rock flows etc) in mobility-disadvantage group living in coastal rural area of China. This research is to take into account all factors regarding disasters and to design the questionnaires accordingly. Two debris flow vulnerable townships are selected as study areas including Hedi Township in Qinyuan County and Xianxi Township in Yueqing City which are located in East China's Zhejiang Province. SPSS was applied to conduct descriptive analysis, which results in an effective empirical model for evacuation behavior of the disable groups. The result of this study shows mobility-disadvantage groups' awareness on disaster prevention and mitigation is poor and their knowledge about basic theory and emergency response is limited. Errors and distortions in public consciousness on disaster prevention and mitigation stimulate the development of areas with frequent disasters, which will expose more life and property to danger and aggravate the vulnerability of hazard bearing body. In conclusion, before drafting emergency planning, the government should consider more the disable group's expectations and actual evacuation behavior than the request of the situation to ensure the planning is good to work. PMID:27174691

  1. A combined approach to physical vulnerability of large cities exposed to natural hazards - the case study of Arequipa, Peru

    NASA Astrophysics Data System (ADS)

    Thouret, Jean-Claude; Ettinger, Susanne; Zuccaro, Giulio; Guitton, Mathieu; Martelli, Kim; Degregorio, Daniela; Nardone, Stefano; Santoni, Olivier; Magill, Christina; Luque, Juan Alexis; Arguedas, Ana

    2013-04-01

    Arequipa, the second largest city in Peru with almost one million inhabitants, is exposed to various natural hazards, such as earthquakes, landslides, flash floods, and volcanic eruptions. This study focuses on the vulnerability and response of housing, infrastructure and lifelines in Arequipa to flash floods and eruption induced hazards, notably lahars from El Misti volcano. We propose a combined approach for assessing physical vulnerability in a large city based on: (1) remote sensing utilizing high-resolution imagery (SPOT5, Google Earth Pro, Bing, Pléïades) to map the distribution and type of land use, properties of city blocks in terms of exposure to the hazard (elevation above river level, distance to channel, impact angle, etc.); (2) in situ survey of buildings and critical infrastructure (e.g., bridges) and strategic resources (e.g., potable water, irrigation, sewage); (3) information gained from interviews with engineers involved in construction works, previous crises (e.g., June 2001 earthquake) and risk mitigation in Arequipa. Remote sensing and mapping at the scale of the city has focused on three pilot areas, along the perennial Rio Chili valley that crosses the city and oasis from north to south, and two of the east-margin tributaries termed Quebrada (ravine): San Lazaro crossing the northern districts and Huarangal crossing the northeastern districts. Sampling of city blocks through these districts provides varying geomorphic, structural, historical, and socio-economic characteristics for each sector. A reconnaissance survey included about 900 edifices located in 40 city blocks across districts of the pilot areas, distinct in age, construction, land use and demographics. A building acts as a structural system and its strength and resistance to flashfloods and lahars therefore highly depends on the type of construction and the used material. Each building surveyed was assigned to one of eight building categories based on physical criteria (dominant building materials, number of floors, percentage and quality of openings, etc). Future steps in this study include mapping potential impacts from flash flood and lahars as a function of frequency of occurrence and magnitude. For this purpose, we will regroup the eight building types identified in Arequipa to obtain a reduced number of vulnerability categories. Fragility functions will then be established for each vulnerability category and hazard relating percentage damage to parameters such as flow velocity, depth, and dynamic and hydrostatic pressure. These functions will be applied to flow simulations for each of the three river channels considered with the final goal to determine potential losses, identify areas of particularly high risk and to prepare plans for evacuation, relocation and rehabilitation. In the long term, this investigation aims to contribute towards a multi-hazard risk analysis including earthquake- and other volcanic hazards, e.g. ashfall and pyroclastic flows, all by considering the cascading effects of a hazard chain. We also plan to address the consequences of failure of two artificial lake dams located 40 and 70 km north of the city. A lake breakout flood or lahar would propagate beyond the city and would call for an immediate response including contingency plans and evacuation practices.

  2. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  3. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  4. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 44 Emergency Management and Assistance 1 2012-10-01 2011-10-01 true Flood Mitigation Plan approval..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  5. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 44 Emergency Management and Assistance 1 2011-10-01 2011-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  6. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 44 Emergency Management and Assistance 1 2014-10-01 2014-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  7. A simulator study investigating how motorcyclists approach side-road hazards.

    PubMed

    Crundall, Elizabeth; Stedmon, Alex W; Saikayasit, Rossukorn; Crundall, David

    2013-03-01

    The most common form of motorcycle collision in the UK occurs when another road user fails to give way and pulls out from a side road in front of an oncoming motorcyclist. While research has considered these collisions from the car driver's perspective, no research to date has addressed how motorcyclists approach these potential hazards. This study conducted a detailed analysis of motorcyclist speed and road position on approach to side-roads in a simulated suburban setting. Novice, Experienced and Advanced riders rode two laps of a simulated route, encountering five side-roads on each lap. On the second lap, a car emerged from the first side-road in a typical 'looked but failed to see' accident scenario. Three Experienced riders and one Novice rider collided with the hazard. The Advanced rider group adopted the safest strategy when approaching side-roads, with a lane position closer to the centre of the road and slower speeds. In contrast, Experienced riders chose faster speeds, often over the speed limit, especially when approaching junctions with good visibility. Rider behaviour at non-hazard junctions was compared between laps, to investigate if riders modified their behaviour after experiencing the hazard. Whilst all riders were generally more cautious after the hazard, the Advanced riders modified their behaviour more than the other groups after the hazard vehicle had pulled out. The results suggest that advanced training can lead to safer riding styles that are not acquired by experience alone. PMID:23182782

  8. Modeling effects of urban heat island mitigation strategies on heat-related morbidity: a case study for Phoenix, Arizona, USA

    NASA Astrophysics Data System (ADS)

    Silva, Humberto R.; Phelan, Patrick E.; Golden, Jay S.

    2010-01-01

    A zero-dimensional energy balance model was previously developed to serve as a user-friendly mitigation tool for practitioners seeking to study the urban heat island (UHI) effect. Accordingly, this established model is applied here to show the relative effects of four common mitigation strategies: increasing the overall (1) emissivity, (2) percentage of vegetated area, (3) thermal conductivity, and (4) albedo of the urban environment in a series of percentage increases by 5, 10, 15, and 20% from baseline values. In addition to modeling mitigation strategies, we present how the model can be utilized to evaluate human health vulnerability from excessive heat-related events, based on heat-related emergency service data from 2002 to 2006. The 24-h average heat index is shown to have the greatest correlation to heat-related emergency calls in the Phoenix (Arizona, USA) metropolitan region. The four modeled UHI mitigation strategies, taken in combination, would lead to a 48% reduction in annual heat-related emergency service calls, where increasing the albedo is the single most effective UHI mitigation strategy.

  9. Blast effect on the lower extremities and its mitigation: a computational study.

    PubMed

    Dong, Liqiang; Zhu, Feng; Jin, Xin; Suresh, Mahi; Jiang, Binhui; Sevagan, Gopinath; Cai, Yun; Li, Guangyao; Yang, King H

    2013-12-01

    A series of computational studies were performed to investigate the response of the lower extremities of mounted soldiers under landmine detonation. A numerical human body model newly developed at Wayne State University was used to simulate two types of experimental studies and the model predictions were validated against test data in terms of the tibia axial force as well as bone fracture pattern. Based on the validated model, the minimum axial force causing tibia facture was found. Then a series of parametric studies was conducted to determine the critical velocity (peak velocity of the floor plate) causing tibia fracture at different upper/lower leg angles. In addition, to limit the load transmission through the vehicular floor, two types of energy absorbing materials, namely IMPAXX(®) foam and aluminum alloy honeycomb, were selected for floor matting. Their performances in terms of blast effect mitigation were compared using the validated numerical model, and it has been found that honeycomb is a more efficient material for blast injury prevention under the loading conditions studied. PMID:23973770

  10. Public willingness to pay for CO2 mitigation and the determinants under climate change: a case study of Suzhou, China.

    PubMed

    Yang, Jie; Zou, Liping; Lin, Tiansheng; Wu, Ying; Wang, Haikun

    2014-12-15

    This study explored the factors that influence respondents' willingness to pay (WTP) for CO2 mitigation under climate change. A questionnaire survey combined with contingent valuation and psychometric paradigm methods were conducted in the city of Suzhou, Jiangsu Province in China. Respondents' traditional demographic attributes, risk perception of greenhouse gas (GHG), and attitude toward the government's risk management practices were established using a Tobit model to analyze the determinants. The results showed that about 55% of the respondents refused to pay for CO2 mitigation, respondent's WTP increased with increasing CO2 mitigation percentage. Important factors influencing WTP include people's feeling of dread of GHGs, confidence in policy, the timeliness of governmental information disclosure, age, education and income level. PMID:25151109

  11. Studying and Improving Human Response to Natural Hazards: Lessons from the Virtual Hurricane Lab

    NASA Astrophysics Data System (ADS)

    Meyer, R.; Broad, K.; Orlove, B. S.

    2010-12-01

    One of the most critical challenges facing communities in areas prone to natural hazards is how to best encourage residents to invest in individual and collective actions that would reduce the damaging impact of low-probability, high-consequence, environmental events. Unfortunately, what makes this goal difficult to achieve is that the relative rarity natural hazards implies that many who face the risk of natural hazards have no previous experience to draw on when making preparation decisions, or have prior experience that provides misleading guidance on how best to prepare. For example, individuals who have experienced strings of minor earthquakes or near-misses from tropical cyclones may become overly complacent about the risks that extreme events actually pose. In this presentation we report the preliminary findings of a program of work that explores the use of realistic multi-media hazard simulations designed for two purposes: 1) to serve as a basic research tool for studying of how individuals make decisions to prepare for rare natural hazards in laboratory settings; and 2) to serve as an educational tool for giving people in hazard-prone areas virtual experience in hazard preparation. We demonstrate a prototype simulation in which participants experience the approach of a virtual hurricane, where they have the opportunity to invest in different kinds of action to protect their home from damage. As the hurricane approaches participants have access to an “information dashboard” in which they can gather information about the storm threat from a variety of natural sources, including mock television weather broadcasts, web sites, and conversations with neighbors. In response to this information they then have the opportunity to invest in different levels of protective actions. Some versions of the simulation are designed as games, where participants are rewarded based on their ability to make the optimal trade-off between under and over-preparing for the threat. From a basic research perspective the data provide valuable potential insights into the dynamics of information gathering prior to hurricane impacts, as well as laboratory in which we can study how both information gathering and responses varies in responses to controlled variations in such factors as the complexity of forecast information. From an applied perspective the simulations provide an opportunity for residents in hazard-prone areas to learn about different kinds of information and receive feedback on their potential biases prior to an actual encounter with a hazard. The presentation concludes with a summary of some of the basic research findings that have emerged from the hurricane lab to date, as well as a discussion of the prospects for extending the technology to a broad range of environmental hazards.

  12. Underground Coal-Fires in Xinjiang, China: A Continued Effort in Applying Geophysics to Solve a Local Problem and to Mitigate a Global Hazard

    NASA Astrophysics Data System (ADS)

    Wuttke, M. W.; Halisch, M.; Tanner, D. C.; Cai, Z. Y.; Zeng, Q.; Wang, C.

    2012-04-01

    Spontaneous uncontrolled coal seam fires are a well known phenomenon that causes severe environmental problems and severe impact on natural coal reserves. Coal fires are a worldwide phenomenon, but in particular in Xinjiang, that covers 17.3 % of Chinas area and hosts approx 42 % of its coal resources. In Xinjiang since more than 50 years a rigorous strategy for fire fighting on local and regional scale is persued. The Xinjiang Coalfield Fire Fighting Bureau (FFB) has developed technologies and methods to deal with any known fire. Many fires have been extinguished already, but the problem is still there if not even growing. This problem is not only a problem for China due to the loss of valuable energy resources, but it is also a worldwide threat because of the generation of substantial amounts of greenhouse gases. Through the FFB, China is struggling to overcome this, but the activities could be much enhanced by the continuation of the already successful conjoint operations. The last ten years have seen two successful cooperative projects between China and Germany on the field of coal-fire fighting, namely the German Technical Cooperation Project on Coal Fire in Xinjiang and the Sino-German Coal Fire Research Initiative funded by the corresponding ministeries of both countries. A persistent task in the fire fighting is the identification and supervision of areas with higher risks for the ignition of coal fires, the exploration of already ignited fire zones to extinguish the fires and the monitoring of extinguished fires to detect as early as possible process that may foster re-ignition. This can be achieved by modeling both the structures and the processes that are involved. This has also been a promising part of the past cooperation projects, yet to be transformed into a standard application of fire fighting procedures. In this contribution we describe the plans for a new conjoint project between China and Germany where on the basis of field investigations and laboratory measurements realistic dynamical models of fire-zones are constructed to increase the understanding of particular coal-fires, to interpret the surface signatures of the coal-fire in terms of location and propagation and to estimate the output of hazardous exhaust products to evaluate the economic benefit of fire extinction.

  13. Communicating Volcanic Hazards in the North Pacific

    NASA Astrophysics Data System (ADS)

    Dehn, J.; Webley, P.; Cunningham, K. W.

    2014-12-01

    For over 25 years, effective hazard communication has been key to effective mitigation of volcanic hazards in the North Pacific. These hazards are omnipresent, with a large event happening in Alaska every few years to a decade, though in many cases can happen with little or no warning (e.g. Kasatochi and Okmok in 2008). Here a useful hazard mitigation strategy has been built on (1) a large database of historic activity from many datasets, (2) an operational alert system with graduated levels of concern, (3) scenario planning, and (4) routine checks and communication with emergency managers and the public. These baseline efforts are then enhanced in the time of crisis with coordinated talking points, targeted studies and public outreach. Scientists naturally tend to target other scientists as their audience, whereas in effective monitoring of hazards that may only occur on year to decadal timescales, details can distract from the essentially important information. Creating talking points and practice in public communications can help make hazard response a part of the culture. Promoting situational awareness and familiarity can relieve indecision and concerns at the time of a crisis.

  14. Economic valuation of flood mitigation services: A case study from the Otter Creek, VT.

    NASA Astrophysics Data System (ADS)

    Galford, G. L.; Ricketts, T.; Bryan, K. L.; ONeil-Dunne, J.; Polasky, S.

    2014-12-01

    The ecosystem services provided by wetlands are widely recognized but difficult to quantify. In particular, estimating the effect of landcover and land use on downstream flood outcomes remains challenging, but is increasingly important in light of climate change predictions of increased precipitation in many areas. Economic valuation can help incorporate ecosystem services into decisions and enable communities to plan for climate and flood resiliency. Here we estimate the economic value of Otter Creek wetlands for Middlebury, VT in mitigating the flood that followed Tropical Storm Irene, as well as for ten historic floods. Observationally, hydrographs above and below the wetlands in the case of each storm indicated the wetlands functioned as a temporary reservoir, slowing the delivery of water to Middlebury. We compare observed floods, based on Middlebury's hydrograph, with simulated floods for scenarios without wetlands. To simulate these "without wetlands" scenarios, we assume the same volume of water was delivered to Middlebury, but in a shorter time pulse similar to a hydrograph upstream of the wetlands. For scenarios with and without wetlands, we map the spatial extent of flooding using LiDAR digital elevation data. We then estimate flood depth at each affected building, and calculate monetary losses as a function of the flood depth and house value using established depth damage relationships. For example, we expect damages equal to 20% of the houses value for a flood depth of two feet in a two-story home with a basement. We define the value of flood mitigation services as the difference in damages between the with and without wetlands scenario, and find that the Otter Creek wetlands reduced flood damage in Middlebury by 88% following Hurricane Irene. Using the 10 additional historic floods, we estimate an ongoing mean value of $400,000 in avoided damages per year. Economic impacts of this magnitude stress the importance of wetland conservation and warrant the consideration of ecosystem services in land use decisions. Our study indicates that here and elsewhere, green infrastructure may have to potential to increase the resilience of communities to projected changes in climate.

  15. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    SciTech Connect

    Bernreuter, D. L

    1981-08-08

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectral for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimted seismic hazard in this region of the country. 29 refs., 15 tabs.

  16. Mini-Sosie high-resolution seismic method aids hazards studies

    USGS Publications Warehouse

    Stephenson, W.J.; Odum, J.; Shedlock, K.M.; Pratt, T.L.; Williams, R.A.

    1992-01-01

    The Mini-Sosie high-resolution seismic method has been effective in imaging shallow-structure and stratigraphic features that aid in seismic-hazard and neotectonic studies. The method is not an alternative to Vibroseis acquisition for large-scale studies. However, it has two major advantages over Vibroseis as it is being used by the USGS in its seismic-hazards program. First, the sources are extremely portable and can be used in both rural and urban environments. Second, the shifting-and-summation process during acquisition improves the signal-to-noise ratio and cancels out seismic noise sources such as cars and pedestrians. -from Authors

  17. Pyrotechnic hazards classification and evaluation program test report. Heat flux study of deflagrating pyrotechnic munitions

    NASA Technical Reports Server (NTRS)

    Fassnacht, P. O.

    1971-01-01

    A heat flux study of deflagrating pyrotechnic munitions is presented. Three tests were authorized to investigate whether heat flux measurements may be used as effective hazards evaluation criteria to determine safe quantity distances for pyrotechnics. A passive sensor study was conducted simultaneously to investigate their usefulness in recording events and conditions. It was concluded that heat flux measurements can effectively be used to evaluate hazards criteria and that passive sensors are an inexpensive tool to record certain events in the vicinity of deflagrating pyrotechnic stacks.

  18. Study of the environmental hazard caused by the oil shale industry solid waste.

    PubMed

    Põllumaa, L; Maloveryan, A; Trapido, M; Sillak, H; Kahru, A

    2001-01-01

    The environmental hazard was studied of eight soil and solid waste samples originating from a region of Estonia heavily polluted by the oil shale industry. The samples were contaminated mainly with oil products (up to 7231mg/kg) and polycyclic aromatic hydrocarbons (PAHs; up to 434mg/kg). Concentrations of heavy metals and water-extractable phenols were low. The toxicities of the aqueous extracts of solid-phase samples were evaluated by using a battery of Toxkit tests (involving crustaceans, protozoa, rotifers and algae). Waste rock and fresh semi-coke were classified as of "high acute toxic hazard", whereas aged semi-coke and most of the polluted soils were classified as of "acute toxic hazard". Analysis of the soil slurries by using the photobacterial solid-phase flash assay showed the presence of particle-bound toxicity in most samples. In the case of four samples out of the eight, chemical and toxicological evaluations both showed that the levels of PAHs, oil products or both exceeded their respective permitted limit values for the living zone (20mg PAHs/kg and 500mg oil products/kg); the toxicity tests showed a toxic hazard. However, in the case of three samples, the chemical and toxicological hazard predictions differed markedly: polluted soil from the Erra River bank contained 2334mg oil/kg, but did not show any water-extractable toxicity. In contrast, spent rock and aged semi-coke that contained none of the pollutants in hazardous concentrations, showed adverse effects in toxicity tests. The environmental hazard of solid waste deposits from the oil shale industry needs further assessment. PMID:11387023

  19. Seismic hazard assessment of the cultural heritage sites: A case study in Cappadocia (Turkey)

    NASA Astrophysics Data System (ADS)

    Seyrek, Evren; Orhan, Ahmet; Dinçer, İsmail

    2014-05-01

    Turkey is one of the most seismically active regions in the world. Major earthquakes with the potential of threatening life and property occur frequently here. In the last decade, over 50,000 residents lost their lives, commonly as a result of building failures in seismic events. The Cappadocia region is one of the most important touristic sites in Turkey. At the same time, the region has been included to the Word Heritage List by UNESCO at 1985 due to its natural, historical and cultural values. The region is undesirably affected by several environmental conditions, which are subjected in many previous studies. But, there are limited studies about the seismic evaluation of the region. Some of the important historical and cultural heritage sites are: Goreme Open Air Museum, Uchisar Castle, Ortahisar Castle, Derinkuyu Underground City and Ihlara Valley. According to seismic hazard zonation map published by the Ministry of Reconstruction and Settlement these heritage sites fall in Zone III, Zone IV and Zone V. This map show peak ground acceleration or 10 percent probability of exceedance in 50 years for bedrock. In this connection, seismic hazard assessment of these heritage sites has to be evaluated. In this study, seismic hazard calculations are performed both deterministic and probabilistic approaches with local site conditions. A catalog of historical and instrumental earthquakes is prepared and used in this study. The seismic sources have been identified for seismic hazard assessment based on geological, seismological and geophysical information. Peak Ground Acceleration (PGA) at bed rock level is calculated for different seismic sources using available attenuation relationship formula applicable to Turkey. The result of the present study reveals that the seismic hazard at these sites is closely matching with the Seismic Zonation map published by the Ministry of Reconstruction and Settlement. Keywords: Seismic Hazard Assessment, Probabilistic Approach, Deterministic Approach, Historical Heritage, Cappadocia.

  20. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  1. A Food Effect Study of an Oral Thrombin Inhibitor and Prodrug Approach To Mitigate It.

    PubMed

    Lee, Jihye; Kim, Bongchan; Kim, Tae Hun; Lee, Sun Hwa; Park, Hee Dong; Chung, Kyungha; Lee, Sung-Hack; Paek, Seungyup; Kim, Eunice EunKyeong; Yoon, SukKyoon; Kim, Aeri

    2016-04-01

    LB30870, a new direct thrombin inhibitor, showed 80% reduction in oral bioavailability in fed state. The present study aims to propose trypsin binding as a mechanism for such negative food effect and demonstrate a prodrug approach to mitigate food effect. Effect of food composition on fed state oral bioavailability of LB30870 was studied in dogs. Various prodrugs were synthesized, and their solubility, permeability, and trypsin binding affinity were measured. LB30870 and prodrugs were subject to cocrystallization with trypsin, and the X-ray structures of cocrystals were determined. Food effect was studied in dogs for selected prodrugs. Protein or lipid meal appeared to affect oral bioavailability of LB30870 in dogs more than carbohydrate meal. Blocking both carboxyl and amidine groups of LB30870 resulted in trypsin Ki values orders of magnitude higher than that of LB30870. Prodrugs belonged to either Biopharmaceutical Classification System I, II, or III. X-ray crystallography revealed that prodrugs did not bind to trypsin, but instead their hydrolysis product at the amidine blocking group formed cocrystal with trypsin. A prodrug with significantly less food effect than LB30870 was identified. Binding of prodrugs to food components such as dietary fiber appeared to counteract the positive effect brought with the prodrug approach. Further formulation research is warranted to enhance the oral bioavailability of prodrugs. In conclusion, this study is the first to demonstrate that the negative food effect of LB30870 can be attributed to trypsin binding. Trypsin binding study is proposed as a screening tool during lead optimization to minimize food effect. PMID:26886576

  2. A Case Study in Ethical Decision Making Regarding Remote Mitigation of Botnets

    NASA Astrophysics Data System (ADS)

    Dittrich, David; Leder, Felix; Werner, Tillmann

    It is becoming more common for researchers to find themselves in a position of being able to take over control of a malicious botnet. If this happens, should they use this knowledge to clean up all the infected hosts? How would this affect not only the owners and operators of the zombie computers, but also other researchers, law enforcement agents serving justice, or even the criminals themselves? What dire circumstances would change the calculus about what is or is not appropriate action to take? We review two case studies of long-lived malicious botnets that present serious challenges to researchers and responders and use them to illuminate many ethical issues regarding aggressive mitigation. We make no judgments about the questions raised, instead laying out the pros and cons of possible choices and allowing workshop attendees to consider how and where they would draw lines. By this, we hope to expose where there is clear community consensus as well as where controversy or uncertainty exists.

  3. Assessment of environmental impact on air quality by cement industry and mitigating measures: a case study.

    PubMed

    Kabir, G; Madugu, A I

    2010-01-01

    In this study, environmental impact on air quality was evaluated for a typical Cement Industry in Nigeria. The air pollutants in the atmosphere around the cement plant and neighbouring settlements were determined using appropriate sampling techniques. Atmospheric dust and CO2 were prevalent pollutants during the sampling period; their concentrations were recorded to be in the range of 249-3,745 mg/m3 and 2,440-2,600 mg/m3, respectively. Besides atmospheric dust and CO2, the air pollutants such as NOx, SOx and CO were in trace concentrations, below the safe limits approved by FEPA that are 0.0062-0.093 mg/m3 NOx, 0.026 mg/m3 SOx and 114.3 mg/m3 CO, respectively. Some cost-effective mitigating measures were recommended that include the utilisation of readily available and low-cost pozzolans material to produce blended cement, not only could energy efficiency be improved, but carbon dioxide emission could also be minimised during clinker production; and the installation of an advance high-pressure grinding rolls (clinker-roller-press process) to maximise energy efficiency to above what is obtainable from the traditional ball mills and to minimise CO2 emission from the power plant. PMID:19067202

  4. Radon mitigation studies: South Central Florida demonstration. Final report, November 1987-January 1991

    SciTech Connect

    Fowler, C.S.; Williamson, A.D.; Pyle, B.E.; Belzer, F.E.; Coker, R.N.

    1992-10-01

    The report gives results of an EPA radon mitigation project involving 14 slab-on-grade houses in Polk County, FL, having indoor radon levels of 320-3810 Bq/cu m (8.7-103 pCi/L), using sub-slab depressurization (SSD) in a variety of applications to evaluate optimal design criteria to be recommended as cost-effective and capable of reducing indoor radon concentrations in houses built over compacted soil fills. For all houses, obvious accessible radon entry points were sealed, and 53-90 L (12-20 gal.) suction pits were dug into the fill material. Two of the houses were mitigated with exterior horizontal suction holes drilled through the stem walls. In four houses, one or more suction pipes were in the garage. The remainder of the interior suction holes were in closets or some other unobtrusive location. Except for the two houses with exterior systems, the other 12 had mitigation fans in the attic. In-line centrifugal fans were used to mitigate each house, although a larger radial blower was installed overnight for experimental purposes in one house, and a vaccumcleaner was used to simulate a larger suction in another house for pressure field measurements only. Post-mitigation worst-case radon concentrations in these houses ranged from 40 to 290 Bq/cu m.

  5. Recommendations for water supply in arsenic mitigation: a case study from Bangladesh.

    PubMed

    Hoque, B A; Mahmood, A A; Quadiruzzaman, M; Khan, F; Ahmed, S A; Shafique, S A; Rahman, M; Morshed, G; Chowdhury, T; Rahman, M M; Khan, F H; Shahjahan, M; Begum, M; Hoque, M M

    2000-11-01

    Arsenic problems have been observed in several countries around the world. The challenges of arsenic mitigation are more difficult for developing and poor countries due to resource and other limitations. Bangladesh is experiencing the worst arsenic problem in the world, as about 30 million people are possibly drinking arsenic contaminated water. Lack of knowledge has hampered the mitigation initiatives. This paper presents experience gained during an action research on water supply in arsenic mitigation in rural Singair, Bangladesh. The mitigation has been implemented there through integrated research and development of appropriate water supply options and its use through community participation. Political leaders and women played key roles in the success of the mitigation. More than one option for safe water has been developed and/or identified. The main recommendations include: integration of screening of tubewells and supply of safe water, research on technological and social aspects, community, women and local government participation, education and training of all stakeholders, immediate and appropriate use of the available knowledge, links between intermediate/immediate and long term investment, effective coordination and immediate attention by health, nutrition, agriculture, education, and other programs to this arsenic issue. PMID:11114764

  6. Coastal dynamics studies for evaluation of hazard and vulnerability for coastal erosion. case study the town La Bocana, Buenaventura, colombian pacific

    NASA Astrophysics Data System (ADS)

    Coca-Domínguez, Oswaldo; Ricaurte-Villota, Constanza

    2015-04-01

    The analysis of the hazard and vulnerability in coastal areas caused for erosion is based on studies of coastal dynamics since that allows having a better information detail that is useful for decision-making in aspects like prevention, mitigation, disaster reduction and integrated risk management. The Town of La Bocana, located in Buenaventura (Colombian Pacific) was selected to carry out the threat assessment for coastal erosion based on three components: i) magnitude, ii) occurrence and iii) susceptibility. Vulnerability meanwhile, is also composed of three main components for its evaluation: i) exposure ii) fragility and iii) resilience, which in turn are evaluated in 6 dimensions of vulnerability: physical, social, economic, ecological, institutional and cultural. The hazard analysis performed used a semi-quantitative approach, and an index of variables such as type of geomorphological unit, type of beach, exposure of the surfing coast, occurrence, among others. Quantitative data of coastal retreat was measured through the use of DSAS (Digital Shoreline Analysis System) an application of ArcGIS, as well as the development of digital elevation models from the beach and 6 beach profiles strategically located on the coast obtained with GNSS technology. Sediment samples collected from these beaches, medium height and wave direction were used as complementary data. The information was integrated across the coast line into segments of 250 x 250 meters. 4 sectors are part of the coastal area of La Bocana: Pianguita, Vistahermosa, Donwtown and Shangay. 6 vulnerability dimensions units were taken from these population, as well as its density for exposure, wich was analyzed through a multi-array method that include variables such as, land use, population, type of structure, education, basic services, among others, to measure frailty, and their respective indicator of resilience. The hazard analysis results indicate that Vistahermosa is in very high threat, while Donwtown and Pianguita are in a medium hazard. Particularly these two sectors have the mayor population density and the biggest hotel development and services infraestructure; meanwhile Shangay was scored with low hazard because the wave action has no direct impact on it. Vulnerability analysis suggest that the sector of Shangay has a very high vulnerability status because it is a sector that does not have any basic services and have low levels of schooling, meanwhile Downtown, Vistahermosa and Pianguita are in the average of vulnerability. Additionally, it was determined that in recent years the sector of Vista hermosa the erosion rates are up to -xx m yr-1, while in other sectors the regression of the coastline can be associated with local tidal peaks that occur during April and October, while other months of the year are typically for recovery and stability processes.

  7. FMECA application to Rainfall Hazard prevention in olive trees growings

    NASA Astrophysics Data System (ADS)

    Buendia-Buendía, F. S.; Bermudez, R.; Tarquis, A. M.; Andina, D.

    2010-05-01

    The FMECA (Failure Mode Effects and Criticality Analysis) is a broadly extended System Safety tool applied in industries as Aerospace in order to prevent hazards. This methodology studies the different failure modes of a system and try to mitigate them in a systematic procedure. In this paper this tool is applied in order to mitigate economical impact hazards derived from Rainfalls to olive trees growing in Granada (Spain), understanding hazard from the System Safety perspective (Any real or potential condition that can cause injury, illness, or death to personnel; damage to or loss of a system, equipment or property; or damage to the environment). The work includes a brief introduction to the System Safety and FMECA methodologies, applying then these concepts to analyze the Olive trees as a system and identify the hazards during the different stages of the whole life cycle plant production.

  8. Software safety hazard analysis

    SciTech Connect

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  9. A Study of Airline Passenger Susceptibility to Atmospheric Turbulence Hazard

    NASA Technical Reports Server (NTRS)

    Stewart, Eric C.

    2000-01-01

    A simple, generic, simulation math model of a commercial airliner has been developed to study the susceptibility of unrestrained passengers to large, discrete gust encounters. The math model simulates the longitudinal motion to vertical gusts and includes (1) motion of an unrestrained passenger in the rear cabin, (2) fuselage flexibility, (3) the lag in the downwash from the wing to the tail, and (4) unsteady lift effects. Airplane and passenger response contours are calculated for a matrix of gust amplitudes and gust lengths of a simulated mountain rotor. A comparison of the model-predicted responses to data from three accidents indicates that the accelerations in actual accidents are sometimes much larger than the simulated gust encounters.

  10. Hazardous waste source-reduction study with treated groundwater recycling

    SciTech Connect

    Chang, L.Y.; McCoy, B.J. )

    1993-08-01

    A feasibility study is presented for modifying electroplating processes for source reduction. Ion exchange and reverse osmosis units are suggested to allow reclaiming and recycling of metal solutions. A particular example of water conservation in an electroplating shop is presented for the treatment and utilization of groundwater contaminated by hydrocarbon chemicals, including volatile organic compounds (VOCs) and gasoline products. Granular carbon adsorption, UV oxidation, and demineralization steps and alkalinity control measures for the groundwater are discussed. Engineering and economic analyses provide a basis for comparing alternative designs. An integrated scheme, including groundwater remediation and source reduction, is feasible for the plating shop. The removal of VOCs and demineralization of the polluted groundwater are important steps. With the integrated plan, 90% removal or recovery of heavy metals can be achieved, and water usage and wastewater can be reduced by 90%. Thus, it is feasible to prevent water pollution at the source and to recycle treated groundwater and wastewater for the manufacturing process.

  11. Voltage Sag Mitigation Strategies for an Indian Power Systems: A Case Study

    NASA Astrophysics Data System (ADS)

    Goswami, A. K.; Gupta, C. P.; Singh, G. K.

    2014-08-01

    Under modern deregulated environment, both utilities and customers are concerned with the power quality improvement but with different objectives/interests. The utility reconfigure its power network and install mitigation devices, if needed, to improve power quality. The paper presents a strategy for selecting cost-effective solutions to mitigate voltage sags, the most frequent power quality disturbance. In this paper, mitigation device(s) is/are inducted in the optimal network topology at suitable places for their better effectiveness for further improvement in power quality. The optimal placement is looked from utility perspectives for overall benefit. Finally, their performance is evaluated on the basis of reduction in total number of voltage sags, reduction in total number of process trips and reduction in total financial losses due to voltage sags.

  12. Landslide hazard mapping with selected dominant factors: A study case of Penang Island, Malaysia

    NASA Astrophysics Data System (ADS)

    Tay, Lea Tien; Alkhasawneh, Mutasem Sh.; Ngah, Umi Kalthum; Lateh, Habibah

    2015-05-01

    Landslide is one of the destructive natural geohazards in Malaysia. In addition to rainfall as triggering factos for landslide in Malaysia, topographical and geological factors play important role in the landslide susceptibility analysis. Conventional topographic factors such as elevation, slope angle, slope aspect, plan curvature and profile curvature have been considered as landslide causative factors in many research works. However, other topographic factors such as diagonal length, surface area, surface roughness and rugosity have not been considered, especially for the research work in landslide hazard analysis in Malaysia. This paper presents landslide hazard mapping using Frequency Ratio (FR) and the study area is Penang Island of Malaysia. Frequency ratio approach is a variant of probabilistic method that is based on the observed relationships between the distribution of landslides and each landslide-causative factor. Landslide hazard map of Penang Island is produced by considering twenty-two (22) landslide causative factors. Among these twenty-two (22) factors, fourteen (14) factors are topographic factors. They are elevation, slope gradient, slope aspect, plan curvature, profile curvature, general curvature, tangential curvature, longitudinal curvature, cross section curvature, total curvature, diagonal length, surface area, surface roughness and rugosity. These topographic factors are extracted from the digital elevation model of Penang Island. The other eight (8) non-topographic factors considered are land cover, vegetation cover, distance from road, distance from stream, distance from fault line, geology, soil texture and rainfall precipitation. After considering all twenty-two factors for landslide hazard mapping, the analysis is repeated with fourteen dominant factors which are selected from the twenty-two factors. Landslide hazard map was segregated into four categories of risks, i.e. Highly hazardous area, Hazardous area, Moderately hazardous area and Not hazardous area. The maps was assessed using ROC (Rate of Curve) based on the area under the curve method (AUC). The result indicates an increase of accuracy from 77.76% (with all 22 factors) to 79.00% (with 14 dominant factors) in the prediction of landslide occurrence.

  13. Landslide hazard mapping with selected dominant factors: A study case of Penang Island, Malaysia

    SciTech Connect

    Tay, Lea Tien; Alkhasawneh, Mutasem Sh.; Ngah, Umi Kalthum; Lateh, Habibah

    2015-05-15

    Landslide is one of the destructive natural geohazards in Malaysia. In addition to rainfall as triggering factos for landslide in Malaysia, topographical and geological factors play important role in the landslide susceptibility analysis. Conventional topographic factors such as elevation, slope angle, slope aspect, plan curvature and profile curvature have been considered as landslide causative factors in many research works. However, other topographic factors such as diagonal length, surface area, surface roughness and rugosity have not been considered, especially for the research work in landslide hazard analysis in Malaysia. This paper presents landslide hazard mapping using Frequency Ratio (FR) and the study area is Penang Island of Malaysia. Frequency ratio approach is a variant of probabilistic method that is based on the observed relationships between the distribution of landslides and each landslide-causative factor. Landslide hazard map of Penang Island is produced by considering twenty-two (22) landslide causative factors. Among these twenty-two (22) factors, fourteen (14) factors are topographic factors. They are elevation, slope gradient, slope aspect, plan curvature, profile curvature, general curvature, tangential curvature, longitudinal curvature, cross section curvature, total curvature, diagonal length, surface area, surface roughness and rugosity. These topographic factors are extracted from the digital elevation model of Penang Island. The other eight (8) non-topographic factors considered are land cover, vegetation cover, distance from road, distance from stream, distance from fault line, geology, soil texture and rainfall precipitation. After considering all twenty-two factors for landslide hazard mapping, the analysis is repeated with fourteen dominant factors which are selected from the twenty-two factors. Landslide hazard map was segregated into four categories of risks, i.e. Highly hazardous area, Hazardous area, Moderately hazardous area and Not hazardous area. The maps was assessed using ROC (Rate of Curve) based on the area under the curve method (AUC). The result indicates an increase of accuracy from 77.76% (with all 22 factors) to 79.00% (with 14 dominant factors) in the prediction of landslide occurrence.

  14. Flood hazards studies in the Mississippi River basin using remote sensing

    NASA Technical Reports Server (NTRS)

    Rango, A.; Anderson, A. T.

    1974-01-01

    The Spring 1973 Mississippi River flood was investigated using remotely sensed data from ERTS-1. Both manual and automatic analyses of the data indicated that ERTS-1 is extremely useful as a regional tool for flood mamagement. Quantitative estimates of area flooded were made in St. Charles County, Missouri and Arkansas. Flood hazard mapping was conducted in three study areas along the Mississippi River using pre-flood ERTS-1 imagery enlarged to 1:250,000 and 1:100,000 scale. Initial results indicate that ERTS-1 digital mapping of flood prone areas can be performed at 1:62,500 which is comparable to some conventional flood hazard map scales.

  15. Respiratory hazards in hard metal workers: a cross sectional study.

    PubMed Central

    Meyer-Bisch, C; Pham, Q T; Mur, J M; Massin, N; Moulin, J J; Teculescu, D; Carton, B; Pierre, F; Baruthio, F

    1989-01-01

    A cross sectional study was conducted on 513 employees at three hard metal plants: 425 exposed workers (351 men, 74 women) and 88 controls (69 men, 19 women). Cough and sputum were more frequent in workers engaged in "soft powder" and presintering workshops compared with controls (12.5% and 16.5% v 3.5%). Spirometric abnormalities were more frequent among women in sintering and finishing workshops compared with control women (56.8% v 23.8%) and abnormalities of carbon monoxide test were more frequent in exposed groups than in controls; this difference was more pronounced in women (31.4% v 5.6%) than in men (18.5% v 13%). No significant correlation was observed between duration of exposure and age adjusted lung function tests. Slight abnormalities of chest radiographs (0/1, 1/1 according to ILO classification) were more frequent in exposed men than controls (12.8% v 1.9%) and mostly in soft powder workers. In subjects with abnormal chest radiographs FVC, FEV1 and carbon monoxide indices (fractional uptake of CO or CO transfer index or both) were lower compared with those with normal chest radiographs. Although relatively mild, the clinical, radiological, and functional abnormalities uncovered call for a regular supervision of workers exposed to hard metal dust. PMID:2787666

  16. A Study on Health Hazards to Employees near Main Streets

    PubMed Central

    2012-01-01

    In order to evaluate the physical and psychological health effects from automobile air pollution, 99 employees who worked near a main street were given a general health questionnaire, and the prevalence of their subjective complaints was measured. The collected data were classified according to gender, sleep time, degree of regular exercise, self-consciousness of symptoms, length of employment, work time, rest time, and smoking status. The results obtained were summarized as follows: The scores related to health complaints regarding physical and psychological items were higher in females than in males. THI scores were higher for the < 4 hour sleep time group. The health complaint scores for physical items were higher in the regular exercise group, whereas most scores for mental items were higher in the irregular exercise groups. The health complaints scores for physical and psychological items were higher in the unhealthy symptom group than in other groups. Those employees who had worked for > 4 years showed significantly higher rates of complaints regarding the eyes and skin. THI scores were higher for the < 6 hour working time group. The smoking group showed higher scores regarding health complaints related to physical items. The THI scores of the respiratory organs, mouth, anus, and digestive organs were significantly higher for the smoking group than for the non-smoking group. In summary, this study shows that the health complaint scores regarding physical and psychological symptoms tended to be higher among the unhealthy group, the less sleep time group, the less work time group, smokers, and females. These results can be used to improve the psychosomatic health status and working environments of employees who work near a main street. PMID:24278611

  17. Mask roughness induced LER control and mitigation: aberrations sensitivity study and alternate illumination scheme

    SciTech Connect

    McClinton, Brittany M.; Naulleau, Patrick P.

    2011-03-11

    Here we conduct a mask-roughness-induced line-edge-roughness (LER) aberrations sensitivity study both as a random distribution amongst the first 16 Fringe Zernikes (for overall aberration levels of 0.25, 0.50, and 0.75nm rms) as well as an individual aberrations sensitivity matrix over the first 37 Fringe Zernikes. Full 2D aerial image modeling for an imaging system with NA = 0.32 was done for both the 22-nm and 16-nm half-pitch nodes on a rough mask with a replicated surface roughness (RSR) of 100 pm and a correlation length of 32 nm at the nominal extreme-ultraviolet lithography (EUVL) wavelength of 13.5nm. As the ideal RSR value for commercialization of EUVL is 50 pm and under, and furthermore as has been shown elsewhere, a correlation length of 32 nm of roughness on the mask sits on the peak LER value for an NA = 0.32 imaging optic, these mask roughness values and consequently the aberration sensitivity study presented here, represent a worst-case scenario. The illumination conditions were chosen based on the possible candidates for the 22-nm and 16-nm half-pitch nodes, respectively. In the 22-nm case, a disk illumination setting of {sigma} = 0.50 was used, and for the 16-nm case, crosspole illumination with {sigma} = 0.10 at an optimum offset of dx = 0 and dy = .67 in sigma space. In examining how to mitigate mask roughness induced LER, we considered an alternate illumination scheme whereby a traditional dipole's angular spectrum is extended in the direction parallel to the line-and-space mask absorber pattern to represent a 'strip'. While this illumination surprisingly provides minimal improvement to the LER as compared to several alternate illumination schemes, the overall imaging quality in terms of image-log-slope (ILS) and contrast is improved.

  18. Mask roughness induced LER control and mitigation: aberrations sensitivity study and alternate illumination scheme

    NASA Astrophysics Data System (ADS)

    McClinton, Brittany M.; Naulleau, Patrick P.

    2011-04-01

    Here we conduct a mask-roughness-induced line-edge-roughness (LER) aberrations sensitivity study both as a random distribution amongst the first 16 Fringe Zernikes (for overall aberration levels of 0.25, 0.50, and 0.75nm rms) as well as an individual aberrations sensitivity matrix over the first 37 Fringe Zernikes. Full 2D aerial image modeling for an imaging system with NA = 0.32 was done for both the 22-nm and 16-nm half-pitch nodes on a rough mask with a replicated surface roughness (RSR) of 100 pm and a correlation length of 32 nm at the nominal extreme-ultraviolet lithography (EUVL) wavelength of 13.5nm. As the ideal RSR value for commercialization of EUVL is 50 pm and under, and furthermore as has been shown elsewhere, a correlation length of 32 nm of roughness on the mask sits on the peak LER value for an NA = 0.32 imaging optic, these mask roughness values and consequently the aberration sensitivity study presented here, represent a worst-case scenario. The illumination conditions were chosen based on the possible candidates for the 22-nm and 16-nm half-pitch nodes, respectively. In the 22-nm case, a disk illumination setting of σ = 0.50 was used, and for the 16-nm case, crosspole illumination with σ = 0.10 at an optimum offset of dx = 0 and dy = .67 in sigma space. In examining how to mitigate mask roughness induced LER, we considered an alternate illumination scheme whereby a traditional dipole's angular spectrum is extended in the direction parallel to the line-and-space mask absorber pattern to represent a "strip". While this illumination surprisingly provides minimal improvement to the LER as compared to several alternate illumination schemes, the overall imaging quality in terms of image-log-slope (ILS) and contrast is improved.

  19. THE SOCIAL IMPLICATIONS OF FLAME RETARDANT CHEMICALS: A CASE STUDY IN RISK AND HAZARD PERCEPTION

    EPA Science Inventory

    This study is expected to fill an important gap in the literature by focusing on how individuals characterize exposure in terms of risk and hazard, and how this understanding can lead to concrete changes in their personal and professional lives. I expect that people differ gre...

  20. A PRELIMINARY FEASIBILITY STUDY FOR AN OFFSHORE HAZARDOUS WASTE INCINERATION FACILITY

    EPA Science Inventory

    The report summarizes a feasibility study of using an existing offshore oil platform, being offered to the Government, as a site for incineration of hazardous wastes and related research. The platform, located in the Gulf of Mexico about 100 km south of Mobile, AL, has potential ...

  1. Preparation of a national Copernicus service for detection and monitoring of land subsidence and mass movements in the context of remote sensing assisted hazard mitigation

    NASA Astrophysics Data System (ADS)

    Kalia, Andre C.; Frei, Michaela; Lege, Thomas

    2014-10-01

    Land subsidence can cause severe damage for e.g. infrastructure and buildings and mass movements even can lead to loss of live. Detection and monitoring of these processes by terrestrial measurement techniques remain a challenge due to limitations in spatial coverage and temporal resolution. Since the launch of ERS-1 in 1991 numerous scientific studies demonstrated the capability of differential SAR-Interferometry (DInSAR) for the detection of surface deformation proving the usability of this method. In order to assist the utilization of DInSAR for governmental tasks a national service-concept within the EU-ESA Program "Copernicus" is in the process of preparation. This is done by i) analyzing the user requirements, ii) developing a concept and iii) perform case studies as "proof of concept". Due to the iterative nature of this procedure governmental users as well as DInSAR experts are involved. This paper introduces the concept, shows the available SAR data archive from ERS-1/2, TerraSAR-X and TanDEM-X as well as the proposed case study. The case study is focusing on the application of advanced DInSAR methods for the detection of subsidence in a region with active gas extraction. The area of interest is located in the state of Lower Saxony in the northwest of Germany. The DInSAR analysis will be based on ERS-1/2 and on TerraSARX/ TanDEM-X SAR data. The usability of the DInSAR products will be discussed with the responsible mining authority (LBEG) in order to adapt the DInSAR products to the user needs and to evaluate the proposed concept.

  2. When does highway construction to mitigate congestion reduce carbon emissions? A Case Study: The Caldecott Tunnel

    NASA Astrophysics Data System (ADS)

    Thurlow, M. E.; Maness, H.; Wiersema, D. J.; Mcdonald, B. C.; Harley, R.; Fung, I. Y.

    2014-12-01

    The construction of the fourth bore of the Caldecott Tunnel, which connects Oakland and Moraga, CA on State Route 24, was the second largest roadway construction project in California last year with a total cost of $417 million. The objective of the fourth bore was to reduce traffic congestion before the tunnel entrance in the off-peak direction of travel, but the project was a source of conflict between policy makers and environmental and community groups concerned about the air quality and traffic impacts. We analyze the impact of the opening of the fourth bore on CO2 emissions associated with traffic. We made surface observations of CO2from a mobile platform along State Route 24 for several weeks in November 2013 incorporating the period prior to and after the opening of the fourth bore on November 16, 2013. We directly compare bottom-up and top-down approaches to estimate the change in traffic emissions associated with the fourth bore opening. A bottom-up emissions inventory was derived from the high-resolution Performance Measurement System (PeMs) dataset and the Multi-scale Motor Vehicle and Equipment Emissions System (MOVES). The emissions inventory was used to drive a box model as well as a high-resolution regional transport model (the Weather and Regional Forecasting Model). The box model was also used to derive emissions from observations in a basic inversion. We also present an analysis of long-term traffic patterns and consider the potential for compensating changes in behavior that offset the observed emissions reductions on longer timescales. Finally, we examine how the results from the Caldecott study demonstrate the general benefit of using mobile measurements for quantifying environmental impacts of congestion mitigation projects.

  3. Urban Vulnerability Assessment to Seismic Hazard through Spatial Multi-Criteria Analysis. Case Study: the Bucharest Municipality/Romania

    NASA Astrophysics Data System (ADS)

    Armas, Iuliana; Dumitrascu, Silvia; Bostenaru, Maria

    2010-05-01

    In the context of an explosive increase in value of the damage caused by natural disasters, an alarming challenge in the third millennium is the rapid growth of urban population in vulnerable areas. Cities are, by definition, very fragile socio-ecological systems with a high level of vulnerability when it comes to environmental changes and that are responsible for important transformations of the space, determining dysfunctions shown in the state of the natural variables (Parker and Mitchell, 1995, The OFDA/CRED International Disaster Database). A contributing factor is the demographic dynamic that affects urban areas. The aim of this study is to estimate the overall vulnerability of the urban area of Bucharest in the context of the seismic hazard, by using environmental, socio-economic, and physical measurable variables in the framework of a spatial multi-criteria analysis. For this approach the capital city of Romania was chosen based on its high vulnerability due to the explosive urban development and the advanced state of degradation of the buildings (most of the building stock being built between 1940 and 1977). Combining these attributes with the seismic hazard induced by the Vrancea source, Bucharest was ranked as the 10th capital city worldwide in the terms of seismic risk. Over 40 years of experience in the natural risk field shows that the only directly accessible way to reduce the natural risk is by reducing the vulnerability of the space (Adger et al., 2001, Turner et al., 2003; UN/ISDR, 2004, Dayton-Johnson, 2004, Kasperson et al., 2005; Birkmann, 2006 etc.). In effect, reducing the vulnerability of urban spaces would imply lower costs produced by natural disasters. By applying the SMCA method, the result reveals a circular pattern, signaling as hot spots the Bucharest historic centre (located on a river terrace and with aged building stock) and peripheral areas (isolated from the emergency centers and defined by precarious social and economic conditions). In effect, the example of Bucharest demonstrates how the results shape the ‘vulnerability to seismic hazard profile of the city, based on which decision makers could develop proper mitigation strategies. To sum up, the use of an analytical framework as the standard Spatial Multi-Criteria Analysis (SMCA) - despite all difficulties in creating justifiable weights (Yeh et al., 1999) - results in accurate estimations of the state of the urban system. Although this method was often mistrusted by decision makers (Janssen, 2001), we consider that the results can represent, based on precisely the level of generalization, a decision support framework for policy makers to critically reflect on possible risk mitigation plans. Further study will lead to the improvement of the analysis by integrating a series of daytime and nighttime scenarios and a better definition of the constructed space variables.

  4. A Study of Aircraft Fire Hazards Related to Natural Electrical Phenomena

    NASA Technical Reports Server (NTRS)

    Kester, Frank L.; Gerstein, Melvin; Plumer, J. A.

    1960-01-01

    The problems of natural electrical phenomena as a fire hazard to aircraft are evaluated. Assessment of the hazard is made over the range of low level electrical discharges, such as static sparks, to high level discharges, such as lightning strikes to aircraft. In addition, some fundamental work is presented on the problem of flame propagation in aircraft fuel vent systems. This study consists of a laboratory investigation in five parts: (1) a study of the ignition energies and flame propagation rates of kerosene-air and JP-6-air foams, (2) a study of the rate of flame propagation of n-heptane, n-octane, n-nonane, and n-decane in aircraft vent ducts, (3) a study of the damage to aluminum, titanium, and stainless steel aircraft skin materials by lightning strikes, (4) a study of fuel ignition by lightning strikes to aircraft skins, and (5) a study of lightning induced flame propagation in an aircraft vent system.

  5. Natural phenomena hazards site characterization criteria

    SciTech Connect

    Not Available

    1994-03-01

    The criteria and recommendations in this standard shall apply to site characterization for the purpose of mitigating Natural Phenomena Hazards (wind, floods, landslide, earthquake, volcano, etc.) in all DOE facilities covered by DOE Order 5480.28. Criteria for site characterization not related to NPH are not included unless necessary for clarification. General and detailed site characterization requirements are provided in areas of meteorology, hydrology, geology, seismology, and geotechnical studies.

  6. Natural hazard understanding in the middle schools of the Colorado Front Range

    SciTech Connect

    Grogger, P.K.

    1995-12-01

    The best form of mitigation is not to put one`s self in a position that mitigation is required. For the last five years the University of Colorado`s Department of Geology has teamed with local school districts to implement an understanding of natural hazards. By working with middle school students the dangers and possible mitigation of North America are learned at an early age. Over the years, the knowledge gained by these communities citizens will hopefully help lessen the dangers from natural hazards society faces. Education of the general public about natural hazards needs to be addressed by the professional societies studying and developing answers to natural hazards problems. By working with school children this process of educating the general public starts early in the education system and will bear fruit many years in the future. This paper describes the course that is being given to students in Colorado.

  7. Use of geotextiles for mitigation of the effects of man-made hazards such as greening of waste deposits in frame of the conversion of industrial areas

    NASA Astrophysics Data System (ADS)

    Bostenaru, Magdalena; Siminea, Ioana; Bostenaru, Maria

    2010-05-01

    The city of Karlsruhe lays on the Rhine valley; however, it is situated at a certain distance from the Rhine river and the coastal front is not integrated in the urban development. However, the port to the Rhine developed to the second largest internal port in Germany. With the process of deindustrialisation, industrial use is now shrinking. With the simultaneous process of the ecological re-win of rivers, the conversion of the industrial area to green and residential areals is imposed. In the 1990s a project was made by the third author of the contribution with Andrea Ciobanu as students of the University of Karlsruhe for the conversion of the Rhine port area of Karlsruhe into such a nature-residential use. The area included also a waste deposit, proposed to be transformed into a "green hill". Such an integration of a waste deposit into a park in the process of the conversion of an industrial area is not singular in Germany; several such projects were proposed and some of them realised at the IBA Emscher Park in the Ruhr area. Some of them were coupled with artistic projects. The technical details are also subject of the contribution. Studies were made by the first two authors on the conditions in which plants grow on former waste deposits if supported by intermediar layers of a geotextile. The characteristics of the geotextiles, together with the technologic process of obtaining, and the results of laboratory and field experiments for use on waste deposits in comparable conditions in Romania will be shown. The geotextile is also usable for ash deposits such as those in the Ruhr area.

  8. Natural hazard risk perception of Italian population: case studies along national territory.

    NASA Astrophysics Data System (ADS)

    Gravina, Teresita; Tupputi Schinosa, Francesca De Luca; Zuddas, Isabella; Preto, Mattia; Marengo, Angelo; Esposito, Alessandro; Figliozzi, Emanuele; Rapinatore, Matteo

    2015-04-01

    Risk perception is judgment that people make about the characteristics and severity of risks, in last few years risk perception studies focused on provide cognitive elements to communication experts responsible in order to design citizenship information and awareness appropriate strategies. Several authors in order to determine natural hazards risk (Seismic, landslides, cyclones, flood, Volcanic) perception used questionnaires as tool for providing reliable quantitative data and permitting comparison the results with those of similar surveys. In Italy, risk perception studies based on surveys, were also carried out in order to investigate on national importance Natural risk, in particular on Somma-Vesuvio and Phlegrean Fields volcanic Risks, but lacked risk perception studies on local situation distributed on whole national territory. National importance natural hazard were frequently reported by national mass media and there were debate about emergencies civil protection plans, otherwise could be difficult to obtain information on bonded and regional nature natural hazard which were diffuses along National territory. In fact, Italian peninsula was a younger geological area subjected to endogenous phenomena (volcanoes, earthquake) and exogenous phenomena which determine land evolution and natural hazard (landslide, coastal erosion, hydrogeological instability, sinkhole) for population. For this reason we decided to investigate on natural risks perception in different Italian place were natural hazard were taken place but not reported from mass media, as were only local relevant or historical event. We carried out surveys in different Italian place interested by different types of natural Hazard (landslide, coastal erosion, hydrogeological instability, sinkhole, volcanic phenomena and earthquake) and compared results, in order to understand population perception level, awareness and civil protection exercises preparation. Our findings support that risks communication have to be based on citizen knowledge and conscious in natural hazards. In fact, informed citizen could participate actively in decision in urban development planning and accept positively legislation and regulation introduced to avoid natural risks. The study has gone some way towards enhancing understanding in citizens conscious in natural risks and allow us to say that communication on natural risks could not be based only in transferring emergency behavior to citizens but also allow people to improve their knowledge in landscape evolution in order to assume aware environmental behavior.

  9. Identifying hazard parameter to develop quantitative and dynamic hazard map of an active volcano in Indonesia

    NASA Astrophysics Data System (ADS)

    Suminar, Wulan; Saepuloh, Asep; Meilano, Irwan

    2016-05-01

    Analysis of hazard assessment to active volcanoes is crucial for risk management. The hazard map of volcano provides information to decision makers and communities before, during, and after volcanic crisis. The rapid and accurate hazard assessment, especially to an active volcano is necessary to be developed for better mitigation on the time of volcanic crises in Indonesia. In this paper, we identified the hazard parameters to develop quantitative and dynamic hazard map of an active volcano. The Guntur volcano in Garut Region, West Java, Indonesia was selected as study area due population are resided adjacent to active volcanoes. The development of infrastructures, especially related to tourism at the eastern flank from the Summit, are growing rapidly. The remote sensing and field investigation approaches were used to obtain hazard parameters spatially. We developed a quantitative and dynamic algorithm to map spatially hazard potential of volcano based on index overlay technique. There were identified five volcano hazard parameters based on Landsat 8 and ASTER imageries: volcanic products including pyroclastic fallout, pyroclastic flows, lava and lahar, slope topography, surface brightness temperature, and vegetation density. Following this proposed technique, the hazard parameters were extracted, indexed, and calculated to produce spatial hazard values at and around Guntur Volcano. Based on this method, the hazard potential of low vegetation density is higher than high vegetation density. Furthermore, the slope topography, surface brightness temperature, and fragmental volcanic product such as pyroclastics influenced to the spatial hazard value significantly. Further study to this proposed approach will be aimed for effective and efficient analyses of volcano risk assessment.

  10. Comparative risk judgements for oral health hazards among Norwegian adults: a cross sectional study

    PubMed Central

    Åstrøm, Anne Nordrehaug

    2002-01-01

    Background This study identified optimistic biases in health and oral health hazards, and explored whether comparative risk judgements for oral health hazards vary systematically with socio-economic characteristics and self-reported risk experience. Methods A simple random sample of 1,190 residents born in 1972 was drawn from the population resident in three counties of Norway. A total of 735 adults (51% women) completed postal questionnaires at home. Results Mean ratings of comparative risk judgements differed significantly (p < 0.001) from the mid point of the scales. T-values ranged from -13.1 and -12.1 for the perceived risk of being divorced and loosing all teeth to -8.2 and -7.8 (p < 0.001) for having gum disease and toothdecay. Multivariate analyses using General Linear Models, GLM, revealed gender differences in comparative risk judgements for gum disease, whereas social position varied systematically with risk judgements for tooth decay, gum disease and air pollution. The odds ratios for being comparatively optimistic with respect to having gum disease were 2.9, 1.9, 1.8 and 1.5 if being satisfied with dentition, having a favourable view of health situation, and having high and low involvement with health enhancing and health detrimental behaviour, respectively. Conclusion Optimism in comparative judgements for health and oral health hazards was evident in young Norwegian adults. When judging their comparative susceptibility for oral health hazards, they consider personal health situation and risk behaviour experience. PMID:12186656

  11. Robot-assisted home hazard assessment for fall prevention: a feasibility study.

    PubMed

    Sadasivam, Rajani S; Luger, Tana M; Coley, Heather L; Taylor, Benjamin B; Padir, Taskin; Ritchie, Christine S; Houston, Thomas K

    2014-01-01

    We examined the feasibility of using a remotely manoeuverable robot to make home hazard assessments for fall prevention. We employed use-case simulations to compare robot assessments with in-person assessments. We screened the homes of nine elderly patients (aged 65 years or more) for fall risks using the HEROS screening assessment. We also assessed the participants' perspectives of the remotely-operated robot in a survey. The nine patients had a median Short Blessed Test score of 8 (interquartile range, IQR 2-20) and a median Life-Space Assessment score of 46 (IQR 27-75). Compared to the in-person assessment (mean = 4.2 hazards identified per participant), significantly more home hazards were perceived in the robot video assessment (mean = 7.0). Only two checklist items (adequate bedroom lighting and a clear path from bed to bathroom) had more than 60% agreement between in-person and robot video assessment. Participants were enthusiastic about the robot and did not think it violated their privacy. The study found little agreement between the in-person and robot video hazard assessments. However, it identified several research questions about how to best use remotely-operated robots. PMID:24352900

  12. Natural Hazards in Ólafsfjörður, Iceland. A conceptual study.

    NASA Astrophysics Data System (ADS)

    Moran, A.; Wastl, M.; Stötter, J.; Ploner, A.; Sönser, T.

    2003-04-01

    This study focuses on the conceptual approach to natural hazard investigations in regions lacking hazard zoning or with only the existence of rudimentary hazard assessments. Fulfilling these specifications, the community of Ólafsfjörður, with a population of ca. 1100 inhabitants, is selected as a sample area. It is located in a high mountain environment in northern Iceland (66.07° N, 18.65°W) and is subject to various natural hazard processes such as snow avalanches, debris and mud flows, rock fall and flooding. The objective of this procedure is to identify the risk potential of the area on a regional scale of 1:25,000. The input data consists of only aerial photography and topographical maps, enabling by the usage of various software programs, including a geographical information system (GIS), the generation of a digital terrain model and creation of orthophotos for mapping. The following step in approaching this task is hazard identification. This consists of the interpretation of aerial photography, field mapping and the evaluation of historic events by means of interviews and using written chronicles. In a second step GIS-based avalanche and rock fall simulation models are run in order to quantify worst-case scenarios. Parallel to the assessment of geomorphological processes, the areas of human settlement and building functions in the entire community are acquired. In a final step the information representing the spatial extents of natural processes and existing human settlements are overlapped allowing an overview of potential conflict zones. The results reveal that a considerable number of buildings and road segments are found to lie within the reach of potential avalanche run-out. This calls for urgent measures to be taken. This study obtains quick results in identifying areas of conflict while forming the foundation for more detailed planning where necessary.

  13. Study of EEPN mitigation using modified RF pilot and Viterbi-Viterbi based phase noise compensation.

    PubMed

    Jacobsen, Gunnar; Xu, Tianhua; Popov, Sergei; Sergeyev, Sergey

    2013-05-20

    We propose--as a modification of the optical (RF) pilot scheme--a balanced phase modulation between two polarizations of the optical signal in order to generate correlated equalization enhanced phase noise (EEPN) contributions in the two polarizations. The method is applicable for n-level PSK system. The EEPN can be compensated, the carrier phase extracted and the nPSK signal regenerated by complex conjugation and multiplication in the receiver. The method is tested by system simulations in a single channel QPSK system at 56 Gb/s system rate. It is found that the conjugation and multiplication scheme in the Rx can mitigate the EEPN to within ½ orders of magnitude. Results are compared to using the Viterbi-Viterbi algorithm to mitigate the EEPN. The latter method improves the sensitivity more than two orders of magnitude. Important novel insight into the statistical properties of EEPN is identified and discussed in the paper. PMID:23736453

  14. Success in transmitting hazard science

    NASA Astrophysics Data System (ADS)

    Price, J. G.; Garside, T.

    2010-12-01

    Money motivates mitigation. An example of success in communicating scientific information about hazards, coupled with information about available money, is the follow-up action by local governments to actually mitigate. The Nevada Hazard Mitigation Planning Committee helps local governments prepare competitive proposals for federal funds to reduce risks from natural hazards. Composed of volunteers with expertise in emergency management, building standards, and earthquake, flood, and wildfire hazards, the committee advises the Nevada Division of Emergency Management on (1) the content of the State’s hazard mitigation plan and (2) projects that have been proposed by local governments and state agencies for funding from various post- and pre-disaster hazard mitigation programs of the Federal Emergency Management Agency. Local governments must have FEMA-approved hazard mitigation plans in place before they can receive this funding. The committee has been meeting quarterly with elected and appointed county officials, at their offices, to encourage them to update their mitigation plans and apply for this funding. We have settled on a format that includes the county’s giving the committee an overview of its infrastructure, hazards, and preparedness. The committee explains the process for applying for mitigation grants and presents the latest information that we have about earthquake hazards, including locations of nearby active faults, historical seismicity, geodetic strain, loss-estimation modeling, scenarios, and documents about what to do before, during, and after an earthquake. Much of the county-specific information is available on the web. The presentations have been well received, in part because the committee makes the effort to go to their communities, and in part because the committee is helping them attract federal funds for local mitigation of not only earthquake hazards but also floods (including canal breaches) and wildfires, the other major concerns in Nevada. Local citizens appreciate the efforts of the state officials to present the information in a public forum. The Committee’s earthquake presentations to the counties are supplemented by regular updates in the two most populous counties during quarterly meetings of the Nevada Earthquake Safety Council, generally alternating between Las Vegas and Reno. We have only 17 counties in Nevada, so we are making good progress at reaching each within a few years. The Committee is also learning from the county officials about their frustrations in dealing with the state and federal bureaucracies. Success is documented by the mitigation projects that FEMA has funded.

  15. Thermal study of payload module for the next-generation infrared space telescope SPICA in risk mitigation phase

    NASA Astrophysics Data System (ADS)

    Shinozaki, Keisuke; Sato, Yoichi; Sawada, Kenichiro; Ando, Makiko; Sugita, Hiroyuki; Yamawaki, Toshihiro; Mizutani, Tadahiro; Komatsu, Keiji; Nakagawa, Takao; Murakami, Hiroshi; Matsuhara, Hideo; Takada, Makoto; Takai, Shigeki; Okabayashi, Akinobu; Tsunematsu, Shoji; Kanao, Kenichi; Narasaki, Katsuhiro

    2014-11-01

    SPace Infrared telescope for Cosmology and Astrophysics (SPICA) is a pre-project of JAXA in collaboration with ESA to be launched around 2020. The SPICA is transferred into a halo orbit around the second Lagrangian point (L2) in the Sun-Earth system, which enables us to use effective radiant cooling in combination with mechanical cooling system in order to cool a 3 m large IR telescope below 6 K. At a present, a conceptional study of SPICA is underway to assess and mitigate mission's risks; the thermal study for the risk mitigation sets a goal of a 25% margin on cooling power of 4 K/1 K temperature regions, a 25% margin on the heat load from Focal Plane Instruments (FPIs) at intermediated temperature region, to enhance the reliability of the mechanical cooler system, and to enhance feasibility of ground tests. Thermal property measurements of FRP materials are also important. This paper introduces details of the thermal design study for risk mitigation, including development of the truss separation mechanism, the cryogenic radiator, mechanical cooler system, and thermal property measurements of materials.

  16. Hazardous Materials Management and Emergency Response (HAMMER) Training Center feasibility study

    SciTech Connect

    Curry, R.

    1989-11-01

    This report presents the results of a feasibility study for location of a Hazardous Materials Management and Emergency Response Training Center at Hanford. Westinghouse Hanford Company conducted the study at the request of the US Department of Energy-Richland Operations Office. The study resulted from an initiative by Tri-Cities, Washington area municipal and county entities proposing such a training center. The initiative responded to changes in federal law requiring extensive and specific personnel training for response to incidents involving hazardous materials. Washington Congressman Sid Morrison requested the Department of Energy-Richland Operations Office to evaluate the initiative as a potential business opportunity for the Department of Energy's Hanford Site. 8 refs., 6 figs., 1 tab.

  17. Climate engineering of vegetated land for hot extremes mitigation: an ESM sensitivity study

    NASA Astrophysics Data System (ADS)

    Wilhelm, Micah; Davin, Edouard; Seneviratne, Sonia

    2014-05-01

    Mitigation efforts to reduce anthropogenic climate forcing have thus far proven inadequate, as evident from accelerating greenhouse gas emissions. Many subtropical and mid-latitude regions are expected to experience longer and more frequent heat waves and droughts within the next century. This increased occurrence of weather extremes has important implications for human health, mortality and for socio-economic factors including forest fires, water availability and agricultural production. Various solar radiation management (SRM) schemes that attempt to homogeneously counter the anthropogenic forcing have been examined with different Earth System Models (ESM). Land climate engineering schemes have also been investigated which reduces the amount of solar radiation that is absorbed at the surface. However, few studies have investigated their effects on extremes but rather on mean climate response. Here we present the results of a series of climate engineering sensitivity experiments performed with the Community Earth System Model (CESM) version 1.0.2 at 2°-resolution. This configuration entails 5 fully coupled model components responsible for simulating the Earth's atmosphere, land, land-ice, ocean and sea-ice that interact through a central coupler. Historical and RCP8.5 scenarios were performed with transient land-cover changes and prognostic terrestrial Carbon/Nitrogen cycles. Four sets of experiments are performed in which surface albedo over snow-free vegetated grid points is increased by 0.5, 0.10, 0.15 and 0.20. The simulations show a strong preferential cooling of hot extremes throughout the Northern mid-latitudes during boreal summer. A strong linear scaling between the cooling of extremes and additional surface albedo applied to the land model is observed. The strongest preferential cooling is found in southeastern Europe and the central United States, where increases of soil moisture and evaporative fraction are the largest relative to the control simulation. This preferential cooling is found to intensify in the future scenario. Cloud cover strongly limits the efficacy of a given surface albedo increase to reflect incoming solar radiation back into space. As anthropogenic forcing increases, cloud cover decreases over much of the northern mid-latitudes in CESM.

  18. Proportional hazards regression in epidemiologic follow-up studies: an intuitive consideration of primary time scale.

    PubMed

    Cologne, John; Hsu, Wan-Ling; Abbott, Robert D; Ohishi, Waka; Grant, Eric J; Fujiwara, Saeko; Cullings, Harry M

    2012-07-01

    In epidemiologic cohort studies of chronic diseases, such as heart disease or cancer, confounding by age can bias the estimated effects of risk factors under study. With Cox proportional-hazards regression modeling in such studies, it would generally be recommended that chronological age be handled nonparametrically as the primary time scale. However, studies involving baseline measurements of biomarkers or other factors frequently use follow-up time since measurement as the primary time scale, with no explicit justification. The effects of age are adjusted for by modeling age at entry as a parametric covariate. Parametric adjustment raises the question of model adequacy, in that it assumes a known functional relationship between age and disease, whereas using age as the primary time scale does not. We illustrate this graphically and show intuitively why the parametric approach to age adjustment using follow-up time as the primary time scale provides a poor approximation to age-specific incidence. Adequate parametric adjustment for age could require extensive modeling, which is wasteful, given the simplicity of using age as the primary time scale. Furthermore, the underlying hazard with follow-up time based on arbitrary timing of study initiation may have no inherent meaning in terms of risk. Given the potential for biased risk estimates, age should be considered as the preferred time scale for proportional-hazards regression with epidemiologic follow-up data when confounding by age is a concern. PMID:22517300

  19. Seaside, Oregon, Tsunami Pilot Study-Modernization of FEMA Flood Hazard Maps: GIS Data

    USGS Publications Warehouse

    Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.

    2006-01-01

    Introduction: The Federal Emergency Management Agency (FEMA) Federal Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study (Chowdhury and others, 2005). Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Analysis (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines (Tsunami Pilot Study Working Group, 2006). The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey, and the National Oceanic and Atmospheric Administration (NOAA), in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. We present the spatial (geographic information system, GIS) data from the pilot study in standard GIS formats and provide files for visualization in Google Earth, a global map viewer.

  20. 44 CFR 201.7 - Tribal Mitigation Plans.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... OF HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION PLANNING § 201.7 Tribal Mitigation Plans. The... tribal government's pre- and post-disaster hazard management policies, programs, and capabilities to... reduce risks from natural hazards, serving as a guide for decision makers as they commit resources...

  1. Seismic Hazard and Risk Assessment for Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Varazanashvili, Otar; Arabidze, Vakhtang; Gugeshashvili, Tengiz; Mukhadze, Teimuraz; Gvencadze, Aleksandre

    2014-05-01

    Risks of natural hazards caused by natural disaster are closely related to the development process of society. The high level of natural disasters in many countries makes necessary to work out the national programs and strategy. The main goal of these programs is to reduce the natural disasters risk and caused losses. Risk mitigation is the cornerstone of the approach to reduce the nation's vulnerability to disasters from natural hazards. So proper investigation and assessment of natural hazards and vulnerability of the element at risk to hazards is very important for an effective and proper assessment of risk. This work issues a call for advance planning and action to reduce natural disaster risks, notably seismic risk through the investigation of vulnerability and seismic hazard for Georgia. Firstly, detail inventory map of element at risk was created. Here elements at risk are comprised of buildings and population. Secondly, seismic hazard maps were calculated based on modern approach of selecting and ranking global and regional ground motion prediction equation for region. Thirdly, on the bases of empirical data that was collected for some earthquake intensity based vulnerability study were completed for Georgian buildings. Finally, probabilistic seismic risk assessment in terms of structural damage and casualties were calculated for the territory of Georgia using obtained results. This methodology gave prediction of damage and casualty for a given probability of recurrence, based on a probabilistic seismic hazard model, population distribution, inventory, and vulnerability of buildings.

  2. Assessment and Indirect Adjustment for Confounding by Smoking in Cohort Studies Using Relative Hazards Models

    PubMed Central

    Richardson, David B.; Laurier, Dominique; Schubauer-Berigan, Mary K.; Tchetgen, Eric Tchetgen; Cole, Stephen R.

    2014-01-01

    Workers' smoking histories are not measured in many occupational cohort studies. Here we discuss the use of negative control outcomes to detect and adjust for confounding in analyses that lack information on smoking. We clarify the assumptions necessary to detect confounding by smoking and the additional assumptions necessary to indirectly adjust for such bias. We illustrate these methods using data from 2 studies of radiation and lung cancer: the Colorado Plateau cohort study (1950–2005) of underground uranium miners (in which smoking was measured) and a French cohort study (1950–2004) of nuclear industry workers (in which smoking was unmeasured). A cause-specific relative hazards model is proposed for estimation of indirectly adjusted associations. Among the miners, the proposed method suggests no confounding by smoking of the association between radon and lung cancer—a conclusion supported by adjustment for measured smoking. Among the nuclear workers, the proposed method suggests substantial confounding by smoking of the association between radiation and lung cancer. Indirect adjustment for confounding by smoking resulted in an 18% decrease in the adjusted estimated hazard ratio, yet this cannot be verified because smoking was unmeasured. Assumptions underlying this method are described, and a cause-specific proportional hazards model that allows easy implementation using standard software is presented. PMID:25245043

  3. Assessment and indirect adjustment for confounding by smoking in cohort studies using relative hazards models.

    PubMed

    Richardson, David B; Laurier, Dominique; Schubauer-Berigan, Mary K; Tchetgen Tchetgen, Eric; Cole, Stephen R

    2014-11-01

    Workers' smoking histories are not measured in many occupational cohort studies. Here we discuss the use of negative control outcomes to detect and adjust for confounding in analyses that lack information on smoking. We clarify the assumptions necessary to detect confounding by smoking and the additional assumptions necessary to indirectly adjust for such bias. We illustrate these methods using data from 2 studies of radiation and lung cancer: the Colorado Plateau cohort study (1950-2005) of underground uranium miners (in which smoking was measured) and a French cohort study (1950-2004) of nuclear industry workers (in which smoking was unmeasured). A cause-specific relative hazards model is proposed for estimation of indirectly adjusted associations. Among the miners, the proposed method suggests no confounding by smoking of the association between radon and lung cancer--a conclusion supported by adjustment for measured smoking. Among the nuclear workers, the proposed method suggests substantial confounding by smoking of the association between radiation and lung cancer. Indirect adjustment for confounding by smoking resulted in an 18% decrease in the adjusted estimated hazard ratio, yet this cannot be verified because smoking was unmeasured. Assumptions underlying this method are described, and a cause-specific proportional hazards model that allows easy implementation using standard software is presented. PMID:25245043

  4. Seaside, Oregon Tsunami Pilot Study - modernization of FEMA flood hazard maps

    USGS Publications Warehouse

    Tsunami Pilot Study Working Group

    2006-01-01

    FEMA Flood Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study. Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Assessment (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines. The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State Agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey and the National Oceanic and Atmospheric Administration, in collaboration with the University of Southern California, Middle East Technical University. Portland State University, Horning Geosciences, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. Draft copies and a briefing on the contents, results and recommendations of this document were provided to FEMA officials before final publication.

  5. Social and ethical perspectives of landslide risk mitigation measures

    NASA Astrophysics Data System (ADS)

    Kalsnes, Bjørn; Vangelsten, Bjørn V.

    2015-04-01

    Landslide risk may be mitigated by use of a wide range of measures. Mitigation and prevention options may include (1) structural measures to reduce the frequency, severity or exposure to the hazard, (2) non-structural measures, such as land-use planning and early warning systems, to reduce the hazard frequency and consequences, and (3) measures to pool and transfer the risks. In a given situation the appropriate system of mitigation measures may be a combination of various types of measures, both structural and non-structural. In the process of choosing mitigation measures for a given landslide risk situation, the role of the geoscientist is normally to propose possible mitigation measures on basis of the risk level and technical feasibility. Social and ethical perspectives are often neglected in this process. However, awareness of the need to consider social as well as ethical issues in the design and management of mitigating landslide risk is rising. There is a growing understanding that technical experts acting alone cannot determine what will be considered the appropriate set of mitigation and prevention measures. Issues such as environment versus development, questions of acceptable risk, who bears the risks and benefits, and who makes the decisions, also need to be addressed. Policymakers and stakeholders engaged in solving environmental risk problems are increasingly recognising that traditional expert-based decision-making processes are insufficient. This paper analyse the process of choosing appropriate mitigation measures to mitigate landslide risk from a social and ethical perspective, considering technical, cultural, economical, environmental and political elements. The paper focus on stakeholder involvement in the decision making process, and shows how making strategies for risk communication is a key for a successful process. The study is supported by case study examples from Norway and Italy. In the Italian case study, three different risk mitigation options was presented to the local community. The options were based on a thorough stakeholder involvement process ending up in three different views on how to deal with the landslide risk situation: i) protect lives and properties (hierarchical) ; ii) careful stewardship of the mountains (egalitarian); and iii) rational individual choice (individualist).

  6. Reviewing and visualising relationships between anthropic processes and natural hazards within a multi-hazard framework

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2014-05-01

    Here we present a broad overview of the interaction relationships between 17 anthropic processes and 21 different natural hazard types. Anthropic processes are grouped into seven categories (subsurface extraction, subsurface addition, land use change, explosions, hydrological change, surface construction processes, miscellaneous). Natural hazards are grouped into six categories (geophysical, hydrological, shallow earth processes, atmospheric, biophysical and space). A wide-ranging review based on grey- and peer-reviewed literature from many scientific disciplines identified 54 relationships where anthropic processes have been noted to trigger natural hazards. We record case studies for all but three of these relationships. Based on the results of this review, we find that the anthropic processes of deforestation, explosions (conventional and nuclear) and reservoir construction could trigger the widest range of different natural hazard types. We also note that within the natural hazards, landslides and earthquakes are those that could be triggered by the widest range of anthropic processes. This work also examines the possibility of anthropic processes (i) resulting in an increased occurrence of a particular hazard interaction (e.g., deforestation could result in an increased interaction between storms and landslides); and (ii) inadvertently reducing the likelihood of a natural hazard or natural hazard interaction (e.g., poor drainage or deforestation reducing the likelihood of wildfires triggered by lightning). This study synthesises, using accessible visualisation techniques, the large amounts of anthropic process and natural hazard information from our review. In it we have outlined the importance of considering anthropic processes within any analysis of hazard interactions, and we reinforce the importance of a holistic approach to natural hazard assessment, mitigation and management.

  7. A web-based tool for ranking landslide mitigation measures

    NASA Astrophysics Data System (ADS)

    Lacasse, S.; Vaciago, G.; Choi, Y. J.; Kalsnes, B.

    2012-04-01

    As part of the research done in the European project SafeLand "Living with landslide risk in Europe: Assessment, effects of global change, and risk management strategies", a compendium of structural and non-structural mitigation measures for different landslide types in Europe was prepared, and the measures were assembled into a web-based "toolbox". Emphasis was placed on providing a rational and flexible framework applicable to existing and future mitigation measures. The purpose of web-based toolbox is to assist decision-making and to guide the user in the choice of the most appropriate mitigation measures. The mitigation measures were classified into three categories, describing whether the mitigation measures addressed the landslide hazard, the vulnerability or the elements at risk themselves. The measures considered include structural measures reducing hazard and non-structural mitigation measures, reducing either the hazard or the consequences (or vulnerability and exposure of elements at risk). The structural measures include surface protection and control of surface erosion; measures modifying the slope geometry and/or mass distribution; measures modifying surface water regime - surface drainage; measures mo¬difying groundwater regime - deep drainage; measured modifying the mechanical charac¬teristics of unstable mass; transfer of loads to more competent strata; retaining structures (to modify slope geometry and/or to transfer stress to compe¬tent layer); deviating the path of landslide debris; dissipating the energy of debris flows; and arresting and containing landslide debris or rock fall. The non-structural mitigation measures, reducing either the hazard or the consequences: early warning systems; restricting or discouraging construction activities; increasing resistance or coping capacity of elements at risk; relocation of elements at risk; sharing of risk through insurance. The measures are described in the toolbox with fact sheets providing a brief description, guidance on design, schematic details, practical examples and references for each mitigation measure. Each of the measures was given a score on its ability and applicability for different types of landslides and boundary conditions, and a decision support matrix was established. The web-based toolbox organizes the information in the compendium and provides an algorithm to rank the measures on the basis of the decision support matrix, and on the basis of the risk level estimated at the site. The toolbox includes a description of the case under study and offers a simplified option for estimating the hazard and risk levels of the slide at hand. The user selects the mitigation measures to be included in the assessment. The toolbox then ranks, with built-in assessment factors and weights and/or with user-defined ranking values and criteria, the mitigation measures included in the analysis. The toolbox includes data management, e.g. saving data half-way in an analysis, returning to an earlier case, looking up prepared examples or looking up information on mitigation measures. The toolbox also generates a report and has user-forum and help features. The presentation will give an overview of the mitigation measures considered and examples of the use of the toolbox, and will take the attendees through the application of the toolbox.

  8. Incorporating induced seismicity in the 2014 United States National Seismic Hazard Model: results of the 2014 workshop and sensitivity studies

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.

    2015-01-01

    The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after further consideration of the reliability and scientific acceptability of each alternative input model. Forecasting the seismic hazard from induced earthquakes is fundamentally different from forecasting the seismic hazard for natural, tectonic earthquakes. This is because the spatio-temporal patterns of induced earthquakes are reliant on economic forces and public policy decisions regarding extraction and injection of fluids. As such, the rates of induced earthquakes are inherently variable and nonstationary. Therefore, we only make maps based on an annual rate of exceedance rather than the 50-year rates calculated for previous U.S. Geological Survey hazard maps.

  9. FDA-iRISK--a comparative risk assessment system for evaluating and ranking food-hazard pairs: case studies on microbial hazards.

    PubMed

    Chen, Yuhuan; Dennis, Sherri B; Hartnett, Emma; Paoli, Greg; Pouillot, Rgis; Ruthman, Todd; Wilson, Margaret

    2013-03-01

    Stakeholders in the system of food safety, in particular federal agencies, need evidence-based, transparent, and rigorous approaches to estimate and compare the risk of foodborne illness from microbial and chemical hazards and the public health impact of interventions. FDA-iRISK (referred to here as iRISK), a Web-based quantitative risk assessment system, was developed to meet this need. The modeling tool enables users to assess, compare, and rank the risks posed by multiple food-hazard pairs at all stages of the food supply system, from primary production, through manufacturing and processing, to retail distribution and, ultimately, to the consumer. Using standard data entry templates, built-in mathematical functions, and Monte Carlo simulation techniques, iRISK integrates data and assumptions from seven components: the food, the hazard, the population of consumers, process models describing the introduction and fate of the hazard up to the point of consumption, consumption patterns, dose-response curves, and health effects. Beyond risk ranking, iRISK enables users to estimate and compare the impact of interventions and control measures on public health risk. iRISK provides estimates of the impact of proposed interventions in various ways, including changes in the mean risk of illness and burden of disease metrics, such as losses in disability-adjusted life years. Case studies for Listeria monocytogenes and Salmonella were developed to demonstrate the application of iRISK for the estimation of risks and the impact of interventions for microbial hazards. iRISK was made available to the public at http://irisk.foodrisk.org in October 2012. PMID:23462073

  10. Space Debris & its Mitigation

    NASA Astrophysics Data System (ADS)

    Kaushal, Sourabh; Arora, Nishant

    2012-07-01

    Space debris has become a growing concern in recent years, since collisions at orbital velocities can be highly damaging to functioning satellites and can also produce even more space debris in the process. Some spacecraft, like the International Space Station, are now armored to deal with this hazard but armor and mitigation measures can be prohibitively costly when trying to protect satellites or human spaceflight vehicles like the shuttle. This paper describes the current orbital debris environment, outline its main sources, and identify mitigation measures to reduce orbital debris growth by controlling these sources. We studied the literature on the topic Space Debris. We have proposed some methods to solve this problem of space debris. We have also highlighted the shortcomings of already proposed methods by space experts and we have proposed some modification in those methods. Some of them can be very effective in the process of mitigation of space debris, but some of them need some modification. Recently proposed methods by space experts are maneuver, shielding of space elevator with the foil, vaporizing or redirecting of space debris back to earth with the help of laser, use of aerogel as a protective layer, construction of large junkyards around international space station, use of electrodynamics tether & the latest method proposed is the use of nano satellites in the clearing of the space debris. Limitations of the already proposed methods are as follows: - Maneuvering can't be the final solution to our problem as it is the act of self-defence. - Shielding can't be done on the parts like solar panels and optical devices. - Vaporizing or redirecting of space debris can affect the human life on earth if it is not done in proper manner. - Aerogel has a threshold limit up to which it can bear (resist) the impact of collision. - Large junkyards can be effective only for large sized debris. In this paper we propose: A. The Use of Nano Tubes by creating a mesh: In this technique we will use the nano tubes. We will create a mesh that will act as a touch panel of the touch screen cell phone. When any small or tiny particle will come on this mesh and touch it then the mesh will act as a touch panel and so that the corresponding processor or sensor will come to know the co-ordinates of it then further by using Destructive laser beam we can destroy that particle. B. Use of the Nano tubes and Nano Bots for the collection of the Space Debris: In this method also we will use a nano mesh which is made up of the nano tubes and the corresponding arrangement will be done so that that mesh will act as a touch panel same as that of the touch screen phones. So when tiny particles will dash on the nano mesh then the Nano Bots which will be at the specific co-ordinates collect the particles and store them into the garbage storage. C. Further the space Debris can be use for the other purposes too:- As we know that the space debris can be any tiny particle in the space. So instead of decomposing that particles or destroying it we can use those particles for the purpose of energy production by using the fuel cells, but for this the one condition is that the particle material should be capable of forming the ionize liquid or solution which can be successfully use in the fuel cell for energy production. But this is useful for only the big projects where in smallest amount of energy has also the great demand or value. D. RECYCLING OF SPACE DEBRIS The general idea of making space structures by recycling space debris is to capture the aluminum of the upper stages, melt it, and form it into new aluminum structures, perhaps by coating the inside of inflatable balloons, to make very large structures of thin aluminum shells. CONCLUSION Space debris has become the topic of great concern in recent years. Space debris creation can't be stopped completely but it can be minimized by adopting some measures. Many methods of space debris mitigation have been proposed earlier by many space experts, but some of them have limitations in them. After some modification those measures can proved beneficial in the process of space debris mitigation. Some new methods of space debris mitigation have been proposed by us in this paper which includes use of nanobot and nanotube mesh technique. Moreover we have to use it for energy purpose or the making of space structures. We end this paper by appealing that ``We have already polluted our own planet earth; we should now ensure that the space is kept least polluted for our own safe exploration of the outer space and also for the safety of aliens from other planets if they happen to exist.

  11. Intrinsic vulnerability, hazard and risk mapping for karst aquifers: A case study

    NASA Astrophysics Data System (ADS)

    Mimi, Ziad A.; Assi, Amjad

    2009-01-01

    SummaryGroundwater from karst aquifers is among the most important resources of drinking water supply of the worldwide population. The European COST action 620 proposed a comprehensive approach to karst groundwater protection, comprising methods of intrinsic and specific vulnerability mapping, hazard and risk mapping. This paper presents the first application of all components of this European approach to the groundwater underlying the Ramallah district, a karst hydrogeology system in Palestine. The vulnerability maps which were developed can assist in the implementation of groundwater management strategies to prevent degradation of groundwater quality. Large areas in the case study area can be classified as low or very low risk area corresponding to the pollution sources due to the absence of hazards and also due to low vulnerabilities. These areas could consequently be interesting for future development as they are preferable in view of ground water protection.

  12. CMMAD usability case study in support of countermine and hazard sensing

    NASA Astrophysics Data System (ADS)

    Walker, Victor G.; Gertman, David I.

    2010-04-01

    During field trials, operator usability data were collected in support of lane clearing missions and hazard sensing for two robot platforms with Robot Intelligence Kernel (RIK) software and sensor scanning payloads onboard. The tests featured autonomous and shared robot autonomy levels where tasking of the robot used a graphical interface featuring mine location and sensor readings. The goal of this work was to provide insights that could be used to further technology development. The efficacy of countermine and hazard systems in terms of mobility, search, path planning, detection, and localization were assessed. Findings from objective and subjective operator interaction measures are reviewed along with commentary from soldiers having taken part in the study who strongly endorse the system.

  13. Study on anaerobic digestion treatment of hazardous colistin sulphate contained pharmaceutical sludge.

    PubMed

    Yin, Fubin; Wang, Dongling; Li, Zifu; Ohlsen, Thomas; Hartwig, Peter; Czekalla, Sven

    2015-02-01

    Pharmaceutical sludge is considered as a hazardous substance with high treatment and disposal fees. Anaerobic digestion could not only transform the hazardous substance into activated sludge, but also generate valuable biogas. This research had two objectives. First: studying the feasibility of anaerobic digestion and determining the biochemical methane potential (BMP) of pharmaceutical sludge under different Inoculum to substrate TS ratios (ISRs) of 0, 0.65, 2.58 and 10.32 in mesophilic condition of 37±1°C. Secondly, investigating the removal efficiency of colistin sulphate during anaerobic digestion. The results showed that the use of anaerobic digestion to treat the pharmaceutical sludge is feasible and that it can completely eliminate the colistin sulphate. The highest biogas production from pharmaceutical sludge is 499.46 mL/g TS at an ISR of 10.32. PMID:25490101

  14. Status of volcanic hazard studies for the Nevada Nuclear Waste Storage Investigations

    SciTech Connect

    Crowe, B.M.; Vaniman, D.T.; Carr, W.J.

    1983-03-01

    Volcanism studies of the Nevada Test Site (NTS) region are concerned with hazards of future volcanism with respect to underground disposal of high-level radioactive waste. The hazards of silicic volcanism are judged to be negligible; hazards of basaltic volcanism are judged through research approaches combining hazard appraisal and risk assessment. The NTS region is cut obliquely by a N-NE trending belt of volcanism. This belt developed about 8 Myr ago following cessation of silicic volcanism and contemporaneous with migration of basaltic activity toward the southwest margin of the Great Basin. Two types of fields are present in the belt: (1) large-volume, long-lived basalt and local rhyolite fields with numerous eruptive centers and (2) small-volume fields formed by scattered basaltic scoria cones. Late Cenozoic basalts of the NTS region belong to the second field type. Monogenetic basalt centers of this region were formed mostly by Strombolian eruptions; Surtseyean activity has been recognized at three centers. Geochemically, the basalts of the NTS region are classified as straddle A-type basalts of the alkalic suite. Petrological studies indicate a volumetric dominance of evolved hawaiite magmas. Trace- and rare-earth-element abundances of younger basalt (<4 Myr) of the NTS region and southern Death Valley area, California, indicate an enrichment in incompatible elements, with the exception of rubidium. The conditional probability of recurring basaltic volcanism and disruption of a repository by that event is bounded by the range of 10{sup -8} to 10{sup -10} as calculated for a 1-yr period. Potential disruptive and dispersal effects of magmatic penetration of a repository are controlled primarily by the geometry of basalt feeder systems, the mechanism of waste incorporation in magma, and Strombolian eruption processes.

  15. 42 CFR 93.408 - Mitigating and aggravating factors in HHS administrative actions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false Mitigating and aggravating factors in HHS administrative actions. 93.408 Section 93.408 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH ASSESSMENTS AND HEALTH EFFECTS STUDIES OF HAZARDOUS SUBSTANCES RELEASES AND FACILITIES PUBLIC HEALTH SERVICE POLICIES ON...

  16. Probabilistic tephra hazard maps for the Neapolitan area: Quantitative volcanological study of Campi Flegrei eruptions

    NASA Astrophysics Data System (ADS)

    Mastrolorenzo, G.; Pappalardo, L.; Troise, C.; Panizza, A.; de Natale, G.

    2008-07-01

    Tephra fall is a relevant hazard of Campi Flegrei caldera (Southern Italy), due to the high vulnerability of Naples metropolitan area to such an event. Here, tephra derive from magmatic as well as phreatomagmatic activity. On the basis of both new and literature data on known, past eruptions (Volcanic Explosivity Index (VEI), grain size parameters, velocity at the vent, column heights and erupted mass), and factors controlling tephra dispersion (wind velocity and direction), 2D numerical simulations of fallout dispersion and deposition have been performed for a large number of case events. A bayesian inversion has been applied to retrieve the best values of critical parameters (e.g., vertical mass distribution, diffusion coefficients, velocity at the vent), not directly inferable by volcanological study. Simulations are run in parallel on multiple processors to allow a fully probabilistic analysis, on a very large catalogue preserving the statistical proprieties of past eruptive history. Using simulation results, hazard maps have been computed for different scenarios: upper limit scenario (worst-expected scenario), eruption-range scenario, and whole-eruption scenario. Results indicate that although high hazard characterizes the Campi Flegrei caldera, the territory to the east of the caldera center, including the whole district of Naples, is exposed to high hazard values due to the dominant westerly winds. Consistently with the stratigraphic evidence of nature of past eruptions, our numerical simulations reveal that even in the case of a subplinian eruption (VEI = 3), Naples is exposed to tephra fall thicknesses of some decimeters, thereby exceeding the critical limit for roof collapse. Because of the total number of people living in Campi Flegrei and the city of Naples (ca. two million of inhabitants), the tephra fallout risk related to a plinian eruption of Campi Flegrei largely matches or exceeds the risk related to a similar eruption at Vesuvius.

  17. Mitigation potential of horizontal ground coupled heat pumps for current and future climatic conditions: UK environmental modelling and monitoring studies

    NASA Astrophysics Data System (ADS)

    García González, Raquel; Verhoef, Anne; Vidale, Pier Luigi; Gan, Guohui; Wu, Yupeng; Hughes, Andrew; Mansour, Majdi; Blyth, Eleanor; Finch, Jon; Main, Bruce

    2010-05-01

    An increased uptake of alternative low or non-CO2 emitting energy sources is one of the key priorities for policy makers to mitigate the effects of environmental change. Relatively little work has been undertaken on the mitigation potential of Ground Coupled Heat Pumps (GCHPs) despite the fact that a GCHP could significantly reduce CO2 emissions from heating systems. It is predicted that under climate change the most probable scenario is for UK temperatures to increase and for winter rainfall to become more abundant; the latter is likely to cause a general rise in groundwater levels. Summer rainfall may reduce considerably, while vegetation type and density may change. Furthermore, recent studies underline the likelihood of an increase in the number of heat waves. Under such a scenario, GCHPs will increasingly be used for cooling as well as heating. These factors will affect long-term performance of horizontal GCHP systems and hence their economic viability and mitigation potential during their life span ( 50 years). The seasonal temperature differences encountered in soil are harnessed by GCHPs to provide heating in the winter and cooling in the summer. The performance of a GCHP system will depend on technical factors (heat exchanger (HE) type, length, depth, and spacing of pipes), but also it will be determined to a large extent by interactions between the below-ground parts of the system and the environment (atmospheric conditions, vegetation and soil characteristics). Depending on the balance between extraction and rejection of heat from and to the ground, the soil temperature in the neighbourhood of the HE may fall or rise. The GROMIT project (GROund coupled heat pumps MITigation potential), funded by the Natural Environment Research Council (UK), is a multi-disciplinary research project, in collaboration with EarthEnergy Ltd., which aims to quantify the CO2 mitigation potential of horizontal GCHPs. It considers changing environmental conditions and combines model predictions of soil moisture content and soil temperature with measurements at different GCHP locations over the UK. The combined effect of environment dynamics and horizontal GCHP technical properties on long-term GCHP performance will be assessed using a detailed land surface model (JULES: Joint UK Land Environment Simulator, Meteorological Office, UK) with additional equations embedded describing the interaction between GCHP heat exchangers and the surrounding soil. However, a number of key soil physical processes are currently not incorporated in JULES, such as groundwater flow, which, especially in lowland areas, can have an important effect on the heat flow between soil and HE. Furthermore, the interaction between HE and soil may also cause soil vapour and moisture fluxes. These will affect soil thermal conductivity and hence heat flow between the HE and the surrounding soil, which will in turn influence system performance. The project will address these issues. We propose to drive an improved version of JULES (with equations to simulate GCHP exchange embedded), with long-term gridded (1 km) atmospheric, soil and vegetation data (reflecting current and future environmental conditions) to reliably assess the mitigation potential of GCHPs over the entire domain of the UK, where uptake of GCHPs has been low traditionally. In this way we can identify areas that are most suitable for the installation of GCHPs. Only then recommendations can be made to local and regional governments, for example, on how to improve the mitigation potential in less suitable areas by adjusting GCHP configurations or design.

  18. Examination of Icing Induced Loss of Control and Its Mitigations

    NASA Technical Reports Server (NTRS)

    Reehorst, Andrew L.; Addy, Harold E., Jr.; Colantonio, Renato O.

    2010-01-01

    Factors external to the aircraft are often a significant causal factor in loss of control (LOC) accidents. In today s aviation world, very few accidents stem from a single cause and typically have a number of causal factors that culminate in a LOC accident. Very often the "trigger" that initiates an accident sequence is an external environment factor. In a recent NASA statistical analysis of LOC accidents, aircraft icing was shown to be the most common external environmental LOC causal factor for scheduled operations. When investigating LOC accident or incidents aircraft icing causal factors can be categorized into groups of 1) in-flight encounter with super-cooled liquid water clouds, 2) take-off with ice contamination, or 3) in-flight encounter with high concentrations of ice crystals. As with other flight hazards, icing induced LOC accidents can be prevented through avoidance, detection, and recovery mitigations. For icing hazards, avoidance can take the form of avoiding flight into icing conditions or avoiding the hazard of icing by making the aircraft tolerant to icing conditions. Icing detection mitigations can take the form of detecting icing conditions or detecting early performance degradation caused by icing. Recovery from icing induced LOC requires flight crew or automated systems capable of accounting for reduced aircraft performance and degraded control authority during the recovery maneuvers. In this report we review the icing induced LOC accident mitigations defined in a recent LOC study and for each mitigation describe a research topic required to enable or strengthen the mitigation. Many of these research topics are already included in ongoing or planned NASA icing research activities or are being addressed by members of the icing research community. These research activities are described and the status of the ongoing or planned research to address the technology needs is discussed

  19. Effects of anthropogenic land-subsidence on river flood hazard: a case study in Ravenna, Italy

    NASA Astrophysics Data System (ADS)

    Carisi, Francesca; Domeneghetti, Alessio; Castellarin, Attilio

    2015-04-01

    Can differential land-subsidence significantly alter the river flooding dynamics, and thus flood risk in flood prone areas? Many studies show how the lowering of the coastal areas is closely related to an increase in the flood-hazard due to more important tidal flooding and see level rise. On the contrary, the literature on the relationship between differential land-subsidence and possible alterations to riverine flood-hazard of inland areas is still sparse, while several areas characterized by significant land-subsidence rates during the second half of the 20th century experienced an intensification in both inundation magnitude and frequency. This study investigates the possible impact of a significant differential ground lowering on flood hazard in proximity of Ravenna, which is one of the oldest Italian cities, former capital of the Western Roman Empire, located a few kilometers from the Adriatic coast and about 60 km south of the Po River delta. The rate of land-subsidence in the area, naturally in the order of a few mm/year, dramatically increased up to 110 mm/year after World War II, primarily due to groundwater pumping and a number of deep onshore and offshore gas production platforms. The subsidence caused in the last century a cumulative drop larger than 1.5 m in the historical center of the city. Starting from these evidences and taking advantage of a recent digital elevation model of 10m resolution, we reconstructed the ground elevation in 1897 for an area of about 65 km2 around the city of Ravenna. We referred to these two digital elevation models (i.e. current topography and topographic reconstruction) and a 2D finite-element numerical model for the simulation of the inundation dynamics associated with several levee failure scenarios along embankment system of the river Montone. For each scenario and digital elevation model, the flood hazard is quantified in terms of water depth, speed and dynamics of the flooding front. The comparison enabled us to quantify alterations to the flooding hazard due to large and rapid differential land-subsidence, shedding some light on whether to consider anthropogenic land-subsidence among the relevant human-induced drivers of flood-risk change.

  20. First-phase study design for the US Navy Radon Assessment and Mitigation Program (NAVRAMP)

    SciTech Connect

    Wilson, D.L.; Gammage, R.B.; Matthews, T.G.

    1990-01-01

    In 1988, the US Navy initiated a worldwide multi-year program for the assessment and mitigation of radon inside buildings under its control. During the first two years of the program, radon levels in residences occupied by Navy personnel and their dependents are being surveyed. Also being surveyed are all schools, child care centers, hospitals, and brigs in addition to a small random sample of bachelor quarters. Passive alpha-track detectors, numbering about 25,000, are being used as monitoring devices. A substantial fraction of the monitors (20%) are being used for quality assurance. Data management programs have been developed to record the chain of custody of the monitors and handle the associated questionnaire data. Program objectives and implementation emphasize quality assurance, records maintenance, and monitor placement and retrieval. 5 refs., 2 tabs.

  1. Study on FPGA SEU Mitigation for the Readout Electronics of DAMPE BGO Calorimeter in Space

    NASA Astrophysics Data System (ADS)

    Shen, Zhongtao; Feng, Changqing; Gao, Shanshan; Zhang, Deliang; Jiang, Di; Liu, Shubin; An, Qi

    2015-06-01

    The BGO calorimeter, which provides a wide measurement range of the primary cosmic ray spectrum, is a key sub-detector of Dark Matter Particle Explorer (DAMPE). The readout electronics of calorimeter consists of 16 pieces of Actel ProASIC Plus FLASH-based FPGA, of which the design-level flip-flops and embedded block RAMs are single event upset (SEU) sensitive in the harsh space environment. Therefore to comply with radiation hardness assurance (RHA), SEU mitigation methods, including partial triple modular redundancy (TMR), CRC checksum, and multi-domain reset are analyzed and tested by the heavy-ion beam test. Composed of multi-level redundancy, a FPGA design with the characteristics of SEU tolerance and low resource consumption is implemented for the readout electronics.

  2. Study of the radiated energy loss during massive gas injection mitigated disruptions on EAST

    NASA Astrophysics Data System (ADS)

    Duan, Y. M.; Hao, Z. K.; Hu, L. Q.; Wang, L.; Xu, P.; Xu, L. Q.; Zhuang, H. D.

    2015-08-01

    The MGI mitigated disruption experiments were carried out on EAST with a new fast gas controlling valve in 2012. Different amounts of noble gas He or mixed gas of 99% He + 1% Ar are injected into plasma in current flat-top phase and current ramp-down phase separately. The initial results of MGI experiments are described. The MGI system and the radiation measurement system are briefly introduced. The characteristics of radiation distribution and radiation energy loss are analyzed. About 50% of the stored thermal energy Wdia is dissipated by radiation during the entire disruption process and the impurities of C and Li from the PFC play important roles to radiative energy loss. The amount of the gas can affect the pre-TQ phase. Strong poloidal asymmetry of radiation begins to appear in the CQ phase, which is possibly caused by the plasma configuration changes as a result of VDE. No toroidal radiation asymmetry is observed presently.

  3. First-phase study design for the US Navy Radon Assessment and Mitigation Program (NAVRAMP)

    SciTech Connect

    Gammage, R.B.; Wilson, D.L.; Dudney, C.S.; Matthews, T.G.

    1990-01-01

    In 1988, the Navy initiated a multi-year program for the assessment and mitigation of radon inside buildings at its worldwide distribution of bases. During the first two years of the program, a survey is being made of indoor radon levels in residences occupied by Navy personnel and their dependents. In addition, a small random sample of other structures is being monitored for elevated radon. Passive alpha-track detectors, numbering about 25,000, are being used as monitoring devices. A substantial fraction of the monitors (20%) are being used for quality assurance. Data management programs have been developed to record the chain of custody of the monitors and handle the associated questionnaire data. Program objectives and implementation emphasize quality assurance, records maintenance and monitor placement and retrieval. 5 refs., 2 tabs.

  4. Using respondents' uncertainty scores to mitigate hypothetical bias in community-based health insurance studies.

    PubMed

    Donfouet, Hermann Pythagore Pierre; Mahieu, Pierre-Alexandre; Malin, Eric

    2013-04-01

    Community-based health insurance has been implemented in several developing countries to help the poor to gain access to adequate health-care services. Assessing what the poor are willing to pay is of paramount importance for policymaking. The contingent valuation method, which relies on a hypothetical market, is commonly used for this purpose. But the presence of the hypothetical bias that is most often inherent in this method tends to bias the estimates upward and compromises policymaking. This paper uses respondents' uncertainty scores in an attempt to mitigate hypothetical bias in community-based health insurance in one rural setting in Cameroon. Uncertainty scores are often employed in single dichotomous choice surveys. An originality of the paper is to use such an approach in a double-bounded dichotomous choice survey. The results suggest that this instrument is effective at decreasing the mean WTP. PMID:22160944

  5. Effectiveness of protected areas in mitigating fire within their boundaries: case study of Chiapas, Mexico.

    PubMed

    Román-Cuesta, María Rosa; Martínez-Vilalta, Jordi

    2006-08-01

    Since the severe 1982-1983 El Niño drought, recurrent burning has been reported inside tropical protected areas (TPAs). Despite the key role of fire in habitat degradation, little is known about the effectiveness of TPAs in mitigating fire incidence and burned areas. We used a GPS fire database (1995-2005) (n=3590 forest fires) obtained from the National Forest Commission to compare fire incidence (number of fires) and burned areas inside TPAs and their surrounding adjacent buffer areas in Southern Mexico (Chiapas). Burned areas inside parks ranged from 2% (Palenque) to 45% (Lagunas de Montebello) of a park's area, and the amount burned was influenced by two severe El Niño events (1998 and 2003). These two years together resulted in 67% and 46% of the total area burned in TPAs and buffers, respectively during the period under analysis. Larger burned areas in TPAs than in their buffers were exclusively related to the extent of natural habitats (flammable area excluding agrarian and pasture lands). Higher fuel loads together with access and extinction difficulties were likely behind this trend. A higher incidence of fire in TPAs than in their buffers was exclusively related to anthropogenic factors such as higher road densities and agrarian extensions. Our results suggest that TPAs are failing to mitigate fire impacts, with both fire incidence and total burned areas being significantly higher in the reserves than in adjacent buffer areas. Management plans should consider those factors that facilitate fires in TPAs: anthropogenic origin of fires, sensitivity of TPAs to El Niñio-droughts, large fuel loads and fuel continuity inside parks, and limited financial resources. Consideration of these factors favors lines of action such as alternatives to the use of fire (e.g., mucuna-maize system), climatic prediction to follow the evolution of El Niño, fuel management strategies that favor extinction practices, and the strengthening of local communities and ecotourism. PMID:16922224

  6. Study on landslide hazard zonation based on factor weighting-rating theory in Slanic Prahova

    NASA Astrophysics Data System (ADS)

    Maftei, R.-M.; Vina, G.; Filipciuc, C.

    2012-04-01

    Studying the risks caused by landslides is important in the context of its forecast triggering. This study mainly integrates the background data that are related to historical and environmental factors and also current triggering factors. The theory on zoning hazard caused by landslides, Landslide Hazard Zonation, (LHZ) appeared in the 1960s. In this period the U.S. and many European countries began to use other triggers factors, besides the slope factor, in achieving hazard zoning. This theory has progressed due to the development of remote sensing and GIS technology, which were used to develop and analys methods and techniques consisting in combining data from different sources. The study of an area involves analysing the geographical position data, estimating the surface, the type of terrain, altitude, identifing the landslides in the area and some geological summary data. Data sources. The data used in this study are: · Landsat 7 satellite images; · 30 m spatial resolution, from which is derived the vegetation index; · topographic maps 1:25 000 from which we can obtain the numerical altitude model (DEM) (used to calculate the slope and relative altitude to land) · geological maps 1:50 000. Studied factors. The main factors used and studied in achieving land slides hazard zoning are: - the rate of displacement, the angle of slope, lithology - the index of vegetation or ground coverage of vegetation (NDVI) - river network, structural factor 1. The calculation of normalized vegetation index is made based on Landsat ETM satellite images. This vegetation factor can be both a principal and a secondary trigger factor in landslides. In areas devoid of vegetation, landslides are triggered more often compared with those in which coverage is greater. 2. Factors derived from the numerical model are the slope and elevation relative altitude. This operation was made using the topographic map 1:25 000 from were the level curvs contour was extracted by digitization, and then they were converted into points that have been interpolated. Lithological and structural factors have been extracted from the geological map by vectorization and the hydrological one from the topographic map and satellite imagery. 3. Weights Selection All these elements were transfomated in raster format with spatial resolution of 25 m. Each element was given a rating of importance from 0-9, depending on its share in causing the phenomenon, and then a quantitative index based on each specific characteristic. Was performed for each subject in hand, a risk index for each area separately. LHZ will be the established from the risk index histogram important steps. This work is presented within the framework of the SafeLand project funded by the EC (FP7).

  7. Studies On The Influence Of Soil Components On Adsorption-Desorption Of Hazardous Organics And Their Insitu Biodegradation

    NASA Astrophysics Data System (ADS)

    Khan, Z.

    2003-12-01

    Currently approximately 155 cubic yards of soil is contaminated with hazardous organics at Patancheru Industrial area (Hyderabad, India). These hazardous organic contaminants are frequently part of hazardous waste disposed on land and the study of waste site interaction is the key to assess the potential for offsite and onsite contamination. In the present study the authors report the results on the adsorption, soil leaching potential and persistence of phenol, p-nitrophenol,2,4-dichlorophenol and 4,chloro-2,nitrophenol which are the common constituents of the hazardous waste generated. The role of soil components like organic matter, clay, iron and aluminium oxides in the adsorption capacity has been studied. Desorption isotherms of soil adsorbed hazardous organics exhibited hysterisis at high initial concentration indicating the degree of irreversibility of adsorption-deesorption process. Leaching potential of the hazardous organics decreases with their increasing hydrophobicity and soil organic matter content while their persistence in terms of half life time (DT50) increases. Insitu biodegradation has been carried out by developing mixed culture systems which can degrade the phenols to complete mineralisation by utilizing them as the sole source of carbon and their corresponding biodegradation kinetic constants were evaluated. Based on the above data generated preparation of hazardous waste dumpsites with suitable soil surface having high holding capacity for organics and their insitu biodegradation by mixing with specific bacterial cultures enriched from different soils can be exploited as a cost effective technology for reclamation of contaminated sites.

  8. Study on the Application of Probabilistic Tsunami Hazard Analysis for the Nuclear Power Plant Site in Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Rhee, H. M.; Kim, M.; Sheen, D. H.; Choi, I. K.

    2014-12-01

    The necessity of study on the tsunami hazard assessment for Nuclear Power Plant (NPP) site was suggested since the event of Fukushima in 2011 had been occurred. It has being emphasized because all of the NPPs in Korean Peninsula are located in coastal region. The tsunami hazard is regarded as the annual exceedance probability for the wave heights. The methodology for analysis of tsunami hazard is based on the seismic hazard analysis. The seismic hazard analysis had been performed by using both deterministic and probabilistic method. Recently, the probabilistic method had been received more attention than the deterministic method because the uncertainties of hazard analysis could be considered by using the logic tree approach. In this study, the probabilistic tsunami hazard analysis for Uljin NPP site was performed by using the information of fault sources which was published by Atomic Energy Society of Japan (AESJ). The wave parameter is the most different parameter with seismic hazard. It could be estimated from the results of tsunami propagation analysis. The TSUNAMI_ver1.0 which was developed by Japan nuclear energy safety organization (JNES), was used for the tsunami simulation. The 80 cases tsunami simulations were performed and then the wave parameters were estimated. For reducing the sensitivity which was encouraged by location of sampling point, the wave parameters were estimated from group of sampling points.The probability density function on the tsunami height was computed by using the recurrence intervals and the wave parameters. And then the exceedance probability distribution was calculated from the probability density function. The tsunami hazards for the sampling groups were calculated. The fractile curves which were shown the uncertainties of input parameters were estimated from the hazards by using the round-robin algorithm. In general, tsunami hazard analysis is focused on the maximum wave heights. But the minimum wave height should be considered for the tsunami hazard analysis on the NPP site since it is connected with water intake system. The results of tsunami hazard analysis for the NPP site was suggested by the annual exceedance probability with the wave heights. This study shows that the PTHA method could be applied for the estimation of tsunami wave height in NPP sites

  9. GIS data for the Seaside, Oregon, Tsunami Pilot Study to modernize FEMA flood hazard maps

    USGS Publications Warehouse

    Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.

    2007-01-01

    A Tsunami Pilot Study was conducted for the area surrounding the coastal town of Seaside, Oregon, as part of the Federal Emergency Management's (FEMA) Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). The Cascadia subduction zone extends from Cape Mendocino, California, to Vancouver Island, Canada. The Seaside area was chosen because it is typical of many coastal communities subject to tsunamis generated by far- and near-field (Cascadia) earthquakes. Two goals of the pilot study were to develop probabilistic 100-year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improving tsunami hazard assessment guidelines for FEMA and state and local agencies. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study model data and results are published separately as a geographic information systems (GIS) data report (Wong and others, 2006). The flood maps and GIS data are briefly described here.

  10. Hazardous Waste

    MedlinePlus

    ... you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint ...

  11. Towards the Seismic Hazard Reassessment of Paks NPP (Hungary) Site: Seismicity and Sensitivity Studies

    NASA Astrophysics Data System (ADS)

    Toth, Laszlo; Monus, Peter; Gyori, Erzsebet; Grenerczy, Gyula; Janos Katona, Tamas; Kiszely, Marta

    2015-04-01

    In context of extension of Paks Nuclear Power Plant by new units, a comprehensive site seismic hazard evaluation program has been developed that is already approved by the Hungarian Authorities. This includes a 3D seismic survey, drilling of several deep boreholes, extensive geological mapping, and geophysical investigations at the site and its vicinity, as well as on near regional, and regional scale. Furthermore, all relevant techniques of modern space geodesy (GPS, PSInSAR) will be also utilized to construct a new seismotectonic model. The implementation of the project is still in progress. In the presentation, some important elements of the new seismic hazard assessment are highlighted, and some results obtained in the preliminary phase of the program are presented and discussed. The first and most important component of the program is the compilation of the seismological database that is developed on different time scale zooming on different event recurrence rates such as paleo-earthquakes (10-1/a). In 1995, Paks NPP installed and started to operate a sensitive microseismic monitoring network capable for locating earthquakes as small as magnitude 2.0 within about 100 km of the NPP site. During the two decades of operation, the microseismic monitoring network located some 2,000 earthquakes within the region of latitude 45.5 - 49 N and longitude 16 - 23 E. Out of the total number of events, 130 earthquakes were reported as 'felt events'. The largest earthquake was an event of ML 4.8, causing significant damage in the epicenter area. The results of microseismic monitoring provide valuable data for seismotectonic modelling and results in more accurate earthquake recurrence equations. The first modern PSHA of Paks NPP site was carried out in 1995. Complex site characterization project was implemented and hazard curves had been evaluated for 10-3 - 10-5 annual frequency. As a follow-up, PSHA results have been reviewed and updated in the frame of periodic safety reviews, and hazard characterization of the site has been confirmed. The hazard curves have been extended to lower probability events, as it is required by the probabilistic safety analysis. These earlier projects resulted in 0.22-0.26 g and 0.43-0.54 g mean PGA at 104 and 105 return periods. The site effect and liquefaction probability have also been evaluated. As it is expected for the site of soft soil conditions, the amplification is greater at shorter periods for the lower amplitude ground motion of 104 return period compared to the longer periods for the higher amplitude of the 105 year level ground motion. Further studies will be based on the improved regional seismotectonic model, state-of-the-art hazard evaluation software, and better knowledge of the local soil conditions. The presented preliminary results can demonstrate the adequacy of the planned program and highlight the progress in the hazard assessment.

  12. A Hazard Assessment and Proposed Risk Index for Art, Architecture, Archive and Artifact Protection: Case Studies for Assorted International Museums

    NASA Astrophysics Data System (ADS)

    Kirk, Clara J.

    This study proposes a hazard/risk index for environmental, technological, and social hazards that may threaten a museum or other place of cultural storage and accession. This index can be utilized and implemented to measure the risk at the locations of these storage facilities in relationship to their geologic, geographic, environmental, and social settings. A model case study of the 1966 flood of the Arno River and its impact on the city of Florence and the Uffizi Gallery was used as the index focus. From this focus an additional eleven museums and their related risk were assessed. Each index addressed a diverse range of hazards based on past frequency and magnitude. It was found that locations nearest a hazard had exceptionally high levels of risk, however more distant locations could have influences that would increase their risk to levels similar to those locations near the hazard. Locations not normally associated with a given natural hazard can be susceptible should the right conditions be met and this research identified, complied and assessed those factions found to influence natural hazard risk at these research sites.

  13. Prediction of Ungauged River Basin for Hydro Power Potential and Flood Risk Mitigation; a Case Study at Gin River, Sri Lanka

    NASA Astrophysics Data System (ADS)

    Ratnayake, A. S.

    2011-12-01

    The most of the primary civilizations of the world emerged in or near river valleys or floodplains. The river channels and floodplains are single hydrologic and geomorphic system. The failure to appreciate the integral connection between floodplains and channel underlies many socioeconomic and environmental problems in river management today. However it is a difficult task of collecting reliable field hydrological data. Under such situations either synthetic or statistically generated data were used for hydraulic engineering designing and flood modeling. The fundamentals of precipitation-runoff relationship through synthetic unit hydrograph for Gin River basin were prepared using the method of the Flood Studies Report of the National Environmental Research Council, United Kingdom (1975). The Triangular Irregular Network model was constructed using Geographic Information System (GIS) to determine hazard prone zones. The 1:10,000 and 1:50,000 topography maps and field excursions were also used for initial site selection of mini-hydro power units and determine flooding area. The turbines output power generations were calculated using the parameters of net head and efficiency of turbine. The peak discharge achieves within 4.74 hours from the onset of the rainstorm and 11.95 hours time takes to reach its normal discharge conditions of Gin River basin. Stream frequency of Gin River is 4.56 (Junctions/ km2) while the channel slope is 7.90 (m/km). The regional coefficient on the catchment is 0.00296. Higher stream frequency and gentle channel slope were recognized as the flood triggering factors of Gin River basin and other parameters such as basins catchment area, main stream length, standard average annual rainfall and soil do not show any significant variations with other catchments of Sri Lanka. The flood management process, including control of flood disaster, prepared for a flood, and minimize it impacts are complicated in human population encroached and modified floodplains. Thus modern GIS technology has been productively executed to prepare hazard maps based on the flood modeling and also it would be further utilized for disaster preparedness and mitigation activities. Five suitable hydraulic heads were recognized for mini-hydro power sites and it would be the most economical and applicable flood controlling hydraulic engineering structure considering all morphologic, climatic, environmental and socioeconomic proxies of the study area. Mini-hydro power sites also utilized as clean, eco friendly and reliable energy source (8630.0 kW). Finally Francis Turbine can be employed as the most efficiency turbine for the selected sites bearing in mind of both technical and economical parameters.

  14. An evaluation of soil erosion hazard: A case study in Southern Africa using geomatics technologies

    NASA Astrophysics Data System (ADS)

    Eiswerth, Barbara Alice

    Accelerated soil erosion in Malawi, Southern Africa, increasingly threatens agricultural productivity, given current and projected population growth trends. Previous attempts to document soil erosion potential have had limited success, lacking appropriate information and diagnostic tools. This study utilized geomatics technologies and the latest available information from topography, soils, climate, vegetation, and land use of a watershed in southern Malawi. The Soil Loss Estimation Model for Southern Africa (SLEMSA), developed for conditions in Zimbabwe, was evaluated and used to create a soil erosion hazard map for the watershed under Malawi conditions. The SLEMSA sub-models of cover, soil loss, and topography were computed from energy interception, rainfall energy, and soil erodibility, and slope length and steepness, respectively. Geomatics technologies including remote sensing and Geographic Information Systems (GIS) provided the tools with which land cover/land use, a digital elevation model, and slope length and steepness were extracted and integrated with rainfall and soils spatial information. Geomatics technologies enable rapid update of the model as new and better data sets become available. Sensitivity analyses of the SLEMSA model revealed that rainfall energy and slope steepness have the greatest influence on soil erosion hazard estimates in this watershed. Energy interception was intermediate in sensitivity level, whereas slope length and soil erodibility ranked lowest. Energy interception and soil erodibility were shown by parameter behavior analysis to behave in a linear fashion with respect to soil erosion hazard, whereas rainfall energy, slope steepness, and slope length exhibit non-linear behavior. When SLEMSA input parameters and results were compared to alternative methods of soil erosion assessment, such as drainage density and drainage texture, the model provided more spatially explicit information using 30 meter grid cells. Results of this study indicate that more accurate soil erosion estimates can be made when: (1) higher resolution digital elevation models are used; (2) data from improved precipitation station network are available, and; (3) greater investment in rainfall energy research.

  15. Determination of metal ion content of beverages and estimation of target hazard quotients: a comparative study

    PubMed Central

    Hague, Theresa; Petroczi, Andrea; Andrews, Paul LR; Barker, James; Naughton, Declan P

    2008-01-01

    Background Considerable research has been directed towards the roles of metal ions in nutrition with metal ion toxicity attracting particular attention. The aim of this study is to measure the levels of metal ions found in selected beverages (red wine, stout and apple juice) and to determine their potential detrimental effects via calculation of the Target Hazard Quotients (THQ) for 250 mL daily consumption. Results The levels (mean ± SEM) and diversity of metals determined by ICP-MS were highest for red wine samples (30 metals totalling 5620.54 ± 123.86 ppb) followed by apple juice (15 metals totalling 1339.87 ± 10.84 ppb) and stout (14 metals totalling 464.85 ± 46.74 ppb). The combined THQ values were determined based upon levels of V, Cr, Mn, Ni, Cu, Zn and Pb which gave red wine samples the highest value (5100.96 ± 118.93 ppb) followed by apple juice (666.44 ± 7.67 ppb) and stout (328.41 ± 42.36 ppb). The THQ values were as follows: apple juice (male 3.11, female 3.87), stout (male 1.84, female 2.19), red wine (male 126.52, female 157.22) and ultra-filtered red wine (male 110.48, female 137.29). Conclusion This study reports relatively high levels of metal ions in red wine, which give a very high THQ value suggesting potential hazardous exposure over a lifetime for those who consume at least 250 mL daily. In addition to the known hazardous metals (e.g. Pb), many metals (e.g. Rb) have not had their biological effects systematically investigated and hence the impact of sustained ingestion is not known. PMID:18578877

  16. The newest achievements of studies on the reutilization, treatment, and disposal technology of hazardous wastes

    SciTech Connect

    Liu Peizhe

    1996-12-31

    From 1991 to 1996, key studies on the reutilization, treatment, and disposal technology of hazardous wastes have been incorporated into the national plan for environmental protection science and technology. At present, the research achievements have been accomplished, have passed national approval, and have been accepted. The author of this paper, as leader of the national group for this research work, expounds the newest achievements of the studies involving four parts: (1) the reutilization technology of electroplating sludge, including the ion-exchange process for recovering the sludge and waste liquor for producing chromium tanning agent and extracting chromium and colloidal protein from tanning waste residue; on the recovery of heavy metals from the electroplating waste liquor with microbic purification; on the demonstration project of producing modified plastics from the sludge and the waste plastics; and on the demonstration of the recovery of heavy metals from waste electroplating sludge by using the ammonia-leaching process; (2) the demonstrative research of reutilization technology of chromium waste residues, including production of self-melting ore and smelting of chromium-containing pig iron, and of pyrolytic detoxification of the residue with cyclone furnace; (3) the incineration technology of hazardous wastes with successful results of the industrial incinerator system for polychlorinated biphenyls; and (4) the safety landfill technology for disposal of hazardous wastes, with a complete set of technology for pretreatment, selection of the site, development of the antipercolating materials, and design and construction of the landfill. Only a part of the achievements is introduced in this paper, most of which has been built and is being operated for demonstration to further spreading application and accumulate experience. 6 refs., 7 figs., 6 tabs.

  17. Creative mitigation

    SciTech Connect

    Ayer, F.; Lagassa, G.

    1989-10-01

    On May 9, 1989, in front of a small but enthusiastic group composed of residents of Columbia Falls, Maine, Downeast fisherman and a crew of Bangor Hydro-Electric employees removed some of the wooden sections of the Columbia Falls dam. The dam is located at the mouth of the Pleasant River on the Maine seacoast, only thirty miles from the border between Maine and New Brunswick, Canada. In so doing, they provided unobstructed access by Atlantic salmon to crucial upstream aquatic habitat for the first time since the day was constructed in 1981. At the same time they made possible the efficient operation of a 13 MW hydroelectric facility some 75 miles inland at West Enfield, Maine, on the Penobscot River. This article describes the creative strategies used by Bangor Pacific Hydro Associated to satisfy environmental mitigation requirements at West Enfield, Maine.

  18. First Production of C60 Nanoparticle Plasma Jet for Study of Disruption Mitigation for ITER

    NASA Astrophysics Data System (ADS)

    Bogatu, I. N.; Thompson, J. R.; Galkin, S. A.; Kim, J. S.; Brockington, S.; Case, A.; Messer, S. J.; Witherspoon, F. D.

    2012-10-01

    Unique fast response and large mass-velocity delivery of nanoparticle plasma jets (NPPJs) provide a novel application for ITER disruption mitigation, runaway electrons diagnostics and deep fueling. NPPJs carry a much larger mass than usual gases. An electromagnetic plasma gun provides a very high injection velocity (many km/s). NPPJ has much higher ram pressure than any standard gas injection method and penetrates the tokamak confining magnetic field. Assimilation is enhanced due to the NP large surface-to-volume ratio. Radially expanding NPPJs help achieving toroidal uniformity of radiation power. FAR-TECH's NPPJ system was successfully tested: a coaxial plasma gun prototype (˜35 cm length, 96 kJ energy) using a solid state TiH2/C60 pulsed power cartridge injector produced a hyper-velocity (>4 km/s), high-density (>10^23 m-3), C60 plasma jet in ˜0.5 ms, with ˜1-2 ms overall response-delivery time. We present the TiH2/C60 cartridge injector output characterization (˜180 mg of sublimated C60 gas) and first production results of a high momentum C60 plasma jet (˜0.6 g.km/s).

  19. Implications of Adhesion Studies for Dust Mitigation on Thermal Control Surfaces

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Berkebile, Stephen P.

    2012-01-01

    Experiments measuring the adhesion forces under ultrahigh vacuum conditions (10 (exp -10) torr) between a synthetic volcanic glass and commonly used space exploration materials have recently been described. The glass has a chemistry and surface structure typical of the lunar regolith. It was found that Van der Waals forces between the glass and common spacecraft materials was negligible. Charge transfer between the materials was induced by mechanically striking the spacecraft material pin against the glass plate. No measurable adhesion occurred when striking the highly conducting materials, however, on striking insulating dielectric materials the adhesion increased dramatically. This indicates that electrostatic forces dominate over Van der Waals forces under these conditions. The presence of small amounts of surface contaminants was found to lower adhesive forces by at least two orders of magnitude, and perhaps more. Both particle and space exploration material surfaces will be cleaned by the interaction with the solar wind and other energetic processes and stay clean because of the extremely high vacuum (10 (exp -12) torr) so the atomically clean adhesion values are probably the relevant ones for the lunar surface environment. These results are used to interpret the results of dust mitigation technology experiments utilizing textured surfaces, work function matching surfaces and brushing. They have also been used to reinterpret the results of the Apollo 14 Thermal Degradation Samples experiment.

  20. Viscoelastic Materials Study for the Mitigation of Blast-Related Brain Injury

    NASA Astrophysics Data System (ADS)

    Bartyczak, Susan; Mock, Willis, Jr.

    2011-06-01

    Recent preliminary research into the causes of blast-related brain injury indicates that exposure to blast pressures, such as from IED detonation or multiple firings of a weapon, causes damage to brain tissue resulting in Traumatic Brain Injury (TBI) and Post Traumatic Stress Disorder (PTSD). Current combat helmets are not sufficient to protect the warfighter from this danger and the effects are debilitating, costly, and long-lasting. Commercially available viscoelastic materials, designed to dampen vibration caused by shock waves, might be useful as helmet liners to dampen blast waves. The objective of this research is to develop an experimental technique to test these commercially available materials when subject to blast waves and evaluate their blast mitigating behavior. A 40-mm-bore gas gun is being used as a shock tube to generate blast waves (ranging from 1 to 500 psi) in a test fixture at the gun muzzle. A fast opening valve is used to release nitrogen gas from the breech to impact instrumented targets. The targets consist of aluminum/ viscoelastic polymer/ aluminum materials. Blast attenuation is determined through the measurement of pressure and accelerometer data in front of and behind the target. The experimental technique, calibration and checkout procedures, and results will be presented.

  1. Macroscopic to microscopic studies of flue gas desulfurization byproducts for acid mine drainage mitigation

    SciTech Connect

    Robbins, E.I.; Kalyoncu, R.S.; Finkelman, R.B.; Matos, G.R.; Barsotti, A.F.; Haefner, R.J.; Rowe, G.L. Jr.; Savela, C.E.; Eddy, J.I.

    1996-12-31

    The use of flue gas desulfurization (FGD) systems to reduce SO{sub 2} emissions has resulted in the generation of large quantities of byproducts. These and other byproducts are being stockpiled at the very time that alkaline materials having high neutralization potential are needed to mitigate acid mine drainage (AMD). FGD byproducts are highly alkaline materials composed primarily of unreacted sorbents (lime or limestone and sulfates and sulfites of Ca). The American Coal Ash Association estimated that approximately 20 million tons of FGD material were generated by electric power utilities equipped with wet lime-limestone PGD systems in 1993. Less than 5% of this material has been put to beneficial use for agricultural soil amendments and for the production of wallboard and cement. Four USGS projects are examining FGD byproduct use to address these concerns. These projects involve (1) calculating the volume of flue gas desulfurization (FGD) byproduct generation and their geographic locations in relation to AMD, (2) determining byproduct chemistry and mineralogy, (3) evaluating hydrology and geochemistry of atmospheric fluidized bed combustion byproduct as soil amendment in Ohio, and (4) analyzing microbial degradation of gypsum in anoxic limestone drains in West Virginia.

  2. Conforth Ranch Wildlife Mitigation Feasibility Study, McNary, Oregon : Annual Report.

    SciTech Connect

    Rasmussen, Larry; Wright, Patrick; Giger, Richard

    1991-03-01

    The 2,860-acre Conforth Ranch near Umatilla, Oregon is being considered for acquisition and management to partially mitigate wildlife losses associated with McNary Hydroelectric Project. The Habitat Evaluation Procedures (HEP) estimated that management for wildlife would result in habitat unit gains of 519 for meadowlark, 420 for quail, 431 for mallard, 466 for Canada goose, 405 for mink, 49 for downy woodpecker, 172 for yellow warbler, and 34 for spotted sandpiper. This amounts to a total combined gain of 2,495 habitat units -- a 110 percent increase over the existing values for these species combined of 2,274 habitat units. Current water delivery costs, estimated at $50,000 per year, are expected to increase to $125,000 per year. A survey of local interest indicated a majority of respondents favored the concept with a minority opposed. No contaminants that would preclude the Fish and Wildlife Service from agreeing to accept the property were identified. 21 refs., 3 figs., 5 tabs.

  3. Numerical study of potential heat flux mitigation effects in the TCV snowflake divertor

    NASA Astrophysics Data System (ADS)

    Lunt, T.; Canal, G. P.; Duval, B. P.; Feng, Y.; Labit, B.; McCarthy, P.; Reimerdes, H.; Vijvers, W. A. J.; Wischmeier, M.

    2016-04-01

    We report on EMC3-Eirene simulations of the plasma and neutral particle transport the TCV boundary layer of a series of snowflake (SF) equilibria characterized by the normalized poloidal flux coordinate {ρx2} of the secondary X-point x 2. We refer to a snowflake plus (SF+) for {ρx2}<1 , a snowflake minus (SF-) for {ρx2}>1 and a single-null (SN) for |{ρx2}-1|\\gg 0 . Four effects are identified that have the potential to mitigate the heat flux density at the outer strike point in a LFS SF-where x 2 is located on the low field side of the primary X-point x 1: (1) a scrape-off layer heat flux splitting, (2) an impurity radiation cloud forming at x 2 (3) the increased connection length to the outer target and (4) increased transport between x 1 and x 2. The LFS SF- is thus expected to tolerate a larger power flux {{P}\\text{sep}} over the separatrix than a comparable SN configuration.

  4. Study of cover source mismatch in steganalysis and ways to mitigate its impact

    NASA Astrophysics Data System (ADS)

    Kodovský, Jan; Sedighi, Vahid; Fridrich, Jessica

    2014-02-01

    When a steganalysis detector trained on one cover source is applied to images from a different source, generally the detection error increases due to the mismatch between both sources. In steganography, this situation is recognized as the so-called cover source mismatch (CSM). The drop in detection accuracy depends on many factors, including the properties of both sources, the detector construction, the feature space used to represent the covers, and the steganographic algorithm. Although well recognized as the single most important factor negatively affecting the performance of steganalyzers in practice, the CSM received surprisingly little attention from researchers. One of the reasons for this is the diversity with which the CSM can manifest. On a series of experiments in the spatial and JPEG domains, we refute some of the common misconceptions that the severity of the CSM is tied to the feature dimensionality or their "fragility." The CSM impact on detection appears too difficult to predict due to the effect of complex dependencies among the features. We also investigate ways to mitigate the negative effect of the CSM using simple measures, such as by enlarging the diversity of the training set (training on a mixture of sources) and by employing a bank of detectors trained on multiple different sources and testing on a detector trained on the closest source.

  5. Awareness of occupational hazards and use of safety measures among welders: a cross-sectional study from eastern Nepal

    PubMed Central

    Budhathoki, Shyam Sundar; Singh, Suman Bahadur; Sagtani, Reshu Agrawal; Niraula, Surya Raj; Pokharel, Paras Kumar

    2014-01-01

    Objective The proper use of safety measures by welders is an important way of preventing and/or reducing a variety of health hazards that they are exposed to during welding. There is a lack of knowledge about hazards and personal protective equipments (PPEs) and the use of PPE among the welders in Nepal is limited. We designed a study to assess welders’ awareness of hazards and PPE, and the use of PPE among the welders of eastern Nepal and to find a possible correlation between awareness and use of PPE among them. Materials and methods A cross-sectional study of 300 welders selected by simple random sampling from three districts of eastern Nepal was conducted using a semistructured questionnaire. Data regarding age, education level, duration of employment, awareness of hazards, safety measures and the actual use of safety measures were recorded. Results Overall, 272 (90.7%) welders were aware of at least one hazard of welding and a similar proportion of welders were aware of at least one PPE. However, only 47.7% used one or more types of PPE. Education and duration of employment were significantly associated with the awareness of hazards and of PPE and its use. The welders who reported using PPE during welding were two times more likely to have been aware of hazards (OR=2.52, 95% CI 1.09 to 5.81) and five times more likely to have been aware of PPE compared with the welders who did not report the use of PPE (OR=5.13, 95% CI 2.34 to 11.26). Conclusions The welders using PPE were those who were aware of hazards and PPE. There is a gap between being aware of hazards and PPE (90%) and use of PPE (47%) at work. Further research is needed to identify the underlying factors leading to low utilisation of PPE despite the welders of eastern Nepal being knowledgeable of it. PMID:24889850

  6. Climate change and mitigation.

    PubMed

    Nibleus, Kerstin; Lundin, Rickard

    2010-01-01

    Planet Earth has experienced repeated changes of its climate throughout time. Periods warmer than today as well as much colder, during glacial episodes, have alternated. In our time, rapid population growth with increased demand for natural resources and energy, has made society increasingly vulnerable to environmental changes, both natural and those caused by man; human activity is clearly affecting the radiation balance of the Earth. In the session "Climate Change and Mitigation" the speakers offered four different views on coal and CO2: the basis for life, but also a major hazard with impact on Earth's climate. A common denominator in the presentations was that more than ever science and technology is required. We need not only understand the mechanisms for climate change and climate variability, we also need to identify means to remedy the anthropogenic influence on Earth's climate. PMID:20873680

  7. Hazard Ranking Methodology for Assessing Health Impacts of Unconventional Natural Gas Development and Production: The Maryland Case Study.

    PubMed

    Boyle, Meleah D; Payne-Sturges, Devon C; Sangaramoorthy, Thurka; Wilson, Sacoby; Nachman, Keeve E; Babik, Kelsey; Jenkins, Christian C; Trowell, Joshua; Milton, Donald K; Sapkota, Amir

    2016-01-01

    The recent growth of unconventional natural gas development and production (UNGDP) has outpaced research on the potential health impacts associated with the process. The Maryland Marcellus Shale Public Health Study was conducted to inform the Maryland Marcellus Shale Safe Drilling Initiative Advisory Commission, State legislators and the Governor about potential public health impacts associated with UNGDP so they could make an informed decision that considers the health and well-being of Marylanders. In this paper, we describe an impact assessment and hazard ranking methodology we used to assess the potential public health impacts for eight hazards associated with the UNGDP process. The hazard ranking included seven metrics: 1) presence of vulnerable populations (e.g. children under the age of 5, individuals over the age of 65, surface owners), 2) duration of exposure, 3) frequency of exposure, 4) likelihood of health effects, 5) magnitude/severity of health effects, 6) geographic extent, and 7) effectiveness of setbacks. Overall public health concern was determined by a color-coded ranking system (low, moderately high, and high) that was generated based on the overall sum of the scores for each hazard. We provide three illustrative examples of applying our methodology for air quality and health care infrastructure which were ranked as high concern and for water quality which was ranked moderately high concern. The hazard ranking was a valuable tool that allowed us to systematically evaluate each of the hazards and provide recommendations to minimize the hazards. PMID:26726918

  8. Hazard Ranking Methodology for Assessing Health Impacts of Unconventional Natural Gas Development and Production: The Maryland Case Study

    PubMed Central

    Sangaramoorthy, Thurka; Wilson, Sacoby; Nachman, Keeve E.; Babik, Kelsey; Jenkins, Christian C.; Trowell, Joshua; Milton, Donald K.; Sapkota, Amir

    2016-01-01

    The recent growth of unconventional natural gas development and production (UNGDP) has outpaced research on the potential health impacts associated with the process. The Maryland Marcellus Shale Public Health Study was conducted to inform the Maryland Marcellus Shale Safe Drilling Initiative Advisory Commission, State legislators and the Governor about potential public health impacts associated with UNGDP so they could make an informed decision that considers the health and well-being of Marylanders. In this paper, we describe an impact assessment and hazard ranking methodology we used to assess the potential public health impacts for eight hazards associated with the UNGDP process. The hazard ranking included seven metrics: 1) presence of vulnerable populations (e.g. children under the age of 5, individuals over the age of 65, surface owners), 2) duration of exposure, 3) frequency of exposure, 4) likelihood of health effects, 5) magnitude/severity of health effects, 6) geographic extent, and 7) effectiveness of setbacks. Overall public health concern was determined by a color-coded ranking system (low, moderately high, and high) that was generated based on the overall sum of the scores for each hazard. We provide three illustrative examples of applying our methodology for air quality and health care infrastructure which were ranked as high concern and for water quality which was ranked moderately high concern. The hazard ranking was a valuable tool that allowed us to systematically evaluate each of the hazards and provide recommendations to minimize the hazards. PMID:26726918

  9. Mapping Europe's Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Giardini, Domenico; Wössner, Jochen; Danciu, Laurentiu

    2014-07-01

    From the rift that cuts through the heart of Iceland to the complex tectonic convergence that causes frequent and often deadly earthquakes in Italy, Greece, and Turkey to the volcanic tremors that rattle the Mediterranean, seismic activity is a prevalent and often life-threatening reality across Europe. Any attempt to mitigate the seismic risk faced by society requires an accurate estimate of the seismic hazard.

  10. Vertical Field of View Reference Point Study for Flight Path Control and Hazard Avoidance

    NASA Technical Reports Server (NTRS)

    Comstock, J. Raymond, Jr.; Rudisill, Marianne; Kramer, Lynda J.; Busquets, Anthony M.

    2002-01-01

    Researchers within the eXternal Visibility System (XVS) element of the High-Speed Research (HSR) program developed and evaluated display concepts that will provide the flight crew of the proposed High-Speed Civil Transport (HSCT) with integrated imagery and symbology to permit path control and hazard avoidance functions while maintaining required situation awareness. The challenge of the XVS program is to develop concepts that would permit a no-nose-droop configuration of an HSCT and expanded low visibility HSCT operational capabilities. This study was one of a series of experiments exploring the 'design space' restrictions for physical placement of an XVS display. The primary experimental issues here was 'conformality' of the forward display vertical position with respect to the side window in simulated flight. 'Conformality' refers to the case such that the horizon and objects appear in the same relative positions when viewed through the forward windows or display and the side windows. This study quantified the effects of visual conformality on pilot flight path control and hazard avoidance performance. Here, conformality related to the positioning and relationship of the artificial horizon line and associated symbology presented on the forward display and the horizon and associated ground, horizon, and sky textures as they would appear in the real view through a window presented in the side window display. No significant performance consequences were found for the non-conformal conditions.

  11. Comparative psychometric study of a range of hazardous drinking measures administered online in a youth population.

    TOXLINE Toxicology Bibliographic Information

    Thomas BA; McCambridge J

    2008-07-01

    AIMS: To compare the psychometric performance of a range of existing alcohol measures when data are collected online with young people, and thereby to gain insights into the reliability and validity of this mode of data collection.METHOD: One hundred and sixty-seven U.K. resident young people aged 16-24 who had drunk alcohol within the past week participated in a cross-sectional psychometric study with a test-retest reliability component. Eight hazardous drinking measures were used: the alcohol use disorders identification test (AUDIT) summary instrument and dedicated assessments of consumption (timeline follow-back and diary-format recall of alcohol drunk in the last 7 days), dependence (Leeds dependence questionnaire and severity of dependence scale) and problems (Rutgers alcohol problem index, alcohol problems scale and academic role expectations and alcohol scale).RESULTS: Internal consistency and test-retest correlation statistics were generally satisfactory, providing evidence of reliability. Validation data obtained in principal components analyses, investigation of the correlation matrix and in a multiple regression model of total AUDIT score were also supportive of the online use of these measures. Evidence was weakest for the alcohol problems scale.CONCLUSIONS: A range of hazardous drinking measures exhibit sound psychometric properties when administered online. Further comparative study of the relationships between different measures is needed.

  12. Experimental study of a highway bridge with shape memory alloy restrainers focusing on the mitigation of unseating and pounding

    NASA Astrophysics Data System (ADS)

    Guo, Anxin; Zhao, Qingjie; Li, Hui

    2012-03-01

    This paper presents an experimental study to investigate the performance of shape memory alloy (SMA) restrainers for mitigating the pounding and unseating of highway bridges when subjected to seismic excitations. Mechanical property tests of the SMA wire used in the restrainers are conducted first to understand the pseudo-elastic characteristics of the material. Then, a series of shaking table tests are carried out on a highway bridge model. The structural responses of the highway bridge model equipped with SMA restrainers, installed in the form of deck-deck and deck-pile connections, are analyzed and compared with the uncontrolled structures. The test results of this study indicate that the SMA restrainers are not only effective in preventing unseating but also in suppressing the seismic-induced pounding of the highway bridge model used in this study.

  13. Standardization of Seismic Microzonification and Probabilistic Seismic Hazard Study Considering Site Effect for Metropolitan Areas in the State of Veracruz

    NASA Astrophysics Data System (ADS)

    Torres Morales, G. F.; Leonardo Suárez, M.; Dávalos Sotelo, R.; Castillo Aguilar, S.; Mora González, I.

    2014-12-01

    Preliminary results obtained from the project "Seismic Hazard in the State of Veracruz and Xalapa Conurbation" and "Microzonation of geological and hydrometeorological hazards for conurbations of Orizaba, Veracruz, and major sites located in the lower sub-basins: The Antigua and Jamapa" are presented. These projects were sponsored respectively by the PROMEP program and the Joint Funds CONACyT-Veracruz state government. The study consists of evaluating the probabilistic seismic hazard considering the site effect (SE) in the urban zones of cities of Xalapa and Orizaba; the site effects in this preliminary stage were incorporated through a standard format proposed in studies of microzonation and application in computer systems, which allows to optimize and condense microzonation studies of a city. This study stems from the need to know the seismic hazard (SH) in the State of Veracruz and its major cities, defining SH as the probabilistic description of exceedance of a given level of ground motion intensity (generally designated by the acceleration soil or maximum ordinate in the response spectrum of pseudo-acceleration, PGA and Sa, respectively) as a result of the action of an earthquake in the area of influence for a specified period of time. The evaluation results are presented through maps of seismic hazard exceedance rate curves and uniform hazard spectra (UHS) for different spectral ordinates and return periods, respectively.

  14. A Randomized, Controlled Trial of Home Injury Hazard Reduction: The HOME Injury Study

    PubMed Central

    Phelan, Kieran J.; Khoury, Jane; Xu, Yingying; Liddy, Stacey; Hornung, Richard; Lanphear, Bruce P.

    2013-01-01

    Objective Test the efficacy of an intervention of safety device installation on medically-attended injury in children birth to 3 years of age. Design A nested, prospective, randomized, controlled trial. Setting Indoor environment of housing units of mothers and children. Participants Mothers and their children enrolled in a birth cohort examining the effects of prevalent neurotoxicants on child development, the Home Observation and Measures of the Environment (HOME) Study. Intervention Installation of multiple, passive measures (stairgates, window locks, smoke & carbon monoxide detectors, to reduce exposure to injury hazards present in housing units. Outcome measure Self-reported and medically-attended and modifiable injury. Methods 1263 (14%) prenatal patients were eligible, 413 (33%) agreed to participate and 355 were randomly assigned to the experimental (n=181) or control (n=174) groups. Injury hazards were assessed at home visits by teams of trained research assistants using a validated survey. Safety devices were installed in intervention homes. Intention-to-treat analyses to test efficacy were conducted on: 1) total injury rates and 2) on injuries deemed, a priori, modifiable by the installation of safety devices. Rates of medically attended injuries (phone calls, office or emergency visits) were calculated using generalized estimating equations. Results The mean age of the children at intervention was 6 months. Injury hazards were significantly reduced in the intervention but not in control group homes at one and two years (p<0.004). There was not a significant difference in the rate for all medically-attended injuries in intervention compared with control group children, 14.3 (95%CI 9.7, 21.1) vs. 20.8 (14.4, 29.9) per 100 child-years (p=0.17) respectively; but there was a significant reduction in modifiable medically attended injuries in intervention compared with control group children, 2.3 (1.0, 5.5) vs. 7.7 (4.2, 14.2) per 100 child-years, respectively (p=0.026). Conclusions An intervention to reduce exposure to hazards in the homes of young children led to a 70% reduction in modifiable medically-attended injury. PMID:21464382

  15. Economics of Tsunami Mitigation in the Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Goettel, K. A.; Rizzo, A.; Sigrist, D.; Bernard, E. N.

    2011-12-01

    The death total in a major Cascadia Subduction Zone (CSZ) tsunami may be comparable to the Tohoku tsunami - tens of thousands. To date, tsunami risk reduction activities have been almost exclusively hazard mapping and evacuation planning. Reducing deaths in locations where evacuation to high ground is impossible in the short time between ground shaking and arrival of tsunamis requires measures such as vertical evacuation facilities or engineered pathways to safe ground. Yet, very few, if any, such tsunami mitigation projects have been done. In contrast, many tornado safe room and earthquake mitigation projects driven entirely or in largely by life safety have been done with costs in the billions of dollars. The absence of tsunami mitigation measures results from the belief that tsunamis are too infrequent and the costs too high to justify life safety mitigation measures. A simple analysis based on return periods, death rates, and the geographic distribution of high risk areas for these hazards demonstrates that this belief is incorrect: well-engineered tsunami mitigation projects are more cost-effective with higher benefit-cost ratios than almost all tornado or earthquake mitigation projects. Goldfinger's paleoseismic studies of CSZ turbidites indicate return periods for major CSZ tsunamis of about 250-500 years (USGS Prof. Paper 1661-F in press). Tsunami return periods are comparable to those for major earthquakes at a given location in high seismic areas and are much shorter than those for tornados at any location which range from >4,000 to >16,000 years for >EF2 and >EF4 tornadoes, respectively. The average earthquake death rate in the US over the past 100-years is about 1/year, or about 30/year including the 1906 San Francisco earthquake. The average death rate for tornadoes is about 90/year. For CSZ tsunamis, the estimated average death rate ranges from about 20/year (10,000 every 500 years) to 80/year (20,000 every 250 years). Thus, the long term deaths rates from tsunamis, earthquakes and tornadoes are comparable. High hazard areas for tornadoes and earthquakes cover ~40% and ~15% of the contiguous US, ~1,250,000 and ~500,000 square miles, respectively. In marked contrast, tsunami life safety risk is concentrated in communities with significant populations in areas where evacuation to high ground is impossible: probably <4,000 square miles or <0.1% of the US. The geographic distribution of life safety risk profoundly affects the economics of tsunami life safety mitigation projects. Consider a tsunami life safety project which saves an average of one life per year (500 lives per 500 years). Using FEMA's value of human life (5.8 million), 7% discount rate and a 50-year project useful lifetime, the net present value of avoided deaths is 80 million. Thus, the benefit-cost ratio would be about 16 or about 80 for tsunami mitigation projects which cost 5 million or 1 million, respectively. These rough calculations indicate that tsunami mitigation projects in high risk locations are economically justified. More importantly, these results indicate that national and local priorities for natural hazard mitigation should be reconsidered, with tsunami mitigation given a very high priority.

  16. Concerns About Climate Change Mitigation Projects: Summary of Findings from Case Studies in Brazil, India, Mexico, and South Africa

    SciTech Connect

    Sathaye, Jayant A.; Andrasko, Kenneth; Makundi, Willy; La Rovere, Emilio Lebre; Ravinandranath, N.H.; Melli, Anandi; Rangachari, Anita; Amaz, Mireya; Gay, Carlos; Friedmann, Rafael; Goldberg, Beth; van Horen, Clive; Simmonds, Gillina; Parker, Gretchen

    1998-11-01

    The concept of joint implementation as a way to implement climate change mitigation projects in another country has been controversial ever since its inception. Developing countries have raised numerous issues at the project-specific technical level, and broader concerns having to do with equity and burden sharing. This paper summarizes the findings of studies for Brazil, India, Mexico and South Africa, four countries that have large greenhouse gas emissions and are heavily engaged in the debate on climate change projects under the Kyoto Protocol. The studies examine potential or current projects/programs to determine whether eight technical concerns about joint implementation can be adequately addressed. They conclude that about half the concerns were minor or well managed by project developers, but concerns about additionality of funds, host country institutions and guarantees of performance (including the issues of baselines and possible leakage) need much more effort to be adequately addressed. All the papers agree on the need to develop institutional arrangements for approving and monitoring such projects in each of the countries represented. The case studies illustrate that these projects have the potential to bring new technology, investment, employment and ancillary socioeconomic and environmental benefits to developing countries. These benefits are consistent with the goal of sustainable development in the four study countries. At a policy level, the studies' authors note that in their view, the Annex I countries should consider limits on the use of jointly implemented projects as a way to get credits against their own emissions at home, and stress the importance of industrialized countries developing new technologies that will benefit all countries. The authors also observe that if all countries accepted caps on their emissions (with a longer time period allowed for developing countries to do so) project-based GHG mitigation would be significantly facilitated by the improved private investment climate.

  17. Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology

    NASA Technical Reports Server (NTRS)

    Woods, Stephen

    2009-01-01

    This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.

  18. Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor

    This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

  19. Photobiomodulation Mitigates Diabetes-Induced Retinopathy by Direct and Indirect Mechanisms: Evidence from Intervention Studies in Pigmented Mice

    PubMed Central

    Liu, Haitao; Patel, Shyam; Roberts, Robin; Berkowitz, Bruce A.; Kern, Timothy S.

    2015-01-01

    Objective Daily application of far-red light from the onset of diabetes mitigated diabetes-induced abnormalities in retinas of albino rats. Here, we test the hypothesis that photobiomodulation (PBM) is effective in diabetic, pigmented mice, even when delayed until weeks after onset of diabetes. Direct and indirect effects of PBM on the retina also were studied. Methods Diabetes was induced in C57Bl/6J mice using streptozotocin. Some diabetics were exposed to PBM therapy (4 min/day; 670 nm) daily. In one study, mice were diabetic for 4 weeks before initiation of PBM for an additional 10 weeks. Retinal oxidative stress, inflammation, and retinal function were measured. In some mice, heads were covered with a lead shield during PBM to prevent direct illumination of the eye, or animals were treated with an inhibitor of heme oxygenase-1. In a second study, PBM was initiated immediately after onset of diabetes, and administered daily for 2 months. These mice were examined using manganese-enhanced MRI to assess effects of PBM on transretinal calcium channel function in vivo. Results PBM intervention improved diabetes-induced changes in superoxide generation, leukostasis, expression of ICAM-1, and visual performance. PBM acted in part remotely from the retina because the beneficial effects were achieved even with the head shielded from the light therapy, and because leukocyte-mediated cytotoxicity of retinal endothelial cells was less in diabetics treated with PBM. SnPP+PBM significantly reduced iNOS expression compared to PBM alone, but significantly exacerbated leukostasis. In study 2, PBM largely mitigated diabetes-induced retinal calcium channel dysfunction in all retinal layers. Conclusions PBM induces retinal protection against abnormalities induced by diabetes in pigmented animals, and even as an intervention. Beneficial effects on the retina likely are mediated by both direct and indirect mechanisms. PBM is a novel non-pharmacologic treatment strategy to inhibit early changes of diabetic retinopathy. PMID:26426815

  20. The Study on Ecological Treatment of Saline Lands to Mitigate the Effects of Climate Change

    NASA Astrophysics Data System (ADS)

    Xie, Jiancang; Zhu, Jiwei; Wang, Tao

    2010-05-01

    The soil water and salt movement is influenced strongly by the frequent droughts, floods and climate change. Additionally, as continued population growth, large-scale reclaiming of arable land and long-term unreasonable irrigation, saline land is increasing at the rate of 1,000,000~15,000,000 mu each year all over the world. In the tradition management, " drainage as the main " measure has series of problem, which appears greater project, more occupation of land, harmful for water saving and downstream pollution. To response the global climate change, it has become the common understanding, which promote energy-saving and environment protection, reflect the current model, explore the ecological management model. In this paper, we take severe saline land—Lubotan in Shaanxi Province as an example. Through nearly 10 years harnessing practice and observing to meteorology, hydrology, soil indicators of climate, we analyze the influence of climate change to soil salinity movement at different seasons and years, then put forward and apply a new model of saline land harnessing to mitigate the Effects of Climate Change and self-rehabilitate entironment. This model will be changed "drainage" to "storage", through the establishment engineering of " storage as the main ", taken comprehensive measures of " project - biology - agriculture ", we are changing saline land into arable land. Adapted to natural changes of climate, rainfall, irrigation backwater, groundwater level, reduced human intervention to achieve system dynamic equilibrium. During the ten years, the salt of plough horizon has reduced from 0.74% to 0.20%, organic matter has increased from 0.7% to 0.92%, various indicators of soil is begining to go better. At the same time, reduced the water for irrigation, drainage pollution and investment costs. Through the model, reformed severe saline land 18,900 mu, increased new cultivated land 16,500 mu, comprehensive efficient significant, ensured the coordinated development of " water - biology - environment " in the region. Model application and promotion can treat saline-alkali and add cultivated land effectively, at the same time, ease the pressure for urban construction land, promote energy saving and emission reducting and ecological restoration, so we can construct a resource-saving and environment-friendly society, realize sustainable development of the population, resources and environment.

  1. New approach to inventorying army hazardous materials. A study done for the Eighth U. S. Army, Korea. Volume 2. Hazardous-material data. Final report

    SciTech Connect

    Kim, B.J.; Gee, C.S.; Lee, Y.H.; Mikulich, L.R.; Grafmyer, D.E.

    1991-09-01

    The goal of the Army hazardous waste minimization program is to achieve a 50 percent reduction of the hazardous waste generated by calendar year 1992 (CY92), as compared to baseline CY85. A first step in achieving effective hazardous waste management is to conduct a thorough hazardous material inventory. Volume I describes a method created to inventory hazardous material by collecting supply data from Logistics Control Activity (LCA) at the Presidio, San Francisco, CA, an comparing this data with the Material Safety Data Sheet (MSDS) in the Hazardous Material Information System (HMIS). Volume H lists hazardous material data collected for the Eighth U.S. Army (EUSA), Korea. Common elements between the two data bases were compiled, analyzed, and validated. It was found that the intersection of the two data bases created a composite list that substantially reduced the number of nonhazardous wastes included in the individual lists. This method may also be applied to supply data from other Army installations.

  2. Economic optimization of natural hazard protection - conceptual study of existing approaches

    NASA Astrophysics Data System (ADS)

    Spackova, Olga; Straub, Daniel

    2013-04-01

    Risk-based planning of protection measures against natural hazards has become a common practice in many countries. The selection procedure aims at identifying an economically efficient strategy with regard to the estimated costs and risk (i.e. expected damage). A correct setting of the evaluation methodology and decision criteria should ensure an optimal selection of the portfolio of risk protection measures under a limited state budget. To demonstrate the efficiency of investments, indicators such as Benefit-Cost Ratio (BCR), Marginal Costs (MC) or Net Present Value (NPV) are commonly used. However, the methodologies for efficiency evaluation differ amongst different countries and different hazard types (floods, earthquakes etc.). Additionally, several inconsistencies can be found in the applications of the indicators in practice. This is likely to lead to a suboptimal selection of the protection strategies. This study provides a general formulation for optimization of the natural hazard protection measures from a socio-economic perspective. It assumes that all costs and risks can be expressed in monetary values. The study regards the problem as a discrete hierarchical optimization, where the state level sets the criteria and constraints, while the actual optimization is made on the regional level (towns, catchments) when designing particular protection measures and selecting the optimal protection level. The study shows that in case of an unlimited budget, the task is quite trivial, as it is sufficient to optimize the protection measures in individual regions independently (by minimizing the sum of risk and cost). However, if the budget is limited, the need for an optimal allocation of resources amongst the regions arises. To ensure this, minimum values of BCR or MC can be required by the state, which must be achieved in each region. The study investigates the meaning of these indicators in the optimization task at the conceptual level and compares their suitability. To illustrate the theoretical findings, the indicators are tested on a hypothetical example of five regions with different risk levels. Last but not least, political and societal aspects and limitations in the use of the risk-based optimization framework are discussed.

  3. Progress in NTHMP Hazard Assessment

    USGS Publications Warehouse

    Gonzalez, F.I.; Titov, V.V.; Mofjeld, H.O.; Venturato, A.J.; Simmons, R.S.; Hansen, R.; Combellick, R.; Eisner, R.K.; Hoirup, D.F.; Yanagi, B.S.; Yong, S.; Darienzo, M.; Priest, G.R.; Crawford, G.L.; Walsh, T.J.

    2005-01-01

    The Hazard Assessment component of the U.S. National Tsunami Hazard Mitigation Program has completed 22 modeling efforts covering 113 coastal communities with an estimated population of 1.2 million residents that are at risk. Twenty-three evacuation maps have also been completed. Important improvements in organizational structure have been made with the addition of two State geotechnical agency representatives to Steering Group membership, and progress has been made on other improvements suggested by program reviewers. ?? Springer 2005.

  4. Cost-benefit analysis of alternative LNG vapor-mitigation measures. Topical report, September 14, 1987-January 15, 1991

    SciTech Connect

    Atallah, S.

    1992-06-25

    A generalized methodology is presented for comparing the costs and safety benefits of alternative hazard mitigation measures for a large LNG vapor release. The procedure involves the quantification of the risk to the public before and after the application of LNG vapor mitigation measures. In the study, risk was defined as the product of the annual accident frequency, estimated from a fault tree analysis, and the severity of the accident. Severity was measured in terms of the number of people who may be exposed to 2.5% or higher concentration. The ratios of the annual costs of the various mitigation measures to their safety benefits (as determined by the differences between the risk before and after mitigation measure implementation), were then used to identify the most cost-effective approaches to vapor cloud mitigation.

  5. Case study: Mapping tsunami hazards associated with debris flow into a reservoir

    USGS Publications Warehouse

    Walder, J.S.; Watts, P.; Waythomas, C.F.

    2006-01-01

    Debris-flow generated impulse waves (tsunamis) pose hazards in lakes, especially those used for hydropower or recreation. We describe a method for assessing tsunami-related hazards for the case in which inundation by coherent water waves, rather than chaotic splashing, is of primary concern. The method involves an experimentally based initial condition (tsunami source) and a Boussinesq model for tsunami propagation and inundation. Model results are used to create hazard maps that offer guidance for emergency planners and responders. An example application explores tsunami hazards associated with potential debris flows entering Baker Lake, a reservoir on the flanks of the Mount Baker volcano in the northwestern United States. ?? 2006 ASCE.

  6. Application of multi-agent coordination methods to the design of space debris mitigation tours

    NASA Astrophysics Data System (ADS)

    Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby

    2016-04-01

    The growth in the number of defunct and fragmented objects near to the Earth poses a growing hazard to launch operations as well as existing on-orbit assets. Numerous studies have demonstrated the positive impact of active debris mitigation campaigns upon the growth of debris populations, but comparatively fewer investigations incorporate specific mission scenarios. Furthermore, while many active mitigation methods have been proposed, certain classes of debris objects are amenable to mitigation campaigns employing chaser spacecraft with existing chemical and low-thrust propulsive technologies. This investigation incorporates an ant colony optimization routing algorithm and multi-agent coordination via auctions into a debris mitigation tour scheme suitable for preliminary mission design and analysis as well as spacecraft flight operations.

  7. Model-Predictive Cascade Mitigation in Electric Power Systems With Storage and Renewables-Part II: Case-Study

    SciTech Connect

    Almassalkhi, MR; Hiskens, IA

    2015-01-01

    The novel cascade-mitigation scheme developed in Part I of this paper is implemented within a receding-horizon model predictive control (MPC) scheme with a linear controller model. This present paper illustrates the MPC strategy with a case-study that is based on the IEEE RTS-96 network, though with energy storage and renewable generation added. It is shown that the MPC strategy alleviates temperature overloads on transmission lines by rescheduling generation, energy storage, and other network elements, while taking into account ramp-rate limits and network limitations. Resilient performance is achieved despite the use of a simplified linear controller model. The MPC scheme is compared against a base-case that seeks to emulate human operator behavior.

  8. Status of volcanic hazard studies for the Nevada Nuclear Waste Storage Investigations. Volume II

    SciTech Connect

    Crowe, B.M.; Wohletz, K.H.; Vaniman, D.T.; Gladney, E.; Bower, N.

    1986-01-01

    Volcanic hazard investigations during FY 1984 focused on five topics: the emplacement mechanism of shallow basalt intrusions, geochemical trends through time for volcanic fields of the Death Valley-Pancake Range volcanic zone, the possibility of bimodal basalt-rhyolite volcanism, the age and process of enrichment for incompatible elements in young basalts of the Nevada Test Site (NTS) region, and the possibility of hydrovolcanic activity. The stress regime of Yucca Mountain may favor formation of shallow basalt intrusions. However, combined field and drill-hole studies suggest shallow basalt intrusions are rare in the geologic record of the southern Great Basin. The geochemical patterns of basaltic volcanism through time in the NTS region provide no evidence for evolution toward a large-volume volcanic field or increases in future rates of volcanism. Existing data are consistent with a declining volcanic system comparable to the late stages of the southern Death Valley volcanic field. The hazards of bimodal volcanism in this area are judged to be low. The source of a 6-Myr pumice discovered in alluvial deposits of Crater Flat has not been found. Geochemical studies show that the enrichment of trace elements in the younger rift basalts must be related to an enrichment of their mantle source rocks. This geochemical enrichment event, which may have been metasomatic alteration, predates the basalts of the silicic episode and is, therefore, not a young event. Studies of crater dimensions of hydrovolcanic landforms indicate that the worst case scenario (exhumation of a repository at Yucca Mountain by hydrovolcanic explosions) is unlikely. Theoretical models of melt-water vapor explosions, particularly the thermal detonation model, suggest hydrovolcanic explosion are possible at Yucca Mountain. 80 refs., 21 figs., 5 tabs.

  9. Eco-efficiency for greenhouse gas emissions mitigation of municipal solid waste management: A case study of Tianjin, China

    SciTech Connect

    Zhao Wei; Huppes, Gjalt; Voet, Ester van der

    2011-06-15

    The issue of municipal solid waste (MSW) management has been highlighted in China due to the continually increasing MSW volumes being generated and the limited capacity of waste treatment facilities. This article presents a quantitative eco-efficiency (E/E) analysis on MSW management in terms of greenhouse gas (GHG) mitigation. A methodology for E/E analysis has been proposed, with an emphasis on the consistent integration of life cycle assessment (LCA) and life cycle costing (LCC). The environmental and economic impacts derived from LCA and LCC have been normalized and defined as a quantitative E/E indicator. The proposed method was applied in a case study of Tianjin, China. The study assessed the current MSW management system, as well as a set of alternative scenarios, to investigate trade-offs between economy and GHG emissions mitigation. Additionally, contribution analysis was conducted on both LCA and LCC to identify key issues driving environmental and economic impacts. The results show that the current Tianjin's MSW management system emits the highest GHG and costs the least, whereas the situation reverses in the integrated scenario. The key issues identified by the contribution analysis show no linear relationship between the global warming impact and the cost impact in MSW management system. The landfill gas utilization scenario is indicated as a potential optimum scenario by the proposed E/E analysis, given the characteristics of MSW, technology levels, and chosen methodologies. The E/E analysis provides an attractive direction towards sustainable waste management, though some questions with respect to uncertainty need to be discussed further.

  10. Eco-efficiency for greenhouse gas emissions mitigation of municipal solid waste management: a case study of Tianjin, China.

    PubMed

    Zhao, Wei; Huppes, Gjalt; van der Voet, Ester

    2011-06-01

    The issue of municipal solid waste (MSW) management has been highlighted in China due to the continually increasing MSW volumes being generated and the limited capacity of waste treatment facilities. This article presents a quantitative eco-efficiency (E/E) analysis on MSW management in terms of greenhouse gas (GHG) mitigation. A methodology for E/E analysis has been proposed, with an emphasis on the consistent integration of life cycle assessment (LCA) and life cycle costing (LCC). The environmental and economic impacts derived from LCA and LCC have been normalized and defined as a quantitative E/E indicator. The proposed method was applied in a case study of Tianjin, China. The study assessed the current MSW management system, as well as a set of alternative scenarios, to investigate trade-offs between economy and GHG emissions mitigation. Additionally, contribution analysis was conducted on both LCA and LCC to identify key issues driving environmental and economic impacts. The results show that the current Tianjin's MSW management system emits the highest GHG and costs the least, whereas the situation reverses in the integrated scenario. The key issues identified by the contribution analysis show no linear relationship between the global warming impact and the cost impact in MSW management system. The landfill gas utilization scenario is indicated as a potential optimum scenario by the proposed E/E analysis, given the characteristics of MSW, technology levels, and chosen methodologies. The E/E analysis provides an attractive direction towards sustainable waste management, though some questions with respect to uncertainty need to be discussed further. PMID:21316937

  11. Distinguishing Realistic Military Blasts from Firecrackers in Mitigation Studies of Blast Induced Traumatic Brain Injury

    SciTech Connect

    Moss, W C; King, M J; Blackman, E G

    2011-01-21

    In their Contributed Article, Nyein et al. (1,2) present numerical simulations of blast waves interacting with a helmeted head and conclude that a face shield may significantly mitigate blast induced traumatic brain injury (TBI). A face shield may indeed be important for future military helmets, but the authors derive their conclusions from a much smaller explosion than typically experienced on the battlefield. The blast from the 3.16 gm TNT charge of (1) has the following approximate peak overpressures, positive phase durations, and incident impulses (3): 10 atm, 0.25 ms, and 3.9 psi-ms at the front of the head (14 cm from charge), and 1.4 atm, 0.32 ms, and 1.7 psi-ms at the back of a typical 20 cm head (34 cm from charge). The peak pressure of the wave decreases by a factor of 7 as it traverses the head. The blast conditions are at the threshold for injury at the front of the head, but well below threshold at the back of the head (4). The blast traverses the head in 0.3 ms, roughly equal to the positive phase duration of the blast. Therefore, when the blast reaches the back of the head, near ambient conditions exist at the front. Because the headform is so close to the charge, it experiences a wave with significant curvature. By contrast, a realistic blast from a 2.2 kg TNT charge ({approx} an uncased 105 mm artillery round) is fatal at an overpressure of 10 atm (4). For an injury level (4) similar to (1), a 2.2 kg charge has the following approximate peak overpressures, positive phase durations, and incident impulses (3): 2.1 atm, 2.3 ms, and 18 psi-ms at the front of the head (250 cm from charge), and 1.8 atm, 2.5 ms, and 16.8 psi-ms at the back of the head (270 cm from charge). The peak pressure decreases by only a factor of 1.2 as it traverses the head. Because the 0.36 ms traversal time is much smaller than the positive phase duration, pressures on the head become relatively uniform when the blast reaches the back of the head. The larger standoff implies that the headform locally experiences a nearly planar blast wave. Also, the positive phase durations and blast impulses are much larger than those of (1). Consequently, the blast model used in (1) is spatially and temporally very different from a military blast. It would be useful to repeat the calculations using military blast parameters. Finally, (1) overlooks a significant part of (5). On page 1 and on page 3, (1) states that (5) did not consider helmet pads. But pages pages 3 and 4 of (5) present simulations of blast wave propagation across an ACH helmeted head form with and without pads. (5) states that when the pads are present, the 'underwash' of air under the helmet is blocked when compared to the case without. (1) reaches this same conclusion, but reports it as a new result rather than a confirmation of that already found in (5).

  12. Balancing Mitigation Against Impact: A Case Study From the 2005 Chicxulub Seismic Survey

    NASA Astrophysics Data System (ADS)

    Barton, P.; Diebold, J.; Gulick, S.

    2006-05-01

    In early 2005 the R/V Maurice Ewing conducted a large-scale deep seismic reflection-refraction survey offshore Yucatan, Mexico, to investigate the internal structure of the Chicxulub impact crater, centred on the coastline. Shots from a tuned 20 airgun, 6970 cu in array were recorded on a 6 km streamer and 25 ocean bottom seismometers (OBS). The water is exceptionally shallow to large distances offshore, reaching 30 m about 60 km from the land, making it unattractive to the larger marine mammals, although there are small populations of Atlantic and spotted dolphins living in the area, as well as several turtle breeding and feeding grounds on the Yucatan peninsula. In the light of calibrated tests of the Ewing's array (Tolstoy et al., 2004, Geophysical Research Letters 31, L14310), a 180 dB safety radius of 3.5 km around the gun array was adopted. An energetic campaign was organised by environmentalists opposing the work. In addition to the usual precautions of visual and listening watches by independent observers, gradual ramp-ups of the gun arrays, and power-downs or shut-downs for sightings, constraints were also placed to limit the survey to daylight hours and weather conditions not exceeding Beaufort 4. The operations were subject to several on-board inspections by the Mexican environmental authorities, causing logistical difficulties. Although less than 1% of the total working time was lost to shutdowns due to actual observation of dolphins or turtles, approximately 60% of the cruise time was taken up in precautionary inactivity. A diver in the water 3.5 km from the profiling ship reported that the sound in the water was barely noticeable, leading us to examine the actual sound levels recorded by both the 6 km streamer and the OBS hydrophones. The datasets are highly self-consistent, and give the same pattern of decay with distance past about 2 km offset, but with different overall levels: this may be due to geometry or calibration differences under investigation. Both datasets indicate significantly lower levels than reported by Tolstoy et al. (2004). There was no evidence of environmental damage created by this survey. It can be concluded that the mitigation measures were extremely successful, but there is also a concern that the overhead cost of the environmental protection made this one of the most costly academic surveys ever undertaken, and that not all of this protection was necessary. In particular, the predicted 180 dB safety radius appeared to be overly conservative, even though based on calibrated measurements in very similar physical circumstances, and we suggest that these differences were a result of local seismic velocity structure in the water column and/or shallow seabed, which resulted in different partitioning of the energy. These results suggest that real time monitoring of hydrophone array data may provide a method of determining the safety radius dynamically, in response to local conditions.

  13. Property-close source separation of hazardous waste and waste electrical and electronic equipment--a Swedish case study.

    PubMed

    Bernstad, Anna; la Cour Jansen, Jes; Aspegren, Henrik

    2011-03-01

    Through an agreement with EEE producers, Swedish municipalities are responsible for collection of hazardous waste and waste electrical and electronic equipment (WEEE). In most Swedish municipalities, collection of these waste fractions is concentrated to waste recycling centres where households can source-separate and deposit hazardous waste and WEEE free of charge. However, the centres are often located on the outskirts of city centres and cars are needed in order to use the facilities in most cases. A full-scale experiment was performed in a residential area in southern Sweden to evaluate effects of a system for property-close source separation of hazardous waste and WEEE. After the system was introduced, results show a clear reduction in the amount of hazardous waste and WEEE disposed of incorrectly amongst residual waste or dry recyclables. The systems resulted in a source separation ratio of 70 wt% for hazardous waste and 76 wt% in the case of WEEE. Results show that households in the study area were willing to increase source separation of hazardous waste and WEEE when accessibility was improved and that this and similar collection systems can play an important role in building up increasingly sustainable solid waste management systems. PMID:20952178

  14. GIS-based pollution hazard mapping and assessment framework of shallow lakes: southeastern Pampean lakes (Argentina) as a case study.

    PubMed

    Romanelli, A; Esquius, K S; Massone, H E; Escalante, A H

    2013-08-01

    The assessment of water vulnerability and pollution hazard traditionally places particular emphasis on the study on groundwaters more than on surface waters. Consequently, a GIS-based Lake Pollution Hazard Index (LPHI) was proposed for assessing and mapping the potential pollution hazard for shallow lakes due to the interaction between the Potential Pollutant Load and the Lake Vulnerability. It includes easily measurable and commonly used parameters: land cover, terrain slope and direction, and soil media. Three shallow lake ecosystems of the southeastern Pampa Plain (Argentina) were chosen to test the usefulness and applicability of this suggested index. Moreover, anthropogenic and natural medium influence on biophysical parameters in these three ecosystems was examined. The evaluation of the LPHI map shows for La Brava and Los Padres lakes the highest pollution hazard (?30% with high to very high category) while Nahuel Ruc Lake seems to be the less hazardous water body (just 9.33% with high LPHI). The increase in LPHI value is attributed to a different loading of pollutants governed by land cover category and/or the exposure to high slopes and influence of slope direction. Dissolved oxygen and biochemical oxygen demand values indicate a moderately polluted and eutrophized condition of shallow lake waters, mainly related to moderate agricultural activities and/or cattle production. Obtained information by means of LPHI calculation result useful to perform a local diagnosis of the potential pollution hazard to a freshwater ecosystem in order to implement basic guidelines to improve lake sustainability. PMID:23355019

  15. Property-close source separation of hazardous waste and waste electrical and electronic equipment - A Swedish case study

    SciTech Connect

    Bernstad, Anna; Cour Jansen, Jes la; Aspegren, Henrik

    2011-03-15

    Through an agreement with EEE producers, Swedish municipalities are responsible for collection of hazardous waste and waste electrical and electronic equipment (WEEE). In most Swedish municipalities, collection of these waste fractions is concentrated to waste recycling centres where households can source-separate and deposit hazardous waste and WEEE free of charge. However, the centres are often located on the outskirts of city centres and cars are needed in order to use the facilities in most cases. A full-scale experiment was performed in a residential area in southern Sweden to evaluate effects of a system for property-close source separation of hazardous waste and WEEE. After the system was introduced, results show a clear reduction in the amount of hazardous waste and WEEE disposed of incorrectly amongst residual waste or dry recyclables. The systems resulted in a source separation ratio of 70 wt% for hazardous waste and 76 wt% in the case of WEEE. Results show that households in the study area were willing to increase source separation of hazardous waste and WEEE when accessibility was improved and that this and similar collection systems can play an important role in building up increasingly sustainable solid waste management systems.

  16. Classification of residential areas according to physical vulnerability to natural hazards: a case study of Çanakkale, Turkey.

    PubMed

    Başaran-Uysal, Arzu; Sezen, Funda; Ozden, Süha; Karaca, Oznur

    2014-01-01

    The selection of new settlement areas and the construction of safe buildings, as well as rendering built-up areas safe, are of great importance in mitigating the damage caused by natural disasters. Most cities in Turkey are unprepared for natural hazards. In this paper, Çanakkale, located in a first-degree seismic zone and sprawled around the Sartçay Delta, is examined in terms of its physical vulnerability to natural hazards. Residential areas are analysed using GIS (geographic information system) and remote-sensing technologies in relation to selected indicators. Residential areas of the city are divided into zones according to an evaluation of geological characteristics, the built-up area's features, and urban infrastructure, and four risk zones are determined. The results of the analysis show that the areas of the city suitable for housing are very limited. In addition, the historical centre and the housing areas near Sartçay stream are shown to be most problematic in terms of natural disasters and sustainability. PMID:24325245

  17. UAV-based Natural Hazard Management in High-Alpine Terrain - Case Studies from Austria

    NASA Astrophysics Data System (ADS)

    Sotier, Bernadette; Adams, Marc; Lechner, Veronika

    2015-04-01

    Unmanned Aerial Vehicles (UAV) have become a standard tool for geodata collection, as they allow conducting on-demand mapping missions in a flexible, cost-effective manner at an unprecedented level of detail. Easy-to-use, high-performance image matching software make it possible to process the collected aerial images to orthophotos and 3D-terrain models. Such up-to-date geodata have proven to be an important asset in natural hazard management: Processes like debris flows, avalanches, landslides, fluvial erosion and rock-fall can be detected and quantified; damages can be documented and evaluated. In the Alps, these processes mostly originate in remote areas, which are difficult and hazardous to access, thus presenting a challenging task for RPAS data collection. In particular, the problems include finding suitable landing and piloting-places, dealing with bad or no GPS-signals and the installation of ground control points (GCP) for georeferencing. At the BFW, RPAS have been used since 2012 to aid natural hazard management of various processes, of which three case studies are presented below. The first case study deals with the results from an attempt to employ UAV-based multi-spectral remote sensing to monitor the state of natural hazard protection forests. Images in the visible and near-infrared (NIR) band were collected using modified low-cost cameras, combined with different optical filters. Several UAV-flights were performed in the 72 ha large study site in 2014, which lies in the Wattental, Tyrol (Austria) between 1700 and 2050 m a.s.l., where the main tree species are stone pine and mountain pine. The matched aerial images were analysed using different UAV-specific vitality indices, evaluating both single- and dual-camera UAV-missions. To calculate the mass balance of a debris flow in the Tyrolean Halltal (Austria), an RPAS flight was conducted in autumn 2012. The extreme alpine environment was challenging for both the mission and the evaluation of the aerial images: In the upper part of the steep channel there was no GPS-signal available, because of the high surrounding rock faces, the landing area consisted of coarse gravel. Therefore, only a manual flight with a high risk of damage was possible. With the calculated RPAS-based digital surface model, created from the 600 aerial images, a chronologically resolved back-calculation of the last big debris-flow event could be performed. In a third case study, aerial images from RPAS were used for a similar investigation in Virgen, Eastern Tyrol (Austria). A debris flow in the Firschnitzbach catchment caused severe damages to the village of Virgen in August 2012. An RPAS-flight was performed, in order to refine the estimated displaced debris mass for assessment purposes. The upper catchment of the Firschnitzbach is situated above the timberline and covers an area of 6.5 ha at a height difference of 1000 m. Therefore, three separate flights were necessary to achieve a sufficient image overlap. The central part of the Firschnitzbach consists of a steep and partly dense forested canyon / gorge, so there was no flight possible for this section up to now. The evaluation of the surface model from the images showed, that only half of the estimated debris mass came from the upper part of the catchment.

  18. CMMAD Usability Case Study in Support of Countermine and Hazard Sensing

    SciTech Connect

    Victor G. Walker; David I. Gertman

    2010-04-01

    During field trials, operator usability data were collected in support of lane clearing missions and hazard sensing for two robot platforms with Robot Intelligence Kernel (RIK) software and sensor scanning payloads onboard. The tests featured autonomous and shared robot autonomy levels where tasking of the robot used a graphical interface featuring mine location and sensor readings. The goal of this work was to provide insights that could be used to further technology development. The efficacy of countermine systems in terms of mobility, search, path planning, detection, and localization were assessed. Findings from objective and subjective operator interaction measures are reviewed along with commentary from soldiers having taken part in the study who strongly endorse the system.

  19. Experimental study on thermal hazard of tributyl phosphate-nitric acid mixtures using micro calorimeter technique.

    PubMed

    Sun, Qi; Jiang, Lin; Gong, Liang; Sun, Jin-Hua

    2016-08-15

    During PUREX spent nuclear fuel reprocessing, mixture of tributyl phosphate (TBP) and hydrocarbon solvent are employed as organic solvent to extract uranium in consideration of radiation contaminated safety and resource recycling, meanwhile nitric acid is utilized to dissolve the spent fuel into small pieces. However, once TBP contacts with nitric acid or nitrates above 130°C, a heavy "red oil" layer would occur accompanied by thermal runaway reactions, even caused several nuclear safety accident. Considering nitric acid volatility and weak exothermic detection, C80micro calorimeter technique was used in this study to investigate thermal decomposition of TBP mixed with nitric acid. Results show that the concentration of nitric acid greatly influences thermal hazard of the system by direct reactions. Even with a low heating rate, if the concentration of nitric acid increases due to evaporation of water or improper operations, thermal runaway in the closed system could start at a low temperature. PMID:27136728

  20. Hazardous alcohol consumption among university students in Ireland: a cross-sectional study

    PubMed Central

    Davoren, Martin P; Shiely, Frances; Byrne, Michael; Perry, Ivan J

    2015-01-01

    Objective There is considerable evidence of a cultural shift towards heavier alcohol consumption among university students, especially women. The aim of this study is to investigate the prevalence and correlates of hazardous alcohol consumption (HAC) among university students with particular reference to gender and to compare different modes of data collection in this population. Setting A large Irish university. Design A cross-sectional study using a classroom distributed paper questionnaire. Participants A total of 2275 undergraduates completed the classroom survey, 84% of those in class and 51% of those registered for the relevant module. Main outcome measures Prevalence of HAC measured using the Alcohol Use Disorders Identification Test for Consumption (AUDIT-C) and the proportion of university students reporting 1 or more of 13 adverse consequences linked to HAC. HAC was defined as an AUDIT-C score of 6 or more among males and 5 or more among females. Results In the classroom sample, 66.4% (95% CI 64.4 to 68.3) reported HAC (65.2% men and 67.3% women). In women, 57.4% met HAC thresholds for men. Similar patterns of adverse consequences were observed among men and women. Students with a hazardous consumption pattern were more likely to report smoking, illicit drug use and being sexually active. Conclusions The findings highlight the high prevalence of HAC among university students relative to the general population. Public policy measures require review to tackle the short-term and long-term risks to physical, mental and social health and well-being. PMID:25633284

  1. AMERICAN HEALTHY HOMES SURVEY: A NATIONAL STUDY OF RESIDENTIAL RELATED HAZARDS

    EPA Science Inventory

    The US Environmental Protection Agency's (EPA) National Exposure Research Laboratory (NERL) and the US Department of Housing and Urban Development's (HUD) Office of Healthy Homes and Lead Hazard Control conducted a national survey of housing related hazards in US residences. The...

  2. Case studies. [hazardous effects of early medical use of X-rays

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The characteristics of the technology assessment process which were manifested in response to the hazardous effects of early medical uses of X-rays are considered. Other topics discussed include: controlling the potential hazards of government-sponsored technology, genetic technology, community level impacts of expanded underground coal mining, and an integrated strategy for aircraft/airport noise abatement.

  3. Geomorphological surveys and software simulations for rock fall hazard assessment: a case study in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Devoto, S.; Boccali, C.; Podda, F.

    2014-12-01

    In northern Italy, fast-moving landslides represent a significant threat to the population and human facilities. In the eastern portion of the Italian Alps, rock falls are recurrent and are often responsible for casualties or severe damage to roads and buildings. The above-cited type of landslide is frequent in mountain ranges, is characterised by strong relief energy and is triggered by earthquakes or copious rainfall, which often exceed 2000 mm yr-1. These factors cause morphological dynamics with intense slope erosion and degradation processes. This work investigates the appraisal of the rock-fall hazard related to the presence of several large unstable blocks located at the top of a limestone peak, approximately 500 m NW with respect to the Village of Cimolais. Field surveys recognised a limestone block exceeding a volume of 400 m3 and identified this block as the most hazardous for Cimolais Village because of its proximity to the rocky cliff. A first assessment of the possible transit and stop areas has been investigated through in-depth traditional activities, such as geomorphological mapping and aerial photo analysis. The output of field surveys was a detailed land use map, which provided a fundamental starting point for rock fall software analysis. The geomorphological observations were correlated with DTMs derived by regional topography and Airborne Laser Scanning (ALS) surveys to recognise possible rock fall routes. To simulate properly rock fall trajectories with a hybrid computer program, particular attention was devoted to the correct quantification of rates of input parameters, such as restitution coefficients and horizontal acceleration associated to earthquakes, which historically occur in this portion of Italy. The simulation outputs regarding the distribution of rock fall end points and kinetic energy along rock falling paths highlight the hazardous situation for Cimolais Village. Because of this reason, mitigation works have been suggested to immediately reduce the landslide risk. This proposal accounts for the high volume of blocks, which, in case of a fall, render the passive mitigation measures already in place at the back of Cimolais worthless.

  4. Impact and effectiveness of risk mitigation strategies on the insurability of nanomaterial production: evidences from industrial case studies.

    PubMed

    Bergamaschi, Enrico; Murphy, Finbarr; Poland, Craig A; Mullins, Martin; Costa, Anna L; McAlea, Eamonn; Tran, Lang; Tofail, Syed A M

    2015-01-01

    Workers involved in producing nanomaterials or using nanomaterials in manufacturing plants are likely to have earlier and higher exposure to manufactured/engineered nanomaterials (ENM) than the general population. This is because both the volume handled and the probability of the effluence of 'free' nanoparticles from the handled volume are much higher during a production process than at any other stage in the lifecycle of nanomaterials and nanotechnology-enabled products. Risk assessment (RA) techniques using control banding (CB) as a framework for risk transfer represents a robust theory but further progress on implementing the model is required so that risk can be transferred to insurance companies. Following a review of RA in general and hazard measurement in particular, we subject a Structural Alert Scheme methodology to three industrial case studies using ZrO2 , TiO2 , and multi-walled carbon nanotubes (MWCNT). The materials are tested in a pristine state and in a remediated (coated) state, and the respective emission and hazard rates are tested alongside the material performance as originally designed. To our knowledge, this is the first such implementation of a CB RA in conjunction with an ENM performance test and offers both manufacturers and underwriters an insight into future applications. PMID:25808636

  5. Marginal hazards model for case-cohort studies with multiple disease outcomes.

    PubMed

    Kang, S; Cai, J

    2009-12-01

    Case-cohort study designs are widely used to reduce the cost of large cohort studies while achieving the same goals, especially when the disease rate is low. A key advantage of the case-cohort study design is its capacity to use the same subcohort for several diseases or for several subtypes of disease. In order to compare the effect of a risk factor on different types of diseases, times to different events need to be modelled simultaneously. Valid statistical methods that take the correlations among the outcomes from the same subject into account need to be developed. To this end, we consider marginal proportional hazards regression models for case-cohort studies with multiple disease outcomes. We also consider generalized case-cohort designs that do not require sampling all the cases, which is more realistic for multiple disease outcomes. We propose an estimating equation approach for parameter estimation with two different types of weights. Consistency and asymptotic normality of the proposed estimators are established. Large sample approximation works well in small samples in simulation studies. The proposed methods are applied to the Busselton Health Study. PMID:23946547

  6. The mechanism and mitigation of the landslides of Leye region in Alishan, Taiwan

    NASA Astrophysics Data System (ADS)

    Feng, Z.-Y.; Ding, Z.-Z.; Chang, K.-C.; Lai, H.-Y.; Chen, S.-C.

    2012-04-01

    Many serious landslides occurred in Leye region in Alishan, Taiwan during Typhoon Morakot in 2009. This study investigated the mechanism of the Leye landslides and discussed the mitigation measures for future complex hazards and their effectiveness. Leye region is located at west side of Mountain Ali in central-southern Taiwan. The toe of Leye slope is surrounded by Creek Tsengwen and is strongly influenced by the landform processes such as river cutting and riverbed widening. The special hydrological and geomorphological conditions at Leye with extreme rainfall and flood induced the landslides in Leye region. Many aboriginal residences and cultivated slopelands were destroyed. The landslide areas were over one hundred hectares. In addition, large amounts of debris were accumulated on the streambeds that cause a high potential of secondary hazards to the region. This study clarified the causes and mechanisms of the Leye landslides, estimated the volume, and discussed the influences of the "flat-iron" landform and dip-slope in sedimentary strata. The mitigation works are still ongoing to prevent possible complex hazards, such as landslide lakes, debris flows, and additional circular landslides. We discussed the mitigation works for the effectiveness to slope stability and their influence on future landform changes in Leye region. An alert level criterion for emergency evacuation was also proposed for "software" mitigation strategy to reduce damages and loses in Leye region.

  7. Remedial Action Assessment System (RAAS): Evaluation of selected feasibility studies of CERCLA (Comprehensive Environmental Response, Compensation, and Liability Act) hazardous waste sites

    SciTech Connect

    Whelan, G. ); Hartz, K.E.; Hilliard, N.D. and Associates, Seattle, WA )

    1990-04-01

    Congress and the public have mandated much closer scrutiny of the management of chemically hazardous and radioactive mixed wastes. Legislative language, regulatory intent, and prudent technical judgment, call for using scientifically based studies to assess current conditions and to evaluate and select costeffective strategies for mitigating unacceptable situations. The NCP requires that a Remedial Investigation (RI) and a Feasibility Study (FS) be conducted at each site targeted for remedial response action. The goal of the RI is to obtain the site data needed so that the potential impacts on public health or welfare or on the environment can be evaluated and so that the remedial alternatives can be identified and selected. The goal of the FS is to identify and evaluate alternative remedial actions (including a no-action alternative) in terms of their cost, effectiveness, and engineering feasibility. The NCP also requires the analysis of impacts on public health and welfare and on the environment; this analysis is the endangerment assessment (EA). In summary, the RI, EA, and FS processes require assessment of the contamination at a site, of the potential impacts in public health or the environment from that contamination, and of alternative RAs that could address potential impacts to the environment. 35 refs., 7 figs., 1 tab.

  8. Review: Assessment of completeness of reporting in intervention studies using livestock: an example from pain mitigation interventions in neonatal piglets.

    PubMed

    O'Connor, A; Anthony, R; Bergamasco, L; Coetzee, J F; Dzikamunhenga, R S; Johnson, A K; Karriker, L A; Marchant-Forde, J N; Martineau, G P; Millman, S T; Pajor, E A; Rutherford, K; Sprague, M; Sutherland, M A; von Borell, E; Webb, S R

    2016-04-01

    Accurate and complete reporting of study methods, results and interpretation are essential components for any scientific process, allowing end-users to evaluate the internal and external validity of a study. When animals are used in research, excellence in reporting is expected as a matter of continued ethical acceptability of animal use in the sciences. Our primary objective was to assess completeness of reporting for a series of studies relevant to mitigation of pain in neonatal piglets undergoing routine management procedures. Our second objective was to illustrate how authors can report the items in the Reporting guidElines For randomized controLled trials for livEstoCk and food safety (REFLECT) statement using examples from the animal welfare science literature. A total of 52 studies from 40 articles were evaluated using a modified REFLECT statement. No single study reported all REFLECT checklist items. Seven studies reported specific objectives with testable hypotheses. Six studies identified primary or secondary outcomes. Randomization and blinding were considered to be partially reported in 21 and 18 studies, respectively. No studies reported the rationale for sample sizes. Several studies failed to report key design features such as units for measurement, means, standard deviations, standard errors for continuous outcomes or comparative characteristics for categorical outcomes expressed as either rates or proportions. In the discipline of animal welfare science, authors, reviewers and editors are encouraged to use available reporting guidelines to ensure that scientific methods and results are adequately described and free of misrepresentations and inaccuracies. Complete and accurate reporting increases the ability to apply the results of studies to the decision-making process and prevent wastage of financial and animal resources. PMID:26556522

  9. A Competence-Based Science Learning Framework Illustrated through the Study of Natural Hazards and Disaster Risk Reduction

    ERIC Educational Resources Information Center

    Oyao, Sheila G.; Holbrook, Jack; Rannikmäe, Miia; Pagunsan, Marmon M.

    2015-01-01

    This article proposes a competence-based learning framework for science teaching, applied to the study of "big ideas", in this case to the study of natural hazards and disaster risk reduction (NH&DRR). The framework focuses on new visions of competence, placing emphasis on nurturing connectedness and behavioral actions toward…

  10. A Competence-Based Science Learning Framework Illustrated through the Study of Natural Hazards and Disaster Risk Reduction

    ERIC Educational Resources Information Center

    Oyao, Sheila G.; Holbrook, Jack; Rannikme, Miia; Pagunsan, Marmon M.

    2015-01-01

    This article proposes a competence-based learning framework for science teaching, applied to the study of "big ideas", in this case to the study of natural hazards and disaster risk reduction (NH&DRR). The framework focuses on new visions of competence, placing emphasis on nurturing connectedness and behavioral actions toward

  11. Exploratory study of burn time, duty factor, and fluence on ITER activation hazards

    SciTech Connect

    Piet, S.J.

    1992-08-01

    The safety analyses for the Conceptual Design Activity (CDA) of the International Thermonuclear Experimental Reactor (ITER) were based on the simplifying assumption that the activation of materials occurs continuously. Since the analyses showed a significant hazard, it is appropriate to examine how much hazard reduction might occur if this conservative assumption were relaxed. This report explores how much reduction might be gained by considering non-continuous operation, that is, by considering plasma burn time, duty factor, and integrated fluence. Other factors impacting activation hazards - material choice, flux, and size - are not considered here.

  12. Safety in earth orbit study. Volume 2: Analysis of hazardous payloads, docking, on-board survivability

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Detailed and supporting analyses are presented of the hazardous payloads, docking, and on-board survivability aspects connected with earth orbital operations of the space shuttle program. The hazards resulting from delivery, deployment, and retrieval of hazardous payloads, and from handling and transport of cargo between orbiter, sortie modules, and space station are identified and analyzed. The safety aspects of shuttle orbiter to modular space station docking includes docking for assembly of space station, normal resupply docking, and emergency docking. Personnel traffic patterns, escape routes, and on-board survivability are analyzed for orbiter with crew and passenger, sortie modules, and modular space station, under normal, emergency, and EVA and IVA operations.

  13. Natural phenomena hazards, Hanford Site, Washington

    SciTech Connect

    Conrads, T.J.

    1998-09-29

    This document presents the natural phenomena hazard loads for use in implementing DOE Order 5480.28, Natural Phenomena Hazards Mitigation, and supports development of double-shell tank systems specifications at the Hanford Site in south-central Washington State. The natural phenomena covered are seismic, flood, wind, volcanic ash, lightning, snow, temperature, solar radiation, suspended sediment, and relative humidity.

  14. Los Alamos Radiation Hydrocode Models of Asteroid Mitigation by a Subsurface Explosion

    NASA Astrophysics Data System (ADS)

    Weaver, R.; Plesko, C. S.; Dearholt, W.

    2010-12-01

    Mitigation of a potentially hazardous object (PHO) by a nuclear subsurface explosion is considered. In this new work we examine non-central subsurface emplacements and seek an optimal depth-of-burial for various explosion energies. This intervention methodology has been popularized in media presentations and is considered as one possible method of impact-hazard mitigation. We present new RAGE radiation hydrocode models of the shock-generated disruption of PHOs by subsurface nuclear bursts and deflection from shallow buried bursts using scenario-specific models from authentic RADAR shape models. We will show 2D and 3D models for the disruption by a large energy source at the center and near the edge (mitigation) of such PHO models (1-10 Mton TNT equivalent), specifically for asteroid 25143 Itokawa. Parametric studies will be done on: the value of the source energy (from 100 Kton to 10 Mton), the parameters in the Steinberg-Guinan strength model used and the internal composition of the object from uniform composition to a “rubble pile” distribution. Specifically we are interested in assessing the optimum depth of burial and energy required to essentially disrupt and/or move the PHO and therefore mitigate the hazard. Recollection will be considered. (LA-UR-10-05860) A subsurface 1 Mt explosion near the long-axis surface of an Itokawa shape model with a non-uniform internal composition. The resulting velocity imparted to the bulk remainder of the object is ~50 m/s.

  15. Los Alamos Radiation Hydrocode Models of Asteroid Mitigation by an Internal Explosion

    NASA Astrophysics Data System (ADS)

    Weaver, Robert; Plesko, C.; Dearholdt, W.

    2010-10-01

    Mitigation of a potentially hazardous object (PHO) by a conventional or nuclear subsurface burst is considered. This intervention methodology has been popularized in media presentations and is considered as one possible method of impact-hazard mitigation. We present RAGE radiation hydrocode models of the shock-generated disruption of PHOs by subsurface nuclear bursts and deflection from shallow buried bursts using scenario-specific models from authentic RADAR shape models. We will show 2D and 3D models for the disruption by a large energy source at the center and near the edge (mitigation) of such PHO models (1-10 Mton TNT equivalent), specifically for asteroid 25143 Itokawa. Parametric studies will be done on: the value of the source energy (from 1 Mton to 10 Mton), the parameters in the Steinberg-Guinan strength model used and the internal composition of the object from uniform composition to a "rubble pile” distribution. Specifically we are interested in assessing the optimum depth of burial and energy required to essentially disrupt and/or move the PHO and therefore mitigate the hazard. Recollection will be considered.

  16. Household hazardous wastes as a potential source of pollution: a generation study.

    PubMed

    Ojeda-Benítez, Sara; Aguilar-Virgen, Quetzalli; Taboada-González, Paul; Cruz-Sotelo, Samantha E

    2013-12-01

    Certain domestic wastes exhibit characteristics that render them dangerous, such as explosiveness, flammability, spontaneous combustion, reactivity, toxicity and corrosiveness. The lack of information about their generation and composition hinders the creation of special programs for their collection and treatment, making these wastes a potential threat to human health and the environment. We attempted to quantify the levels of hazardous household waste (HHW) generated in Mexicali, Mexico. The analysis considered three socioeconomic strata and eight categories. The sampling was undertaken on a house-by-house basis, and hypothesis testing was based on differences between two proportions for each of the eight categories. In this study, HHW comprised 3.49% of the total generated waste, which exceeded that reported in previous studies in Mexico. The greatest quantity of HHW was generated by the middle stratum; in the upper stratum, most packages were discarded with their contents remaining. Cleaning products represent 45.86% of the HHW generated. Statistical differences were not observed for only two categories among the three social strata. The scarcity of studies on HHW generation limits direct comparisons. Any decrease in waste generation within the middle social stratum will have a large effect on the total amount of waste generated, and decrease their impact on environmental and human health. PMID:24293231

  17. Vulnerability studies and integrated assessments for hazard risk reduction in Pittsburgh, PA (Invited)

    NASA Astrophysics Data System (ADS)

    Klima, K.

    2013-12-01

    Today's environmental problems stretch beyond the bounds of most academic disciplines, and thus solutions require an interdisciplinary approach. For instance, the scientific consensus is changes in the frequency and severity of many types of extreme weather events are increasing (IPCC 2012). Yet despite our efforts to reduce greenhouse gases, we continue to experience severe weather events such as Superstorm Sandy, record heat and blizzards, and droughts. These natural hazards, combined with increased vulnerability and exposure, result in longer-lasting disruptions to critical infrastructure and business continuity throughout the world. In order to protect both our lives and the economy, we must think beyond the bounds of any one discipline to include an integrated assessment of relevant work. In the wake of recent events, New York City, Washington, DC, Chicago, and a myriad of other cities have turned to their academic powerhouses for assistance in better understanding their vulnerabilities. This talk will share a case study of the state of integrated assessments and vulnerability studies of energy, transportation, water, real estate, and other main sectors in Pittsburgh, PA. Then the talk will use integrated assessment models and other vulnerability studies to create coordinated sets of climate projections for use by the many public agencies and private-sector organizations in the region.

  18. Mitigating Challenges of Using Virtual Reality in Online Courses: A Case Study

    ERIC Educational Resources Information Center

    Stewart, Barbara; Hutchins, Holly M.; Ezell, Shirley; De Martino, Darrell; Bobba, Anil

    2010-01-01

    Case study methodology was used to describe the challenges experienced in the development of a virtual component for a freshman-level undergraduate course. The purpose of the project was to use a virtual environment component to provide an interactive and engaging learning environment. While some student and faculty feedback was positive, this…

  19. Methane emission from ruminants and solid waste: A critical analysis of baseline and mitigation projections for climate and policy studies

    NASA Astrophysics Data System (ADS)

    Matthews, E.

    2012-12-01

    Current and projected estimates of methane (CH4) emission from anthropogenic sources are numerous but largely unexamined or compared. Presented here is a critical appraisal of CH4 projections used in climate-chemistry and policy studies. We compare emissions for major CH4 sources from several groups, including our own new data and RCP projections developed for climate-chemistry models for the next IPCC Assessment Report (AR5). We focus on current and projected baseline and mitigation emissions from ruminant animals and solid waste that are both predicted to rise dramatically in coming decades, driven primarily by developing countries. For waste, drivers include increasing urban populations, higher per capita waste generation due to economic growth and increasing landfilling rates. Analysis of a new global data base detailing waste composition, collection and disposal indicates that IPCC-based methodologies and default data overestimate CH4 emission for the current period which cascades into substantial overestimates in future projections. CH4 emission from solid waste is estimated to be ~10-15 Tg CH4/yr currently rather than the ~35 Tg/yr often reported in the literature. Moreover, emissions from developing countries are unlikely to rise rapidly in coming decades because new management approaches, such as sanitary landfills, that would increase emissions are maladapted to infrastructures in these countries and therefore unlikely to be implemented. The low current emission associated with solid waste (~10 Tg), together with future modest growth, implies that mitigation of waste-related CH4 emission is a poor candidate for slowing global warming. In the case of ruminant animals (~90 Tg CH4/yr currently), the dominant assumption driving future trajectories of CH4 emission is a substantial increase in meat and dairy consumption in developing countries to be satisfied by growing animal populations. Unlike solid waste, current ruminant emissions among studies exhibit a narrow range that does not necessarily signal low uncertainty but rather a reliance on similar animal statistics and emission factors. The UN Food and Agriculture Organization (FAO) projects 2000-2030 growth rates of livestock for most developing countries at 2% to >3% annually. However, the assumption of rapidly rising meat consumption is not supported by current trends nor by resource availability. For example, increased meat consumption in China and other developing countries is poultry and pork that do not affect CH4 emissions, suggesting that the rapid growth projected for all animals, boosting growth in CH4 emission, will not occur. From a resource standpoint, large increases in cattle, sheep and goat populations, especially for African countries (~60% by 2030), are not supportable on arid grazing lands that require very low stocking rates and semi-nomadic management. Increases projected for African animal populations would require either that about 2/3 more animals are grazed on increasingly drier lands or that all non-forested areas become grazing lands. Similar to solid waste, future methane emission from ruminant animals is likely to grow modestly although animals are not a likely candidate for CH4 mitigation due to their dispersed distribution throughout widely varying agricultural systems under very local management.

  20. Social Uptake of Scientific Understanding of Seismic Hazard in Sumatra and Cascadia

    NASA Astrophysics Data System (ADS)

    Shannon, R.; McCloskey, J.; Guyer, C.; McDowell, S.; Steacy, S.

    2007-12-01

    The importance of science within hazard mitigation cannot be underestimated. Robust mitigation polices rely strongly on a sound understanding of the science underlying potential natural disasters and the transference of that knowledge from the scientific community to the general public via governments and policy makers. We aim to investigate how and why the public's knowledge, perceptions, response, adjustments and values towards science have changed throughout two decades of research conducted in areas along and adjacent to the Sumatran and Cascadia subduction zones. We will focus on two countries subject to the same potential hazard, but which encompass starkly contrasting political, economic, social and environmental settings. The transfer of scientific knowledge into the public/ social arena is a complex process, the success of which is reflected in a community's ability to withstand large scale devastating events. Although no one could have foreseen the magnitude of the 2004 Boxing Day tsunami, the social devastation generated underscored the stark absence of mitigation measures in the nations most heavily affected. It furthermore emphasized the need for the design and implementation of disaster preparedness measures. Survey of existing literature has already established timelines for major events and public policy changes in the case study areas. Clear evidence exists of the link between scientific knowledge and its subsequent translation into public policy, particularly in the Cascadia context. The initiation of the National Tsunami Hazard Mitigation Program following the Cape Mendocino earthquake in 1992 embodies this link. Despite a series of environmental disasters with recorded widespread fatalities dating back to the mid 1900s and a heightened impetus for scientific research into tsunami/ earthquake hazard following the 2004 Boxing Day tsunami, the translation of science into the public realm is not widely obvious in the Sumatran context. This research aims to further investigate how the enhanced understanding of earthquake and tsunami hazards is being used to direct hazard mitigation strategies and enables direct comparison with the scientific and public policy developments in Cascadia.

  1. A comparative study of frequency ratio, statistical index and poisson method for landslide hazard mapping along East-West highway

    NASA Astrophysics Data System (ADS)

    Azizat, Nazirah; Lateh, Habibah; Tay, Lea Tien; Yusoff, Izham Mohamad

    2015-05-01

    The purpose of this study is to evaluate of the landslide hazard along East-west Highway (Gerik- Jeli) using frequency ratio, statistical index and poisson method. Historical data in this area were identified by field work in years 2007 until 2012. The factors chosen in producing of landslide hazard map were geology, rainfall, slope angle, soil texture, stream, profile curvature, plan curvature, NDVI, fault Line, elevation, age of rock and aspect. The results of the analysis for these three methods were verified using actual landslide location data. The validation results showed satisfactory agreement between the landslide hazard map and the existing data on landslide locations. The estimating of influencing factor also was carried out using sensitivity analysis.

  2. Exercise as an intervention for sedentary hazardous drinking college students: A pilot study

    PubMed Central

    Weinstock, Jeremiah; Capizzi, Jeffrey; Weber, Stefanie M.; Pescatello, Linda S.; Petry, Nancy M.

    2014-01-01

    Young adults 18–24 years have the highest rates of problems associated with alcohol use among all age groups, and substance use is inversely related to engagement in substance-free activities. This pilot study investigated the promotion of one specific substance-free activity, exercise, on alcohol use in college students. Thirty-one sedentary college students who engaged in hazardous drinking (Alcohol Use Disorders Identification Test scores ≥ 8) were randomized to one of two conditions: (a) one 50-minute session of motivational enhancement therapy (MET) focused on increasing exercise, or (b) one 50-minute session of MET focused on increasing exercise plus 8 weeks of contingency management (CM) for adhering to specific exercise activities. All participants completed evaluations at baseline and post-treatment (2-months later) assessing exercise participation and alcohol use. Results of the pilot study suggest the interventions were well received by participants, the MET+CM condition showed an increased self-reported frequency of exercise in comparison to the MET alone condition, but other indices of exercise, physical fitness, and alcohol use did not differ between the interventions over time. These results suggest that a larger scale trial could better assess efficacy of this well received combined intervention. Investigation in other clinically relevant populations is also warranted. PMID:24949085

  3. Role of human- and animal-sperm studies in the evaluation of male reproductive hazards

    SciTech Connect

    Wyrobek, A.J.; Gordon, L.; Watchmaker, G.

    1982-04-07

    Human sperm tests provide a direct means of assessing chemically induced spermatogenic dysfunction in man. Available tests include sperm count, motility, morphology (seminal cytology), and Y-body analyses. Over 70 different human exposures have been monitored in various groups of exposed men. The majority of exposures studied showed a significant change from control in one or more sperm tests. When carefully controlled, the sperm morphology test is statistically the most sensitive of these human sperm tests. Several sperm tests have been developed in nonhuman mammals for the study of chemical spermatotoxins. The sperm morphology test in mice has been the most widely used. Results with this test seem to be related to germ-cell mutagenicity. In general, animal sperm tests should play an important role in the identification and assessment of potential human reproductive hazards. Exposure to spermatotoxins may lead to infertility, and more importantly, to heritable genetic damage. While there are considerable animal and human data suggesting that sperm tests may be used to detect agents causing infertility, the extent to which these tests detect heritable genetic damage remains unclear. (ERB)

  4. Field study of exhaust fans for mitigating indoor air quality problems: Final report

    SciTech Connect

    Grimsrud, D.T.; Szydlowski, R.F.; Turk, B.H.

    1986-09-01

    Residential ventilation in the United States housing stock is provided primarily by infiltration, the natural leakage of outdoor air into a building through cracks and holes in the building shell. Since ventilation is the dominant mechanism for control of indoor pollutant concentrations, low infiltration rates caused fluctuation in weather conditions may lead to high indoor pollutant concentrations. Supplemental mechanical ventilation can be used to eliminate these periods of low infiltration. This study examined effects of small continuously-operating exhaust fan on pollutant concentrations and energy use in residences.

  5. The Relative Severity of Single Hazards within a Multi-Hazard Framework

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2013-04-01

    Here we present a description of the relative severity of single hazards within a multi-hazard framework, compiled through examining, quantifying and ranking the extent to which individual hazards trigger or increase the probability of other hazards. Hazards are broken up into six major groupings (geophysical, hydrological, shallow earth processes, atmospheric, biophysical and space), with the interactions for 21 different hazard types examined. These interactions include both one primary hazard triggering a secondary hazard, and one primary hazard increasing the probability of a secondary hazard occurring. We identify, through a wide-ranging review of grey- and peer-review literature, >90 interactions. The number of hazard-type linkages are then summed for each hazard in terms of their influence (the number of times one hazard type triggers another type of hazard, or itself) and their sensitivity (the number of times one hazard type is triggered by other hazard types, or itself). The 21 different hazards are then ranked based on (i) influence and (ii) sensitivity. We found, by quantification and ranking of these hazards, that: (i) The strongest influencers (those triggering the most secondary hazards) are volcanic eruptions, earthquakes and storms, which when taken together trigger almost a third of the possible hazard interactions identified; (ii) The most sensitive hazards (those being triggered by the most primary hazards) are identified to be landslides, volcanic eruptions and floods; (iii) When sensitivity rankings are adjusted to take into account the differential likelihoods of different secondary hazards being triggered, the most sensitive hazards are found to be landslides, floods, earthquakes and ground heave. We believe that by determining the strongest influencing and the most sensitive hazards for specific spatial areas, the allocation of resources for mitigation measures might be done more effectively.

  6. Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs

    EPA Science Inventory

    This is the fourth, also the last, report of the report series entitled “Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings.” This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method, developed ...

  7. Laboratory Study of Polychlorinated Biphenyl Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs

    EPA Science Inventory

    This is the fourth, also the last, report of the report series entitled “Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings.” This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method...

  8. Laboratory Study of Polychlorinated Biphenyl Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs

    EPA Science Inventory

    This is the fourth, also the last, report of the report series entitled Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings. This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method...

  9. Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs

    EPA Science Inventory

    This is the fourth, also the last, report of the report series entitled Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings. This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method, developed ...

  10. Legal and institutional tools to mitigate plastic pollution affecting marine species: Argentina as a case study.

    PubMed

    González Carman, Victoria; Machain, Natalia; Campagna, Claudio

    2015-03-15

    Plastics are the most common form of debris found along the Argentine coastline. The Río de la Plata estuarine area is a relevant case study to describe a situation where ample policy exists against a backdrop of plastics disposed by populated coastal areas, industries, and vessels; with resultant high impacts of plastic pollution on marine turtles and mammals. Policy and institutions are in place but the impact remains due to ineffective waste management, limited public education and awareness, and weaknesses in enforcement of regulations. This context is frequently repeated all over the world. We list possible interventions to increase the effectiveness of policy that require integrating efforts among governments, the private sector, non-governmental organizations and the inhabitants of coastal cities to reduce the amount of plastics reaching the Río de la Plata and protect threatened marine species. What has been identified for Argentina applies to the region and globally. PMID:25627195

  11. Dietary Flaxseed Mitigates Impaired Skeletal Muscle Regeneration: in Vivo, in Vitro and in Silico Studies

    PubMed Central