Sample records for hazard mitigation studies

  1. Unacceptable Risk: Earthquake Hazard Mitigation in One California School District. Hazard Mitigation Case Study.

    ERIC Educational Resources Information Center

    California State Office of Emergency Services, Sacramento.

    Earthquakes are a perpetual threat to California's school buildings. School administrators must be aware that hazard mitigation means much more than simply having a supply of water bottles in the school; it means getting everyone involved in efforts to prevent tragedies from occurring in school building in the event of an earthquake. The PTA in…

  2. Mitigating lightning hazards

    SciTech Connect

    Hasbrouck, R.

    1996-05-01

    A new draft document provides guidance for assessing and mitigating the effects of lightning hazards on a Department of Energy (or any other) facility. Written by two Lawrence Livermore Engineers, the document combines lightning hazard identification and facility categorization with a new concept, the Lightning Safety System, to help dispel the confusion and mystery surrounding lightning and its effects. The guidance is of particular interest to DOE facilities storing and handling nuclear and high-explosive materials. The concepts presented in the document were used to evaluate the lightning protection systems of the Device Assembly Facility at the Nevada Test Site.

  3. Numerical Study on Tsunami Hazard Mitigation Using a Submerged Breakwater

    PubMed Central

    Yoo, Jeseon; Han, Sejong; Cho, Yong-Sik

    2014-01-01

    Most coastal structures have been built in surf zones to protect coastal areas. In general, the transformation of waves in the surf zone is quite complicated and numerous hazards to coastal communities may be associated with such phenomena. Therefore, the behavior of waves in the surf zone should be carefully analyzed and predicted. Furthermore, an accurate analysis of deformed waves around coastal structures is directly related to the construction of economically sound and safe coastal structures because wave height plays an important role in determining the weight and shape of a levee body or armoring material. In this study, a numerical model using a large eddy simulation is employed to predict the runup heights of nonlinear waves that passed a submerged structure in the surf zone. Reduced runup heights are also predicted, and their characteristics in terms of wave reflection, transmission, and dissipation coefficients are investigated. PMID:25215334

  4. Numerical study on tsunami hazard mitigation using a submerged breakwater.

    PubMed

    Ha, Taemin; Yoo, Jeseon; Han, Sejong; Cho, Yong-Sik

    2014-01-01

    Most coastal structures have been built in surf zones to protect coastal areas. In general, the transformation of waves in the surf zone is quite complicated and numerous hazards to coastal communities may be associated with such phenomena. Therefore, the behavior of waves in the surf zone should be carefully analyzed and predicted. Furthermore, an accurate analysis of deformed waves around coastal structures is directly related to the construction of economically sound and safe coastal structures because wave height plays an important role in determining the weight and shape of a levee body or armoring material. In this study, a numerical model using a large eddy simulation is employed to predict the runup heights of nonlinear waves that passed a submerged structure in the surf zone. Reduced runup heights are also predicted, and their characteristics in terms of wave reflection, transmission, and dissipation coefficients are investigated. PMID:25215334

  5. Collective action for community-based hazard mitigation: a case study of Tulsa project impact

    E-print Network

    Lee, Hee Min

    2005-11-01

    During the past two decades, community-based hazard mitigation (CBHM) has been newly proposed and implemented as an alternative conceptual model for emergency management to deal with disasters comprehensively in order to curtail skyrocketing...

  6. Mitigating Volcanic Hazards

    NSDL National Science Digital Library

    Lynne Elkins

    This activity spans two in-class sessions of 1-1.5 hours each, and includes both a small group activity focused on a set of volcanic case studies and a full-class role-playing activity where the class must decide, as a fictional town, how to respond to a nearby volcano exhibiting increased activity.

  7. Some hazards and catastrophes: Recognition and mitigation

    NASA Astrophysics Data System (ADS)

    Smith, Joseph V.

    Because some hazards to the human race embrace the technical competence of the American Geophysical Union, a Union session entitled “Some Hazards and Catastrophes: Recognition and Mitigation” will be held at the 1986 AGU Spring Meeting in Baltimore, Md, Tuesday morning, May 20.Scientific study of natural hazards has already been shown to be valuable to society and indeed has already been the subject of many AGU sessions. However, most programs of hazard reduction have been underfunded and restricted to only a few parts of the world. It is time to examine all natural hazards in the widest possible context of public affairs and to determine how a bold international effort would benefit the human race.

  8. Washington Tsunami Hazard Mitigation Program

    NASA Astrophysics Data System (ADS)

    Walsh, T. J.; Schelling, J.

    2012-12-01

    Washington State has participated in the National Tsunami Hazard Mitigation Program (NTHMP) since its inception in 1995. We have participated in the tsunami inundation hazard mapping, evacuation planning, education, and outreach efforts that generally characterize the NTHMP efforts. We have also investigated hazards of significant interest to the Pacific Northwest. The hazard from locally generated earthquakes on the Cascadia subduction zone, which threatens tsunami inundation in less than hour following a magnitude 9 earthquake, creates special problems for low-lying accretionary shoreforms in Washington, such as the spits of Long Beach and Ocean Shores, where high ground is not accessible within the limited time available for evacuation. To ameliorate this problem, we convened a panel of the Applied Technology Council to develop guidelines for construction of facilities for vertical evacuation from tsunamis, published as FEMA 646, now incorporated in the International Building Code as Appendix M. We followed this with a program called Project Safe Haven (http://www.facebook.com/ProjectSafeHaven) to site such facilities along the Washington coast in appropriate locations and appropriate designs to blend with the local communities, as chosen by the citizens. This has now been completed for the entire outer coast of Washington. In conjunction with this effort, we have evaluated the potential for earthquake-induced ground failures in and near tsunami hazard zones to help develop cost estimates for these structures and to establish appropriate tsunami evacuation routes and evacuation assembly areas that are likely to to be available after a major subduction zone earthquake. We intend to continue these geotechnical evaluations for all tsunami hazard zones in Washington.

  9. Florida households' expected responses to hurricane hazard mitigation incentives.

    PubMed

    Ge, Yue; Peacock, Walter Gillis; Lindell, Michael K

    2011-10-01

    This study tested a series of models predicting household expectations of participating in hurricane hazard mitigation incentive programs. Data from 599 households in Florida revealed that mitigation incentive adoption expectations were most strongly and consistently related to hazard intrusiveness and risk perception and, to a lesser extent, worry. Demographic and hazard exposure had indirect effects on mitigation incentive adoption expectations that were mediated by the psychological variables. The results also revealed differences in the factors affecting mitigation incentive adoption expectations for each of five specific incentive programs. Overall, the results suggest that hazard managers are more likely to increase participation in mitigation incentive programs if they provide messages that repeatedly (thus increasing hazard intrusiveness) remind people of the likelihood of severe negative consequences of hurricane impact (thus increasing risk perception). PMID:21449959

  10. 76 FR 61070 - Disaster Assistance; Hazard Mitigation Grant Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-03

    ...programs for wildfire hazard mitigation and erosion hazard mitigation in the list of eligible...Department of Agriculture. Wildfire and Erosion Under the NPRM, vegetation management related to wildfire and erosion hazard mitigation measures would be...

  11. Seismic hazard status and mitigation actions

    E-print Network

    Paris-Sud XI, Université de

    -rap.obs.ujf-grenoble.fr hal-00662359,version1-23Jan2012 #12;Hazard mapping in the FWI: seismic microzonation > GeophysicalSeismic hazard status and mitigation actions in Guadeloupe and Martinique, French West Indies J to the FWI > The PGA 0,3g for a 475 yr- return period hal-00662359,version1-23Jan2012 #12;Hazard mapping

  12. What Influences Hazard Mitigation? Household Decision Making About Wildfire Risks in Arizona's White Mountains

    Microsoft Academic Search

    Timothy W. Collins

    2008-01-01

    Through a study of human response to wildfire hazards, this article addresses the question: What influences hazard mitigation? Results from a household-level multiple regression analysis using structured survey, hazard exposure, and secondary data reveal that social vulnerability, place dependency, and contextual influences are important determinants of mitigation of wildfire hazards. Lower income and renter households engage in less mitigation than

  13. An economic and geographic appraisal of a spatial natural hazard risk: a study of landslide mitigation rules

    USGS Publications Warehouse

    Bernknopf, R.L.; Brookshire, D.S.; Campbell, R.H.; Shapiro, C.D.

    1988-01-01

    Efficient mitigation of natural hazards requires a spatial representation of the risk, based upon the geographic distribution of physical parameters and man-related development activities. Through such a representation, the spatial probability of landslides based upon physical science concepts is estimated for Cincinnati, Ohio. Mitigation programs designed to reduce loss from landslide natural hazards are then evaluated. An optimum mitigation rule is suggested that is spatially selective and is determined by objective measurements of hillside slope and properties of the underlying soil. -Authors

  14. Earthquake Hazard Mitigation Strategy in Indonesia

    NASA Astrophysics Data System (ADS)

    Karnawati, D.; Anderson, R.; Pramumijoyo, S.

    2008-05-01

    Because of the active tectonic setting of the region, the risks of geological hazards inevitably increase in Indonesian Archipelagoes and other ASIAN countries. Encouraging community living in the vulnerable area to adapt with the nature of geology will be the most appropriate strategy for earthquake risk reduction. Updating the Earthquake Hazard Maps, enhancement ofthe existing landuse management , establishment of public education strategy and method, strengthening linkages among stake holders of disaster mitigation institutions as well as establishement of continues public consultation are the main strategic programs for community resilience in earthquake vulnerable areas. This paper highlights some important achievements of Earthquake Hazard Mitigation Programs in Indonesia, together with the difficulties in implementing such programs. Case examples of Yogyakarta and Bengkulu Earthquake Mitigation efforts will also be discussed as the lesson learned. The new approach for developing earthquake hazard map which is innitiating by mapping the psychological aspect of the people living in vulnerable area will be addressed as well.

  15. Mitigation of Hazardous Comets and Asteroids

    NASA Astrophysics Data System (ADS)

    Belton, Michael J. S.; Morgan, Thomas H.; Samarasinha, Nalin H.; Yeomans, Donald K.

    2004-11-01

    Preface; 1. Recent progress in interpreting the nature of the near-Earth object population W. Bottke, A. Morbidelli and R. Jedicke; 2. Earth impactors: orbital characteristics and warning times S. R. Chesley and T. B. Spahr; 3. The role of radar in predicting and preventing asteroid and comet collisions with Earth S. J. Ostro and J. D. Giorgini; 4. Interior structures for asteroids and cometary nuclei E. Asphaug; 5. What we know and don't know about surfaces of potentially hazardous small bodies C. R. Chapman; 6. About deflecting asteroids and comets K. A. Holsapple; 7. Scientific requirements for understanding the near-Earth asteroid population A. W. Harris; 8. Physical properties of comets and asteroids inferred from fireball observations M. D. Martino and A. Cellino; 9. Mitigation technologies and their requirements C. Gritzner and R. Kahle; 10. Peering inside near-Earth objects with radio tomography W. Kofman and A. Safaeinili; 11. Seismological imvestigation of asteroid and comet interiors J. D. Walker and W. F. Huebner; 12. Lander and penetrator science for near-Earth object mitigation studies A. J. Ball, P. Lognonne, K. Seiferlin, M. Patzold and T. Spohn; 13. Optimal interpretation and deflection of Earth-approaching asteroids using low-thrust electric propulsion B. A. Conway; 14. Close proximity operations at small bodies: orbiting, hovering, and hopping D. J. Scheeres; 15. Mission operations in low gravity regolith and dust D. Sears, M. Franzen, S. Moore, S. Nichols, M. Kareev and P. Benoit; 16. Impacts and the public: communicating the nature of the impact hazard D. Morrison, C. R. Chapman, D. Steel and R. P. Binzel; 17. Towards a program to remove the threat of hazardous NEOs M. J. S. Belton.

  16. Mitigation of Hazardous Comets and Asteroids

    NASA Astrophysics Data System (ADS)

    Belton, Michael J. S.; Morgan, Thomas H.; Samarasinha, Nalin H.; Yeomans, Donald K.

    2011-03-01

    Preface; 1. Recent progress in interpreting the nature of the near-Earth object population W. Bottke, A. Morbidelli and R. Jedicke; 2. Earth impactors: orbital characteristics and warning times S. R. Chesley and T. B. Spahr; 3. The role of radar in predicting and preventing asteroid and comet collisions with Earth S. J. Ostro and J. D. Giorgini; 4. Interior structures for asteroids and cometary nuclei E. Asphaug; 5. What we know and don't know about surfaces of potentially hazardous small bodies C. R. Chapman; 6. About deflecting asteroids and comets K. A. Holsapple; 7. Scientific requirements for understanding the near-Earth asteroid population A. W. Harris; 8. Physical properties of comets and asteroids inferred from fireball observations M. D. Martino and A. Cellino; 9. Mitigation technologies and their requirements C. Gritzner and R. Kahle; 10. Peering inside near-Earth objects with radio tomography W. Kofman and A. Safaeinili; 11. Seismological imvestigation of asteroid and comet interiors J. D. Walker and W. F. Huebner; 12. Lander and penetrator science for near-Earth object mitigation studies A. J. Ball, P. Lognonne, K. Seiferlin, M. Patzold and T. Spohn; 13. Optimal interpretation and deflection of Earth-approaching asteroids using low-thrust electric propulsion B. A. Conway; 14. Close proximity operations at small bodies: orbiting, hovering, and hopping D. J. Scheeres; 15. Mission operations in low gravity regolith and dust D. Sears, M. Franzen, S. Moore, S. Nichols, M. Kareev and P. Benoit; 16. Impacts and the public: communicating the nature of the impact hazard D. Morrison, C. R. Chapman, D. Steel and R. P. Binzel; 17. Towards a program to remove the threat of hazardous NEOs M. J. S. Belton.

  17. Reduce toxic hazards using passive mitigation

    SciTech Connect

    Flamberg, S.A.; Torti, K.S.; Myers, P.M. [ERM-Four Elements, Inc., Columbus, OH (United States)

    1998-07-01

    The primary goal of the Risk Management Program Rule promulgated under Section 112(r) of the 1990 US Clean Air Act Amendments is to prevent the accidental release of those chemicals that pose the greatest threat to the public and the environment, and to encourage emergency preparedness to mitigate the severity of such releases. The Rule requires facility owners to identify, evaluate, and communicate to the public any potential worst-case scenarios that could involve accidental releases of toxic and flammable substances. A worst-case scenario is defined by the US Environmental Protection Agency (EPA; Washington, DC) as: {hor_ellipsis}the release of the largest quantity of a regulated substance from a vessel or process line failure that results in the greatest distance to an endpoint. When designing systems to store or process hazardous materials, passive-mitigation methods--those that function without human, mechanical, or energy input--should be considered. Such systems contain or limit a potential release of hazardous materials. And, because they have no mechanical requirements, passive-mitigation techniques are considered more reliable than active methods, such as emergency-shutdown and water-spray systems. Passive mitigation should also be considered when defining potential release scenarios and modeling hazard zones.

  18. WHC natural phenomena hazards mitigation implementation plan

    SciTech Connect

    Conrads, T.J.

    1996-09-11

    Natural phenomena hazards (NPH) are unexpected acts of nature which pose a threat or danger to workers, the public or to the environment. Earthquakes, extreme winds (hurricane and tornado),snow, flooding, volcanic ashfall, and lightning strike are examples of NPH at Hanford. It is the policy of U.S. Department of Energy (DOE) to design, construct and operate DOE facilitiesso that workers, the public and the environment are protected from NPH and other hazards. During 1993 DOE, Richland Operations Office (RL) transmitted DOE Order 5480.28, ``Natural Phenomena Hazards Mitigation,`` to Westinghouse Hanford COmpany (WHC) for compliance. The Order includes rigorous new NPH criteria for the design of new DOE facilities as well as for the evaluation and upgrade of existing DOE facilities. In 1995 DOE issued Order 420.1, ``Facility Safety`` which contains the same NPH requirements and invokes the same applicable standards as Order 5480.28. It will supersede Order 5480.28 when an in-force date for Order 420.1 is established through contract revision. Activities will be planned and accomplished in four phases: Mobilization; Prioritization; Evaluation; and Upgrade. The basis for the graded approach is the designation of facilities/structures into one of five performance categories based upon safety function, mission and cost. This Implementation Plan develops the program for the Prioritization Phase, as well as an overall strategy for the implemention of DOE Order 5480.2B.

  19. G188 Research Paper Volcano Tourism: Hazards and Mitigation

    E-print Network

    Polly, David

    Madi McNew G188 Research Paper 6/14/14 Volcano Tourism: Hazards and Mitigation ABSTRACT The Long of visitors each year. The various products and effects of volcanoes make hazards unavoidable. Regardless, monitoring and mitigation of the volcanoes are necessary in order to protect the residents and tourists

  20. EVALUATION OF FOAMS FOR MITIGATING AIR POLLUTION FROM HAZARDOUS SPILLS

    EPA Science Inventory

    This program has been conducted to evaluate commercially available water base foams for mitigating the vapors from hazardous chemical spills. Foam systems were evaluated in the laboratory to define those foam properties which are important in mitigating hazardous vapors. Larger s...

  1. Influence of behavioral biases on the assessment of multi-hazard risks and the implementation of multi-hazard risks mitigation measures: case study of multi-hazard cyclone shelters in Tamil Nadu, India

    NASA Astrophysics Data System (ADS)

    Komendantova, Nadejda; Patt, Anthony

    2013-04-01

    In December 2004, a multiple hazards event devastated the Tamil Nadu province of India. The Sumatra -Andaman earthquake with a magnitude of Mw=9.1-9.3 caused the Indian Ocean tsunami with wave heights up to 30 m, and flooding that reached up to two kilometers inland in some locations. More than 7,790 persons were killed in the province of Tamil Nadu, with 206 in its capital Chennai. The time lag between the earthquake and the tsunami's arrival in India was over an hour, therefore, if a suitable early warning system existed, a proper means of communicating the warning and shelters existing for people would exist, than while this would not have prevented the destruction of infrastructure, several thousands of human lives would have been saved. India has over forty years of experience in the construction of cyclone shelters. With additional efforts and investment, these shelters could be adapted to other types of hazards such as tsunamis and flooding, as well as the construction of new multi-hazard cyclone shelters (MPCS). It would therefore be possible to mitigate one hazard such as cyclones by the construction of a network of shelters while at the same time adapting these shelters to also deal with, for example, tsunamis, with some additional investment. In this historical case, the failure to consider multiple hazards caused significant human losses. The current paper investigates the patterns of the national decision-making process with regards to multiple hazards mitigation measures and how the presence of behavioral and cognitive biases influenced the perceptions of the probabilities of multiple hazards and the choices made for their mitigation by the national decision-makers. Our methodology was based on the analysis of existing reports from national and international organizations as well as available scientific literature on behavioral economics and natural hazards. The results identified several biases in the national decision-making process when the construction of cyclone shelters was being undertaken. The availability heuristics caused a perception of low probability of tsunami following an earthquake, as the last large similar event happened over a hundred years ago. Another led to a situation when decisions were taken on the basis of experience and not statistical evidence, namely, experience showed that the so-called "Ring of Fire" generates underground earthquakes and tsunamis in the Pacific Ocean. This knowledge made decision-makers to neglect the numerical estimations about probability of underground earthquake in the Indian Ocean even though seismologists were warning about probability of a large underground earthquake in the Indian Ocean. The bounded rationality bias led to misperception of signals from the early warning center in the Pacific Ocean. The resulting limited concern resulted in risk mitigation measures that considered cyclone risks, but much less about tsunami. Under loss aversion considerations, the decision-makers perceived the losses connected with the necessary additional investment as being greater than benefits from mitigating a less probable hazard.

  2. Potentially Hazardous Objects (PHO) Mitigation Program

    NASA Astrophysics Data System (ADS)

    Huebner, Walter

    Southwest Research Institute (SwRI) and its partner, Los Alamos National Laboratory (LANL), are prepared to develop, implement, and expand procedures to avert collisions of potentially hazardous objects (PHOs) with Earth as recommended by NASA in its White Paper "Near- Earth Object Survey and Deflection Analysis of Alternatives" requested by the US Congress and submitted to it in March 2007. In addition to developing the general mitigation program as outlined in the NASA White Paper, the program will be expanded to include aggressive mitigation procedures for small (e.g., Tunguska-sized) PHOs and other short warning-time PHOs such as some long-period comet nuclei. As a first step the program will concentrate on the most likely and critical cases, namely small objects and long-period comet nuclei with short warning-times, but without losing sight of objects with longer warning-times. Objects smaller than a few hundred meters are of interest because they are about 1000 times more abundant than kilometer-sized objects and are fainter and more difficult to detect, which may lead to short warning times and hence short reaction times. Yet, even these small PHOs can have devastating effects as the 30 June 1908, Tungaska event has shown. In addition, long-period comets, although relatively rare but large (sometimes tens of kilometers in size), cannot be predicted because of their long orbital periods. Comet C/1983 H1 (IRAS-Araki-Alcock), for example, has an orbital period of 963.22 years, was discovered 27 April 1983, and passed Earth only two weeks later, on 11 May 1983, at a distance of 0.0312 AU. Aggressive methods and continuous alertness will be needed to defend against objects with such short warning times. While intact deflection of a PHO remains a key objective, destruction of a PHO and dispersion of the pieces must also be considered. The effectiveness of several alternative methods including nuclear demolition munitions, conventional explosives, and hyper-velocity impacts will be investigated and compared. This comparison is important for technical as well as political reasons, both domestic and international. The long-range plan includes evaluation of technical readiness including launch capabilities, tests for effectiveness using materials simulating PHOs, and building and testing several modular systems appropriate for alternative applications depending on the type of PHO.

  3. A mitigation program for potentially hazardous long-period comets

    Microsoft Academic Search

    Daniel Boice; Walter Huebner

    2010-01-01

    A program is being developed to avert collisions of potentially hazardous objects (PHOs) with Earth by Southwest Research Institute and Los Alamos National Laboratory. In addition to developing the general mitigation strategies, the program will be expanded to include aggressive mitigation procedures for small (e.g., Tunguska-sized) potentially hazardous objects (PHO) and other short warning-time PHOs, such as some long-period comet

  4. Benefit-Cost Analysis of FEMA Hazard Mitigation Grants

    Microsoft Academic Search

    Adam Rose; Keith Porter; Nicole Dash; Jawhar Bouabid; Charles Huyck; John C. Whitehead; Douglass Shaw; Ronald T. Eguchi; Craig Taylor; Thomas R. McLane; L. Thomas Tobin; Philip T. Ganderton; David Godschalk; Anne S. Kiremidjian; Kathleen Tierney; Carol Taylor West

    2006-01-01

    Mitigation ameliorates the impact of natural hazards on communities by reducing loss of life and injury, property and environmental damage, and social and economic disruption. The potential to reduce these losses brings many benefits, but every mitigation activity has a cost that must be considered in our world of limited resources. In principle benefit-cost analysis (BCA) can be used to

  5. Destructive Interactions Between Mitigation Strategies and the Causes of Unexpected Failures in Natural Hazard Mitigation Systems

    NASA Astrophysics Data System (ADS)

    Day, S. J.; Fearnley, C. J.

    2013-12-01

    Large investments in the mitigation of natural hazards, using a variety of technology-based mitigation strategies, have proven to be surprisingly ineffective in some recent natural disasters. These failures reveal a need for a systematic classification of mitigation strategies; an understanding of the scientific uncertainties that affect the effectiveness of such strategies; and an understanding of how the different types of strategy within an overall mitigation system interact destructively to reduce the effectiveness of the overall mitigation system. We classify mitigation strategies into permanent, responsive and anticipatory. Permanent mitigation strategies such as flood and tsunami defenses or land use restrictions, are both costly and 'brittle': when they malfunction they can increase mortality. Such strategies critically depend on the accuracy of the estimates of expected hazard intensity in the hazard assessments that underpin their design. Responsive mitigation strategies such as tsunami and lahar warning systems rely on capacities to detect and quantify the hazard source events and to transmit warnings fast enough to enable at risk populations to decide and act effectively. Self-warning and voluntary evacuation is also usually a responsive mitigation strategy. Uncertainty in the nature and magnitude of the detected hazard source event is often the key scientific obstacle to responsive mitigation; public understanding of both the hazard and the warnings, to enable decision making, can also be a critical obstacle. Anticipatory mitigation strategies use interpretation of precursors to hazard source events and are used widely in mitigation of volcanic hazards. Their critical limitations are due to uncertainties in time, space and magnitude relationships between precursors and hazard events. Examples of destructive interaction between different mitigation strategies are provided by the Tohoku 2011 earthquake and tsunami; recent earthquakes that have impacted population centers with poor enforcement of building codes, unrealistic expectations of warning systems or failures to understand local seismic damage mechanisms; and the interaction of land use restriction strategies and responsive warning strategies around lahar-prone volcanoes. A more complete understanding of the interactions between these different types of mitigation strategy, especially the consequences for the expectations and behaviors of the populations at risk, requires models of decision-making under high levels of both uncertainty and danger. The Observation-Orientation-Decision-Action (OODA) loop model (Boyd, 1987) may be a particularly useful model. It emphasizes the importance of 'orientation' (the interpretation of observations and assessment of their significance for the observer and decision-maker), the feedback between decisions and subsequent observations and orientations, and the importance of developing mitigation strategies that are flexible and so able to respond to the occurrence of the unexpected. REFERENCE: Boyd, J.R. A Discourse on Winning and Losing [http://dnipogo.org/john-r-boyd/

  6. Mitigation of earthquake hazards using seismic base isolation systems

    SciTech Connect

    Wang, C.Y.

    1994-06-01

    This paper deals with mitigation of earthquake hazards using seismic base-isolation systems. A numerical algorithm is described for system response analysis of isolated structures with laminated elastomer bearings. The focus of this paper is on the adaptation of a nonlinear constitutive equation for the isolation bearing, and the treatment of foundation embedment for the soil-structure-interaction analysis. Sample problems are presented to illustrate the mitigating effect of using base-isolation systems.

  7. Mitigation of earthquake hazards using seismic isolation systems

    SciTech Connect

    Wang, C.-Y.

    1996-06-01

    This paper describes mitigation of earthquake hazards using seismic base isolation systems. A numerical algorithm for analyzing system response of base-isolated structures with laminated elastomer bearings is briefly described. Seismic response analyses of both base- isolated and unisolated buildings under earthquakes {number_sign}42 and {number_sign}44 are performed and the results are compared to illustrate the mitigating effect of base-isolated systems.

  8. Mitigation options for accidental releases of hazardous gases

    SciTech Connect

    Fthenakis, V.M.

    1995-05-01

    The objective of this paper is to review and compare technologies available for mitigation of unconfined releases of toxic and flammable gases. These technologies include: secondary confinement, deinventory, vapor barriers, foam spraying, and water sprays/monitors. Guidelines for the design and/or operation of effective post-release mitigation systems and case studies involving actual industrial mitigation systems are also presented.

  9. Rainfall-triggered landslides, anthropogenic hazards, and mitigation strategies

    NASA Astrophysics Data System (ADS)

    Larsen, M. C.

    2008-01-01

    Rainfall-triggered landslides are part of a natural process of hillslope erosion that can result in catastrophic loss of life and extensive property damage in mountainous, densely populated areas. As global population expansion on or near steep hillslopes continues, the human and economic costs associated with landslides will increase. Landslide hazard mitigation strategies generally involve hazard assessment mapping, warning systems, control structures, and regional landslide planning and policy development. To be sustainable, hazard mitigation requires that management of natural resources is closely connected to local economic and social interests. A successful strategy is dependent on a combination of multi-disciplinary scientific and engineering approaches, and the political will to take action at the local community to national scale.

  10. 77 FR 24505 - Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential Buildings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-24

    ...FEMA-2012-0007] Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential...comments on Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential...such activity is the implementation of wind retrofit projects to protect existing...

  11. New Approaches to Tsunami Hazard Mitigation Demonstrated in Oregon

    NASA Astrophysics Data System (ADS)

    Priest, G. R.; Rizzo, A.; Madin, I.; Lyles Smith, R.; Stimely, L.

    2012-12-01

    Oregon Department of Geology and Mineral Industries and Oregon Emergency Management collaborated over the last four years to increase tsunami preparedness for residents and visitors to the Oregon coast. Utilizing support from the National Tsunami Hazards Mitigation Program (NTHMP), new approaches to outreach and tsunami hazard assessment were developed and then applied. Hazard assessment was approached by first doing two pilot studies aimed at calibrating theoretical models to direct observations of tsunami inundation gleaned from the historical and prehistoric (paleoseismic/paleotsunami) data. The results of these studies were then submitted to peer-reviewed journals and translated into 1:10,000-12,000-scale inundation maps. The inundation maps utilize a powerful new tsunami model, SELFE, developed by Joseph Zhang at the Oregon Health & Science University. SELFE uses unstructured computational grids and parallel processing technique to achieve fast accurate simulation of tsunami interactions with fine-scale coastal morphology. The inundation maps were simplified into tsunami evacuation zones accessed as map brochures and an interactive mapping portal at http://www.oregongeology.org/tsuclearinghouse/. Unique in the world are new evacuation maps that show separate evacuation zones for distant versus locally generated tsunamis. The brochure maps explain that evacuation time is four hours or more for distant tsunamis but 15-20 minutes for local tsunamis that are invariably accompanied by strong ground shaking. Since distant tsunamis occur much more frequently than local tsunamis, the two-zone maps avoid needless over evacuation (and expense) caused by one-zone maps. Inundation mapping for the entire Oregon coast will be complete by ~2014. Educational outreach was accomplished first by doing a pilot study to measure effectiveness of various approaches using before and after polling and then applying the most effective methods. In descending order, the most effective methods were: (1) door-to-door (person-to-person) education, (2) evacuation drills, (3) outreach to K-12 schools, (4) media events, and (5) workshops targeted to key audiences (lodging facilities, teachers, and local officials). Community organizers were hired to apply these five methods to clusters of small communities, measuring performance by before and after polling. Organizers were encouraged to approach the top priority, person-to-person education, by developing Community Emergency Response Teams (CERT) or CERT-like organizations in each community, thereby leaving behind a functioning volunteer-based group that will continue the outreach program and build long term resiliency. One of the most effective person-to-person educational tools was the Map Your Neighborhood program that brings people together so they can sketch the basic layout of their neighborhoods to depict key earthquake and tsunami hazards and mitigation solutions. The various person-to-person volunteer efforts and supporting outreach activities are knitting communities together and creating a permanent culture of tsunami and earthquake preparedness. All major Oregon coastal population centers will have been covered by this intensive outreach program by ~2014.

  12. Evaluating fuel complexes for fire hazard mitigation planning in the southeastern United States

    Microsoft Academic Search

    Anne G. Andreu; Dan Shea; Bernard R. Parresol; Roger D. Ottmar

    2012-01-01

    Fire hazard mitigation planning requires an accurate accounting of fuel complexes to predict potential fire behavior and effects of treatment alternatives. In the southeastern United States, rapid vegetation growth coupled with complex land use history and forest management options requires a dynamic approach to fuel characterization. In this study we assessed potential surface fire behavior with the Fuel Characteristic Classification

  13. Composite Materials for Hazard Mitigation of Reactive Metal Hydrides.

    SciTech Connect

    Pratt, Joseph William; Cordaro, Joseph Gabriel; Sartor, George B.; Dedrick, Daniel E.; Reeder, Craig L.

    2012-02-01

    In an attempt to mitigate the hazards associated with storing large quantities of reactive metal hydrides, polymer composite materials were synthesized and tested under simulated usage and accident conditions. The composites were made by polymerizing vinyl monomers using free-radical polymerization chemistry, in the presence of the metal hydride. Composites with vinyl-containing siloxane oligomers were also polymerized with and without added styrene and divinyl benzene. Hydrogen capacity measurements revealed that addition of the polymer to the metal hydride reduced the inherent hydrogen storage capacity of the material. The composites were found to be initially effective at reducing the amount of heat released during oxidation. However, upon cycling the composites, the mitigating behavior was lost. While the polymer composites we investigated have mitigating potential and are physically robust, they undergo a chemical change upon cycling that makes them subsequently ineffective at mitigating heat release upon oxidation of the metal hydride. Acknowledgements The authors would like to thank the following people who participated in this project: Ned Stetson (U.S. Department of Energy) for sponsorship and support of the project. Ken Stewart (Sandia) for building the flow-through calorimeter and cycling test stations. Isidro Ruvalcaba, Jr. (Sandia) for qualitative experiments on the interaction of sodium alanate with water. Terry Johnson (Sandia) for sharing his expertise and knowledge of metal hydrides, and sodium alanate in particular. Marcina Moreno (Sandia) for programmatic assistance. John Khalil (United Technologies Research Corp) for insight into the hazards of reactive metal hydrides and real-world accident scenario experiments. Summary In an attempt to mitigate and/or manage hazards associated with storing bulk quantities of reactive metal hydrides, polymer composite materials (a mixture of a mitigating polymer and a metal hydride) were synthesized and tested under simulated usage and accident conditions. Mitigating the hazards associated with reactive metal hydrides during an accident while finding a way to keep the original capability of the active material intact during normal use has been the focus of this work. These composites were made by polymerizing vinyl monomers using free-radical polymerization chemistry, in the presence of the metal hydride, in this case a prepared sodium alanate (chosen as a representative reactive metal hydride). It was found that the polymerization of styrene and divinyl benzene could be initiated using AIBN in toluene at 70 degC. The resulting composite materials can be either hard or brittle solids depending on the cross-linking density. Thermal decomposition of these styrene-based composite materials is lower than neat polystyrene indicating that the chemical nature of the polymer is affected by the formation of the composite. The char-forming nature of cross-linked polystyrene is low and therefore, not an ideal polymer for hazard mitigation. To obtain composite materials containing a polymer with higher char-forming potential, siloxane-based monomers were investigated. Four vinyl-containing siloxane oligomers were polymerized with and without added styrene and divinyl benzene. Like the styrene materials, these composite materials exhibited thermal decomposition behavior significantly different than the neat polymers. Specifically, the thermal decomposition temperature was shifted approximately 100 degC lower than the neat polymer signifying a major chemical change to the polymer network. Thermal analysis of the cycled samples was performed on the siloxane-based composite materials. It was found that after 30 cycles the siloxane-containing polymer composite material has similar TGA/DSC-MS traces as the virgin composite material indicating that the polymer is physically intact upon cycling. Hydrogen capacity measurements revealed that addition of the polymer to the metal hydride in the form of a composite material reduced the inherent hydrogen storage capacity of the material. This

  14. A mitigation program for potentially hazardous long-period comets

    NASA Astrophysics Data System (ADS)

    Boice, Daniel; Huebner, Walter

    A program is being developed to avert collisions of potentially hazardous objects (PHOs) with Earth by Southwest Research Institute and Los Alamos National Laboratory. In addition to developing the general mitigation strategies, the program will be expanded to include aggressive mitigation procedures for small (e.g., Tunguska-sized) potentially hazardous objects (PHO) and other short warning-time PHOs, such as some long-period comet nuclei. The program will initially concentrate on the most likely and critical cases, namely small objects and long-period comet nuclei with short warning-times. In this paper we discuss the threat posed by long-period comets. Although relatively rare but large (sometimes tens of kilometers in size) and fast moving, their detection cannot be predicted because of their long orbital periods, for example, comet C/1983 H1 (IRAS-Araki-Alcock) has an orbital period of 963.22 years. It was discovered on 27 April 1983, and passed Earth at a distance of 0.0312 AU on 11 May 1983, only two weeks later. Aggressive methods and continuous alertness will be needed to defend against objects with such short warning times. We summarize results on anticipated warning times of long-period comets given advances in modern telescopic facilities searching for such objects to present nominal and worst-case scenarios for these potentially hazardous objects.

  15. National Landslide Hazards Mitigation Strategy: A Framework for Loss Reduction

    NSDL National Science Digital Library

    This report outlines the key elements of a comprehensive and effective national strategy for reducing losses from landslides nationwide and provides an assessment of the status, needs, and associated costs of this strategy. Topics include the rising costs resulting from landslide damage, the role of the federal government in mitigating this hazard, and some features of the new strategy such as partnerships among government, academia and the private sector, expanded research, and making use of technological advantages. The material may be downloaded in PDF or plain text format.

  16. Modeling and mitigating natural hazards: Stationarity is immortal!

    NASA Astrophysics Data System (ADS)

    Montanari, Alberto; Koutsoyiannis, Demetris

    2014-12-01

    Environmental change is a reason of relevant concern as it is occurring at an unprecedented pace and might increase natural hazards. Moreover, it is deemed to imply a reduced representativity of past experience and data on extreme hydroclimatic events. The latter concern has been epitomized by the statement that "stationarity is dead." Setting up policies for mitigating natural hazards, including those triggered by floods and droughts, is an urgent priority in many countries, which implies practical activities of management, engineering design, and construction. These latter necessarily need to be properly informed, and therefore, the research question on the value of past data is extremely important. We herein argue that there are mechanisms in hydrological systems that are time invariant, which may need to be interpreted through data inference. In particular, hydrological predictions are based on assumptions which should include stationarity. In fact, any hydrological model, including deterministic and nonstationary approaches, is affected by uncertainty and therefore should include a random component that is stationary. Given that an unnecessary resort to nonstationarity may imply a reduction of predictive capabilities, a pragmatic approach, based on the exploitation of past experience and data is a necessary prerequisite for setting up mitigation policies for environmental risk.

  17. Natural hazards phenomena mitigation with respect to seismic hazards at the Environmental Restoration Disposal Facility

    SciTech Connect

    Reidel, S.P.

    1994-01-06

    This report provides information on the seismic hazard for design of the proposed Environmental Restoration Disposal Facility (ERDF), a facility designed for the disposal of wastes generated during the cleanup of Hanford Site aggregate areas. The preferred ERDF site is located south and east of 200 East and 200 West Areas. The Washington State Groundwater Protection Program (WAC 173-303-806 (4)(a)(xxi)) requires that the characteristics of local and regional hydrogeology be defined. A plan for that work has been developed (Weekes and Borghese 1993). In addition, WAC 173-303-282 provides regulatory guidance on siting a dangerous waste facility, and US Department of Energy (DOE) Order 5480.28 requires consideration of natural phenomena hazards mitigation for DOE sites and facilities. This report provides information to evaluate the ERDF site with respect to seismic hazard. The ERDF will be a Corrective Action Management Unit (CAMU) as defined by 40 CFR 260.10.

  18. Mitigating mountain hazards in Austria - legislation, risk transfer, and awareness building

    NASA Astrophysics Data System (ADS)

    Holub, M.; Fuchs, S.

    2009-04-01

    Embedded in the overall concept of integral risk management, mitigating mountain hazards is pillared by land use regulations, risk transfer, and information. In this paper aspects on legislation related to natural hazards in Austria are summarised, with a particular focus on spatial planning activities and hazard mapping, and possible adaptations focussing on enhanced resilience are outlined. Furthermore, the system of risk transfer is discussed, highlighting the importance of creating incentives for risk-aware behaviour, above all with respect to individual precaution and insurance solutions. Therefore, the issue of creating awareness through information is essential, which is presented subsequently. The study results in recommendations of how administrative units on different federal and local levels could increase the enforcement of regulations related to the minimisation of natural hazard risk. Moreover, the nexus to risk transfer mechanisms is provided, focusing on the current compensation system in Austria and some possible adjustments in order to provide economic incentives for (private) investments in mitigation measures, i.e. local structural protection. These incentives should be supported by delivering information on hazard and risk target-oriented to any stakeholder involved. Therefore, coping strategies have to be adjusted and the interaction between prevention and precaution has to be highlighted. The paper closes with recommendations of how these efforts could be achieved, with a particular focus on the situation in the Republic of Austria.

  19. ANALYSIS AND MITIGATION OF X-RAY HAZARD GENERATED FROM HIGH INTENSITY LASER-TARGET INTERACTIONS

    SciTech Connect

    Qiu, R.; Liu, J.C.; Prinz, A.A.; Rokni, S.H.; Woods, M.; Xia, Z.; /SLAC; ,

    2011-03-21

    Interaction of a high intensity laser with matter may generate an ionizing radiation hazard. Very limited studies have been made, however, on the laser-induced radiation protection issue. This work reviews available literature on the physics and characteristics of laser-induced X-ray hazards. Important aspects include the laser-to-electron energy conversion efficiency, electron angular distribution, electron energy spectrum and effective temperature, and bremsstrahlung production of X-rays in the target. The possible X-ray dose rates for several femtosecond Ti:sapphire laser systems used at SLAC, including the short pulse laser system for the Matter in Extreme Conditions Instrument (peak power 4 TW and peak intensity 2.4 x 10{sup 18} W/cm{sup 2}) were analysed. A graded approach to mitigate the laser-induced X-ray hazard with a combination of engineered and administrative controls is also proposed.

  20. Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.

    2006-12-01

    An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes reduce the level of uncertainty in their results to the uncertainty in the geophysical initial conditions. Further, when coupled with real--time free--field tsunami measurements from tsunameters, validated codes are the only choice for realistic forecasting of inundation; the consequences of failure are too ghastly to take chances with numerical procedures that have not been validated. We discuss a ten step process of benchmark tests for models used for inundation mapping. The associated methodology and algorithmes have to first be validated with analytical solutions, then verified with laboratory measurements and field data. The models need to be published in the scientific literature in peer-review journals indexed by ISI. While this process may appear onerous, it reflects our state of knowledge, and is the only defensible methodology when human lives are at stake. Synolakis, C.E., and Bernard, E.N, Tsunami science before and beyond Boxing Day 2004, Phil. Trans. R. Soc. A 364 1845, 2231--2263, 2005.

  1. Mitigation of unconfined releases of hazardous gases via liquid spraying

    SciTech Connect

    Fthenakis, V.M.

    1997-02-01

    The capability of water sprays in mitigating clouds of hydrofluoric acid (HF) has been demonstrated in the large-scale field experiments of Goldfish and Hawk, which took place at the DOE Nevada Test Site. The effectiveness of water sprays and fire water monitors to remove HF from vapor plume, has also been studied theoretically using the model HGSPRAY5 with the near-field and far-field dispersion described by the HGSYSTEM models. This paper presents options to select and evaluate liquid spraying systems, based on the industry experience and mathematical modeling.

  2. Examining Local Jurisdictions' Capacity and Commitment For Hazard Mitigation Policies and Strategies along the Texas Coast

    E-print Network

    Husein, Rahmawati

    2012-07-16

    and development regulations can be effectively used for hazard mitigation, particularly if they are backed by state planning mandates (Berke & French, 1994; Berke et.al., 1996; Burby & Dalton, 1994; Burby & May, 1997; Godschalk, et al., 1999). However, most...

  3. The seismic project of the National Tsunami Hazard Mitigation Program

    USGS Publications Warehouse

    Oppenheimer, D.H.; Bittenbinder, A.N.; Bogaert, B.M.; Buland, R.P.; Dietz, L.D.; Hansen, R.A.; Malone, S.D.; McCreery, C.S.; Sokolowski, T.J.; Whitmore, P.M.; Weaver, C.S.

    2005-01-01

    In 1997, the Federal Emergency Management Agency (FEMA), National Oceanic and Atmospheric Administration (NOAA), U.S. Geological Survey (USGS), and the five western States of Alaska, California, Hawaii, Oregon, and Washington joined in a partnership called the National Tsunami Hazard Mitigation Program (NTHMP) to enhance the quality and quantity of seismic data provided to the NOAA tsunami warning centers in Alaska and Hawaii. The NTHMP funded a seismic project that now provides the warning centers with real-time seismic data over dedicated communication links and the Internet from regional seismic networks monitoring earthquakes in the five western states, the U.S. National Seismic Network in Colorado, and from domestic and global seismic stations operated by other agencies. The goal of the project is to reduce the time needed to issue a tsunami warning by providing the warning centers with high-dynamic range, broadband waveforms in near real time. An additional goal is to reduce the likelihood of issuing false tsunami warnings by rapidly providing to the warning centers parametric information on earthquakes that could indicate their tsunamigenic potential, such as hypocenters, magnitudes, moment tensors, and shake distribution maps. New or upgraded field instrumentation was installed over a 5-year period at 53 seismic stations in the five western states. Data from these instruments has been integrated into the seismic network utilizing Earthworm software. This network has significantly reduced the time needed to respond to teleseismic and regional earthquakes. Notably, the West Coast/Alaska Tsunami Warning Center responded to the 28 February 2001 Mw 6.8 Nisqually earthquake beneath Olympia, Washington within 2 minutes compared to an average response time of over 10 minutes for the previous 18 years. ?? Springer 2005.

  4. Numerical and probabilistic analysis of asteroid and comet impact hazard mitigation

    SciTech Connect

    Plesko, Catherine S [Los Alamos National Laboratory; Weaver, Robert P [Los Alamos National Laboratory; Huebner, Walter F [Los Alamos National Laboratory

    2010-09-09

    The possibility of asteroid and comet impacts on Earth has received significant recent media and scientific attention. Still, there are many outstanding questions about the correct response once a potentially hazardous object (PHO) is found. Nuclear munitions are often suggested as a deflection mechanism because they have a high internal energy per unit launch mass. However, major uncertainties remain about the use of nuclear munitions for hazard mitigation. There are large uncertainties in a PHO's physical response to a strong deflection or dispersion impulse like that delivered by nuclear munitions. Objects smaller than 100 m may be solid, and objects at all sizes may be 'rubble piles' with large porosities and little strength. Objects with these different properties would respond very differently, so the effects of object properties must be accounted for. Recent ground-based observations and missions to asteroids and comets have improved the planetary science community's understanding of these objects. Computational power and simulation capabilities have improved such that it is possible to numerically model the hazard mitigation problem from first principles. Before we know that explosive yield Y at height h or depth -h from the target surface will produce a momentum change in or dispersion of a PHO, we must quantify energy deposition into the system of particles that make up the PHO. Here we present the initial results of a parameter study in which we model the efficiency of energy deposition from a stand-off nuclear burst onto targets made of PHO constituent materials.

  5. Next-Generation GPS Station for Hazards Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2013-12-01

    Our objective is to better forecast, assess, and mitigate natural hazards, including earthquakes, tsunamis, and extreme storms and flooding through development and implementation of a modular technology for the next-generation in-situ geodetic station to support the flow of information from multiple stations to scientists, mission planners, decision makers, and first responders. The same technology developed under NASA funding can be applied to enhance monitoring of large engineering structures such as bridges, hospitals and other critical infrastructure. Meaningful warnings save lives when issued within 1-2 minutes for destructive earthquakes, several tens of minutes for tsunamis, and up to several hours for extreme storms and flooding, and can be provided by on-site fusion of multiple data types and generation of higher-order data products: GPS/GNSS and accelerometer measurements to estimate point displacements, and GPS/GNSS and meteorological measurements to estimate moisture variability in the free atmosphere. By operating semi-autonomously, each station can then provide low-latency, high-fidelity and compact data products within the constraints of narrow communications bandwidth that often accompanies natural disasters. We have developed a power-efficient, low-cost, plug-in Geodetic Module for fusion of data from in situ sensors including GPS, a strong-motion accelerometer module, and a meteorological sensor package, for deployment at existing continuous GPS stations in southern California; fifteen stations have already been upgraded. The low-cost modular design is scalable to the many existing continuous GPS stations worldwide. New on-the-fly data products are estimated with 1 mm precision and accuracy, including three-dimensional seismogeodetic displacements for earthquake, tsunami and structural monitoring and precipitable water for forecasting extreme weather events such as summer monsoons and atmospheric rivers experienced in California. Unlike more traditional approaches where data are collected and analyzed from a network of stations at a central processing facility, we are embedding these capabilities in the Geodetic Module's processor for in situ analysis and data delivery through TCP/IP to avoid single points of failure during emergencies. We are infusing our technology to several local and state groups, including the San Diego County Office of Emergency Services for earthquake and tsunami early warnings, UC San Diego Health Services for hospital monitoring and early warning, Caltrans for bridge monitoring, and NOAA's Weather Forecasting Offices in San Diego and Los Angeles Counties for forecasting extreme weather events. We describe our overall system and the ongoing efforts at technology infusion.

  6. Evaluating fuel complexes for fire hazard mitigation planning in the southeastern United States.

    SciTech Connect

    Andreu, Anne G.; Shea, Dan; Parresol, Bernard, R.; Ottmar, Roger, D.

    2012-01-01

    Fire hazard mitigation planning requires an accurate accounting of fuel complexes to predict potential fire behavior and effects of treatment alternatives. In the southeastern United States, rapid vegetation growth coupled with complex land use history and forest management options requires a dynamic approach to fuel characterization. In this study we assessed potential surface fire behavior with the Fuel Characteristic Classification System (FCCS), a tool which uses inventoried fuelbed inputs to predict fire behavior. Using inventory data from 629 plots established in the upper Atlantic Coastal Plain, South Carolina, we constructed FCCS fuelbeds representing median fuel characteristics by major forest type and age class. With a dry fuel moisture scenario and 6.4 km h{sub 1} midflame wind speed, the FCCS predicted moderate to high potential fire hazard for the majority of the fuelbeds under study. To explore fire hazard under potential future fuel conditions, we developed fuelbeds representing the range of quantitative inventorydata for fuelbed components that drive surface fire behavior algorithms and adjusted shrub species composition to represent 30% and 60% relative cover of highly flammable shrub species. Results indicate that the primary drivers of surface fire behavior vary by forest type, age and surface fire behavior rating. Litter tends to be a primary or secondary driver in most forest types. In comparison to other surface fire contributors, reducing shrub loading results in reduced flame lengths most consistently across forest types. FCCS fuelbeds and the results from this project can be used for fire hazard mitigation planning throughout the southern Atlantic Coastal Plain where similar forest types occur. The approach of building simulated fuelbeds across the range of available surface fuel data produces sets of incrementally different fuel characteristics that can be applied to any dynamic forest types in which surface fuel conditions change rapidly.

  7. How much do hazard mitigation plans cost? An analysis of federal grant data.

    PubMed

    Jackman, Andrea M; Beruvides, Mario G

    2013-01-01

    Under the Disaster Mitigation Act of 2000 and Federal Emergency Management Agency's subsequent Interim Final Rule, the requirement was placed on local governments to author and gain approval for a Hazard Mitigation Plan (HMP) for the areas under their jurisdiction. Low completion percentages for HMPs--less than one-third of eligible governments--were found by an analysis conducted 3 years after the final deadline for the aforementioned legislation took place. Follow-up studies showed little improvement at 5 and 8 years after the deadline. It was hypothesized that the cost of a HMP is a significant factor in determining whether or not a plan is completed. A study was conducted using Boolean Matrix Analysis methods to determine what, if any, characteristics of a certain community will most influence the cost of a HMP. The frequency of natural hazards experienced by the planning area, the number of jurisdictions participating in the HMEP, the population, and population density were found to significantly affect cost. These variables were used in a regression analysis to determine their predictive power for cost. It was found that along with two interaction terms, the variables explain approximately half the variation in HMP cost. PMID:24303771

  8. Assessing the costs of hazard mitigation through landscape interventions in the urban structure

    NASA Astrophysics Data System (ADS)

    Bostenaru-Dan, Maria; Aldea Mendes, Diana; Panagopoulos, Thomas

    2014-05-01

    In this paper we look at an issue rarely approached, the economic efficiency of natural hazard risk mitigation. The urban scale at which a natural hazard can impact leads to the importance of urban planning strategy in risk management. However, usually natural, engineering, and social sciences deal with it, and the role of architecture and urban planning is neglected. Climate change can lead to risks related to increased floods, desertification, sea level rise among others. Reducing the sealed surfaces in cities through green spaces in the crowded centres can mitigate them, and can be foreseen in restructuration plans in presence or absence of disasters. For this purpose we reviewed the role of green spaces and community centres such as churches in games, which can build the core for restructuration efforts, as also field and archive studies show. We look at the way ICT can contribute to organize the information from the building survey to economic computations in direct modeling or through games. The roles of game theory, agent based modeling and networks and urban public policies in designing decision systems for risk management are discussed. Games rules are at the same time supported by our field and archive studies, as well as research by design. Also we take into consideration at a rare element, which is the role of landscape planning, through the inclusion of green elements in reconstruction after the natural and man-made disasters, or in restructuration efforts to mitigate climate change. Apart of existing old city tissue also landscape can be endangered by speculation and therefore it is vital to highlight its high economic value, also in this particular case. As ICOMOS highlights for the 2014 congress, heritage and landscape are two sides of the same coin. Landscape can become or be connected to a community centre, the first being necessary for building a settlement, the second raising its value, or can build connections between landmarks in urban routes. For this reason location plays a role not only for mitigating the effects of hazards but also for increasing the value of land through vicinities. Games are only another way to build a model of the complex system which is the urban organism in this regard, and a model is easier to be analysed than the system while displaying its basic rules. The role of landscape of building roads of memory between landmarks in the reconstruction is yet to be investigated in a future proposed COST action.

  9. The price of safety: costs for mitigating and coping with Alpine hazards

    NASA Astrophysics Data System (ADS)

    Pfurtscheller, C.; Thieken, A. H.

    2013-10-01

    Due to limited public budgets and the need to economize, the analysis of costs of hazard mitigation and emergency management of natural hazards becomes increasingly important for public natural hazard and risk management. In recent years there has been a growing body of literature on the estimation of losses which supported to help to determine benefits of measures in terms of prevented losses. On the contrary, the costs of mitigation are hardly addressed. This paper thus aims to shed some light on expenses for mitigation and emergency services. For this, we analysed the annual costs of mitigation efforts in four regions/countries of the Alpine Arc: Bavaria (Germany), Tyrol (Austria), South Tyrol (Italy) and Switzerland. On the basis of PPP values (purchasing power parities), annual expenses on public safety ranged from EUR 44 per capita in the Free State of Bavaria to EUR 216 in the Autonomous Province of South Tyrol. To analyse the (variable) costs for emergency services in case of an event, we used detailed data from the 2005 floods in the Federal State of Tyrol (Austria) as well as aggregated data from the 2002 floods in Germany. The analysis revealed that multi-hazards, the occurrence and intermixture of different natural hazard processes, contribute to increasing emergency costs. Based on these findings, research gaps and recommendations for costing Alpine natural hazards are discussed.

  10. Spatio-temporal patterns of hazards and their use in risk assessment and mitigation. Case study of road accidents in Romania

    NASA Astrophysics Data System (ADS)

    Catalin Stanga, Iulian

    2013-04-01

    Road accidents are among the leading causes of death in many world countries, partly as an inherent consequence of the increasing mobility of today society. The World Health Organization estimates that 1.3 million people died in road accidents in 2011, which means 186 deaths per million. The tragic picture is completed by millions of peoples experiencing different physical injuries or by the enormous social and economic costs that these events imply. Romania has one of the most unsafe road networks within the European Union, with annual averages of 9400 accidents, 8300 injuries and almost 2680 fatalities (2007-2012). An average of 141 death per million is more than twice the average fatality rate in European Union (about 60 death per million). Other specific indicators (accidents or fatalities reported to the road length, vehicle fleet size, driving license owners or adult population etc.) are even worst in the same European context. Road accidents are caused by a complex series of factors, some of them being a relatively constant premise, while others act as catalyzing factors or triggering agent: road features and quality, vehicle technical state, weather conditions, human related factors etc. All these lead to a complex equation with too many unknown variables, making almost impossible a probabilistic approach. However, the high concentration of accidents in a region or in some road sectors is caused by the existence of a specific context, created by factors with permanent or repetitive character, and leads to the idea of a spatial autocorrelation between locations of different adjoining accident. In the same way, the increasing frequency of road accidents and of their causes repeatability in different periods of the year would allow to identify those black timeframes with higher incidence of road accidents. Identifying and analyzing the road blackspots (hotspots) and black zones would help to improve road safety by acting against the common causes that create the spatial or temporal clustering of crash accidents. Since the 1990's, Geographical Informational Systems (GIS) became a very important tool for traffic and road safety management, allowing not only the spatial and multifactorial analysis, but also graphical and non-graphical outputs. The current paper presents an accessible GIS methodology to study the spatio-temporal pattern of injury related road accidents, to identify the high density accidents zones, to make a cluster analysis, to create multicriterial typologies, to identify spatial and temporal similarities and to explain them. In this purpose, a Geographical Information System was created, allowing a complex analysis that involves not only the events, but also a large set of interrelated and spatially linked attributes. The GIS includes the accidents as georeferenced point elements with a spatially linked attribute database: identification information (date, location details); accident type; main, secondary and aggravating causes; data about driver; vehicle information; consequences (damages, injured peoples and fatalities). Each attribute has its own number code that allows both the statistical analysis and the spatial interrogation. The database includes those road accidents that led to physical injuries and loss of human lives between 2007 and 2012 and the spatial analysis was realized using TNTmips 7.3 software facilities. Data aggregation and processing allowed creating the spatial pattern of injury related road accidents through Kernel density estimation at three different levels (national - Romania; county level - Iasi County; local level - Iasi town). Spider graphs were used to create the temporal pattern or road accidents at three levels (daily, weekly and monthly) directly related to their causes. Moreover the spatial and temporal database relates the natural hazards (glazed frost, fog, and blizzard) with the human made ones, giving the opportunity to evaluate the nature of uncertainties in risk assessment. At the end, this paper provides a clustering methodology based on several environmenta

  11. Looking before we leap: an ongoing, quantative investigation of asteroid and comet impact hazard mitigation

    SciTech Connect

    Plesko, Catherine S [Los Alamos National Laboratory; Weaver, Robert P [Los Alamos National Laboratory; Bradley, Paul A [Los Alamos National Laboratory; Huebner, Walter F [Los Alamos National Laboratory

    2010-01-01

    There are many outstanding questions about the correct response to an asteroid or comet impact threat on Earth. Nuclear munitions are currently thought to be the most efficient method of delivering an impact-preventing impulse to a potentially hazardous object (PHO). However, there are major uncertainties about the response of PHOs to a nuclear burst, and the most appropriate ways to use nuclear munitions for hazard mitigation.

  12. Fourth DOE Natural Phenomena Hazards Mitigation Conference: Proceedings. Volume 1

    SciTech Connect

    Not Available

    1993-12-31

    This conference allowed an interchange in the natural phenomena area among designers, safety professionals, and managers. The papers presented in Volume I of the proceedings are from sessions I - VIII which cover the general topics of: DOE standards, lessons learned and walkdowns, wind, waste tanks, ground motion, testing and materials, probabilistic seismic hazards, risk assessment, base isolation and energy dissipation, and lifelines and floods. Individual papers are indexed separately. (GH)

  13. Experimentally Benchmarked Numerical Approaches to Lightning Hazard Assessment and Mitigation

    NASA Astrophysics Data System (ADS)

    Jones, Malcolm; Newton, David

    2013-04-01

    A natural hazard that has been with us since the beginning of time is the lighting strike. Not only does it represent a direct hazard to humans but also to the facilities that they work within and the products they produce. The latter categories are of particular concern when they are related to potentially hazardous processes and products. For this reason experimental and numerical modelling techniques are developed to understand the nature of the hazards, to develop appropriate protective approaches which can be put in place and finally to gain assurance that the overall risks fall within national, international accepted standards and those appropriate to the special nature of the work. The latter is of particular importance when the processes and the products within such facilities have a potential susceptibility to lightning strike and where failure is deemed unacceptable. This paper covers examples of the modelling approaches applied to such facilities within which high consequence operations take place, together with the protection that is required for high consequence products. In addition examples are given of how the numerical techniques are benchmarked by supporting experimental programmes. Not only should such a safety rationale be laid down and agreed early for these facilities and products but that it is maintained during the inevitable changes that will occur during the design, development, production and maintenance phases. For example an 'improvement', as seen by a civil engineer or a facility manager, may well turn out to be detrimental to lightning safety. Constant vigilance is key to ensuring the maintenance of safety.

  14. Department of Energy Natural Phenomena Hazards Mitigation Program

    SciTech Connect

    Murray, R.C.

    1993-09-01

    This paper will present a summary of past and present accomplishments of the Natural Phenomena Hazards Program that has been ongoing at Lawrence Livermore National Laboratory since 1975. The Natural Phenomena covered includes earthquake; winds, hurricanes, and tornadoes; flooding and precipitation; lightning; and volcanic events. The work is organized into four major areas (1) Policy, requirements, standards, and guidance (2) Technical support, research development, (3) Technology transfer, and (4) Oversight.

  15. Avalanche hazards and mitigation in Austria: a review

    Microsoft Academic Search

    Peter Höller

    2007-01-01

    At all times natural hazards like torrents or avalanches pose a threat to settlements and infrastructures in the Austrian\\u000a Alps. Since 1950 more than 1,600 persons have been killed by avalanches in Austria, which is on average approximately 30 fatalities\\u000a per year. In particular, the winter periods 1950\\/1951 and 1953\\/1954 stand out with more than 100 fatalities. Those events\\u000a led

  16. A comparison of fire hazard mitigation alternatives in pinyon–juniper woodlands of Arizona

    Microsoft Academic Search

    David W. Huffman; Peter Z. Fulé; Joseph E. Crouse; Kristen M. Pearson

    2009-01-01

    Concern over uncontrollable wildfire in pinyon–juniper woodlands has led public land managers in the southwestern United States to seek approaches for mitigating wildfire hazard, yet little information is available concerning effectiveness and ecological responses of alternative treatments. We established a randomized block experiment at a pinyon–juniper site in northern Arizona and tested effects of no treatment (Control), thinning only (Thin),

  17. Environmental Hazards 6 (2005) 3949 Mitigation of the heat island effect in urban New Jersey

    E-print Network

    more often in low-density suburban and rural areas such as tree canopies, grass, and fieldsEnvironmental Hazards 6 (2005) 39­49 Mitigation of the heat island effect in urban New Jersey Montclair State University, USA d Barnard College, USA Abstract Implementation of urban heat island (UHI

  18. Assessment of indirect losses and costs of emergency for project planning of alpine hazard mitigation

    NASA Astrophysics Data System (ADS)

    Amenda, Lisa; Pfurtscheller, Clemens

    2013-04-01

    By virtue of augmented settling in hazardous areas and increased asset values, natural disasters such as floods, landslides and rockfalls cause high economic losses in Alpine lateral valleys. Especially in small municipalities, indirect losses, mainly stemming from a breakdown of transport networks, and costs of emergency can reach critical levels. A quantification of these losses is necessary to estimate the worthiness of mitigation measures, to determine the appropriate level of disaster assistance and to improve risk management strategies. There are comprehensive approaches available for assessing direct losses. However, indirect losses and costs of emergency are widely not assessed and the empirical basis for estimating these costs is weak. To address the resulting uncertainties of project appraisals, a standardized methodology has been developed dealing with issues of local economic effects and emergency efforts needed. In our approach, the cost-benefit-analysis for technical mitigation of the Austrian Torrent and Avalanche Control (TAC) will be optimized and extended using the 2005-debris flow as a design event, which struggled a small town in the upper Inn valley in southwest Tyrol (Austria). Thereby, 84 buildings were affected, 430 people were evacuated and due to this, the TAC implemented protection measures for 3.75 million Euros. Upgrading the method of the TAC and analyzing to what extent the cost-benefit-ratio is about to change, is one of the main objectives of this study. For estimating short-run indirect effects and costs of emergency on the local level, data was collected via questionnaires, field mapping, guided interviews, as well as intense literature research. According to this, up-to-date calculation methods were evolved and the cost-benefit-analysis of TAC was recalculated with these new-implemented results. The cost-benefit-ratio will be more precise and specific and hence, the decision, which mitigation alternative will be carried out. Based on this, the worthiness of the mitigation measures can be determined in more detail and the proper level of emergency assistance can be calculated more adequately. By dint of this study, a better data basis will be created evaluating technical and non-technical mitigation measures, which is useful for government agencies, insurance companies and research.

  19. Linear Aerospike SR-71 Experiment (LASRE): Aerospace Propulsion Hazard Mitigation Systems

    E-print Network

    Masashi Mizukami; Griffin P. Corpening; Ronald J. Ray; Neal Hass; Kimberly A. Ennix; Scott M. Lazaroff

    1998-01-01

    A major hazard posed by the propulsion system of hypersonic and space vehicles is the possibility of fire or explosion in the vehicle environment. The hazard is mitigated by minimizing or detecting, in the vehicle environment, the three ingredients essential to producing fire: fuel, oxidizer, and an ignition source. The Linear Aerospike SR-71 Experiment (LASRE) consisted of a linear aerospike rocket engine integrated into one-half of an X-33-like lifting body shape, carried on top of an SR-71 aircraft. Gaseous hydrogen and liquid oxygen were used as propellants. Although LASRE is a one-of-a-kind experimental system, it must be rated for piloted flight, so this test presented a unique challenge. To help meet safety requirements, the following propulsion hazard mitigation systems were incorporated into the experiment: pod inert purge, oxygen sensors, a hydrogen leak detection algorithm, hydrogen sensors, fire detection and pod temperature thermocouples, water misting, and control room di...

  20. Mitigation of EMU Cut Glove Hazard from Micrometeoroid and Orbital Debris Impacts on ISS Handrails

    NASA Technical Reports Server (NTRS)

    Ryan, Shannon; Christiansen, Eric L.; Davis, Bruce A.; Ordonez, Erick

    2009-01-01

    Recent cut damages sustained on crewmember gloves during extravehicular activity (ISS) onboard the International Space Station (ISS) have been caused by contact with sharp edges or a pinch point according to analysis of the damages. One potential source are protruding sharp edged crater lips from micrometeoroid and orbital debris (MMOD) impacts on metallic handrails along EVA translation paths. A number of hypervelocity impact tests were performed on ISS handrails, and found that mm-sized projectiles were capable of inducing crater lip heights two orders of magnitude above the minimum value for glove abrasion concerns. Two techniques were evaluated for mitigating the cut glove hazard of MMOD impacts on ISS handrails: flexible overwraps which act to limit contact between crewmember gloves and impact sites, and; alternate materials which form less hazardous impact crater profiles. In parallel with redesign efforts to increase the cut resilience of EMU gloves, the modifications to ISS handrails evaluated in this study provide the means to significantly reduce cut glove risk from MMOD impact craters

  1. The U.S. National Tsunami Hazard Mitigation Program: Successes in Tsunami Preparedness

    NASA Astrophysics Data System (ADS)

    Whitmore, P.; Wilson, R. I.

    2012-12-01

    Formed in 1995 by Congressional Action, the National Tsunami Hazards Mitigation Program (NTHMP) provides the framework for tsunami preparedness activities in the United States. The Program consists of the 28 U.S. coastal states, territories, and commonwealths (STCs), as well as three Federal agencies: the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), and the United States Geological Survey (USGS). Since its inception, the NTHMP has advanced tsunami preparedness in the United States through accomplishments in many areas of tsunami preparedness: - Coordination and funding of tsunami hazard analysis and preparedness activities in STCs; - Development and execution of a coordinated plan to address education and outreach activities (materials, signage, and guides) within its membership; - Lead the effort to assist communities in meeting National Weather Service (NWS) TsunamiReady guidelines through development of evacuation maps and other planning activities; - Determination of tsunami hazard zones in most highly threatened coastal communities throughout the country by detailed tsunami inundation studies; - Development of a benchmarking procedure for numerical tsunami models to ensure models used in the inundation studies meet consistent, NOAA standards; - Creation of a national tsunami exercise framework to test tsunami warning system response; - Funding community tsunami warning dissemination and reception systems such as sirens and NOAA Weather Radios; and, - Providing guidance to NOAA's Tsunami Warning Centers regarding warning dissemination and content. NTHMP activities have advanced the state of preparedness of United States coastal communities, and have helped save lives and property during recent tsunamis. Program successes as well as future plans, including maritime preparedness, are discussed.

  2. New Activities of the U.S. National Tsunami Hazard Mitigation Program, Mapping and Modeling Subcommittee

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Eble, M. C.

    2013-12-01

    The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is comprised of representatives from coastal states and federal agencies who, under the guidance of NOAA, work together to develop protocols and products to help communities prepare for and mitigate tsunami hazards. Within the NTHMP are several subcommittees responsible for complimentary aspects of tsunami assessment, mitigation, education, warning, and response. The Mapping and Modeling Subcommittee (MMS) is comprised of state and federal scientists who specialize in tsunami source characterization, numerical tsunami modeling, inundation map production, and warning forecasting. Until September 2012, much of the work of the MMS was authorized through the Tsunami Warning and Education Act, an Act that has since expired but the spirit of which is being adhered to in parallel with reauthorization efforts. Over the past several years, the MMS has developed guidance and best practices for states and territories to produce accurate and consistent tsunami inundation maps for community level evacuation planning, and has conducted benchmarking of numerical inundation models. Recent tsunami events have highlighted the need for other types of tsunami hazard analyses and products for improving evacuation planning, vertical evacuation, maritime planning, land-use planning, building construction, and warning forecasts. As the program responsible for producing accurate and consistent tsunami products nationally, the NTHMP-MMS is initiating a multi-year plan to accomplish the following: 1) Create and build on existing demonstration projects that explore new tsunami hazard analysis techniques and products, such as maps identifying areas of strong currents and potential damage within harbors as well as probabilistic tsunami hazard analysis for land-use planning. 2) Develop benchmarks for validating new numerical modeling techniques related to current velocities and landslide sources. 3) Generate guidance and protocols for the production and use of new tsunami hazard analysis products. 4) Identify multistate collaborations and funding partners interested in these new products. Application of these new products will improve the overall safety and resilience of coastal communities exposed to tsunami hazards.

  3. Developing a scientific procedure for community based hazard mapping and risk mitigation

    NASA Astrophysics Data System (ADS)

    Verrier, M.

    2011-12-01

    As an international exchange student from the Geological Sciences Department at San Diego State University (SDSU), I joined the KKN-PPM program at Universitas Gadjah Mada (UGM), Yogyakarta, Indonesia, in July 2011 for 12 days (July 4th to July 16th) of its two month duration (July 4th to August 25th). The KKN-PPM group I was attached was designated 154 and was focused in Plosorejo Village, Karanganyar, Kerjo, Central Java, Indonesia. The mission of KKN-PPM 154 was to survey Plosorejo village for existing landslides, to generate a simple hazard susceptibility map that can be understood by local villagers, and then to begin dissemination of that map into the community. To generate our susceptibility map we first conducted a geological survey of the existing landslides in the field study area, with a focus on determining landslide triggers and gauging areas for susceptibility for future landslides. The methods for gauging susceptibility included lithological observation, the presence of linear cracking, visible loss of structural integrity in structures such as villager homes, as well as collaboration with local residents and with the local rescue and response team. There were three color distinctions used in representing susceptibility which were green, where there is no immediate danger of landslide damage; orange, where transportation routes are at risk of being disrupted by landslides; and red, where imminent landslide potential puts a home in direct danger. The landslide inventory and susceptibility data was compiled into digital mediums such as CorelDraw, ArcGIS and Google Earth. Once a technical map was generated, we presented it to the village leadership for confirmation and modification based on their experience. Finally, we began to use the technical susceptibility map to draft evacuation routes and meeting points in the event of landslides, as well as simple susceptibility maps that can be understood and utilized by local villagers. Landslide mitigation projects that are being conducted alongside the community hazard map include marking evacuation routes with painted bamboo signs, creating a meaningful landslide awareness mural, and installing simple early warning systems that detect land movement and alert residents that evacuation routes should be used. KKN-PPM is scheduled to continue until August 25th, 2011. In the future, research will be done into using the model for community based hazard mapping outlined here in the Geological Sciences Department at SDSU to increase georisk awareness and improve mitigation of landslides in local areas of need such as Tijuana, Mexico.

  4. The influence of hazard models on GIS-based regional risk assessments and mitigation policies

    USGS Publications Warehouse

    Bernknopf, R.L.; Rabinovici, S.J.M.; Wood, N.J.; Dinitz, L.B.

    2006-01-01

    Geographic information systems (GIS) are important tools for understanding and communicating the spatial distribution of risks associated with natural hazards in regional economies. We present a GIS-based decision support system (DSS) for assessing community vulnerability to natural hazards and evaluating potential mitigation policy outcomes. The Land Use Portfolio Modeler (LUPM) integrates earth science and socioeconomic information to predict the economic impacts of loss-reduction strategies. However, the potential use of such systems in decision making may be limited when multiple but conflicting interpretations of the hazard are available. To explore this problem, we conduct a policy comparison using the LUPM to test the sensitivity of three available assessments of earthquake-induced lateral-spread ground failure susceptibility in a coastal California community. We find that the uncertainty regarding the interpretation of the science inputs can influence the development and implementation of natural hazard management policies. Copyright ?? 2006 Inderscience Enterprises Ltd.

  5. Monitoring Fogo Island, Cape Verde Archipelago, for Volcanic Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Faria, B. V.; Heleno, S. I.; Barros, I. J.; d'Oreye, N.; Bandomo, Z.; Fonseca, J. F.

    2001-12-01

    Fogo Island, in the Cape Verde Archipelago (North Atlantic), with a total area of 476 km2 and a population of about 40000, is an active ocean island volcano raising from an average sea-bottom depth of the order of -3000m to a maximum altitude of 2820m. All of the 28 historically recorded eruptions (Ribeiro, 1960) since the arrival of the first settlers in the 15th Century took place in Cha das Caldeiras, a 9 km-wide flat zone 1700 meters above sea level that resulted from the infill of a large lateral collapse caldera (Day et al., 2000). The last eruptions occurred in 1951 and 1995, through secondary cones at the basis of Pico do Fogo, the main volcanic edifice. A tall scarp surrounds Cha das Calderas on its western side only, and the eastern limit leads to a very steep sub-aerial slope down to the coastline. With this morphology, the volcanic hazard is significant inside Cha das Caldeiras - with a resident population of the order of 800 - and particularly in the villages of the eastern coast. Because the magma has low viscosity, eruptions in Fogo have scarce precursory activity, and its forecast is therefore challenging. The VIGIL monitoring network was installed between 1997 and 2001, and is currently in full operation. It consists of seven seismographic stations - two of which broadband - four tilt stations, a CO2 monitoring station and a meteo station. The data is telemetred in real time to the central laboratory in the neighbor island of Santiago, and analyzed on a daily basis. The continuous data acquisition is complemented by periodic GPS, gravity and leveling surveys (Lima et al., this conference). In this paper we present the methodology adopted to monitor the level of volcanic activity of Fogo Volcano, and show examples of the data being collected. Anomalous data recorded at the end of September 2000, which led to the only occurrence of an alert warning so far, are also presented and discussed.

  6. Kurata, Spencer and Ruiz-Sandoval 1 ABSTRACT: A risk monitoring of buildings for natural and man-made hazards mitigation is discussed in

    E-print Network

    Spencer Jr., B.F.

    -made hazards mitigation is discussed in this paper. Ubiquitous monitoring using a network of wireless sensors-Sandoval 2 2 BUILDING RISK MONITORING 2.1 Risk monitoring and hazard mitigation Buildings are subjected to natural hazards such as severe earthquakes and strong winds, as well as man-made hazards such as fire

  7. Planning ahead for asteroid and comet hazard mitigation, phase 1: parameter space exploration and scenario modeling

    SciTech Connect

    Plesko, Catherine S [Los Alamos National Laboratory; Clement, R Ryan [Los Alamos National Laboratory; Weaver, Robert P [Los Alamos National Laboratory; Bradley, Paul A [Los Alamos National Laboratory; Huebner, Walter F [Los Alamos National Laboratory

    2009-01-01

    The mitigation of impact hazards resulting from Earth-approaching asteroids and comets has received much attention in the popular press. However, many questions remain about the near-term and long-term, feasibility and appropriate application of all proposed methods. Recent and ongoing ground- and space-based observations of small solar-system body composition and dynamics have revolutionized our understanding of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). Ongoing increases in computing power and algorithm sophistication make it possible to calculate the response of these inhomogeneous objects to proposed mitigation techniques. Here we present the first phase of a comprehensive hazard mitigation planning effort undertaken by Southwest Research Institute and Los Alamos National Laboratory. We begin by reviewing the parameter space of the object's physical and chemical composition and trajectory. We then use the radiation hydrocode RAGE (Gittings et al. 2008), Monte Carlo N-Particle (MCNP) radiation transport (see Clement et al., this conference), and N-body dynamics codes to explore the effects these variations in object properties have on the coupling of energy into the object from a variety of mitigation techniques, including deflection and disruption by nuclear and conventional munitions, and a kinetic impactor.

  8. A portfolio approach to evaluating natural hazard mitigation policies: An Application to lateral-spread ground failure in Coastal California

    USGS Publications Warehouse

    Bernknopf, R.L.; Dinitz, L.B.; Rabinovici, S.J.M.; Evans, A.M.

    2001-01-01

    In the past, efforts to prevent catastrophic losses from natural hazards have largely been undertaken by individual property owners based on site-specific evaluations of risks to particular buildings. Public efforts to assess community vulnerability and encourage mitigation have focused on either aggregating site-specific estimates or adopting standards based upon broad assumptions about regional risks. This paper develops an alternative, intermediate-scale approach to regional risk assessment and the evaluation of community mitigation policies. Properties are grouped into types with similar land uses and levels of hazard, and hypothetical community mitigation strategies for protecting these properties are modeled like investment portfolios. The portfolios consist of investments in mitigation against the risk to a community posed by a specific natural hazard, and are defined by a community's mitigation budget and the proportion of the budget invested in locations of each type. The usefulness of this approach is demonstrated through an integrated assessment of earthquake-induced lateral-spread ground failure risk in the Watsonville, California area. Data from the magnitude 6.9 Loma Prieta earthquake of 1989 are used to model lateral-spread ground failure susceptibility. Earth science and economic data are combined and analyzed in a Geographic Information System (GIS). The portfolio model is then used to evaluate the benefits of mitigating the risk in different locations. Two mitigation policies, one that prioritizes mitigation by land use type and the other by hazard zone, are compared with a status quo policy of doing no further mitigation beyond that which already exists. The portfolio representing the hazard zone rule yields a higher expected return than the land use portfolio does: However, the hazard zone portfolio experiences a higher standard deviation. Therefore, neither portfolio is clearly preferred. The two mitigation policies both reduce expected losses and increase overall expected community wealth compared to the status quo policy.

  9. Mitigating hazards to aircraft from drifting volcanic clouds by comparing and combining IR satellite data with forward transport models

    Microsoft Academic Search

    M. Alexandra Matiella Novak

    2008-01-01

    Volcanic ash clouds in the upper atmosphere (>10km) present a significant hazard to the aviation community and in some cases cause near-disastrous situations for aircraft that inadvertently encounter them. The two most commonly used techniques for mitigating hazards to aircraft from drifting volcanic clouds are (1) using data from satellite observations and (2) the forecasting of dispersion and trajectories with

  10. Lidar and Electro-Optics for Atmospheric Hazard Sensing and Mitigation

    NASA Technical Reports Server (NTRS)

    Clark, Ivan O.

    2012-01-01

    This paper provides an overview of the research and development efforts of the Lidar and Electro-Optics element of NASA's Aviation Safety Program. This element is seeking to improve the understanding of the atmospheric environments encountered by aviation and to provide enhanced situation awareness for atmospheric hazards. The improved understanding of atmospheric conditions is specifically to develop sensor signatures for atmospheric hazards. The current emphasis is on kinetic air hazards such as turbulence, aircraft wake vortices, mountain rotors, and windshear. Additional efforts are underway to identify and quantify the hazards arising from multi-phase atmospheric conditions including liquid and solid hydrometeors and volcanic ash. When the multi-phase conditions act as obscurants that result in reduced visual awareness, the element seeks to mitigate the hazards associated with these diminished visual environments. The overall purpose of these efforts is to enable safety improvements for air transport class and business jet class aircraft as the transition to the Next Generation Air Transportation System occurs.

  11. Linear Aerospike SR-71 Experiment (LASRE): Aerospace Propulsion Hazard Mitigation Systems

    NASA Technical Reports Server (NTRS)

    Mizukami, Masashi; Corpening, Griffin P.; Ray, Ronald J.; Hass, Neal; Ennix, Kimberly A.; Lazaroff, Scott M.

    1998-01-01

    A major hazard posed by the propulsion system of hypersonic and space vehicles is the possibility of fire or explosion in the vehicle environment. The hazard is mitigated by minimizing or detecting, in the vehicle environment, the three ingredients essential to producing fire: fuel, oxidizer, and an ignition source. The Linear Aerospike SR-71 Experiment (LASRE) consisted of a linear aerospike rocket engine integrated into one-half of an X-33-like lifting body shape, carried on top of an SR-71 aircraft. Gaseous hydrogen and liquid oxygen were used as propellants. Although LASRE is a one-of-a-kind experimental system, it must be rated for piloted flight, so this test presented a unique challenge. To help meet safety requirements, the following propulsion hazard mitigation systems were incorporated into the experiment: pod inert purge, oxygen sensors, a hydrogen leak detection algorithm, hydrogen sensors, fire detection and pod temperature thermocouples, water misting, and control room displays. These systems are described, and their development discussed. Analyses, ground test, and flight test results are presented, as are findings and lessons learned.

  12. The asteroid and comet impact hazard: risk assessment and mitigation options.

    PubMed

    Gritzner, Christian; Dürfeld, Kai; Kasper, Jan; Fasoulas, Stefanos

    2006-08-01

    The impact of extraterrestrial matter onto Earth is a continuous process. On average, some 50,000 tons of dust are delivered to our planet every year. While objects smaller than about 30 m mainly disintegrate in the Earth's atmosphere, larger ones can penetrate through it and cause damage on the ground. When an object of hundreds of meters in diameter impacts an ocean, a tsunami is created that can devastate coastal cities. Further, if a km-sized object hit the Earth it would cause a global catastrophe due to the transport of enormous amounts of dust and vapour into the atmosphere resulting in a change in the Earth's climate. This article gives an overview of the near-Earth asteroid and comet (near-Earth object-NEO) impact hazard and the NEO search programmes which are gathering important data on these objects. It also points out options for impact hazard mitigation by using deflection systems. It further discusses the critical constraints for NEO deflection strategies and systems as well as mitigation and evacuation costs and benefits. Recommendations are given for future activities to solve the NEO impact hazard problem. PMID:16670908

  13. Fluor Daniel Hanford implementation plan for DOE Order 5480.28, Natural phenomena hazards mitigation

    SciTech Connect

    Conrads, T.J.

    1997-09-12

    Natural phenomena hazards (NPH) are unexpected acts of nature that pose a threat or danger to workers, the public, or the environment. Earthquakes, extreme winds (hurricane and tornado), snow, flooding, volcanic ashfall, and lightning strikes are examples of NPH that could occur at the Hanford Site. U.S. Department of Energy (DOE) policy requires facilities to be designed, constructed, and operated in a manner that protects workers, the public, and the environment from hazards caused by natural phenomena. DOE Order 5480.28, Natural Phenomena Hazards Mitigation, includes rigorous new natural phenomena criteria for the design of new DOE facilities, as well as for the evaluation and, if necessary, upgrade of existing DOE facilities. The Order was transmitted to Westinghouse Hanford Company in 1993 for compliance and is also identified in the Project Hanford Management Contract, Section J, Appendix C. Criteria and requirements of DOE Order 5480.28 are included in five standards, the last of which, DOE-STD-1023, was released in fiscal year 1996. Because the Order was released before all of its required standards were released, enforcement of the Order was waived pending release of the last standard and determination of an in-force date by DOE Richland Operations Office (DOE-RL). Agreement also was reached between the Management and Operations Contractor and DOE-RL that the Order would become enforceable for new structures, systems, and components (SSCS) 60 days following issue of a new order-based design criteria in HNF-PRO-97, Engineering Design and Evaluation. The order also requires that commitments addressing existing SSCs be included in an implementation plan that is to be issued 1 year following the release of the last standard. Subsequently, WHC-SP-1175, Westinghouse Hanford Company Implementation Plan for DOE Order 5480.28, Natural Phenomena Hazards Mitigation, Rev. 0, was issued in November 1996, and this document, HNF-SP-1175, Fluor Daniel Hanford Implementation Plan for DOE Order 5480.28, Natural Phenomena Hazards Mitigation, is Rev. 1 of that plan.

  14. Impact Hazard Mitigation: Understanding the Effects of Nuclear Explosive Outputs on Comets and Asteroids

    NASA Astrophysics Data System (ADS)

    Clement, R.

    The NASA 2007 white paper "Near-Earth Object Survey and Deflection Analysis of Alternatives" affirms deflection as the safest and most effective means of potentially hazardous object (PHO) impact prevention. It also calls for further studies of object deflection. In principle, deflection of a PHO may be accomplished by using kinetic impactors, chemical explosives, gravity tractors, solar sails, or nuclear munitions. Of the sudden impulse options, nuclear munitions are by far the most efficient in terms of yield-per-unit-mass launched and are technically mature. However, there are still significant questions about the response of a comet or asteroid to a nuclear burst. Recent and ongoing observational and experimental work is revolutionizing our understanding of the physical and chemical properties of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). The combination of this improved understanding of small solar-system bodies combined with current state-of-the-art modeling and simulation capabilities, which have also improved dramatically in recent years, allow for a science-based, comprehensive study of PHO mitigation techniques. Here we present an examination of the effects of radiation from a nuclear explosion on potentially hazardous asteroids and comets through Monte Carlo N-Particle code (MCNP) simulation techniques. MCNP is a general-purpose particle transport code commonly used to model neutron, photon, and electron transport for medical physics, reactor design and safety, accelerator target and detector design, and a variety of other applications including modeling the propagation of epithermal neutrons through the Martian regolith (Prettyman 2002). It is a massively parallel code that can conduct simulations in 1-3 dimensions, complicated geometries, and with extremely powerful variance reduction techniques. It uses current nuclear cross section data, where available, and fills in the gaps with analytical models where data are not available. MCNP has undergone extensive verification and validation and is considered the gold-standard for particle transport. (Forrest B. Brown, et al., "MCNP Version 5," Trans. Am. Nucl. Soc., 87, 273, November 2002.) Additionally, a new simulation capability using MCNP has become available to this collaboration. The first results of this new capability will also be presented. In particular, we will show results of neutron and gamma-ray energy deposition and flux as a function of material depth, composition, density, geometry, and distance from the source (nuclear burst). We will also discuss the benefits and shortcomings of linear Monte Carlo. Finally, we will set the stage for the correct usage and limitations of these results in coupled radiation-hydrodynamic calculations (see Plesko et al, this conference).

  15. Impact hazard mitigation: understanding the effects of nuclear explosive outputs on comets and asteroids

    SciTech Connect

    Clement, Ralph R C [Los Alamos National Laboratory; Plesko, Catherine S [Los Alamos National Laboratory; Bradley, Paul A [Los Alamos National Laboratory; Conlon, Leann M [Los Alamos National Laboratory

    2009-01-01

    The NASA 2007 white paper ''Near-Earth Object Survey and Deflection Analysis of Alternatives'' affirms deflection as the safest and most effective means of potentially hazardous object (PHO) impact prevention. It also calls for further studies of object deflection. In principle, deflection of a PHO may be accomplished by using kinetic impactors, chemical explosives, gravity tractors, solar sails, or nuclear munitions. Of the sudden impulse options, nuclear munitions are by far the most efficient in terms of yield-per-unit-mass launched and are technically mature. However, there are still significant questions about the response of a comet or asteroid to a nuclear burst. Recent and ongoing observational and experimental work is revolutionizing our understanding of the physical and chemical properties of these bodies (e.g ., Ryan (2000) Fujiwara et al. (2006), and Jedicke et al. (2006)). The combination of this improved understanding of small solar-system bodies combined with current state-of-the-art modeling and simulation capabilities, which have also improved dramatically in recent years, allow for a science-based, comprehensive study of PHO mitigation techniques. Here we present an examination of the effects of radiation from a nuclear explosion on potentially hazardous asteroids and comets through Monte Carlo N-Particle code (MCNP) simulation techniques. MCNP is a general-purpose particle transport code commonly used to model neutron, photon, and electron transport for medical physics reactor design and safety, accelerator target and detector design, and a variety of other applications including modeling the propagation of epithermal neutrons through the Martian regolith (Prettyman 2002). It is a massively parallel code that can conduct simulations in 1-3 dimensions, complicated geometries, and with extremely powerful variance reduction techniques. It uses current nuclear cross section data, where available, and fills in the gaps with analytical models where data are not available. MCNP has undergone extensive verification and validation and is considered the gold-standard for particle transport. (Forrest B. Brown, et al., ''MCNP Version 5,'' Trans. Am. Nucl. Soc., 87, 273, November 2002.) Additionally, a new simulation capability using MCNP has become available to this collaboration. The first results of this new capability will also be presented.

  16. Identification, prediction, and mitigation of sinkhole hazards in evaporite karst areas

    USGS Publications Warehouse

    Gutierrez, F.; Cooper, A.H.; Johnson, K.S.

    2008-01-01

    Sinkholes usually have a higher probability of occurrence and a greater genetic diversity in evaporite terrains than in carbonate karst areas. This is because evaporites have a higher solubility and, commonly, a lower mechanical strength. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas. To deal with these hazards, a phased approach is needed for sinkhole identification, investigation, prediction, and mitigation. Identification techniques include field surveys and geomorphological mapping combined with accounts from local people and historical sources. Detailed sinkhole maps can be constructed from sequential historical maps, recent topographical maps, and digital elevation models (DEMs) complemented with building-damage surveying, remote sensing, and high-resolution geodetic surveys. On a more detailed level, information from exposed paleosubsidence features (paleokarst), speleological explorations, geophysical investigations, trenching, dating techniques, and boreholes may help in investigating dissolution and subsidence features. Information on the hydrogeological pathways including caves, springs, and swallow holes are particularly important especially when corroborated by tracer tests. These diverse data sources make a valuable database-the karst inventory. From this dataset, sinkhole susceptibility zonations (relative probability) may be produced based on the spatial distribution of the features and good knowledge of the local geology. Sinkhole distribution can be investigated by spatial distribution analysis techniques including studies of preferential elongation, alignment, and nearest neighbor analysis. More objective susceptibility models may be obtained by analyzing the statistical relationships between the known sinkholes and the conditioning factors. Chronological information on sinkhole formation is required to estimate the probability of occurrence of sinkholes (number of sinkholes/km2 year). Such spatial and temporal predictions, frequently derived from limited records and based on the assumption that past sinkhole activity may be extrapolated to the future, are non-corroborated hypotheses. Validation methods allow us to assess the predictive capability of the susceptibility maps and to transform them into probability maps. Avoiding the most hazardous areas by preventive planning is the safest strategy for development in sinkhole-prone areas. Corrective measures could be applied to reduce the dissolution activity and subsidence processes. A more practical solution for safe development is to reduce the vulnerability of the structures by using subsidence-proof designs. ?? 2007 Springer-Verlag.

  17. Rio Soliette (haiti): AN International Initiative for Flood-Hazard Assessment and Mitigation

    NASA Astrophysics Data System (ADS)

    Gandolfi, S.; Castellarin, A.; Barbarella, M.; Brath, A.; Domeneghetti, A.; Brandimarte, L.; Di Baldassarre, G.

    2013-01-01

    Natural catastrophic events are one of most critical aspects for health and economy all around the world. However, the impact in a poor region can impact more dramatically than in others countries. Isla Hispaniola (Haiti and the Dominican Republic), one of the poorest regions of the planet, has repeatedly been hit by catastrophic natural disasters that caused incalculable human and economic losses. After the catastrophic flood event occurred in the basin of River Soliette on May 24th, 2004, the General Direction for Development and Cooperation of the Italian Department of Foreign Affairs funded an international cooperation initiative (ICI) coordinated by the University of Bologna, that involved Haitian and Dominican institutions.Main purpose of the ICI was hydrological and hydraulic analysis of the May 2004 flood event aimed at formulating a suitable and affordable flood risk mitigation plan, consisting of structural and non-structural measures. In this contest, a topographic survey was necessary to realize the hydrological model and to improve the knowledge in some areas candidates to be site for mitigation measures.To overcome the difficulties arising from the narrowness of funds, surveyors and limited time available for the survey, only GPS technique have been used, both for framing aspects (using PPP approach), and for geometrical survey of the river by means of river cross-sections and detailed surveys in two areas (RTK technique). This allowed us to reconstruct both the river geometry and the DTM's of two expansion areas (useful for design hydraulic solutions for mitigate flood-hazard risk).

  18. 2009 ERUPTION OF REDOUBT VOLCANO: Lahars, Oil, and the Role of Science in Hazards Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Swenson, R.; Nye, C. J.

    2009-12-01

    In March, 2009, Redoubt Volcano erupted for the third time in 45 years. More than 19 explosions produced ash plumes to 60,000 ft asl, lahar flows of mud and ice down the Drift river ~30 miles to the coast, and tephra fall up to 1.5 mm onto surrounding communities. The eruption had severe impact on many operations. Airlines were forced to cancel or divert hundreds of international and domestic passenger and cargo flights, and Anchorage International airport closed for over 12 hours. Mudflows and floods down the Drift River to the coast impacted operations at the Drift River Oil Terminal (DROT) which was forced to shut down and ultimately be evacuated. Prior mitigation efforts to protect the DROT oil tank farm from potential impacts associated with a major eruptive event were successful, and none of the 148,000 barrels of oil stored at the facility was spilled or released. Nevertheless, the threat of continued eruptive activity at Redoubt, with the possibility of continued lahar flows down the Drift River alluvial fan, required an incident command post be established so that the US Coast Guard, Alaska Dept. of Environmental Conservation, and the Cook Inlet Pipeline Company could coordinate a response to the potential hazards. Ultimately, the incident command team relied heavily on continuous real-time data updates from the Alaska Volcano Observatory, as well as continuous geologic interpretations and risk analysis by the USGS Volcanic Hazards group, the State Division of Geological and Geophysical Surveys and the University of Alaska Geophysical Institute, all members of the collaborative effort of the Alaska Volcano Observatory. The great success story that unfolded attests to the efforts of the incident command team, and their reliance on real-time scientific analysis from scientific experts. The positive results also highlight how pre-disaster mitigation and monitoring efforts, in concert with hazards response planning, can be used in a cooperative industry / multi-agency effort to positively affect hazards mitigation. The final outcomes from this potentially disastrous event included: 1) no on-site personnel were injured; 2) no detrimental environmental impacts associated with the oil terminal occurred; and 3) incident command personnel, together with numerous industry representatives, were able to make well-informed, although costly decisions that resulted in safe removal of the oil from the storage facilities. The command team’s efforts also furthered the process of restarting the Cook Inlet oil production after a forced five month shutdown.

  19. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

    NASA Astrophysics Data System (ADS)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

    2013-12-01

    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at five of them. The results showed that teams quickly learned to collect high-quality data for each method of analysis. SPAC and refraction microtremor analysis each demonstrated that dispersion relations based on ambient noise and from arrays with an aperture of less than 200 meters could be used to determine the depth of a weak, disaggregated layer known to underlie the fast near-surface limestone terraces on which Santo Domingo is situated, and indicated the presence of unexpectedly strong rocks below. All three array methods concurred that most Santo Domingo sites has relatively high VS30 (average shear velocity to a depth of 30 m), generally at the B-C NEHRP hazard class boundary or higher. HVSR analysis revealed that the general pattern of resonance was short periods close to the coast, and an increase with distance from the shore line. In the east-west direction, significant variations were also evident at the highest elevation terrace, and near the Ozama River. In terms of the sub-soil conditions, the observed pattern of HVSR values, departs form the expected increase of sediments thickness close to the coast.

  20. Coupling Radar Rainfall Estimation and Hydrological Modelling For Flash-flood Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Borga, M.; Creutin, J. D.

    Flood risk mitigation is accomplished through managing either or both the hazard and vulnerability. Flood hazard may be reduced through structural measures which alter the frequency of flood levels in the area. The vulnerability of a community to flood loss can be mitigated through changing or regulating land use and through flood warning and effective emergency response. When dealing with flash-flood hazard, it is gener- ally accepted that the most effective way (and in many instances the only affordable in a sustainable perspective) to mitigate the risk is by reducing the vulnerability of the involved communities, in particular by implementing flood warning systems and community self-help programs. However, both the inherent characteristics of the at- mospheric and hydrologic processes involved in flash-flooding and the changing soci- etal needs provide a tremendous challenge to traditional flood forecasting and warning concepts. In fact, the targets of these systems are traditionally localised like urbanised sectors or hydraulic structures. Given the small spatial scale that characterises flash floods and the development of dispersed urbanisation, transportation, green tourism and water sports, human lives and property are exposed to flash flood risk in a scat- tered manner. This must be taken into consideration in flash flood warning strategies and the investigated region should be considered as a whole and every section of the drainage network as a potential target for hydrological warnings. Radar technology offers the potential to provide information describing rain intensities almost contin- uously in time and space. Recent research results indicate that coupling radar infor- mation to distributed hydrologic modelling can provide hydrologic forecasts at all potentially flooded points of a region. Nevertheless, very few flood warning services use radar data more than on a qualitative basis. After a short review of current under- standing in this area, two issues are examined: advantages and caveats of using radar rainfall estimates in operational flash flood forecasting, methodological problems as- sociated to the use of hydrological models for distributed flash flood forecasting with rainfall input estimated from radar.

  1. Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Research Team

    NASA Technical Reports Server (NTRS)

    Kelly, Michael J.

    2013-01-01

    The Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage raft empennage.

  2. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    NASA Astrophysics Data System (ADS)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation of earthquake scientists and engineers. In addition to the national maps, the USGS produces more detailed urban seismic hazard maps that communities have used to prioritize retrofits and design critical infrastructure that can withstand large earthquakes. At a regional scale, the USGS and its partners in California have developed a time-dependent earthquake rupture forecast that is being used by the insurance sector, which can serve to distribute risk and foster mitigation if the right incentives are in place. What the USGS and partners are doing at the urban, regional, and national scales, the Global Earthquake Model project is seeking to do for the world. A significant challenge for engaging the public to prepare for earthquakes is making low-probability, high-consequence events real enough to merit personal action. Scenarios help by starting with the hazard posed by a specific earthquake and then exploring the fragility of the built environment, cascading failures, and the real-life consequences for the public. To generate such a complete picture takes multiple disciplines working together. Earthquake scenarios are being used both for emergency management exercises and much broader public preparedness efforts like the Great California ShakeOut, which engaged nearly 7 million people.

  3. Robust Satellite Techniques (RST) for Natural and Environmental Hazards Monitoring and Mitigation: Theory and Applications

    Microsoft Academic Search

    V. Tramutoli

    2007-01-01

    Several algorithms and data analysis techniques have been proposed using satellite observations (within atmospheric spectral windows) for cloud and surface parameters studies and for human environment monitoring applications. Quite all these algorithms are difficult to extend to different geographical, seasonal conditions, generally offering poor performances and uncertain reliability especially when applied in environmental risk prevision, monitoring and\\/or mitigation. In this

  4. Evaluation Of Risk And Possible Mitigation Schemes For Previously Unidentified Hazards

    NASA Technical Reports Server (NTRS)

    Linzey, William; McCutchan, Micah; Traskos, Michael; Gilbrech, Richard; Cherney, Robert; Slenski, George; Thomas, Walter, III

    2006-01-01

    This report presents the results of arc track testing conducted to determine if such a transfer of power to un-energized wires is possible and/or likely during an arcing event, and to evaluate an array of protection schemes that may significantly reduce the possibility of such a transfer. The results of these experiments may be useful for determining the level of protection necessary to guard against spurious voltage and current being applied to safety critical circuits. It was not the purpose of these experiments to determine the probability of the initiation of an arc track event only if an initiation did occur could it cause the undesired event: an inadvertent thruster firing. The primary wire insulation used in the Orbiter is aromatic polyimide, or Kapton , a construction known to arc track under certain conditions [3]. Previous Boeing testing has shown that arc tracks can initiate in aromatic polyimide insulated 28 volts direct current (VDC) power circuits using more realistic techniques such as chafing with an aluminum blade (simulating the corner of an avionics box or lip of a wire tray), or vibration of an aluminum plate against a wire bundle [4]. Therefore, an arc initiation technique was chosen that provided a reliable and consistent technique of starting the arc and not a realistic simulation of a scenario on the vehicle. Once an arc is initiated, the current, power and propagation characteristics of the arc depend on the power source, wire gauge and insulation type, circuit protection and series resistance rather than type of initiation. The initiation method employed for these tests was applying an oil and graphite mixture to the ends of a powered twisted pair wire. The flight configuration of the heater circuits, the fuel/oxider (or ox) wire, and the RCS jet solenoid were modeled in the test configuration so that the behavior of these components during an arcing event could be studied. To determine if coil activation would occur with various protection wire schemes, 145 tests were conducted using various fuel/ox wire alternatives (shielded and unshielded) and/or different combinations of polytetrafuloroethylene (PTFE), Mystik tape and convoluted wraps to prevent unwanted coil activation. Test results were evaluated along with other pertinent data and information to develop a mitigation strategy for an inadvertent RCS firing. The SSP evaluated civilian aircraft wiring failures to search for aging trends in assessing the wire-short hazard. Appendix 2 applies Weibull statistical methods to the same data with a similar purpose.

  5. Using Robust Decision Making to Assess and Mitigate the Risks of Natural Hazards in Developing Countries

    NASA Astrophysics Data System (ADS)

    Kalra, N.; Lempert, R. J.; Peyraud, S.

    2012-12-01

    Ho Chi Minh City (HCMC) ranks fourth globally among coastal cities most vulnerable to climate change and already experiences extensive routine flooding. In the coming decades, increased precipitation, rising sea levels, and land subsidence could permanently inundate a large portion of the city's population, place the poor at particular risk, and threaten new economic development in low-lying areas. HCMC is not alone in facing the impacts of natural hazards exacerbated by uncertain future climate change, development, and other deep uncertainties. Assessing and managing these risks is a tremendous challenge, particularly in developing countries which face pervasive shortages of the data and models generally used to plan for such changes. Using HCMC as a case study, this talk will demonstrate how a scenario-based approach that uses robustness as a decision and planning element can help developing countries assess future climate risk and manage the risk of natural disasters. In contrast to traditional approaches which treat uncertainty with a small number of handcrafted scenarios, this talk will emphasize how robust decision making, which uses modeling to explore over thousands of scenarios, can identify potential vulnerabilities to HCMC's emerging flood risk management strategy and suggest potential responses. The talk will highlight several novel features of the collaboration with the HCMC Steering Committee for Flood Control. First, it examines several types of risk -- risk to the poor, risk to the non-poor, and risk to the economy -- and illustrates how management policies have different implications for these sectors. Second, it demonstrates how diverse and sometimes incomplete climate, hydrologic, socioeconomic, GIS, and other data and models can be integrated into a modeling framework to develop and evaluate many scenarios of flood risk. Third, it illustrates the importance of non-structural policies such as land use management and building design to manage flood risk. Finally, it demonstrates how an adaptive management strategy that evolves over time and implements management options in response to new information can more effectively mitigate risks from natural disasters than can static policies.; A scatter plot of risk to the poor and non-poor in 1000 different scenarios under eight different risk management options (differentiated by color).

  6. The Identification of Filters and Interdependencies for Effective Resource Allocation: Coupling the Mitigation of Natural Hazards to Economic Development.

    NASA Astrophysics Data System (ADS)

    Agar, S. M.; Kunreuther, H.

    2005-12-01

    Policy formulation for the mitigation and management of risks posed by natural hazards requires that governments confront difficult decisions for resource allocation and be able to justify their spending. Governments also need to recognize when spending offers little improvement and the circumstances in which relatively small amounts of spending can make substantial differences. Because natural hazards can have detrimental impacts on local and regional economies, patterns of economic development can also be affected by spending decisions for disaster mitigation. This paper argues that by mapping interdependencies among physical, social and economic factors, governments can improve resource allocation to mitigate the risks of natural hazards while improving economic development on local and regional scales. Case studies of natural hazards in Turkey have been used to explore specific "filters" that act to modify short- and long-term outcomes. Pre-event filters can prevent an event from becoming a natural disaster or change a routine event into a disaster. Post-event filters affect both short and long-term recovery and development. Some filters cannot be easily modified by spending (e.g., rural-urban migration) but others (e.g., land-use practices) provide realistic spending targets. Net social benefits derived from spending, however, will also depend on the ways by which filters are linked, or so-called "interdependencies". A single weak link in an interdependent system, such as a power grid, can trigger a cascade of failures. Similarly, weak links in social and commercial networks can send waves of disruption through communities. Conversely, by understanding the positive impacts of interdependencies, spending can be targeted to maximize net social benefits while mitigating risks and improving economic development. Detailed information on public spending was not available for this study but case studies illustrate how networks of interdependent filters can modify social benefits and costs. For example, spending after the 1992 Erzincan earthquake targeted local businesses but limited alternative employment, labor losses and diminished local markets all contributed to economic stagnation. Spending after the 1995 Dinar earthquake provided rent subsidies, supporting a major exodus from the town. Consequently many local people were excluded from reconstruction decisions and benefits offered by reconstruction funds. After the 1999 Marmara earthquakes, a 3-year economic decline in Yalova illustrates the vulnerability of local economic stability to weak regulation enforcement by a few agents. A resource allocation framework indicates that government-community relations, lack of economic diversification, beliefs, and compensation are weak links for effective spending. Stronger positive benefits could be achieved through spending to target land-use regulation enforcement, labor losses, time-critical needs of small businesses, and infrastructure. While the impacts of the Marmara earthquakes were devastating, strong commercial networks and international interests helped to re-establish the regional economy. Interdependencies may have helped to drive a recovery. Smaller events in eastern Turkey, however, can wipe out entire communities and can have long-lasting impacts on economic development. These differences may accelerate rural to urban migration and perpetuate regional economic divergence in the country. 1: Research performed in the Wharton MBA Program, Univ. of Pennsylvania.

  7. Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues

    USGS Publications Warehouse

    Hearn, Paul P.

    2009-01-01

    Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

  8. The Effective Organization and Use of Data in Bridging the Hazard Mitigation-Climate Change Adaptation Divide (Invited)

    NASA Astrophysics Data System (ADS)

    Smith, G. P.; Fox, J.; Shuford, S.

    2010-12-01

    The costs associated with managing natural hazards and disasters continue to rise in the US and elsewhere. Many climate change impacts are manifested in stronger or more frequent natural hazards such as floods, wildfire, hurricanes and typhoons, droughts, and heat waves. Despite this common problem, the climate change adaptation and hazards management communities have largely failed to acknowledge each other’s work in reducing hazard impacts. This is even reflected in the language that each community uses; for example, the hazards management community refers to hazard risk reduction as mitigation while the climate change community refers to it as adaptation. In order to help bridge this divide, we suggest each community utilize data in a more formally-organized and effective manner based on four principles: 1. The scale of the data must reflect the needs of the decision maker. In most cases, decision makers’ needs are most effectively met through the development of a multiple alternatives that takes into account a variety of possible impacts. 2. Investments intended to reduce vulnerability and increase resilience should be driven by the wise use of available data using a “risk-based” strategy. 3. Climate change adaptation and hazard mitigation strategies must be integrated with other value drivers when building resiliency. Development and use of data that underscore the concept of “no regrets” risk reduction can be used to accomplish this aim. 4. The use of common data is critical in building a bridge between the climate change adaptation and hazards management communities. We will explore how the creation of data repositories that collect, analyze, display and archive hazards and disaster data can help address the challenges posed by the current and hazards management and climate change adaptation divide.

  9. New Multi-HAzard and MulTi-RIsk Assessment MethodS for Europe (MATRIX): A research program towards mitigating multiple hazards and risks in Europe

    NASA Astrophysics Data System (ADS)

    Fleming, K. M.; Zschau, J.; Gasparini, P.; Modaressi, H.; Matrix Consortium

    2011-12-01

    Scientists, engineers, civil protection and disaster managers typically treat natural hazards and risks individually. This leads to the situation where the frequent causal relationships between the different hazards and risks, e.g., earthquakes and volcanos, or floods and landslides, are ignored. Such an oversight may potentially lead to inefficient mitigation planning. As part of their efforts to confront this issue, the European Union, under its FP7 program, is supporting the New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe or MATRIX project. The focus of MATRIX is on natural hazards, in particular earthquakes, landslides, volcanos, wild fires, storms and fluvial and coastal flooding. MATRIX will endeavour to develop methods and tools to tackle multi-type natural hazards and risks within a common framework, focusing on methodologies that are suited to the European context. The work will involve an assessment of current single-type hazard and risk assessment methodologies, including a comparison and quantification of uncertainties and harmonization of single-type methods, examining the consequence of cascade effects within a multi-hazard environment, time-dependent vulnerability, decision making and support for multi-hazard mitigation and adaption, and a series of test cases. Three test sites are being used to assess the methods developed within the project (Naples, Cologne, and the French West Indies), as well as a "virtual city" based on a comprehensive IT platform that will allow scenarios not represented by the test cases to be examined. In addition, a comprehensive dissemination program that will involve national platforms for disaster management, as well as various outreach activities, will be undertaken. The MATRIX consortium consists of ten research institutions (nine European and one Canadian), an end-user (i.e., one of the European national platforms for disaster reduction) and a partner from industry.

  10. Rockfall hazard assessment, risk quantification, and mitigation options for reef cove resort development, False Cape, Queensland, Australia

    NASA Astrophysics Data System (ADS)

    Schlotfeldt, P.

    2009-04-01

    GIS and 2-D rock fall simulations were used as the primary tools during a rock fall hazard assessment and analyses for a major resort and township development near Cairns, Queensland in Australia. The methods used included 1) the development of a digital elevation model (DEM); undertaking rock fall trajectory analyses to determine the end points of rockfalls, the distribution of kinetic energy for identified rock fall runout Zones, and 3) undertaking event tree analyses based on a synthesis of all data in order to establish Zones with the highest risk of fatalities. This paper describes the methodology used and the results of this work. Recommendations to mitigate the hazard included having exclusions zones with no construction, scaling (including trim blasting), construction of berms and rockfall catch fences. Keywords: GIS, rockfall simulation, rockfall runout Zones, mitigation options INTRODUCTION False Cape is located on the east side of the Trinity inlet near Cairns (Figure 1). Construction is underway for a multi-million dollar development close the beach front. The development will ultimately cover about 1.5 km of prime coast line. The granite slopes above the development are steep and are covered with a number of large, potentially unstable boulders. Sheet jointing is present in the in-situ bedrock and these combined with other tectonic joint sets have provided a key mechanism for large side down slope on exposed bedrock. With each rock fall (evidence by boulders strew in gullies, over the lower parts of the slope, and on the beach) the failure mechanism migrates upslope. In order for the Developer to proceed with construction he needs to mitigate the identified rock fall hazard. The method used to study the hazard and key finding are presented in this paper. Discussion is provided in the conclusion on mitigation options. KEY METHODS USED TO STUDY THE HAZARD In summary the methods used to study the hazard for the False Cape project include; 1. The development of a digital elevation model (DEM) used to delineate rock fall runout Zones [1] that included the spatial location of boulder fields mapped within Zones(Figure 2). A Zone is defined as an area above the development on steep sided slopes where falling rocks are channeled into gullies / and or are contained between topographic features such as ridges and spurs that extend down the mountainside. These natural barriers generally ensure that falling rocks do not fall or roll into adjacent Zones; 2. The use of ‘Flow Path Tracing Tool' in Arc GIS spatial analyst to confirm typical descents of boulders in Zones. These were shown to correlated strongly with the endpoints of boulders observed within the development and major clusters of boulders on the beach front; 3. The use of 2-D rockfall trajectory analyses [2] using sections cut along typical 3-D trajectory paths mapped out in ARC GIS per Zone. Sections along typical paths in Zones simulated, to some degree, the 3-D affect or path of rocks as they bounce roll down slope (Figure 3); 4. The calibration of rockfall input parameters (coefficients of normal and tangential restitution, slope roughness, friction angle, etc.) using field identified endpoints and size of fallen rock and boulder; and 5. Undertaking risk evolutions in order to quantify the potential risk for each independent rockfall Zone. KEY FINDINGS FROM THE STUDIES The key findings from the study include; 1. Multiple potentially unstable in-situ boulders (some in excess of several thousand tonnes) are present above the development. 2. Similar geological structures (dykes, jointing, etc.) are present in the boulders on the beach front and within the development exposed in-situ bedrock located above the development. Measurement and comparison of the orientation of these geological structures present in boulders with that observed in the in-situ bedrock provided strong evidence that that the boulders have mitigated down slope. 3. Eight discrete Rockfall Runout Zones were identified using the digital elevation model set up in ARC GIS (Figure 4). The bound

  11. Studying Fire Mitigation Strategies in Multi-Ownership Landscapes

    E-print Network

    He, Hong S.

    Studying Fire Mitigation Strategies in Multi-Ownership Landscapes: Balancing the Management of Fire and succession model (LANDIS) to evaluate the relative effective- ness of four alternative fire mitigation­urban inter- face; forest succession; simulation modeling; fire risk mitigation. INTRODUCTION

  12. Improving Tsunami Hazard Mitigation and Preparedness Using Real-Time and Post-Tsunami Field Data

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Miller, K. M.

    2012-12-01

    The February 27, 2010 Chile and March 11, 2011 Japan tsunamis caused dramatic loss of life and damage in the near-source region, and notable impacts in distant coastal regions like California. Comprehensive real-time and post-tsunami field surveys and the availability of hundreds of videos within harbors and marinas allow for detailed documentation of these two events by the State of California Tsunami Program, which receives funding through the National Tsunami Hazard Mitigation Program. Although neither event caused significant inundation of dry land in California, dozens of harbors sustained damage totaling nearly $100-million. Information gathered from these events has guided new strategies in tsunami evacuation planning and maritime preparedness. Scenario-specific, tsunami evacuation "playbook" maps and guidance are being produced detailing inundation from tsunamis of various size and source location. These products help coastal emergency managers prepare local response plans when minor distant source tsunamis or larger tsunamis from local and regional sources are generated. In maritime communities, evaluation of strong tsunami currents and damage are being used to validate/calibrate numerical tsunami model currents and produce in-harbor hazard maps and identify offshore safety zones for potential boat evacuation when a tsunami Warning is issued for a distant source event. Real-time and post-tsunami field teams have been expanded to capture additional detailed information that can be shared in a more timely manner during and after an event through a state-wide clearinghouse. These new products and related efforts will result in more accurate and efficient emergency response by coastal communities, potentially reducing the loss of lives and property during future tsunamis.

  13. Volcanic hazard in Mexico: a comprehensive on-line database for risk mitigation

    NASA Astrophysics Data System (ADS)

    Manea, Marina; Constantin Manea, Vlad; Capra, Lucia; Bonasia, Rosanna

    2013-04-01

    Researchers are currently working on several key aspects of the Mexican volcanoes, such as remote sensing, field data of old and recent volcaniclastic deposits, structural framework, monitoring (rainfall data and visual observation of lahars), and laboratory experiment (analogue models and numerical simulations - fall3D, titan2D). Each investigation is focused on specific processes, but it is fundamental to visualize the global status of the volcano in order to understand its behavior and to mitigate future hazards. The Mexican Volcanoes @nline represents a novel initiative aimed to collect, on a systematic basis, the complete set of data obtained so far on the volcanoes, and to continuously update the database with new data. All the information is compiled from published works and updated frequently. Maps, such as the geological map of the Mexican volcanos and the associated hazard zonation, as well as point data, such as stratigraphic sections, sedimentology and diagrams of rainfall intensities, are presented in Google Earth format in order to be easily accessed by the scientific community and the general public. An important section of this online database is the presentation of numerical simulations results for ash dispersion associated with the principal Mexican active volcanoes. Daily prediction of ash flow dispersion (based on real-time data from CENAPRED and the Mexican Meteorological Service), as well as large-scale high-resolution subduction simulations performed on HORUS (the Computational Geodynamics Laboratory's supercomputer) represent a central part of the Mexican Volcanos @nline database. The Mexican Volcanoes @nline database is maintained by the Computational Geodynamics Laboratory and it is based entirely on Open Source software. The website can be visited at: http://www.geociencias.unam.mx/mexican_volcanoes.

  14. Past, Present, and Future Challenges in Earthquake Hazard Mitigation of Indonesia: A Collaborative Work of Geological Agency Indonesia and Geoscience Australia

    NASA Astrophysics Data System (ADS)

    Hidayati, S.; Cummins, P. R.; Cipta, A.; Omang, A.; Griffin, J.; Horspool, N.; Robiana, R.; Sulaeman, C.

    2012-12-01

    In the last decade, Indonesia has suffered from earthquakes disaster since four out of twelve of the world's large earthquakes with more than 1000 causalities occurred in Indonesia. The great Sumatra earthquake of December 26, 2004 followed by tsunami which cost 227,898 of lives has brought Indonesia and its active tectonic setting to the world's attention. Therefore the government of Indonesia encourages hazard mitigation efforts that are more focused on the pre-disaster phase. In response to government policy in earthquake disaster mitigation, Geological Agency Indonesia attempts to meet the need for rigorous earthquake hazard map throughout the country in provincial scale in 2014. A collaborative work with Geoscience Australia through short-term training missions; on-going training, mentoring, assistance and studying in Australia, under the auspices of Australia-Indonesia Facility for Disaster Reduction (AIFDR) have accelerated the execution of these maps. Since 2010 to date of collaboration, by using probabilistic seismic hazard assessment (PSHA) method, provincial earthquake hazard maps of Central Java (2010), West Sulawesi, Gorontalo, and North Maluku (2011) have been published. In 2012, by the same method, the remaining provinces of Sulawesi Island, Papua, North Sumatera and Jambi will be published. In the end of 2014, all 33 Indonesian provinces hazard maps will be delivered. The future challenges are to work together with the stakeholders, to produce district scale maps and establish a national standard for earthquake hazard maps. Moreover, the most important consideration is to build the capacity to update, maintain and revise the maps as recent information available.

  15. Looking Before We Leap: Recent Results From An Ongoing Quantitative Investigation Of Asteroid And Comet Impact Hazard Mitigation.

    NASA Astrophysics Data System (ADS)

    Plesko, Catherine; Weaver, R. P.; Korycansky, D. G.; Huebner, W. F.

    2010-10-01

    The asteroid and comet impact hazard is now part of public consciousness, as demonstrated by movies, Super Bowl commercials, and popular news stories. However, there is a popular misconception that hazard mitigation is a solved problem. Many people think, `we'll just nuke it.’ There are, however, significant scientific questions remaining in the hazard mitigation problem. Before we can say with certainty that an explosive yield Y at height of burst h will produce a momentum change in or dispersion of a potentially hazardous object (PHO), we need to quantify how and where energy is deposited into the rubble pile or conglomerate that may make up the PHO. We then need to understand how shock waves propagate through the system, what causes them to disrupt, and how long gravitationally bound fragments take to recombine. Here we present numerical models of energy deposition from an energy source into various materials that are known PHO constituents, and rigid body dynamics models of the recombination of disrupted objects. In the energy deposition models, we explore the effects of porosity and standoff distance as well as that of composition. In the dynamical models, we explore the effects of fragment size and velocity distributions on the time it takes for gravitationally bound fragments to recombine. Initial models indicate that this recombination time is relatively short, as little as 24 hours for a 1 km sized PHO composed of 1000 meter-scale self-gravitating fragments with an initial velocity field of v/r = 0.001 1/s.

  16. PREDICTION/MITIGATION OF SUBSIDENCE DAMAGE TO HAZARDOUS WASTE LANDFILL COVERS

    EPA Science Inventory

    Characteristics of Resource Conservation and Recovery Act hazardous waste landfills and of landfilled hazardous wastes have been described to permit development of models and other analytical techniques for predicting, reducing, and preventing landfill settlement and related cove...

  17. Mitigation and benefits measures as policy tools for siting potentially hazardous facilities: determinants of effectiveness and appropriateness.

    PubMed

    Jenkins-Smith, H; Kunreuther, H

    2001-04-01

    How do mitigation and benefits measures affect public acceptance for siting different kinds of potentially hazardous facilities? What kinds of benefits measures are seen as most (or least) appropriate for different kinds of facilities? This study used a nationwide telephone survey consisting of 1,234 interviews with randomly selected respondents to test for the effects of packages of safety and benefits measures for siting a landfill, prison, incinerator and nuclear waste repository. The experimental design used in the survey permits analysis of the fractions of respondents who are willing to change their initial levels of acceptance (or opposition) when presented with a sequence of the safety and benefit measures. The measures vary significantly in their impact on levels of acceptance for the facilities, and some measures that would at face value appear to reassure residents of facility safety turn out to lack credibility and therefore diminish facility acceptance. Ordering of the benefits versus safety measures significantly affects changes in acceptance in surprising ways. The perceived appropriateness of different kinds of benefits measures varies systematically by the type of facility under consideration. It appears that successful benefits packages will directly address the underlying dimensions of concern caused by the facility. These findings point to the importance of further research on "commensurable" benefits measures. PMID:11414544

  18. Mitigating hazards to aircraft from drifting volcanic clouds by comparing and combining IR satellite data with forward transport models

    NASA Astrophysics Data System (ADS)

    Matiella Novak, M. Alexandra

    Volcanic ash clouds in the upper atmosphere (>10km) present a significant hazard to the aviation community and in some cases cause near-disastrous situations for aircraft that inadvertently encounter them. The two most commonly used techniques for mitigating hazards to aircraft from drifting volcanic clouds are (1) using data from satellite observations and (2) the forecasting of dispersion and trajectories with numerical models. This dissertation aims to aid in the mitigation of this hazard by using Moderate Infrared Resolution Spectroradiometer (MODIS) and Advanced Very High Resolution Radiometer (AVHRR) infrared (IR) satellite data to quantitatively analyze and constrain the uncertainties in the PUFF volcanic ash transport model. Furthermore, this dissertation has experimented with the viability of combining IR data with the PUFF model to increase the model's reliability. Comparing IR satellite data with forward transport models provides valuable information concerning the uncertainty and sensitivity of the transport models. A study analyzing the viability of combining satellite-based information with the PUFF model was also done. Factors controlling the cloud-shape evolution, such as the horizontal dispersion coefficient, vertical distribution of particles, the height of the cloud, and the location of the cloud were all updated based on observations from satellite data in an attempt to increase the reliability of the simulations. Comparing center of mass locations--calculated from satellite data--to HYSPLIT trajectory simulations provides insight into the vertical distribution of the cloud. A case study of the May 10, 2003 Anatahan Volcano eruption was undertaken to assess methods of calculating errors in PUFF simulations with respect to the transport and dispersion of the erupted cloud. An analysis of the factors controlling the cloud-shape evolution of the cloud in the model was also completed and compared to the shape evolution of the cloud observed in the IR satellite data. An accurate eruption length of 28 hours--based on satellite imagery--resulted in an error growth rate that decreased by 50% from the original simulation. Using a dispersion coefficient that was calculated from satellite imagery further improved the PUFF simulation. Results show that using satellite-based information in the PUFF model decreases the error growth of the simulation by as much as 60%. PUFF simulations were also compared to IR satellite data of four other eruptions: Augustine (2006), Cleveland (2001), Hekla (2000) and Soufriere Hills (2006). The Anatahan, Augustine and Cleveland eruptions produced clouds that were ash-rich. The Hekla and Soufriere Hills eruption produced clouds that were ice-rich. Mass retrievals performed on the satellite data for these eruptions were holistically compared to determine if the evolution of the ash clouds were dependent on the cloud species and the atmospheric environment into which they were ejected (arctic vs. tropical environments). Analysis show that the ice-rich clouds decrease in mass, area and optical depth more rapidly than the ash-rich clouds. Moreover, error growth rates of the simulated lowlatitude eruptions were linear, where as error growth rates of simulated high-latitude clouds were exponential. Results from this study provide some insight into possible implications for volcanic cloud simulations of ash and ice clouds in differing environments. Finally, a sensitivity analysis of the PUFF model was implemented. This part of the research generated collaborative research with the University of Alaska, Fairbanks/Alaska Volcano Observatory, where the PUFF model is housed. Transport simulations of the May 10, 2003 Anatahan volcanic cloud were done from the PUFF model's command line. Unlike the web-based version of PUFF, the command line is not publicly accessible but provides substantially more control over the user's definition of certain variables. Results of this study show it is viable and practical to continually update the PUFF simulation of a volcanic cloud's evolution and locatio

  19. Assessment and mitigation of risk from low-probability, high-consequence hazards

    Microsoft Academic Search

    Bruce R. Ellingwood

    2007-01-01

    Modern probabilistic risk assessment of civil infrastructure supports risk mitigation policy by providing insight into the factors that govern performance of civil infrastructure subjected to severe events beyond the design basis and the relative efficiency of various options for risk mitigation. A fully-coupled risk assessment of a system provides estimates of the annual probability of exceeding pre-defined performance levels, defined

  20. Studying Hazard and Risk in Pastoral Societies

    Microsoft Academic Search

    Michael Bollig

    \\u000a This book centres around the comparison of hazards, risk perception and risk minimising strategies in two African pastoral\\u000a societies, the Pokot of northern Kenya and the Himba of northern Namibia (see Map 1). Both societies were studied over several\\u000a years of intensive field research between 1987 and 1999. The central questions guiding the comparative approach are: (1) How\\u000a are hazards

  1. Catastrophic debris flows transformed from landslides in volcanic terrains : mobility, hazard assessment and mitigation strategies

    USGS Publications Warehouse

    Scott, Kevin M.; Macias, Jose Luis; Naranjo, Jose Antonio; Rodriguez, Sergio; McGeehin, John P.

    2001-01-01

    Communities in lowlands near volcanoes are vulnerable to significant volcanic flow hazards in addition to those associated directly with eruptions. The largest such risk is from debris flows beginning as volcanic landslides, with the potential to travel over 100 kilometers. Stratovolcanic edifices commonly are hydrothermal aquifers composed of unstable, altered rock forming steep slopes at high altitudes, and the terrain surrounding them is commonly mantled by readily mobilized, weathered airfall and ashflow deposits. We propose that volcano hazard assessments integrate the potential for unanticipated debris flows with, at active volcanoes, the greater but more predictable potential of magmatically triggered flows. This proposal reinforces the already powerful arguments for minimizing populations in potential flow pathways below both active and selected inactive volcanoes. It also addresses the potential for volcano flank collapse to occur with instability early in a magmatic episode, as well as the 'false-alarm problem'-the difficulty in evacuating the potential paths of these large mobile flows. Debris flows that transform from volcanic landslides, characterized by cohesive (muddy) deposits, create risk comparable to that of their syneruptive counterparts of snow and ice-melt origin, which yield noncohesive (granular) deposits, because: (1) Volcano collapses and the failures of airfall- and ashflow-mantled slopes commonly yield highly mobile debris flows as well as debris avalanches with limited runout potential. Runout potential of debris flows may increase several fold as their volumes enlarge beyond volcanoes through bulking (entrainment) of sediment. Through this mechanism, the runouts of even relatively small collapses at Cascade Range volcanoes, in the range of 0.1 to 0.2 cubic kilometers, can extend to populated lowlands. (2) Collapse is caused by a variety of triggers: tectonic and volcanic earthquakes, gravitational failure, hydrovolcanism, and precipitation, as well as magmatic activity and eruptions. (3) Risk of collapse begins with initial magmatic activity and increases as intrusion proceeds. An archetypal debris flow from volcanic terrain occurred in Colombia with a tectonic earthquake (M 6.4) in 1994. The Rio Piez conveyed a catastrophic wave of debris flow over 100 kilometers, coalesced from multiple slides of surflcial material weakened both by weathering and by hydrothermal alteration in a large strato- volcano. Similar seismogenic flows occurred in Mexico in 1920 (M -6.5), Chile in 1960 (M 9.2), and Ecuador in 1987 (M 6.1 and 6.9). Velocities of wave fronts in two examples were 60 to 90 km/hr (17-25 meters per second) over the initial 30 kilometers. Volcano flank and sector collapses may produce untransformed debris avalanches, as occurred initially at Mount St. Helens in 1980. However, at least as common is direct transformation of the failed mass to a debris flow. At two other volcanoes in the Cascade Range-- Mount Rainier and Mount Baker--rapid transformation and high mobility were typical of most of at least 15 Holocene flows. This danger exists downstream from many stratovolcanoes worldwide; the population at risk is near 150,000 and increasing at Mount Rainier. The first step in preventing future catastrophes is documenting past flows. Deposits of some debris flows, however, can be mistaken for those of less-mobile debris avalanches on the basis of mounds formed by buoyed megaclasts. Megaclasts may record only the proximal phase of a debris flow that began as a debris avalanche. Runout may have extended much farther, and thus furore flow mobility may be underestimated. Processes and behaviors of megaclast-bearing paleoflows are best inferred from the intermegaclast matrix. Mitigation strategy can respond to volcanic flows regardless of type and trigger by: (1) Avoidance: Limit settlement in flow pathways to numbers that can be evacuated after event warnings (flow is occurring). (2) Instrumental even

  2. Seismicity and seismotectonics of southern Ghana: lessons for seismic hazard mitigation

    NASA Astrophysics Data System (ADS)

    Amponsah, Paulina

    2014-05-01

    Ghana is located on the West African craton and is far from the major earthquake zone of the world. It is therefore largely considered a stable region. However, the southern part of the country is seismically active. Records of damaging earthquakes in Ghana date as far back as 1615. A study on the microseismic activity in southern Ghana shows that the seismic activity is linked with active faulting between the east-west trending Coastal boundary fault and a northeast-southwest trending Akwapim fault zone. Epicentres of most of the earthquakes have been located close to the area where the two major faults intersect. This can be related to the level of activity of the faults. Some of the epicentres have been located offshore and can be associated with the level of activity of the coastal boundary fault. A review of the geological and instrumental recordings of earthquakes in Ghana show that earthquakes have occurred in the past and are still liable to occur within the vicinity of the intersection of the Akwapim fault zone and the Coastal boundary fault. Data from both historical and instrumental records indicate that the most seismically active areas in Ghana are the west of Accra, where the Akwapim fault zone and the Coastal boundary fault intersect. There are numerous minor faults in the intersection area between the Akwapim fault zone and the Coastal boundary fault. This mosaic of faults has a major implication for seismic activity in the area. Earthquake disaster mitigation measures are being put in place in recent times to reduce the impact of any major event that may occur in the country. The National Disaster Management Organization has come out with a building guide to assist in the mitigation effort of earthquake disasters and floods in the country. The building guide clearly stipulates the kind of material to be used, the proportion, what should go into the foundation for one or two storey building, the electrical materials to be used and many others.

  3. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    J. L. Kubicek

    2001-09-07

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  4. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    Richard C. Logan

    2002-03-28

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  5. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr events could benefit the NTHMP. The joint NFIP/NTHMP pilot study at Seaside, Oregon is organized into three closely related components: Probabilistic, Modeling, and Impact studies. Probabilistic studies (Geist, et al., this session) are led by the USGS and include the specification of near- and far-field seismic tsunami sources and their associated probabilities. Modeling studies (Titov, et al., this session) are led by NOAA and include the development and testing of a Seaside tsunami inundation model and an associated database of computed wave height and flow velocity fields. Impact studies (Synolakis, et al., this session) are led by USC and include the computation and analyses of indices for the categorization of hazard zones. The results of each component study will be integrated to produce a Seaside tsunami hazard map. This presentation will provide a brief overview of the project and an update on progress, while the above-referenced companion presentations will provide details on the methods used and the preliminary results obtained by each project component.

  6. Evaluation and mitigation of lightning hazards to the space shuttle Solid Rocket Motors (SRM)

    NASA Technical Reports Server (NTRS)

    Rigden, Gregory J.; Papazian, Peter B.

    1988-01-01

    The objective was to quantify electric field strengths in the Solid Rocket Motor (SRM) propellant in the event of a worst case lightning strike. Using transfer impedance measurements for selected lightning protection materials and 3D finite difference modeling, a retrofit design approach for the existing dielectric grain cover and railcar covers was evaluated and recommended for SRM segment transport. A safe level of 300 kV/m was determined for the propellant. The study indicated that a significant potential hazard exists for unprotected segments during rail transport. However, modified railcar covers and grain covers are expected to prevent lightning attachment to the SRM and to reduce the levels to several orders of magnitude below 300 kV/m.

  7. Advances in Remote Sensing Approaches for Hazard Mitigation and Natural Resource Protection in Pacific Latin America: A Workshop for Advanced Graduate Students, Post- Doctoral Researchers, and Junior Faculty

    NASA Astrophysics Data System (ADS)

    Gierke, J. S.; Rose, W. I.; Waite, G. P.; Palma, J. L.; Gross, E. L.

    2008-12-01

    Though much of the developing world has the potential to gain significantly from remote sensing techniques in terms of public health and safety, they often lack resources for advancing the development and practice of remote sensing. All countries share a mutual interest in furthering remote sensing capabilities for natural hazard mitigation and resource development. With National Science Foundation support from the Partnerships in International Research and Education program, we are developing a new educational system of applied research and engineering for advancing collaborative linkages among agencies and institutions in Pacific Latin American countries (to date: Guatemala, El Salvador, Nicaragua, Costa Rica, Panama, and Ecuador) in the development of remote sensing tools for hazard mitigation and water resources management. The project aims to prepare students for careers in science and engineering through their efforts to solve suites of problems needing creative solutions: collaboration with foreign agencies; living abroad immersed in different cultures; and adapting their academic training to contend with potentially difficult field conditions and limited resources. The ultimate goal of integrating research with education is to encourage cross-disciplinary, creative, and critical thinking in problem solving and foster the ability to deal with uncertainty in analyzing problems and designing appropriate solutions. In addition to traditional approaches for graduate and undergraduate research, we have built new educational systems of applied research and engineering: (1) the Peace Corp/Master's International program in Natural Hazards which features a 2-year field assignment during service in the U.S. Peace Corps, (2) the Michigan Tech Enterprise program for undergraduates, which gives teams of students from different disciplines the opportunity to work for three years in a business-like setting to solve real-world problems, and (3) a unique university exchange program in natural hazards (E-Haz). Advancements in research have been made, for example, in using thermal remote sensing methods for studying vent and eruptive processes, and in fusing RADARSAT with ASTER imagery to delineate lineaments in volcanic terrains for siting water wells. While these and other advancements are developed in conjunction with our foreign counterparts, the impacts of this work can be broadened through more comprehensive dissemination activities. Towards this end, we are in the planning phase of a Pan American workshop on applications of remote sensing techniques for natural hazards and water resources management. The workshop will be at least two weeks, sometime in July/August 2009, and involve 30-40 participants, with balanced participation from the U.S. and Latin America. In addition to fundamental aspects of remote sensing and digital image processing, the workshop topics will be presented in the context of new developments for studying volcanic processes and hazards and for characterizing groundwater systems.

  8. Disruption Mitigation Studies in DIII--D

    NASA Astrophysics Data System (ADS)

    Taylor, P. L.

    1998-11-01

    Critical to the viability of the tokamak concept along with the operation and survivability of future devices is the development of techniques to terminate the discharge safely and mitigate the destructive effects of disruptions: high thermal and electromagnetic loads as well as intense high energy runaway electron beams. A series of dedicated disruption experiments on DIII--D have provided further data on the discharge behavior, thermal loads, halo currents, and runaway electrons and have evaluated techniques to mitigate the disruptions while minimizing runaway electron production. Non-axisymmetric halo currents occur with currents up to 30% of the plasma current and with toroidal peaking factors of 2 at the time of peak halo current. Large heat fluxes are also measured with up to 100% of the pre-disruption thermal energy deposited on the divertor floor. Fundamental questions on the halo current generation, scaling, and mitigation are being addressed by comparisons of DIII--D plasmas during disruptions with the DINA time-dependent resistive MHD code and with semi-analytic halo current evolution models. Experiments injecting cryogenic impurity ``killer'' pellets of neon, argon, and methane have successfully mitigated these disruption effects; reducing the halo currents by 30%--50%, lowering the halo current asymmetry to near unity, reducing the vertical force on the vessel, and enhancing the power loss through the radiation channel to ~90%. Often runaway electrons are generated following the pellet injection and results of recent experiments using pre-emptive ``killer'' pellets help benchmark theoretical models of the pellet ablation and plasma energy loss (KPRAD and TSC codes), and of the runaway electron generation (CQL3D Fokker-Planck code). Use of the models has led to two new runaway generation mechanisms; both a modification of the standard Dreicer process and arising out of instability induced transport or time dependent effects. Experiments with a massive helium gas puff (3000 T-l in 7 ms) have also effectively mitigated disruptions but without the formation of runaway electrons that can occur with ``killer'' pellets. The massive gas puff results provide encouragement for longer term approaches to disruption mitigation which are focusing on liquid jets. focusing on liquid jets.

  9. Linear Aerospike SR71 Experiment (LASRE): Aerospace Propulsion Hazard Mitigation Systems

    Microsoft Academic Search

    Masashi Mizukami; Griffin P. Corpening; Ronald J. Ray; Neal Hass; Kimberly A. Ennix; Scott M. Lazaroff

    1998-01-01

    A major hazard posed by the propulsion system ofhypersonic and space vehicles is the possibility of fire orexplosion in the vehicle environment. The hazard ismitigated by minimizing or detecting, in the vehicleenvironment, the three ingredients essential toproducing fire: fuel, oxidizer, and an ignition source.The Linear Aerospike SR-71 Experiment (LASRE)consisted of a linear aerospike rocket engine integratedinto one-half of an X-33-like

  10. Piloted Simulation to Evaluate the Utility of a Real Time Envelope Protection System for Mitigating In-Flight Icing Hazards

    NASA Technical Reports Server (NTRS)

    Ranaudo, Richard J.; Martos, Borja; Norton, Bill W.; Gingras, David R.; Barnhart, Billy P.; Ratvasky, Thomas P.; Morelli, Eugene

    2011-01-01

    The utility of the Icing Contamination Envelope Protection (ICEPro) system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device (ICEFTD). ICEPro provides real time envelope protection cues and alerting messages on pilot displays. The pilots participating in this test were divided into two groups; a control group using baseline displays without ICEPro, and an experimental group using ICEPro driven display cueing. Each group flew identical precision approach and missed approach procedures with a simulated failure case icing condition. Pilot performance, workload, and survey questionnaires were collected for both groups of pilots. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their situation awareness of a hazardous aircraft state.

  11. The respiratory health hazards of volcanic ash: a review for volcanic risk mitigation

    NASA Astrophysics Data System (ADS)

    Horwell, Claire J.; Baxter, Peter J.

    2006-07-01

    Studies of the respiratory health effects of different types of volcanic ash have been undertaken only in the last 40 years, and mostly since the eruption of Mt. St. Helens in 1980. This review of all published clinical, epidemiological and toxicological studies, and other work known to the authors up to and including 2005, highlights the sparseness of studies on acute health effects after eruptions and the complexity of evaluating the long-term health risk (silicosis, non-specific pneumoconiosis and chronic obstructive pulmonary disease) in populations from prolonged exposure to ash due to persistent eruptive activity. The acute and chronic health effects of volcanic ash depend upon particle size (particularly the proportion of respirable-sized material), mineralogical composition (including the crystalline silica content) and the physico-chemical properties of the surfaces of the ash particles, all of which vary between volcanoes and even eruptions of the same volcano, but adequate information on these key characteristics is not reported for most eruptions. The incidence of acute respiratory symptoms (e.g. asthma, bronchitis) varies greatly after ashfalls, from very few, if any, reported cases to population outbreaks of asthma. The studies are inadequate for excluding increases in acute respiratory mortality after eruptions. Individuals with pre-existing lung disease, including asthma, can be at increased risk of their symptoms being exacerbated after falls of fine ash. A comprehensive risk assessment, including toxicological studies, to determine the long-term risk of silicosis from chronic exposure to volcanic ash, has been undertaken only in the eruptions of Mt. St. Helens (1980), USA, and Soufrière Hills, Montserrat (1995 onwards). In the Soufrière Hills eruption, a long-term silicosis hazard has been identified and sufficient exposure and toxicological information obtained to make a probabilistic risk assessment for the development of silicosis in outdoor workers and the general population. A more systematic approach to multi-disciplinary studies in future eruptions is recommended, including establishing an archive of ash samples and a website containing health advice for the public, together with scientific and medical study guidelines for volcanologists and health-care workers.

  12. Assessing NEO hazard mitigation in terms of astrodynamics and propulsion systems requirements.

    PubMed

    Remo, John L

    2004-05-01

    Uncertainties associated with assessing valid near-Earth object (NEO) threats and carrying out interception missions place unique and stringent burdens on designing mission architecture, astrodynamics, and spacecraft propulsion systems. A prime uncertainty is associated with the meaning of NEO orbit predictability regarding Earth impact. Analyses of past NEO orbits and impact probabilities indicate uncertainties in determining if a projected NEO threat will actually materialize within a given time frame. Other uncertainties regard estimated mass, composition, and structural integrity of the NEO body. At issue is if one can reliably estimate a NEO threat and its magnitude. Parameters that determine NEO deflection requirements within various time frames, including the terminal orbital pass before impact, and necessary energy payloads, are quantitatively discussed. Propulsion system requirements for extending space capabilities to rapidly interact with NEOs at ranges of up to about 1 AU (astronomical unit) from Earth are outlined. Such missions, without gravitational boosts, are deemed critical for a practical and effective response to mitigation. If an impact threat is confirmed on an immediate orbital pass, the option for interactive reconnaissance, and interception, and subsequent NEO orbit deflection must be promptly carried out. There also must be an option to abort the mitigation mission if the NEO is subsequently found not to be Earth threatening. These options require optimal decision latitude and operational possibilities for NEO threat removal while minimizing alarm. Acting too far in advance of the projected impact could induce perturbations that ultimately exacerbate the threat. Given the dilemmas, uncertainties, and limited options associated with timely NEO mitigation within a decision making framework, currently available propulsion technologies that appear most viable to carry out a NEO interception/mitigation mission within the greatest margin of control and reliability are those based on a combined (bimodal) nuclear thermal/nuclear electric propulsion platform. Elements of required and currently available performance characteristics for nuclear and electric propulsion systems are also discussed. PMID:15220155

  13. The Relation of Hazard Awareness to Adoption of Approved Mitigation Measures.

    ERIC Educational Resources Information Center

    Saarinen, Thomas F.

    The relationship between an individual's or community's awareness of natural hazards and subsequent behavior change is examined in this review of research. The document is presented in seven sections. Following Section I, the introduction, Section II discusses the role of experience in behavior change. Section III examines the role of education…

  14. GLACIER HAZARDS AT BELVEDERE GLACIER AND THE MONTE ROSA EAST FACE, ITALIAN ALPS: PROCESSES AND MITIGATION

    E-print Network

    Kääb, Andreas

    , and confront the responsible authorities with complex problems without precedent in the European Alps­ I / 67 ­ GLACIER HAZARDS AT BELVEDERE GLACIER AND THE MONTE ROSA EAST FACE, ITALIAN ALPS and Giorgio Viazzo7 ABSTRACT In summer 2001, the Belvedere glacier, Macugnaga, Italian Alps, started a surge

  15. Earth sciences, GIS and geomatics for natural hazards assessment and risks mitigation: a civil protection perspective

    Microsoft Academic Search

    Luigi Perotti; Riccardo Conte; Massimo Lanfranco; Gianluigi Perrone; Marco Giardino; Sara Ratto

    2010-01-01

    Geo-information and remote sensing are proper tools to enhance functional strategies for increasing awareness on natural hazards and risks and for supporting research and operational activities devoted to disaster reduction. An improved Earth Sciences knowledge coupled with Geomatics advanced technologies has been developed by the joint research group and applied by the ITHACA (Information Technology for Humanitarian Assistance, Cooperation and

  16. Airborne surveys and monitoring of the Earth - application to the mitigation of natural and anthropogenic hazards

    NASA Astrophysics Data System (ADS)

    Okuma, Shigeo

    2014-03-01

    Airborne surveys are highly useful for their capacity to map, monitor, and forecast natural and anthropogenic hazards safely and efficiently from the air, and for understanding the Earth's global structures. This special issue is based on papers presented at a session of the Japan Geoscience Union Meeting 2012 which was held on 23 May 2012 in Chiba, Japan.

  17. Debris flood hazard documentation and mitigation on the Tilcara alluvial fan (Quebrada de Humahuaca, Jujuy province, North-West Argentina)

    NASA Astrophysics Data System (ADS)

    Marcato, G.; Bossi, G.; Rivelli, F.; Borgatti, L.

    2012-06-01

    For some decades, mass wasting processes such as landslides and debris floods have been threatening villages and transportation routes in the Rio Grande Valley, named Quebrada de Humauhuaca. One of the most significant examples is the urban area of Tilcara, built on a large alluvial fan. In recent years, debris flood phenomena have been triggered in the tributary valley of the Huasamayo Stream and reached the alluvial fan on a decadal basis. In view of proper development of the area, hazard and risk assessment together with risk mitigation strategies are of paramount importance. The need is urgent also because the Quebrada de Humahuaca was recently included in the UNESCO World Cultural Heritage. Therefore, the growing tourism industry may lead to uncontrolled exploitation and urbanization of the valley, with a consequent increase of the vulnerability of the elements exposed to risk. In this context, structural and non structural mitigation measures not only have to be based on the understanding of natural processes, but also have to consider environmental and sociological factors that could hinder the effectiveness of the countermeasure works. The hydrogeological processes are described with reference to present-day hazard and risk conditions. Considering the socio-economic context, some possible interventions are outlined, which encompass budget constraints and local practices. One viable solution would be to build a protecting dam upstream of the fan apex and an artificial channel, in order to divert the floodwaters in a gully that would then convey water and sediments into the Rio Grande, some kilometers downstream of Tilcara. The proposed remedial measures should employ easily available and relatively cheap technologies and local workers, incorporating low environmental and visual impacts issues, in order to ensure both the future conservation of the site and its safe exploitation for inhabitants and tourists.

  18. Detecting Slow Deformation Signals Preceding Dynamic Failure: A New Strategy For The Mitigation Of Natural Hazards (SAFER)

    NASA Astrophysics Data System (ADS)

    Vinciguerra, Sergio; Colombero, Chiara; Comina, Cesare; Ferrero, Anna Maria; Mandrone, Giuseppe; Umili, Gessica; Fiaschi, Andrea; Saccorotti, Gilberto

    2014-05-01

    Rock slope monitoring is a major aim in territorial risk assessment and mitigation. The high velocity that usually characterizes the failure phase of rock instabilities makes the traditional instruments based on slope deformation measurements not applicable for early warning systems. On the other hand the use of acoustic emission records has been often a good tool in underground mining for slope monitoring. Here we aim to identify the characteristic signs of impending failure, by deploying a "site specific" microseismic monitoring system on an unstable patch of the Madonna del Sasso landslide on the Italian Western Alps designed to monitor subtle changes of the mechanical properties of the medium and installed as close as possible to the source region. The initial characterization based on geomechanical and geophysical tests allowed to understand the instability mechanism and to design the monitoring systems to be placed. Stability analysis showed that the stability of the slope is due to rock bridges. Their failure progress can results in a global slope failure. Consequently the rock bridges potentially generating dynamic ruptures need to be monitored. A first array consisting of instruments provided by University of Turin, has been deployed on October 2013, consisting of 4 triaxial 4.5 Hz seismometers connected to a 12 channel data logger arranged in a 'large aperture' configuration which encompasses the entire unstable rock mass. Preliminary data indicate the occurrence of microseismic swarms with different spectral contents. Two additional geophones and 4 triaxial piezoelectric accelerometers able to operate at frequencies up to 23 KHz will be installed during summer 2014. This will allow us to develop a network capable of recording events with Mw < 0.5 and frequencies between 700 Hz and 20 kHz. Rock physical and mechanical characterization along with rock deformation laboratory experiments during which the evolution of related physical parameters under simulated conditions of stress and fluid content will be also studied and theoretical modelling will allow to come up with a full hazard assessment and test new methodologies for a much wider scale of applications within EU.

  19. Natural Hazard Mitigation thru Water Augmentation Strategies to Provide Additional Snow Pack for Water Supply and Hydropower Generation in Drought Stressed Alps\\/Mountains

    Microsoft Academic Search

    D. Matthews; M. Brilly

    2009-01-01

    Climate variability and change are clearly stressing water supplies in high alpine regions of the Earth. These recent long-term natural hazards present critical challenges to policy makers and water managers. This paper addresses strategies to use enhanced scientific methods to mitigate the problem. Recent rapid depletions of glaciers and intense droughts throughout the world have created a need to reexamine

  20. Marine and Hydrokinetic Renewable Energy Devices, Potential Navigational Hazards and Mitigation Measures

    SciTech Connect

    Cool, Richard, M.; Hudon, Thomas, J.; Basco, David, R.; Rondorf, Neil, E.

    2009-12-01

    On April 15, 2008, the Department of Energy (DOE) issued a Funding Opportunity Announcement for Advanced Water Power Projects which included a Topic Area for Marine and Hydrokinetic Renewable Energy Market Acceleration Projects. Within this Topic Area, DOE identified potential navigational impacts of marine and hydrokinetic renewable energy technologies and measures to prevent adverse impacts on navigation as a sub-topic area. DOE defines marine and hydrokinetic technologies as those capable of utilizing one or more of the following resource categories for energy generation: ocean waves; tides or ocean currents; free flowing water in rivers or streams; and energy generation from the differentials in ocean temperature. PCCI was awarded Cooperative Agreement DE-FC36-08GO18177 from the DOE to identify the potential navigational impacts and mitigation measures for marine hydrokinetic technologies. A technical report addressing our findings is available on this Science and Technology Information site under the Product Title, "Marine and Hydrokinetic Renewable Energy Technologies: Potential Navigational Impacts and Mitigation Measures". This product is a brochure, primarily for project developers, that summarizes important issues in that more comprehensive report, identifies locations where that report can be downloaded, and identifies points of contact for more information.

  1. Natural Hazards and Effects on Local Populations: Applications of NSF MARGINS research to hazards mitigation in Central America

    E-print Network

    Marshall, Jeffrey S.

    , with less attention directed toward the behavior of quiescent volcanoes, or to the long term evolution assesments will benefit from studies of magma generation and movement, volatile and fluid fluxes, ground

  2. Protecting new health facilities from natural hazards: guidelines for the promotion of disaster mitigation.

    PubMed

    2004-01-01

    The health sector is particularly vulnerable to naturally occurring events. The vulnerability of the health infrastructure (hospitals and clinics) is of particular concern. Not only are such facilities vulnerable structurally, but their ability to continue to provide essential functions may be severely compromised, thus leaving the stricken population without essential services. This paper summarizes a more detailed document, Guidelines for Vulnerability Reduction in the Design of New Health Facilities published by the Pan-American Health Organization (PAHO)/ World Health Organization (WHO). The current document summarizes these Guidelines emphasizing how they may be used, by whom, and for what purpose. Potential users of the Guidelines include, but are not limited to: (1) initiators of health facility construction projects; (2) executors and supervisors of health facility construction projects; and (3) financing bodies in charge of funding health facility construction projects. The Guidelines include: (1) implications of natural phenomena upon the health infrastructure; (2) guidelines for vulnerability reduction for incorporation into development project cycles; (3) definitive phases and stages within the phases for development projects including: (I) Projects Assessment (needs assessment; assessment of options, the preliminary project); (II) Investment (project design, construction); and (III) Operational Activities (operations and maintenance). In addition, investment in damage reduction measures, policies and regulations, training and education, and the role of international organizations in the promotion and funding of mitigation strategies are addressed. PMID:15645629

  3. Using Darwin's theory of atoll formation to improve tsunami hazard mitigation in the Pacific

    NASA Astrophysics Data System (ADS)

    Goff, J. R.; Terry, J. P.

    2012-12-01

    It is 130 years since Charles Darwin's death and 176 years since he his penned his subsidence theory of atoll formation on 12th April 1836 during the voyage of the Beagle through the Pacific. This theory, founded on the premise of a subsiding volcano and the corresponding upward growth of coral reef, was astonishing for the time considering the absence of an underpinning awareness of plate tectonics. Furthermore, with the exception of the occasional permutation and opposing idea his theory has endured and has an enviable longevity amongst paradigms in geomorphology. In his theory, Darwin emphasised the generally circular morphology of the atoll shape and surprisingly, the validity of this simple morphological premise has never been questioned. There are however, few atolls in the Pacific Ocean that attain such a simple morphology with most manifesting one or more arcuate 'bight-like' structures (ABLSs). These departures from the circular form complicate his simplistic model and are indicative of geomorphological processes in the Pacific Ocean which cannot be ignored. ABLSs represent the surface morphological expression of major submarine failures of atoll volcanic foundations. Such failures can occur during any stage of atoll formation and are a valuable addition to Darwin's theory because they indicate the instability of the volcanic foundations. It is widely recognized in the research community that sector/flank collapses of island edifices are invariably tsunamigenic and yet we have no clear understanding of how significant such events are in the tsunami hazard arena. The recognition of ABLSs however, now offers scientists the opportunity to establish a first order database of potential local and regional tsunamigenic sources associated with the sector/flank collapses of island edifices. We illustrate the talk with examples of arcuate 'bight-like' structures and associated tsunamis in atoll and atoll-like environments. The implications for our understanding of tsunami hazards are profound. In essence, at present we are seriously under-estimating the significance of locally and regionally generated tsunamis throughout the entire Pacific Ocean, but we now have the opportunity to enhance our understanding of such events.

  4. Correlates of hazards education for youth: a replication study

    Microsoft Academic Search

    Kevin R. Ronan; Kylie Crellin; David Johnston

    2010-01-01

    Youth and families have been identified as particularly vulnerable to the effects of hazardous events. This study examined\\u000a correlates of hazards education involvement for youth. Participants were 407 youth between the ages of 7 and 18 who filled\\u000a out several indices reflecting hazards awareness, risk perceptions, psychological factors, knowledge, and adoption of hazards\\u000a adjustments and family emergency plans. Additionally, interactive

  5. Sea otter oil-spill mitigation study

    SciTech Connect

    Davis, R.W.; Thomas, J.; Williams, T.M.; Kastelein, R.; Cornell, L.

    1986-05-01

    The objective of the study was to analyze the effectiveness of existing capture, transport, cleaning, and rehabilitation methods and develop new methods to reduce the impact of an accidental oil spill to California sea otters, resulting from the present conditions or from future Outer Continental Shelf (OCS) oil and gas development in State or Federal waters. In addition, the study investigated whether or not a systematic difference in thermal conductivity existed between the pelts of Alaska and California Sea otters. This was done to assure that conclusions drawn from the oiling experiments carried out at Hubbs Marine Research Institute, Tetra Tech, Inc. contributed to the overall study by preparing a literature review and report on the fate and effects of oil dispersants and chemically dispersed oil.

  6. State of Colorado Wildfire Hazard

    E-print Network

    State of Colorado Wildfire Hazard Mitigation Plan Colorado Multi-Hazards Mitigation Plan July 2002 and importance of the August 1995 Wildfire Hazard Mitigation Plan and its predecessors as foundation documents on which to build and judge progress in wildfire hazard mitigation. The text version of the 1995 Plan

  7. Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Research Team . Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Kelly, Michael J.

    2013-01-01

    The Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage (horizontal and vertical tail). This report contains the Appendices to Volume I.

  8. Volcanic Ash Image Products from MODIS for Aviation Safety and Natural Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Stephens, G.; Ellrod, G. P.; Im, J.

    2003-12-01

    Multi-spectral volcanic ash image products have been developed using Moderate Resolution Imaging Spectroradiometer (MODIS) data from the NASA Terra spacecraft (Ellrod and Im 2003). Efforts are now underway to integrate these new products into the MODIS Data Retrieval System at NESDIS, for use in the operational Hazard Mapping System (HMS). The images will be used at the Washington Volcanic Ash Advisory Center (W-VAAC) in the issuance of volcanic ash advisory statements to aircraft. In addition, the images will be made available to users in the global volcano and emergency management community via the World Wide Web. During the development process, good results (high detection rate with low ­false alarms­") were obtained from a tri-spectral combination of MODIS Infrared (IR) bands centered near 8.6, 11.0 and 12.0 ŸYm (Bands 29, 31, and 32). Optimum Red-Green-Blue false color composite images were developed to provide information on ash cloud location, as well as cloud phase and surface characteristics, to aid in interpretation both day and night. Information on volcanic ash derived from the tri-spectral product was displayed using the red color gun. This information was combined with visible (0.6 ŸYm) and near-IR (1.6 ŸYm) data for green and blue, respectively, during daylight periods. At night, the 8.6 ­V 11.0 ŸYm combination and 11.0 ŸYm band were used for the green and blue colors in the RGB product. Currently, raw MODIS data in five minute ­granules­" are processed for the following regions: (1) southern Alaska, (2) Mexico, Central America and the Caribbean, and (3) northern Andes region of South America. Image products are converted to Geo-spatial Information System (GIS) compatible formats for use in the HMS, and to Man-Computer Interactive Data Access System (McIDAS) ­Area File­" format for use in currently configured W-VAAC display systems. The installation of a high speed, fiber optic line from NASA Goddard Space Flight Center to the World Weather Building, Camp Springs, Maryland (scheduled for completion by Fall, 2003) will allow a full set of data to be processed from both Terra and Aqua spacecraft.

  9. Conceptual Study on Air Ingress Mitigation for VHTRs

    SciTech Connect

    Chang H. Oh; Eung S. Kim

    2012-09-01

    An air-ingress accident followed by a pipe break is considered as a critical event for a very high temperature gas-cooled reactor (VHTR) safety. Following helium depressurization, it is anticipated that unless countermeasures are taken, air will enter the core through the break leading to oxidation of the in-core graphite structure. Thus, without mitigation features, this accident might lead to severe exothermic chemical reactions of graphite and oxygen depending on the accident scenario and the design. Under extreme circumstances, a loss of core structural integrity may occur and lead to a detrimental situation for the VHTR safety. This paper discusses various air-ingress mitigation concepts applicable for the VHTRs. The study begins with identifying important factors (or phenomena) associated with the air-ingress accident using a root-cause analysis. By preventing main causes of the important events identified in the root-cause diagram, the basic air-ingress mitigation ideas were conceptually developed. Among them, two concepts were finally evaluated as effective candidates. One concept is to inject helium into the lower plenum which is a direct in-vessel helium injection. The other concept is to enclose the reactor with a non-pressure boundary consisting of an opening at the bottom, which is an ex-vessel enclosure boundary. Computational fluid dynamics (CFD) methods were used to validate these concepts. As a result, it was shown that both concepts can effectively mitigate the air-ingress process. In the first concept, the injected helium replaces the air in the core and the lower plenum upper part by buoyancy force because of its low density. It prevented air from moving into the reactor core showing great potential for mitigating graphite oxidation in the core. In the second concept, the air-ingress rate is controlled by molecular diffusion through the opening at the enclosure bottom after depressurization. Some modified reactor cavity design is expected to play this role in the VHTRs.

  10. Assessing the Relationship Between Hazard Mitigation Plan Quality and Rural Status in a Cohort of 57 Counties from 3 States in the Southeastern U.S.

    E-print Network

    Horney, Jennifer A.; Naimi, Ashley I.; Lyles, Ward; Simon, Matt; Salvesen, David; Berke, Philip

    2013-08-13

    similar hazard exposures that predominantly include flooding, hurricanes and tornadoes. Establishing that consistent differences exist between rural and urban plans across these plan quality principles could play an important role in mitigating... with plan coding and secondary data collection. Funding was provided by the United States Department of Agriculture’s National Institute of Food and Agriculture (UDSA NIFA) 2009-06143. The findings and conclusions in this report are those of the authors...

  11. Scientific Animations for Tsunami Hazard Mitigation: The Pacific Tsunami Warning Center's YouTube Channel

    NASA Astrophysics Data System (ADS)

    Becker, N. C.; Wang, D.; Shiro, B.; Ward, B.

    2013-12-01

    Outreach and education save lives, and the Pacific Tsunami Warning Center (PTWC) has a new tool--a YouTube Channel--to advance its mission to protect lives and property from dangerous tsunamis. Such outreach and education is critical for coastal populations nearest an earthquake since they may not get an official warning before a tsunami reaches them and will need to know what to do when they feel strong shaking. Those who live far enough away to receive useful official warnings and react to them, however, can also benefit from PTWC's education and outreach efforts. They can better understand a tsunami warning message when they receive one, can better understand the danger facing them, and can better anticipate how events will unfold while the warning is in effect. The same holds true for emergency managers, who have the authority to evacuate the public they serve, and for the news media, critical partners in disseminating tsunami hazard information. PTWC's YouTube channel supplements its formal outreach and education efforts by making its computer animations available 24/7 to anyone with an Internet connection. Though the YouTube channel is only a month old (as of August 2013), it should rapidly develop a large global audience since similar videos on PTWC's Facebook page have reached over 70,000 viewers during organized media events, while PTWC's official web page has received tens of millions of hits during damaging tsunamis. These animations are not mere cartoons but use scientific data and calculations to render graphical depictions of real-world phenomena as accurately as possible. This practice holds true whether the animation is a simple comparison of historic earthquake magnitudes or a complex simulation cycling through thousands of high-resolution data grids to render tsunami waves propagating across an entire ocean basin. PTWC's animations fall into two broad categories. The first group illustrates concepts about seismology and how it is critical to tsunami warning operations, such as those about earthquake magnitudes, how earthquakes are located, where and how often earthquakes occur, and fault rupture length. The second group uses the PTWC-developed tsunami forecast model, RIFT (Wang et al., 2012), to show how various historic tsunamis propagated through the world's oceans. These animations illustrate important concepts about tsunami behavior such as their speed, how they bend around and bounce off of seafloor features, how their wave heights vary from place to place and in time, and how their behavior is strongly influenced by the type of earthquake that generated them. PTWC's YouTube channel also includes an animation that simulates both seismic and tsunami phenomena together as they occurred for the 2011 Japan tsunami including actual sea-level measurements and proper timing for tsunami alert status, thus serving as a video 'time line' for that event and showing the time scales involved in tsunami warning operations. Finally, PTWC's scientists can use their YouTube channel to communicate with their colleagues in the research community by supplementing their peer-reviewed papers with video 'figures' (e.g., Wang et al., 2012).

  12. Multi-scale earthquake hazard and risk in the Chinese mainland and countermeasures for the preparedness, mitigation, and management: an overview

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Jiang, C.; Ma, T.

    2012-12-01

    Earthquake hazard and risk in the Chinese mainland exhibit multi-scale characteristics. Temporal scales from centuries to months, spatial scales from the whole mainland to specific engineering structures, and energy scales from great disastrous earthquakes to small earthquakes causing social disturbance and economic loss, feature the complexity of earthquake disasters. Coping with such complex challenge, several research and application projects have been undertaken since recent years. Lessons and experiences of the 2008 Wenchuan earthquake contributed much to the launching and conducting of these projects. Understandings of the scientific problems and technical approaches taken in the mainstream studies in the Chinese mainland have no significant difference from those in the international scientific communities, albeit using of some of the terminologies have "cultural differences" - for instance, in the China Earthquake Administration (CEA), the terminology "earthquake forecast/prediction (study)" is generally used in a much broader sense, mainly indicating time-dependent seismic hazard at different spatio-temporal scales. Several scientific products have been produced serving the society in different forms. These scientific products have unique academic merits due to the long-term persistence feature and the forward forecast nature, which are all essential for the evaluation of the technical performance and the falsification of the scientific ideas. On the other hand, using the language of the "actor network theory (ANT)" in science studies (or the sociology of science), at present, the hierarchical "actors' network", making the science transformed to the actions of the public and government for the preparedness, mitigation, and management of multi-scale earthquake disasters, is still in need of careful construction and improvement.

  13. 3-D seismic structure of the Kachchh, Gujarat, and its implications for the earthquake hazard mitigation

    Microsoft Academic Search

    A. P. SinghO; O. P. Mishra; B. K. Rastogi; Dinesh Kumar

    2011-01-01

    Several pieces of studies on the January 26, 2001, Bhuj earthquake (Mw 7.6) revealed that the mainshock was triggered on the\\u000a hidden unmapped fault in the western part of Indian stable continental region that caused a huge loss in the entire Kachchh\\u000a rift basin of Gujarat, India. Occurrences of infrequent earthquakes of Mw 7.6 due to existence of hidden and

  14. A shape memory alloy-based reusable hysteretic damper for seismic hazard mitigation

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Zhu, S.

    2007-10-01

    This paper presents a special shape memory alloy-based hysteretic damper with distinctive features such as tunable hysteretic behavior and ability to withstand several design level earthquakes. Superelastic nitinol stranded wires are used for energy dissipation in this damping device, termed a reusable hysteretic damper (RHD). By adjusting its design parameters, the hysteretic behavior of the RHD can be modified to best fit the needs for passive structural control applications. Adjustable design parameters of the RHD include the inclination angle of the nitinol wires, pretension level, and friction effect. A simulation-based parametric study was carried out to examine the effects of these design parameters of the RHD on its energy dissipating performance. The effectiveness of the RHD in passive seismic response control of civil engineering structures is examined through a nonlinear dynamic analysis of a three-story steel frame building with and without an RHD. The simulation results suggest that it can effectively reduce the structural response of building structures subjected to strong earthquakes. With proper design, an RHD can be reused for several strong earthquakes without the need for repair, due to the high fatigue life of nitinol wires.

  15. RAGE Hydrocode Modeling of Asteroid Mitigation: new simulations with parametric studies for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Weaver, R.; Plesko, C. S.; Gisler, G. R.

    2013-12-01

    We are performing detailed hydrodynamic simulations of the interaction from a strong explosion with sample Asteroid objects. The purpose of these simulations is to apply modern hydrodynamic codes that have been well verified and validated (V&V) to the problem of mitigating the hazard from a potentially hazardous object (PHO), an asteroid or comet that is on an Earth crossing orbit. The code we use for these simulations is the RAGE code from Los Alamos National Laboratory [1-6]. Initial runs were performed using a spherical object. Next we ran simulations using the shape form from a known asteroid: 25143 Itokawa. This particular asteroid is not a PHO but we use its shape to consider the influence of non-spherical objects. The initial work was performed using 2D cylindrically symmetric simulations and simple geometries. We then performed a major fully 3D simulation. For an Itokawa size object (~500 m) and an explosion energies ranging from 0.5 - 1 megatons, the velocities imparted to all of the PHO "rocks" in all cases were many m/s. The velocities calculated were much larger than escape velocity and would preclude re-assembly of the fragments. The dispersion of the asteroid remnants is very directional from a surface burst, with all fragments moving away from the point of the explosion. This detail can be used to time the intercept for maximum movement off the original orbit. Results from these previous studies will be summarized for background. In the new work presented here we show a variety of parametric studies around these initial simulations. We modified the explosion energy by +/- 20% and varied the internal composition from a few large "rocks" to several hundred smaller rocks. The results of these parametric studies will be presented. We have also extended our work [6],[7] to stand-off nuclear bursts and will present the initial results for the energy deposition by a generic source into the non-uniform composition asteroid. The goal of this new work is to obtain an "optimal stand-off" distance from detailed radiation transport-hydrodynamic simulations from generic explosion properties. The initial results of these two studies will also be presented. References [1] Gitting, Weaver et al 'The RAGE radiation-hydrodynamics Code,' Comp. Sci. Disc. 1 (2008) 015005 November 21, 2008 [2] Huebner, W.F. et al, 'The Engagement Space for Countermeasures Against Potentially Hazardous Objects (PHOs),' International Conference in Asteroid and Comet Hazards, 2009 held at the Russian Academy of Sciences, St. Petersburg 21-25-September 2009. [3] Gisler, Weaver, Mader, & Gittings, Two and three dimensional asteroid impact simulations, Computing in Science & Engineering, 6, 38 (2004). [4] NASA geometry courtesy of S.J. Osto et al. (2002) in Asteroids Book 3 [5] Itokawa image courtesy of JAXA: http://www.isas.jaxa.jp/e/snews/2005/1102.shtml#pic001 [6] Plesko, C et al "Looking Before we Leap: Recent Results from an Ongoing, Quantitative Investigation of Asteroid and Comet Impact Hazard Mitigation" Division of Planetary Sciences 2010. [7] Plesko, C et. al. "Numerical Models of Asteroid and Comet Impact Hazard Mitigation by Nuclear Stand-Off Burst." Planetary Defense Conference 2011.

  16. Landsat TM and ETM+ Time Sequence of Lahar Hazards on Fuego Volcano, Guatemala

    Microsoft Academic Search

    S. L. Reif; G. J. Bluth; W. I. Rose; O. Matias

    2003-01-01

    Volcanic hazards pose a threat to a large number of the world's population, especially secondary hazards due to remobilization of volcanic material such as landslides and lahars. Many hazard-prone areas would benefit by remote sensing tools for hazard mitigation. In this study, we propose to use remote sensing and GIS techniques to map these hazard prone areas around Fuego volcano,

  17. Hazardous and radioactive waste incineration studies

    SciTech Connect

    Vavruska, J.S.; Stretz, L.A.; Borduin, L.C.

    1981-01-01

    Development and demonstration of a transuranic (TRU) waste volume-reduction process is described. A production-scale controlled air incinerator using commercially available equipment and technology has been modified for solid radioactive waste service. This unit successfully demonstrated the volume reduction of transuranic (TRU) waste with an average TRU content of about 20 nCi/g. The same incinerator and offgas treatment system is being modified further to evaluate the destruction of hazardous liquid wastes such as polychlorinated biphenyls (PCBs) and hazardous solid wastes such as pentachlorophenol (PCP)-treated wood.

  18. Natural Hazard Mitigation thru Water Augmentation Strategies to Provide Additional Snow Pack for Water Supply and Hydropower Generation in Drought Stressed Alps/Mountains

    NASA Astrophysics Data System (ADS)

    Matthews, D.; Brilly, M.

    2009-12-01

    Climate variability and change are clearly stressing water supplies in high alpine regions of the Earth. These recent long-term natural hazards present critical challenges to policy makers and water managers. This paper addresses strategies to use enhanced scientific methods to mitigate the problem. Recent rapid depletions of glaciers and intense droughts throughout the world have created a need to reexamine modern water augmentation technologies for enhancing snow pack in mountainous regions. Today’s reliance on clean efficient hydroelectric power in the Alps and the Rocky Mountains poses a critical need for sustainable snow packs and high elevation water supplies through out the year. Hence, the need to make natural cloud systems more efficient precipitators during the cold season through anthropogenic weather modification techniques. The Bureau of Reclamation, US Department of the Interior, has spent over $39M in research from 1963 to 1990 to develop the scientific basis for snow pack augmentation in the headwaters of the Colorado, American, and Columbia River Basins in the western United States, and through USAID in Morocco in the High Atlas Mountains. This paper presents a brief summary of the research findings and shows that even during drought conditions potential exists for significant, cost-effective enhancement of water supplies. Examples of ground based propane and AgI seeding generators, cloud physics studies of supercooled cloud droplets and ice crystal characteristics that indicate seeding potential will be shown. Hypothetical analyses of seeding potential in 17 western states from Montana to California will be presented based on observed SNOTEL snow water equivalent measurements, and distributed by elevation and observed winter precipitation. Early studies indicated from 5 to 20% increases in snow pack were possible, if winter storm systems were seeded effectively. If this potential was realized in drought conditions observed in 2003, over 1.08 million acre feet (1.33 x 10**9 m3) of additional water could be captured by seeding efficiently and effectively in just 10 storms. Recent projects sponsored by the National Science Foundation, NOAA, and the States of Wyoming, Utah and Nevada, and conducted by the National Center for Atmospheric Research will be discussed briefly. Examples of conditions in extreme droughts of the Western United States will be presented that show potential to mitigate droughts in these regions through cloud seeding. Implications for American and European hydropower generation and sustainable water supplies will be discussed.

  19. Collaborative Studies Target Volcanic Hazards in Central America Gregg J. S. Bluth and William I. Rose

    E-print Network

    Bluth, Gregg

    with the local volcanological and hazard mitigation agencies, and all benefit from the sharing of manpower Currently, two main agencies within Guatemala deal with volcano hazards. INSIVUMEH (Instituto Nacional de who maintain small observatories and observe the volcanoes, reporting daily by radio. INSIVUMEH

  20. A design study on complexity reduced multipath mitigation

    NASA Astrophysics Data System (ADS)

    Wasenmüller, U.; Brack, T.; Groh, I.; Staudinger, E.; Sand, S.; Wehn, N.

    2012-09-01

    Global navigation satellite systems, e.g. the current GPS and the future European Galileo system, are frequently used in car navigation systems or smart phones to determine the position of a user. The calculation of the mobile position is based on the signal propagation times between the satellites and the mobile terminal. At least four time of arrival (TOA) measurements from four different satellites are required to resolve the position uniquely. Further, the satellites need to be line-of-sight to the receiver for exact position calculation. However, in an urban area, the direct path may be blocked and the resulting multipath propagation causes errors in the order of tens of meters for each measurement. and in the case of non-line-of-sight (NLOS), positive errors in the order of hundreds of meters. In this paper an advanced algorithm for multipath mitigation known as CRMM is presented. CRMM features reduced algorithmic complexity and superior performance in comparison with other state of the art multipath mitigation algorithms. Simulation results demonstrate the significant improvements in position calculation in environments with severe multipath propagation. Nevertheless, in relation to traditional algorithms an increased effort is required for real-time signal processing due to the large amount of data, which has to be processed in parallel. Based on CRMM, we performed a comprehensive design study including a design space exploration for the tracking unit hardware part, and prototype implementation for hardware complexity estimation.

  1. System Safety Hazards Assessment in Conceptual Program Trade Studies

    NASA Technical Reports Server (NTRS)

    Eben, Dennis M.; Saemisch, Michael K.

    2003-01-01

    Providing a program in the concept development phase with a method of determining system safety benefits of potential concepts has always been a challenge. Lockheed Martin Space and Strategic Missiles has developed a methodology for developing a relative system safety ranking using the potential hazards of each concept. The resulting output supports program decisions with system safety as an evaluation criterion with supporting data for evaluation. This approach begins with a generic hazards list that has been tailored for the program being studied and augmented with an initial hazard analysis. Each proposed concept is assessed against the list of program hazards and ranked in three derived areas. The hazards can be weighted to show those that are of more concern to the program. Sensitivities can be also be determined to test the robustness of the conclusions

  2. Airflow Hazard Visualization for Helicopter Pilots: Flight Simulation Study Results

    NASA Technical Reports Server (NTRS)

    Aragon, Cecilia R.; Long, Kurtis R.

    2005-01-01

    Airflow hazards such as vortices or low level wind shear have been identified as a primary contributing factor in many helicopter accidents. US Navy ships generate airwakes over their decks, creating potentially hazardous conditions for shipboard rotorcraft launch and recovery. Recent sensor developments may enable the delivery of airwake data to the cockpit, where visualizing the hazard data may improve safety and possibly extend ship/helicopter operational envelopes. A prototype flight-deck airflow hazard visualization system was implemented on a high-fidelity rotorcraft flight dynamics simulator. Experienced helicopter pilots, including pilots from all five branches of the military, participated in a usability study of the system. Data was collected both objectively from the simulator and subjectively from post-test questionnaires. Results of the data analysis are presented, demonstrating a reduction in crash rate and other trends that illustrate the potential of airflow hazard visualization to improve flight safety.

  3. Caribbean Tsunami and Earthquake Hazards Studies

    NSDL National Science Digital Library

    This portal provides information on the seismicity and plate tectonics of the active boundary between the North American plate and the northeast corner of the Caribbean plate, and the research program being conducted there by the United States Geological Survey (USGS). There are links to maps and remote imagery of the plate boundary and the Caribbean Trench, and to publications and news articles on seismic and tsunami hazards, seafloor mapping, plate interactions, and submarine slides. There is also a movie that describes the geologic background and USGS research efforts in the area.

  4. Mitigating Resistance to Teaching Science Through Inquiry: Studying Self

    NASA Astrophysics Data System (ADS)

    Spector, Barbara; Burkett, Ruth S.; Leard, Cyndy

    2007-04-01

    This is the report of a qualitative emergent-design study of 2 different Web-enhanced science methods courses for preservice elementary teachers in which an experiential learning strategy, labeled “using yourself as a learning laboratory,” was implemented. Emergent grounded theory indicated this strategy, when embedded in a course organized as an inquiry with specified action foci, contributed to mitigating participants’ resistance to learning and teaching through inquiry. Enroute to embracing inquiry, learners experienced stages resembling the stages of grief one experiences after a major loss. Data sources included participant observation, electronic artifacts in WebCT, and interviews. Findings are reported in 3 major sections: “Action Foci Common to Both Courses,” “Participants’ Growth and Change,” and “Challenges and Tradeoffs.”

  5. Peru mitigation assessment of greenhouse gases: Sector -- Energy. Peru climate change country study; Final report

    SciTech Connect

    NONE

    1996-08-01

    The aim of this study is to determine the Inventory and propose Greenhouse Gases Mitigation alternatives in order to face the future development of the country in a clean environmental setting without delaying the development process required to improve Peruvian standard of living. The main idea of this executive abstract is to show concisely the results of the Greenhouse Gases Mitigation for Peru in the period 1990--2015. The studies about mitigation for the Energy Sector are shown in this summary.

  6. Experimental study designs to improve the evaluation of road mitigation measures for wildlife.

    PubMed

    Rytwinski, Trina; van der Ree, Rodney; Cunnington, Glenn M; Fahrig, Lenore; Findlay, C Scott; Houlahan, Jeff; Jaeger, Jochen A G; Soanes, Kylie; van der Grift, Edgar A

    2015-05-01

    An experimental approach to road mitigation that maximizes inferential power is essential to ensure that mitigation is both ecologically-effective and cost-effective. Here, we set out the need for and standards of using an experimental approach to road mitigation, in order to improve knowledge of the influence of mitigation measures on wildlife populations. We point out two key areas that need to be considered when conducting mitigation experiments. First, researchers need to get involved at the earliest stage of the road or mitigation project to ensure the necessary planning and funds are available for conducting a high quality experiment. Second, experimentation will generate new knowledge about the parameters that influence mitigation effectiveness, which ultimately allows better prediction for future road mitigation projects. We identify seven key questions about mitigation structures (i.e., wildlife crossing structures and fencing) that remain largely or entirely unanswered at the population-level: (1) Does a given crossing structure work? What type and size of crossing structures should we use? (2) How many crossing structures should we build? (3) Is it more effective to install a small number of large-sized crossing structures or a large number of small-sized crossing structures? (4) How much barrier fencing is needed for a given length of road? (5) Do we need funnel fencing to lead animals to crossing structures, and how long does such fencing have to be? (6) How should we manage/manipulate the environment in the area around the crossing structures and fencing? (7) Where should we place crossing structures and barrier fencing? We provide experimental approaches to answering each of them using example Before-After-Control-Impact (BACI) study designs for two stages in the road/mitigation project where researchers may become involved: (1) at the beginning of a road/mitigation project, and (2) after the mitigation has been constructed; highlighting real case studies when available. PMID:25704749

  7. Hazardous materials incidents on major highways -- A case study

    SciTech Connect

    McElhaney, M.S.

    1995-12-31

    Personnel from both the public and private sectors have been involved for many years in pre-planning for hazardous materials releases at fixed installations all over the world. As a result of several major petroleum releases during marine transportation, oil companies, private contractors and government agencies have been preparing contingency plans for oil spills and other petroleum product releases in marine settings. Various industry groups have also developed plans for railway and pipeline disasters. These response plans are of varying quality, complexity and usefulness. Organizations such as plant emergency response teams, government agencies, contract response and clean-up crews and fire departments use these plans as a basis for training and resource allocation, hopefully becoming familiar enough with them that the plans are truly useful when product releases occur. Planners and emergency responders to hazardous materials releases must overcome some of the deficiencies which have long stood in the way of efficient and effective response and mitigation efforts. Specifically they must recognize and involve all resources with which they may respond or interact during an incident. This involvement should begin with the planning stages and carry through to training and emergency response and recovery efforts. They must ensure that they adopt and utilize a common command and control system and that all potential resources know this system thoroughly and train together before the incident occurs. It is only through incorporating these two factors that may successfully combat the ever growing number of unwanted product releases occurring in the more difficult realm of transportation.

  8. Preliminary study on the transport of hazardous materials through tunnels.

    PubMed

    Bubbico, Roberto; Di Cave, Sergio; Mazzarotta, Barbara; Silvetti, Barbara

    2009-11-01

    The risk associated to road and rail transportation of some hazardous materials along two routes, one including a significant portion in tunnels, and the other following the same path, but running completely in the open, is assessed. The results show that, for rail transport, no particular risk increase or mitigation is associated to the circulation of the dangerous goods through tunnels; on the contrary, for road transport, a risk increase is generally observed in the presence of tunnels. However, for LPG, the risk curve in the open lies above that in tunnels in the high frequency-low fatality zone, according to the different evolution of the accidental scenarios in the tunnel (assuming no ventilation). The transportation of liquefied nitrogen, not hazardous in the open but potentially asphyxiating in a tunnel, gives rise to a negligible risk when performed by rail, but to a not negligible one, when performed by road. These preliminary results focused on the risk for the exposed population, suggest that it may be unnecessary to limit dangerous goods circulation through rail tunnels, while, at least for some types of dangerous goods, the circulation through road tunnels may be allowed/forbidden based on the results of a specific risk analysis. PMID:19819368

  9. Success in transmitting hazard science

    Microsoft Academic Search

    J. G. Price; T. Garside

    2010-01-01

    Money motivates mitigation. An example of success in communicating scientific information about hazards, coupled with information about available money, is the follow-up action by local governments to actually mitigate. The Nevada Hazard Mitigation Planning Committee helps local governments prepare competitive proposals for federal funds to reduce risks from natural hazards. Composed of volunteers with expertise in emergency management, building standards,

  10. Concerns About Climate Change Mitigation Projects: Summary of Findings from Case Studies in Brazil, India, Mexico, and South Africa

    E-print Network

    1998-01-01

    Climate-Change Projects and Benefits Case Study Brazil:CLIMATE CHANGE MITIGATION SUMMARY PROJECTS: OF FINDINGS FROM CASE STUDIES IN BRAZIL,CLIMATE CHANGE MITIGATION PROJECTS: SUMMARY OF FINDINGS FROM CASE STUDIES IN BRAZIL,

  11. A study on seismicity and seismic hazard for Karnataka State

    NASA Astrophysics Data System (ADS)

    Sitharam, T. G.; James, Naveen; Vipin, K. S.; Raj, K. Ganesha

    2012-04-01

    This paper presents a detailed study on the seismic pattern of the state of Karnataka and also quantifies the seismic hazard for the entire state. In the present work, historical and instrumental seismicity data for Karnataka (within 300 km from Karnataka political boundary) were compiled and hazard analysis was done based on this data. Geographically, Karnataka forms a part of peninsular India which is tectonically identified as an intraplate region of Indian plate. Due to the convergent movement of the Indian plate with the Eurasian plate, movements are occurring along major intraplate faults resulting in seismic activity of the region and hence the hazard assessment of this region is very important. Apart from referring to seismotectonic atlas for identifying faults and fractures, major lineaments in the study area were also mapped using satellite data. The earthquake events reported by various national and international agencies were collected until 2009. Declustering of earthquake events was done to remove foreshocks and aftershocks. Seismic hazard analysis was done for the state of Karnataka using both deterministic and probabilistic approaches incorporating logic tree methodology. The peak ground acceleration (PGA) at rock level was evaluated for the entire state considering a grid size of 0.05° × 0.05°. The attenuation relations proposed for stable continental shield region were used in evaluating the seismic hazard with appropriate weightage factors. Response spectra at rock level for important Tier II cities and Bangalore were evaluated. The contour maps showing the spatial variation of PGA values at bedrock are presented in this work.

  12. Mitigation of hazards from future lahars from Mount Merapi in the Krasak River channel near Yogyakarta, central Java

    USGS Publications Warehouse

    Ege, John R.; Sutikno

    1983-01-01

    Procedures for reducing hazards from future lahars and debris flows in the Krasak River channel near Yogyakarta, Central Java, Indonesia, include (1) determining the history of the location, size, and effects of previous lahars and debris flows, and (2) decreasing flow velocities. The first may be accomplished by geologic field mapping along with acquiring information by interviewing local residents, and the second by increasing the cross sectional area of the river channel and constructing barriers in the flow path.

  13. Using fine-scale fuel measurements to assess wildland fuels, potential fire behavior and hazard mitigation treatments in the southeastern USA.

    SciTech Connect

    Ottmar, Roger, D.; Blake, John, I.; Crolly, William, T.

    2012-01-01

    The inherent spatial and temporal heterogeneity of fuelbeds in forests of the southeastern United States may require fine scale fuel measurements for providing reliable fire hazard and fuel treatment effectiveness estimates. In a series of five papers, an intensive, fine scale fuel inventory from the Savanna River Site in the southeastern United States is used for building fuelbeds and mapping fire behavior potential, evaluating fuel treatment options for effectiveness, and providing a comparative analysis of landscape modeled fire behavior using three different data sources including the Fuel Characteristic Classification System, LANDFIRE, and the Southern Wildfire Risk Assessment. The research demonstrates that fine scale fuel measurements associated with fuel inventories repeated over time can be used to assess broad scale wildland fire potential and hazard mitigation treatment effectiveness in the southeastern USA and similar fire prone regions. Additional investigations will be needed to modify and improve these processes and capture the true potential of these fine scale data sets for fire and fuel management planning.

  14. Studies of RF Biological Hazards from High Power HF Transmitters

    Microsoft Academic Search

    Albert R. Kall; Frank Campellone; H. M. Watts; George C. Henny

    1969-01-01

    A one-year research program is reported concerning an intensive investigation into biological RF radiation hazards from high power, high frequency (HF) antennas. This program consisted of two major phases: the measurement of field strengths in the near vicinity of high power HF band antennas, and coordinated clinical irradiation studies using male albino Wistar rats as the test specimens. An extensive

  15. Safety Hazards in Child Care Settings. CPSC Staff Study.

    ERIC Educational Resources Information Center

    Consumer Product Safety Commission, Washington, DC.

    Each year, thousands of children in child care settings are injured seriously enough to need emergency medical treatment. This national study identified potential safety hazards in 220 licensed child care settings in October and November 1998. Eight product areas were examined: cribs, soft bedding, playground surfacing, playground surface…

  16. A CASE STUDY OF HAZARDOUS WASTES IN CLASS I LANDFILLS

    EPA Science Inventory

    This study documents the average concentration, estimated daily deposition, and partitioning of 17 metal species in hazardous wastes discharged to five Class I landfill sites in the greater Los Angeles area. These sites receive a combined estimated daily volume of 2.3 x 10 to the...

  17. A numerical study of water mist mitigation of tunnel fires

    Microsoft Academic Search

    F. Nmira; J. L. Consalvi; A. Kaiss; A. C. Fernandez-Pello; B. Porterie

    2009-01-01

    A Eulerian–Eulerian two-phase model is developed to numerically investigate the efficiency of water mist systems in mitigating thermoplastic fires in a tunnel. The polydisperse nature of the spray is modeled using the moments of the droplet size distribution function. The system of Favre-averaged Navier–Stokes equations for the gas phase is closed using the k–?RNG turbulence model, the Eddy-Break-Up–Arrhenius model for

  18. EVOLVE 4.0 orbital debris mitigation studies

    Microsoft Academic Search

    P. H. Krisko; N. L. Johnson; J. N. Opiela

    2001-01-01

    In a continuing effort to limit future space debris generation, the NASA Policy Directive 8710.3 was issued in May 1997. It requires all NASA-sponsored programs to conduct formal assessments in accordance with NASA Safety Standard 1740.14 to quantify the potential to generate debris and to consider debris mitigation options. Recent improvements to the NASA long-term debris environment model, EVOLVE 4.0,

  19. Leak detection, monitoring, and mitigation technology trade study update

    SciTech Connect

    HERTZEL, J.S.

    1998-11-10

    This document is a revision and update to the initial report that describes various leak detection, monitoring, and mitigation (LDMM) technologies that can be used to support the retrieval of waste from the single-shell tanks (SST) at the Hanford Site. This revision focuses on the improvements in the technical performance of previously identified and useful technologies, and it introduces new technologies that might prove to be useful.

  20. Deepwater Gulf of Mexico Shallow Hazards Studies Best Practices

    NASA Astrophysics Data System (ADS)

    Fernandez, M.; Hobbs, B.

    2005-05-01

    ConocoPhillips (hConoco) has been involved in deepwater exploration in the Gulf of Mexico for the last 5 years using a dynamically positioned (DP) drillship. As part of the Federal (MMS) and State permitting process for deepwater exploration, ConocoPhillips (COPC) actively undertakes in securing seabed and shallow subsurface hazard surveys and analyses for every potential drillsite. COPC conducts seabed and shallow subsurface hazards surveys for at least two main reasons: To be a safe, efficient operator, seabed and shallow subsurface hazard surveys and analyses are necessary steps of the Exploration Work Flow to help ensure a safe well, and to fulfill MMS (or local government) regulatory requirements The purpose of shallow geohazards studies is to determine seafloor and sub-bottom conditions, inspect for possible chemosynthetic communities, and to provide a shallow hazards assessment in accordance with NTL 2003-G17. During the five years of deepwater exploration COPC has contracted Fugro Geoservices to perform hazards studies in over 30 offshore blocks. Deepwater Gulf of Mexico Shallow Hazards Studies Best Practices The results of the seabed and shallow geohazards are a critical part of the construction of all of our well plans and are dynamically used in all MDT's. The results of the seabed and shallow geohazards investigations have greatly improved our drilling efficiency by predicting and avoiding possible chemosynthetic communities, sea floor faults, shallow gas, and shallow water flow. CoP's outstanding safety record and environmental stewardship with regards to geohazards has helped us in accelerating certain Exploration Plans (within MMS guidelines). These types of efforts has saved money and kept the drilling schedule running smoothly. In the last two years, the MMS has given COPC approval to use existing 3D spec seismic volumes for Shallow Hazards Assessment at several locations where applicable. This type of effort has saved ConocoPhillips hundreds of thousands of dollars that would have been spent in either acquiring 2D high resolution seismic data or reprocessing an existing 3D data volume. Examples from Selected Prospects: Magnolia (Garden Banks 783/784); Voss (Keathley Canyon 347/391/435); Lorien (Green Canyon 199); Yorick (Green Canyon 391/435)

  1. HOUSEHOLD HAZARDOUS WASTE CHARACTERIZATION STUDY FOR PALM BEACH COUNTY, FLORIDA: A MITE PROGRAM EVALUATION

    EPA Science Inventory

    The objectives of the Household hazardous Waste Characterization Study (the HHW Study) were to quantify the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County, Florida's (the county) residential solid waste (characterized in this study as municipal soli...

  2. The use of questionnaires for acquiring information on public perception of natural hazards and risk mitigation - a review of current knowledge and practice

    NASA Astrophysics Data System (ADS)

    Bird, D. K.

    2009-07-01

    Questionnaires are popular and fundamental tools for acquiring information on public knowledge and perception of natural hazards. Questionnaires can provide valuable information to emergency management agencies for developing risk management procedures. Although many natural hazards researchers describe results generated from questionnaires, few explain the techniques used for their development and implementation. Methodological detail should include, as a minimum, response format (open/closed questions), mode of delivery, sampling technique, response rate and access to the questionnaire to allow reproduction of or comparison with similar studies. This article reviews current knowledge and practice for developing and implementing questionnaires. Key features include questionnaire design, delivery mode, sampling techniques and data analysis. In order to illustrate these aspects, a case study examines methods chosen for the development and implementation of questionnaires used to obtain information on knowledge and perception of volcanic hazards in a tourist region in southern Iceland. Face-to-face interviews highlighted certain issues with respect to question structure and sequence. Recommendations are made to overcome these problems before the questionnaires are applied in future research projects. In conclusion, basic steps that should be disclosed in the literature are provided as a checklist to ensure that reliable, replicable and valid results are produced from questionnaire based hazard knowledge and risk perception research.

  3. RADON MITIGATION IN SCHOOLS: CASE STUDIES OF RADON MITIGATION SYSTEMS INSTALLED BY EPA IN FOUR MARYLAND SCHOOLS ARE PRESENTED

    EPA Science Inventory

    The first part of this two-part paper discusses radon entry into schools, radon mitigation approaches for schools, and school characteristics (e.g., heating, ventilation, and air-conditioning -- HVAC-- system design and operation) that influence radon entry and mitigation system ...

  4. Monitoring volcanic activity with satellite remote sensing to reduce aviation hazard and mitigate the risk: application to the North Pacific

    NASA Astrophysics Data System (ADS)

    Webley, P. W.; Dehn, J.

    2012-12-01

    Volcanic activity across the North Pacific (NOPAC) occurs on a daily basis and as such monitoring needs to occur on a 24 hour, 365 days a year basis. The risk to the local population and aviation traffic is too high for this not to happen. Given the size and remoteness of the NOPAC region, satellite remote sensing has become an invaluable tool to monitor the ground activity from the regions volcanoes as well as observe, detect and analyze the volcanic ash clouds that transverse across the Pacific. Here, we describe the satellite data collection, data analysis, real-time alert/alarm systems, observational database and nearly 20-year archive of both automated and manual observations of volcanic activity. We provide examples of where satellite remote sensing has detected precursory activity at volcanoes, prior to the volcanic eruption, as well as different types of eruptive behavior that can be inferred from the time series data. Additionally, we illustrate how the remote sensing data be used to detect volcanic ash in the atmosphere, with some of the pro's and con's to the method as applied to the NOPAC, and how the data can be used with other volcano monitoring techniques, such as seismic monitoring and infrasound, to provide a more complete understanding of a volcanoes behavior. We focus on several large volcanic events across the region, since our archive started in 1993, and show how the system can detect both these large scale events as well as the smaller in size but higher in frequency type events. It's all about how to reduce the risk, improve scenario planning and situational awareness and at the same time providing the best and most reliable hazard assessment from any volcanic activity.

  5. Study on the health hazards of scrap metal cutters.

    PubMed

    Ho, S F; Wong, P H; Kwok, S F

    1989-12-01

    Scrap metal cutters seemed to be left out in most preventive programmes as the workers were mainly contract workers. The health hazards of scrap metal cutting in 54 workers from a foundry and a ship breaking plant were evaluated. Environmental sampling showed lead levels ranging from 0.02 to 0.57 mg/m3 (threshold limit values is 0.15 mg/m3). Exposure to lead came mainly from the paint coat of the metals cut. Metal fume fever was not reported although their main complaints were cough and rhinitis. Skin burns at all stages of healing and residual scars were seen over hands, forearms and thighs. 96% of the cutters had blood lead levels exceeding 40 micrograms/100 ml with 10 workers exceeding 70 micrograms/100 ml. None had clinical evidence of lead poisoning. The study showed that scrap metal cutting is a hazardous industry associated with significant lead exposure. With proper medical supervision, the blood lead levels of this group of workers decreased illustrating the importance of identifying the hazard and implementing appropriate medical surveillance programmes. PMID:2635395

  6. Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards: Part II. Validation of satellite-derived Volcanic Sulphur Dioxide Levels.

    NASA Astrophysics Data System (ADS)

    Koukouli, MariLiza; Balis, Dimitris; Dimopoulos, Spiros; Clarisse, Lieven; Carboni, Elisa; Hedelt, Pascal; Spinetti, Claudia; Theys, Nicolas; Tampellini, Lucia; Zehner, Claus

    2014-05-01

    The eruption of the Icelandic volcano Eyjafjallajökull in the spring of 2010 turned the attention of both the public and the scientific community to the susceptibility of the European airspace to the outflows of large volcanic eruptions. The ash-rich plume from Eyjafjallajökull drifted towards Europe and caused major disruptions of European air traffic for several weeks affecting the everyday life of millions of people and with a strong economic impact. This unparalleled situation revealed limitations in the decision making process due to the lack of information on the tolerance to ash of commercial aircraft engines as well as limitations in the ash monitoring and prediction capabilities. The European Space Agency project Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards, was introduced to facilitate the development of an optimal End-to-End System for Volcanic Ash Plume Monitoring and Prediction. This system is based on comprehensive satellite-derived ash plume and sulphur dioxide [SO2] level estimates, as well as a widespread validation using supplementary satellite, aircraft and ground-based measurements. The validation of volcanic SO2 levels extracted from the sensors GOME-2/MetopA and IASI/MetopA are shown here with emphasis on the total column observed right before, during and after the Eyjafjallajökull 2010 eruptions. Co-located ground-based Brewer Spectrophotometer data extracted from the World Ozone and Ultraviolet Radiation Data Centre, WOUDC, were compared to the different satellite estimates. The findings are presented at length, alongside a comprehensive discussion of future scenarios.

  7. Active Amplification of the Terrestrial Albedo to Mitigate Climate Change: An Exploratory Study

    E-print Network

    Hamwey, R M

    2005-01-01

    This study explores the potential to enhance the reflectance of solar insolation by the human settlement and grassland components of the Earth's terrestrial surface as a climate change mitigation measure. Preliminary estimates derived using a static radiative transfer model indicate that such efforts could amplify the planetary albedo enough to offset the current global annual average level of radiative forcing caused by anthropogenic greenhouse gases by as much as 30 percent or 0.76 W/m2. Terrestrial albedo amplification may thus extend, by about 25 years, the time available to advance the development and use of low-emission energy conversion technologies which ultimately remain essential to mitigate long-term climate change. However, additional study is needed to confirm the estimates reported here and to assess the economic and environmental impacts of active land-surface albedo amplification as a climate change mitigation measure.

  8. Channelized debris flow hazard mitigation through the use of flexible barriers: a simplified computational approach for a sensitivity analysis.

    NASA Astrophysics Data System (ADS)

    Segalini, Andrea; Ferrero, Anna Maria; Brighenti, Roberto

    2013-04-01

    A channelized debris flow is usually represented by a mixture of solid particles of various sizes and water, flowing along a laterally confined inclined channel-shaped region up to an unconfined area where it slow down its motion and spreads out into a flat-shaped mass. The study of these phenomena is very difficult due to their short duration and unpredictability, lack of historical data for a given basin and complexity of the involved mechanical phenomena. The post event surveys allow for the identification of some depositional features and provide indication about the maximum flow height; however they lack information about development of the phenomena with time. For this purpose the monitoring of recursive events has been carried out by several Authors. Most of the studies, aimed at the determination of the characteristic features of a debris flow, were carried out in artificial channels, where the main involved variables were measured and other where controlled during the tests; however, some uncertainties remained and other scaled models where developed to simulate the deposition mechanics as well as to analyze the transportation mechanics and the energy dissipation. The assessment of the mechanical behavior of the protection structures upon impact with the flow as well as the energy associated to it are necessary for the proper design of such structures that, in densely populated area, can avoid victims and limit the destructive effects of such a phenomenon. In this work a simplified structural model, developed by the Authors for the safety assessment of retention barrier against channelized debris flow, is presented and some parametric cases are interpreted through the proposed approach; this model is developed as a simplified and efficient tool to be used for the verification of the supporting cables and foundations of a flexible debris flow barrier. The present analytical and numerical-based approach has a different aim of a FEM model. The computational experiences by using FEM modeling for these kind of structures, had shown that a large amount of time for both the geometrical setup of the model and its computation is necessary. The big effort required by FEM for this class of problems limits the actual possibility to investigate different geometrical configurations, load schemes etc. and it is suitable to represent a specific configuration but it does not allow for investigation of the influence of parameter changes. On the other hand parametrical analysis are common practice in geotechnical design for the quoted reasons. Consequently, the Authors felt the need to develop a simplified method (which is not yet available in our knowledge) that allow to perform several parametrical analysis in a limited time. It should be noted that, in this paper, no consideration regarding the mechanical and physical behavior of debris flows are carried out; the proposed model requires the input of parameters that must be acquired through a preliminary characterization of the design event. However, adopting the proposed tool, the designer will be able to perform sensitivity analysis that will help in quantify the influence of parameters variability as commonly occurs in geotechnical design.

  9. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...Assistance 1 2011-10-01 2011-10-01 false Flood Mitigation Plan development. 78.5 Section...SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood...

  10. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...Assistance 1 2010-10-01 2010-10-01 false Flood Mitigation Plan development. 78.5 Section...SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood...

  11. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...Assistance 1 2012-10-01 2011-10-01 true Flood Mitigation Plan development. 78.5 Section...SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood...

  12. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...Assistance 1 2014-10-01 2014-10-01 false Flood Mitigation Plan development. 78.5 Section...SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood...

  13. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...Assistance 1 2013-10-01 2013-10-01 false Flood Mitigation Plan development. 78.5 Section...SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood...

  14. Interventionist and participatory approaches to flood risk mitigation decisions: two case studies in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Bianchizza, C.; Del Bianco, D.; Pellizzoni, L.; Scolobig, A.

    2012-04-01

    Flood risk mitigation decisions pose key challenges not only from a technical but also from a social, economic and political viewpoint. There is an increasing demand for improving the quality of these processes by including different stakeholders - and especially by involving the local residents in the decision making process - and by guaranteeing the actual improvement of local social capacities during and after the decision making. In this paper we analyse two case studies of flood risk mitigation decisions, Malborghetto-Valbruna and Vipiteno-Sterzing, in the Italian Alps. In both of them, mitigation works have been completed or planned, yet following completely different approaches especially in terms of responses of residents and involvement of local authorities. In Malborghetto-Valbruna an 'interventionist' approach (i.e. leaning towards a top down/technocratic decision process) was used to make decisions after the flood event that affected the municipality in the year 2003. In Vipiteno-Sterzing, a 'participatory' approach (i.e. leaning towards a bottom-up/inclusive decision process) was applied: decisions about risk mitigation measures were made by submitting different projects to the local citizens and by involving them in the decision making process. The analysis of the two case studies presented in the paper is grounded on the results of two research projects. Structured and in-depth interviews, as well as questionnaire surveys were used to explore residents' and local authorities' orientations toward flood risk mitigation. Also a SWOT analysis (Strengths, Weaknesses, Opportunities and Threats) involving key stakeholders was used to better understand the characteristics of the communities and their perception of flood risk mitigation issues. The results highlight some key differences between interventionist and participatory approaches, together with some implications of their adoption in the local context. Strengths and weaknesses of the two approaches, as well as key challenges for the future are also discussed.

  15. Insights from EMF Associated Agricultural and Forestry Greenhouse Gas Mitigation Studies

    SciTech Connect

    McCarl, Bruce A.; Murray, Brian; Kim, Man-Keun; Lee, Heng-Chi; Sands, Ronald D.; Schneider, Uwe

    2007-11-19

    Integrated assessment modeling (IAM) as employed by the Energy Modeling Forum (EMF) generally involves a multi-sector appraisal of greenhouse gas emission (GHGE) mitigation alternatives and climate change effects typically at the global level. Such a multi-sector evaluation encompasses potential climate change effects and mitigative actions within the agricultural and forestry (AF) sectors. In comparison with many of the other sectors covered by IAM, the AF sectors may require somewhat different treatment due to their critical dependence upon spatially and temporally varying resource and climatic conditions. In particular, in large countries like the United States, forest production conditions vary dramatically across the landscape. For example, some areas in the southern US present conditions favorable to production of fast growing, heat tolerant pine species, while more northern regions often favor slower-growing hardwood and softwood species. Moreover, some lands are currently not suitable for forest production (e.g., the arid western plains). Similarly, in agriculture, the US has areas where citrus and cotton can be grown and other areas where barley and wheat are more suitable. This diversity across the landscape causes differential GHGE mitigation potential in the face of climatic changes and/or responses to policy or price incentives. It is difficult for a reasonably sized global IAM system to reflect the full range of sub-national geographic AF production possibilities alluded to above. AF response in the face of climate change altered temperature precipitation regimes or mitigation incentives will likely involve region-specific shifts in land use and agricultural/forest production. This chapter addresses AF sectoral responses in climate change mitigation analysis. Specifically, we draw upon US-based studies of AF GHGE mitigation possibilities that incorporate sub-national detail drawing largely on a body of studies done by the authors in association with EMF activities. We discuss characteristics of AF sectoral responses that could be incorporated in future IAM efforts in climate change policy.

  16. Mitigating Hazards in School Facilities

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    School safety is a human concern, one that every school and community must take seriously and strive continually to achieve. It is also a legal concern; schools can be held liable if they do not make good-faith efforts to provide a safe and secure school environment. How schools are built and maintained is an integral part of school safety and…

  17. Versatile gas gun target assembly for studying blast wave mitigation in materials

    NASA Astrophysics Data System (ADS)

    Bartyczak, S.; Mock, W., Jr.

    2012-03-01

    Traumatic brain injury (TBI) has become a serious problem for military personnel returning from recent conflicts. This has increased interest in investigating blast mitigating materials for use in helmets. In this paper we describe a new versatile target assembly that is used with an existing gas gun for studying these materials.

  18. CASE STUDY OF RADON DIAGNOSTICS AND MITIGATION IN A NEW YORK STATE SCHOOL

    EPA Science Inventory

    The paper discusses a case study of radon diagnostics and mitigation performed by EPA in a New York State school building. esearch focused on active subslab depressurization (ASD) in the basement and, to a lesser degree, the potential for radon reduction in the basement and slab-...

  19. Case Study: Flood Mitigation of the Muda River, Malaysia P. Y. Julien, M.ASCE1

    E-print Network

    Julien, Pierre Y.

    Case Study: Flood Mitigation of the Muda River, Malaysia P. Y. Julien, M.ASCE1 ; A. Ab. Ghani2 ; N. A. Zakaria3 ; R. Abdullah4 ; and C. K. Chang5 Abstract: The 2003 flood of the Muda River reached 1,340 m3 /s at Ladang Victoria and adversely impacted 45,000 people in Malaysia. A flood control

  20. Occupational Health Hazards among Healthcare Workers in Kampala, Uganda

    PubMed Central

    Yu, Xiaozhong; Buregyeya, Esther; Musoke, David; Wang, Jia-Sheng; Halage, Abdullah Ali; Whalen, Christopher; Bazeyo, William; Williams, Phillip; Ssempebwa, John

    2015-01-01

    Objective. To assess the occupational health hazards faced by healthcare workers and the mitigation measures. Methods. We conducted a cross-sectional study utilizing quantitative data collection methods among 200 respondents who worked in 8 major health facilities in Kampala. Results. Overall, 50.0% of respondents reported experiencing an occupational health hazard. Among these, 39.5% experienced biological hazards while 31.5% experienced nonbiological hazards. Predictors for experiencing hazards included not wearing the necessary personal protective equipment (PPE), working overtime, job related pressures, and working in multiple health facilities. Control measures to mitigate hazards were availing separate areas and containers to store medical waste and provision of safety tools and equipment. Conclusion. Healthcare workers in this setting experience several hazards in their workplaces. Associated factors include not wearing all necessary protective equipment, working overtime, experiencing work related pressures, and working in multiple facilities. Interventions should be instituted to mitigate the hazards. Specifically PPE supply gaps, job related pressures, and complacence in adhering to mitigation measures should be addressed.

  1. 24 CFR 51.205 - Mitigating measures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...Projects Near Hazardous Operations Handling Conventional Fuels or Chemicals of an Explosive or Flammable Nature § 51.205 Mitigating...existing permanent fire resistant structure of adequate size and strength will shield the proposed project from the hazard....

  2. 24 CFR 51.205 - Mitigating measures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...Projects Near Hazardous Operations Handling Conventional Fuels or Chemicals of an Explosive or Flammable Nature § 51.205 Mitigating...existing permanent fire resistant structure of adequate size and strength will shield the proposed project from the hazard....

  3. Mitigating Resistance to Teaching Science through Inquiry: Studying Self

    ERIC Educational Resources Information Center

    Spector, Barbara; Burkett, Ruth S.; Leard, Cyndy

    2007-01-01

    This is the report of a qualitative emergent-design study of 2 different Web-enhanced science methods courses for preservice elementary teachers in which an experiential learning strategy, labeled "using yourself as a learning laboratory," was implemented. Emergent grounded theory indicated this strategy, when embedded in a course organized as an…

  4. Methodological Issues In Forestry Mitigation Projects: A CaseStudy Of Kolar District

    SciTech Connect

    Ravindranath, N.H.; Murthy, I.K.; Sudha, P.; Ramprasad, V.; Nagendra, M.D.V.; Sahana, C.A.; Srivathsa, K.G.; Khan, H.

    2007-06-01

    There is a need to assess climate change mitigationopportunities in forest sector in India in the context of methodologicalissues such as additionality, permanence, leakage, measurement andbaseline development in formulating forestry mitigation projects. A casestudy of forestry mitigation project in semi-arid community grazing landsand farmlands in Kolar district of Karnataka, was undertaken with regardto baseline and project scenariodevelopment, estimation of carbon stockchange in the project, leakage estimation and assessment ofcost-effectiveness of mitigation projects. Further, the transaction coststo develop project, and environmental and socio-economic impact ofmitigation project was assessed.The study shows the feasibility ofestablishing baselines and project C-stock changes. Since the area haslow or insignificant biomass, leakage is not an issue. The overallmitigation potential in Kolar for a total area of 14,000 ha under variousmitigation options is 278,380 tC at a rate of 20 tC/ha for the period2005-2035, which is approximately 0.67 tC/ha/yr inclusive of harvestregimes under short rotation and long rotation mitigation options. Thetransaction cost for baseline establishment is less than a rupee/tC andfor project scenario development is about Rs. 1.5-3.75/tC. The projectenhances biodiversity and the socio-economic impact is alsosignificant.

  5. NOAA Technical Memorandum ERL PMEL-37 FEASIBILITY STUDY ON MITIGATING TSUNAMI HAZARDS IN THE PACIFIC

    E-print Network

    for publicity or advertising purposes of information from this publication concerning proprietary products·".... S0-8'"~ as-a._ -. \\ ." ~~-1Tf(1.'t/J,/I ')/7Ir~++STATISTICS {TOT......NU...1If0""OC

  6. Proposed seismic hazard maps of Sumatra and Java islands and microzonation study of Jakarta city, Indonesia

    Microsoft Academic Search

    Masyhur Irsyam; Donny T. Dangkua; Hendriyawan; Drajat Hoedajanto; Bigman M. Hutapea; Engkon K. Kertapati; Teddy Boen; Mark D. Petersen

    2008-01-01

    This paper presents the development of spectral hazard maps for Sumatra and Java islands, Indonesia and microzonation study\\u000a for Jakarta city. The purpose of this study is to propose a revision of the seismic hazard map in Indonesian Seismic Code\\u000a SNI 03-1726-2002. Some improvements in seismic hazard analysis were implemented in the analysis by considering the recent\\u000a seismic activities around

  7. Feasibility study of tank leakage mitigation using subsurface barriers

    SciTech Connect

    Treat, R.L.; Peters, B.B.; Cameron, R.J.; McCormak, W.D.; Trenkler, T.; Walters, M.F. [Ensearch Environmental, Inc. (United States); Rouse, J.K.; McLaughlin, T.J. [Bovay Northwest, Inc., Richland, WA (United States); Cruse, J.M. [Westinghouse Hanford Co., Richland, WA (United States)

    1994-09-21

    The US Department of Energy (DOE) has established the Tank Waste Remediation System (TWRS) to satisfy manage and dispose of the waste currently stored in the underground storage tanks. The retrieval element of TWRS includes a work scope to develop subsurface impermeable barriers beneath SSTs. The barriers could serve as a means to contain leakage that may result from waste retrieval operations and could also support site closure activities by facilitating cleanup. Three types of subsurface barrier systems have emerged for further consideration: (1) chemical grout, (2) freeze walls, and (3) desiccant, represented in this feasibility study as a circulating air barrier. This report contains analyses of the costs and relative risks associated with combinations retrieval technologies and barrier technologies that from 14 alternatives. Eight of the alternatives include the use of subsurface barriers; the remaining six nonbarrier alternative are included in order to compare the costs, relative risks and other values of retrieval with subsurface barriers. Each alternative includes various combinations of technologies that can impact the risks associated with future contamination of the groundwater beneath the Hanford Site to varying degrees. Other potential risks associated with these alternatives, such as those related to accidents and airborne contamination resulting from retrieval and barrier emplacement operations, are not quantitatively evaluated in this report.

  8. HOUSEHOLD HAZARDOUS WASTE CHARACTERIZATION STUDY FOR PALM BEACH COUNTY, FLORIDA - A MITE PROGRAM EVALUATION

    EPA Science Inventory

    The objectives of the Household Hazardous Waste Characterization Study (the HHW Study) were to: 1) Quantity the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County Florida?s (the County) residential solid waste (characterized in this study as municipal s...

  9. The 5 key questions coping with risks due to natural hazards, answered by a case study

    NASA Astrophysics Data System (ADS)

    Hardegger, P.; Sausgruber, J. T.; Schiegg, H. O.

    2009-04-01

    Based on Maslow's hierarchy of needs, human endeavours concern primarily existential needs, consequently, to be safeguarded against both natural as well as man made threads. The subsequent needs are to realize chances in a variety of fields, as economics and many others. Independently, the 5 crucial questions are the same as for coping with risks due to natural hazards specifically. These 5 key questions are I) What is the impact in function of space and time ? II) What protection measures comply with the general opinion and how much do they mitigate the threat? III) How can the loss be adequately quantified and monetized ? IV) What budget for prevention and reserves for restoration and compensation are to be planned ? V) Which mix of measures and allocation of resources is sustainable, thus, optimal ? The 5 answers, exemplified by a case study, concerning the sustainable management of risk due to the debris flows by the Enterbach / Inzing / Tirol / Austria, are as follows : I) The impact, created by both the propagation of flooding and sedimentation, has been forecasted by modeling (numerical simulation) the 30, 50, 100, 150, 300 and 1000 year debris flow. The input was specified by detailed studies in meteorology, precipitation and runoff, in geology, hydrogeology, geomorphology and slope stability, in hydraulics, sediment transport and debris flow, in forestry, agriculture and development of communal settlement and infrastructure. All investigations were performed according to the method of ETAlp (Erosion and Transport in Alpine systems). ETAlp has been developed in order to achieve a sustainable development in alpine areas and has been evaluated by the research project "nab", within the context of the EU-Interreg IIIb projects. II) The risk mitigation measures of concern are in hydraulics at the one hand and in forestry at the other hand. Such risk management is evaluated according to sustainability, which means economic, ecologic and social, in short, "triple" compatibility. 100% protection against the 100 year event shows to be the optimal degree of protection. Consequently, impacts statistically less frequent than once in 100 year are accepted as the remaining risk. Such floods and debris flows respectively cause a fan of propagation which is substantially reduced due to the protection measures against the 100 year event. III) The "triple loss distribution" shows the monetized triple damage, dependent on its probability. The monetization is performed by the social process of participation of the impacted interests, if not, by official experts in representation. The triple loss distribution rises in time mainly due to the rise in density and value of precious goods. A comparison of the distributions of the triple loss and the triple risk, behaving in opposite direction, is shown and explained within the project. IV) The recommended yearly reserves to be stocked for restoration and compensation of losses, caused by debris flows, amount to € 70'000.- according to the approach of the "technical risk premium". The discrepancy in comparison with the much higher amounts according to the common approaches of natural hazards engineering are discussed. V) The sustainable mix of hydraulic and forestry measures with the highest return on investment at lowest risk is performed according to the portfolio theory (Markowitz), based on the triple value curves, generated by the method of TripelBudgetierung®. Accordingly, the optimum mix of measures to protect the community of Inzing against the natural hazard of debris flow, thus, the most efficient allocation of resources equals to 2/3 for hydraulic, 1/3 for forestry measures. In detail, the results of the research pilot project "Nachhaltiges Risikomanagement - Enterbach / Inzing / Tirol / Austria" may be consulted under www.ibu.hsr.ch/inzing.

  10. Feasibility Study of Radiometry for Airborne Detection of Aviation Hazards

    NASA Technical Reports Server (NTRS)

    Gimmestad, Gary G.; Papanicolopoulos, Chris D.; Richards, Mark A.; Sherman, Donald L.; West, Leanne L.; Johnson, James W. (Technical Monitor)

    2001-01-01

    Radiometric sensors for aviation hazards have the potential for widespread and inexpensive deployment on aircraft. This report contains discussions of three aviation hazards - icing, turbulence, and volcanic ash - as well as candidate radiometric detection techniques for each hazard. Dual-polarization microwave radiometry is the only viable radiometric technique for detection of icing conditions, but more research will be required to assess its usefulness to the aviation community. Passive infrared techniques are being developed for detection of turbulence and volcanic ash by researchers in this country and also in Australia. Further investigation of the infrared airborne radiometric hazard detection approaches will also be required in order to develop reliable detection/discrimination techniques. This report includes a description of a commercial hyperspectral imager for investigating the infrared detection techniques for turbulence and volcanic ash.

  11. 44 CFR 201.7 - Tribal Mitigation Plans.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...OF HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION...commitment to reduce risks from natural hazards...identified hazards. The risk assessment shall include...losses identified in the risk assessment, based on...government's pre- and post-disaster hazard management...

  12. 44 CFR 201.7 - Tribal Mitigation Plans.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...OF HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION...commitment to reduce risks from natural hazards...identified hazards. The risk assessment shall include...losses identified in the risk assessment, based on...government's pre- and post-disaster hazard management...

  13. 44 CFR 201.7 - Tribal Mitigation Plans.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...OF HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION...commitment to reduce risks from natural hazards...identified hazards. The risk assessment shall include...losses identified in the risk assessment, based on...government's pre- and post-disaster hazard management...

  14. 44 CFR 201.7 - Tribal Mitigation Plans.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...OF HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION...commitment to reduce risks from natural hazards...identified hazards. The risk assessment shall include...losses identified in the risk assessment, based on...government's pre- and post-disaster hazard management...

  15. Geospatial Approach on Landslide Hazard Zonation Mapping Using Multicriteria Decision Analysis: A Study on Coonoor and Ooty, Part of Kallar Watershed, The Nilgiris, Tamil Nadu

    NASA Astrophysics Data System (ADS)

    Rahamana, S. Abdul; Aruchamy, S.; Jegankumar, R.

    2014-12-01

    Landslides are one of the critical natural phenomena that frequently lead to serious problems in hilly area, resulting to loss of human life and property, as well as causing severe damage to natural resources. The local geology with high degree of slope coupled with high intensity of rainfall along with unplanned human activities of the study area causes many landslides in this region. The present study area is more attracted by tourist throughout the year, so this area must be considered for preventive measures. Geospatial based Multicriteria decision analysis (MCDA) technique is increasingly used for landslide vulnerability and hazard zonation mapping. It enables the integration of different data layers with different levels of uncertainty. In this present study, it is used analytic hierarchy process (AHP) method to prepare landslide hazard zones of the Coonoor and Ooty, part of Kallar watershed, The Nilgiris, Tamil Nadu. The study was carried out using remote sensing data, field surveys and geographic information system (GIS) tools. The ten factors that influence landslide occurrence, such as elevation, slope aspect, slope angle, drainage density, lineament density, soil, precipitation, land use/land cover (LULC), distance from road and NDVI were considered. These factors layers were extracted from the various related spatial data's. These factors were evaluated, and then, the individual factor weight and class weight were assigned to each of the related factors. The Landslide Hazard Zone Index (LHZI) was calculated using Multicriteria decision analysis (MCDA) the technique based on the assigned weight and the rating is given by the Analytical Hierarchy Process (AHP) method. The final cumulative map of the study area was categorized into four hazard zones and classified as zone I to IV. There are 3.56% of the area comes under the hazard zone IV fallowed by 48.19% of the area comes under zone III, 43.63 % of the area in zone II and 4.61% of the area comes hazard zone I. Further resulted hazard zone map and landuse/landcover map are overlaid to check the hazard status, and existing inventory of known landslides within the present study area was compared with the resulting vulnerable and hazard zone maps. The landslide hazard zonation map is useful for landslide hazard prevention, mitigation, and improvement to society, and proper planning for land use and construction in the future.

  16. Land use /Land Cover Approaches as Instruments of Natural Hazard Mitigation in the Manjira River Sub-Basin, Andhra Pradesh, India.

    NASA Astrophysics Data System (ADS)

    THATIPARTI, V. L.

    2001-05-01

    Rapid industrialization during the last three decades had a profound adverse effect on the land use / land cover practices in , and the water quality, of the Manjira River sub-basin, Medak district, Andhra Pradesh, India. As water interacts with all other components of the environment, such as geology, soils, weather and climate, flora and fauna, the pollution of water has affected both biophysical and socioeconomic and cultural environments. The area of study is the catchment of Nakkavagu (stream) in the Manjira river system, which lies between long. 78 05' - 78 25' E., and the lat. 17 25'- and 17 45' N., and covers an area of 734 sq.km. Remote Sensing and GIS techniques have been employed to identify and quantify measures for mitigating the adverse impacts of the industrialization and for being prepared for extreme weather events. The methodology employed in the present study involves the generation of various thematic layers like slope, hydrogeomorphology and land use / land cover maps using Land sat MSS, IRS IA LISS II and IRS ID LISS III and PAN merged data in EASI / PACE 6.3 ver. Platform. By overlaying all the above thematic maps, action plan maps are generated to device various ways and means of rolling back the degradation of the environment, and to develop low -cost, people - participatory strategies ( such as, agricultural practices, use of water bodies and land under urbanization, structural and non-structural, particularly vegetation methods, etc.) of reducing the vulnerability of the population for extreme weather events.

  17. The fujairah united arab emirates (uae) (ml = 5.1) earthquake of march 11, 2002 a reminder for the immediate need to develop and implement a national hazard mitigation strategy

    NASA Astrophysics Data System (ADS)

    Al-Homoud, A.

    2003-04-01

    On March 11, 2002, at mid nigh, the Fujairah Masafi region in the UAE was shaken by an earthquake of shallow depth and local magnitude m = 5.1 on Richter Scale. The earthquake occurred on Dibba fault in the UAE with epicenter of the earthquake at 20 km NW of Fujairha city. The focal depth was just 10 km. The earthquake was felt in most parts of the northern emirates: Dubai, Sharjah, Ajman, Ras Al-Khaima, and Um-Qwain. The "main shock" was followed in the following weeks by more than twenty five earthquakes with local magnitude ranging from m = 4 to m = 4.8. The location of those earthquakes was along Zagros Reverse Faulting System in the Iranian side the Arabian Gulf, opposite to the Shores of the UAE. Most of these earthquakes were shallow too and were actually felt by the people. However, there was another strong earthquake in early April 2002 in the same Masafi region with local magnitude m = 5.1 and focal depth 30 km, therefore it was not felt by the northern emirates residents. No major structural damages to buildings and lifeline systems were reported in the several cities located in the vicinity of the earthquake epicenter. The very small values of ground accelerations were not enough to test the structural integrity of tall building and major infrastructures. Future major earthquakes anticipated in the region in close vicinity of northern emirates, once they occur, and considering the noticeable local site effect of the emirates sandy soils of high water table levels, will surely put these newly constructed building into the real test. Even though there were no casualties in the March 11th event, but there was major fear as a result of the loud sound of rock rupture heard in the mountains close to Maafi, the noticeable disturbance of animals and birds minutes before the earthquake incident and during the incident, cracks in the a good number of Masafi buildings and major damages that occurred in "old" buildings of Fujairah Masafi area, the closest city to the epicenter of the earthquake. Indeed, the March 11, 2002 and "aftershocks" scared the citizens of Masafi and surrounding regions and ignited the attention of the public and government to the subject matter of earthquake hazard, specialty this earthquake came one year after the near by Indian m = 6.5 destructive Earthquake. Indeed the recent m = 6.2 June 22 destructive earthquake too that hit north west Iran, has again reminded the UAE public and government with the need to take quick and concrete measures to dtake the necessary steps to mitigate any anticipated earthquake hazard. This study reflects in some details on the following aspects related to the region and vicinity: geological and tectonic setting, seismicity, earthquake activity data base and seismic hazard assessment. Moreover, it documents the following aspects of the March 11, 2002 earthquake: tectonic, seismological, instrumental seismic data, aftershocks, strong motion recordings and response spectral and local site effect analysis, geotechnical effects and structural observations in the region affected by the earthquake. The study identifies local site ground amplification effects and liquefaction hazard potential in some parts of the UAE. Moreover, the study reflects on the coverage of the incident in the media, public and government response, state of earthquake engineering practice in the construction industry in the UAE, and the national preparedness and public awareness issues. However, it is concluded for this event that the mild damages that occurred in Masafi region were due to poor quality of construction, and lack of underestimating of the design base shear. Practical recommendations are suggested for the authorities to avoid damages in newly constructed buildings and lifelines as a result of future stronger earthquakes, in addition to recommendations on a national strategy for earthquake hazard mitigation in the UAE, which is still missing. The recommendations include the development and implementation of a design code for earthquake loading in the UAE,

  18. Volcanic hazard studies for the Yucca Mountain project

    SciTech Connect

    Crowe, B.; Turrin, B.; Wells, S.; Perry, F.; McFadden, L.; Renault, C.E.; Champion, D.; Harrington, C.

    1989-05-01

    Volcanic hazard studies are ongoing to evaluate the risk of future volcanism with respect to siting of a repository for disposal of high-level radioactive waste at the Yucca Mountain site. Seven Quaternary basaltic volcanic centers are located a minimum distance of 12 km and a maximum distance of 47 km from the outer boundary of the exploration block. The conditional probability of disruption of a repository by future basaltic volcanism is bounded by the range of 10{sup {minus}8} to 10{sup {minus}10} yr{sup {minus}1}. These values are currently being reexamined based on new developments in the understanding of the evaluation of small volume, basaltic volcanic centers including: (1) Many, perhaps most, of the volcanic centers exhibit brief periods of eruptive activity separated by longer periods of inactivity. (2) The centers may be active for time spans exceeding 10{sup 5} yrs, (3) There is a decline in the volume of eruptions of the centers through time, and (4) Small volume eruptions occurred at two of the Quaternary centers during latest Pleistocene or Holocene time. We classify the basalt centers as polycyclic, and distinguish them from polygenetic volcanoes. Polycyclic volcanism is characterized by small volume, episodic eruptions of magma of uniform composition over time spans of 10{sup 3} to 10{sup 5} yrs. Magma eruption rates are low and the time between eruptions exceeds the cooling time of the magma volumes. 25 refs., 2 figs.

  19. Hazardous drinking-related characteristics of depressive disorders in Korea: the CRESCEND study.

    PubMed

    Park, Seon-Cheol; Lee, Sang Kyu; Oh, Hong Seok; Jun, Tae-Youn; Lee, Min-Soo; Kim, Jae-Min; Kim, Jung-Bum; Yim, Hyeon-Woo; Park, Yong Chon

    2015-01-01

    This study aimed to identify clinical correlates of hazardous drinking in a large cohort of Korean patients with depression. We recruited a total of 402 depressed patients aged > 18 yr from the Clinical Research Center for Depression (CRESCEND) study in Korea. Patients' drinking habits were assessed using the Korean Alcohol Use Disorder Identification Test (AUDIT-K). Psychometric scales, including the HAMD, HAMA, BPRS, CGI-S, SSI-Beck, SOFAS, and WHOQOL-BREF, were used to assess depression, anxiety, overall psychiatric symptoms, global severity, suicidal ideation, social functioning, and quality of life, respectively. We compared demographic and clinical features and psychometric scores between patients with and without hazardous drinking behavior after adjusting for the effects of age and sex. We then performed binary logistic regression analysis to identify independent correlates of hazardous drinking in the study population. Our results revealed that hazardous drinking was associated with current smoking status, history of attempted suicide, greater psychomotor retardation, suicidal ideation, weight loss, and lower hypochondriasis than non-hazardous drinking. The regression model also demonstrated that more frequent smoking, higher levels of suicidal ideation, and lower levels of hypochondriasis were independently correlates for hazardous drinking in depressed patients. In conclusion, depressed patients who are hazardous drinkers experience severer symptoms and a greater burden of illness than non-hazardous drinkers. In Korea, screening depressed patients for signs of hazardous drinking could help identify subjects who may benefit from comprehensive therapeutic approaches. PMID:25552886

  20. The critical need for moderate to high resolution thermal infrared data for volcanic hazard mitigation and process monitoring from the micron to the kilometer scale

    NASA Astrophysics Data System (ADS)

    Ramsey, M. S.

    2006-12-01

    The use of satellite thermal infrared (TIR) data to rapidly detect and monitor transient thermal events such as volcanic eruptions commonly relies on datasets with coarse spatial resolution (1.0 - 8.0 km) and high temporal resolution (minutes to hours). However, the growing need to extract physical parameters at meter to sub- meter scales requires data with improved spectral and spatial resolution. Current orbital systems such as the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) and the Landsat Enhanced Thematic Mapper plus (ETM+) can provide TIR data ideal for this type of scientific analysis, assessment of hazard risks, and to perform smaller scale monitoring; but at the expense of rapid repeat observations. A potential solution to this apparent conflict is to combine the spatial and temporal scales of TIR data in order to provide the benefits of rapid detection together with the potential of detailed science return. Such a fusion is now in place using ASTER data collected in the north Pacific region to monitor the Aleutian and Kamchatka arcs. However, this approach of cross-instrument/cross-satellite monitoring is in jeopardy with the lack of planned moderate resolution TIR instruments following ETM+ and ASTER. This data collection program is also being expanded globally, and was used in 2006 to assist in the response and monitoring of the volcanic crisis at Merapi Volcano in Indonesia. Merapi Volcano is one of the most active volcanoes in the country and lies in central Java north of the densely-populated city of Yogyakarta. Pyroclastic flows and lahars are common following the growth and collapse of the summit lava dome. These flows can be fatal and were the major hazard concern during a period of renewed activity beginning in April 2006. Lava at the surface was confirmed on 25 April and ASTER was tasked with an urgent request observation, subsequently collecting data on 26 April (daytime) and 28 April (nighttime). The TIR revealed thermally-elevated pixels (max = 25.9 C) clustered near the summit with a lesser anomaly (max = 15.5 C) approximately 650 m to the southwest and down slope from the summit. Such small-scale and low-grade thermal features confirmed the increased activity state of the volcano and were only made possible with the moderate spatial, spectral, and radiometric resolution of ASTER. ASTER continued to collect data for the next 12 weeks tracking the progress of large scale pyroclastic flows, the growth of the lava dome, and the path of ash-rich plumes. Data from these observations were reported world-wide and used for evacuation and hazard planning purposes. With the pending demise of such TIR data from orbit, research is also focused on the use of handheld TIR instruments such as the forward-looking infrared radiometer (FLIR) camera. These instruments provide the highest spatial resolution in-situ TIR data and have been used to observe numerous volcanic phenomena and quantitatively model others (e.g., the rise of the magma body preceding the eruption of Mt. St. Helens Volcano; the changes on the lava dome at Bezymianny Volcano; the behavior of basalt crusts during pahoehoe flow inflation). Studies such as these confirm the utility and importance of future moderate to high resolution TIR data in order to understand volcanic processes and their accompanying hazards.

  1. Health effects from hazardous waste incineration facilities: five case studies.

    PubMed

    Pleus, R C; Kelly, K E

    1996-01-01

    Environmental pollution, primarily from industrialization, has caused significant adverse effects to humans, animals, and the ecosystem. Attempts have been made to reduce and prevent these pollutants through better waste management practices. Incineration is one such practice, which seeks to prevent adverse health impacts to future generations by destroying waste today, without increasing risk to those living near incineration facilities in the process. As with any industrial process, however, proper design and operation are important requirements to ensure the facility can be operated safely. Any technology that cannot be managed safely should not be considered acceptable. This paper reviews the scientific basis of past allegations associated with the process of hazardous waste incineration. These five case studies, which have attracted considerable public attention, have not been shown to be scientifically accurate of factually based. This paper attempts to separate fact from fiction and to show some of the consistent inaccuracies that were repeated throughout all five studies. In reviewing the above cases and others in the literature, several common elements become apparent. 1. Most of the reports are based on single newspaper articles, activist newsletters, interviews with admittedly biased respondents, and other secondary or inappropriate sources of information that do not withstand scientific scrutiny. 2. Research studies are quoted incompletely or out of context. Often the original point made by the researcher is the exact opposite of the impression left by Costner and Thornton. 3. In four of five cases, no data were supplied to substantiate the claims. As an observation, where substantive research data do not exist to support allegations of adverse health effects, a tendency seems to be increasing over time to make allegations and then not provide supporting data. Because public damage is often done simply by making the allegation, this tactic appears to be effective. Thus, unsubstantiated allegations should not go unchallenged. 4. A relatively small group of people appears to be consistently generating most of the allegations. 5. The format of the allegations tends to be similar; often just the name of the facility changes. 6. Furthermore, these same few individuals tend to repeat the same allegations about the same facilities, even after the allegations have been long since proven incorrect. Despite the widespread prevalence of incineration facilities around the world and the millions of tons of waste destroyed in them each year, surprisingly few reports of adverse health effects exist in the scientific literature relative to other types of waste management practices. 7. The existing reports do not indicate that hazardous waste incineration has widespread potential for adverse health effects. However, as with all industrial processes, care must be taken to ensure that facilities are well designed and well operated to minimize or prevent adverse health effects. As with all environmental exposures, potential impacts on public health need to be addressed scientifically. Making a scientifically valid connection between operation of an incinerator and resulting disease within a population is a difficult undertaking, requiring the combined efforts of toxicologists, epidemiologists, chemists, physicians, and persons in other disciplines. Nevertheless, concerns regarding potential impacts of incineration must be addressed and communicated, both accurately and effectively, if the actual risks of incineration are to become widely understood. PMID:8794540

  2. Rock cliffs hazard analysis based on remote geostructural surveys: The Campione del Garda case study (Lake Garda, Northern Italy)

    NASA Astrophysics Data System (ADS)

    Ferrero, A. M.; Migliazza, M.; Roncella, R.; Segalini, A.

    2011-02-01

    The town of Campione del Garda (located on the west coast of Lake Garda) and its access road have been historically subject to rockfall phenomena with risk for public security in several areas of the coast. This paper presents a study devoted to the determination of risk for coastal cliffs and the design of mitigation measures. Our study was based on statistical rockfall analysis performed with a commercial code and on stability analysis of rock slopes based on the key block method. Hazard from block kinematics and rock-slope failure are coupled by applying the Rockfall Hazard Assessment Procedure (RHAP). Because of the huge dimensions of the slope, its morphology and the geostructural survey were particularly complicated and demanding. For these reasons, noncontact measurement methods, based on aerial photogrammetry by helicopter, were adopted. A special software program, developed by the authors, was applied for discontinuity identification and for their orientation measurements. The potentially of aerial photogrammetic survey in rock mechanic application and its improvement in the rock mass knowledge is analysed in the article.

  3. Land Use/Land Cover Approaches as Instruments of Natural Hazard Mitigation in the Manjira River Sub-Basin, Andhra Pradesh, India

    NASA Astrophysics Data System (ADS)

    Lakshmi, T. V.; Reddy, M. A.; Anjaneyulu, Y.

    2001-05-01

    Rapid industrialization during the last three decades had a profound adverse effect on the land use/land cover practices in, and the water quality, of the Manjira River sub-basin, Medak District, Andhra Pradesh, India. As water interacts with all other components of the environment, such as, geology, soils, weather and climate, flora and fauna, the pollution of water has affected both biophysical and socioeconomic and cultural environments. The area of study is the catchment of Nakkavagu (stream) in the Manjira river system, which lies between long. 78 05' - 78 25' E., and the lat. 17 25' - 17 45' N., and covers an area of 734 sq. km. Remote sensing and GIS techniques have been employed to identify and quantify measures for mitigating the adverse impacts of the industrialization and for being prepared for extreme weather events. The methodology employed in the present study involves the generation of various thematic layers like slope, hydrogeomorphology and land use / land cover maps using Landsat MSS, IRS 1A LISS II and IRS 1D LISS III and PAN merged data in EASI/PACE 6.3 ver. platform. By overlaying all the above thematic maps, action plan maps are generated to devise various ways and means of rolling back the degradation of the environment, and to develop low-cost, people-participatory strategies (such as, agricultural practices, use of water bodies and land under urbanization, structural and non-structural, particularly vegetation methods, etc.) of reducing the vulnerability of the population for extreme weather events.

  4. SIMPLE ADVICE FOR INJURED HAZARDOUS DRINKERS: AN IMPLEMENTATION STUDY

    Microsoft Academic Search

    ALICIA RODRIGUEZ-MARTOS; YOLANDA CASTELLANO; JOAN M. SALMER ´ ON; GEMMA DOMINGO

    2007-01-01

    Aim: To evaluate the implementation of a screening and intervention procedure for hazardous drinkers in the routine praxis of an emergency service, without increasing the ED (emergency department) staff. Methods: Four stages of the implementation process were undertaken: exploration and adoption, programme installation, and initial implementation. Two hospitals participated, with a coordinator, four trainers and all the emergency nursing staff.

  5. LAND CLASSIFICATION USED TO SELECT ABANDONED HAZARDOUS WASTE STUDY SITES

    EPA Science Inventory

    The biological effects of hazardous substances in the environment are influenced by climate, physiography, and biota. These factors interact to determine the transport and fate of chemicals, but are difficult to model accurately except for small areas with a large data base. The ...

  6. STUDY ON AIR INGRESS MITIGATION METHODS IN THE VERY HIGH TEMPERATURE GAS COOLED REACTOR (VHTR)

    SciTech Connect

    Chang H. Oh

    2011-03-01

    An air-ingress accident followed by a pipe break is considered as a critical event for a very high temperature gas-cooled reactor (VHTR). Following helium depressurization, it is anticipated that unless countermeasures are taken, air will enter the core through the break leading to oxidation of the in-core graphite structure. Thus, without mitigation features, this accident might lead to severe exothermic chemical reactions of graphite and oxygen. Under extreme circumstances, a loss of core structural integrity may occur along with excessive release of radiological inventory. Idaho National Laboratory under the auspices of the U.S. Department of Energy is performing research and development (R&D) that focuses on key phenomena important during challenging scenarios that may occur in the VHTR. Phenomena Identification and Ranking Table (PIRT) studies to date have identified the air ingress event, following on the heels of a VHTR depressurization, as very important (Oh et al. 2006, Schultz et al. 2006). Consequently, the development of advanced air ingress-related models and verification and validation (V&V) requirements are part of the experimental validation plan. This paper discusses about various air-ingress mitigation concepts applicable for the VHTRs. The study begins with identifying important factors (or phenomena) associated with the air-ingress accident by using a root-cause analysis. By preventing main causes of the important events identified in the root-cause diagram, the basic air-ingress mitigation ideas can be conceptually derived. The main concepts include (1) preventing structural degradation of graphite supporters; (2) preventing local stress concentration in the supporter; (3) preventing graphite oxidation; (4) preventing air ingress; (5) preventing density gradient driven flow; (4) preventing fluid density gradient; (5) preventing fluid temperature gradient; (6) preventing high temperature. Based on the basic concepts listed above, various air-ingress mitigation methods are proposed in this study. Among them, the following two mitigation ideas are extensively investigated using computational fluid dynamic codes (CFD): (1) helium injection in the lower plenum, and (2) reactor enclosure opened at the bottom. The main idea of the helium injection method is to replace air in the core and the lower plenum upper part by buoyancy force. This method reduces graphite oxidation damage in the severe locations of the reactor inside. To validate this method, CFD simulations are addressed here. A simple 2-D CFD model is developed based on the GT-MHR 600MWt design. The simulation results showed that the helium replace the air flow into the core and significantly reduce the air concentration in the core and bottom reflector potentially protecting oxidation damage. According to the simulation results, even small helium flow was sufficient to remove air in the core, mitigating the air-ingress successfully. The idea of the reactor enclosure with an opening at the bottom changes overall air-ingress mechanism from natural convection to molecular diffusion. This method can be applied to the current system by some design modification of the reactor cavity. To validate this concept, this study also uses CFD simulations based on the simplified 2-D geometry. The simulation results showed that the enclosure open at the bottom can successfully mitigate air-ingress into the reactor even after on-set natural circulation occurs.

  7. Development, Implementation, and Pilot Evaluation of a Model-Driven Envelope Protection System to Mitigate the Hazard of In-Flight Ice Contamination on a Twin-Engine Commuter Aircraft

    NASA Technical Reports Server (NTRS)

    Martos, Borja; Ranaudo, Richard; Norton, Billy; Gingras, David; Barnhart, Billy

    2014-01-01

    Fatal loss-of-control accidents have been directly related to in-flight airframe icing. The prototype system presented in this report directly addresses the need for real-time onboard envelope protection in icing conditions. The combination of prior information and real-time aerodynamic parameter estimations are shown to provide sufficient information for determining safe limits of the flight envelope during inflight icing encounters. The Icing Contamination Envelope Protection (ICEPro) system was designed and implemented to identify degradations in airplane performance and flying qualities resulting from ice contamination and provide safe flight-envelope cues to the pilot. The utility of the ICEPro system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their awareness of a hazardous aircraft state. The performance of ICEPro system was further evaluated by various levels of sensor noise and atmospheric turbulence.

  8. Methodological framework for the probabilistic risk assessment of multi-hazards at a municipal scale: a case study in the Fella river valley, Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; van Westen, Cees; Reichenbach, Paola

    2013-04-01

    Local and regional authorities in mountainous areas that deal with hydro-meteorological hazards like landslides and floods try to set aside budgets for emergencies and risk mitigation. However, future losses are often not calculated in a probabilistic manner when allocating budgets or determining how much risk is acceptable. The absence of probabilistic risk estimates can create a lack of preparedness for reconstruction and risk reduction costs and a deficiency in promoting risk mitigation and prevention in an effective way. The probabilistic risk of natural hazards at local scale is usually ignored all together due to the difficulty in acknowledging, processing and incorporating uncertainties in the estimation of losses (e.g. physical damage, fatalities and monetary loss). This study attempts to set up a working framework for a probabilistic risk assessment (PRA) of landslides and floods at a municipal scale using the Fella river valley (Eastern Italian Alps) as a multi-hazard case study area. The emphasis is on the evaluation and determination of the uncertainty in the estimation of losses from multi-hazards. To carry out this framework some steps are needed: (1) by using physically based stochastic landslide and flood models we aim to calculate the probability of the physical impact on individual elements at risk, (2) this is then combined with a statistical analysis of the vulnerability and monetary value of the elements at risk in order to include their uncertainty in the risk assessment, (3) finally the uncertainty from each risk component is propagated into the loss estimation. The combined effect of landslides and floods on the direct risk to communities in narrow alpine valleys is also one of important aspects that needs to be studied.

  9. Seismic hazard studies for the High Flux Beam Reactor at Brookhaven National Laboratory

    SciTech Connect

    Costantino, C.J.; Heymsfield, E. (City Coll., New York, NY (United States). Dept. of Civil Engineering); Park, Y.J.; Hofmayer, C.H. (Brookhaven National Lab., Upton, NY (United States))

    1991-01-01

    This paper presents the results of a calculation to determine the site specific seismic hazard appropriate for the deep soil site at Brookhaven National Laboratory (BNL) which is to be used in the risk assessment studies being conducted for the High Flux Beam Reactor (HFBR). The calculations use as input the seismic hazard defined for the bedrock outcrop by a study conducted at Lawrence Livermore National Laboratory (LLNL). Variability in site soil properties were included in the calculations to obtain the seismic hazard at the ground surface and compare these results with those using the generic amplification factors from the LLNL study. 9 refs., 8 figs.

  10. Landslide hazard evaluation: a review of current techniques and their application in a multi-scale study, Central Italy

    NASA Astrophysics Data System (ADS)

    Guzzetti, Fausto; Carrara, Alberto; Cardinali, Mauro; Reichenbach, Paola

    1999-12-01

    In recent years, growing population and expansion of settlements and life-lines over hazardous areas have largely increased the impact of natural disasters both in industrialized and developing countries. Third world countries have difficulty meeting the high costs of controlling natural hazards through major engineering works and rational land-use planning. Industrialized societies are increasingly reluctant to invest money in structural measures that can reduce natural risks. Hence, the new issue is to implement warning systems and land utilization regulations aimed at minimizing the loss of lives and property without investing in long-term, costly projects of ground stabilization. Government and research institutions worldwide have long attempted to assess landslide hazard and risks and to portray its spatial distribution in maps. Several different methods for assessing landslide hazard were proposed or implemented. The reliability of these maps and the criteria behind these hazard evaluations are ill-formalized or poorly documented. Geomorphological information remains largely descriptive and subjective. It is, hence, somewhat unsuitable to engineers, policy-makers or developers when planning land resources and mitigating the effects of geological hazards. In the Umbria and Marche Regions of Central Italy, attempts at testing the proficiency and limitations of multivariate statistical techniques and of different methodologies for dividing the territory into suitable areas for landslide hazard assessment have been completed, or are in progress, at various scales. These experiments showed that, despite the operational and conceptual limitations, landslide hazard assessment may indeed constitute a suitable, cost-effective aid to land-use planning. Within this framework, engineering geomorphology may play a renewed role in assessing areas at high landslide hazard, and helping mitigate the associated risk.

  11. In-Situ Mitigation of Effluents from Acid Waste Rock Dumps Using Reactive Surface Barriers –– a Feasibility Study

    Microsoft Academic Search

    P. Schneider; K. Osenbrück; P. L. Neitzel; K. Nindel

    2002-01-01

    .   The long-term mitigation of pore waters of acid waste rock dumps formed during uranium mining in the former G.D.R. requires\\u000a new remediation approaches. A study was performed to evaluate the feasibility of reactive surface barriers (RSB) as part of\\u000a an alternative covering system. One topic of the investigation was to evaluate suitable reactive materials for the mitigation\\u000a of radionuclides

  12. A Study on Integrated Community Based Flood Mitigation with Remote Sensing Technique in Kota Bharu, Kelantan

    NASA Astrophysics Data System (ADS)

    'Ainullotfi, A. A.; Ibrahim, A. L.; Masron, T.

    2014-02-01

    This study is conducted to establish a community based flood management system that is integrated with remote sensing technique. To understand local knowledge, the demographic of the local society is obtained by using the survey approach. The local authorities are approached first to obtain information regarding the society in the study areas such as the population, the gender and the tabulation of settlement. The information about age, religion, ethnic, occupation, years of experience facing flood in the area, are recorded to understand more on how the local knowledge emerges. Then geographic data is obtained such as rainfall data, land use, land elevation, river discharge data. This information is used to establish a hydrological model of flood in the study area. Analysis were made from the survey approach to understand the pattern of society and how they react to floods while the analysis of geographic data is used to analyse the water extent and damage done by the flood. The final result of this research is to produce a flood mitigation method with a community based framework in the state of Kelantan. With the flood mitigation that involves the community's understanding towards flood also the techniques to forecast heavy rainfall and flood occurrence using remote sensing, it is hope that it could reduce the casualties and damage that might cause to the society and infrastructures in the study area.

  13. Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria

    NASA Astrophysics Data System (ADS)

    Maltzkait, Anika; Pfurtscheller, Clemens

    2014-05-01

    Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria A. Maltzkait (1) & C. Pfurtscheller (1) (1) Institute for Interdisciplinary Mountain Research (IGF), Austrian Academy of Sciences, Innsbruck, Austria The extreme flood events of 2002, 2005 and 2013 in Austria underlined the importance of local emergency services being able to withstand and reduce the adverse impacts of natural hazards. Although for legal reasons municipal emergency and crisis management plans exist in Austria, they mostly do not cover risk analyses of natural hazards - a sound, comparable assessment to identify and evaluate risks. Moreover, total losses and operational emergencies triggered by natural hazards have increased in recent decades. Given sparse public funds, objective budget decisions are needed to ensure the efficient provision of operating resources, like personnel, vehicles and equipment in the case of natural hazards. We present a case study of the municipality of Au, Austria, which was hardly affected during the 2005 floods. Our approach is primarily based on a qualitative risk analysis, combining existing hazard plans, GIS data, field mapping and data on operational efforts of the fire departments. The risk analysis includes a map of phenomena discussed in a workshop with local experts and a list of risks as well as a risk matrix prepared at that workshop. On the basis for the exact requirements for technical and non-technical mitigation measures for each natural hazard risk were analysed in close collaboration with members of the municipal operation control and members of the local emergency services (fire brigade, Red Cross). The measures includes warning, evacuation and, technical interventions with heavy equipment and personnel. These results are used, first, to improve the municipal emergency and crisis management plan by providing a risk map, and a list of risks and, second, to check if the local emergency forces can cope with the different risk scenarios using locally available resources. The emergency response plans will identify possible resource deficiencies in personnel, vehicles and equipment. As qualitative methods and data are used, uncertainties in the study emerged in finding definitions for safety targets, in the construction of the different risk scenarios, in the inherent uncertainty beyond the probability of occurrence and the intensity of natural hazards, also in the case of the expectable losses. Finally, we used available studies and expert interviews to develop objective rules for investment decisions for the fire departments and the Red Cross to present an empirically sound basis for the efficient provision of intervention in the case of natural hazards for the municipality of Au. Again, the regulations for objective provision were developed in close collaboration with the emergency services.

  14. Using fine-scale fuel measurements to assess wildland fuels, potential fire behavior and hazard mitigation treatments in the southeastern USA

    Microsoft Academic Search

    Roger D. Ottmar; John I. Blake; William T. Crolly; Crolly; T. William

    2012-01-01

    The inherent spatial and temporal heterogeneity of fuelbeds in forests of the southeastern United States may require fine scale fuel measurements for providing reliable fire hazard and fuel treatment effectiveness estimates. In a series of five papers, an intensive, fine scale fuel inventory from the Savanna River Site in the southeastern United States is used for building fuelbeds and mapping

  15. ASSESSMENT OF HAZARDOUS WASTE SURFACE IMPOUNDMENT TECHNOLOGY CASE STUDIES AND PERSPECTIVES OF EXPERTS

    EPA Science Inventory

    The available data were gathered for a large number of case studies of hazardous waste surface impoundments (SI). Actual and projected performances were compared. This collection, analysis and dissemination of the accumulated experience can contribute significantly to improving S...

  16. Observational Studies of Earthquake Preparation and Generation to Mitigate Seismic Risks in Mines

    NASA Astrophysics Data System (ADS)

    Durrheim, R. J.; Ogasawara, H.; Nakatani, M.; Milev, A.; Cichowicz, A.; Kawakata, H.; Yabe, Y.; Murakami, O.; Naoi, M. M.; Moriya, H.; Satoh, T.

    2011-12-01

    We provide a status report on a 5-year project to monitor in-situ fault instability and strong motion in South African gold mines. The project has two main aims: (1) To learn more about earthquake preparation and generation mechanisms by deploying dense arrays of high-sensitivity sensors within rock volumes where mining is likely to induce significant seismic activity. (2) To upgrade the South African national surface seismic network in the mining districts. This knowledge will contribute to efforts to upgrade schemes of seismic hazard assessment and to limit and mitigate the seismic risks in deep mines. As of 31 July 2011, 46 boreholes totalling 1.9 km in length had been drilled at project sites at Ezulwini, Moab-Khotsong and Driefontein gold mines. Several dozen more holes are still to be drilled. Acoustic emission sensors, strain- and tiltmeters, and controlled seismic sources are being installed to monitor the deformation of the rock mass, the accumulation of damage during the preparation phase, and changes in dynamic stress as the rupture front propagates. These data will be integrated with measurements of stope closure, stope strong motion, seismic data recorded by the mine-wide network, and stress modelling. Preliminary results will be reported at AGU meeting. The project is endorsed by the Japan Science and Technology Agency (JST), Japan International Cooperation Agency (JICA) and the South African government. It is funded by the JST-JICA program for Science and Technology Research Partnership for Sustainable development (SATREPS, the Council for Scientific and Industrial Research (CSIR), the Council for Geoscience, the University of the Witwatersrand and the Department of Science and Technology. The contributions of Seismogen CC, OHMS Ltd, AnglogoldAshanti Rock Engineering Applied Research Group, First Uranium, the Gold Fields Seismic Department and the Institute of Mine Seismology are gratefully acknowledged.

  17. Hazardous Waste/Mixed Waste Treatment Building throughput study

    SciTech Connect

    England, J.L.; Kanzleiter, J.P.

    1991-12-18

    The hazardous waste/mixed waste HW/MW Treatment Building (TB) is the specified treatment location for solid hazardous waste/mixed waste at SRS. This report provides throughput information on the facility based on known and projected waste generation rates. The HW/MW TB will have an annual waste input for the first four years of approximately 38,000 ft{sup 3} and have an annual treated waste output of approximately 50,000 ft{sup 3}. After the first four years of operation it will have an annual waste input of approximately 16,000 ft{sup 3} and an annual waste output of approximately 18,000 ft. There are several waste streams that cannot be accurately predicted (e.g. environmental restoration, decommissioning, and decontamination). The equipment and process area sizing for the initial four years should allow excess processing capability for these poorly defined waste streams. A treatment process description and process flow of the waste is included to aid in understanding the computations of the throughput. A description of the treated wastes is also included.

  18. Applications of 3D QRA technique to the fire\\/explosion simulation and hazard mitigation within a naphtha-cracking plant

    Microsoft Academic Search

    Yet-Pole I; Chi-Min Shu; Ching-Hong Chong

    2009-01-01

    This research employed computational fluid dynamics (CFD) fire and explosion simulation software, FLACS, to evaluate the possible hazards of different worst-case scenarios within a petrochemical plant. After the effect factors (overpressure, pressure impulse, and thermal radiation temperature) were calculated, the results of interest were, in turn, adopted to a self-developed risk analysis module to estimate the corresponding 3D individual risk

  19. Study on mitigation of pulsed heat load for ITER cryogenic system

    NASA Astrophysics Data System (ADS)

    Peng, N.; Xiong, L. Y.; Jiang, Y. C.; Tang, J. C.; Liu, L. Q.

    2015-03-01

    One of the key requirements for ITER cryogenic system is the mitigation of the pulsed heat load deposited in the magnet system due to magnetic field variation and pulsed DT neutron production. As one of the control strategies, bypass valves of Toroidal Field (TF) case helium loop would be adjusted to mitigate the pulsed heat load to the LHe plant. A quasi-3D time-dependent thermal-hydraulic analysis of the TF winding packs and TF case has been performed to study the behaviors of TF magnets during the reference plasma scenario with the pulses of 400 s burn and repetition time of 1800 s. The model is based on a 1D helium flow and quasi-3D solid heat conduction model. The whole TF magnet is simulated taking into account thermal conduction between winding pack and case which are cooled separately. The heat loads are given as input information, which include AC losses in the conductor, eddy current losses in the structure, thermal radiation, thermal conduction and nuclear heating. The simulation results indicate that the temperature variation of TF magnet stays within the allowable range when the smooth control strategy is active.

  20. Experimental study of disruption mitigation using massive injection of noble gases on Tore Supra

    NASA Astrophysics Data System (ADS)

    Reux, C.; Bucalossi, J.; Saint-Laurent, F.; Gil, C.; Moreau, P.; Maget, P.

    2010-09-01

    Disruptions are a major threat for future tokamaks, including ITER. Disruption-generated heat loads, electromagnetic forces and runaway electrons will not be tolerable for next-generation devices. Massive noble gas injection is foreseen as a standard mitigation system for these tokamaks. Disruption mitigation experiments have been carried out on Tore Supra to study various injection scenarios and to investigate gas jet penetration and mixing. Comparisons of different gases (He, Ne, Ar, He/Ar mixture) and amounts (from 5 to 500 Pa m3) were made, showing that light gases are more efficient regarding runaway electron suppression than heavier gases. Eddy currents in the limiter are moderately reduced by all the gases, and may be more dependent on the time constants of the structures than on the gas species. The density rise induced by the massive injection before the thermal quench is higher and faster with light gases. Gas jet penetration in the cooling phase is observed to be shallow and independent of the gas nature and amount. The gas cold front is stopped along the q = 2 surface where it triggers MHD instabilities, expelling thermal energy from the plasma core.

  1. Study on Emergency Announcement Systems to Reduce Damages by Hazardous Disasters in Coastal Fishery Communities

    NASA Astrophysics Data System (ADS)

    Gotoh, Hiroshi

    Recently, a number of hazardous disasters occurred everywhere in the world. In Japan, though the administrative sections have performed to construct structures for measures to protect people by hazards, it is important that the people evacuate in the event of disasters adequately. In this study, some questionnaires and field investigations were carried out on the effectiveness of emergency announcement systems installed in coastal fishery communities facing Pacific Ocean. Also, some ways to improve emergency announcement systems are discussed.

  2. Assessing sensitivity of Probabilistic Seismic Hazard Analysis (PSHA) to fault parameters: Sumatra case study

    NASA Astrophysics Data System (ADS)

    Omang, A.; Cummins, P. R.; Horspool, N.; Hidayati, S.

    2012-12-01

    Slip rate data and fault geometry are two important inputs in determining seismic hazard, because they are used to estimate earthquake recurrence intervals which strongly influence the hazard level in an area. However, the uncertainty of slip-rates and geometry of the fault are rarely considered in any probabilistic seismic hazard analysis (PSHA), which is surprising given the estimates of slip-rates can vary significantly from different data sources (e.g. geological vs. Geodetic). We use the PSHA method to assess the sensitivity of seismic hazard to fault slip-rates along the Great Sumatran Fault in Sumatra, Indonesia. We will consider the epistemic uncertainty of fault slip rate by employing logic trees to include alternative slip rate models. The weighting of the logic tree is determined by the probability density function of the slip rate estimates using the approach of Zechar and Frankel (2009). We consider how the PSHA result accounting for slip rate uncertainty differs from that for a specific slip rate by examining hazard values as a function of return period and distance from the fault. We also consider the geometry of the fault, especially the top and the bottom of the rupture area within a fault, to study the effect from different depths. Based on the results of this study, in some cases the uncertainty in fault slip-rates, fault geometry and maximum magnitude have a significant effect on hazard level and area impacted by earthquakes and should be considered in PSHA studies.

  3. A combined approach to physical vulnerability of large cities exposed to natural hazards - the case study of Arequipa, Peru

    NASA Astrophysics Data System (ADS)

    Thouret, Jean-Claude; Ettinger, Susanne; Zuccaro, Giulio; Guitton, Mathieu; Martelli, Kim; Degregorio, Daniela; Nardone, Stefano; Santoni, Olivier; Magill, Christina; Luque, Juan Alexis; Arguedas, Ana

    2013-04-01

    Arequipa, the second largest city in Peru with almost one million inhabitants, is exposed to various natural hazards, such as earthquakes, landslides, flash floods, and volcanic eruptions. This study focuses on the vulnerability and response of housing, infrastructure and lifelines in Arequipa to flash floods and eruption induced hazards, notably lahars from El Misti volcano. We propose a combined approach for assessing physical vulnerability in a large city based on: (1) remote sensing utilizing high-resolution imagery (SPOT5, Google Earth Pro, Bing, Pléïades) to map the distribution and type of land use, properties of city blocks in terms of exposure to the hazard (elevation above river level, distance to channel, impact angle, etc.); (2) in situ survey of buildings and critical infrastructure (e.g., bridges) and strategic resources (e.g., potable water, irrigation, sewage); (3) information gained from interviews with engineers involved in construction works, previous crises (e.g., June 2001 earthquake) and risk mitigation in Arequipa. Remote sensing and mapping at the scale of the city has focused on three pilot areas, along the perennial Rio Chili valley that crosses the city and oasis from north to south, and two of the east-margin tributaries termed Quebrada (ravine): San Lazaro crossing the northern districts and Huarangal crossing the northeastern districts. Sampling of city blocks through these districts provides varying geomorphic, structural, historical, and socio-economic characteristics for each sector. A reconnaissance survey included about 900 edifices located in 40 city blocks across districts of the pilot areas, distinct in age, construction, land use and demographics. A building acts as a structural system and its strength and resistance to flashfloods and lahars therefore highly depends on the type of construction and the used material. Each building surveyed was assigned to one of eight building categories based on physical criteria (dominant building materials, number of floors, percentage and quality of openings, etc). Future steps in this study include mapping potential impacts from flash flood and lahars as a function of frequency of occurrence and magnitude. For this purpose, we will regroup the eight building types identified in Arequipa to obtain a reduced number of vulnerability categories. Fragility functions will then be established for each vulnerability category and hazard relating percentage damage to parameters such as flow velocity, depth, and dynamic and hydrostatic pressure. These functions will be applied to flow simulations for each of the three river channels considered with the final goal to determine potential losses, identify areas of particularly high risk and to prepare plans for evacuation, relocation and rehabilitation. In the long term, this investigation aims to contribute towards a multi-hazard risk analysis including earthquake- and other volcanic hazards, e.g. ashfall and pyroclastic flows, all by considering the cascading effects of a hazard chain. We also plan to address the consequences of failure of two artificial lake dams located 40 and 70 km north of the city. A lake breakout flood or lahar would propagate beyond the city and would call for an immediate response including contingency plans and evacuation practices.

  4. Melatonin mitigate cerebral vasospasm after experimental subarachnoid hemorrhage: a study of synchrotron radiation angiography

    NASA Astrophysics Data System (ADS)

    Cai, J.; He, C.; Chen, L.; Han, T.; Huang, S.; Huang, Y.; Bai, Y.; Bao, Y.; Zhang, H.; Ling, F.

    2013-06-01

    Cerebral vasospasm (CV) after subarachnoid hemorrhage (SAH) is a devastating and unsolved clinical issue. In this study, the rat models, which had been induced SAH by prechiasmatic cistern injection, were treated with melatonin. Synchrotron radiation angiography (SRA) was employed to detect and evaluate CV of animal models. Neurological scoring and histological examinations were used to assess the neurological deficits and CV as well. Using SRA techniques and histological analyses, the anterior cerebral artery diameters of SAH rats with melatonin administration were larger than those without melatonin treatment (p < 0.05). The neurological deficits of SAH rats treated with melatonin were less than those without melatonin treatment (p < 0.05). We concluded that SRA was a precise and in vivo tool to observe and evaluate CV of SAH rats; intraperitoneally administration of melatonin could mitigate CV after experimental SAH.

  5. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...2010-10-01 2010-10-01 false Standard Flood Hazard Determination Form and Instructions...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard Determination Form and...

  6. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...2012-10-01 2011-10-01 true Standard Flood Hazard Determination Form and Instructions...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard Determination Form and...

  7. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...2014-10-01 2014-10-01 false Standard Flood Hazard Determination Form and Instructions...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard Determination Form and...

  8. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...2013-10-01 2013-10-01 false Standard Flood Hazard Determination Form and Instructions...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard Determination Form and...

  9. 44 CFR 65.16 - Standard Flood Hazard Determination Form and Instructions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...2011-10-01 2011-10-01 false Standard Flood Hazard Determination Form and Instructions...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...SPECIAL HAZARD AREAS § 65.16 Standard Flood Hazard Determination Form and...

  10. Echo-sounding method aids earthquake hazard studies

    USGS Publications Warehouse

    U.S. Geological Survey

    1995-01-01

    Dramatic examples of catastrophic damage from an earthquake occurred in 1989, when the M 7.1 Lorna Prieta rocked the San Francisco Bay area, and in 1994, when the M 6.6 Northridge earthquake jolted southern California. The surprising amount and distribution of damage to private property and infrastructure emphasizes the importance of seismic-hazard research in urbanized areas, where the potential for damage and loss of life is greatest. During April 1995, a group of scientists from the U.S. Geological Survey and the University of Tennessee, using an echo-sounding method described below, is collecting data in San Antonio Park, California, to examine the Monte Vista fault which runs through this park. The Monte Vista fault in this vicinity shows evidence of movement within the last 10,000 years or so. The data will give them a "picture" of the subsurface rock deformation near this fault. The data will also be used to help locate a trench that will be dug across the fault by scientists from William Lettis & Associates.

  11. Google Earth Views of Probabilistic Tsunami Hazard Analysis Pilot Study, Seaside, Oregon

    Microsoft Academic Search

    F. L. Wong; A. J. Venturato; E. L. Geist

    2006-01-01

    Virtual globes such as Google Earth provide immediate geographic context for research data for coastal hazard planning. We present Google Earth views of data from a Tsunami Pilot Study conducted within and near Seaside and Gearhart, Oregon, as part of FEMA's Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). Two goals of the pilot study were

  12. Mitigation and prevention of exertional heat stress in firefighters: a review of cooling strategies for structural firefighting and hazardous materials responders.

    PubMed

    McEntire, Serina J; Suyama, Joe; Hostler, David

    2013-01-01

    Most duties performed by firefighters require the use of personal protective equipment, which inhibits normal thermoregulation during exertion, creating an uncompensable heat stress. Structured rest periods are required to correct the effects of uncompensable heat stress and ensure that firefighter safety is maintained and that operations can be continued until their conclusion. While considerable work has been done to optimize firefighter cooling during fireground operations, there is little consensus on when or how cooling should be deployed. A systematic review of cooling techniques and practices among firefighters and hazardous materials operators was conducted to describe the state of the science and provide recommendations for deploying resources for fireground rehab (i.e., structured rest periods during an incident). Five electronic databases were searched using a selected combination of key words. One hundred forty publications were found in the initial search, with 27 meeting all the inclusion criteria. Two independent reviewers performed a qualitative assessment of each article based on nine specific questions. From the selected literature, the efficacy of multiple cooling strategies was compared during exertion and immediately following exertion under varying environmental conditions. When considering the literature available for cooling firefighters and hazardous materials technicians during emergency incident rehabilitation, widespread use of cooling devices does not appear to be warranted if ambient temperature and humidity approximate room temperature and protective garments can be removed. When emergency incident rehabilitation must be conducted in hot or humid conditions, active cooling devices are needed. Hand/forearm immersion is likely the best modality for cooling during rehab under hot, humid conditions; however, this therapy has a number of limitations. Cooling during work thus far has been limited primarily to cooling vests and liquid- or air-cooled suits. In general, liquid-perfused suits appear to be superior to air-cooled garments, but both add weight to the firefighter, making current iterations less desirable. There is still considerable work to be done to determine the optimal cooling strategies for firefighters and hazardous materials operators during work. PMID:23379781

  13. Earthquake Hazard Map, A Small Step to Adapt with the Teeterboard Named Indonesia Case study: Papua Island

    NASA Astrophysics Data System (ADS)

    Cipta, A.

    2012-12-01

    The availability of earthquake hazard maps throughout Indonesia is a must, since earthquake mitigation effort is more emphasized on pre-disaster phase. The hazard map is created using PSHA (Probabilistic Seismic Hazard Assessment) method and developed on OpenQuake. Earthquake source, site class and return period are required. Furthermore GMPE for each earthquake zone should be ascertained. Site class is determined by considering to geology and morphologic aspects. Geomorphic aspect is used to estimate vs30, i.e. site class for each site. As consequent, geomorphic map of Papua can be divided into 13 geomorphic classes. Meanwhile earthquake source zone is classified as: active fault, subduction zone and background. Parameters using for probabilistic analyses are fault trace, slip-rate, dip and dimension of fault. Four segments of subduction zones, intraslab and 14 active faults both inland and sea-fault are used for this purpose. Parameters needed for subduction source model are area, dip, slip-rate, a- and b-value. Background source model is used to accommodate earthquake may occur randomly out of fault and to predict larger scale earthquake near location of historical small and medium earthquakes. Since specific attenuation function for Indonesian has not been provided yet, the attenuation function that introduced by some researchers is used. Hazard levels were calculated to represent 10% probability of exceedance in 50 years (475 years earthquake recurrence) and furthermore earthquake hazard map of Papua has been created.; Hazard Level Class Based on MMI scalet;

  14. Studying and Improving Human Response to Natural Hazards: Lessons from the Virtual Hurricane Lab

    NASA Astrophysics Data System (ADS)

    Meyer, R.; Broad, K.; Orlove, B. S.

    2010-12-01

    One of the most critical challenges facing communities in areas prone to natural hazards is how to best encourage residents to invest in individual and collective actions that would reduce the damaging impact of low-probability, high-consequence, environmental events. Unfortunately, what makes this goal difficult to achieve is that the relative rarity natural hazards implies that many who face the risk of natural hazards have no previous experience to draw on when making preparation decisions, or have prior experience that provides misleading guidance on how best to prepare. For example, individuals who have experienced strings of minor earthquakes or near-misses from tropical cyclones may become overly complacent about the risks that extreme events actually pose. In this presentation we report the preliminary findings of a program of work that explores the use of realistic multi-media hazard simulations designed for two purposes: 1) to serve as a basic research tool for studying of how individuals make decisions to prepare for rare natural hazards in laboratory settings; and 2) to serve as an educational tool for giving people in hazard-prone areas virtual experience in hazard preparation. We demonstrate a prototype simulation in which participants experience the approach of a virtual hurricane, where they have the opportunity to invest in different kinds of action to protect their home from damage. As the hurricane approaches participants have access to an “information dashboard” in which they can gather information about the storm threat from a variety of natural sources, including mock television weather broadcasts, web sites, and conversations with neighbors. In response to this information they then have the opportunity to invest in different levels of protective actions. Some versions of the simulation are designed as games, where participants are rewarded based on their ability to make the optimal trade-off between under and over-preparing for the threat. From a basic research perspective the data provide valuable potential insights into the dynamics of information gathering prior to hurricane impacts, as well as laboratory in which we can study how both information gathering and responses varies in responses to controlled variations in such factors as the complexity of forecast information. From an applied perspective the simulations provide an opportunity for residents in hazard-prone areas to learn about different kinds of information and receive feedback on their potential biases prior to an actual encounter with a hazard. The presentation concludes with a summary of some of the basic research findings that have emerged from the hurricane lab to date, as well as a discussion of the prospects for extending the technology to a broad range of environmental hazards.

  15. Public engagement in neighbourhood level wildfire mitigation and preparedness: case studies from Canada, the US and Australia.

    PubMed

    McGee, T K

    2011-10-01

    This study examined neighbourhood level wildfire mitigation programs being implemented in neighbourhoods in Canada (FireSmart-ForestWise), Australia (Community Fireguard) and the US (Firewise Communities). Semi-structured interviews were completed with 19 residents participating in the programs. A wide range of activities were completed as part of the three programs. Despite differences between the three programs, participants appeared to participate in the programs for three main reasons: Fire experience, agency involvement, and personal and family protection. A fire therefore provides a window of opportunity to engage residents in neighbourhood level wildfire mitigation programs. The neighbourhood level wildfire mitigation programs helped to reduce the wildfire risk, but also enhanced both community resilience and relationships between residents and government agencies. PMID:21684061

  16. Public willingness to pay for CO2 mitigation and the determinants under climate change: a case study of Suzhou, China.

    PubMed

    Yang, Jie; Zou, Liping; Lin, Tiansheng; Wu, Ying; Wang, Haikun

    2014-12-15

    This study explored the factors that influence respondents' willingness to pay (WTP) for CO2 mitigation under climate change. A questionnaire survey combined with contingent valuation and psychometric paradigm methods were conducted in the city of Suzhou, Jiangsu Province in China. Respondents' traditional demographic attributes, risk perception of greenhouse gas (GHG), and attitude toward the government's risk management practices were established using a Tobit model to analyze the determinants. The results showed that about 55% of the respondents refused to pay for CO2 mitigation, respondent's WTP increased with increasing CO2 mitigation percentage. Important factors influencing WTP include people's feeling of dread of GHGs, confidence in policy, the timeliness of governmental information disclosure, age, education and income level. PMID:25151109

  17. Pyrotechnic hazards classification and evaluation program test report. Heat flux study of deflagrating pyrotechnic munitions

    NASA Technical Reports Server (NTRS)

    Fassnacht, P. O.

    1971-01-01

    A heat flux study of deflagrating pyrotechnic munitions is presented. Three tests were authorized to investigate whether heat flux measurements may be used as effective hazards evaluation criteria to determine safe quantity distances for pyrotechnics. A passive sensor study was conducted simultaneously to investigate their usefulness in recording events and conditions. It was concluded that heat flux measurements can effectively be used to evaluate hazards criteria and that passive sensors are an inexpensive tool to record certain events in the vicinity of deflagrating pyrotechnic stacks.

  18. Mini-Sosie high-resolution seismic method aids hazards studies

    USGS Publications Warehouse

    Stephenson, W.J.; Odum, J.; Shedlock, K.M.; Pratt, T.L.; Williams, R.A.

    1992-01-01

    The Mini-Sosie high-resolution seismic method has been effective in imaging shallow-structure and stratigraphic features that aid in seismic-hazard and neotectonic studies. The method is not an alternative to Vibroseis acquisition for large-scale studies. However, it has two major advantages over Vibroseis as it is being used by the USGS in its seismic-hazards program. First, the sources are extremely portable and can be used in both rural and urban environments. Second, the shifting-and-summation process during acquisition improves the signal-to-noise ratio and cancels out seismic noise sources such as cars and pedestrians. -from Authors

  19. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    SciTech Connect

    Bernreuter, D. L

    1981-08-08

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectral for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimted seismic hazard in this region of the country. 29 refs., 15 tabs.

  20. Seaside, Oregon Tsunami Pilot Study--Modernization of FEMA Flood Hazard Maps

    E-print Network

    Seaside, Oregon Tsunami Pilot Study-- Modernization of FEMA Flood Hazard Maps By Tsunami Pilot ADMINISTRATION U.S. D EPARTMENT OF COMM E R CE 10 m 8 6 4 500-year tsunami-- maximum wave height (m) with a 0.002 annual probability of exceedance #12;#12;Seaside, Oregon Tsunami Pilot Study-- Modernization of FEMA

  1. Remote sensing techniques for landslide studies and hazard zonation in Europe

    Microsoft Academic Search

    Franco Mantovani; Robert Soeters; C. J. Van Westen

    1996-01-01

    An inventory is presented of researches concerning the use of remote sensing for landslide studies and hazard zonation as mainly carried out in the countries belonging to the European Community. An overview is given of the applicability of remote sensing in the following phases of landslide studies: 1.(1) Detection and classification of landslides. Special emphasis is given to the types

  2. Seismic hazard assessment of the cultural heritage sites: A case study in Cappadocia (Turkey)

    NASA Astrophysics Data System (ADS)

    Seyrek, Evren; Orhan, Ahmet; Dinçer, ?smail

    2014-05-01

    Turkey is one of the most seismically active regions in the world. Major earthquakes with the potential of threatening life and property occur frequently here. In the last decade, over 50,000 residents lost their lives, commonly as a result of building failures in seismic events. The Cappadocia region is one of the most important touristic sites in Turkey. At the same time, the region has been included to the Word Heritage List by UNESCO at 1985 due to its natural, historical and cultural values. The region is undesirably affected by several environmental conditions, which are subjected in many previous studies. But, there are limited studies about the seismic evaluation of the region. Some of the important historical and cultural heritage sites are: Goreme Open Air Museum, Uchisar Castle, Ortahisar Castle, Derinkuyu Underground City and Ihlara Valley. According to seismic hazard zonation map published by the Ministry of Reconstruction and Settlement these heritage sites fall in Zone III, Zone IV and Zone V. This map show peak ground acceleration or 10 percent probability of exceedance in 50 years for bedrock. In this connection, seismic hazard assessment of these heritage sites has to be evaluated. In this study, seismic hazard calculations are performed both deterministic and probabilistic approaches with local site conditions. A catalog of historical and instrumental earthquakes is prepared and used in this study. The seismic sources have been identified for seismic hazard assessment based on geological, seismological and geophysical information. Peak Ground Acceleration (PGA) at bed rock level is calculated for different seismic sources using available attenuation relationship formula applicable to Turkey. The result of the present study reveals that the seismic hazard at these sites is closely matching with the Seismic Zonation map published by the Ministry of Reconstruction and Settlement. Keywords: Seismic Hazard Assessment, Probabilistic Approach, Deterministic Approach, Historical Heritage, Cappadocia.

  3. Study of the environmental hazard caused by the oil shale industry solid waste.

    PubMed

    Põllumaa, L; Maloveryan, A; Trapido, M; Sillak, H; Kahru, A

    2001-01-01

    The environmental hazard was studied of eight soil and solid waste samples originating from a region of Estonia heavily polluted by the oil shale industry. The samples were contaminated mainly with oil products (up to 7231mg/kg) and polycyclic aromatic hydrocarbons (PAHs; up to 434mg/kg). Concentrations of heavy metals and water-extractable phenols were low. The toxicities of the aqueous extracts of solid-phase samples were evaluated by using a battery of Toxkit tests (involving crustaceans, protozoa, rotifers and algae). Waste rock and fresh semi-coke were classified as of "high acute toxic hazard", whereas aged semi-coke and most of the polluted soils were classified as of "acute toxic hazard". Analysis of the soil slurries by using the photobacterial solid-phase flash assay showed the presence of particle-bound toxicity in most samples. In the case of four samples out of the eight, chemical and toxicological evaluations both showed that the levels of PAHs, oil products or both exceeded their respective permitted limit values for the living zone (20mg PAHs/kg and 500mg oil products/kg); the toxicity tests showed a toxic hazard. However, in the case of three samples, the chemical and toxicological hazard predictions differed markedly: polluted soil from the Erra River bank contained 2334mg oil/kg, but did not show any water-extractable toxicity. In contrast, spent rock and aged semi-coke that contained none of the pollutants in hazardous concentrations, showed adverse effects in toxicity tests. The environmental hazard of solid waste deposits from the oil shale industry needs further assessment. PMID:11387023

  4. Natural phenomena hazards site characterization criteria

    SciTech Connect

    Not Available

    1994-03-01

    The criteria and recommendations in this standard shall apply to site characterization for the purpose of mitigating Natural Phenomena Hazards (wind, floods, landslide, earthquake, volcano, etc.) in all DOE facilities covered by DOE Order 5480.28. Criteria for site characterization not related to NPH are not included unless necessary for clarification. General and detailed site characterization requirements are provided in areas of meteorology, hydrology, geology, seismology, and geotechnical studies.

  5. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  6. Concerns about climate change mitigation projects: summary of findings from case studies in Brazil, India, Mexico and South Africa

    Microsoft Academic Search

    Jayant A. Sathaye; Kenneth Andrasko; Willy Makundi; Emilio Lebre La Rovere; N. H. Ravindranath; Anandi Melli; Anita Rangachari; Mireya Imaz; Carlos Gay; Rafael Friedmann; Beth Goldberg; Clive van Horen; Gillian Simmonds; Gretchen Parker

    1999-01-01

    The concept of joint implementation as a way to implement climate change mitigation projects in another country has been controversial ever since its inception. Developing countries have raised numerous issues at the project-specific technical level and broader concerns having to do with equity and burden sharing. This paper summarizes the findings of studies for Brazil, India, Mexico and South Africa,

  7. Gas jet disruption mitigation studies on Alcator C-Mod and DIII-D

    E-print Network

    Hollmann, E. M.

    High-pressure noble gas jet injection is a mitigation technique which potentially satisfies the requirements of fast response time and reliability, without degrading subsequent discharges. Previously reported gas jet ...

  8. Study on the Mitigation Method of Zero-Sequence Harmonic Current in Building Electric System

    Microsoft Academic Search

    Ming Shen; Xiangqian Tong; Biying Ren

    2010-01-01

    The tuned harmonic filter in series with the neutral line is an effective method for zero-sequence harmonic mitigation. However, the series tuned filter not only has the de-tuning problem but also inject fundamental impedance into the neutral line. In order to mitigate those negative effects, a continuous auto-tuning zero-sequence harmonic filter based on an active reactor was presented in this

  9. Reducing aluminum dust explosion hazards: case study of dust inerting in an aluminum buffing operation.

    PubMed

    Myers, Timothy J

    2008-11-15

    Metal powders or dusts can represent significant dust explosion hazards in industry, due to their relatively low ignition energy and high explosivity. The hazard is well known in industries that produce or use aluminum powders, but is sometimes not recognized by facilities that produce aluminum dust as a byproduct of bulk aluminum processing. As demonstrated by the 2003 dust explosion at aluminum wheel manufacturer Hayes Lemmerz, facilities that process bulk metals are at risk due to dust generated during machining and finishing operations [U.S. Chemical Safety and Hazard Investigation Board, Investigation Report, Aluminum Dust Explosion Hayes Lemmerz International, Inc., Huntington, Indiana, Report No. 2004-01-I-IN, September 2005]. Previous studies have shown that aluminum dust explosions are more difficult to suppress with flame retardants or inerting agents than dust explosions fueled by other materials such as coal [A.G. Dastidar, P.R. Amyotte, J. Going, K. Chatrathi, Flammability limits of dust-minimum inerting concentrations, Proc. Saf. Progr., 18-1 (1999) 56-63]. In this paper, an inerting method is discussed to reduce the dust explosion hazard of residue created in an aluminum buffing operation as the residue is generated. This technique reduces the dust explosion hazard throughout the buffing process and within the dust collector systems making the process inherently safer. Dust explosion testing results are presented for process dusts produced during trials with varying amounts of flame retardant additives. PMID:18423857

  10. A hazard and operability study of anhydrous ammonia application in agriculture.

    PubMed

    Spencer, A B; Gressel, M G

    1993-11-01

    Researchers from the National Institute for Occupational Safety and Health (NIOSH) applied Hazard and Operability (HAZOP) analysis to examine hazards during the use of anhydrous ammonia by farmers. This analysis evaluated the storage, transfer, and application of anhydrous ammonia, identifying credible hazard scenarios, practical solutions, and research needs. Ninety-five findings were developed that are of use to farmers, distributors of ammonia and application equipment, and manufacturers of application equipment. The findings generally involve training, equipment design changes, preventive maintenance, and material compatibilities. The HAZOP team found that additional safety features need to be developed or implemented. The study also pointed out where correct operator procedure and preventive maintenance can prevent inadvertent releases. Other inadvertent releases are caused by incompatible materials, or by using equipment in ways other than intended. Several examples of the findings are given to emphasize the HAZOP technique and the high-risk scenarios. Strategies for dissemination to the agricultural community are presented. PMID:8256691

  11. Flood hazards studies in the Mississippi River basin using remote sensing

    NASA Technical Reports Server (NTRS)

    Rango, A.; Anderson, A. T.

    1974-01-01

    The Spring 1973 Mississippi River flood was investigated using remotely sensed data from ERTS-1. Both manual and automatic analyses of the data indicated that ERTS-1 is extremely useful as a regional tool for flood mamagement. Quantitative estimates of area flooded were made in St. Charles County, Missouri and Arkansas. Flood hazard mapping was conducted in three study areas along the Mississippi River using pre-flood ERTS-1 imagery enlarged to 1:250,000 and 1:100,000 scale. Initial results indicate that ERTS-1 digital mapping of flood prone areas can be performed at 1:62,500 which is comparable to some conventional flood hazard map scales.

  12. Mitigating urban heat island effects in high-density cities based on sky view factor and urban morphological understanding: a study of Hong Kong

    Microsoft Academic Search

    Chao Yuan; Liang Chen

    2011-01-01

    The urban heat island (UHI) effect is one of the most studied topics in many mega cities because of rapid urbanization. The literature review showed a significant correlation between sky view factor (SVF) and UHI. However, this climate knowledge has low impact on urban planning process to mitigate UHI. More studies should be devoted to ways of mitigating UHI based

  13. A PRELIMINARY FEASIBILITY STUDY FOR AN OFFSHORE HAZARDOUS WASTE INCINERATION FACILITY

    EPA Science Inventory

    The report summarizes a feasibility study of using an existing offshore oil platform, being offered to the Government, as a site for incineration of hazardous wastes and related research. The platform, located in the Gulf of Mexico about 100 km south of Mobile, AL, has potential ...

  14. SLUDGE TREATMENT PROJECT ENGINEERED CONTAINER RETRIEVAL AND TRANSFER SYSTEM PRELMINARY DESIGN HAZARD AND OPERABILITY STUDY

    Microsoft Academic Search

    CARRO CA

    2011-01-01

    This Hazard and Operability (HAZOP) study addresses the Sludge Treatment Project (STP) Engineered Container Retrieval and Transfer System (ECRTS) preliminary design for retrieving sludge from underwater engineered containers located in the 105-K West (KW) Basin, transferring the sludge as a sludge-water slurry (hereafter referred to as 'slurry') to a Sludge Transport and Storage Container (STSC) located in a Modified KW

  15. THE SOCIAL IMPLICATIONS OF FLAME RETARDANT CHEMICALS: A CASE STUDY IN RISK AND HAZARD PERCEPTION

    EPA Science Inventory

    This study is expected to fill an important gap in the literature by focusing on how individuals characterize exposure in terms of risk and hazard, and how this understanding can lead to concrete changes in their personal and professional lives. I expect that people differ gre...

  16. Urban Vulnerability Assessment to Seismic Hazard through Spatial Multi-Criteria Analysis. Case Study: the Bucharest Municipality/Romania

    NASA Astrophysics Data System (ADS)

    Armas, Iuliana; Dumitrascu, Silvia; Bostenaru, Maria

    2010-05-01

    In the context of an explosive increase in value of the damage caused by natural disasters, an alarming challenge in the third millennium is the rapid growth of urban population in vulnerable areas. Cities are, by definition, very fragile socio-ecological systems with a high level of vulnerability when it comes to environmental changes and that are responsible for important transformations of the space, determining dysfunctions shown in the state of the natural variables (Parker and Mitchell, 1995, The OFDA/CRED International Disaster Database). A contributing factor is the demographic dynamic that affects urban areas. The aim of this study is to estimate the overall vulnerability of the urban area of Bucharest in the context of the seismic hazard, by using environmental, socio-economic, and physical measurable variables in the framework of a spatial multi-criteria analysis. For this approach the capital city of Romania was chosen based on its high vulnerability due to the explosive urban development and the advanced state of degradation of the buildings (most of the building stock being built between 1940 and 1977). Combining these attributes with the seismic hazard induced by the Vrancea source, Bucharest was ranked as the 10th capital city worldwide in the terms of seismic risk. Over 40 years of experience in the natural risk field shows that the only directly accessible way to reduce the natural risk is by reducing the vulnerability of the space (Adger et al., 2001, Turner et al., 2003; UN/ISDR, 2004, Dayton-Johnson, 2004, Kasperson et al., 2005; Birkmann, 2006 etc.). In effect, reducing the vulnerability of urban spaces would imply lower costs produced by natural disasters. By applying the SMCA method, the result reveals a circular pattern, signaling as hot spots the Bucharest historic centre (located on a river terrace and with aged building stock) and peripheral areas (isolated from the emergency centers and defined by precarious social and economic conditions). In effect, the example of Bucharest demonstrates how the results shape the ‘vulnerability to seismic hazard profile of the city, based on which decision makers could develop proper mitigation strategies. To sum up, the use of an analytical framework as the standard Spatial Multi-Criteria Analysis (SMCA) - despite all difficulties in creating justifiable weights (Yeh et al., 1999) - results in accurate estimations of the state of the urban system. Although this method was often mistrusted by decision makers (Janssen, 2001), we consider that the results can represent, based on precisely the level of generalization, a decision support framework for policy makers to critically reflect on possible risk mitigation plans. Further study will lead to the improvement of the analysis by integrating a series of daytime and nighttime scenarios and a better definition of the constructed space variables.

  17. A Case Study in Ethical Decision Making Regarding Remote Mitigation of Botnets

    NASA Astrophysics Data System (ADS)

    Dittrich, David; Leder, Felix; Werner, Tillmann

    It is becoming more common for researchers to find themselves in a position of being able to take over control of a malicious botnet. If this happens, should they use this knowledge to clean up all the infected hosts? How would this affect not only the owners and operators of the zombie computers, but also other researchers, law enforcement agents serving justice, or even the criminals themselves? What dire circumstances would change the calculus about what is or is not appropriate action to take? We review two case studies of long-lived malicious botnets that present serious challenges to researchers and responders and use them to illuminate many ethical issues regarding aggressive mitigation. We make no judgments about the questions raised, instead laying out the pros and cons of possible choices and allowing workshop attendees to consider how and where they would draw lines. By this, we hope to expose where there is clear community consensus as well as where controversy or uncertainty exists.

  18. Seismic Hazard and Risk Assessment for Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, Nino; Varazanashvili, Otar; Arabidze, Vakhtang; Gugeshashvili, Tengiz; Mukhadze, Teimuraz; Gvencadze, Aleksandre

    2014-05-01

    Risks of natural hazards caused by natural disaster are closely related to the development process of society. The high level of natural disasters in many countries makes necessary to work out the national programs and strategy. The main goal of these programs is to reduce the natural disasters risk and caused losses. Risk mitigation is the cornerstone of the approach to reduce the nation's vulnerability to disasters from natural hazards. So proper investigation and assessment of natural hazards and vulnerability of the element at risk to hazards is very important for an effective and proper assessment of risk. This work issues a call for advance planning and action to reduce natural disaster risks, notably seismic risk through the investigation of vulnerability and seismic hazard for Georgia. Firstly, detail inventory map of element at risk was created. Here elements at risk are comprised of buildings and population. Secondly, seismic hazard maps were calculated based on modern approach of selecting and ranking global and regional ground motion prediction equation for region. Thirdly, on the bases of empirical data that was collected for some earthquake intensity based vulnerability study were completed for Georgian buildings. Finally, probabilistic seismic risk assessment in terms of structural damage and casualties were calculated for the territory of Georgia using obtained results. This methodology gave prediction of damage and casualty for a given probability of recurrence, based on a probabilistic seismic hazard model, population distribution, inventory, and vulnerability of buildings.

  19. Subsurface Fire Hazards Technical Report

    SciTech Connect

    Logan, R.C.

    1999-09-27

    The results from this report are preliminary and cannot be used as input into documents supporting procurement, fabrication, or construction. This technical report identifies fire hazards and proposes their mitigation for the subsurface repository fire protection system. The proposed mitigation establishes the minimum level of fire protection to meet NRC regulations, DOE fire protection orders, that ensure fire containment, adequate life safety provisions, and minimize property loss. Equipment requiring automatic fire suppression systems is identified. The subsurface fire hazards that are identified can be adequately mitigated.

  20. Success in transmitting hazard science

    NASA Astrophysics Data System (ADS)

    Price, J. G.; Garside, T.

    2010-12-01

    Money motivates mitigation. An example of success in communicating scientific information about hazards, coupled with information about available money, is the follow-up action by local governments to actually mitigate. The Nevada Hazard Mitigation Planning Committee helps local governments prepare competitive proposals for federal funds to reduce risks from natural hazards. Composed of volunteers with expertise in emergency management, building standards, and earthquake, flood, and wildfire hazards, the committee advises the Nevada Division of Emergency Management on (1) the content of the State’s hazard mitigation plan and (2) projects that have been proposed by local governments and state agencies for funding from various post- and pre-disaster hazard mitigation programs of the Federal Emergency Management Agency. Local governments must have FEMA-approved hazard mitigation plans in place before they can receive this funding. The committee has been meeting quarterly with elected and appointed county officials, at their offices, to encourage them to update their mitigation plans and apply for this funding. We have settled on a format that includes the county’s giving the committee an overview of its infrastructure, hazards, and preparedness. The committee explains the process for applying for mitigation grants and presents the latest information that we have about earthquake hazards, including locations of nearby active faults, historical seismicity, geodetic strain, loss-estimation modeling, scenarios, and documents about what to do before, during, and after an earthquake. Much of the county-specific information is available on the web. The presentations have been well received, in part because the committee makes the effort to go to their communities, and in part because the committee is helping them attract federal funds for local mitigation of not only earthquake hazards but also floods (including canal breaches) and wildfires, the other major concerns in Nevada. Local citizens appreciate the efforts of the state officials to present the information in a public forum. The Committee’s earthquake presentations to the counties are supplemented by regular updates in the two most populous counties during quarterly meetings of the Nevada Earthquake Safety Council, generally alternating between Las Vegas and Reno. We have only 17 counties in Nevada, so we are making good progress at reaching each within a few years. The Committee is also learning from the county officials about their frustrations in dealing with the state and federal bureaucracies. Success is documented by the mitigation projects that FEMA has funded.

  1. A feasibility study on the influence of the geomorphological feature in identifying the potential landslide hazard

    NASA Astrophysics Data System (ADS)

    Baek, M. H.; Kim, T. H.

    2014-11-01

    In this study we focused on identifying geomorphological features that control the location of landslides. The representation of these features is based on a high resolution DEM (Digital Elevation Model) derived from airborne laser altimetry (LiDAR) and evaluated by statistical analysis of axial orientation data. The main principle of this analysis is generating eigenvalues from axial orientation data and comparing them. The Planarity, a ratio of eigenvalues, would tell the degree of roughness on ground surface based on their ratios. Results are compared to the recent landslide case in Korea in order to evaluate the feasibility of the proposed methodology in identifying the potential landslide hazard. The preliminary landslide assessment based on the Planarity analysis well discriminates features between stable and unstable domain in the study area especially in the landslide initiation zones. Results also show it is beneficial to build the preliminary landslide hazard especially inventory mapping where none of information on historical records of landslides is existed. By combining other physical procedures such as geotechnical monitoring, the landslide hazard assessment using geomorphological features will promise a better understanding of landslides and their mechanisms, and provide an enhanced methodology to evaluate their hazards and appropriate actions.

  2. Voltage Sag Mitigation Strategies for an Indian Power Systems: A Case Study

    NASA Astrophysics Data System (ADS)

    Goswami, A. K.; Gupta, C. P.; Singh, G. K.

    2014-08-01

    Under modern deregulated environment, both utilities and customers are concerned with the power quality improvement but with different objectives/interests. The utility reconfigure its power network and install mitigation devices, if needed, to improve power quality. The paper presents a strategy for selecting cost-effective solutions to mitigate voltage sags, the most frequent power quality disturbance. In this paper, mitigation device(s) is/are inducted in the optimal network topology at suitable places for their better effectiveness for further improvement in power quality. The optimal placement is looked from utility perspectives for overall benefit. Finally, their performance is evaluated on the basis of reduction in total number of voltage sags, reduction in total number of process trips and reduction in total financial losses due to voltage sags.

  3. Adsorption studies of hazardous malachite green onto treated ginger waste.

    PubMed

    Ahmad, Rais; Kumar, Rajeev

    2010-01-01

    Adsorption of malachite green (MG) from aqueous solution onto treated ginger waste (TGW) was investigated by batch and column methods. The effect of various factors such as initial dye concentration, contact time, pH and temperature were studied. The maximum adsorption of MG was observed at pH 9. Langmuir and Freundlich isotherms were employed to describe the MG adsorption equilibrium. The monolayer adsorption capacities were found to be 84.03, 163.9 and 188.6 mg/g at 30, 40 and 50 degrees C, respectively. The values of thermodynamic parameters like DeltaG degrees , DeltaH degrees and DeltaS degrees indicated that adsorption was spontaneous and endothermic in nature. The pseudo second order kinetic model fitted well in correlation to the experimental results. Rechienberg's equation was employed to determine the mechanism of adsorption. The results indicated that film diffusion was a major mode of adsorption. The breakthrough capacities were also investigated. PMID:20060638

  4. Mask roughness induced LER control and mitigation: aberrations sensitivity study and alternate illumination scheme

    SciTech Connect

    McClinton, Brittany M.; Naulleau, Patrick P.

    2011-03-11

    Here we conduct a mask-roughness-induced line-edge-roughness (LER) aberrations sensitivity study both as a random distribution amongst the first 16 Fringe Zernikes (for overall aberration levels of 0.25, 0.50, and 0.75nm rms) as well as an individual aberrations sensitivity matrix over the first 37 Fringe Zernikes. Full 2D aerial image modeling for an imaging system with NA = 0.32 was done for both the 22-nm and 16-nm half-pitch nodes on a rough mask with a replicated surface roughness (RSR) of 100 pm and a correlation length of 32 nm at the nominal extreme-ultraviolet lithography (EUVL) wavelength of 13.5nm. As the ideal RSR value for commercialization of EUVL is 50 pm and under, and furthermore as has been shown elsewhere, a correlation length of 32 nm of roughness on the mask sits on the peak LER value for an NA = 0.32 imaging optic, these mask roughness values and consequently the aberration sensitivity study presented here, represent a worst-case scenario. The illumination conditions were chosen based on the possible candidates for the 22-nm and 16-nm half-pitch nodes, respectively. In the 22-nm case, a disk illumination setting of {sigma} = 0.50 was used, and for the 16-nm case, crosspole illumination with {sigma} = 0.10 at an optimum offset of dx = 0 and dy = .67 in sigma space. In examining how to mitigate mask roughness induced LER, we considered an alternate illumination scheme whereby a traditional dipole's angular spectrum is extended in the direction parallel to the line-and-space mask absorber pattern to represent a 'strip'. While this illumination surprisingly provides minimal improvement to the LER as compared to several alternate illumination schemes, the overall imaging quality in terms of image-log-slope (ILS) and contrast is improved.

  5. A web-based tool for ranking landslide mitigation measures

    NASA Astrophysics Data System (ADS)

    Lacasse, S.; Vaciago, G.; Choi, Y. J.; Kalsnes, B.

    2012-04-01

    As part of the research done in the European project SafeLand "Living with landslide risk in Europe: Assessment, effects of global change, and risk management strategies", a compendium of structural and non-structural mitigation measures for different landslide types in Europe was prepared, and the measures were assembled into a web-based "toolbox". Emphasis was placed on providing a rational and flexible framework applicable to existing and future mitigation measures. The purpose of web-based toolbox is to assist decision-making and to guide the user in the choice of the most appropriate mitigation measures. The mitigation measures were classified into three categories, describing whether the mitigation measures addressed the landslide hazard, the vulnerability or the elements at risk themselves. The measures considered include structural measures reducing hazard and non-structural mitigation measures, reducing either the hazard or the consequences (or vulnerability and exposure of elements at risk). The structural measures include surface protection and control of surface erosion; measures modifying the slope geometry and/or mass distribution; measures modifying surface water regime - surface drainage; measures mo¬difying groundwater regime - deep drainage; measured modifying the mechanical charac¬teristics of unstable mass; transfer of loads to more competent strata; retaining structures (to modify slope geometry and/or to transfer stress to compe¬tent layer); deviating the path of landslide debris; dissipating the energy of debris flows; and arresting and containing landslide debris or rock fall. The non-structural mitigation measures, reducing either the hazard or the consequences: early warning systems; restricting or discouraging construction activities; increasing resistance or coping capacity of elements at risk; relocation of elements at risk; sharing of risk through insurance. The measures are described in the toolbox with fact sheets providing a brief description, guidance on design, schematic details, practical examples and references for each mitigation measure. Each of the measures was given a score on its ability and applicability for different types of landslides and boundary conditions, and a decision support matrix was established. The web-based toolbox organizes the information in the compendium and provides an algorithm to rank the measures on the basis of the decision support matrix, and on the basis of the risk level estimated at the site. The toolbox includes a description of the case under study and offers a simplified option for estimating the hazard and risk levels of the slide at hand. The user selects the mitigation measures to be included in the assessment. The toolbox then ranks, with built-in assessment factors and weights and/or with user-defined ranking values and criteria, the mitigation measures included in the analysis. The toolbox includes data management, e.g. saving data half-way in an analysis, returning to an earlier case, looking up prepared examples or looking up information on mitigation measures. The toolbox also generates a report and has user-forum and help features. The presentation will give an overview of the mitigation measures considered and examples of the use of the toolbox, and will take the attendees through the application of the toolbox.

  6. Comparative risk judgements for oral health hazards among Norwegian adults: a cross sectional study

    PubMed Central

    Åstrøm, Anne Nordrehaug

    2002-01-01

    Background This study identified optimistic biases in health and oral health hazards, and explored whether comparative risk judgements for oral health hazards vary systematically with socio-economic characteristics and self-reported risk experience. Methods A simple random sample of 1,190 residents born in 1972 was drawn from the population resident in three counties of Norway. A total of 735 adults (51% women) completed postal questionnaires at home. Results Mean ratings of comparative risk judgements differed significantly (p < 0.001) from the mid point of the scales. T-values ranged from -13.1 and -12.1 for the perceived risk of being divorced and loosing all teeth to -8.2 and -7.8 (p < 0.001) for having gum disease and toothdecay. Multivariate analyses using General Linear Models, GLM, revealed gender differences in comparative risk judgements for gum disease, whereas social position varied systematically with risk judgements for tooth decay, gum disease and air pollution. The odds ratios for being comparatively optimistic with respect to having gum disease were 2.9, 1.9, 1.8 and 1.5 if being satisfied with dentition, having a favourable view of health situation, and having high and low involvement with health enhancing and health detrimental behaviour, respectively. Conclusion Optimism in comparative judgements for health and oral health hazards was evident in young Norwegian adults. When judging their comparative susceptibility for oral health hazards, they consider personal health situation and risk behaviour experience. PMID:12186656

  7. Google Earth Views of Probabilistic Tsunami Hazard Analysis Pilot Study, Seaside, Oregon

    NASA Astrophysics Data System (ADS)

    Wong, F. L.; Venturato, A. J.; Geist, E. L.

    2006-12-01

    Virtual globes such as Google Earth provide immediate geographic context for research data for coastal hazard planning. We present Google Earth views of data from a Tsunami Pilot Study conducted within and near Seaside and Gearhart, Oregon, as part of FEMA's Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). Two goals of the pilot study were to develop probabilistic 100- year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improved tsunami hazard assessment guidelines. The Seaside area was chosen because it is typical of many coastal communities along the Cascadia subduction zone that extends from Cape Mendocino, California, to the Strait of Juan de Fuca, Washington. State and local stakeholders also expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study report will be augmented by a separate geographic information systems (GIS) data publication that provides model data and results. In addition to traditional GIS data formats, Google Earth kmz files are available to provide rapid visualization of the data against the rich base map provided by the interface. The data include verbal and geologic observations of historic tsunami events, newly constructed DEMs, historic shorelines, earthquake sources, models of tsunami wave heights, and maps of the estimated 100- and 500-year probabilistic floods. Tsunami Pilot Study Working Group, 2006, Seaside, Oregon Tsunami Pilot Study - Modernization of FEMA Flood Hazard Maps: U.S. Geological Survey Open-file Report 2006-1234, http://pubs.usgs.gov/of/2006/1234/.

  8. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... § 437.55 Hazard analysis. (a) A permittee...hazards and assess the risk to public health and...results, or (iii) Analysis. (b) A permittee must carry out the risk elimination and mitigation...derived from its hazard analysis. (c) A...

  9. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... § 437.55 Hazard analysis. (a) A permittee...hazards and assess the risk to public health and...results, or (iii) Analysis. (b) A permittee must carry out the risk elimination and mitigation...derived from its hazard analysis. (c) A...

  10. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... § 437.55 Hazard analysis. (a) A permittee...hazards and assess the risk to public health and...results, or (iii) Analysis. (b) A permittee must carry out the risk elimination and mitigation...derived from its hazard analysis. (c) A...

  11. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... § 437.55 Hazard analysis. (a) A permittee...hazards and assess the risk to public health and...results, or (iii) Analysis. (b) A permittee must carry out the risk elimination and mitigation...derived from its hazard analysis. (c) A...

  12. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... § 437.55 Hazard analysis. (a) A permittee...hazards and assess the risk to public health and...results, or (iii) Analysis. (b) A permittee must carry out the risk elimination and mitigation...derived from its hazard analysis. (c) A...

  13. Development based climate change adaptation and mitigation—conceptual issues and lessons learned in studies in developing countries

    Microsoft Academic Search

    Kirsten Halsnæs; Jan Verhagen

    2007-01-01

    This paper discusses the conceptual basis for linking development policies and climate change adaptation and mitigation and\\u000a suggests an analytical approach that can be applied to studies in developing countries. The approach is centred on a broad\\u000a set of policy evaluation criteria that merge traditional economic and sectoral goals and broader social issues related to\\u000a health and income distribution. The

  14. Development based climate change adaptation and mitigation—conceptual issues and lessons learned in studies in developing countries

    Microsoft Academic Search

    Kirsten Halsnæs Æ Jan Verhagen

    Abstract,This paper discusses the conceptual basis for linking development,policies and climate change,adaptation and mitigation and suggests an analytical approach,that can be applied to studies in developing,countries. The approach,is centred on a broad set of policy evaluation criteria that merge,traditional economic,and sectoral goals and broader social issues related to health and income,distribution. The approach,is inspired by institutional economics and development paradigms

  15. Mitigation of indirect environmental effects of GM crops.

    PubMed

    Pidgeon, J D; May, M J; Perry, J N; Poppy, G M

    2007-06-22

    Currently, the UK has no procedure for the approval of novel agricultural practices that is based on environmental risk management principles. Here, we make a first application of the 'bow-tie' risk management approach in agriculture, for assessment of land use changes, in a case study of the introduction of genetically modified herbicide tolerant (GMHT) sugar beet. There are agronomic and economic benefits, but indirect environmental harm from increased weed control is a hazard. The Farm Scale Evaluation (FSE) trials demonstrated reduced broad-leaved weed biomass and seed production at the field scale. The simplest mitigation measure is to leave a proportion of rows unsprayed in each GMHT crop field. Our calculations, based on FSE data, show that a maximum of 2% of field area left unsprayed is required to mitigate weed seed production and 4% to mitigate weed biomass production. Tilled margin effects could simply be mitigated by increasing the margin width from 0.5 to 1.5 m. Such changes are cheap and simple to implement in farming practices. This case study demonstrates the usefulness of the bow-tie risk management approach and the transparency with which hazards can be addressed. If adopted generally, it would help to enable agriculture to adopt new practices with due environmental precaution. PMID:17439853

  16. Seaside, Oregon, Tsunami Pilot Study-Modernization of FEMA Flood Hazard Maps: GIS Data

    USGS Publications Warehouse

    Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.

    2006-01-01

    Introduction: The Federal Emergency Management Agency (FEMA) Federal Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study (Chowdhury and others, 2005). Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Analysis (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines (Tsunami Pilot Study Working Group, 2006). The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey, and the National Oceanic and Atmospheric Administration (NOAA), in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. We present the spatial (geographic information system, GIS) data from the pilot study in standard GIS formats and provide files for visualization in Google Earth, a global map viewer.

  17. Proportional hazards regression in epidemiologic follow-up studies: an intuitive consideration of primary time scale.

    PubMed

    Cologne, John; Hsu, Wan-Ling; Abbott, Robert D; Ohishi, Waka; Grant, Eric J; Fujiwara, Saeko; Cullings, Harry M

    2012-07-01

    In epidemiologic cohort studies of chronic diseases, such as heart disease or cancer, confounding by age can bias the estimated effects of risk factors under study. With Cox proportional-hazards regression modeling in such studies, it would generally be recommended that chronological age be handled nonparametrically as the primary time scale. However, studies involving baseline measurements of biomarkers or other factors frequently use follow-up time since measurement as the primary time scale, with no explicit justification. The effects of age are adjusted for by modeling age at entry as a parametric covariate. Parametric adjustment raises the question of model adequacy, in that it assumes a known functional relationship between age and disease, whereas using age as the primary time scale does not. We illustrate this graphically and show intuitively why the parametric approach to age adjustment using follow-up time as the primary time scale provides a poor approximation to age-specific incidence. Adequate parametric adjustment for age could require extensive modeling, which is wasteful, given the simplicity of using age as the primary time scale. Furthermore, the underlying hazard with follow-up time based on arbitrary timing of study initiation may have no inherent meaning in terms of risk. Given the potential for biased risk estimates, age should be considered as the preferred time scale for proportional-hazards regression with epidemiologic follow-up data when confounding by age is a concern. PMID:22517300

  18. Thermal study of payload module for the next-generation infrared space telescope SPICA in risk mitigation phase

    NASA Astrophysics Data System (ADS)

    Shinozaki, Keisuke; Sato, Yoichi; Sawada, Kenichiro; Ando, Makiko; Sugita, Hiroyuki; Yamawaki, Toshihiro; Mizutani, Tadahiro; Komatsu, Keiji; Nakagawa, Takao; Murakami, Hiroshi; Matsuhara, Hideo; Takada, Makoto; Takai, Shigeki; Okabayashi, Akinobu; Tsunematsu, Shoji; Kanao, Kenichi; Narasaki, Katsuhiro

    2014-11-01

    SPace Infrared telescope for Cosmology and Astrophysics (SPICA) is a pre-project of JAXA in collaboration with ESA to be launched around 2020. The SPICA is transferred into a halo orbit around the second Lagrangian point (L2) in the Sun-Earth system, which enables us to use effective radiant cooling in combination with mechanical cooling system in order to cool a 3 m large IR telescope below 6 K. At a present, a conceptional study of SPICA is underway to assess and mitigate mission's risks; the thermal study for the risk mitigation sets a goal of a 25% margin on cooling power of 4 K/1 K temperature regions, a 25% margin on the heat load from Focal Plane Instruments (FPIs) at intermediated temperature region, to enhance the reliability of the mechanical cooler system, and to enhance feasibility of ground tests. Thermal property measurements of FRP materials are also important. This paper introduces details of the thermal design study for risk mitigation, including development of the truss separation mechanism, the cryogenic radiator, mechanical cooler system, and thermal property measurements of materials.

  19. Preparation of a national Copernicus service for detection and monitoring of land subsidence and mass movements in the context of remote sensing assisted hazard mitigation

    NASA Astrophysics Data System (ADS)

    Kalia, Andre C.; Frei, Michaela; Lege, Thomas

    2014-10-01

    Land subsidence can cause severe damage for e.g. infrastructure and buildings and mass movements even can lead to loss of live. Detection and monitoring of these processes by terrestrial measurement techniques remain a challenge due to limitations in spatial coverage and temporal resolution. Since the launch of ERS-1 in 1991 numerous scientific studies demonstrated the capability of differential SAR-Interferometry (DInSAR) for the detection of surface deformation proving the usability of this method. In order to assist the utilization of DInSAR for governmental tasks a national service-concept within the EU-ESA Program "Copernicus" is in the process of preparation. This is done by i) analyzing the user requirements, ii) developing a concept and iii) perform case studies as "proof of concept". Due to the iterative nature of this procedure governmental users as well as DInSAR experts are involved. This paper introduces the concept, shows the available SAR data archive from ERS-1/2, TerraSAR-X and TanDEM-X as well as the proposed case study. The case study is focusing on the application of advanced DInSAR methods for the detection of subsidence in a region with active gas extraction. The area of interest is located in the state of Lower Saxony in the northwest of Germany. The DInSAR analysis will be based on ERS-1/2 and on TerraSARX/ TanDEM-X SAR data. The usability of the DInSAR products will be discussed with the responsible mining authority (LBEG) in order to adapt the DInSAR products to the user needs and to evaluate the proposed concept.

  20. New Seismic Hazard study in Spain Aimed at the revision of the Spanish Building Code

    NASA Astrophysics Data System (ADS)

    Rivas-Medina, A.; Benito, B.; Cabañas, L.; Martínez-Solares, J. M.; Ruíz, S.; Gaspar-Escribano, J. M.; Carreño, E.; Crespo, M.; García-Mayordomo, J.

    2013-05-01

    In this paper we present a global overview of the recent study carried out in Spain for the new hazard map, which final goal is the revision of the Building Code in our country (NCSE-02). The study was carried our for a working group joining experts from The Instituto Geografico Nacional (IGN) and the Technical University of Madrid (UPM) , being the different phases of the work supervised by an expert Committee integrated by national experts from public institutions involved in subject of seismic hazard. The PSHA method (Probabilistic Seismic Hazard Assessment) has been followed, quantifying the epistemic uncertainties through a logic tree and the aleatory ones linked to variability of parameters by means of probability density functions and Monte Carlo simulations. In a first phase, the inputs have been prepared, which essentially are: 1) a project catalogue update and homogenization at Mw 2) proposal of zoning models and source characterization 3) calibration of Ground Motion Prediction Equations (GMPE's) with actual data and development of a local model with data collected in Spain for Mw < 5.5. In a second phase, a sensitivity analysis of the different input options on hazard results has been carried out in order to have criteria for defining the branches of the logic tree and their weights. Finally, the hazard estimation was done with the logic tree shown in figure 1, including nodes for quantifying uncertainties corresponding to: 1) method for estimation of hazard (zoning and zoneless); 2) zoning models, 3) GMPE combinations used and 4) regression method for estimation of source parameters. In addition, the aleatory uncertainties corresponding to the magnitude of the events, recurrence parameters and maximum magnitude for each zone have been also considered including probability density functions and Monte Carlo simulations The main conclusions of the study are presented here, together with the obtained results in terms of PGA and other spectral accelerations SA (T) for return periods of 475, 975 and 2475 years. The map of the coefficient of variation (COV) are also represented to give an idea of the zones where the dispersion among results are the highest and the zones where the results are robust.

  1. Examination of Icing Induced Loss of Control and Its Mitigations

    NASA Technical Reports Server (NTRS)

    Reehorst, Andrew L.; Addy, Harold E., Jr.; Colantonio, Renato O.

    2010-01-01

    Factors external to the aircraft are often a significant causal factor in loss of control (LOC) accidents. In today s aviation world, very few accidents stem from a single cause and typically have a number of causal factors that culminate in a LOC accident. Very often the "trigger" that initiates an accident sequence is an external environment factor. In a recent NASA statistical analysis of LOC accidents, aircraft icing was shown to be the most common external environmental LOC causal factor for scheduled operations. When investigating LOC accident or incidents aircraft icing causal factors can be categorized into groups of 1) in-flight encounter with super-cooled liquid water clouds, 2) take-off with ice contamination, or 3) in-flight encounter with high concentrations of ice crystals. As with other flight hazards, icing induced LOC accidents can be prevented through avoidance, detection, and recovery mitigations. For icing hazards, avoidance can take the form of avoiding flight into icing conditions or avoiding the hazard of icing by making the aircraft tolerant to icing conditions. Icing detection mitigations can take the form of detecting icing conditions or detecting early performance degradation caused by icing. Recovery from icing induced LOC requires flight crew or automated systems capable of accounting for reduced aircraft performance and degraded control authority during the recovery maneuvers. In this report we review the icing induced LOC accident mitigations defined in a recent LOC study and for each mitigation describe a research topic required to enable or strengthen the mitigation. Many of these research topics are already included in ongoing or planned NASA icing research activities or are being addressed by members of the icing research community. These research activities are described and the status of the ongoing or planned research to address the technology needs is discussed

  2. Characteristics and predictors of home injury hazards among toddlers in Wenzhou, China: a community-based cross-sectional study

    PubMed Central

    2014-01-01

    Background Home hazards are associated with toddlers receiving unintentional home injuries (UHI). These result in not only physical and psychological difficulties for children, but also economic losses and additional stress for their families. Few researchers pay attention to predictors of home hazards among toddlers in a systematic way. The purpose of this study is firstly to describe the characteristics of homes with hazards and secondly to explore the predicted relationship of children, parents and family factors to home hazards among toddlers aged 24–47 months in Wenzhou, China. Methods A random cluster sampling was employed to select 366 parents having children aged 24 – 47 months from 13 kindergartens between March and April of 2012. Four instruments assessed home hazards, demographics, parent’s awareness of UHI, as well as family functioning. Results Descriptive statistics showed that the mean of home hazards was 12.29 (SD?=?6.39). The nine kinds of home hazards that were identified in over 50% of households were: plastic bags (74.3%), coin buttons (69.1%), and toys with small components (66.7%) etc. Multivariate linear regression revealed that the predictors of home hazards were the child’s age, the child’s residential status and family functioning (b?=?.19, 2.02, - .07, p?hazards were significantly attributed to older toddlers, migrant toddlers and poorer family functioning. This result suggested that heath care providers should focus on the vulnerable family and help the parents assess home hazards. Further study is needed to find interventions on how to manage home hazards for toddlers in China. PMID:24953678

  3. Seaside, Oregon Tsunami Pilot Study - modernization of FEMA flood hazard maps

    USGS Publications Warehouse

    Tsunami Pilot Study Working Group

    2006-01-01

    FEMA Flood Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study. Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Assessment (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines. The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State Agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey and the National Oceanic and Atmospheric Administration, in collaboration with the University of Southern California, Middle East Technical University. Portland State University, Horning Geosciences, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. Draft copies and a briefing on the contents, results and recommendations of this document were provided to FEMA officials before final publication.

  4. A Simulation Study of the Proactive Server Roaming for Mitigating Denial of Service Attacks

    Microsoft Academic Search

    Chatree Sangpachatanaruk; Sherif M. Khattab; Taieb Znati; Rami G. Melhem; Daniel Mossé

    2003-01-01

    The main goal of the NETSEC project is to designand implement a framework for mitigating the effects ofthe node-based and link-based DoS attacks. Our strategy employs three lines of defense. The first line ofdefense is to restrict the access to the defended servicesusing offline service subscription, encryption and othertraditional security techniques. The second line of defense is server roaming, by

  5. FDA-iRISK--a comparative risk assessment system for evaluating and ranking food-hazard pairs: case studies on microbial hazards.

    PubMed

    Chen, Yuhuan; Dennis, Sherri B; Hartnett, Emma; Paoli, Greg; Pouillot, Régis; Ruthman, Todd; Wilson, Margaret

    2013-03-01

    Stakeholders in the system of food safety, in particular federal agencies, need evidence-based, transparent, and rigorous approaches to estimate and compare the risk of foodborne illness from microbial and chemical hazards and the public health impact of interventions. FDA-iRISK (referred to here as iRISK), a Web-based quantitative risk assessment system, was developed to meet this need. The modeling tool enables users to assess, compare, and rank the risks posed by multiple food-hazard pairs at all stages of the food supply system, from primary production, through manufacturing and processing, to retail distribution and, ultimately, to the consumer. Using standard data entry templates, built-in mathematical functions, and Monte Carlo simulation techniques, iRISK integrates data and assumptions from seven components: the food, the hazard, the population of consumers, process models describing the introduction and fate of the hazard up to the point of consumption, consumption patterns, dose-response curves, and health effects. Beyond risk ranking, iRISK enables users to estimate and compare the impact of interventions and control measures on public health risk. iRISK provides estimates of the impact of proposed interventions in various ways, including changes in the mean risk of illness and burden of disease metrics, such as losses in disability-adjusted life years. Case studies for Listeria monocytogenes and Salmonella were developed to demonstrate the application of iRISK for the estimation of risks and the impact of interventions for microbial hazards. iRISK was made available to the public at http://irisk.foodrisk.org in October 2012. PMID:23462073

  6. CMMAD usability case study in support of countermine and hazard sensing

    NASA Astrophysics Data System (ADS)

    Walker, Victor G.; Gertman, David I.

    2010-04-01

    During field trials, operator usability data were collected in support of lane clearing missions and hazard sensing for two robot platforms with Robot Intelligence Kernel (RIK) software and sensor scanning payloads onboard. The tests featured autonomous and shared robot autonomy levels where tasking of the robot used a graphical interface featuring mine location and sensor readings. The goal of this work was to provide insights that could be used to further technology development. The efficacy of countermine and hazard systems in terms of mobility, search, path planning, detection, and localization were assessed. Findings from objective and subjective operator interaction measures are reviewed along with commentary from soldiers having taken part in the study who strongly endorse the system.

  7. Intrinsic vulnerability, hazard and risk mapping for karst aquifers: A case study

    NASA Astrophysics Data System (ADS)

    Mimi, Ziad A.; Assi, Amjad

    2009-01-01

    SummaryGroundwater from karst aquifers is among the most important resources of drinking water supply of the worldwide population. The European COST action 620 proposed a comprehensive approach to karst groundwater protection, comprising methods of intrinsic and specific vulnerability mapping, hazard and risk mapping. This paper presents the first application of all components of this European approach to the groundwater underlying the Ramallah district, a karst hydrogeology system in Palestine. The vulnerability maps which were developed can assist in the implementation of groundwater management strategies to prevent degradation of groundwater quality. Large areas in the case study area can be classified as low or very low risk area corresponding to the pollution sources due to the absence of hazards and also due to low vulnerabilities. These areas could consequently be interesting for future development as they are preferable in view of ground water protection.

  8. Climate engineering of vegetated land for hot extremes mitigation: an ESM sensitivity study

    NASA Astrophysics Data System (ADS)

    Wilhelm, Micah; Davin, Edouard; Seneviratne, Sonia

    2014-05-01

    Mitigation efforts to reduce anthropogenic climate forcing have thus far proven inadequate, as evident from accelerating greenhouse gas emissions. Many subtropical and mid-latitude regions are expected to experience longer and more frequent heat waves and droughts within the next century. This increased occurrence of weather extremes has important implications for human health, mortality and for socio-economic factors including forest fires, water availability and agricultural production. Various solar radiation management (SRM) schemes that attempt to homogeneously counter the anthropogenic forcing have been examined with different Earth System Models (ESM). Land climate engineering schemes have also been investigated which reduces the amount of solar radiation that is absorbed at the surface. However, few studies have investigated their effects on extremes but rather on mean climate response. Here we present the results of a series of climate engineering sensitivity experiments performed with the Community Earth System Model (CESM) version 1.0.2 at 2°-resolution. This configuration entails 5 fully coupled model components responsible for simulating the Earth's atmosphere, land, land-ice, ocean and sea-ice that interact through a central coupler. Historical and RCP8.5 scenarios were performed with transient land-cover changes and prognostic terrestrial Carbon/Nitrogen cycles. Four sets of experiments are performed in which surface albedo over snow-free vegetated grid points is increased by 0.5, 0.10, 0.15 and 0.20. The simulations show a strong preferential cooling of hot extremes throughout the Northern mid-latitudes during boreal summer. A strong linear scaling between the cooling of extremes and additional surface albedo applied to the land model is observed. The strongest preferential cooling is found in southeastern Europe and the central United States, where increases of soil moisture and evaporative fraction are the largest relative to the control simulation. This preferential cooling is found to intensify in the future scenario. Cloud cover strongly limits the efficacy of a given surface albedo increase to reflect incoming solar radiation back into space. As anthropogenic forcing increases, cloud cover decreases over much of the northern mid-latitudes in CESM.

  9. 44 CFR 65.11 - Evaluation of sand dunes in mapping coastal flood hazard areas.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...Evaluation of sand dunes in mapping coastal flood hazard areas. 65.11 Section 65...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...Evaluation of sand dunes in mapping coastal flood hazard areas. (a) General...

  10. 44 CFR 65.11 - Evaluation of sand dunes in mapping coastal flood hazard areas.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...Evaluation of sand dunes in mapping coastal flood hazard areas. 65.11 Section 65...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...Evaluation of sand dunes in mapping coastal flood hazard areas. (a) General...

  11. 44 CFR 65.11 - Evaluation of sand dunes in mapping coastal flood hazard areas.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...Evaluation of sand dunes in mapping coastal flood hazard areas. 65.11 Section 65...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...Evaluation of sand dunes in mapping coastal flood hazard areas. (a) General...

  12. 44 CFR 65.11 - Evaluation of sand dunes in mapping coastal flood hazard areas.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...Evaluation of sand dunes in mapping coastal flood hazard areas. 65.11 Section 65...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...Evaluation of sand dunes in mapping coastal flood hazard areas. (a) General...

  13. 44 CFR 65.11 - Evaluation of sand dunes in mapping coastal flood hazard areas.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...Evaluation of sand dunes in mapping coastal flood hazard areas. 65.11 Section 65...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...Evaluation of sand dunes in mapping coastal flood hazard areas. (a) General...

  14. Status of volcanic hazard studies for the Nevada Nuclear Waste Storage Investigations

    SciTech Connect

    Crowe, B.M.; Vaniman, D.T.; Carr, W.J.

    1983-03-01

    Volcanism studies of the Nevada Test Site (NTS) region are concerned with hazards of future volcanism with respect to underground disposal of high-level radioactive waste. The hazards of silicic volcanism are judged to be negligible; hazards of basaltic volcanism are judged through research approaches combining hazard appraisal and risk assessment. The NTS region is cut obliquely by a N-NE trending belt of volcanism. This belt developed about 8 Myr ago following cessation of silicic volcanism and contemporaneous with migration of basaltic activity toward the southwest margin of the Great Basin. Two types of fields are present in the belt: (1) large-volume, long-lived basalt and local rhyolite fields with numerous eruptive centers and (2) small-volume fields formed by scattered basaltic scoria cones. Late Cenozoic basalts of the NTS region belong to the second field type. Monogenetic basalt centers of this region were formed mostly by Strombolian eruptions; Surtseyean activity has been recognized at three centers. Geochemically, the basalts of the NTS region are classified as straddle A-type basalts of the alkalic suite. Petrological studies indicate a volumetric dominance of evolved hawaiite magmas. Trace- and rare-earth-element abundances of younger basalt (<4 Myr) of the NTS region and southern Death Valley area, California, indicate an enrichment in incompatible elements, with the exception of rubidium. The conditional probability of recurring basaltic volcanism and disruption of a repository by that event is bounded by the range of 10{sup -8} to 10{sup -10} as calculated for a 1-yr period. Potential disruptive and dispersal effects of magmatic penetration of a repository are controlled primarily by the geometry of basalt feeder systems, the mechanism of waste incorporation in magma, and Strombolian eruption processes.

  15. MITIGATION OF SEDIMENTATION HAZARDS DOWNSTREAM FROM RESERVOIRS

    Microsoft Academic Search

    Ellen WOHL; Sara RATHBURN

    Many reservoirs currently in operation trap most or all of the sediment entering the reservoir, creating sediment-depleted conditions downstream. This may cause channel adjustment in the form of bank erosion, bed erosion, substrate coarsening, and channel planform change. Channel adjustment may also result from episodic sediment releases during reservoir operation, or from sediment evacuation following dam removal. Channel adjustment to

  16. ERTS-1 flood hazard studies in the Mississippi River Basin. [Missouri, Mississippi, and Arkansas

    NASA Technical Reports Server (NTRS)

    Rango, A.; Anderson, A. T.

    1974-01-01

    The Spring 1973 Mississippi River flood was investigated using remotely sensed data from ERTS-1. Both manual and automatic analyses of the data indicate that ERTS-1 is extremely useful as a regional tool for flood and floodplain management. The maximum error of such flood area measurements is conservatively estimated to be less than five percent. Change detection analysis indicates that the flood had major impacts on soil moisture, land pattern stability, and vegetation stress. Flood hazard identification was conducted using photointerpretation techniques in three study areas along the Mississippi River using pre-flood ERTS-1 imagery down to 1:100,000 scale. Flood prone area boundaries obtained from ERTS-1 were generally in agreement with flood hazard maps produced by the U.S. Army Corps of Engineers and the U.S. Geological Survey although the latter are somewhat more detailed because of their larger scale. Initial results indicate that ERTS-1 digital mapping of the flood-prone areas can be performed at least 1:62,500 which is comparable to conventional flood hazard map scales.

  17. Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology

    NASA Technical Reports Server (NTRS)

    Woods, Stephen

    2009-01-01

    This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.

  18. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...What must I do to mitigate internal corrosion? 195.579 Section 195.579...TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Corrosion Control § 195.579 What must I do to mitigate internal corrosion? (a) General. If you...

  19. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...What must I do to mitigate internal corrosion? 195.579 Section 195.579...TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Corrosion Control § 195.579 What must I do to mitigate internal corrosion? (a) General. If you...

  20. Hazardous-Materials Robot

    NASA Technical Reports Server (NTRS)

    Stone, Henry W.; Edmonds, Gary O.

    1995-01-01

    Remotely controlled mobile robot used to locate, characterize, identify, and eventually mitigate incidents involving hazardous-materials spills/releases. Possesses number of innovative features, allowing it to perform mission-critical functions such as opening and unlocking doors and sensing for hazardous materials. Provides safe means for locating and identifying spills and eliminates risks of injury associated with use of manned entry teams. Current version of vehicle, called HAZBOT III, also features unique mechanical and electrical design enabling vehicle to operate safely within combustible atmosphere.

  1. A Hazard Assessment and Proposed Risk Index for Art, Architecture, Archive and Artifact Protection: Case Studies for Assorted International Museums

    NASA Astrophysics Data System (ADS)

    Kirk, Clara J.

    This study proposes a hazard/risk index for environmental, technological, and social hazards that may threaten a museum or other place of cultural storage and accession. This index can be utilized and implemented to measure the risk at the locations of these storage facilities in relationship to their geologic, geographic, environmental, and social settings. A model case study of the 1966 flood of the Arno River and its impact on the city of Florence and the Uffizi Gallery was used as the index focus. From this focus an additional eleven museums and their related risk were assessed. Each index addressed a diverse range of hazards based on past frequency and magnitude. It was found that locations nearest a hazard had exceptionally high levels of risk, however more distant locations could have influences that would increase their risk to levels similar to those locations near the hazard. Locations not normally associated with a given natural hazard can be susceptible should the right conditions be met and this research identified, complied and assessed those factions found to influence natural hazard risk at these research sites.

  2. Economics of Tsunami Mitigation in the Pacific Northwest

    NASA Astrophysics Data System (ADS)

    Goettel, K. A.; Rizzo, A.; Sigrist, D.; Bernard, E. N.

    2011-12-01

    The death total in a major Cascadia Subduction Zone (CSZ) tsunami may be comparable to the Tohoku tsunami - tens of thousands. To date, tsunami risk reduction activities have been almost exclusively hazard mapping and evacuation planning. Reducing deaths in locations where evacuation to high ground is impossible in the short time between ground shaking and arrival of tsunamis requires measures such as vertical evacuation facilities or engineered pathways to safe ground. Yet, very few, if any, such tsunami mitigation projects have been done. In contrast, many tornado safe room and earthquake mitigation projects driven entirely or in largely by life safety have been done with costs in the billions of dollars. The absence of tsunami mitigation measures results from the belief that tsunamis are too infrequent and the costs too high to justify life safety mitigation measures. A simple analysis based on return periods, death rates, and the geographic distribution of high risk areas for these hazards demonstrates that this belief is incorrect: well-engineered tsunami mitigation projects are more cost-effective with higher benefit-cost ratios than almost all tornado or earthquake mitigation projects. Goldfinger's paleoseismic studies of CSZ turbidites indicate return periods for major CSZ tsunamis of about 250-500 years (USGS Prof. Paper 1661-F in press). Tsunami return periods are comparable to those for major earthquakes at a given location in high seismic areas and are much shorter than those for tornados at any location which range from >4,000 to >16,000 years for >EF2 and >EF4 tornadoes, respectively. The average earthquake death rate in the US over the past 100-years is about 1/year, or about 30/year including the 1906 San Francisco earthquake. The average death rate for tornadoes is about 90/year. For CSZ tsunamis, the estimated average death rate ranges from about 20/year (10,000 every 500 years) to 80/year (20,000 every 250 years). Thus, the long term deaths rates from tsunamis, earthquakes and tornadoes are comparable. High hazard areas for tornadoes and earthquakes cover ~40% and ~15% of the contiguous US, ~1,250,000 and ~500,000 square miles, respectively. In marked contrast, tsunami life safety risk is concentrated in communities with significant populations in areas where evacuation to high ground is impossible: probably <4,000 square miles or <0.1% of the US. The geographic distribution of life safety risk profoundly affects the economics of tsunami life safety mitigation projects. Consider a tsunami life safety project which saves an average of one life per year (500 lives per 500 years). Using FEMA's value of human life (5.8 million), 7% discount rate and a 50-year project useful lifetime, the net present value of avoided deaths is 80 million. Thus, the benefit-cost ratio would be about 16 or about 80 for tsunami mitigation projects which cost 5 million or 1 million, respectively. These rough calculations indicate that tsunami mitigation projects in high risk locations are economically justified. More importantly, these results indicate that national and local priorities for natural hazard mitigation should be reconsidered, with tsunami mitigation given a very high priority.

  3. GIS data for the Seaside, Oregon, Tsunami Pilot Study to modernize FEMA flood hazard maps

    USGS Publications Warehouse

    Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.

    2007-01-01

    A Tsunami Pilot Study was conducted for the area surrounding the coastal town of Seaside, Oregon, as part of the Federal Emergency Management's (FEMA) Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). The Cascadia subduction zone extends from Cape Mendocino, California, to Vancouver Island, Canada. The Seaside area was chosen because it is typical of many coastal communities subject to tsunamis generated by far- and near-field (Cascadia) earthquakes. Two goals of the pilot study were to develop probabilistic 100-year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improving tsunami hazard assessment guidelines for FEMA and state and local agencies. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study model data and results are published separately as a geographic information systems (GIS) data report (Wong and others, 2006). The flood maps and GIS data are briefly described here.

  4. Simulation study of HEMT structures with HfO2 cap layer for mitigating inverse piezoelectric effect related device failures

    NASA Astrophysics Data System (ADS)

    Nagulapally, Deepthi; Joshi, Ravi P.; Pradhan, Aswini

    2015-01-01

    The Inverse Piezoelectric Effect (IPE) is thought to contribute to possible device failure of GaN High Electron Mobility Transistors (HEMTs). Here we focus on a simulation study to probe the possible mitigation of the IPE by reducing the internal electric fields and related elastic energy through the use of high-k materials. Inclusion of a HfO2 "cap layer" above the AlGaN barrier particularly with a partial mesa structure is shown to have potential advantages. Simulations reveal even greater reductions in the internal electric fields by using "field plates" in concert with high-k oxides.

  5. Awareness of occupational hazards and use of safety measures among welders: a cross-sectional study from eastern Nepal

    PubMed Central

    Budhathoki, Shyam Sundar; Singh, Suman Bahadur; Sagtani, Reshu Agrawal; Niraula, Surya Raj; Pokharel, Paras Kumar

    2014-01-01

    Objective The proper use of safety measures by welders is an important way of preventing and/or reducing a variety of health hazards that they are exposed to during welding. There is a lack of knowledge about hazards and personal protective equipments (PPEs) and the use of PPE among the welders in Nepal is limited. We designed a study to assess welders’ awareness of hazards and PPE, and the use of PPE among the welders of eastern Nepal and to find a possible correlation between awareness and use of PPE among them. Materials and methods A cross-sectional study of 300 welders selected by simple random sampling from three districts of eastern Nepal was conducted using a semistructured questionnaire. Data regarding age, education level, duration of employment, awareness of hazards, safety measures and the actual use of safety measures were recorded. Results Overall, 272 (90.7%) welders were aware of at least one hazard of welding and a similar proportion of welders were aware of at least one PPE. However, only 47.7% used one or more types of PPE. Education and duration of employment were significantly associated with the awareness of hazards and of PPE and its use. The welders who reported using PPE during welding were two times more likely to have been aware of hazards (OR=2.52, 95% CI 1.09 to 5.81) and five times more likely to have been aware of PPE compared with the welders who did not report the use of PPE (OR=5.13, 95% CI 2.34 to 11.26). Conclusions The welders using PPE were those who were aware of hazards and PPE. There is a gap between being aware of hazards and PPE (90%) and use of PPE (47%) at work. Further research is needed to identify the underlying factors leading to low utilisation of PPE despite the welders of eastern Nepal being knowledgeable of it. PMID:24889850

  6. Prediction of Ungauged River Basin for Hydro Power Potential and Flood Risk Mitigation; a Case Study at Gin River, Sri Lanka

    NASA Astrophysics Data System (ADS)

    Ratnayake, A. S.

    2011-12-01

    The most of the primary civilizations of the world emerged in or near river valleys or floodplains. The river channels and floodplains are single hydrologic and geomorphic system. The failure to appreciate the integral connection between floodplains and channel underlies many socioeconomic and environmental problems in river management today. However it is a difficult task of collecting reliable field hydrological data. Under such situations either synthetic or statistically generated data were used for hydraulic engineering designing and flood modeling. The fundamentals of precipitation-runoff relationship through synthetic unit hydrograph for Gin River basin were prepared using the method of the Flood Studies Report of the National Environmental Research Council, United Kingdom (1975). The Triangular Irregular Network model was constructed using Geographic Information System (GIS) to determine hazard prone zones. The 1:10,000 and 1:50,000 topography maps and field excursions were also used for initial site selection of mini-hydro power units and determine flooding area. The turbines output power generations were calculated using the parameters of net head and efficiency of turbine. The peak discharge achieves within 4.74 hours from the onset of the rainstorm and 11.95 hours time takes to reach its normal discharge conditions of Gin River basin. Stream frequency of Gin River is 4.56 (Junctions/ km2) while the channel slope is 7.90 (m/km). The regional coefficient on the catchment is 0.00296. Higher stream frequency and gentle channel slope were recognized as the flood triggering factors of Gin River basin and other parameters such as basins catchment area, main stream length, standard average annual rainfall and soil do not show any significant variations with other catchments of Sri Lanka. The flood management process, including control of flood disaster, prepared for a flood, and minimize it impacts are complicated in human population encroached and modified floodplains. Thus modern GIS technology has been productively executed to prepare hazard maps based on the flood modeling and also it would be further utilized for disaster preparedness and mitigation activities. Five suitable hydraulic heads were recognized for mini-hydro power sites and it would be the most economical and applicable flood controlling hydraulic engineering structure considering all morphologic, climatic, environmental and socioeconomic proxies of the study area. Mini-hydro power sites also utilized as clean, eco friendly and reliable energy source (8630.0 kW). Finally Francis Turbine can be employed as the most efficiency turbine for the selected sites bearing in mind of both technical and economical parameters.

  7. A seismic landslide hazard analysis with topographic effect, a case study in the 99 Peaks region, Central Taiwan

    NASA Astrophysics Data System (ADS)

    Peng, Wen-Fei; Wang, Chein-Lee; Chen, Shih-Tsu; Lee, Shing-Tsz

    2009-04-01

    It has been known that ground motion amplitude will be amplified at mountaintops; however, such topographic effects are not included in conventional landslide hazard models. In this study, a modified procedure that considers the topographic effects is proposed to analyze the seismic landslide hazard. The topographic effect is estimated by back analysis. First, a 3D dynamic numerical model with irregular topography is constructed. The theoretical topographic amplification factors are derived from the dynamic numerical model. The ground motion record is regarded as the reference motion in the plane area. By combining the topographic amplification factors with the reference motions, the amplified acceleration time history and amplified seismic intensity parameters are obtained. Newmark’s displacement model is chosen to perform the seismic landslide hazard analysis. By combining the regression equation and the seismic parameter of peak ground acceleration and Arias intensity, the Newmark’s displacement distribution is generated. Subsequently, the calculated Newmark’s displacement maps are transformed to the hazard maps. The landslide hazard maps of the 99 Peaks region, Central Taiwan are evaluated. The actual landslide inventory maps triggered by the 21 September 1999, Chi-Chi earthquake are compared with the calculated hazard maps. Relative to the conventional procedure, the results show that the proposed procedures, which include the topographic effect can obtain a better result for seismic landslide hazard analysis.

  8. Properties of Hazards

    NSDL National Science Digital Library

    2001-01-01

    A Hazard is any physical, biological or chemical agent that poses a threat to human health. You can get exposed to hazards due to the way you live, the job you do, or the environment you are surrounded by. This "Properties of Hazards" module has five instructional units. Follow the links above in order. Each lesson contains study material for one day. Learn more about just how close you are to environmental health hazards and how they can affect you.

  9. Hazardous Waste

    MedlinePLUS

    ... you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint ...

  10. An evaluation of soil erosion hazard: A case study in Southern Africa using geomatics technologies

    NASA Astrophysics Data System (ADS)

    Eiswerth, Barbara Alice

    Accelerated soil erosion in Malawi, Southern Africa, increasingly threatens agricultural productivity, given current and projected population growth trends. Previous attempts to document soil erosion potential have had limited success, lacking appropriate information and diagnostic tools. This study utilized geomatics technologies and the latest available information from topography, soils, climate, vegetation, and land use of a watershed in southern Malawi. The Soil Loss Estimation Model for Southern Africa (SLEMSA), developed for conditions in Zimbabwe, was evaluated and used to create a soil erosion hazard map for the watershed under Malawi conditions. The SLEMSA sub-models of cover, soil loss, and topography were computed from energy interception, rainfall energy, and soil erodibility, and slope length and steepness, respectively. Geomatics technologies including remote sensing and Geographic Information Systems (GIS) provided the tools with which land cover/land use, a digital elevation model, and slope length and steepness were extracted and integrated with rainfall and soils spatial information. Geomatics technologies enable rapid update of the model as new and better data sets become available. Sensitivity analyses of the SLEMSA model revealed that rainfall energy and slope steepness have the greatest influence on soil erosion hazard estimates in this watershed. Energy interception was intermediate in sensitivity level, whereas slope length and soil erodibility ranked lowest. Energy interception and soil erodibility were shown by parameter behavior analysis to behave in a linear fashion with respect to soil erosion hazard, whereas rainfall energy, slope steepness, and slope length exhibit non-linear behavior. When SLEMSA input parameters and results were compared to alternative methods of soil erosion assessment, such as drainage density and drainage texture, the model provided more spatially explicit information using 30 meter grid cells. Results of this study indicate that more accurate soil erosion estimates can be made when: (1) higher resolution digital elevation models are used; (2) data from improved precipitation station network are available, and; (3) greater investment in rainfall energy research.

  11. Study on FPGA SEU Mitigation for Readout Electronics of DAMPE BGO Calorimeter

    E-print Network

    Zhongtao Shen; Changqing Feng; Shanshan Gao; Deliang Zhang; Di Jiang; Shubin Liu; Qi An

    2014-06-16

    The BGO calorimeter, which provides a wide measurement range of the primary cosmic ray spectrum, is a key sub-detector of Dark Matter Particle Explorer (DAMPE). The readout electronics of calorimeter consists of 16 pieces of Actel ProASIC Plus FLASH-based FPGA, of which the design-level flip-flops and embedded block RAMs are single event upset (SEU) sensitive in the harsh space environment. Therefore to comply with radiation hardness assurance (RHA), SEU mitigation methods, including partial triple modular redundancy (TMR), CRC checksum, and multi-domain reset are analyzed and tested by the heavy-ion beam test. Composed of multi-level redundancy, a FPGA design with the characteristics of SEU tolerance and low resource consumption is implemented for the readout electronics.

  12. Study on FPGA SEU Mitigation for Readout Electronics of DAMPE BGO Calorimeter

    E-print Network

    Shen, Zhongtao; Gao, Shanshan; Zhang, Deliang; Jiang, Di; Liu, Shubin; An, Qi

    2014-01-01

    The BGO calorimeter, which provides a wide measurement range of the primary cosmic ray spectrum, is a key sub-detector of Dark Matter Particle Explorer (DAMPE). The readout electronics of calorimeter consists of 16 pieces of Actel ProASIC Plus FLASH-based FPGA, of which the design-level flip-flops and embedded block RAMs are single event upset (SEU) sensitive in the harsh space environment. Therefore to comply with radiation hardness assurance (RHA), SEU mitigation methods, including partial triple modular redundancy (TMR), CRC checksum, and multi-domain reset are analyzed and tested by the heavy-ion beam test. Composed of multi-level redundancy, a FPGA design with the characteristics of SEU tolerance and low resource consumption is implemented for the readout electronics.

  13. K Basins hazard analysis

    SciTech Connect

    MCCALL, T.B.

    2002-10-09

    The 105-K East (KE) and 105-K West (KW) Basins in the 100 K Area of the Hanford Site have been used for storage of irradiated N Reactor and single-pass reactor fuel. Remaining spent fuel is continuing to be stored underwater in racks and canisters in the basins while fuel retrieval activities proceed to remove the fuel from the basins. The Spent Nuclear Fuel (SNF) Project is adding equipment to the facility in preparation for removing the fuel and sludge from the basins. In preparing this hazard analysis, a variety of hazard analysis techniques were used by the K Basins hazard analysis teams, including hazard and operability studies, preliminary hazard analyses, and ''what if'' analyses. This document summarizes the hazard analyses performed as part of the safety evaluations for the various modification projects and combines them with the original hazard analyses to create a living hazard analysis document.

  14. K Basin Hazard Analysis

    SciTech Connect

    MCCALL, T.B.

    2002-03-21

    The 105-K East (KE) and 105-K West (KW) Basins in the 100 K Area of the Hanford Site have been used for storage of irradiated N Reactor and single-pass reactor fuel. Remaining spent fuel is continuing to be stored underwater in racks and canisters in the basins while fuel retrieval activities proceed to remove the fuel from the basins. The Spent Nuclear Fuel (SNF) Project is adding equipment to the facility in preparation for removing the fuel and sludge from the basins. In preparing this hazard analysis, a variety of hazard analysis techniques were used by the K Basins hazard analysis teams, including hazard and operability studies, preliminary hazard analyses, and ''what if'' analyses. This document summarizes the hazard analyses performed as part of the safety evaluations for the various modification projects and combines them with the original hazard analyses to create a living hazard analysis document.

  15. Effectiveness of protected areas in mitigating fire within their boundaries: case study of Chiapas, Mexico.

    PubMed

    Román-Cuesta, María Rosa; Martínez-Vilalta, Jordi

    2006-08-01

    Since the severe 1982-1983 El Niño drought, recurrent burning has been reported inside tropical protected areas (TPAs). Despite the key role of fire in habitat degradation, little is known about the effectiveness of TPAs in mitigating fire incidence and burned areas. We used a GPS fire database (1995-2005) (n=3590 forest fires) obtained from the National Forest Commission to compare fire incidence (number of fires) and burned areas inside TPAs and their surrounding adjacent buffer areas in Southern Mexico (Chiapas). Burned areas inside parks ranged from 2% (Palenque) to 45% (Lagunas de Montebello) of a park's area, and the amount burned was influenced by two severe El Niño events (1998 and 2003). These two years together resulted in 67% and 46% of the total area burned in TPAs and buffers, respectively during the period under analysis. Larger burned areas in TPAs than in their buffers were exclusively related to the extent of natural habitats (flammable area excluding agrarian and pasture lands). Higher fuel loads together with access and extinction difficulties were likely behind this trend. A higher incidence of fire in TPAs than in their buffers was exclusively related to anthropogenic factors such as higher road densities and agrarian extensions. Our results suggest that TPAs are failing to mitigate fire impacts, with both fire incidence and total burned areas being significantly higher in the reserves than in adjacent buffer areas. Management plans should consider those factors that facilitate fires in TPAs: anthropogenic origin of fires, sensitivity of TPAs to El Niñio-droughts, large fuel loads and fuel continuity inside parks, and limited financial resources. Consideration of these factors favors lines of action such as alternatives to the use of fire (e.g., mucuna-maize system), climatic prediction to follow the evolution of El Niño, fuel management strategies that favor extinction practices, and the strengthening of local communities and ecotourism. PMID:16922224

  16. National evaluation of the US Department of Housing and Urban Development Lead-Based Paint Hazard Control Grant Program: Study methods

    Microsoft Academic Search

    Warren Galke; Scott Clark; Pat McLaine; Robert Bornschein; Jonathan Wilson; Paul Succop; Sandy Roda; Jill Breysse; David Jacobs; JoAnn Grote; William Menrath; Sherry Dixon; Mei Chen; Ralph Buncher

    2005-01-01

    The US Department of Housing and Urban Development (HUD) undertook an evaluation of its Lead Hazard Control Grant Program between 1994 and 1999. The Evaluation is the largest study ever done on the effectiveness of lead hazard controls implemented in residential dwellings. The Evaluation had several major objectives: determining the effectiveness of various lead hazard controls in reducing residential dust

  17. Progress in NTHMP Hazard Assessment

    USGS Publications Warehouse

    Gonzalez, F.I.; Titov, V.V.; Mofjeld, H.O.; Venturato, A.J.; Simmons, R.S.; Hansen, R.; Combellick, R.; Eisner, R.K.; Hoirup, D.F.; Yanagi, B.S.; Yong, S.; Darienzo, M.; Priest, G.R.; Crawford, G.L.; Walsh, T.J.

    2005-01-01

    The Hazard Assessment component of the U.S. National Tsunami Hazard Mitigation Program has completed 22 modeling efforts covering 113 coastal communities with an estimated population of 1.2 million residents that are at risk. Twenty-three evacuation maps have also been completed. Important improvements in organizational structure have been made with the addition of two State geotechnical agency representatives to Steering Group membership, and progress has been made on other improvements suggested by program reviewers. ?? Springer 2005.

  18. Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor

    This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

  19. Delayed geochemical hazard: A tool for risk assessment of heavy metal polluted sites and case study.

    PubMed

    Zheng, Mingxia; Feng, Liu; He, Juanni; Chen, Ming; Zhang, Jiawen; Zhang, Minying; Wang, Jing

    2015-04-28

    A concept of delayed geochemical hazard (DGH) was proposed instead of chemical time bomb to represent an ecological and environmental hazard caused by sudden reactivation and release of long-term accumulated pollutants in soil/sediment system due to the change of physicochemical conditions or the decrease of environmental capacity. A DGH model was also established to provide a quantitative tool to assess and predict potential environmental risk caused by heavy metals and especially its dynamic evolutions. A case study of DGH was carried out for a mercury-polluted area in southern China. Results of soil column experiment showed that DGH was directly resulted from the transformation and release of pollutant from the releasable species to the active ones through a mechanism of chain reaction. The most possible chain reaction was summarized as HgE+C+F+O+R?HgE+C+F+O?HgE+C+F?HgE+C?HgE. Although 8.3% of the studied area with the total releasable content of mercury (TRCPHg) exceeded the DGH critical point value of 16.667mg/kg, with the possibility of DGH burst, the area was classified as low-risk of DGH. This confirmed that DGH model could contribute to the risk assessment and early warning of soil/sediment pollution. PMID:25661167

  20. Vertical Field of View Reference Point Study for Flight Path Control and Hazard Avoidance

    NASA Technical Reports Server (NTRS)

    Comstock, J. Raymond, Jr.; Rudisill, Marianne; Kramer, Lynda J.; Busquets, Anthony M.

    2002-01-01

    Researchers within the eXternal Visibility System (XVS) element of the High-Speed Research (HSR) program developed and evaluated display concepts that will provide the flight crew of the proposed High-Speed Civil Transport (HSCT) with integrated imagery and symbology to permit path control and hazard avoidance functions while maintaining required situation awareness. The challenge of the XVS program is to develop concepts that would permit a no-nose-droop configuration of an HSCT and expanded low visibility HSCT operational capabilities. This study was one of a series of experiments exploring the 'design space' restrictions for physical placement of an XVS display. The primary experimental issues here was 'conformality' of the forward display vertical position with respect to the side window in simulated flight. 'Conformality' refers to the case such that the horizon and objects appear in the same relative positions when viewed through the forward windows or display and the side windows. This study quantified the effects of visual conformality on pilot flight path control and hazard avoidance performance. Here, conformality related to the positioning and relationship of the artificial horizon line and associated symbology presented on the forward display and the horizon and associated ground, horizon, and sky textures as they would appear in the real view through a window presented in the side window display. No significant performance consequences were found for the non-conformal conditions.

  1. A Randomized, Controlled Trial of Home Injury Hazard Reduction: The HOME Injury Study

    PubMed Central

    Phelan, Kieran J.; Khoury, Jane; Xu, Yingying; Liddy, Stacey; Hornung, Richard; Lanphear, Bruce P.

    2013-01-01

    Objective Test the efficacy of an intervention of safety device installation on medically-attended injury in children birth to 3 years of age. Design A nested, prospective, randomized, controlled trial. Setting Indoor environment of housing units of mothers and children. Participants Mothers and their children enrolled in a birth cohort examining the effects of prevalent neurotoxicants on child development, the Home Observation and Measures of the Environment (HOME) Study. Intervention Installation of multiple, passive measures (stairgates, window locks, smoke & carbon monoxide detectors, to reduce exposure to injury hazards present in housing units. Outcome measure Self-reported and medically-attended and modifiable injury. Methods 1263 (14%) prenatal patients were eligible, 413 (33%) agreed to participate and 355 were randomly assigned to the experimental (n=181) or control (n=174) groups. Injury hazards were assessed at home visits by teams of trained research assistants using a validated survey. Safety devices were installed in intervention homes. Intention-to-treat analyses to test efficacy were conducted on: 1) total injury rates and 2) on injuries deemed, a priori, modifiable by the installation of safety devices. Rates of medically attended injuries (phone calls, office or emergency visits) were calculated using generalized estimating equations. Results The mean age of the children at intervention was 6 months. Injury hazards were significantly reduced in the intervention but not in control group homes at one and two years (p<0.004). There was not a significant difference in the rate for all medically-attended injuries in intervention compared with control group children, 14.3 (95%CI 9.7, 21.1) vs. 20.8 (14.4, 29.9) per 100 child-years (p=0.17) respectively; but there was a significant reduction in modifiable medically attended injuries in intervention compared with control group children, 2.3 (1.0, 5.5) vs. 7.7 (4.2, 14.2) per 100 child-years, respectively (p=0.026). Conclusions An intervention to reduce exposure to hazards in the homes of young children led to a 70% reduction in modifiable medically-attended injury. PMID:21464382

  2. State of Tsunami Mitigation in the U.S. states and territories

    NASA Astrophysics Data System (ADS)

    Rizzo, A.

    2012-12-01

    On March 11, 2011, the west coast of the United States was vividly reminded of the hazard that sits about hundred miles off the coast. Coastal populations could have less than 15 minutes to evacuate from a Cascadia-induced tsunami. The 2011 Japanese tsunami provides many lessons on when mitigation works and when mitigation fails. Emergency managers, policy makers, scientists, and the public have many options when it comes to mitigation strategies. Topics covered by this presentation range from understanding the effects of probability and possibility on mitigation strategies to the existing and emerging mitigation strategies being used by coastal States and Federal partners to mitigate the tsunami hazard.

  3. Factors influencing treatment decisions of hazardous waste generators: a study of factories in the Bangkok region.

    PubMed

    Jarusombat, Soparatana; Nurul Amin, A T M

    2002-12-01

    While the developed countries are engaged in developing new technology for waste avoidance, recycling and reuse, developing countries are still wrestling to decide on the best option to treat and dispose of wastes. This is particularly so in the case of hazardous industrial wastes (HIW). In order to ensure safe HIW management, one policy response has been to make available a central treatment facility for HIW generators. Such a facility was established in Thailand--the Bang Khuntien Treatment Facility (BKTF)--for the HIW generators in the Bangkok region. However, this facility operates much below its capacity. A study was therefore undertaken between April 1996 and April 1997 to investigate the factors that influence the HIW producing factories' decisions regarding the use of this facility. The survey reveals a distinct pattern: the majority of the non-users of the facility are small- and medium-scale factories, small generators of hazardous wastes, non recipients of Board of Investment (BOI) concessions, and located far away from the facility. The results from a multiple regression analysis show that a factory decision on usage of the facility is positively influenced by its size, amount of wastes generated, knowledge of the facility, pressure from regulators, and consideration of employee and community welfare. It is negatively influenced by distance and costs of using the treatment facility. PMID:12549660

  4. Social Uptake of Scientific Understanding of Seismic Hazard in Sumatra and Cascadia

    NASA Astrophysics Data System (ADS)

    Shannon, R.; McCloskey, J.; Guyer, C.; McDowell, S.; Steacy, S.

    2007-12-01

    The importance of science within hazard mitigation cannot be underestimated. Robust mitigation polices rely strongly on a sound understanding of the science underlying potential natural disasters and the transference of that knowledge from the scientific community to the general public via governments and policy makers. We aim to investigate how and why the public's knowledge, perceptions, response, adjustments and values towards science have changed throughout two decades of research conducted in areas along and adjacent to the Sumatran and Cascadia subduction zones. We will focus on two countries subject to the same potential hazard, but which encompass starkly contrasting political, economic, social and environmental settings. The transfer of scientific knowledge into the public/ social arena is a complex process, the success of which is reflected in a community's ability to withstand large scale devastating events. Although no one could have foreseen the magnitude of the 2004 Boxing Day tsunami, the social devastation generated underscored the stark absence of mitigation measures in the nations most heavily affected. It furthermore emphasized the need for the design and implementation of disaster preparedness measures. Survey of existing literature has already established timelines for major events and public policy changes in the case study areas. Clear evidence exists of the link between scientific knowledge and its subsequent translation into public policy, particularly in the Cascadia context. The initiation of the National Tsunami Hazard Mitigation Program following the Cape Mendocino earthquake in 1992 embodies this link. Despite a series of environmental disasters with recorded widespread fatalities dating back to the mid 1900s and a heightened impetus for scientific research into tsunami/ earthquake hazard following the 2004 Boxing Day tsunami, the translation of science into the public realm is not widely obvious in the Sumatran context. This research aims to further investigate how the enhanced understanding of earthquake and tsunami hazards is being used to direct hazard mitigation strategies and enables direct comparison with the scientific and public policy developments in Cascadia.

  5. 44 CFR 65.5 - Revision to special hazard area boundaries with no change to base flood elevation determinations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...hazard area boundaries with no change to base flood elevation determinations. 65.5 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...hazard area boundaries with no change to base flood elevation determinations. (a)...

  6. 44 CFR 65.5 - Revision to special hazard area boundaries with no change to base flood elevation determinations.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...hazard area boundaries with no change to base flood elevation determinations. 65.5 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...hazard area boundaries with no change to base flood elevation determinations. (a)...

  7. 44 CFR 65.5 - Revision to special hazard area boundaries with no change to base flood elevation determinations.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...hazard area boundaries with no change to base flood elevation determinations. 65.5 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...hazard area boundaries with no change to base flood elevation determinations. (a)...

  8. 44 CFR 65.5 - Revision to special hazard area boundaries with no change to base flood elevation determinations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...hazard area boundaries with no change to base flood elevation determinations. 65.5 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...hazard area boundaries with no change to base flood elevation determinations. (a)...

  9. 44 CFR 65.5 - Revision to special hazard area boundaries with no change to base flood elevation determinations.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...hazard area boundaries with no change to base flood elevation determinations. 65.5 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program IDENTIFICATION AND...hazard area boundaries with no change to base flood elevation determinations. (a)...

  10. Study of cover source mismatch in steganalysis and ways to mitigate its impact

    NASA Astrophysics Data System (ADS)

    Kodovský, Jan; Sedighi, Vahid; Fridrich, Jessica

    2014-02-01

    When a steganalysis detector trained on one cover source is applied to images from a different source, generally the detection error increases due to the mismatch between both sources. In steganography, this situation is recognized as the so-called cover source mismatch (CSM). The drop in detection accuracy depends on many factors, including the properties of both sources, the detector construction, the feature space used to represent the covers, and the steganographic algorithm. Although well recognized as the single most important factor negatively affecting the performance of steganalyzers in practice, the CSM received surprisingly little attention from researchers. One of the reasons for this is the diversity with which the CSM can manifest. On a series of experiments in the spatial and JPEG domains, we refute some of the common misconceptions that the severity of the CSM is tied to the feature dimensionality or their "fragility." The CSM impact on detection appears too difficult to predict due to the effect of complex dependencies among the features. We also investigate ways to mitigate the negative effect of the CSM using simple measures, such as by enlarging the diversity of the training set (training on a mixture of sources) and by employing a bank of detectors trained on multiple different sources and testing on a detector trained on the closest source.

  11. Conforth Ranch Wildlife Mitigation Feasibility Study, McNary, Oregon : Annual Report.

    SciTech Connect

    Rasmussen, Larry; Wright, Patrick; Giger, Richard

    1991-03-01

    The 2,860-acre Conforth Ranch near Umatilla, Oregon is being considered for acquisition and management to partially mitigate wildlife losses associated with McNary Hydroelectric Project. The Habitat Evaluation Procedures (HEP) estimated that management for wildlife would result in habitat unit gains of 519 for meadowlark, 420 for quail, 431 for mallard, 466 for Canada goose, 405 for mink, 49 for downy woodpecker, 172 for yellow warbler, and 34 for spotted sandpiper. This amounts to a total combined gain of 2,495 habitat units -- a 110 percent increase over the existing values for these species combined of 2,274 habitat units. Current water delivery costs, estimated at $50,000 per year, are expected to increase to $125,000 per year. A survey of local interest indicated a majority of respondents favored the concept with a minority opposed. No contaminants that would preclude the Fish and Wildlife Service from agreeing to accept the property were identified. 21 refs., 3 figs., 5 tabs.

  12. First Production of C60 Nanoparticle Plasma Jet for Study of Disruption Mitigation for ITER

    NASA Astrophysics Data System (ADS)

    Bogatu, I. N.; Thompson, J. R.; Galkin, S. A.; Kim, J. S.; Brockington, S.; Case, A.; Messer, S. J.; Witherspoon, F. D.

    2012-10-01

    Unique fast response and large mass-velocity delivery of nanoparticle plasma jets (NPPJs) provide a novel application for ITER disruption mitigation, runaway electrons diagnostics and deep fueling. NPPJs carry a much larger mass than usual gases. An electromagnetic plasma gun provides a very high injection velocity (many km/s). NPPJ has much higher ram pressure than any standard gas injection method and penetrates the tokamak confining magnetic field. Assimilation is enhanced due to the NP large surface-to-volume ratio. Radially expanding NPPJs help achieving toroidal uniformity of radiation power. FAR-TECH's NPPJ system was successfully tested: a coaxial plasma gun prototype (˜35 cm length, 96 kJ energy) using a solid state TiH2/C60 pulsed power cartridge injector produced a hyper-velocity (>4 km/s), high-density (>10^23 m-3), C60 plasma jet in ˜0.5 ms, with ˜1-2 ms overall response-delivery time. We present the TiH2/C60 cartridge injector output characterization (˜180 mg of sublimated C60 gas) and first production results of a high momentum C60 plasma jet (˜0.6 g.km/s).

  13. Implications of Adhesion Studies for Dust Mitigation on Thermal Control Surfaces

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Berkebile, Stephen P.

    2012-01-01

    Experiments measuring the adhesion forces under ultrahigh vacuum conditions (10 (exp -10) torr) between a synthetic volcanic glass and commonly used space exploration materials have recently been described. The glass has a chemistry and surface structure typical of the lunar regolith. It was found that Van der Waals forces between the glass and common spacecraft materials was negligible. Charge transfer between the materials was induced by mechanically striking the spacecraft material pin against the glass plate. No measurable adhesion occurred when striking the highly conducting materials, however, on striking insulating dielectric materials the adhesion increased dramatically. This indicates that electrostatic forces dominate over Van der Waals forces under these conditions. The presence of small amounts of surface contaminants was found to lower adhesive forces by at least two orders of magnitude, and perhaps more. Both particle and space exploration material surfaces will be cleaned by the interaction with the solar wind and other energetic processes and stay clean because of the extremely high vacuum (10 (exp -12) torr) so the atomically clean adhesion values are probably the relevant ones for the lunar surface environment. These results are used to interpret the results of dust mitigation technology experiments utilizing textured surfaces, work function matching surfaces and brushing. They have also been used to reinterpret the results of the Apollo 14 Thermal Degradation Samples experiment.

  14. Viscoelastic Materials Study for the Mitigation of Blast-Related Brain Injury

    NASA Astrophysics Data System (ADS)

    Bartyczak, Susan; Mock, Willis, Jr.

    2011-06-01

    Recent preliminary research into the causes of blast-related brain injury indicates that exposure to blast pressures, such as from IED detonation or multiple firings of a weapon, causes damage to brain tissue resulting in Traumatic Brain Injury (TBI) and Post Traumatic Stress Disorder (PTSD). Current combat helmets are not sufficient to protect the warfighter from this danger and the effects are debilitating, costly, and long-lasting. Commercially available viscoelastic materials, designed to dampen vibration caused by shock waves, might be useful as helmet liners to dampen blast waves. The objective of this research is to develop an experimental technique to test these commercially available materials when subject to blast waves and evaluate their blast mitigating behavior. A 40-mm-bore gas gun is being used as a shock tube to generate blast waves (ranging from 1 to 500 psi) in a test fixture at the gun muzzle. A fast opening valve is used to release nitrogen gas from the breech to impact instrumented targets. The targets consist of aluminum/ viscoelastic polymer/ aluminum materials. Blast attenuation is determined through the measurement of pressure and accelerometer data in front of and behind the target. The experimental technique, calibration and checkout procedures, and results will be presented.

  15. Concerns About Climate Change Mitigation Projects: Summary of Findings from Case Studies in Brazil, India, Mexico, and South Africa

    SciTech Connect

    Sathaye, Jayant A.; Andrasko, Kenneth; Makundi, Willy; La Rovere, Emilio Lebre; Ravinandranath, N.H.; Melli, Anandi; Rangachari, Anita; Amaz, Mireya; Gay, Carlos; Friedmann, Rafael; Goldberg, Beth; van Horen, Clive; Simmonds, Gillina; Parker, Gretchen

    1998-11-01

    The concept of joint implementation as a way to implement climate change mitigation projects in another country has been controversial ever since its inception. Developing countries have raised numerous issues at the project-specific technical level, and broader concerns having to do with equity and burden sharing. This paper summarizes the findings of studies for Brazil, India, Mexico and South Africa, four countries that have large greenhouse gas emissions and are heavily engaged in the debate on climate change projects under the Kyoto Protocol. The studies examine potential or current projects/programs to determine whether eight technical concerns about joint implementation can be adequately addressed. They conclude that about half the concerns were minor or well managed by project developers, but concerns about additionality of funds, host country institutions and guarantees of performance (including the issues of baselines and possible leakage) need much more effort to be adequately addressed. All the papers agree on the need to develop institutional arrangements for approving and monitoring such projects in each of the countries represented. The case studies illustrate that these projects have the potential to bring new technology, investment, employment and ancillary socioeconomic and environmental benefits to developing countries. These benefits are consistent with the goal of sustainable development in the four study countries. At a policy level, the studies' authors note that in their view, the Annex I countries should consider limits on the use of jointly implemented projects as a way to get credits against their own emissions at home, and stress the importance of industrialized countries developing new technologies that will benefit all countries. The authors also observe that if all countries accepted caps on their emissions (with a longer time period allowed for developing countries to do so) project-based GHG mitigation would be significantly facilitated by the improved private investment climate.

  16. GIS-based pollution hazard mapping and assessment framework of shallow lakes: southeastern Pampean lakes (Argentina) as a case study.

    PubMed

    Romanelli, A; Esquius, K S; Massone, H E; Escalante, A H

    2013-08-01

    The assessment of water vulnerability and pollution hazard traditionally places particular emphasis on the study on groundwaters more than on surface waters. Consequently, a GIS-based Lake Pollution Hazard Index (LPHI) was proposed for assessing and mapping the potential pollution hazard for shallow lakes due to the interaction between the Potential Pollutant Load and the Lake Vulnerability. It includes easily measurable and commonly used parameters: land cover, terrain slope and direction, and soil media. Three shallow lake ecosystems of the southeastern Pampa Plain (Argentina) were chosen to test the usefulness and applicability of this suggested index. Moreover, anthropogenic and natural medium influence on biophysical parameters in these three ecosystems was examined. The evaluation of the LPHI map shows for La Brava and Los Padres lakes the highest pollution hazard (?30 % with high to very high category) while Nahuel Rucá Lake seems to be the less hazardous water body (just 9.33 % with high LPHI). The increase in LPHI value is attributed to a different loading of pollutants governed by land cover category and/or the exposure to high slopes and influence of slope direction. Dissolved oxygen and biochemical oxygen demand values indicate a moderately polluted and eutrophized condition of shallow lake waters, mainly related to moderate agricultural activities and/or cattle production. Obtained information by means of LPHI calculation result useful to perform a local diagnosis of the potential pollution hazard to a freshwater ecosystem in order to implement basic guidelines to improve lake sustainability. PMID:23355019

  17. Property-close source separation of hazardous waste and waste electrical and electronic equipment - A Swedish case study

    SciTech Connect

    Bernstad, Anna, E-mail: anna.bernstad@chemeng.lth.se [Dep. of Chem. Eng., Faculty of Eng., Lund University, Lund (Sweden); Cour Jansen, Jes la [Dep. of Chem. Eng., Faculty of Eng., Lund University, Lund (Sweden); Aspegren, Henrik [VA SYD, City of Malmoe (Sweden)

    2011-03-15

    Through an agreement with EEE producers, Swedish municipalities are responsible for collection of hazardous waste and waste electrical and electronic equipment (WEEE). In most Swedish municipalities, collection of these waste fractions is concentrated to waste recycling centres where households can source-separate and deposit hazardous waste and WEEE free of charge. However, the centres are often located on the outskirts of city centres and cars are needed in order to use the facilities in most cases. A full-scale experiment was performed in a residential area in southern Sweden to evaluate effects of a system for property-close source separation of hazardous waste and WEEE. After the system was introduced, results show a clear reduction in the amount of hazardous waste and WEEE disposed of incorrectly amongst residual waste or dry recyclables. The systems resulted in a source separation ratio of 70 wt% for hazardous waste and 76 wt% in the case of WEEE. Results show that households in the study area were willing to increase source separation of hazardous waste and WEEE when accessibility was improved and that this and similar collection systems can play an important role in building up increasingly sustainable solid waste management systems.

  18. Economic optimization of natural hazard protection - conceptual study of existing approaches

    NASA Astrophysics Data System (ADS)

    Spackova, Olga; Straub, Daniel

    2013-04-01

    Risk-based planning of protection measures against natural hazards has become a common practice in many countries. The selection procedure aims at identifying an economically efficient strategy with regard to the estimated costs and risk (i.e. expected damage). A correct setting of the evaluation methodology and decision criteria should ensure an optimal selection of the portfolio of risk protection measures under a limited state budget. To demonstrate the efficiency of investments, indicators such as Benefit-Cost Ratio (BCR), Marginal Costs (MC) or Net Present Value (NPV) are commonly used. However, the methodologies for efficiency evaluation differ amongst different countries and different hazard types (floods, earthquakes etc.). Additionally, several inconsistencies can be found in the applications of the indicators in practice. This is likely to lead to a suboptimal selection of the protection strategies. This study provides a general formulation for optimization of the natural hazard protection measures from a socio-economic perspective. It assumes that all costs and risks can be expressed in monetary values. The study regards the problem as a discrete hierarchical optimization, where the state level sets the criteria and constraints, while the actual optimization is made on the regional level (towns, catchments) when designing particular protection measures and selecting the optimal protection level. The study shows that in case of an unlimited budget, the task is quite trivial, as it is sufficient to optimize the protection measures in individual regions independently (by minimizing the sum of risk and cost). However, if the budget is limited, the need for an optimal allocation of resources amongst the regions arises. To ensure this, minimum values of BCR or MC can be required by the state, which must be achieved in each region. The study investigates the meaning of these indicators in the optimization task at the conceptual level and compares their suitability. To illustrate the theoretical findings, the indicators are tested on a hypothetical example of five regions with different risk levels. Last but not least, political and societal aspects and limitations in the use of the risk-based optimization framework are discussed.

  19. 44 CFR 201.6 - Local Mitigation Plans.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...OF HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION...jurisdiction's commitment to reduce risks from natural hazards...the effects of natural disasters, the planning process...was involved. (2) A risk assessment that...

  20. 44 CFR 201.6 - Local Mitigation Plans.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...OF HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION...jurisdiction's commitment to reduce risks from natural hazards...the effects of natural disasters, the planning process...was involved. (2) A risk assessment that...

  1. 44 CFR 201.6 - Local Mitigation Plans.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...OF HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION...jurisdiction's commitment to reduce risks from natural hazards...the effects of natural disasters, the planning process...was involved. (2) A risk assessment that...

  2. 44 CFR 201.6 - Local Mitigation Plans.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...OF HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION...jurisdiction's commitment to reduce risks from natural hazards...the effects of natural disasters, the planning process...was involved. (2) A risk assessment that...

  3. Space Debris & its Mitigation

    NASA Astrophysics Data System (ADS)

    Kaushal, Sourabh; Arora, Nishant

    2012-07-01

    Space debris has become a growing concern in recent years, since collisions at orbital velocities can be highly damaging to functioning satellites and can also produce even more space debris in the process. Some spacecraft, like the International Space Station, are now armored to deal with this hazard but armor and mitigation measures can be prohibitively costly when trying to protect satellites or human spaceflight vehicles like the shuttle. This paper describes the current orbital debris environment, outline its main sources, and identify mitigation measures to reduce orbital debris growth by controlling these sources. We studied the literature on the topic Space Debris. We have proposed some methods to solve this problem of space debris. We have also highlighted the shortcomings of already proposed methods by space experts and we have proposed some modification in those methods. Some of them can be very effective in the process of mitigation of space debris, but some of them need some modification. Recently proposed methods by space experts are maneuver, shielding of space elevator with the foil, vaporizing or redirecting of space debris back to earth with the help of laser, use of aerogel as a protective layer, construction of large junkyards around international space station, use of electrodynamics tether & the latest method proposed is the use of nano satellites in the clearing of the space debris. Limitations of the already proposed methods are as follows: - Maneuvering can't be the final solution to our problem as it is the act of self-defence. - Shielding can't be done on the parts like solar panels and optical devices. - Vaporizing or redirecting of space debris can affect the human life on earth if it is not done in proper manner. - Aerogel has a threshold limit up to which it can bear (resist) the impact of collision. - Large junkyards can be effective only for large sized debris. In this paper we propose: A. The Use of Nano Tubes by creating a mesh: In this technique we will use the nano tubes. We will create a mesh that will act as a touch panel of the touch screen cell phone. When any small or tiny particle will come on this mesh and touch it then the mesh will act as a touch panel and so that the corresponding processor or sensor will come to know the co-ordinates of it then further by using Destructive laser beam we can destroy that particle. B. Use of the Nano tubes and Nano Bots for the collection of the Space Debris: In this method also we will use a nano mesh which is made up of the nano tubes and the corresponding arrangement will be done so that that mesh will act as a touch panel same as that of the touch screen phones. So when tiny particles will dash on the nano mesh then the Nano Bots which will be at the specific co-ordinates collect the particles and store them into the garbage storage. C. Further the space Debris can be use for the other purposes too:- As we know that the space debris can be any tiny particle in the space. So instead of decomposing that particles or destroying it we can use those particles for the purpose of energy production by using the fuel cells, but for this the one condition is that the particle material should be capable of forming the ionize liquid or solution which can be successfully use in the fuel cell for energy production. But this is useful for only the big projects where in smallest amount of energy has also the great demand or value. D. RECYCLING OF SPACE DEBRIS The general idea of making space structures by recycling space debris is to capture the aluminum of the upper stages, melt it, and form it into new aluminum structures, perhaps by coating the inside of inflatable balloons, to make very large structures of thin aluminum shells. CONCLUSION Space debris has become the topic of great concern in recent years. Space debris creation can't be stopped completely but it can be minimized by adopting some measures. Many methods of space debris mitigation have been proposed earlier by many space experts, but some of them have limitations in them. After some

  4. Stakeholders’ perception in identification of river bank erosion hazard: a case study

    Microsoft Academic Search

    Bela Das

    2011-01-01

    Hazards due to riverbank erosion, despite being considered usually as a natural phenomenon, have become a critical problem\\u000a in recent times as introduction of new technology and one-sided engineering-based solution approach to combat natural hazards\\u000a without taking into account of opinions of all categories of stakeholders, particularly of the hazard victims and policy makers,\\u000a has aggravated the problem in many

  5. Creative mitigation

    SciTech Connect

    Ayer, F.; Lagassa, G.

    1989-10-01

    On May 9, 1989, in front of a small but enthusiastic group composed of residents of Columbia Falls, Maine, Downeast fisherman and a crew of Bangor Hydro-Electric employees removed some of the wooden sections of the Columbia Falls dam. The dam is located at the mouth of the Pleasant River on the Maine seacoast, only thirty miles from the border between Maine and New Brunswick, Canada. In so doing, they provided unobstructed access by Atlantic salmon to crucial upstream aquatic habitat for the first time since the day was constructed in 1981. At the same time they made possible the efficient operation of a 13 MW hydroelectric facility some 75 miles inland at West Enfield, Maine, on the Penobscot River. This article describes the creative strategies used by Bangor Pacific Hydro Associated to satisfy environmental mitigation requirements at West Enfield, Maine.

  6. 44 CFR 66.3 - Establishment of community case file and flood elevation study docket.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...Establishment of community case file and flood elevation study docket. 66.3 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program CONSULTATION WITH LOCAL...Establishment of community case file and flood elevation study docket. (a)...

  7. 44 CFR 66.3 - Establishment of community case file and flood elevation study docket.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...Establishment of community case file and flood elevation study docket. 66.3 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program CONSULTATION WITH LOCAL...Establishment of community case file and flood elevation study docket. (a)...

  8. 44 CFR 66.3 - Establishment of community case file and flood elevation study docket.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...Establishment of community case file and flood elevation study docket. 66.3 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program CONSULTATION WITH LOCAL...Establishment of community case file and flood elevation study docket. (a)...

  9. 44 CFR 66.3 - Establishment of community case file and flood elevation study docket.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...Establishment of community case file and flood elevation study docket. 66.3 Section...INSURANCE AND HAZARD MITIGATION National Flood Insurance Program CONSULTATION WITH LOCAL...Establishment of community case file and flood elevation study docket. (a)...

  10. Case study: Mapping tsunami hazards associated with debris flow into a reservoir

    USGS Publications Warehouse

    Walder, J.S.; Watts, P.; Waythomas, C.F.

    2006-01-01

    Debris-flow generated impulse waves (tsunamis) pose hazards in lakes, especially those used for hydropower or recreation. We describe a method for assessing tsunami-related hazards for the case in which inundation by coherent water waves, rather than chaotic splashing, is of primary concern. The method involves an experimentally based initial condition (tsunami source) and a Boussinesq model for tsunami propagation and inundation. Model results are used to create hazard maps that offer guidance for emergency planners and responders. An example application explores tsunami hazards associated with potential debris flows entering Baker Lake, a reservoir on the flanks of the Mount Baker volcano in the northwestern United States. ?? 2006 ASCE.

  11. Natural phenomena hazards, Hanford Site, Washington

    SciTech Connect

    Conrads, T.J.

    1998-09-29

    This document presents the natural phenomena hazard loads for use in implementing DOE Order 5480.28, Natural Phenomena Hazards Mitigation, and supports development of double-shell tank systems specifications at the Hanford Site in south-central Washington State. The natural phenomena covered are seismic, flood, wind, volcanic ash, lightning, snow, temperature, solar radiation, suspended sediment, and relative humidity.

  12. Geologic hazards and Alaska's communities in a changing climate

    NASA Astrophysics Data System (ADS)

    Wolken, G. J.

    2010-12-01

    Observations indicate that changes in climate modify or intensify geomorphic processes in high-latitude regions. Changes in these processes can increase the magnitude and frequency of geologic hazards leading to casualties, damages to property and infrastructure, and a host of socio-economic problems. Numerous communities in Alaska are threatened by geologic hazards and are currently involved in adaptation or mitigation efforts to cope with these risks. In many communities, relocation is the preferred method for managing risk, but a lack of baseline geoscience data prohibits a sound evaluation of geologic hazards and recent landscape change and prevents informed community decision making. In an attempt to bridge this information gap, the Climate Change Hazards Program at the Alaska Division of Geological & Geophysical Surveys (DGGS) is collecting baseline geoscience data, quantifying landscape change, and conducting hazards assessments in and around imperiled communities in Alaska. An important and challenging step in each study is effectively communicating scientific results to community residents, other government agencies, and policy makers, which requires communication beyond peer-reviewed publications. Community visits, public meetings, and workshops are potentially important mechanism for disseminating important geologic hazards information to stakeholders in Alaska. Current DGGS pilot projects in the areas of Kivalina and Koyukuk illustrate the need for conducting geologic hazards assessments and properly disseminating scientific information.

  13. Status of volcanic hazard studies for the Nevada Nuclear Waste Storage Investigations. Volume II

    SciTech Connect

    Crowe, B.M.; Wohletz, K.H.; Vaniman, D.T.; Gladney, E.; Bower, N.

    1986-01-01

    Volcanic hazard investigations during FY 1984 focused on five topics: the emplacement mechanism of shallow basalt intrusions, geochemical trends through time for volcanic fields of the Death Valley-Pancake Range volcanic zone, the possibility of bimodal basalt-rhyolite volcanism, the age and process of enrichment for incompatible elements in young basalts of the Nevada Test Site (NTS) region, and the possibility of hydrovolcanic activity. The stress regime of Yucca Mountain may favor formation of shallow basalt intrusions. However, combined field and drill-hole studies suggest shallow basalt intrusions are rare in the geologic record of the southern Great Basin. The geochemical patterns of basaltic volcanism through time in the NTS region provide no evidence for evolution toward a large-volume volcanic field or increases in future rates of volcanism. Existing data are consistent with a declining volcanic system comparable to the late stages of the southern Death Valley volcanic field. The hazards of bimodal volcanism in this area are judged to be low. The source of a 6-Myr pumice discovered in alluvial deposits of Crater Flat has not been found. Geochemical studies show that the enrichment of trace elements in the younger rift basalts must be related to an enrichment of their mantle source rocks. This geochemical enrichment event, which may have been metasomatic alteration, predates the basalts of the silicic episode and is, therefore, not a young event. Studies of crater dimensions of hydrovolcanic landforms indicate that the worst case scenario (exhumation of a repository at Yucca Mountain by hydrovolcanic explosions) is unlikely. Theoretical models of melt-water vapor explosions, particularly the thermal detonation model, suggest hydrovolcanic explosion are possible at Yucca Mountain. 80 refs., 21 figs., 5 tabs.

  14. Remedial Action Assessment System (RAAS): Evaluation of selected feasibility studies of CERCLA (Comprehensive Environmental Response, Compensation, and Liability Act) hazardous waste sites

    SciTech Connect

    Whelan, G. (Pacific Northwest Lab., Richland, WA (USA)); Hartz, K.E.; Hilliard, N.D. (Beck (R.W.) and Associates, Seattle, WA (USA))

    1990-04-01

    Congress and the public have mandated much closer scrutiny of the management of chemically hazardous and radioactive mixed wastes. Legislative language, regulatory intent, and prudent technical judgment, call for using scientifically based studies to assess current conditions and to evaluate and select costeffective strategies for mitigating unacceptable situations. The NCP requires that a Remedial Investigation (RI) and a Feasibility Study (FS) be conducted at each site targeted for remedial response action. The goal of the RI is to obtain the site data needed so that the potential impacts on public health or welfare or on the environment can be evaluated and so that the remedial alternatives can be identified and selected. The goal of the FS is to identify and evaluate alternative remedial actions (including a no-action alternative) in terms of their cost, effectiveness, and engineering feasibility. The NCP also requires the analysis of impacts on public health and welfare and on the environment; this analysis is the endangerment assessment (EA). In summary, the RI, EA, and FS processes require assessment of the contamination at a site, of the potential impacts in public health or the environment from that contamination, and of alternative RAs that could address potential impacts to the environment. 35 refs., 7 figs., 1 tab.

  15. Geomorphological surveys and software simulations for rock fall hazard assessment: a case study in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Devoto, S.; Boccali, C.; Podda, F.

    2014-12-01

    In northern Italy, fast-moving landslides represent a significant threat to the population and human facilities. In the eastern portion of the Italian Alps, rock falls are recurrent and are often responsible for casualties or severe damage to roads and buildings. The above-cited type of landslide is frequent in mountain ranges, is characterised by strong relief energy and is triggered by earthquakes or copious rainfall, which often exceed 2000 mm yr-1. These factors cause morphological dynamics with intense slope erosion and degradation processes. This work investigates the appraisal of the rock-fall hazard related to the presence of several large unstable blocks located at the top of a limestone peak, approximately 500 m NW with respect to the Village of Cimolais. Field surveys recognised a limestone block exceeding a volume of 400 m3 and identified this block as the most hazardous for Cimolais Village because of its proximity to the rocky cliff. A first assessment of the possible transit and stop areas has been investigated through in-depth traditional activities, such as geomorphological mapping and aerial photo analysis. The output of field surveys was a detailed land use map, which provided a fundamental starting point for rock fall software analysis. The geomorphological observations were correlated with DTMs derived by regional topography and Airborne Laser Scanning (ALS) surveys to recognise possible rock fall routes. To simulate properly rock fall trajectories with a hybrid computer program, particular attention was devoted to the correct quantification of rates of input parameters, such as restitution coefficients and horizontal acceleration associated to earthquakes, which historically occur in this portion of Italy. The simulation outputs regarding the distribution of rock fall end points and kinetic energy along rock falling paths highlight the hazardous situation for Cimolais Village. Because of this reason, mitigation works have been suggested to immediately reduce the landslide risk. This proposal accounts for the high volume of blocks, which, in case of a fall, render the passive mitigation measures already in place at the back of Cimolais worthless.

  16. CMMAD Usability Case Study in Support of Countermine and Hazard Sensing

    SciTech Connect

    Victor G. Walker; David I. Gertman

    2010-04-01

    During field trials, operator usability data were collected in support of lane clearing missions and hazard sensing for two robot platforms with Robot Intelligence Kernel (RIK) software and sensor scanning payloads onboard. The tests featured autonomous and shared robot autonomy levels where tasking of the robot used a graphical interface featuring mine location and sensor readings. The goal of this work was to provide insights that could be used to further technology development. The efficacy of countermine systems in terms of mobility, search, path planning, detection, and localization were assessed. Findings from objective and subjective operator interaction measures are reviewed along with commentary from soldiers having taken part in the study who strongly endorse the system.

  17. Eco-efficiency for greenhouse gas emissions mitigation of municipal solid waste management: a case study of Tianjin, China.

    PubMed

    Zhao, Wei; Huppes, Gjalt; van der Voet, Ester

    2011-06-01

    The issue of municipal solid waste (MSW) management has been highlighted in China due to the continually increasing MSW volumes being generated and the limited capacity of waste treatment facilities. This article presents a quantitative eco-efficiency (E/E) analysis on MSW management in terms of greenhouse gas (GHG) mitigation. A methodology for E/E analysis has been proposed, with an emphasis on the consistent integration of life cycle assessment (LCA) and life cycle costing (LCC). The environmental and economic impacts derived from LCA and LCC have been normalized and defined as a quantitative E/E indicator. The proposed method was applied in a case study of Tianjin, China. The study assessed the current MSW management system, as well as a set of alternative scenarios, to investigate trade-offs between economy and GHG emissions mitigation. Additionally, contribution analysis was conducted on both LCA and LCC to identify key issues driving environmental and economic impacts. The results show that the current Tianjin's MSW management system emits the highest GHG and costs the least, whereas the situation reverses in the integrated scenario. The key issues identified by the contribution analysis show no linear relationship between the global warming impact and the cost impact in MSW management system. The landfill gas utilization scenario is indicated as a potential optimum scenario by the proposed E/E analysis, given the characteristics of MSW, technology levels, and chosen methodologies. The E/E analysis provides an attractive direction towards sustainable waste management, though some questions with respect to uncertainty need to be discussed further. PMID:21316937

  18. Reducing aluminum dust explosion hazards: Case study of dust inerting in an aluminum buffing operation

    Microsoft Academic Search

    Timothy J. Myers

    2008-01-01

    Metal powders or dusts can represent significant dust explosion hazards in industry, due to their relatively low ignition energy and high explosivity. The hazard is well known in industries that produce or use aluminum powders, but is sometimes not recognized by facilities that produce aluminum dust as a byproduct of bulk aluminum processing. As demonstrated by the 2003 dust explosion

  19. AMERICAN HEALTHY HOMES SURVEY: A NATIONAL STUDY OF RESIDENTIAL RELATED HAZARDS

    EPA Science Inventory

    The US Environmental Protection Agency's (EPA) National Exposure Research Laboratory (NERL) and the US Department of Housing and Urban Development's (HUD) Office of Healthy Homes and Lead Hazard Control conducted a national survey of housing related hazards in US residences. The...

  20. Probabilistic Hazard Curves for Tornadic Winds, Wind Gusts, and Extreme Rainfall Events

    SciTech Connect

    Weber, A.H.

    1999-07-29

    'This paper summarizes a study carried on at the Savannah River Site (SRS) for determining probabilistic hazard curves for tornadic winds, wind gusts, and extreme rainfall events. DOE Order 420.1, Facility Safety, outlines the requirements for Natural Phenomena Hazards (NPH) mitigation for new and existing DOE facilities. Specifically, NPH include tornadic winds, maximum wind gusts, and extreme rainfall events. Probabilistic hazard curves for each phenomenon indicate the recurrence frequency, and these hazard curves must be updated at least every 10 years to account for recent data, improved methodologies, or criteria changes. Also, emergency response exercises often use hypothetical weather data to initiate accident scenarios. The hazard curves in these reports provide a means to use extreme weather events based on models and measurements rather than scenarios that are created ad hoc as is often the case.'

  1. The Study on Ecological Treatment of Saline Lands to Mitigate the Effects of Climate Change

    NASA Astrophysics Data System (ADS)

    Xie, Jiancang; Zhu, Jiwei; Wang, Tao

    2010-05-01

    The soil water and salt movement is influenced strongly by the frequent droughts, floods and climate change. Additionally, as continued population growth, large-scale reclaiming of arable land and long-term unreasonable irrigation, saline land is increasing at the rate of 1,000,000~15,000,000 mu each year all over the world. In the tradition management, " drainage as the main " measure has series of problem, which appears greater project, more occupation of land, harmful for water saving and downstream pollution. To response the global climate change, it has become the common understanding, which promote energy-saving and environment protection, reflect the current model, explore the ecological management model. In this paper, we take severe saline land—Lubotan in Shaanxi Province as an example. Through nearly 10 years harnessing practice and observing to meteorology, hydrology, soil indicators of climate, we analyze the influence of climate change to soil salinity movement at different seasons and years, then put forward and apply a new model of saline land harnessing to mitigate the Effects of Climate Change and self-rehabilitate entironment. This model will be changed "drainage" to "storage", through the establishment engineering of " storage as the main ", taken comprehensive measures of " project - biology - agriculture ", we are changing saline land into arable land. Adapted to natural changes of climate, rainfall, irrigation backwater, groundwater level, reduced human intervention to achieve system dynamic equilibrium. During the ten years, the salt of plough horizon has reduced from 0.74% to 0.20%, organic matter has increased from 0.7% to 0.92%, various indicators of soil is begining to go better. At the same time, reduced the water for irrigation, drainage pollution and investment costs. Through the model, reformed severe saline land 18,900 mu, increased new cultivated land 16,500 mu, comprehensive efficient significant, ensured the coordinated development of " water - biology - environment " in the region. Model application and promotion can treat saline-alkali and add cultivated land effectively, at the same time, ease the pressure for urban construction land, promote energy saving and emission reducting and ecological restoration, so we can construct a resource-saving and environment-friendly society, realize sustainable development of the population, resources and environment.

  2. 24 CFR 51.205 - Mitigating measures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...Chemicals of an Explosive or Flammable Nature § 51.205 Mitigating measures...potential hazard of an explosion or fire prone nature is predicated on level topography with...eliminated or modified if: (a) The nature of the topography shields the...

  3. 24 CFR 51.205 - Mitigating measures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...Chemicals of an Explosive or Flammable Nature § 51.205 Mitigating measures...potential hazard of an explosion or fire prone nature is predicated on level topography with...eliminated or modified if: (a) The nature of the topography shields the...

  4. 24 CFR 51.205 - Mitigating measures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...Chemicals of an Explosive or Flammable Nature § 51.205 Mitigating measures...potential hazard of an explosion or fire prone nature is predicated on level topography with...eliminated or modified if: (a) The nature of the topography shields the...

  5. Natural Hazards, Second Edition

    NASA Astrophysics Data System (ADS)

    Rouhban, Badaoui

    Natural disaster loss is on the rise, and the vulnerability of the human and physical environment to the violent forces of nature is increasing. In many parts of the world, disasters caused by natural hazards such as earthquakes, floods, landslides, drought, wildfires, intense windstorms, tsunami, and volcanic eruptions have caused the loss of human lives, injury, homelessness, and the destruction of economic and social infrastructure. Over the last few years, there has been an increase in the occurrence, severity, and intensity of disasters, culminating with the devastating tsunami of 26 December 2004 in South East Asia.Natural hazards are often unexpected or uncontrollable natural events of varying magnitude. Understanding their mechanisms and assessing their distribution in time and space are necessary for refining risk mitigation measures. This second edition of Natural Hazards, (following a first edition published in 1991 by Cambridge University Press), written by Edward Bryant, associate dean of science at Wollongong University, Australia, grapples with this crucial issue, aspects of hazard prediction, and other issues. The book presents a comprehensive analysis of different categories of hazards of climatic and geological origin.

  6. Distinguishing Realistic Military Blasts from Firecrackers in Mitigation Studies of Blast Induced Traumatic Brain Injury

    SciTech Connect

    Moss, W C; King, M J; Blackman, E G

    2011-01-21

    In their Contributed Article, Nyein et al. (1,2) present numerical simulations of blast waves interacting with a helmeted head and conclude that a face shield may significantly mitigate blast induced traumatic brain injury (TBI). A face shield may indeed be important for future military helmets, but the authors derive their conclusions from a much smaller explosion than typically experienced on the battlefield. The blast from the 3.16 gm TNT charge of (1) has the following approximate peak overpressures, positive phase durations, and incident impulses (3): 10 atm, 0.25 ms, and 3.9 psi-ms at the front of the head (14 cm from charge), and 1.4 atm, 0.32 ms, and 1.7 psi-ms at the back of a typical 20 cm head (34 cm from charge). The peak pressure of the wave decreases by a factor of 7 as it traverses the head. The blast conditions are at the threshold for injury at the front of the head, but well below threshold at the back of the head (4). The blast traverses the head in 0.3 ms, roughly equal to the positive phase duration of the blast. Therefore, when the blast reaches the back of the head, near ambient conditions exist at the front. Because the headform is so close to the charge, it experiences a wave with significant curvature. By contrast, a realistic blast from a 2.2 kg TNT charge ({approx} an uncased 105 mm artillery round) is fatal at an overpressure of 10 atm (4). For an injury level (4) similar to (1), a 2.2 kg charge has the following approximate peak overpressures, positive phase durations, and incident impulses (3): 2.1 atm, 2.3 ms, and 18 psi-ms at the front of the head (250 cm from charge), and 1.8 atm, 2.5 ms, and 16.8 psi-ms at the back of the head (270 cm from charge). The peak pressure decreases by only a factor of 1.2 as it traverses the head. Because the 0.36 ms traversal time is much smaller than the positive phase duration, pressures on the head become relatively uniform when the blast reaches the back of the head. The larger standoff implies that the headform locally experiences a nearly planar blast wave. Also, the positive phase durations and blast impulses are much larger than those of (1). Consequently, the blast model used in (1) is spatially and temporally very different from a military blast. It would be useful to repeat the calculations using military blast parameters. Finally, (1) overlooks a significant part of (5). On page 1 and on page 3, (1) states that (5) did not consider helmet pads. But pages pages 3 and 4 of (5) present simulations of blast wave propagation across an ACH helmeted head form with and without pads. (5) states that when the pads are present, the 'underwash' of air under the helmet is blocked when compared to the case without. (1) reaches this same conclusion, but reports it as a new result rather than a confirmation of that already found in (5).

  7. Balancing Mitigation Against Impact: A Case Study From the 2005 Chicxulub Seismic Survey

    NASA Astrophysics Data System (ADS)

    Barton, P.; Diebold, J.; Gulick, S.

    2006-05-01

    In early 2005 the R/V Maurice Ewing conducted a large-scale deep seismic reflection-refraction survey offshore Yucatan, Mexico, to investigate the internal structure of the Chicxulub impact crater, centred on the coastline. Shots from a tuned 20 airgun, 6970 cu in array were recorded on a 6 km streamer and 25 ocean bottom seismometers (OBS). The water is exceptionally shallow to large distances offshore, reaching 30 m about 60 km from the land, making it unattractive to the larger marine mammals, although there are small populations of Atlantic and spotted dolphins living in the area, as well as several turtle breeding and feeding grounds on the Yucatan peninsula. In the light of calibrated tests of the Ewing's array (Tolstoy et al., 2004, Geophysical Research Letters 31, L14310), a 180 dB safety radius of 3.5 km around the gun array was adopted. An energetic campaign was organised by environmentalists opposing the work. In addition to the usual precautions of visual and listening watches by independent observers, gradual ramp-ups of the gun arrays, and power-downs or shut-downs for sightings, constraints were also placed to limit the survey to daylight hours and weather conditions not exceeding Beaufort 4. The operations were subject to several on-board inspections by the Mexican environmental authorities, causing logistical difficulties. Although less than 1% of the total working time was lost to shutdowns due to actual observation of dolphins or turtles, approximately 60% of the cruise time was taken up in precautionary inactivity. A diver in the water 3.5 km from the profiling ship reported that the sound in the water was barely noticeable, leading us to examine the actual sound levels recorded by both the 6 km streamer and the OBS hydrophones. The datasets are highly self-consistent, and give the same pattern of decay with distance past about 2 km offset, but with different overall levels: this may be due to geometry or calibration differences under investigation. Both datasets indicate significantly lower levels than reported by Tolstoy et al. (2004). There was no evidence of environmental damage created by this survey. It can be concluded that the mitigation measures were extremely successful, but there is also a concern that the overhead cost of the environmental protection made this one of the most costly academic surveys ever undertaken, and that not all of this protection was necessary. In particular, the predicted 180 dB safety radius appeared to be overly conservative, even though based on calibrated measurements in very similar physical circumstances, and we suggest that these differences were a result of local seismic velocity structure in the water column and/or shallow seabed, which resulted in different partitioning of the energy. These results suggest that real time monitoring of hydrophone array data may provide a method of determining the safety radius dynamically, in response to local conditions.

  8. Case study to remove radioactive hazardous sludge from long horizontal storage tanks

    SciTech Connect

    Hylton, T.D.; Youngblood, E.L.; Cummins, R.L.

    1995-12-31

    The removal of radioactive hazardous sludge from waste tanks is a significant problem at several US Department of Energy (DOE) sites. The use of submerged jets produced by mixing pumps lowered into the supernatant/sludge interface to produce a homogeneous slurry is being studied at several DOE facilities. The homogeneous slurry can be pumped from the tanks to a treatment facility or alternative storage location. Most of the previous and current studies with this method are for flat-bottom tanks with vertical walls. Because of the difference in geometry, the results of these studies are not directly applicable to long horizontal tanks such as those used at the Oak Ridge National Laboratory. Mobilization and mixing studies were conducted with a surrogate sludge (e.g., kaolin clay) using submerged jets in two sizes of horizontal tanks. The nominal capacities of these tanks were 0.87 m{sup 3} (230 gal) and 95 m{sup 3} (25,000 gal). Mobilization efficiencies and mixing times were determined for single and bidirectional jets in both tanks with the discharge nozzles positioned at two locations in the tanks. Approximately 80% of the surrogate sludge was mobilized in the 95-m{sup 3} tank using a fixed bidirectional jet (inside diameter = 0.035 m) and a jet velocity of 6.4 m/s (21 ft/s).

  9. Natural Hazards Observer, volume 2, number 4, June 1978

    Microsoft Academic Search

    A. White; P. Waterstone

    1978-01-01

    The National Hazards Observer is intended to strengthen communication between research workers and the individuals, organizations and agencies concerned with public action relating to natural hazards. The feature article concerns the strengthened commitment of the Federal Disaster Assistance Administration (FDAA) in hazard mitigation. Also included in this issue are discussions of: (1) preparation of 60-second public service announcements for radio

  10. Energy considerations in ground motion attenuation and probabilistic seismic hazard studies

    NASA Astrophysics Data System (ADS)

    Sari, Ali

    2003-06-01

    The severity of earthquake ground motion is generally characterized using peak ground acceleration and spectral acceleration, which are force-based or strength-based parameters. Modern seismic provisions adopt such strength-based design procedures, where seismic demand is represented in the form of an elastic response spectrum. Such design procedures implicitly account for the ductility capacity that a structure might possess by the use of reduction factors, but they do not include in a direct way consideration of the cyclic nature of the response that the structure undergoes and the resulting cumulative damage. In contrast with such strength-based parameters, energy-based parameters may be easily defined that include both the effects of oscillation amplitude as well as cycles experienced. Hence, these parameters may be expected to correlate better with structural damage. The correlation of structural damage measures with strength- and energy-based parameters is studied using inelastic dynamic analyses of reinforced concrete and steel buildings. The two earthquakes that occurred in Turkey in 1999 and the damage suffered by structures in those events offer a regional framework within which to study recorded ground motions to understand strength and energy demands over different distance ranges and for structures of different natural periods. This same framework, focused on a region in Northwestern Turkey, is used in probabilistic seismic hazard analysis (PSHA) studies where design ground motions associated with different return periods derived from both strength and energy considerations are compared for four sites as well as using maps for a larger area around Istanbul. Finally, new attenuation models for strength- and energy-based parameters are developed using an extensive database of earthquakes in Turkey. A noteworthy aspect of the new attenuation models is that they are developed using random effects models that account for distinctions between inter-event and intra-event variability in estimates of ground shaking. These attenuation models highlight differences in strength and energy demands felt at different sites. Results from hazard studies that use the new models are compared with those obtained for Western U.S. models and show some differences in design motions that are based on strength but smaller differences when energy is considered.

  11. Safety in earth orbit study. Volume 2: Analysis of hazardous payloads, docking, on-board survivability

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Detailed and supporting analyses are presented of the hazardous payloads, docking, and on-board survivability aspects connected with earth orbital operations of the space shuttle program. The hazards resulting from delivery, deployment, and retrieval of hazardous payloads, and from handling and transport of cargo between orbiter, sortie modules, and space station are identified and analyzed. The safety aspects of shuttle orbiter to modular space station docking includes docking for assembly of space station, normal resupply docking, and emergency docking. Personnel traffic patterns, escape routes, and on-board survivability are analyzed for orbiter with crew and passenger, sortie modules, and modular space station, under normal, emergency, and EVA and IVA operations.

  12. 44 CFR 201.4 - Standard State Mitigation Plans.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...provided through the Pre-disaster Mitigation (PDM...commitment to reduce risks from natural hazards...as well as the State risk assessment. The State...losses identified in the risk assessment. This section...State's pre- and post-disaster hazard...

  13. 44 CFR 201.4 - Standard State Mitigation Plans.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...provided through the Pre-disaster Mitigation (PDM...commitment to reduce risks from natural hazards...as well as the State risk assessment. The State...losses identified in the risk assessment. This section...State's pre- and post-disaster hazard...

  14. 44 CFR 201.4 - Standard State Mitigation Plans.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...provided through the Pre-disaster Mitigation (PDM...commitment to reduce risks from natural hazards...as well as the State risk assessment. The State...losses identified in the risk assessment. This section...State's pre- and post-disaster hazard...

  15. 44 CFR 201.4 - Standard State Mitigation Plans.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...provided through the Pre-disaster Mitigation (PDM...commitment to reduce risks from natural hazards...as well as the State risk assessment. The State...losses identified in the risk assessment. This section...State's pre- and post-disaster hazard...

  16. Experimental investigation of hydrogen jet fire mitigation by barrier walls

    Microsoft Academic Search

    R. W. Schefer; E. G. Merilo; M. A. Groethe; W. G. Houf

    2011-01-01

    Hydrogen jet flames resulting from ignition of unintended releases can be extensive in length and pose significant radiation and impingement hazards. One possible mitigation strategy to reduce exposure to jet flames is to incorporate barriers around hydrogen storage and delivery equipment. While reducing the extent of unacceptable consequences, the walls may introduce other hazards if not properly configured. This paper

  17. Vulnerability studies and integrated assessments for hazard risk reduction in Pittsburgh, PA (Invited)

    NASA Astrophysics Data System (ADS)

    Klima, K.

    2013-12-01

    Today's environmental problems stretch beyond the bounds of most academic disciplines, and thus solutions require an interdisciplinary approach. For instance, the scientific consensus is changes in the frequency and severity of many types of extreme weather events are increasing (IPCC 2012). Yet despite our efforts to reduce greenhouse gases, we continue to experience severe weather events such as Superstorm Sandy, record heat and blizzards, and droughts. These natural hazards, combined with increased vulnerability and exposure, result in longer-lasting disruptions to critical infrastructure and business continuity throughout the world. In order to protect both our lives and the economy, we must think beyond the bounds of any one discipline to include an integrated assessment of relevant work. In the wake of recent events, New York City, Washington, DC, Chicago, and a myriad of other cities have turned to their academic powerhouses for assistance in better understanding their vulnerabilities. This talk will share a case study of the state of integrated assessments and vulnerability studies of energy, transportation, water, real estate, and other main sectors in Pittsburgh, PA. Then the talk will use integrated assessment models and other vulnerability studies to create coordinated sets of climate projections for use by the many public agencies and private-sector organizations in the region.

  18. Investigations on health hazards of chimney sweeps in Germany: results of a follow-up study.

    PubMed

    Letzel, S; Weber, A; Schaller, K H; Angerer, J; Iro, H; Waitz, G; Knorr-Held, F; Weltle, D; Lehnert, G

    1992-01-01

    Within the framework of a longitudinal study, 127 chimney sweeps from the area of Upper and Middle Franconia (Bavaria, Germany), who had participated in a first medical check-up in 1974, were offered follow-up examinations in 1990. Eighty-one subjects participated in these examinations; in addition individual occupational case histories and medical case histories were obtained for a further 15 and 35 chimney sweeps, respectively. Five test subjects had died before the evaluation deadline (August 15, 1990). The causes of death were a non-Hodgkin's lymphoma, a bladder carcinoma, pulmonary metastases with unknown primary tumour, a suicide and an acute myocardial infarction. Conspicuous results were carcinoma of the oesophagus in one case and leucoplakia of the mucous membranes in the mouth and pharyngeal region in three cases; furthermore one chimney sweep had two haemorrhagic lumps on his vocal cords. Taking into account important non-occupational hazards (alcohol and nicotine abuse) as possible causes of these changes and the lack of relevant occupational exposure to products of incineration over a number of years, none of these cases nor any of the other ascertained results could be considered likely to be causally related to occupational activities. Due to the small number of cases, an epidemiological risk evaluation did not seem useful. Comparison with the results of other chimney sweep studies published in the international literature is not helpful due to the differences in study design, the varying case frequencies, and the different conditions of exposure. PMID:1399014

  19. Geomatics application for geological and geomorphological mapping and landslide hazard evaluation: case studies in the Italian NW-Alps

    Microsoft Academic Search

    Marco Giardino; Enrico Borgogno Mondino; Luigi Perotti

    Theoretical assumptions, research methodologies and results applications of innovative geological and geomorphological mapping for landslide hazard evaluation in the alpine environment are presented in the paper. Case studies from the Italian NW-Alps highlight the role of Geomatics techniques in enhancing the accuracy of multitemporal reconstructions of geomorphic processes and landforms. Aerial and satellite remote sensing stereoscopy, derived DTMs, high resolution

  20. Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs

    EPA Science Inventory

    This is the fourth, also the last, report of the report series entitled ?Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings.? This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method, developed ...

  1. Laboratory Study of Polychlorinated Biphenyl Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs

    EPA Science Inventory

    This is the fourth, also the last, report of the report series entitled “Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings.” This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method...

  2. Further RAGE modeling of asteroid mitigation: surface and subsurface explosions in porous objects

    SciTech Connect

    Weaver, Robert P [Los Alamos National Laboratory; Plesko, Catherine S [Los Alamos National Laboratory; Dearholt, William R [Los Alamos National Laboratory

    2011-01-03

    Disruption or mitigation of a potentially hazardous object (PHO) by a high-energy subsurface burst is considered. This is just one possible method of impact-hazard mitigation. We present RAGE hydrocode models of the shock-generated disruption of PHOs by subsurface nuclear bursts using scenario-specific models from realistic RADAR shape models. We will show 2D and 3D models for the disruption by a large energy source at the center of such PHO models ({approx}100 kt-10 Mt) specifically for the shape of the asteroid 25143 Itokawa. We study the effects of non-uniform composition (rubble pile), shallow buried bursts for the optimal depth of burial and porosity.

  3. Exercise as an intervention for sedentary hazardous drinking college students: A pilot study

    PubMed Central

    Weinstock, Jeremiah; Capizzi, Jeffrey; Weber, Stefanie M.; Pescatello, Linda S.; Petry, Nancy M.

    2014-01-01

    Young adults 18–24 years have the highest rates of problems associated with alcohol use among all age groups, and substance use is inversely related to engagement in substance-free activities. This pilot study investigated the promotion of one specific substance-free activity, exercise, on alcohol use in college students. Thirty-one sedentary college students who engaged in hazardous drinking (Alcohol Use Disorders Identification Test scores ? 8) were randomized to one of two conditions: (a) one 50-minute session of motivational enhancement therapy (MET) focused on increasing exercise, or (b) one 50-minute session of MET focused on increasing exercise plus 8 weeks of contingency management (CM) for adhering to specific exercise activities. All participants completed evaluations at baseline and post-treatment (2-months later) assessing exercise participation and alcohol use. Results of the pilot study suggest the interventions were well received by participants, the MET+CM condition showed an increased self-reported frequency of exercise in comparison to the MET alone condition, but other indices of exercise, physical fitness, and alcohol use did not differ between the interventions over time. These results suggest that a larger scale trial could better assess efficacy of this well received combined intervention. Investigation in other clinically relevant populations is also warranted. PMID:24949085

  4. Role of human- and animal-sperm studies in the evaluation of male reproductive hazards

    SciTech Connect

    Wyrobek, A.J.; Gordon, L.; Watchmaker, G.

    1982-04-07

    Human sperm tests provide a direct means of assessing chemically induced spermatogenic dysfunction in man. Available tests include sperm count, motility, morphology (seminal cytology), and Y-body analyses. Over 70 different human exposures have been monitored in various groups of exposed men. The majority of exposures studied showed a significant change from control in one or more sperm tests. When carefully controlled, the sperm morphology test is statistically the most sensitive of these human sperm tests. Several sperm tests have been developed in nonhuman mammals for the study of chemical spermatotoxins. The sperm morphology test in mice has been the most widely used. Results with this test seem to be related to germ-cell mutagenicity. In general, animal sperm tests should play an important role in the identification and assessment of potential human reproductive hazards. Exposure to spermatotoxins may lead to infertility, and more importantly, to heritable genetic damage. While there are considerable animal and human data suggesting that sperm tests may be used to detect agents causing infertility, the extent to which these tests detect heritable genetic damage remains unclear. (ERB)

  5. Over-Pressurized Drums: Their Causes and Mitigation

    SciTech Connect

    Simmons, Fred; Kuntamukkula, Murty; Quigley, David; Robertson, Janeen; Freshwater, David

    2009-07-10

    Having to contend with bulging or over-pressurized drums is, unfortunately, a common event for people storing chemicals and chemical wastes. (Figure 1) The Department of Energy alone reported over 120 incidents of bulging drums between 1992 and 1999 (1). Bulging drums can be caused by many different mechanisms, represent a number of significant hazards and can be tricky to mitigate. In this article, we will discuss reasons or mechanisms by which drums can become over-pressurized, recognition of the hazards associated with and mitigation of over-pressurized drums, and methods that can be used to prevent drum over-pressurization from ever occurring. Drum pressurization can represent a significant safety hazard. Unless recognized and properly mitigated, improperly manipulated pressurized drums can result in employee exposure, employee injury, and environmental contamination. Therefore, recognition of when a drum is pressurized and knowledge of pressurized drum mitigation techniques is essential.

  6. Seismic hazard map of the western hemisphere

    USGS Publications Warehouse

    Shedlock, K.M.; Tanner, J.G.

    1999-01-01

    Vulnerability to natural disasters increases with urbanization and development of associated support systems (reservoirs, power plants, etc.). Catastrophic earthquakes account for 60% of worldwide casualties associated with natural disasters. Economic damage from earthquakes is increasing, even in technologically advanced countries with some level of seismic zonation, as shown by the 1989 Loma Prieta, CA ($6 billion), 1994 Northridge, CA ($ 25 billion), and 1995 Kobe, Japan (> $ 100 billion) earthquakes. The growth of megacities in seismically active regions around the world often includes the construction of seismically unsafe buildings and infrastructures, due to an insufficient knowledge of existing seismic hazard. Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes), emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of the Americas is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful global seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA) with a 10% chance of exceedance in 50 years for the western hemisphere. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions specify the horizontal force a building should be able to withstand during an earthquake. This seismic hazard map of the Americas depicts the likely level of short-period ground motion from earthquakes in a fifty-year window. Short-period ground motions effect short-period structures (e.g., one-to-two story buildings). The largest seismic hazard values in the western hemisphere generally occur in areas that have been, or are likely to be, the sites of the largest plate boundary earthquakes. Although the largest earthquakes ever recorded are the 1960 Chile and 1964 Alaska subduction zone earthquakes, the largest seismic hazard (PGA) value in the Americas is in Southern California (U.S.), along the San Andreas fault.

  7. Effect of anxiolytic and hypnotic drug prescriptions on mortality hazards: retrospective cohort study

    PubMed Central

    2014-01-01

    Objective To test the hypothesis that people taking anxiolytic and hypnotic drugs are at increased risk of premature mortality, using primary care prescription records and after adjusting for a wide range of potential confounders. Design Retrospective cohort study. Setting 273 UK primary care practices contributing data to the General Practice Research Database. Participants 34?727 patients aged 16 years and older first prescribed anxiolytic or hypnotic drugs, or both, between 1998 and 2001, and 69?418 patients with no prescriptions for such drugs (controls) matched by age, sex, and practice. Patients were followed-up for a mean of 7.6 years (range 0.1-13.4 years). Main outcome All cause mortality ascertained from practice records. Results Physical and psychiatric comorbidities and prescribing of non-study drugs were significantly more prevalent among those prescribed study drugs than among controls. The age adjusted hazard ratio for mortality during the whole follow-up period for use of any study drug in the first year after recruitment was 3.46 (95% confidence interval 3.34 to 3.59) and 3.32 (3.19 to 3.45) after adjusting for other potential confounders. Dose-response associations were found for all three classes of study drugs (benzodiazepines, Z drugs (zaleplon, zolpidem, and zopiclone), and other drugs). After excluding deaths in the first year, there were approximately four excess deaths linked to drug use per 100 people followed for an average of 7.6 years after their first prescription. Conclusions In this large cohort of patients attending UK primary care, anxiolytic and hypnotic drugs were associated with significantly increased risk of mortality over a seven year period, after adjusting for a range of potential confounders. As with all observational findings, however, these results are prone to bias arising from unmeasured and residual confounding. PMID:24647164

  8. Methane emission from ruminants and solid waste: A critical analysis of baseline and mitigation projections for climate and policy studies

    NASA Astrophysics Data System (ADS)

    Matthews, E.

    2012-12-01

    Current and projected estimates of methane (CH4) emission from anthropogenic sources are numerous but largely unexamined or compared. Presented here is a critical appraisal of CH4 projections used in climate-chemistry and policy studies. We compare emissions for major CH4 sources from several groups, including our own new data and RCP projections developed for climate-chemistry models for the next IPCC Assessment Report (AR5). We focus on current and projected baseline and mitigation emissions from ruminant animals and solid waste that are both predicted to rise dramatically in coming decades, driven primarily by developing countries. For waste, drivers include increasing urban populations, higher per capita waste generation due to economic growth and increasing landfilling rates. Analysis of a new global data base detailing waste composition, collection and disposal indicates that IPCC-based methodologies and default data overestimate CH4 emission for the current period which cascades into substantial overestimates in future projections. CH4 emission from solid waste is estimated to be ~10-15 Tg CH4/yr currently rather than the ~35 Tg/yr often reported in the literature. Moreover, emissions from developing countries are unlikely to rise rapidly in coming decades because new management approaches, such as sanitary landfills, that would increase emissions are maladapted to infrastructures in these countries and therefore unlikely to be implemented. The low current emission associated with solid waste (~10 Tg), together with future modest growth, implies that mitigation of waste-related CH4 emission is a poor candidate for slowing global warming. In the case of ruminant animals (~90 Tg CH4/yr currently), the dominant assumption driving future trajectories of CH4 emission is a substantial increase in meat and dairy consumption in developing countries to be satisfied by growing animal populations. Unlike solid waste, current ruminant emissions among studies exhibit a narrow range that does not necessarily signal low uncertainty but rather a reliance on similar animal statistics and emission factors. The UN Food and Agriculture Organization (FAO) projects 2000-2030 growth rates of livestock for most developing countries at 2% to >3% annually. However, the assumption of rapidly rising meat consumption is not supported by current trends nor by resource availability. For example, increased meat consumption in China and other developing countries is poultry and pork that do not affect CH4 emissions, suggesting that the rapid growth projected for all animals, boosting growth in CH4 emission, will not occur. From a resource standpoint, large increases in cattle, sheep and goat populations, especially for African countries (~60% by 2030), are not supportable on arid grazing lands that require very low stocking rates and semi-nomadic management. Increases projected for African animal populations would require either that about 2/3 more animals are grazed on increasingly drier lands or that all non-forested areas become grazing lands. Similar to solid waste, future methane emission from ruminant animals is likely to grow modestly although animals are not a likely candidate for CH4 mitigation due to their dispersed distribution throughout widely varying agricultural systems under very local management.

  9. Does computer use pose an occupational hazard for forearm pain; from the NUDATA study

    PubMed Central

    Kryger, A; Andersen, J; Lassen, C; Brandt, L; Vilstrup, I; Overgaard, E; Thomsen, J; Mikkelsen, S

    2003-01-01

    Methods: A total of 6943 participants with a wide range of computer use and work tasks were studied. At baseline and at one year follow up participants completed a questionnaire. Participants with relevant forearm symptoms were offered a clinical examination. Symptom cases and clinical cases were defined on the basis of self reported pain score and palpation tenderness in the muscles of the forearm. Results: The seven days prevalence of moderate to severe forearm pain was 4.3%. Sixteen of 296 symptom cases met criteria for being a clinical forearm case, and 12 had signs of potential nerve entrapment. One year incidence of reported symptom cases was 1.3%; no subjects developed new signs of nerve entrapment. Increased risk of new forearm pain was associated with use of a mouse device for more than 30 hours per week, and with keyboard use more than 15 hours per week. High job demands and time pressure at baseline were risk factors for onset of forearm pain; women had a twofold increased risk of developing forearm pain. Self reported ergonomic workplace factors at baseline did not predict future forearm pain. Conclusion: Intensive use of a mouse device, and to a lesser extent keyboard usage, were the main risk factors for forearm pain. The occurrence of clinical disorders was low, suggesting that computer use is not commonly associated with any severe occupational hazard to the forearm. PMID:14573725

  10. Can hazardous waste become a raw material? The case study of an aluminium residue: a review.

    PubMed

    López-Delgado, Aurora; Tayibi, Hanan

    2012-05-01

    The huge number of research studies carried out during recent decades focused on finding an effective solution for the waste treatment, have allowed some of these residues to become new raw materials for many industries. Achieving this ensures a reduction in energy and natural resources consumption, diminishing of the negative environmental impacts and creating secondary and tertiary industries. A good example is provided by the metallurgical industry, in general, and the aluminium industry in this particular case. The aluminium recycling industry is a beneficial activity for the environment, since it recovers resources from primary industry, manufacturing and post-consumer waste. Slag and scrap which were previously considered as waste, are nowadays the raw material for some highly profitable secondary and tertiary industries. The most recent European Directive on waste establishes that if waste is used as a common product and fulfils the existing legislation for this product, then this waste can be defined as 'end-of-waste'. The review presented here, attempts to show several proposals for making added-value materials using an aluminium residue which is still considered as a hazardous waste, and accordingly, disposed of in secure storage. The present proposal includes the use of this waste to manufacture glass, glass-ceramic, boehmite and calcium aluminate. Thus the waste might effectively be recovered as a secondary source material for various industries. PMID:22071175

  11. Study of hazardous metals in iron slag waste using laser induced breakdown spectroscopy.

    PubMed

    Gondal, M A; Hussain, T; Yamani, Z H; Bakry, A H

    2007-05-01

    Laser Induced Breakdown Spectroscopy (LIBS) was applied for quantitative elemental analysis of slag samples collected from a local steel plant using an Nd: YAG laser emitting radiation at 1064 nm wavelength. The concentration of different elements of environmental significance such as cadmium, calcium, sulfur, magnesium, chromium, manganese, titanium, barium, phosphorus and silicon were 44, 2193, 1724,78578, 217260, 22220, 5178, 568, 2805, 77871 were mg Kg-1, respectively. Optimal experimental conditions for analysis were investigated. The calibration curves were drawn for different elements. The concentrations determined with our Laser-Induced Breakdown Spectrometers were compared with the results obtained using Inductively Coupled Plasma (ICP) emission spectroscopy. Our study demonstrates that LIBS could be highly appropriate for rapid online analysis of iron slag waste. The relative accuracy of our LIBS system for various elements as compared with ICP method is in the range of 0.001-0.049 at 2.5% error confidence. Limits of detection (LOD) of our LIBS system were also estimated for the elements noted here. The hazardous effects of some of the trace elements present in iron slag exceeding permissible safe limits are also discussed. PMID:17474003

  12. Ground motion input in seismic evaluation studies: impacts on risk assessment of uniform hazard spectra

    SciTech Connect

    Wu, S.C.; Sewell, R.T.

    1996-07-01

    Conservatism and variability in seismic risk estimates are studied: effects of uniform hazard spectrum (UHS) are examined for deriving probabilistic estimates of risk and in-structure demand levels, as compared to the more-exact use of realistic time history inputs (of given probability) that depend explicitly on magnitude and distance. This approach differs from the conventional in its exhaustive treatment of the ground-motion threat and in its more detailed assessment of component responses to that threat. The approximate UH-ISS (in-structure spectrum) obtained based on UHS appear to be very close to the more-exact results directed computed from scenario earthquakes. This conclusion does not depend on site configurations and structural characteristics. Also, UH-ISS has composite shapes and may not correspond to the characteristics possessed a single earthquake. The shape is largely affected by the structural property in most cases and can be derived approximately from the corresponding UHS. Motions with smooth spectra, however, will not have the same damage potential as those of more realistic motions with jagged spectral shapes. As a result, UHS-based analysis may underestimate the real demands in nonlinear structural analyses.

  13. Flood hazard studies in Central Texas using orbital and suborbital remote sensing machinery

    NASA Technical Reports Server (NTRS)

    Baker, V. R.; Holz, R. K.; Patton, P. C.

    1975-01-01

    Central Texas is subject to infrequent, unusually intense rainstorms which cause extremely rapid runoff from drainage basins developed on the deeply dissected limestone and marl bedrock of the Edwards Plateau. One approach to flood hazard evaluation in this area is a parametric model relating flood hydrograph characteristics to quantitative geomorphic properties of the drainage basins. The preliminary model uses multiple regression techniques to predict potential peak flood discharge from basin magnitude, drainage density, and ruggedness number. After mapping small catchment networks from remote sensing imagery, input data for the model are generated by network digitization and analysis by a computer assisted routine of watershed analysis. The study evaluated the network resolution capabilities of the following data formats: (1) large-scale (1:24,000) topographic maps, employing Strahler's "method of v's," (2) standard low altitude black and white aerial photography (1:13,000 and 1:20,000 scales), (3) NASA - generated aerial infrared photography at scales ranging from 1:48,000 to 1:123,000, and (4) Skylab Earth Resources Experiment Package S-190A and S-190B sensors (1:750,000 and 1:500,000 respectively).

  14. Sequential analysis of China's hazards in geoscience

    Microsoft Academic Search

    Chen Zhiming

    1996-01-01

    At the request of the IGU Study Group on Rapid Geomorphologic Hazards, the author has dealt with a lot of information on China's hazards in the course of compiling geomorphic hazard inventories. While analyzing those hazards throughout history, especially in the past two decades, the author found that quite many hazards used to be either interrelated or associated with each

  15. Fire and Risk Mitigation through Community education Case study: Program Bushfire Smart, The Pine Rivers Shire

    Microsoft Academic Search

    Andrew McLoughlin; Craig Welden

    This paper is a case study of the Bushfire Smart Program. This program was developed as a local partnership between Pine Rivers Shire Council (PRSC), the Queensland Fire and Rescue Service Rural and Urban Divisions (QFRS), The Southeast Queensland Fire and Biodiversity Consortium (FABC), Queensland Parks and Wildlife Service (QPWS), Griffith University and to deliver bushfire community education to reduce

  16. Mitigating challenges of using virtual reality in online courses: a case study

    Microsoft Academic Search

    Barbara Stewart; Holly M. Hutchins; Shirley Ezell; Darrell De Martino; Anil Bobba

    2010-01-01

    Case study methodology was used to describe the challenges experienced in the development of a virtual component for a freshman?level undergraduate course. The purpose of the project was to use a virtual environment component to provide an interactive and engaging learning environment. While some student and faculty feedback was positive, this pilot was riddled with challenges. Five categories of challenges

  17. Geoengineering, climate change scepticism and the ‘moral hazard’ argument: an experimental study of UK public perceptions

    PubMed Central

    Corner, Adam; Pidgeon, Nick

    2014-01-01

    Many commentators have expressed concerns that researching and/or developing geoengineering technologies may undermine support for existing climate policies—the so-called moral hazard argument. This argument plays a central role in policy debates about geoengineering. However, there has not yet been a systematic investigation of how members of the public view the moral hazard argument, or whether it impacts on people's beliefs about geoengineering and climate change. In this paper, we describe an online experiment with a representative sample of the UK public, in which participants read one of two arguments (either endorsing or rejecting the idea that geoengineering poses a moral hazard). The argument endorsing the idea of geoengineering as a moral hazard was perceived as more convincing overall. However, people with more sceptical views and those who endorsed ‘self-enhancing’ values were more likely to agree that the prospect of geoengineering would reduce their motivation to make changes in their own behaviour in response to climate change. The findings suggest that geoengineering is likely to pose a moral hazard for some people more than others, and the implications for engaging the public are discussed. PMID:25404680

  18. Geoengineering, climate change scepticism and the 'moral hazard' argument: an experimental study of UK public perceptions.

    PubMed

    Corner, Adam; Pidgeon, Nick

    2014-12-28

    Many commentators have expressed concerns that researching and/or developing geoengineering technologies may undermine support for existing climate policies-the so-called moral hazard argument. This argument plays a central role in policy debates about geoengineering. However, there has not yet been a systematic investigation of how members of the public view the moral hazard argument, or whether it impacts on people's beliefs about geoengineering and climate change. In this paper, we describe an online experiment with a representative sample of the UK public, in which participants read one of two arguments (either endorsing or rejecting the idea that geoengineering poses a moral hazard). The argument endorsing the idea of geoengineering as a moral hazard was perceived as more convincing overall. However, people with more sceptical views and those who endorsed 'self-enhancing' values were more likely to agree that the prospect of geoengineering would reduce their motivation to make changes in their own behaviour in response to climate change. The findings suggest that geoengineering is likely to pose a moral hazard for some people more than others, and the implications for engaging the public are discussed. PMID:25404680

  19. Doser study in Maryland coastal plain: Use of lime doser to mitigate stream acidification. Final report

    SciTech Connect

    Hall, L.W.; Fischer, S.A.; Killen, W.D.; Ziegenfuss, M.C.; Klauda, R.J.

    1992-07-01

    The purpose of the 1991 doser study was to determine the efficacy of automated lime slurry dosers to neutralize acidic pulses and improve water quality in Bacon Ridge Branch and Mattawoman Creek; measure physicochemical responses of Bacon Ridge Branch, Mattawoman Creek and Faulkner Branch to rain events and determine the use of the above three streams, North River, and Tull Branch for spawning by yellow perch, white perch, alewife and blueback herring.

  20. Study of gas cluster ion beam surface treatments for mitigating RF breakdown

    Microsoft Academic Search

    D. R. Swenson; E. Degenkolb; Z. Insepov

    2006-01-01

    Surface processing with high-energy gas cluster ion beams (GCIB) is investigated for increasing the high voltage breakdown strength of RF cavities and electrodes in general. Various GCIB treatments were studied for Nb, Cu, Stainless Steel and Ti electrode materials using beams of Ar, Ar+H2, O2, N2, Ar+CH4, or O2+NF3 clusters with accelerating potentials up to 35kV. Etching using chemically active

  1. Houston's Novel Strategy to Control Hazardous Air Pollutants: A Case Study in Policy Innovation and Political Stalemate.

    PubMed

    Sexton, Ken; Linder, Stephen H

    2015-01-01

    Although ambient concentrations have declined steadily over the past 30 years, Houston has recorded some of the highest levels of hazardous air pollutants in the United States. Nevertheless, federal and state regulatory efforts historically have emphasized compliance with the National Ambient Air Quality Standard for ozone, treating "air toxics" in Houston as a residual problem to be solved through application of technology-based standards. Between 2004 and 2009, Mayor Bill White and his administration challenged the well-established hierarchy of air quality management spelled out in the Clean Air Act, whereby federal and state authorities are assigned primacy over local municipalities for the purpose of designing and implementing air pollution control strategies. The White Administration believed that existing regulations were not sufficient to protect the health of Houstonians and took a diversity of both collaborative and combative policy actions to mitigate air toxic emissions from stationary sources. Opposition was substantial from a local coalition of entrenched interests satisfied with the status quo, which hindered the city's attempts to take unilateral policy actions. In the short term, the White Administration successfully raised the profile of the air toxics issue, pushed federal and state regulators to pay more attention, and induced a few polluting facilities to reduce emissions. But since White left office in 2010, air quality management in Houston has returned to the way it was before, and today there is scant evidence that his policies have had any lasting impact. PMID:25698880

  2. Houston’s Novel Strategy to Control Hazardous Air Pollutants: A Case Study in Policy Innovation and Political Stalemate

    PubMed Central

    Sexton, Ken; Linder, Stephen H

    2015-01-01

    Although ambient concentrations have declined steadily over the past 30 years, Houston has recorded some of the highest levels of hazardous air pollutants in the United States. Nevertheless, federal and state regulatory efforts historically have emphasized compliance with the National Ambient Air Quality Standard for ozone, treating “air toxics” in Houston as a residual problem to be solved through application of technology-based standards. Between 2004 and 2009, Mayor Bill White and his administration challenged the well-established hierarchy of air quality management spelled out in the Clean Air Act, whereby federal and state authorities are assigned primacy over local municipalities for the purpose of designing and implementing air pollution control strategies. The White Administration believed that existing regulations were not sufficient to protect the health of Houstonians and took a diversity of both collaborative and combative policy actions to mitigate air toxic emissions from stationary sources. Opposition was substantial from a local coalition of entrenched interests satisfied with the status quo, which hindered the city’s attempts to take unilateral policy actions. In the short term, the White Administration successfully raised the profile of the air toxics issue, pushed federal and state regulators to pay more attention, and induced a few polluting facilities to reduce emissions. But since White left office in 2010, air quality management in Houston has returned to the way it was before, and today there is scant evidence that his policies have had any lasting impact. PMID:25698880

  3. The California Hazards Institute

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Kellogg, L. H.; Turcotte, D. L.

    2006-12-01

    California's abundant resources are linked with its natural hazards. Earthquakes, landslides, wildfires, floods, tsunamis, volcanic eruptions, severe storms, fires, and droughts afflict the state regularly. These events have the potential to become great disasters, like the San Francisco earthquake and fire of 1906, that overwhelm the capacity of society to respond. At such times, the fabric of civic life is frayed, political leadership is tested, economic losses can dwarf available resources, and full recovery can take decades. A patchwork of Federal, state and local programs are in place to address individual hazards, but California lacks effective coordination to forecast, prevent, prepare for, mitigate, respond to, and recover from, the harmful effects of natural disasters. Moreover, we do not know enough about the frequency, size, time, or locations where they may strike, nor about how the natural environment and man-made structures would respond. As California's population grows and becomes more interdependent, even moderate events have the potential to trigger catastrophes. Natural hazards need not become natural disasters if they are addressed proactively and effectively, rather than reactively. The University of California, with 10 campuses distributed across the state, has world-class faculty and students engaged in research and education in all fields of direct relevance to hazards. For that reason, the UC can become a world leader in anticipating and managing natural hazards in order to prevent loss of life and property and degradation of environmental quality. The University of California, Office of the President, has therefore established a new system-wide Multicampus Research Project, the California Hazards Institute (CHI), as a mechanism to research innovative, effective solutions for California. The CHI will build on the rich intellectual capital and expertise of the Golden State to provide the best available science, knowledge and tools for leaders, managers, stakeholders, policy makers, educators and the public to effectively and comprehensively combat the problems caused by the natural hazards that threaten California. During this first year of operation, UC faculty involved in the CHI will identify the science and technology research priorities of the Institute, followed by the solicitation of participation by other important stakeholders within California. The CHI is founded upon the idea that the hazards associated with events such as earthquakes and floods need not become great disasters such as the San Francisco earthquake of 1906 and 2005 Hurricane Katrina if these hazards can be anticipated proactively, before they must be dealt with reactively.

  4. Legal and institutional tools to mitigate plastic pollution affecting marine species: Argentina as a case study.

    PubMed

    González Carman, Victoria; Machain, Natalia; Campagna, Claudio

    2015-03-15

    Plastics are the most common form of debris found along the Argentine coastline. The Río de la Plata estuarine area is a relevant case study to describe a situation where ample policy exists against a backdrop of plastics disposed by populated coastal areas, industries, and vessels; with resultant high impacts of plastic pollution on marine turtles and mammals. Policy and institutions are in place but the impact remains due to ineffective waste management, limited public education and awareness, and weaknesses in enforcement of regulations. This context is frequently repeated all over the world. We list possible interventions to increase the effectiveness of policy that require integrating efforts among governments, the private sector, non-governmental organizations and the inhabitants of coastal cities to reduce the amount of plastics reaching the Río de la Plata and protect threatened marine species. What has been identified for Argentina applies to the region and globally. PMID:25627195

  5. Naringin mitigates erythrocytes aging induced by paclitaxel: an in vitro study.

    PubMed

    Harisa, Gamaleldin I

    2014-03-01

    In this study, the protective role of naringin (NAR) against paclitaxel (PTX)-induced erythrocytes aging has been investigated using human erythrocyte as an in vitro model. Erythrocytes were incubated with PTX in the presence and absence of NAR. Incubation of erythrocytes with PTX resulted in increased protein carbonyl content and malondialdehyde and hemolysis percentage compared with control. Furthermore, a significant increase in the ratios of glutathione peroxidase/glutathione reductase, superoxide dismutase/glutathione peroxidase, and superoxide dismutase/catalase in PTX-treated cells was observed, compared with control cells. In contrast, reduced glutathione/oxidized glutathione ratio and glucose-6-phosphate dehydrogenase activity were decreased upon PTX treatment. The simultaneous incubation of erythrocytes with PTX and NAR restored these variables to values similar to those of control erythrocytes. These results suggest that NAR inhibited PTX-induced aging by lessening the PTX-induced oxidative stress. PMID:24375949

  6. Integrated multi-parameters Probabilistic Seismic Landslide Hazard Analysis (PSLHA): the case study of Ischia island, Italy

    NASA Astrophysics Data System (ADS)

    Caccavale, Mauro; Matano, Fabio; Sacchi, Marco; Mazzola, Salvatore; Somma, Renato; Troise, Claudia; De Natale, Giuseppe

    2014-05-01

    The Ischia island is a large, complex, partly submerged, active volcanic field located about 20 km east to the Campi Flegrei, a major active volcano-tectonic area near Naples. The island is morphologically characterized in its central part by the resurgent block of Mt. Epomeo, controlled by NW-SE and NE-SW trending fault systems, by mountain stream basin with high relief energy and by a heterogeneous coastline with alternation of beach and tuff/lava cliffs in a continuous reshape due to the weather and sea erosion. The volcano-tectonic process is a main factor for slope stability, as it produces seismic activity and generated steep slopes in volcanic deposits (lava, tuff, pumice and ash layers) characterized by variable strength. In the Campi Flegrei and surrounding areas the possible occurrence of a moderate/large seismic event represents a serious threat for the inhabitants, for the infrastructures as well as for the environment. The most relevant seismic sources for Ischia are represented by the Campi Flegrei caldera and a 5 km long fault located below the island north coast. However those sources are difficult to constrain. The first one due to the on-shore and off-shore extension not yet completely defined. The second characterized only by few large historical events is difficult to parameterize in the framework of probabilistic hazard approach. The high population density, the presence of many infrastructures and the more relevant archaeological sites associated with the natural and artistic values, makes this area a strategic natural laboratory to develop new methodologies. Moreover Ischia represents the only sector, in the Campi Flegrei area, with documented historical landslides originated by earthquake, allowing for the possibility of testing the adequacy and stability of the method. In the framework of the Italian project MON.I.C.A (infrastructural coastlines monitoring) an innovative and dedicated probabilistic methodology has been applied to identify the areas with higher susceptibility of landslide occurrence due to the seismic effect. The (PSLHA) combines the probability of exceedance maps for different GM parameters with the geological and geomorphological information, in terms of critical acceleration and dynamic stability factor. Generally the maps are evaluated for Peak Ground Acceleration, Velocity or Intensity, are well related with anthropic infrastructures (e.g. streets, building, etc.). Each ground motion parameter represents a different aspect in the hazard and has a different correlation with the generation of possible damages. Many works pointed out that other GM like Arias and Housner intensity and the absolute displacement could represent a better choice to analyse for example the cliffs stability. The selection of the GM parameter is of crucial importance to obtain the most useful hazard maps. However in the last decades different Ground Motion Prediction Equations for a new set of GM parameters have been published. Based on this information a series of landslide hazard maps can be produced. The new maps will lead to the identification of areas with highest probability of landslide induced by an earthquake. In a strategic site like Ischia this new methodologies will represent an innovative and advanced tool for the landslide hazard mitigation.

  7. Simultaneous transcutaneous electrical nerve stimulation mitigates simulator sickness symptoms in healthy adults: a crossover study

    PubMed Central

    2013-01-01

    Background Flight simulators have been used to train pilots to experience and recognize spatial disorientation, a condition in which pilots incorrectly perceive the position, location, and movement of their aircrafts. However, during or after simulator training, simulator sickness (SS) may develop. Spatial disorientation and SS share common symptoms and signs and may involve a similar mechanism of dys-synchronization of neural inputs from the vestibular, visual, and proprioceptive systems. Transcutaneous electrical nerve stimulation (TENS), a maneuver used for pain control, was found to influence autonomic cardiovascular responses and enhance visuospatial abilities, postural control, and cognitive function. The purpose of present study was to investigate the protective effects of TENS on SS. Methods Fifteen healthy young men (age: 28.6?±?0.9 years, height: 172.5?±?1.4 cm, body weight: 69.3?±?1.3 kg, body mass index: 23.4?±?1.8 kg/m2) participated in this within-subject crossover study. SS was induced by a flight simulator. TENS treatment involved 30 minutes simultaneous electrical stimulation of the posterior neck and the right Zusanli acupoint. Each subject completed 4 sessions (control, SS, TENS, and TENS?+?SS) in a randomized order. Outcome indicators included SS symptom severity and cognitive function, evaluated with the Simulator Sickness Questionnaire (SSQ) and d2 test of attention, respectively. Sleepiness was rated using the Visual Analogue Scales for Sleepiness Symptoms (VAS-SS). Autonomic and stress responses were evaluated by heart rate, heart rate variability (HRV) and salivary stress biomarkers (salivary alpha-amylase activity and salivary cortisol concentration). Results Simulator exposure increased SS symptoms (SSQ and VAS-SS scores) and decreased the task response speed and concentration. The heart rate, salivary stress biomarker levels, and the sympathetic parameter of HRV increased with simulator exposure, but parasympathetic parameters decreased (p?

  8. Study and mitigation of calibration error sources in a water vapour Raman lidar

    NASA Astrophysics Data System (ADS)

    David, Leslie; Bock, Olivier; Bosser, Pierre; Thom, Christian; Pelon, Jacques

    2014-05-01

    The monitoring of water vapour throughout the atmosphere is important for many scientific applications (weather forecasting, climate research, calibration of GNSS altimetry measurements). Measuring water vapour remains a technical challenge because of its high variability in space and time. The major issues are achieving long-term stability (e.g., for climate trends monitoring) and high accuracy (e.g. for calibration/validation applications). LAREG and LOEMI at Institut National de l'Information Géographique et Forestière (IGN) have developed a mobile scanning water vapour Raman lidar in collaboration with LATMOS at CNRS. This system aims at providing high accuracy water vapour measurements throughout the troposphere for calibrating GNSS wet delay signals and thus improving vertical positioning. Current developments aim at improving the calibration method and long term stability of the system to allow the Raman lidar to be used as a reference instrument. The IGN-LATMOS lidar was deployed in the DEMEVAP (Development of Methodologies for Water Vapour Measurement) campaign that took place in 2011 at the Observatoire de Haute Provence. The goals of DEMEVAP were to inter-compare different water vapour sounding techniques (lidars, operational and research radiosondes, GPS,…) and to study various calibration methods for the Raman lidar. A significant decrease of the signals and of the calibration constants of the IGN-LATMOS Raman lidar has been noticed all along the campaign. This led us to study the likely sources of uncertainty and drifts in each part of the instrument: emission, reception and detection. We inventoried several error sources as well as instability sources. The impact of the temperature dependence of the Raman lines on the filter transmission or the fluorescence in the fibre, are examples of the error sources. We investigated each error source and each instability source (uncontrolled laser beam jitter, temporal fluctuations of the photomultiplier gain and spatial inhomogeneity in the sensitivity of the photomultiplier photocathode,…) separately using theoretical analysis, numerical and optical simulations, and laboratory experiments. The instability induced by the use of an optics fibre for coupling the signal collected by the telescope to the detectors is especially investigated. We quantified the impact of all these error sources on the water vapour and nitrogen Raman channels measurements and on the change in the differential calibration constant and we tried to implement an experimental solution to minimize the variations.

  9. Feasibility study of tank leakage mitigation using subsurface barriers. Revision 1

    SciTech Connect

    Treat, R.L.; Peters, B.B.; Cameron, R.J. [Enserch Environmental, Inc., Richland, WA (United States)] [and others

    1995-01-01

    This document reflects the evaluations and analyses performed in response to Tri-Party Agreement Milestone M-45-07A - {open_quotes}Complete Evaluation of Subsurface Barrier Feasibility{close_quotes} (September 1994). In addition, this feasibility study was revised reflecting ongoing work supporting a pending decision by the DOE Richland Operations Office, the Washington State Department of Ecology, and the US Environmental Protection Agency regarding further development of subsurface barrier options for SSTs and whether to proceed with demonstration plans at the Hanford Site (Tri-Party Agreement Milestone M-45-07B). Analyses of 14 integrated SST tank farm remediation alternatives were conducted in response to the three stated objectives of Tri-Party Agreement Milestone M-45-07A. The alternatives include eight with subsurface barriers and six without. Technologies used in the alternatives include three types of tank waste retrieval, seven types of subsurface barriers, a method of stabilizing the void space of emptied tanks, two types of in situ soil flushing, one type of surface barrier, and a clean-closure method. A no-action alternative and a surface-barrier-only alternative were included as nonviable alternatives for comparison. All other alternatives were designed to result in closure of SST tank farms as landfills or in clean-closure. Revision 1 incorporates additional analyses of worker safety, large leak scenarios, and sensitivity to the leach rates of risk controlling constituents. The additional analyses were conducted to support TPA Milestone M-45-07B.

  10. Land use and management change under climate change adaptation and mitigation strategies: a U.S. case study

    USGS Publications Warehouse

    Mu, Jianhong E.; Wein, Anne; McCarl, Bruce

    2013-01-01

    We examine the effects of crop management adaptation and climate mitigation strategies on land use and land management, plus on related environmental and economic outcomes. We find that crop management adaptation (e.g. crop mix, new species) increases Greenhouse gas (GHG) emissions by 1.7 % under a more severe climate projection while a carbon price reduces total forest and agriculture GHG annual flux by 15 % and 9 %, respectively. This shows that trade-offs are likely between mitigation and adaptation. Climate change coupled with crop management adaptation has small and mostly negative effects on welfare; mitigation, which is implemented as a carbon price starting at $15 per metric ton carbon dioxide (CO2) equivalent with a 5 % annual increase rate, bolsters welfare carbon payments. When both crop management adaptation and carbon price are implemented the effects of the latter dominates.

  11. Evaluation of impacts and mitigation assessments for the UMTRA Project: Gunnison and Durango pilot studies. Final report

    SciTech Connect

    Beranich, S.J. [Southwest Environmental, Albuquerque, NM (United States)

    1994-08-24

    This report evaluates the impacts assessment and proposed mitigations provided in environmental documents concerning the US Department of Energy`s (DOE) Uranium Mill Tailings Remedial Action (UMTRA) Project. The projected impacts and proposed mitigations identified in UMTRA Project environmental documents were evaluated for two UMTRA Project sites. These sites are Gunnison and Durango, which are representative of currently active and inactive UMTRA Project sites, respectively. National Environmental Policy Act (NEPA) documentation was prepared for the remedial action at Durango and Gunnison as well as for the provision of an alternate water supply system at Gunnison. Additionally, environmental analysis was completed for mill site demolition Gunnison, and for a new road related to the Durango remedial action. The results in this report pertain only to the impact assessments prepared by the Regulatory Compliance staff as a part of the NEPA compliance requirements. Similarly, the mitigative measures documented are those that were identified during the NEPA process.

  12. Nanocytological Field Carcinogenesis Detection to Mitigate Overdiagnosis of Prostate Cancer: A Proof of Concept Study

    PubMed Central

    Roy, Hemant K.; Brendler, Charles B.; Subramanian, Hariharan; Zhang, Di; Maneval, Charles; Chandler, John; Bowen, Leah; Kaul, Karen L.; Helfand, Brian T.; Wang, Chi-Hsiung; Quinn, Margo; Petkewicz, Jacqueline; Paterakos, Michael; Backman, Vadim

    2015-01-01

    Purpose To determine whether nano-architectural interrogation of prostate field carcinogenesis can be used to predict prognosis in patients with early stage (Gleason 6) prostate cancer (PCa), which is mostly indolent but frequently unnecessarily treated. Materials and Methods We previously developed partial wave spectroscopic microscopy (PWS) that enables quantification of the nanoscale intracellular architecture (20–200nm length scale) with remarkable accuracy. We adapted this technique to assess prostate needle core biopsies in a case control study from men with Gleason 6 disease who either progressed (n = 20) or remained indolent (n = 18) over a ~3 year follow up. We measured the parameter disorder strength (Ld) characterizing the spatial heterogeneity of the nanoscale cellular structure and nuclear morphology from the microscopically normal mucosa ~150 histologically normal epithelial cells. Results There was a profound increase in nano-architectural disorder between progressors and non-progressors. Indeed, the Ld from future progressors was dramatically increased when compared to future non-progressors (1±0.065 versus 1.30±0.0614, respectively p = 0.002). The area under the receiver operator characteristic curve (AUC) was 0.79, yielding a sensitivity of 88% and specificity of 72% for discriminating between progressors and non-progressors. This was not confounded by demographic factors (age, smoking status, race, obesity), thus supporting the robustness of the approach. Conclusions We demonstrate, for the first time, that nano-architectural alterations occur in prostate cancer field carcinogenesis and can be exploited to predict prognosis of early stage PCa. This approach has promise in addressing the clinically vexing dilemma of management of Gleason 6 PCa and may provide a paradigm for dealing with the larger issue of cancer overdiagnosis. PMID:25706755

  13. Integrated study to define the hazard of the unstable flanks of Mt. Etna: the Italian DPC-INGV FLANK Project

    NASA Astrophysics Data System (ADS)

    Acocella, Valerio; Puglisi, Giuseppe

    2010-05-01

    Volcanoes are often characterized by unstable flanks. The eastern and south-eastern flanks of Mt. Etna (Italy) have shown repeated evidence of instability in the recent past. The extent and frequency of these processes varies widely, from nearly continuous creep-like movements of specific portions of the flank to the rarer slip of the entire eastern sector, involving also the off-shore portion. Estimated slip rates may vary enormously, from mm/yr to m/week. The most dramatic instability events are associated with major eruptions and shallow seismic activity, as during 2002-2003, posing a serious hazard to the inhabited flanks of the volcano. The Italian Department of Civil Defense (DPC), with the National Institute of Geophysics and Volcanology (INGV), as well as with the involvement of Italian Universities and other Research Institutes, has launched a 2-years project (may 2008-may 2010) devoted to minimize the hazard deriving from the instability of the Etna flanks. This multidisciplinary project embraces geological, geophysical, volcanological, modeling and hazard studies, both on the on-shore and the off-shore portions of the E and SE flanks of the volcano. Indeed, the main aims are to define: (a) the 3D geometry of the collapsing sector(s); (b) the relationships between flank movement and volcanic and seismic activity; (c) the hazard related to the flank instability. The collected data populate a GIS database implemented according the WoVo rules. This project represents the first attempt, at least in Europe, to use an integrated approach to minimize the hazard deriving from flank instability in a volcano. Here we briefly summarize the state of the art of the project at an advanced stage, highlighting the path of the different Tasks, as well as the main results.

  14. Trade study of leakage detection, monitoring, and mitigation technologies to support Hanford single-shell waste retrieval

    SciTech Connect

    Hertzel, J.S.

    1996-03-01

    The U.S. Department of Energy has established the Tank Waste Remediation System to safely manage and dispose of low-level, high-level, and transuranic wastes currently stored in underground storage tanks at the Hanford Site in Eastern Washington. This report supports the Hanford Federal Facility Agreement and Consent Order (Tri-Party Agreement) Milestone No. M-45-08-T01 and addresses additional issues regarding single-shell tank leakage detection, monitoring, and mitigation technologies and provide an indication of the scope of leakage detection, monitoring, and mitigation activities necessary to support the Tank Waste Remedial System Initial Single-shell Tank Retrieval System project.

  15. A study of hazardous air pollutants at the Tidd PFBC Demonstration Plant

    SciTech Connect

    NONE

    1994-10-01

    The US Department of Energy (DOE) Clean Coal Technology (CCD Program is a joint effort between government and industry to develop a new generation of coal utilization processes. In 1986, the Ohio Power Company, a subsidiary of American Electric Power (AEP), was awarded cofunding through the CCT program for the Tidd Pressure Fluidized Bed Combustor (PFBC) Demonstration Plant located in Brilliant, Ohio. The Tidd PFBC unit began operation in 1990 and was later selected as a test site for an advanced particle filtration (APF) system designed for hot gas particulate removal. The APF system was sponsored by the DOE Morgantown Energy Technology Center (METC) through their Hot Gas Cleanup Research and Development Program. A complementary goal of the DOE CCT and METC R&D programs has always been to demonstrate the environmental acceptability of these emerging technologies. The Clean Air Act Amendments of 1990 (CAAA) have focused that commitment toward evaluating the fate of hazardous air pollutants (HAPs) associated with advanced coal-based and hot gas cleanup technologies. Radian Corporation was contacted by AEP to perform this assessment of HAPs at the Tidd PFBC demonstration plant. The objective of this study is to assess the major input, process, and emission streams at Plant Tidd for the HAPs identified in Title III of the CAAA. Four flue gas stream locations were tested: ESP inlet, ESP outlet, APF inlet, and APF outlet. Other process streams sampled were raw coal, coal paste, sorbent, bed ash, cyclone ash, individual ESP hopper ash, APF ash, and service water. Samples were analyzed for trace elements, minor and major elements, anions, volatile organic compounds, dioxin/furan compounds, ammonia, cyanide, formaldehyde, and semivolatile organic compounds. The particle size distribution in the ESP inlet and outlet gas streams and collected ash from individual ESP hoppers was also determined.

  16. Multi-Gas Mitigation Analysis by IPAC

    Microsoft Academic Search

    Xiulian Hu Zhu Songli Kejun Jiang

    2006-01-01

    By recognizing the importance of non-CO2 gases mitigation for climate change abatement, modeling study for multi-gas scenarios was conducted by using IPAC model. This is also part of EMF-21 study for comparing the cost for CO2 mitigation and multi-gas mitigation. The main objective of this analysis is to evaluate the international potential and costs of non-CO2 greenhouse gas abatement. Three

  17. CASE STUDIES ADDENDUM: 1-8. REMEDIAL RESPONSE AT HAZARDOUS WASTE SITES

    EPA Science Inventory

    In response to the threat to human health and the environment posed by numerous uncontrolled hazardous waste sites across the country, new remedial action technologies are evolving and known technologies are being retrofitted and adapted for use in cleaning up these sites. The re...

  18. Division of Surveillance, Hazard Evaluations and Field Studies (DSHEFS). Medical aspects were reviewed

    E-print Network

    Steven W. Lenhart; Cih Preface

    This Health Hazard Evaluation (HHE) report and any recommendations made herein are for the specific facility evaluated and may not be universally applicable. Any recommendations made are not to to be considered as final statements of of NIOSH policy or or of of any agency or or individual involved. Additional HHE reports are available at at

  19. CASE STUDIES 1-23: REMEDIAL RESPONSE AT HAZARDOUS WASTE SITES

    EPA Science Inventory

    In response to the threat to human health and the environment posed by numerous uncontrolled hazardous waste sites across the country, new remedial action technologies are evolving and known technologies are being retrofitted and adapted for use in cleaning up these sites. This r...

  20. Mitigating GHG emissions from agriculture under climate change constrains - a case study for the State of Saxony, Germany

    NASA Astrophysics Data System (ADS)

    Haas, E.; Kiese, R.; Klatt, S.; Butterbach-Bahl, K.

    2012-12-01

    Mitigating greenhouse gas (N2O, CO2, CH4) emissions from agricultural soils under conditions of projected climate change (IPCC SRES scenarios) is a prerequisite to limit global warming. In this study we used the recently developed regional biogeochemical ecosystem model LandscapeDNDC (Haas et al., 2012, Landscape Ecology) and two time slices for present day (1998 - 2018) and future climate (2078-2098) (regional downscale of IPCC SRES A1B climate simulation) and compared a business as usual agricultural management scenario (winter rape seed - winter barley - winter wheat rotation; fertilization: 170 / 150 / 110 kg-N mineral fertilizer; straw harvest barley/wheat: 90 %) with scenarios where either one or all of the following options were realized: no-till, residue return to fields equal 100%, reduction of fertilization rate s were left on the field or reduction of N fertilization by 10%. The spatial domain is the State of Saxony (1 073 523 hectares of arable land), a typical region for agricultural production in Central Europe. The simulations are based on a high resolution polygonal datasets (5 517 agricultural grid cells) for which relevant information on soil properties is available. The regionalization of the N2O emissions was validated against the IPCC Tier I methodology resulting in N2O emissions of 1 824 / 1 610 / 1 180 [t N2O-N yr-1] for of the baseline years whereas the simulations results in 6 955 / 6 039 / 2 207 [t N2O-N yr-1] for the first three years of the baseline scenarios and ranging between 621 and 6 955 [t N2O-N yr-1] within the following years (mean of 2 923). The influence of climate change (elevated mean temperature of approx. 2°C and minor changes in precipitation) results in an increase of 259 [t N2O-N yr-1] (mean 3 182) or approx. 9 percent on average (with a minimum of 618 and a maximum of 6 553 [t N2O-N yr-1]). Focusing on the mitigation , the recarbonization did result in an increase of soil carbon stocks of 2 585 [kg C/ha] within the simulation time span (with 161 868 [kg C/ha] at the initial stage and 164 453 [kg C/ha] at the end of the 21 years of simulation, mean at 163 444). The study present a fully compensation of the carbon sequestration due to the incorporation of the residues by a steady increase of soil N2O emissions due to additional nitrogen supply by the mineralization of organic material (residues). For the derivation of a sustainable land use, the study presents an optimal scenario to keep the yields high while increasing soil C and reducing gaseous N losses and leaching.

  1. Hazard Assessment of Chemical Air Contaminants Measured in Residences

    SciTech Connect

    Logue, J.M.; McKone, T.E.; Sherman, M. H.; Singer, B.C.

    2010-05-10

    Identifying air pollutants that pose a potential hazard indoors can facilitate exposure mitigation. In this study, we compiled summary results from 77 published studies reporting measurements of chemical pollutants in residences in the United States and in countries with similar lifestyles. These data were used to calculate representative mid-range and upper bound concentrations relevant to chronic exposures for 267 pollutants and representative peak concentrations relevant to acute exposures for 5 activity-associated pollutants. Representative concentrations are compared to available chronic and acute health standards for 97 pollutants. Fifteen pollutants appear to exceed chronic health standards in a large fraction of homes. Nine other pollutants are identified as potential chronic health hazards in a substantial minority of homes and an additional nine are identified as potential hazards in a very small percentage of homes. Nine pollutants are identified as priority hazards based on the robustness of measured concentration data and the fraction of residences that appear to be impacted: acetaldehyde; acrolein; benzene; 1,3-butadiene; 1,4-dichlorobenzene; formaldehyde; naphthalene; nitrogen dioxide; and PM{sub 2.5}. Activity-based emissions are shown to pose potential acute health hazards for PM{sub 2.5}, formaldehyde, CO, chloroform, and NO{sub 2}.

  2. Case studies in risk assessment for hazardous waste burning cement kilns

    SciTech Connect

    Fraiser, L.H.; Lund, L.G.; Tyndall, K.H. [Texas Natural Resource Conservation Commission (TNRCC), Austin, TX (United States)

    1996-12-31

    In November of 1994, the Environmental Protection Agency (EPA) issued its final Strategy for Hazardous Waste Minimization and Combustion. In the Strategy, EPA outlined the role of risk assessment in assuring the safe operation of hazardous waste combustion facilities. In accordance with the goals of the Strategy, the Texas Natural Resource Conservation Commission performed screening analyses for two cement companies that supplement with hazardous waste-derived fuel. The methodology employed was that outlined in the Guidance for Performing Screening Level Risk Analyses at Combustion Facilities Burning Hazardous Wastes. A tiered screening approach, allowing progression from a generic worst-case risk assessment to an increasingly more detailed site-specific risk assessment, was developed for applying EPA`s methodology. Interactive spreadsheets, consisting of approximately 50 fate and transport equations and an additional 30 algorithms for estimating human health risks by indirect and direct pathways, were developed for performing the screening analyses. Exposure scenarios included adult and child residents, subsistence farmers (dairy and beef), and a subsistence fisher. Residents were assumed to consume soil and vegetables and to inhale contaminated air. Farmers were assumed to consume soil, vegetables, beef and/or milk (as appropriate) and to inhale contaminated air. In addition to inhaling contaminated air, the fisher was assumed to consume soil, vegetables and fish. The subsistence fisher scenario dominated the estimated risks posed by the two cement companies evaluated. As expected, indirect pathways contributed the majority of the risk. In conclusion, the results indicate that the cumulative (indirect and direct) cancer risks and non-cancer hazard indices did not exceed target levels of 1E{minus}05 and 0.5, respectively.

  3. Addition of straw or sawdust to mitigate greenhouse gas emissions from slurry produced by housed cattle: a field incubation study.

    PubMed

    van der Weerden, Tony J; Luo, Jiafa; Dexter, Moira

    2014-07-01

    The use of housed wintering systems (e.g., barns) associated with dairy cattle farming is increasing in southern New Zealand. Typically, these wintering systems use straw or a woodmix as bedding material. Ammonia (NH) and greenhouse gas (GHG) emissions (nitrous oxide [NO] and methane [CH]) associated with storage of slurry + bedding material from wintering systems is poorly understood. A field incubation study was conducted to determine such emissions from stored slurry where bedding material (straw and sawdust) was added at two rates and stored for 7 mo. During the first 4 mo of storage, compared with untreated slurry, the addition of sawdust significantly reduced NH and CH emissions from 29 to 3% of the initial slurry nitrogen (N) content and from 0.5 to <0.01% of the initial slurry carbon (C) content. However, sawdust enhanced NO emissions to 0.7% of the initial slurry-N content, compared with <0.01% for untreated slurry. Straw generally had an intermediate effect. Extending the storage period to 7 mo increased emissions from all treatments. Ammonia emissions were inversely related to the slurry C:N ratio and total solid (TS) content, and CH emissions were inversely related to slurry TS content. Mitigation of GHG emissions from stored slurry can be achieved by reducing the storage period as much as possible after winter slurry collection, providing ground conditions allow access for land spreading and nutrient inputs match pasture requirements. Although adding bedding material can reduce GHG emissions during storage, increased manure volumes for carting and spreading need to be considered. PMID:25603082

  4. Development and application of the EPIC model for carbon cycle, greenhouse-gas mitigation, and biofuel studies

    SciTech Connect

    Izaurralde, Roberto C.; Mcgill, William B.; Williams, J.R.

    2012-06-01

    This chapter provides a comprehensive review of the EPIC model in relation to carbon cycle, greenhouse-gas mitigation, and biofuel applications. From its original capabilities and purpose (i.e., quantify the impacts or erosion on soil productivity), the EPIC model has evolved into a comprehensive terrestrial ecosystem model for simulating with more or less process-level detail many ecosystem processes such as weather, hydrology, plant growth and development, carbon cycle (including erosion), nutrient cycling, greenhouse-gas emissions, and the most complete set of manipulations that can be implemented on a parcel of land (e.g. tillage, harvest, fertilization, irrigation, drainage, liming, burning, pesticide application). The chapter also provides details and examples of the latest efforts in model development such as the coupled carbon-nitrogen model, a microbial denitrification model with feedback to the carbon decomposition model, updates on calculation of ecosystem carbon balances, and carbon emissions from fossil fuels. The chapter has included examples of applications of the EPIC model in soil carbon sequestration, net ecosystem carbon balance, and biofuel studies. Finally, the chapter provides the reader with an update on upcoming improvements in EPIC such as the additions of modules for simulating biochar amendments, sorption of soluble C in subsoil horizons, nitrification including the release of N2O, and the formation and consumption of methane in soils. Completion of these model development activities will render an EPIC model with one of the most complete representation of biogeochemical processes and capable of simulating the dynamic feedback of soils to climate and management in terms not only of transient processes (e.g., soil water content, heterotrophic respiration, N2O emissions) but also of fundamental soil properties (e.g. soil depth, soil organic matter, soil bulk density, water limits).

  5. Volcanic hazards and aviation safety

    USGS Publications Warehouse

    Casadevall, Thomas J.; Thompson, Theodore B.; Ewert, John W.

    1996-01-01

    An aeronautical chart was developed to determine the relative proximity of volcanoes or ash clouds to the airports and flight corridors that may be affected by volcanic debris. The map aims to inform and increase awareness about the close spatial relationship between volcanoes and aviation operations. It shows the locations of the active volcanoes together with selected aeronautical navigation aids and great-circle routes. The map mitigates the threat that volcanic hazards pose to aircraft and improves aviation safety.

  6. Ground fissure hazards in USA and China

    NASA Astrophysics Data System (ADS)

    da-Yu, Geng; Li, Zhong-Sheng

    2000-07-01

    Tremendous losses were caused by ground fissure hazard both in USA and China. Six states of southwestern USA and seven provinces of central China were affected by the destructive ground fissures. The aseismic ground fissure hazards usually take place in land subsidence area. The comparison of the two countries’ ground fissures were given including ground fissure formation, evolution, mechanics of destruction and countermeasures against them. The destructive ground fissures occurred about a half century earlier in USA than in China. The mechanisms of various ground fissures were analyzed with interdisciplinary studies. It has been found that the preexisted faults are serving as the bases of forming modern ground fissure, and human activities, e.g. over pumping ground water, or oil, can accelerate the creeping of the fissures and make them destructive to many kinds of civil engineering. The countermeasures to mitigate ground fissure hazard were put forward, not only in science and technology but also in social administration. The successful practices in the two countries were introduced as examples.

  7. The Impact Hazard

    NASA Astrophysics Data System (ADS)

    Morrison, D.

    2009-12-01

    Throughout its existence, Earth has been pummelled by rocks from space. The cratered face of the Moon testifies to this continuing cosmic bombardment, and the 1908 Tunguska impact in Siberia should have been a wake-up call to the impact hazard. For most scientists, however, it was the discovery 30 years ago that the KT mass extinction was caused by an impact that opened our eyes to this important aspect of Earth history -- that some geological and biological changes have an external origin, and that the biosphere is much more sensitive to impact disturbance than was imagined. While life adapts beautifully to slow changes in the enviroment, a sudden event, like a large impact, can have catastrophic consequences. While we do not face any known hazard today for an extinction-level event, we are becoming aware that more than a million near-earth asteroids (NEAs) exist with the capacity to take out a city if they hit in the wrong place. The NASA Spaceguard Survey has begun to discover and track the larger NEAs, but we do not yet have the capability to find more than a few pecent of the objects as small as the Tunguska impactor (about 40 m diameter). This continuing impact hazard is at roughly the hazard level of volcanic eruptions, including the rare supervolcano eruptions. The differnece is that an incoming cosmic projectile can be detected and tracked, and by application of modern space technology, most impactors could be deflected. Impacts are the only natural hazard that can be eliminated. This motivates our NEA search programs such as Spaceguard and argues for extending them to smaller sizes. At the same time we realize that the most likely warning time for the next impact remains a few seconds, and we may therefore need to fall back on the more conventional responses of disaster mitigation and relief.

  8. Hazardous Substances and Hazardous Waste

    MedlinePLUS

    ... of to prevent an accidental release into the environment. Advances in technology have greatly improved our ability to treat or dispose of hazardous waste in a way that prevents it from harming people or the environment. Typical methods of hazardous waste storage and disposal ...

  9. NEOShield - A global approach to NEO Impact Threat Mitigation

    NASA Astrophysics Data System (ADS)

    Michel, Patrick

    2015-03-01

    NEOShield is a European-Union funded project coordinated by the German Aero-space Center, DLR, to address near-Earth object (NEO) impact hazard mitigation issues. The NEOShield consortium consists of 13 research institutes, universities, and industrial partners from 6 countries and includes leading US and Russian space organizations. The project is funded for a period of 3.5 years from January 2012 with a total of 5.8 million euros. The primary aim of the project is to investigate in detail promising mitigation techniques, such as the kinetic impactor, blast deflection, and the gravity tractor, and devise feasible demonstration missions. Options for an international strategy for implementation when an actual impact threat arises will also be investigated. The NEOShield work plan consists of scientific investigations into the nature of the impact hazard and the physical properties of NEOs, and technical and engineering studies of practical means of deflecting NEOs. There exist many ideas for asteroid deflection techniques, many of which would require considerable scientific and technological development. The emphasis of NEOShield is on techniques that are feasible with current technology, requiring a minimum of research and development work. NEOShield aims to provide detailed designs of feasible mitigation demonstration missions, targeting NEOs of the kind most likely to trigger the first space-based mitigation action. Most of the asteroid deflection techniques proposed to date require physical contact with the threatening object, an example being the kinetic impactor. NEOShield includes research into the mitigation-relevant physical properties of NEOs on the basis of remotely-sensed astronomical data and the results of rendezvous missions, the observational techniques required to efficiently gather mitigation-relevant data on the dynamical state and physical properties of a threatening NEO, and laboratory investigations using gas guns to fire projectiles into asteroid regolith analog materials. The gas-gun investigations enable state-of-the-art numerical models to be verified at small scales. Computer simulations at realistic NEO scales are used to investigate how NEOs with a range of properties would respond to a pulse of energy applied in a deflection attempt. The technical work includes the development of crucial technologies, such as the autonomous guidance of a kinetic impactor to a precise point on the surface of the target, and the detailed design of realistic missions for the purpose of demonstrating the applicability and feasibility of one or more of the techniques investigated. Theoretical work on the blast deflection method of mitigation is designed to probe the circumstances in which this last line of defense may be the only viable option and the issues relating to its deployment. A global response campaign roadmap will be developed based on realistic scenarios presented, for example, by the discovery of an object such as 99942 Apophis or 2011 AG5 on a threatening orbit. The work will include considerations of the timeline of orbit knowledge and impact probability development, reconnaissance observations and fly-by or rendezvous missions, the political decision to mount a mitigation attempt, and the design, development, and launch of the mitigation mission. Collaboration with colleagues outside the NEOShield Consortium involved in complementary activities (e.g. under the auspices of the UN, NASA, or ESA) is being sought in order to establish a broad international strategy. We present a brief overview of the history and planned scope of the project, and progress made to date. The NEOShield project (http://www.neoshield.net) has received funding from the European Union Seventh Framework Program (FP7/2007-2013) under Grant Agreement no. 282703.

  10. Geologic and Geophysicsal Studies of Natural Hazards and Risks in the Gulf of Peter the Great, Japan Sea

    NASA Astrophysics Data System (ADS)

    Anokhin, Vladimir; Shcherbakov, Viktor; Motychko, Viktor; Slinchenkov, Vladimir; Sokolov, Georgy; Kotov, Sergey; Kartashov, Sergey

    2013-04-01

    The area of the Gulf of Peter the Great is socially, economically and culturally one of the most important regions for the Russian Far East. At the same time, there have been reported palpable natural hazards, which pose a real threat to local infrastructure. Complex field team of the Gramaberg VNIIOkeangeologia institute carried out geological and geophysical studies of natural hazards in the water area and coastal zone of the gulf in the summer and autumn of 2012. The research program included - geodetic deformation monitoring of the coastal zone by the HDS 3000 Leica tachometer; - echo sounding of the underwater part of the coastal slope by the LCX-37C depth sounder equipped with active external 12-channel GPS Lowrance antenna LGC-3000; - high-frequency acoustic profiling by GeoPulse Subbotom Profilier with oscillator frequency of 12.2 kHz for the study of bottom sediments to a depth of 40 m; - hydromagnetic measurements by SeaSPY Marine Magnetics magnetometer for investigation of deep geological structure; - sonar measurements by GEO SM C-MAX, 325 kHz frequency emitters for studying seafloor features; - studies of the water column (sensing and sampling); - bottom sediment sampling. Analytic work was performed by mass spectrometry, atomic absorption spectrophotometry, chromatography, gas chromatography-mass spectrometry, gamma spectrometry and included the following. For water - the content of Fe, Mn, Cd, As, Pb, Cu, Co, Ni, Cr, Zn, Hg in solution and in suspension, polycyclic aromatic compounds, organochlorine pesticides, oil, methane. For sediments - grade analysis, mineralogical analysis of sand, determination of Fe, Mn, Cd, As, Pb, Cu, Co, Ni, Cr, Zn, Hg content; identification of petroleum products, polychlorinated biphenyls, organochlorine pesticides, the specific activity of Cs-137. As a result, a set of geological maps was composed: maps of pre-Quaternary and Quaternary rocks and deposits, lithological map, geomorphological map, map of engineering geological zoning, map of the major hydro- and lithodynamic processes, hydraulic and geochemical maps and sections, seismotectonic map, map of endogenous geodynamics map exogenous geological processes, map assess the overall geo-ecological situations etc. As a result of the first stage of these studies we identified the following significant hazards and risks faced by the region: 1. Seismic hazards - along seismoactivity faults in the region. 2. Tsunami hasards in the coasts of Amursky, Ussuriysky and other gulf of the region. 3. Destruction of shore, including landslides, in many littoral zones of the region. 4. Avalanche sedimentation in Amursky, Ussuriysky gulfs. 5. Gas emissions in bottom of the shelf zone. 6. Industrial pollution in aquatories near industrial centres. Estimation of hazards and risks in the Gulf of Peter the Great will be continued.

  11. Hazard and operability study of the multi-function Waste Tank Facility. Revision 1

    SciTech Connect

    Hughes, M.E.

    1995-05-15

    The Multi-Function Waste Tank Facility (MWTF) East site will be constructed on the west side of the 200E area and the MWTF West site will be constructed in the SW quadrant of the 200W site in the Hanford Area. This is a description of facility hazards that site personnel or the general public could potentially be exposed to during operation. A list of preliminary Design Basis Accidents was developed.

  12. SLUDGE TREATMENT PROJECT ENGINEERED CONTAINER RETRIEVAL AND TRANSFER SYSTEM PRELMINARY DESIGN HAZARD AND OPERABILITY STUDY

    SciTech Connect

    CARRO CA

    2011-07-15

    This Hazard and Operability (HAZOP) study addresses the Sludge Treatment Project (STP) Engineered Container Retrieval and Transfer System (ECRTS) preliminary design for retrieving sludge from underwater engineered containers located in the 105-K West (KW) Basin, transferring the sludge as a sludge-water slurry (hereafter referred to as 'slurry') to a Sludge Transport and Storage Container (STSC) located in a Modified KW Basin Annex, and preparing the STSC for transport to T Plant using the Sludge Transport System (STS). There are six, underwater engineered containers located in the KW Basin that, at the time of sludge retrieval, will contain an estimated volume of 5.2 m{sup 3} of KW Basin floor and pit sludge, 18.4 m{sup 3} of 105-K East (KE) Basin floor, pit, and canister sludge, and 3.5 m{sup 3} of settler tank sludge. The KE and KW Basin sludge consists of fuel corrosion products (including metallic uranium, and fission and activation products), small fuel fragments, iron and aluminum oxide, sand, dirt, operational debris, and biological debris. The settler tank sludge consists of sludge generated by the washing of KE and KW Basin fuel in the Primary Clean Machine. A detailed description of the origin of sludge and its chemical and physical characteristics can be found in HNF-41051, Preliminary STP Container and Settler Sludge Process System Description and Material Balance. In summary, the ECRTS retrieves sludge from the engineered containers and hydraulically transfers it as a slurry into an STSC positioned within a trailer-mounted STS cask located in a Modified KW Basin Annex. The slurry is allowed to settle within the STSC to concentrate the solids and clarify the supernate. After a prescribed settling period the supernate is decanted. The decanted supernate is filtered through a sand filter and returned to the basin. Subsequent batches of slurry are added to the STSC, settled, and excess supernate removed until the prescribed quantity of sludge is collected. The sand filter is then backwashed into the STSC. The STSC and STS cask are then inerted and transported to T Plant.

  13. Flood fatality hazard and flood damage hazard: combining multiple hazard characteristics into meaningful maps for spatial planning

    NASA Astrophysics Data System (ADS)

    de Bruijn, K. M.; Klijn, F.; van de Pas, B.; Slager, C. T. J.

    2015-01-01

    For comprehensive flood risk management, accurate information on flood hazards is crucial. While in the past an estimate of potential flood consequences in large areas was often sufficient to make decisions on flood protection, there currently is an increasing demand to have detailed hazard maps available to be able to consider other risk reducing measures as well. Hazard maps are a prerequisite for spatial planning, but can also support emergency management, the design of flood mitigation measures, and the setting of insurance policies. The increase in flood risks due to population growth and economic development in hazardous areas in the past shows that sensible spatial planning is crucial to prevent risks increasing further. Assigning the least hazardous locations for development or adapting developments to the actual hazard requires comprehensive flood hazard maps. Since flood hazard is a multi-dimensional phenomenon, many different maps could be relevant. Having large numbers of maps to take into account does, however, not make planning easier. To support flood risk management planning we therefore introduce a new approach in which all relevant flood hazard parameters can be combined into two comprehensive maps of flood damage hazard respectively flood fatality hazard.

  14. U.S. Postal Service radon assessment and mitigation program. Progress report, September 1993--November 1994

    SciTech Connect

    Velazquez, L.E.; Petty, J.L. Jr.

    1994-12-31

    In 1992, the US Postal Service (USPS) entered into an Interagency Agreement with the Department of Energy (DOE) whereby DOE would provide technical assistance in support of the USPS Radon Assessment and Mitigation Program. To aid in this effort, DOE tasked the Hazardous Waste Remedial Actions Program (HAZWRAP), which is managed by Martin Marietta Energy Systems, Inc., for DOE under contract AC05-84OR21400. Since that time, HAZWRAP has developed and finalized the sampling protocol, mitigation diagnostic protocol, and the quality assurance and quality control procedures. These procedures were validated during the Protocol Validation (1992-1993) and Pilot Study (1993-1994) phases of the program. To date, HAZWRAP has performed approximately 16,000 radon measurements in 250 USPS buildings. Mitigation diagnostics have been performed in 27 buildings. Thus far, 13% of the measurements have been above the Environmental Protection Agency action level of 4 pCi/L. This report summarizes the pilot program radon testing data and mitigation diagnostic data for 22 sites and contains recommendations for mitigation diagnostics.

  15. GUIDELINES FOR THE USE OF CHEMICALS IN REMOVING HAZARDOUS SUBSTANCE DISCHARGES

    EPA Science Inventory

    This report was undertaken to develop guidelines on the use of various chemical and biological agents to mitigate discharge of hazardous substances. Eight categories of mitigative agents and their potential uses in removing hazardous substances discharged on land and in waterways...

  16. Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner

    2014-12-01

    During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly evolving crisis, accurately accounting for and propagating all uncertainties and enabling rational decision making under uncertainty.

  17. Evaluating a multi-criteria model for hazard assessment in urban design. The Porto Marghera case study

    SciTech Connect

    Luria, Paolo; Aspinall, Peter A

    2003-08-01

    The aim of this paper is to describe a new approach to major industrial hazard assessment, which has been recently studied by the authors in conjunction with the Italian Environmental Protection Agency ('ARPAV'). The real opportunity for developing a different approach arose from the need of the Italian EPA to provide the Venice Port Authority with an appropriate estimation of major industrial hazards in Porto Marghera, an industrial estate near Venice (Italy). However, the standard model, the quantitative risk analysis (QRA), only provided a list of individual quantitative risk values, related to single locations. The experimental model is based on a multi-criteria approach--the Analytic Hierarchy Process--which introduces the use of expert opinions, complementary skills and expertise from different disciplines in conjunction with quantitative traditional analysis. This permitted the generation of quantitative data on risk assessment from a series of qualitative assessments, on the present situation and on three other future scenarios, and use of this information as indirect quantitative measures, which could be aggregated for obtaining the global risk rate. This approach is in line with the main concepts proposed by the last European directive on Major Hazard Accidents, which recommends increasing the participation of operators, taking the other players into account and, moreover, paying more attention to the concepts of 'urban control', 'subjective risk' (risk perception) and intangible factors (factors not directly quantifiable)

  18. Design and construction of a rotary kiln simulator for use in studying the incineration of hazardous waste

    NASA Astrophysics Data System (ADS)

    Lemieux, Paul M.; Pershing, David W.

    1989-08-01

    Rotary kiln systems are widely used in industrial applications to transfer energy from high temperature flames to irregular solids. Recently these systems have been shown to be suitable for the incineration of hazardous solid waste materials and the thermal treatment of contaminated soils. Destruction and removal efficiencies in excess of 99.99% have been reported for hazardous species, but the rate controlling steps of the incineration process are not well understood. This article describes the design, construction, and operation of a laboratory scale simulator which was developed to investigate the fundamentals of hazardous incineration in a rotary kiln environment. This 2 ft×2 ft refractory-lined kiln allows time-resolved characterization of contaminant evolution and destruction. Continuous thermal and exhaust concentration measurements are used to characterize the fate of the solid charge as a function of residence time within the kiln. Overall destruction efficiency can be measured by subsequent analysis of the solid phase. The initial performance of this facility has been demonstrated by studying the combustion of waste zirconium metal and by characterizing the thermal clean up of solid sorbent contaminated with toluene. The rotary kiln simulator has been shown suitable for investigation of parameters such as amount of charge, contaminate loading, rotation speed, temperature, excess oxygen, and particle size.

  19. Reproductive Hazards

    MedlinePLUS

    ... and female reproductive systems play a role in pregnancy. Problems with these systems can affect fertility and ... a reproductive hazard can cause different effects during pregnancy, depending on when she is exposed. During the ...

  20. Hazardous materials

    MedlinePLUS

    ... from the body that may carry harmful germs Gases that are used to make patients sleep during ... hazardous materials are used and stored. Some common areas are where: X-rays and other imaging tests ...

  1. Coastal Hazards.

    ERIC Educational Resources Information Center

    Vandas, Steve

    1998-01-01

    Focuses on hurricanes and tsunamis and uses these topics to address other parts of the science curriculum. In addition to a discussion on beach erosion, a poster is provided that depicts these natural hazards that threaten coastlines. (DDR)

  2. Hazardous Drugs

    MedlinePLUS

    ... Solutions Possible Solutions Additional Information Safety and Health Topics A-Z Index What's New Worker exposure to ... in the workplace. Pharmacy . Reviews safety and health topics related to hazardous drugs including drug handling, administration, ...

  3. Hazardous Waste

    NSDL National Science Digital Library

    Harris, Kathryn Louise.

    Given media attention to the US Navy's recent problems with the disposal of a large amount of napalm, an incendiary compound, this week's In the News examines the issue of hazardous waste and materials. The eight resources discussed provide information on various aspects of the topic. Due to the large number of companies specializing in the management and remediation of hazardous waste contamination, private firms will not be noted.

  4. Mitigating Radiation Effects on ICs at Device and Architectural Levels: the SpaceWire Router Case Study

    Microsoft Academic Search

    Esa Petri; Sergio Saponara; Marco Tonarelli; Iacopo Del Corona; Luca Fanucci; Pierangelo Terreni

    2007-01-01

    The mitigation of radiation effects on integrated circuits is discussed in this paper with reference to the design of hardware macrocells for reliable high-speed networking based on the SpaceWire standard target applications include on-board distributed control systems for aerospace, military, avionics or automotive scenarios. Due to CMOS technology scaling, digital ICs are becoming more susceptible to radiation effects, both total

  5. Hazardous substance liability insurance

    SciTech Connect

    Not Available

    1982-03-01

    The study was carried out to meet requirements of the Comprehensive Environmental Response, Compensation and Liability Act of 1980. It considers the adequacy and feasibility of private insurance to protect owners and operators of ships covered by the Act and for post-closure financial responsibility for hazardous waste disposal facilities. The report is in three parts: Pt. 1 is an introduction to the hazardous substance insurance problem; Pt. 2 considers the adequacy of private insurance for owners and operators of vessels and facilities; Pt. 3 focuses on the problem of a private insurance alternative to the Post-Closure Liability Fund for 'inactive' hazardous waste disposal facilities.

  6. Determinants of spatial and temporal patterns in compensatory wetland mitigation.

    PubMed

    BenDor, Todd; Brozovi?, Nicholas

    2007-09-01

    Development projects that impact wetlands commonly require compensatory mitigation, usually through creation or restoration of wetlands on or off the project site. Over the last decade, federal support has increased for third-party off-site mitigation methods. At the same time, regulators have lowered the minimum impact size that triggers the requirement for compensatory mitigation. Few studies have examined the aggregate impact of individual wetland mitigation projects. No previous study has compared the choice of mitigation method by regulatory agency or development size. We analyze 1058 locally and federally permitted wetland mitigation transactions in the Chicago region between 1993 and 2004. We show that decreasing mitigation thresholds have had striking effects on the methods and spatial distribution of wetland mitigation. In particular, the observed increase in mitigation bank use is driven largely by the needs of the smallest impacts. Conversely, throughout the time period studied, large developments have rarely used mitigation banking, and have been relatively unaffected by changing regulatory focus and banking industry growth. We surmise that small developments lack the scale economies necessary for feasible permittee responsible mitigation. Finally, we compare the rates at which compensation required by both county and federal regulators is performed across major watershed boundaries. We show that local regulations prohibiting cross-county mitigation lead to higher levels of cross- watershed mitigation than federal regulations without cross-county prohibitions. Our data suggest that local control over wetland mitigation may prioritize administrative boundaries over hydrologic function in the matter of selecting compensation sites. PMID:17602255

  7. Determinants of Spatial and Temporal Patterns in Compensatory Wetland Mitigation

    NASA Astrophysics Data System (ADS)

    Bendor, Todd; Brozovi?, Nicholas

    2007-09-01

    Development projects that impact wetlands commonly require compensatory mitigation, usually through creation or restoration of wetlands on or off the project site. Over the last decade, federal support has increased for third-party off-site mitigation methods. At the same time, regulators have lowered the minimum impact size that triggers the requirement for compensatory mitigation. Few studies have examined the aggregate impact of individual wetland mitigation projects. No previous study has compared the choice of mitigation method by regulatory agency or development size. We analyze 1058 locally and federally permitted wetland mitigation transactions in the Chicago region between 1993 and 2004. We show that decreasing mitigation thresholds have had striking effects on the methods and spatial distribution of wetland mitigation. In particular, the observed increase in mitigation bank use is driven largely by the needs of the smallest impacts. Conversely, throughout the time period studied, large developments have rarely used mitigation banking, and have been relatively unaffected by changing regulatory focus and banking industry growth. We surmise that small developments lack the scale economies necessary for feasible permittee responsible mitigation. Finally, we compare the rates at which compensation required by both county and federal regulators is performed across major watershed boundaries. We show that local regulations prohibiting cross-county mitigation lead to higher levels of cross- watershed mitigation than federal regulations without cross-county prohibitions. Our data suggest that local control over wetland mitigation may prioritize administrative boundaries over hydrologic function in the matter of selecting compensation sites.

  8. Rockfall hazard assessment by using terrestrial laser scanning. A case study in Funchal (Madeira)

    NASA Astrophysics Data System (ADS)

    Nguyen, Hieu Trung; Fernandez-Steeger, Tomas; Domingos, Rodriguez; Wiatr, Thomas; Azzam, Rafig

    2010-05-01

    Rockfall hazard assessment in a high-relief volcanic environment is a difficult task, facing the challenge of missing standard rating systems and procedures. Likewise mountainous areas, further handicaps are a restricted accessibility to the rock faces and the high efforts in terms of time and labour force to identify and rate these problems. To develop a procedure for rockfall hazard assessment, the island of Maderia is a good research area to investigate rockfalls in a volcanic environment under sub-tropic humid climate conditions. As the entire island is characterised by high mountain ridges and steep deep valleys in lavaflows and tuff layers, the occurrence of rockfalls is a frequent and a serious problem. These hazards are the most frequent causing severe damage to infrastructure and fatalities. In this research, slopes in Funchal city have been mapped and investigated regarding their rock fall hazard potential. The analysed slopes are build-up of lava flows with column structures and intercalated breccias, pyroclatics or tuff layers. Many of the columns already lack basal support and show a wide joint spacing, threatening houses and streets in the city. TLS data acquisitions in May and December 2008 provide information for detailed structural analysis, detection of unstable areas within a slope and rockfall simulations. High resolution scans have been recorded on uncovered rock surfaces with detectable joints while in areas with dense vegetation a lower resolution has been chosen. Although it makes sense to scan an entire area with the best acquirable resolution, the resulting enormous data require powerful computing environments and will slow down data processing. To speed up the data processing, a conventional local digital elevation model (DEM) built up the geometric basic model. Its main disadvantage is that it is not possible to project overhanging parts or notches within the steep slopes which have an important influence on the accuracy of any rockfall simulations. By implantation of the high resolution scans of the TLS into the local DEM, an improvement close to a solely high-resolution digital elevation model (HRDEM) can be achieved. The rockfall hazard assessment starts by comparison of time-shifted datasets and with additional automatic jointing analysis. Based on this data 3-D displacements and associated kinematical failure mechanism can be identified. Using on this information, it becomes possible to determine specific parameters for numerical rockfall simulations like average block sizes, shape or potential sources. Including additional data like surface roughness the results of numerical rockfall simulations allow to classify different areas of hazard based on run-out distances, frequency of impacts and related kinetic energy. The analysis shows that rockfall favourable occurs in areas where notches and undercuts, due to the lesser erosionresistence of pyroclatics or tuff layers, appear. In case of a rockfall the typical blocks have a cylindrical shape, a volume of 1 m3 and are able to hit the entire area. The results can help to provide useful information for civil protection and engineering countermeasures. Repeated TLS scans on the same area will continue the observation and the progress of instability and mass movement occurrence.

  9. USGS National Seismic Hazard Maps

    NSDL National Science Digital Library

    This set of resources provides seismic hazard assessments and information on design values and mitigation for the U.S. and areas around the world. Map resources include the U.S. National and Regional probabilistic ground motion map collection, which covers the 50 states, Puerto Rico, and selected countries. These maps display peak ground acceleration (PGA) values, and are used as the basis for seismic provisions in building codes and for new construction. There is also a custom mapping and analysis tool, which enables users to re-plot these maps for area of interest, get hazard values using latitude/longitude or zip code, find predominant magnitudes and distances, and map the probability of given magnitude within a certain distance from a site. The ground motion calculator, a Java application, determines hazard curves, uniform hazard response spectra, and design parameters for sites in the 50 states and most territories. There is also a two-part earthquake hazards 'primer', which provides links to hazard maps and frequently-asked-questions, and more detailed information for building and safety planners.

  10. Area-scale landslide hazard and risk assessment

    NASA Astrophysics Data System (ADS)

    Romeo, Roberto W.; Floris, Mario; Veneri, Francesco

    2006-10-01

    The paper deals with a methodology for quantitative landslide hazard and risk assessments over wide-scale areas. The approach was designed to fulfil the following requirements: (1) rapid investigation of large study areas; (2) use of elementary information, in order to satisfy the first requirement and to ensure validation, repetition and real time updating of the assessments every time new data are available; (3) computation of the landslide frequency of occurrence, in order to compare objectively different hazard conditions and to minimize references to qualitative hazard attributes such as activity states. The idea of multi-temporal analysis set forth by Cardinali et al. (Nat Hazards Earth Syst Sci 2:57-72, 2002), has been stressed here to compute average recurrence time for individual landslides and to forecast their behaviour within reference time periods. The method is based on the observation of the landslide activity through aerial-photo surveys carried out in several time steps. The output is given by a landslide hazard map showing the mean return period of landslides reactivation. Assessing the hazard in a quantitative way allows for estimating quantitatively the risk as well; thus, the probability of the exposed elements (such as people and real estates) to suffer damages due to the occurrence of landslides can be calculated. The methodology here presented is illustrated with reference to a sample area in Central Italy (Umbria region), for which both the landslide hazard and risk for the human life are analysed and computed. Results show the powerful quantitative approach for assessing the exposure of human activities to the landslide threat for a best choice of the countermeasures needed to mitigate the risk.

  11. Modeling, Forecasting and Mitigating Extreme Earthquakes

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  12. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  13. Electric shock hazard

    Microsoft Academic Search

    Charles F. Dalziel

    1972-01-01

    A long-standing expert on electric-shock hazards summarizes the studies that determined the effective body impedance under varying conditions. He describes perception currents, reaction currents, let-go currents, and fibrillating currents. Turning to means for reducing low-voltage (120-240-volt) hazards, double insulation, shock limitation, isolation transformers, and the use of either high frequency or direct current are discussed for various environments. Macroshock is

  14. Pilot studies of seismic hazard and risk in North Sulawesi Province, Indonesia

    USGS Publications Warehouse

    Thenhaus, P.C.; Hanson, S.L.; Effendi, I.; Kertapati, E.K.; Algermissen, S.T.

    1993-01-01

    Earthquake ground motions in North Sulawesi on soft soil that have a 90% probability of not been exceeded in 560 years are estimated to be 0.63 g (63% of the acceleration of gravity) at Palu, 0.31 g at Gorontalo, and 0.27 g at Manado. Estimated ground motions for rock conditions for the same probability level and exposure time are 56% of those for soft soil. The hazard estimates are obtained from seismic sources that model the earthquake potential to a depth of 100 km beneath northern and central Sulawesi and include the Palu fault zone of western Sulawesi, the North Sulawesi subduction zone, and the southern most segment of the Sangihe subduction zone beneath the Molucca Sea. An attenuation relation based on Japanese strong-motion data and considered appropriate for subduction environments of the western Pacific was used in determination of ground motions. Following the 18 April 1990 North Sulawesi earthquake (Ms 7.3) a seismic hazard and risk assessment was carried out. -from Authors

  15. SOME APPLICATIONS OF SEISMIC SOURCE MECHANISM STUDIES TO ASSESSING UNDERGROUND HAZARD.

    USGS Publications Warehouse

    McGarr, A.

    1984-01-01

    Various measures of the seismic source mechanism of mine tremors, such as magnitude, moment, stress drop, apparent stress, and seismic efficiency, can be related directly to several aspects of the problem of determining the underground hazard arising from strong ground motion of large seismic events. First, the relation between the sum of seismic moments of tremors and the volume of stope closure caused by mining during a given period can be used in conjunction with magnitude-frequency statistics and an empirical relation between moment and magnitude to estimate the maximum possible sized tremor for a given mining situation. Second, it is shown that the 'energy release rate,' a commonly-used parameter for predicting underground seismic hazard, may be misleading in that the importance of overburden stress, or depth, is overstated. Third, results involving the relation between peak velocity and magnitude, magnitude-frequency statistics, and the maximum possible magnitude are applied to the problem of estimating the frequency at which design limits of certain underground support equipment are likely to be exceeded.

  16. Evaluation of soil bioassays for use at Washington state hazardous waste sites: A pilot study

    SciTech Connect

    Blakley, N.; Norton, D.; Stinson, M. [Washington Department of Ecology, Olympia, WA (United States); Boyer, R. [WCFWRU, Seattle, WA (United States). School of Fisheries

    1994-12-31

    The Washington State Department of Ecology (Ecology) is developing guidelines to assess soil toxicity at hazardous waste sites being investigated under the Washington Model Toxics Control Act Cleanup Regulation. To evaluate soil toxicity, Ecology selected five bioassay protocols -- Daphnia, Earthworm, Seedling, Fathead Minnow, and Frog Embryo Teratogenesis Assay Xenopus (FETAX) -- for use as screening level assessment tools at six State hazardous waste sites. Sites contained a variety of contaminants including metals, creosote, pesticides, and petroleum products (leaking underground storage tanks). Three locations, representing high, medium, and low levels of contamination, were samples at each site. In general, the high contaminant samples resulted in the highest toxic response in all bioassays. The order of site toxicity, as assessed by overall toxic response, is creosote, petroleum products, metals, and pesticides. Results indicate that human health standards, especially for metals, may not adequately protect some of the species tested. The FETAX bioassay had the greatest overall number of toxic responses and lowest variance. The seedling and Daphnia bioassays had lower and similar overall toxic response results, followed by the earthworm and fathead minnow. Variability was markedly highest for the seedling. The Daphnia and fathead minnow variability were similar to the FETAX level, while the earthworm variability was slightly higher.

  17. ATTITUDES TOWARD ENVIRONMENTAL HAZARDS: WHERE DO TOXIC WASTES FIT?

    Microsoft Academic Search

    Joanna Burger; Malcolm Martin; Keith Cooper; Michael Gochfeld

    1997-01-01

    The public is continually faced with making decisions about the risks associated with envi ronmental hazards, and, along with managers and government officials, must make informed decisions concerning possible regulation, mitigation, and restoration of degraded sites or other environmental threats. We explored the attitudes regarding several environ mental hazards of six groups of people: undergraduate science majors, undergraduate nonscience majors,

  18. Attitudes toward environmental hazards: Where do toxic wastes fit?

    Microsoft Academic Search

    Joanna Burger; K. Cooper; M. Martin; Michael Gochfeld

    1997-01-01

    The public is continually faced with making decisions about the risks associated with environmental hazards, and, along with managers and government officials, must make informed decisions concerning possible regulation, mitigation, and restoration of degraded sites or other environmental threats. We explored the attitudes regarding several environmental hazards of six groups of people: undergraduate science majors, undergraduate nonscience majors, and graduate

  19. Hazard and consequence analysis for waste emplacement at the Waste Isolation Pilot Plant

    SciTech Connect

    Gerstner, D.M.; Clayton, S.G.; Farrell, R.F.; McCormick, J.A.; Ortiz, C.; Standiford, D.L.

    1996-05-01

    The Carlsbad Area Office established and analyzed the safety bases for the design and operations as documented in the WIPP Safety Analysis Report (SAR). Additional independent efforts are currently underway to assess the hazards associated with the long-term (10,000 year) isolation period as required by 40 CFR 191. The structure of the WIPP SAR is unique due to the hazards involved, and the agreement between the State of New Mexico and the DOE regarding SAR content and format. However, the hazards and accident analysis philosophy as contained in DOE-STD-3009-94 was followed as closely as possible, while adhering to state agreements. Hazards associated with WIPP waste receipt, emplacement, and disposal operations were systematically identified using a modified Hazard and Operability Study (HAZOP) technique. The WIPP HAZOP assessed the potential internal, external, and natural phenomena events that can cause the identified hazards to develop into accidents. The hazard assessment identified deviations from the intended design and operation of the waste handling system, analyzed potential accident consequences to the public and workers, estimated likelihood of occurrence, and evaluated associated preventative and mitigative features. It was concluded from the assessment that the proposed WIPP waste emplacement operations and design are sufficient to ensure safety of the public, workers, and environment, over the 35 year disposal phase.

  20. Voluntary Climate Change Mitigation Actions of Young Adults: A Classification of Mitigators through Latent Class Analysis

    PubMed Central

    Korkala, Essi A. E.; Hugg, Timo T.; Jaakkola, Jouni J. K.

    2014-01-01

    Encouraging individuals to take action is important for the overall success of climate change mitigation. Campaigns promoting climate change mitigation could address particular groups of the population on the basis of what kind of mitigation actions the group is already taking. To increase the knowledge of such groups performing similar mitigation actions we conducted a population-based cross-sectional study in Finland. The study population comprised 1623 young adults who returned a self-administered questionnaire (response rate 64%). Our aims were to identify groups of people engaged in similar climate change mitigation actions and to study the gender differences in the grouping. We also determined if socio-demographic characteristics can predict group membership. We performed latent class analysis using 14 mitigation actions as manifest variables. Three classes were identified among men: the Inactive (26%), the Semi-active (63%) and the Active (11%) and two classes among women: the Semi-active (72%) and the Active (28%). The Active among both genders were likely to have mitigated climate change through several actions, such as recycling, using environmentally friendly products, preferring public transport, and conserving energy. The Semi-Active had most probably recycled and preferred public transport because of climate change. The Inactive, a class identified among men only, had very probably done nothing to mitigate climate change. Among males, being single or divorced predicted little involvement in climate change mitigation. Among females, those without tertiary degree and those with annual income €?16801 were less involved in climate change mitigation. Our results illustrate to what extent young adults are engaged in climate change mitigation, which factors predict little involvement in mitigation and give insight to which segments of the public could be the audiences of targeted mitigation campaigns. PMID:25054549

  1. Recording and cataloging hazards information, revision A

    NASA Technical Reports Server (NTRS)

    Stein, R. J.

    1974-01-01

    A data collection process is described for the purpose of discerning causation factors of accidents, and the establishment of boundaries or controls aimed at mitigating and eliminating accidents. A procedure is proposed that suggests a discipline approach to hazard identification based on energy interrelationships together with an integrated control technique which takes the form of checklists.

  2. Natural Hazards Journal of the International

    E-print Network

    varying socio-cultural settings, with dif- ferent geological characteristics and different debris flow, is that socio-cultural issues are at least as important as technical choices on the effectiveness of hazard­triggering events. Keywords Debris flow Á Socio-cultural Á Mitigation Á Vulnerability 1 Introduction Debris flows

  3. CPSC experiment: pitfalls of hazard management

    Microsoft Academic Search

    T. Bick; R. E. Kasperson

    1978-01-01

    The article is the third in a series dealing with the problems of hazards and hazard management. In the first article the full range of hazards impacts consequences was explored. In the second, some generic problems that beset hazard management were discussed. The present article is the first of several case studies focused on the Federal regulatory effort. In treating

  4. Thinking of Wildfire as a Natural Hazard

    Microsoft Academic Search

    SARAH MCCAFFREY

    2004-01-01

    Natural hazards theory with its emphasis on understanding the human–hazard interaction has much to offer in better understanding how individuals respond to the wildfire hazard. Ironically, very few natural hazards studies have actually looked at wildfires, despite the insights the field might offer. This report is structured around four interrelated questions that are often heard from individuals involved with wildfire

  5. Automated Standard Hazard Tool

    NASA Technical Reports Server (NTRS)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  6. Saving Lives and Mitigating Losses

    E-print Network

    Duchowski, Andrew T.

    thicker version of the original nail], with 20,000 pounds of load on an 8-foot shear wall mimicking whatSaving Lives and Mitigating Losses Wind and Structural Engineering Research Facility #12;Clemson University's Wind and Structural Engineering Research (WiSER) Facility is a premier laboratory for the study

  7. A Moore's cellular automaton model to get probabilistic seismic hazard maps for different magnitude releases: A case study for Greece

    NASA Astrophysics Data System (ADS)

    Jiménez, A.; Posadas, A. M.

    2006-09-01

    Cellular automata are simple mathematical idealizations of natural systems and they supply useful models for many investigations in natural science. Examples include sandpile models, forest fire models, and slider block models used in seismology. In the present paper, they have been used for establishing temporal relations between the energy releases of the seismic events that occurred in neighboring parts of the crust. The catalogue is divided into time intervals, and the region is divided into cells which are declared active or inactive by means of a threshold energy release criterion. Thus, a pattern of active and inactive cells which evolves over time is determined. A stochastic cellular automaton is constructed starting with these patterns, in order to simulate their spatio-temporal evolution, by supposing a Moore's neighborhood interaction between the cells. The best model is chosen by maximizing the mutual information between the past and the future states. Finally, a Probabilistic Seismic Hazard Map is given for the different energy releases considered. The method has been applied to the Greece catalogue from 1900 to 1999. The Probabilistic Seismic Hazard Maps for energies corresponding to m = 4 and m = 5 are close to the real seismicity after the data in that area, and they correspond to a background seismicity in the whole area. This background seismicity seems to cover the whole area in periods of around 25-50 years. The optimum cell size is in agreement with other studies; for m > 6 the optimum area increases according to the threshold of clear spatial resolution, and the active cells are not so clustered. The results are coherent with other hazard studies in the zone and with the seismicity recorded after the data set, as well as provide an interaction model which points out the large scale nature of the earthquake occurrence.

  8. Volcanic Hazard Education through Virtual Field studies of Vesuvius and Laki Volcanoes

    NASA Astrophysics Data System (ADS)

    Carey, S.; Sigurdsson, H.

    2011-12-01

    Volcanic eruptions pose significant hazards to human populations and have the potential to cause significant economic impacts as shown by the recent ash-producing eruptions in Iceland. Demonstrating both the local and global impact of eruptions is important for developing an appreciation of the scale of hazards associated with volcanic activity. In order to address this need, Web-based virtual field exercises at Vesuvius volcano in Italy and Laki volcano in Iceland have been developed as curriculum enhancements for undergraduate geology classes. The exercises are built upon previous research by the authors dealing with the 79 AD explosive eruption of Vesuvius and the 1783 lava flow eruption of Laki. Quicktime virtual reality images (QTVR), video clips, user-controlled Flash animations and interactive measurement tools are used to allow students to explore archeological and geological sites, collect field data in an electronic field notebook, and construct hypotheses about the impacts of the eruptions on the local and global environment. The QTVR images provide 360o views of key sites where students can observe volcanic deposits and formations in the context of a defined field area. Video sequences from recent explosive and effusive eruptions of Carribean and Hawaiian volcanoes are used to illustrate specific styles of eruptive activity, such as ash fallout, pyroclastic flows and surges, lava flows and their effects on the surrounding environment. The exercises use an inquiry-based approach to build critical relationships between volcanic processes and the deposits that they produce in the geologic record. A primary objective of the exercises is to simulate the role of a field volcanologist who collects information from the field and reconstructs the sequence of eruptive processes based on specific features of the deposits. Testing of the Vesuvius and Laki exercises in undergraduate classes from a broad spectrum of educational institutions shows a preference for the web-based interactive tools compared with traditional paper-based laboratory exercises. The exercises are freely accessible for undergraduate classes such as introductory geology, geologic hazards, or volcanology. Accompany materials, such as lecture-based Powerpoint presentations about Vesuvius and Laki, are also being developed for instructors to better integrate the web-based exercises into their existing curriculum.

  9. Preliminary hazards analysis -- vitrification process

    SciTech Connect

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)] [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  10. Study to determine the possible hazard of methylmercury in seafood to the fetus in utero. Final report, 1980-1985

    SciTech Connect

    Marsh, D.O.; Turner, M.D.; Smith, J.C.

    1985-12-01

    This study was conducted to determine the possible hazards of methylmercury (MeHg) in seafood to the fetus in utero. Hair and blood samples of pregnant women in New Bedford, MA, Manta, Ecuador, and Mancora, Peru (all areas of high seafood consumption) were examined. These samples were collected and studied at various stages of gestation and during pre- and post-natal periods. In some cases, blood and hair samples of some of the infants born to these women were also examined. The women of Manta and Mancora showed higher levels of MeHg than the women in New Bedford. However, no health hazards could be linked to any of the infants from the ingestion of MeHg in marine fish. Although no human data exist, experimental evidence suggests marine fish may contain elements that reduce the toxicity of MeHg and it's possible that selenium contributes to the protective effect of fish vs. grain diets. Tables of MeHg levels in the study groups are attached to the report.

  11. Contributions to the Characterization and Mitigation of Rotorcraft Brownout

    NASA Astrophysics Data System (ADS)

    Tritschler, John Kirwin

    Rotorcraft brownout, the condition in which the flow field of a rotorcraft mobilizes sediment from the ground to generate a cloud that obscures the pilot's field of view, continues to be a significant hazard to civil and military rotorcraft operations. This dissertation presents methodologies for: (i) the systematic mitigation of rotorcraft brownout through operational and design strategies and (ii) the quantitative characterization of the visual degradation caused by a brownout cloud. In Part I of the dissertation, brownout mitigation strategies are developed through simulation-based brownout studies that are mathematically formulated within a numerical optimization framework. Two optimization studies are presented. The first study involves the determination of approach-to-landing maneuvers that result in reduced brownout severity. The second study presents a potential methodology for the design of helicopter rotors with improved brownout characteristics. The results of both studies indicate that the fundamental mechanisms underlying brownout mitigation are aerodynamic in nature, and the evolution of a ground vortex ahead of the rotor disk is seen to be a key element in the development of a brownout cloud. In Part II of the dissertation, brownout cloud characterizations are based upon the Modulation Transfer Function (MTF), a metric commonly used in the optics community for the characterization of imaging systems. The use of the MTF in experimentation is examined first, and the application of MTF calculation and interpretation methods to actual flight test data is described. The potential for predicting the MTF from numerical simulations is examined second, and an initial methodology is presented for the prediction of the MTF of a brownout cloud. Results from the experimental and analytical studies rigorously quantify the intuitively-known facts that the visual degradation caused by brownout is a space and time-dependent phenomenon, and that high spatial frequency features, i.e., fine-grained detail, are obscured before low spatial frequency features, i.e., large objects. As such, the MTF is a metric that is amenable to Handling Qualities (HQ) analyses.

  12. Observational studies in South African mines to mitigate seismic risks: implications for mine safety and tectonic earthquakes

    NASA Astrophysics Data System (ADS)

    Durrheim, Raymond; Ogaswara, Hiroshi; Nakatani, Masao; Yabe, Yasuo; Milev, Alexander; Cichowicz, Artur; Kawakata, Hironori; Moriya, Hirokazu; Naoi, Makoto; Kgarume, Thabang; Murakami, Osamu; Mngadi, Siyanda

    2014-05-01

    Seismicity poses a significant risk to workers in deep and overstressed mines, such as the gold mines in the Witwatersrand basin of South Africa, as well as inhabitants of earthquake-prone regions such as Japan. A 5-year collaborative project entitled "Observational studies in South African mines to mitigate seismic risks" was launched in 2010 to address these risks, drawing on over a century of South African and Japanese research experience with respect to mining-related and tectonic earthquakes, respectively. The project has three main aims: (1) to learn more about earthquake preparation and triggering mechanisms by deploying arrays of sensitive sensors within rock volumes where mining is likely to induce seismic activity; (2) to learn more about earthquake rupture and rockburst damage phenomena by deploying robust strong ground motion sensors close to potential fault zones and on stope hangingwalls; and (3) to upgrade the South African surface national seismic network in the mining districts. Research sites have been established at mines operated by Sibanye Gold (Hlanganani Shaft and Cooke #4 Shaft) and Anglogold Ashanti (Moab-Khotsong). More than 70 boreholes (totalling more than 2.8 km in length) have been drilled to locate "capable" faults i.e. faults that are considered likely to become seismically active as a result of mining activity and to deploy sensors. Acoustic emission sensors, strain- and tilt meters, and controlled seismic sources were installed to monitor the deformation of the rock mass, the accumulation of damage during the earthquake preparation phase, and changes in dynamic stress produced by the propagation of the rupture front. These data are being integrated with measurements of rock properties, stope closure, stope strong motion, seismic data recorded by the mine-wide network, and stress modelling. The mid-point of the 5-year project has passed. New observations of stress and the response of the rock mass to mining have already been made, and many more are expected in the next two years as the mining front sweeps through the monitoring arrays. We will describe examples of technology adaptation and transfer, as well as preliminary research findings. The strain cell and associated tools required for the compact conical-ended borehole overcoring (CCBO) technique, which determines the 3D stress tensor by a single overcoring of a strain cell, have been reduced to the core size used in South African mines. This modified method was tested at three sites, where it was demonstrated that three overcoring measurements can be made within two shifts. A large number of acoustic emission (AE) sensors were installed at Cooke #4 mine. In the period from 30 September to 5 October in 2011 the monitoring system automatically located 40,555 AE, some of which were located by Moriya et al. using the joint hypocenter location method. Moriya et al. applied the multiplet and the double-difference analysis to the selected multiplets, successfully delineating multiple planar structures. Ultimately we hope that this project will produce knowledge and technology that will reduce the risk posed by both mining-induced and tectonic earthquakes

  13. An analysis of land use planning and equity issues surrounding hazardous liquid and natural gas transmission pipelines in North Carolina

    NASA Astrophysics Data System (ADS)

    Osland, Anna Christine

    Hazardous liquid and natural gas transmission pipelines have received limited attention by planning scholars even though local development decisions can have broad consequences if a rupture occurs. In this dissertation, I evaluated the implications of land-use planning for reducing risk to transmission pipeline hazards in North Carolina via three investigations. First, using a survey of planning directors in jurisdictions with transmission pipeline hazards, I investigated the land use planning tools used to mitigate pipeline hazards and the factors associated with tool adoption. Planning scholars have documented the difficulty of inducing planning in hazardous areas, yet there remain gaps in knowledge about the factors associated with tool adoption. Despite the risks associated with pipeline ruptures, I found most localities use few mitigation tools, and the adoption of regulatory and informational tools appear to be influenced by divergent factors. Whereas risk perception, commitment, capacity, and community context were associated with total tool and information tool use, only risk perception and capacity factors were associated with regulatory tool use. Second, using interviews of emergency managers and planning directors, I examined the role of agency collaboration for building mitigation capacity. Scholars have highlighted the potential of technical collaboration, yet less research has investigated how inter-agency collaboration shapes mitigation capacity. I identify three categories of technical collaboration, discuss how collaborative spillovers can occur from one planning area to another, and challenge the notion that all technical collaborations result in equal mitigation outcomes. Third, I evaluated characteristics of the population near pipelines to address equity concerns. Surprisingly, I did not find broad support for differences in exposure of vulnerable populations. Nonetheless, my analyses uncovered statistically significant clusters of vulnerable groups within the hazard area. Interestingly, development closer to pipelines was newer than areas farther away, illustrating the failure of land-use planning to reduce development encroachment. Collectively, these results highlight the potential of land-use planning to keep people and development from encroaching on pipeline hazards. While this study indicates that planners in many areas address pipeline hazards, it also illustrates how changes to local practices can further reduce risks to human health, homeland security, and the environment.

  14. Future trends in paleoseismology: Integrated study of the seismic landscape as a vital tool in seismic hazard analyses

    NASA Astrophysics Data System (ADS)

    Michetti, Alessandro Maria; Audemard M., Franck A.; Marco, Shmuel

    2005-10-01

    This paper forms the Introduction to this Special Issue of Tectonophysics, devoted to selected scientific research presented during events sponsored by the INQUA Subcommission on Paleoseismicity in the past few years. In this note, we summarize the contents of the contributed papers and use the issues they raise to review the state-of-the-art in paleoseismology from a Quaternary geology perspective. In our opinion, the evolution of paleoseismological studies in the past decade clearly demonstrates that in order to properly understand the seismic potential of a region, and to assess the associated hazards, broad-based/multidisciplinary studies are necessary to take full advantage from the geological evidence of past earthquakes. A major challenge in future paleoseismic research is to build detailed empirical relations between various categories of coseismic effects in the natural environment and earthquake magnitude/intensity. These relations should be compiled in a way that is fully representative of the wide variety of natural environments on Earth, in terms of climatic settings, Quaternary tectonic evolution, rheological parameters of the seismogenic crust, and stress environment. For instance, available data indicate that between earthquake magnitude and surface faulting parameters different scaling laws exist, and they are a function of the local geodynamic setting (including style of faulting, typical focal depths, heat flow). In this regard, we discuss in some detail the concept of seismic landscape, which provides the necessary background for developing paleoseismological research strategies. The large amount of paleoseismological data collected in recent years shows that each earthquake source creates a signature on the geology and the geomorphology of an area that is unequivocally related with the order of magnitude of its earthquake potential. This signature is defined as the seismic landscape of the area (e.g., Serva, L., Vittori, E., Ferreli, L., Michetti, A.M., 1997. Geology and seismic hazard. In: Grellet, B., Mohammadioun, B., Hays, W. (Eds.), Proceedings of the Second France-United States Workshop on Earthquake Hazard Assessment in Intraplate Regions: Central and Eastern United States and Western Europe, October 16, 1995, Nice, France, 20-24, Ouest Editions, Nantes, France; Michetti, A.M., Hancock, P.L., 1997. Paleoseismology: understanding past earthquakes using quaternary geology. Journal of Geodynamics 24 (1-4), 3-10). We then illustrate how this relatively new framework is helpful in understanding the seismic behavior of faults capable of producing surface faulting and provides a comprehensive approach for the use of paleoseismicity data in earthquake hazard characterization.

  15. Elements of infrastructure and seismic hazard in the central United States

    USGS Publications Warehouse

    Wheeler, Russell L.; Rhea, B. Susan; Tarr, Arthur C.

    1994-01-01

    In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Missouri. Seismicity continues to the present day throughout a tightly clustered patter of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This Professional Paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

  16. Flood hazard and flood risk assessment using a time series of satellite images: a case study in Namibia.

    PubMed

    Skakun, Sergii; Kussul, Nataliia; Shelestov, Andrii; Kussul, Olga

    2014-08-01

    In this article, the use of time series of satellite imagery to flood hazard mapping and flood risk assessment is presented. Flooded areas are extracted from satellite images for the flood-prone territory, and a maximum flood extent image for each flood event is produced. These maps are further fused to determine relative frequency of inundation (RFI). The study shows that RFI values and relative water depth exhibit the same probabilistic distribution, which is confirmed by Kolmogorov-Smirnov test. The produced RFI map can be used as a flood hazard map, especially in cases when flood modeling is complicated by lack of available data and high uncertainties. The derived RFI map is further used for flood risk assessment. Efficiency of the presented approach is demonstrated for the Katima Mulilo region (Namibia). A time series of Landsat-5/7 satellite images acquired from 1989 to 2012 is processed to derive RFI map using the presented approach. The following direct damage categories are considered in the study for flood risk assessment: dwelling units, roads, health facilities, and schools. The produced flood risk map shows that the risk is distributed uniformly all over the region. The cities and villages with the highest risk are identified. The proposed approach has minimum data requirements, and RFI maps can be generated rapidly to assist rescuers and decisionmakers in case of emergencies. On the other hand, limitations include: strong dependence on the available data sets, and limitations in simulations with extrapolated water depth values. PMID:24372226

  17. Landslide Hazards

    NSDL National Science Digital Library

    This fact sheet provides an overview of the hazards presented by landslides and debris flows, particluarly in wet weather conditions in landslide-prone areas. Topics include a discussion of the characteristics of debris flows, and suggestions for residents who live in steep, hilly terrain which is subject to periodic heavy rains.

  18. Main natural hazards and vulnerability studies for some historical constructions and urban sectors of Valparaiso City (Chile)

    NASA Astrophysics Data System (ADS)

    Romanelli, F.

    2009-04-01

    The Project "MAR VASTO" ("Risk Management in Valparaíso/Manejo de Riesgos en Valparaíso, Servicios Técnicos", 2007) started in March 2007, with coordination of ENEA (Italian Agency for New Technologies, Energy and Environment), participation of several partners (Italy: University of Ferrara, Faculties of Architecture and Engineering; University of Padua, Faculty of Engineering; Abdus Salam International Centre for Theoretical Physics of Trieste; Chile: Valparaíso Technical University Federico Santa Maria, Civil Works Department; Santiago University of Chile, Division Structures Constructions Geotechnics), and support of local stakeholders. Being Valparaíso included since 2003 in the UNESCO Word Heritage List of protected sites, the project main goals are the following: to collect, analyze and elaborate the existing information, with a satisfying evaluation of main hazards; to develop a GIS digital archive, well organized, user-friendly and easy to be implemented in the future, providing maps and scenarios of specific and multiple risk; to provide a vulnerability analysis for three historical churches (San Francisco del Baron, Las Hermanitas de la Providencia, La Matríz, made by various materials - masonry, concrete, wood and adobe - and located in different city sites) and for a building stock in the Cerro Cordillera (partially inside the UNESCO area), analyzing more than 200 constructions; to suggest guidelines for future urban planning and strengthening interventions. In the framework of the MAR VASTO Project, the most important hazards have been investigated carried out. With regard to seismic hazard, "state-of-the-art" information has been provided by Chilean partners and stakeholders, using materials of several studies and stored in original earthquake reports, local newspapers and magazines. The activities performed by the Italian team regarded the definition, for the city of Valparaiso, of earthquake scenarios and maps based on the neo-deterministic approach. With regard to tsunami, the information has been provided by SHOA (Servicio Hidrografico y Oceanografico de la Armada de Chile). Tsunami scenarios and inundation maps (for both the 1906 and 1985 earthquakes) have been evaluated by the Italian team, taking into account also worse scenarios (namely the 1730 seismic event). Landslide hazard (identifying main flow areas and pointing out most affected zones, with a deeper investigation in the Cerro Cordillera, pilot area for the MAR VASTO Project) and fire hazard have been also evaluated. Finally, a GIS database has been developed, to store the hazard maps produced in the project. In addition, the GIS database has been verified by using the data obtained by a DGPS survey, performed during the in situ work in Valparaiso. In the framework of the MAR VASTO Project a building stock located in the Cerro Cordillera, partially inside the UNESCO area, has been investigated. The work done in the above said Cerro Cordillera sector by the Italian team can be considered a pilot experience which would be enlarged to all the Valparaiso City area in the framework of the town planning, actually in progress.

  19. Climate change induced heat wave hazard in eastern Africa: Dar Es Salaam (Tanzania) and Addis Ababa (Ethiopia) case study

    NASA Astrophysics Data System (ADS)

    Capuano, Paolo; Sellerino, Mariangela; Di Ruocco, Angela; Kombe, Wilbard; Yeshitela, Kumelachew

    2013-04-01

    Last decades, new records were set in the world for tornadoes, drought, wind, floods, wildfires and hot temperatures, testifying unusual weather and climate patterns with increasing frequency and intensity of extreme weather events. Extreme heat events are natural hazards affecting many regions in the world, nevertheless limited work has been done on the analysis and effects of extreme heat events in Africa, that is considered a continent particularly vulnerable to the effects of climate change. In fact, the increase of temperature expected in the African continent during the 21st century is larger than the global mean warming, being about 3° to 4° C, about 1.5 times the global temperature increase (Christensen et al., 2007; Gualdi et al., 2012), with the subtropical regions projected to warm more than the tropical regions. Observations and downscaled model simulations (RCP4.5 and RCP8.5 IPCC scenarios) are analyzed to describe heat wave characteristics in Dar es Salaam (Tanzania) and Addis Ababa (Ethiopia), spanning the last five decades as well as that projected for the 21st century. Observed data are daily maximum and minimum temperature collected in the period 1961-2011; downscaled model simulations span up to 2050. Heat waves are defined following a peak over threshold approach by statistical comparison to historical meteorological baselines (site dependent), using a fixed absolute threshold. Projected future warming in the Dar es Salaam and Addis Ababa shows a further increase in the heat waves parameters. Heat wave duration and hot days number are strictly correlated showing that the temperature rise could generate not only an increase of heat waves number but mainly a longer average duration, that can strongly affect the resilience capacity of the population, particularly the elder people. In fact, the impacts of heat waves on the society are determined also by temporal duration (Stephenson, 2008), in addition to their frequency, in fact the capacity of adaptation can be reduced with prolonged exposure to high temperature and humidity. The expected persistence of long-lived heat waves lasting approximately 1.5-2 weeks is clearly longer with respect to the climatological period (1961-1990). During 100 years, short lived but more intense waves are more than doubled in duration. It is evident the needs for the national health services to develop strategies for the mitigation of the heat wave effects, to enhance the resilience of the population, particularly the elder people.

  20. Mitigation Action Plan

    SciTech Connect

    Not Available

    1994-02-01

    This Mitigation Action Plan (MAP) focuses on mitigation commitments stated in the Supplemental Environmental Impact Statement (SEIS) and the Record of Decision (ROD) for the Naval Petroleum Reserve No. 1 (NPR-1). Specific commitments and mitigation implementation actions are listed in Appendix A-Mitigation Actions, and form the central focus of this MAP. They will be updated as needed to allow for organizational, regulatory, or policy changes. It is the intent of DOE to comply with all applicable federal, state, and local environmental, safety, and health laws and regulations. Eighty-six specific commitments were identified in the SEIS and associated ROD which pertain to continued operation of NPR-1 with petroleum production at the Maximum Efficient Rate (MER). The mitigation measures proposed are expected to reduce impacts as much as feasible, however, as experience is gained in actual implementation of these measures, some changes may be warranted.

  1. Applications of a Forward-Looking Interferometer for the On-board Detection of Aviation Weather Hazards

    NASA Technical Reports Server (NTRS)

    West, Leanne; Gimmestad, Gary; Smith, William; Kireev, Stanislav; Cornman, Larry B.; Schaffner, Philip R.; Tsoucalas, George

    2008-01-01

    The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining measurements of potential weather hazards to alert flight crews. The FLI concept is based on high-resolution Infrared (IR) Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing, and which have also been applied to the detection of aerosols and gases for other purposes. It is being evaluated for multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing, during all phases of flight. Previous sensitivity and characterization studies addressed the phenomenology that supports detection and mitigation by the FLI. Techniques for determining the range, and hence warning time, were demonstrated for several of the hazards, and a table of research instrument parameters was developed for investigating all of the hazards discussed above. This work supports the feasibility of detecting multiple hazards with an FLI multi-hazard airborne sensor, and for producing enhanced IR images in reduced visibility conditions; however, further research must be performed to develop a means to estimate the intensities of the hazards posed to an aircraft and to develop robust algorithms to relate sensor measurables to hazard levels. In addition, validation tests need to be performed with a prototype system.

  2. Household Hazardous Waste

    MedlinePLUS

    ... Links The National Library of Medicine?s Household Products Database Household Hazardous Products Environmental Hazards Management Institute Minnesota Pollution Control Agency North America Hazardous ...

  3. High-dimensional additive hazards regression for oral squamous cell carcinoma using microarray data: a comparative study.

    PubMed

    Hamidi, Omid; Tapak, Lily; Jafarzadeh Kohneloo, Aarefeh; Sadeghifar, Majid

    2014-01-01

    Microarray technology results in high-dimensional and low-sample size data sets. Therefore, fitting sparse models is substantial because only a small number of influential genes can reliably be identified. A number of variable selection approaches have been proposed for high-dimensional time-to-event data based on Cox proportional hazards where censoring is present. The present study applied three sparse variable selection techniques of Lasso, smoothly clipped absolute deviation and the smooth integration of counting, and absolute deviation for gene expression survival time data using the additive risk model which is adopted when the absolute effects of multiple predictors on the hazard function are of interest. The performances of used techniques were evaluated by time dependent ROC curve and bootstrap .632+ prediction error curves. The selected genes by all methods were highly significant (P < 0.001). The Lasso showed maximum median of area under ROC curve over time (0.95) and smoothly clipped absolute deviation showed the lowest prediction error (0.105). It was observed that the selected genes by all methods improved the prediction of purely clinical model indicating the valuable information containing in the microarray features. So it was concluded that used approaches can satisfactorily predict survival based on selected gene expression measurements. PMID:24982876

  4. High-Dimensional Additive Hazards Regression for Oral Squamous Cell Carcinoma Using Microarray Data: A Comparative Study

    PubMed Central

    Hamidi, Omid; Tapak, Lily; Jafarzadeh Kohneloo, Aarefeh; Sadeghifar, Majid

    2014-01-01

    Microarray technology results in high-dimensional and low-sample size data sets. Therefore, fitting sparse models is substantial because only a small number of influential genes can reliably be identified. A number of variable selection approaches have been proposed for high-dimensional time-to-event data based on Cox proportional hazards where censoring is present. The present study applied three sparse variable selection techniques of Lasso, smoothly clipped absolute deviation and the smooth integration of counting, and absolute deviation for gene expression survival time data using the additive risk model which is adopted when the absolute effects of multiple predictors on the hazard function are of interest. The performances of used techniques were evaluated by time dependent ROC curve and bootstrap .632+ prediction error curves. The selected genes by all methods were highly significant (P < 0.001). The Lasso showed maximum median of area under ROC curve over time (0.95) and smoothly clipped absolute deviation showed the lowest prediction error (0.105). It was observed that the selected genes by all methods improved the prediction of purely clinical model indicating the valuable information containing in the microarray features. So it was concluded that used approaches can satisfactorily predict survival based on selected gene expression measurements. PMID:24982876

  5. Urban plant physiology: adaptation-mitigation strategies under permanent stress.

    PubMed

    Calfapietra, Carlo; Peñuelas, Josep; Niinemets, Ülo

    2015-02-01

    Urban environments that are stressful for plant function and growth will become increasingly widespread in future. In this opinion article, we define the concept of 'urban plant physiology', which focuses on plant responses and long term adaptations to urban conditions and on the capacity of urban vegetation to mitigate environmental hazards in urbanized settings such as air and soil pollution. Use of appropriate control treatments would allow for studies in urban environments to be comparable to expensive manipulative experiments. In this opinion article, we propose to couple two approaches, based either on environmental gradients or manipulated gradients, to develop the concept of urban plant physiology for assessing how single or multiple environmental factors affect the key environmental services provided by urban forests. PMID:25476199

  6. Non Governmental Organizations (NGOs) in disaster mitigation and preparedness education programs

    Microsoft Academic Search

    E. Vye

    2009-01-01

    Integrating disaster preparedness programs and working locally with existing development projects (pre and post disaster) can reduce vulnerabilities to natural hazards, and subsequently the dependency on external assistance. Non Governmental Organizations (NGOs) play a strong communicative role in the field of education for disaster mitigation and preparedness (DMP). Strategies and initiatives pertaining to earthquake