These are representative sample records from related to your search topic.
For comprehensive and current results, perform a real-time search at

Unacceptable Risk: Earthquake Hazard Mitigation in One California School District. Hazard Mitigation Case Study.  

ERIC Educational Resources Information Center

Earthquakes are a perpetual threat to California's school buildings. School administrators must be aware that hazard mitigation means much more than simply having a supply of water bottles in the school; it means getting everyone involved in efforts to prevent tragedies from occurring in school building in the event of an earthquake. The PTA in…

California State Office of Emergency Services, Sacramento.


Study proposes wholesale change in thinking about natural hazards mitigation  

NASA Astrophysics Data System (ADS)

The “lollapaloozas,” the major natural catastrophes, are getting bigger and bigger, and it is time to confront this growing problem by dramatically changing the way that society approaches natural hazard mitigation, conducts itself in relation to the natural environment, and accepts responsibility for activities that could lead to or increase disasters, according to Dennis Mileti, principal investigator of a new study on natural hazards, and director of the Natural Hazards Research and Applications Information Center at the University of Colorado at Boulder.Since 1989, the United States has been struck by seven of the nation's 10 most costly natural disasters, including the 1994 Northridge earthquake in California that caused $25 billion in damages. Also since 1989, the financial cost of natural hazards in the United States—which includes floods, earthquakes, hurricanes, and wildfires, as well as landslides, heat, and fog—has frequently averaged $1 billion per week, a price that some experts say will continue rising. Internationally, the Kobe, Japan, earthquake cost more than $100 billion and is the most financially costly disaster in world history None of these figures include indirect losses related to natural disasters, such as lost economic productivity

Showstack, Randy


Mitigating lightning hazards  

SciTech Connect

A new draft document provides guidance for assessing and mitigating the effects of lightning hazards on a Department of Energy (or any other) facility. Written by two Lawrence Livermore Engineers, the document combines lightning hazard identification and facility categorization with a new concept, the Lightning Safety System, to help dispel the confusion and mystery surrounding lightning and its effects. The guidance is of particular interest to DOE facilities storing and handling nuclear and high-explosive materials. The concepts presented in the document were used to evaluate the lightning protection systems of the Device Assembly Facility at the Nevada Test Site.

Hasbrouck, R.



Numerical study on tsunami hazard mitigation using a submerged breakwater.  


Most coastal structures have been built in surf zones to protect coastal areas. In general, the transformation of waves in the surf zone is quite complicated and numerous hazards to coastal communities may be associated with such phenomena. Therefore, the behavior of waves in the surf zone should be carefully analyzed and predicted. Furthermore, an accurate analysis of deformed waves around coastal structures is directly related to the construction of economically sound and safe coastal structures because wave height plays an important role in determining the weight and shape of a levee body or armoring material. In this study, a numerical model using a large eddy simulation is employed to predict the runup heights of nonlinear waves that passed a submerged structure in the surf zone. Reduced runup heights are also predicted, and their characteristics in terms of wave reflection, transmission, and dissipation coefficients are investigated. PMID:25215334

Ha, Taemin; Yoo, Jeseon; Han, Sejong; Cho, Yong-Sik



Washington Tsunami Hazard Mitigation Program  

NASA Astrophysics Data System (ADS)

Washington State has participated in the National Tsunami Hazard Mitigation Program (NTHMP) since its inception in 1995. We have participated in the tsunami inundation hazard mapping, evacuation planning, education, and outreach efforts that generally characterize the NTHMP efforts. We have also investigated hazards of significant interest to the Pacific Northwest. The hazard from locally generated earthquakes on the Cascadia subduction zone, which threatens tsunami inundation in less than hour following a magnitude 9 earthquake, creates special problems for low-lying accretionary shoreforms in Washington, such as the spits of Long Beach and Ocean Shores, where high ground is not accessible within the limited time available for evacuation. To ameliorate this problem, we convened a panel of the Applied Technology Council to develop guidelines for construction of facilities for vertical evacuation from tsunamis, published as FEMA 646, now incorporated in the International Building Code as Appendix M. We followed this with a program called Project Safe Haven ( to site such facilities along the Washington coast in appropriate locations and appropriate designs to blend with the local communities, as chosen by the citizens. This has now been completed for the entire outer coast of Washington. In conjunction with this effort, we have evaluated the potential for earthquake-induced ground failures in and near tsunami hazard zones to help develop cost estimates for these structures and to establish appropriate tsunami evacuation routes and evacuation assembly areas that are likely to to be available after a major subduction zone earthquake. We intend to continue these geotechnical evaluations for all tsunami hazard zones in Washington.

Walsh, T. J.; Schelling, J.



Contributions of Nimbus 7 TOMS Data to Volcanic Study and Hazard Mitigation  

NASA Technical Reports Server (NTRS)

Nimbus TOMS data have led to advancements among many volcano-related scientific disciplines, from the initial ability to quantify SO2 clouds leading to derivations of eruptive S budgets and fluxes, to tracking of individual clouds, assessing global volcanism and atmospheric impacts. Some of the major aspects of TOMS-related research, listed below, will be reviewed and updated: (1) Measurement of volcanic SO2 clouds: Nimbus TOMS observed over 100 individual SO2 clouds during its mission lifetime; large explosive eruptions are now routinely and reliably measured by satellite. (2) Eruption processes: quantification of SO2 emissions have allowed assessments of eruption sulfur budgets, the evaluation of "excess" sulfur, and inferences of H2S emissions. (3) Detection of ash: TOMS data are now used to detect volcanic particulates in the atmosphere, providing complementary analyses to infrared methods of detection. Paired TOMS and AVHRR studies have provided invaluable information on volcanic cloud compositions and processes. (4) Cloud tracking and hazard mitigation: volcanic clouds can be considered gigantic tracers in the atmosphere, and studies of the fates of these clouds have led to new knowledge of their physical and chemical dispersion in the atmosphere for predictive models. (5) Global trends: the long term data set has provided researchers an unparalleled record of explosive volcanism, and forms a key component in assessing annual to decadal trends in global S emissions. (6) Atmospheric impacts: TOMS data have been linked to independent records of atmospheric change, in order to compare cause and effect processes following a massive injection of SO2 into the atmosphere. (7) Future TOMS instruments and applications: Nimbus TOMS has given way to new satellite platforms, with several wavelength and resolution modifications. New efforts to launch a geostationary TOMS could provide unprecedented observations of volcanic activity.

Krueger, Arlin J.; Bluth, G. J. S.; Schaefer, S. A.



76 FR 61070 - Disaster Assistance; Hazard Mitigation Grant Program  

Federal Register 2010, 2011, 2012, 2013, 2014

...programs for wildfire hazard mitigation and erosion hazard mitigation in the list of eligible...Department of Agriculture. Wildfire and Erosion Under the NPRM, vegetation management related to wildfire and erosion hazard mitigation measures would be...



Predictability and extended-range prognosis in natural hazard risk mitigation process: A case study over west Greece  

NASA Astrophysics Data System (ADS)

Natural hazards pose an increasing threat to society and new innovative techniques or methodologies are necessary to be developed, in order to enhance the risk mitigation process in nowadays. It is commonly accepted that disaster risk reduction is a vital key for future successful economic and social development. The systematic improvement accuracy of extended-range prognosis products, relating with monthly and seasonal predictability, introduced them as a new essential link in risk mitigation procedure. Aiming at decreasing the risk, this paper presents the use of seasonal and monthly forecasting process that was tested over west Greece from September to December, 2013. During that season significant severe weather events occurred, causing significant impact to the local society (severe storms/rainfalls, hail, flash floods, etc). Seasonal and monthly forecasting products from European Centre for Medium-Range Weather Forecasts (ECMWF) depicted, with probabilities stratified by terciles, areas of Greece where significant weather may occur. As atmospheric natural hazard early warning systems are able to deliver warnings up to 72 hours in advance, this study illustrates that extended-range prognosis could be introduced as a new technique in risk mitigation. Seasonal and monthly forecast products could highlight extended areas where severe weather events may occur in one month lead time. In addition, a risk mitigation procedure, that extended prognosis products are adopted, is also presented providing useful time to preparedness process at regional administration level.

Matsangouras, Ioannis T.; Nastos, Panagiotis T.



An economic and geographic appraisal of a spatial natural hazard risk: a study of landslide mitigation rules  

USGS Publications Warehouse

Efficient mitigation of natural hazards requires a spatial representation of the risk, based upon the geographic distribution of physical parameters and man-related development activities. Through such a representation, the spatial probability of landslides based upon physical science concepts is estimated for Cincinnati, Ohio. Mitigation programs designed to reduce loss from landslide natural hazards are then evaluated. An optimum mitigation rule is suggested that is spatially selective and is determined by objective measurements of hillside slope and properties of the underlying soil. -Authors

Bernknopf, R.L.; Brookshire, D.S.; Campbell, R.H.; Shapiro, C.D.



Playing against nature: improving earthquake hazard mitigation  

NASA Astrophysics Data System (ADS)

The great 2011 Tohoku earthquake dramatically demonstrated the need to improve earthquake and tsunami hazard assessment and mitigation policies. The earthquake was much larger than predicted by hazard models, and the resulting tsunami overtopped coastal defenses, causing more than 15,000 deaths and $210 billion damage. Hence if and how such defenses should be rebuilt is a challenging question, because the defences fared poorly and building ones to withstand tsunamis as large as March's is too expensive,. A similar issue arises along the Nankai Trough to the south, where new estimates warning of tsunamis 2-5 times higher than in previous models raise the question of what to do, given that the timescale on which such events may occur is unknown. Thus in the words of economist H. Hori, "What should we do in face of uncertainty? Some say we should spend our resources on present problems instead of wasting them on things whose results are uncertain. Others say we should prepare for future unknown disasters precisely because they are uncertain". Thus society needs strategies to mitigate earthquake and tsunami hazards that make economic and societal sense, given that our ability to assess these hazards is poor, as illustrated by highly destructive earthquakes that often occur in areas predicted by hazard maps to be relatively safe. Conceptually, we are playing a game against nature "of which we still don't know all the rules" (Lomnitz, 1989). Nature chooses tsunami heights or ground shaking, and society selects the strategy to minimize the total costs of damage plus mitigation costs. As in any game of chance, we maximize our expectation value by selecting the best strategy, given our limited ability to estimate the occurrence and effects of future events. We thus outline a framework to find the optimal level of mitigation by balancing its cost against the expected damages, recognizing the uncertainties in the hazard estimates. This framework illustrates the role of the uncertainties and the need to candidly assess them. It can be applied to exploring policies under various hazard scenarios and mitigating other natural hazards.ariation in total cost, the sum of expected loss and mitigation cost, as a function of mitigation level. The optimal level of mitigation, n*, minimizes the total cost. The expected loss depends on the hazard model, so the better the hazard model, the better the mitigation policy (Stein and Stein, 2012).

Stein, S. A.; Stein, J.



Mitigation of Hazardous Comets and Asteroids  

NASA Astrophysics Data System (ADS)

Preface; 1. Recent progress in interpreting the nature of the near-Earth object population W. Bottke, A. Morbidelli and R. Jedicke; 2. Earth impactors: orbital characteristics and warning times S. R. Chesley and T. B. Spahr; 3. The role of radar in predicting and preventing asteroid and comet collisions with Earth S. J. Ostro and J. D. Giorgini; 4. Interior structures for asteroids and cometary nuclei E. Asphaug; 5. What we know and don't know about surfaces of potentially hazardous small bodies C. R. Chapman; 6. About deflecting asteroids and comets K. A. Holsapple; 7. Scientific requirements for understanding the near-Earth asteroid population A. W. Harris; 8. Physical properties of comets and asteroids inferred from fireball observations M. D. Martino and A. Cellino; 9. Mitigation technologies and their requirements C. Gritzner and R. Kahle; 10. Peering inside near-Earth objects with radio tomography W. Kofman and A. Safaeinili; 11. Seismological imvestigation of asteroid and comet interiors J. D. Walker and W. F. Huebner; 12. Lander and penetrator science for near-Earth object mitigation studies A. J. Ball, P. Lognonne, K. Seiferlin, M. Patzold and T. Spohn; 13. Optimal interpretation and deflection of Earth-approaching asteroids using low-thrust electric propulsion B. A. Conway; 14. Close proximity operations at small bodies: orbiting, hovering, and hopping D. J. Scheeres; 15. Mission operations in low gravity regolith and dust D. Sears, M. Franzen, S. Moore, S. Nichols, M. Kareev and P. Benoit; 16. Impacts and the public: communicating the nature of the impact hazard D. Morrison, C. R. Chapman, D. Steel and R. P. Binzel; 17. Towards a program to remove the threat of hazardous NEOs M. J. S. Belton.

Belton, Michael J. S.; Morgan, Thomas H.; Samarasinha, Nalin H.; Yeomans, Donald K.



Mitigation of Hazardous Comets and Asteroids  

NASA Astrophysics Data System (ADS)

Preface; 1. Recent progress in interpreting the nature of the near-Earth object population W. Bottke, A. Morbidelli and R. Jedicke; 2. Earth impactors: orbital characteristics and warning times S. R. Chesley and T. B. Spahr; 3. The role of radar in predicting and preventing asteroid and comet collisions with Earth S. J. Ostro and J. D. Giorgini; 4. Interior structures for asteroids and cometary nuclei E. Asphaug; 5. What we know and don't know about surfaces of potentially hazardous small bodies C. R. Chapman; 6. About deflecting asteroids and comets K. A. Holsapple; 7. Scientific requirements for understanding the near-Earth asteroid population A. W. Harris; 8. Physical properties of comets and asteroids inferred from fireball observations M. D. Martino and A. Cellino; 9. Mitigation technologies and their requirements C. Gritzner and R. Kahle; 10. Peering inside near-Earth objects with radio tomography W. Kofman and A. Safaeinili; 11. Seismological imvestigation of asteroid and comet interiors J. D. Walker and W. F. Huebner; 12. Lander and penetrator science for near-Earth object mitigation studies A. J. Ball, P. Lognonne, K. Seiferlin, M. Patzold and T. Spohn; 13. Optimal interpretation and deflection of Earth-approaching asteroids using low-thrust electric propulsion B. A. Conway; 14. Close proximity operations at small bodies: orbiting, hovering, and hopping D. J. Scheeres; 15. Mission operations in low gravity regolith and dust D. Sears, M. Franzen, S. Moore, S. Nichols, M. Kareev and P. Benoit; 16. Impacts and the public: communicating the nature of the impact hazard D. Morrison, C. R. Chapman, D. Steel and R. P. Binzel; 17. Towards a program to remove the threat of hazardous NEOs M. J. S. Belton.

Belton, Michael J. S.; Morgan, Thomas H.; Samarasinha, Nalin H.; Yeomans, Donald K.



Collective action for community-based hazard mitigation: a case study of Tulsa project impact  

E-print Network

disaster losses. Local community members have been growingly required to share information and responsibilities for reducing community vulnerabilities to natural and technological hazards and building a safer community. Consequently they are encouraged...

Lee, Hee Min



WHC natural phenomena hazards mitigation implementation plan  

SciTech Connect

Natural phenomena hazards (NPH) are unexpected acts of nature which pose a threat or danger to workers, the public or to the environment. Earthquakes, extreme winds (hurricane and tornado),snow, flooding, volcanic ashfall, and lightning strike are examples of NPH at Hanford. It is the policy of U.S. Department of Energy (DOE) to design, construct and operate DOE facilitiesso that workers, the public and the environment are protected from NPH and other hazards. During 1993 DOE, Richland Operations Office (RL) transmitted DOE Order 5480.28, ``Natural Phenomena Hazards Mitigation,`` to Westinghouse Hanford COmpany (WHC) for compliance. The Order includes rigorous new NPH criteria for the design of new DOE facilities as well as for the evaluation and upgrade of existing DOE facilities. In 1995 DOE issued Order 420.1, ``Facility Safety`` which contains the same NPH requirements and invokes the same applicable standards as Order 5480.28. It will supersede Order 5480.28 when an in-force date for Order 420.1 is established through contract revision. Activities will be planned and accomplished in four phases: Mobilization; Prioritization; Evaluation; and Upgrade. The basis for the graded approach is the designation of facilities/structures into one of five performance categories based upon safety function, mission and cost. This Implementation Plan develops the program for the Prioritization Phase, as well as an overall strategy for the implemention of DOE Order 5480.2B.

Conrads, T.J.




EPA Science Inventory

This program has been conducted to evaluate commercially available water base foams for mitigating the vapors from hazardous chemical spills. Foam systems were evaluated in the laboratory to define those foam properties which are important in mitigating hazardous vapors. Larger s...


Benefit-Cost Analysis of FEMA Hazard Mitigation Grants  

Microsoft Academic Search

Mitigation decreases the losses from natural hazards by reducing our vulnerability or by reducing the frequency and magnitude of causal factors. Reducing these losses brings many benefits, but every mitigation activity has a cost that must be considered in our world of limited resources. In principle, benefit-cost analysis BCA attempts to assess a mitigation activity’s expected net benefits discounted future

Adam Z. Rose; Keith Porter; Nicole Dash; Jawhar Bouabid; Charles Huyck; John Whitehead; Douglass Shaw; Ronald Eguchi; Craig Taylor; Thomas McLane; L. Thomas Tobin; Philip T. Ganderton; Anne S. Kiremidjian; Kathleen Tierney; Carol Taylor West



Potentially Hazardous Objects (PHO) Mitigation Program  

NASA Astrophysics Data System (ADS)

Southwest Research Institute (SwRI) and its partner, Los Alamos National Laboratory (LANL), are prepared to develop, implement, and expand procedures to avert collisions of potentially hazardous objects (PHOs) with Earth as recommended by NASA in its White Paper "Near- Earth Object Survey and Deflection Analysis of Alternatives" requested by the US Congress and submitted to it in March 2007. In addition to developing the general mitigation program as outlined in the NASA White Paper, the program will be expanded to include aggressive mitigation procedures for small (e.g., Tunguska-sized) PHOs and other short warning-time PHOs such as some long-period comet nuclei. As a first step the program will concentrate on the most likely and critical cases, namely small objects and long-period comet nuclei with short warning-times, but without losing sight of objects with longer warning-times. Objects smaller than a few hundred meters are of interest because they are about 1000 times more abundant than kilometer-sized objects and are fainter and more difficult to detect, which may lead to short warning times and hence short reaction times. Yet, even these small PHOs can have devastating effects as the 30 June 1908, Tungaska event has shown. In addition, long-period comets, although relatively rare but large (sometimes tens of kilometers in size), cannot be predicted because of their long orbital periods. Comet C/1983 H1 (IRAS-Araki-Alcock), for example, has an orbital period of 963.22 years, was discovered 27 April 1983, and passed Earth only two weeks later, on 11 May 1983, at a distance of 0.0312 AU. Aggressive methods and continuous alertness will be needed to defend against objects with such short warning times. While intact deflection of a PHO remains a key objective, destruction of a PHO and dispersion of the pieces must also be considered. The effectiveness of several alternative methods including nuclear demolition munitions, conventional explosives, and hyper-velocity impacts will be investigated and compared. This comparison is important for technical as well as political reasons, both domestic and international. The long-range plan includes evaluation of technical readiness including launch capabilities, tests for effectiveness using materials simulating PHOs, and building and testing several modular systems appropriate for alternative applications depending on the type of PHO.

Huebner, Walter


Benefit-Cost Analysis of FEMA Hazard Mitigation Grants  

Microsoft Academic Search

Mitigation ameliorates the impact of natural hazards on communities by reducing loss of life and injury, property and environmental damage, and social and economic disruption. The potential to reduce these losses brings many benefits, but every mitigation activity has a cost that must be considered in our world of limited resources. In principle benefit-cost analysis (BCA) can be used to

Adam Rose; Keith Porter; Nicole Dash; Jawhar Bouabid; Charles Huyck; John C. Whitehead; Douglass Shaw; Ronald T. Eguchi; Craig Taylor; Thomas R. McLane; L. Thomas Tobin; Philip T. Ganderton; David Godschalk; Anne S. Kiremidjian; Kathleen Tierney; Carol Taylor West



Destructive Interactions Between Mitigation Strategies and the Causes of Unexpected Failures in Natural Hazard Mitigation Systems  

NASA Astrophysics Data System (ADS)

Large investments in the mitigation of natural hazards, using a variety of technology-based mitigation strategies, have proven to be surprisingly ineffective in some recent natural disasters. These failures reveal a need for a systematic classification of mitigation strategies; an understanding of the scientific uncertainties that affect the effectiveness of such strategies; and an understanding of how the different types of strategy within an overall mitigation system interact destructively to reduce the effectiveness of the overall mitigation system. We classify mitigation strategies into permanent, responsive and anticipatory. Permanent mitigation strategies such as flood and tsunami defenses or land use restrictions, are both costly and 'brittle': when they malfunction they can increase mortality. Such strategies critically depend on the accuracy of the estimates of expected hazard intensity in the hazard assessments that underpin their design. Responsive mitigation strategies such as tsunami and lahar warning systems rely on capacities to detect and quantify the hazard source events and to transmit warnings fast enough to enable at risk populations to decide and act effectively. Self-warning and voluntary evacuation is also usually a responsive mitigation strategy. Uncertainty in the nature and magnitude of the detected hazard source event is often the key scientific obstacle to responsive mitigation; public understanding of both the hazard and the warnings, to enable decision making, can also be a critical obstacle. Anticipatory mitigation strategies use interpretation of precursors to hazard source events and are used widely in mitigation of volcanic hazards. Their critical limitations are due to uncertainties in time, space and magnitude relationships between precursors and hazard events. Examples of destructive interaction between different mitigation strategies are provided by the Tohoku 2011 earthquake and tsunami; recent earthquakes that have impacted population centers with poor enforcement of building codes, unrealistic expectations of warning systems or failures to understand local seismic damage mechanisms; and the interaction of land use restriction strategies and responsive warning strategies around lahar-prone volcanoes. A more complete understanding of the interactions between these different types of mitigation strategy, especially the consequences for the expectations and behaviors of the populations at risk, requires models of decision-making under high levels of both uncertainty and danger. The Observation-Orientation-Decision-Action (OODA) loop model (Boyd, 1987) may be a particularly useful model. It emphasizes the importance of 'orientation' (the interpretation of observations and assessment of their significance for the observer and decision-maker), the feedback between decisions and subsequent observations and orientations, and the importance of developing mitigation strategies that are flexible and so able to respond to the occurrence of the unexpected. REFERENCE: Boyd, J.R. A Discourse on Winning and Losing [

Day, S. J.; Fearnley, C. J.



Space options for tropical cyclone hazard mitigation  

NASA Astrophysics Data System (ADS)

This paper investigates potential space options for mitigating the impact of tropical cyclones on cities and civilians. Ground-based techniques combined with space-based remote sensing instrumentation are presented together with space-borne concepts employing space solar power technology. Two space-borne mitigation options are considered: atmospheric warming based on microwave irradiation and laser-induced cloud seeding based on laser power transfer. Finally technology roadmaps dedicated to the space-borne options are presented, including a detailed discussion on the technological viability and technology readiness level of our proposed systems. Based on these assessments, the space-borne cyclone mitigation options presented in this paper may be established in a quarter of a century.

Dicaire, Isabelle; Nakamura, Ryoko; Arikawa, Yoshihisa; Okada, Kazuyuki; Itahashi, Takamasa; Summerer, Leopold



Potentially Hazardous Objects (PHO) Mitigation Program  

Microsoft Academic Search

Southwest Research Institute (SwRI) and its partner, Los Alamos National Laboratory (LANL), are prepared to develop, implement, and expand procedures to avert collisions of potentially hazardous objects (PHOs) with Earth as recommended by NASA in its White Paper \\

Walter Huebner



Mitigation of the most hazardous tank at the Hanford Site  

SciTech Connect

Various tanks at the Hanford Site have been declared to be unresolved safety problems. This means that the tank has the potential to be beyond the limits covered by the current safety documentation. Tank 241-SY-101 poses the greatest hazard. The waste stored in this tank has periodically released hydrogen gas which exceeds the lower flammable limits. A mixer pump was installed in this tank to stir the waste. Stirring the waste would allow the hydrogen to be released slowly in a controlled manner and mitigate the hazard associated with this tank. The testing of this mixer pump is reported in this document. The mixer pump has been successful in controlling the hydrogen concentration in the tank dome to below the flammable limit which has mitigated the hazardous gas releases.

Reynolds, D.A.



77 FR 24505 - Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential Buildings  

Federal Register 2010, 2011, 2012, 2013, 2014

...FEMA-2012-0007] Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential...comments on Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential...such activity is the implementation of wind retrofit projects to protect existing...



Seismic hazard assessment and mitigation in India: an overview  

NASA Astrophysics Data System (ADS)

The Indian subcontinent is characterized by various tectonic units viz., Himalayan collision zone in North, Indo-Burmese arc in north-east, failed rift zones in its interior in Peninsular Indian shield and Andaman Sumatra trench in south-east Indian Territory. During the last about 100 years, the country has witnessed four great and several major earthquakes. Soon after the occurrence of the first great earthquake, the Shillong earthquake ( M w: 8.1) in 1897, efforts were started to assess the seismic hazard in the country. The first such attempt was made by Geological Survey of India in 1898 and since then considerable progress has been made. The current seismic zonation map prepared and published by Bureau of Indian Standards, broadly places seismic risk in different parts of the country in four major zones. However, this map is not sufficient for the assessment of area-specific seismic risks, necessitating detailed seismic zoning, that is, microzonation for earthquake disaster mitigation and management. Recently, seismic microzonation studies are being introduced in India, and the first level seismic microzonation has already been completed for selected urban centres including, Jabalpur, Guwahati, Delhi, Bangalore, Ahmadabad, Dehradun, etc. The maps prepared for these cities are being further refined on larger scales as per the requirements, and a plan has also been firmed up for taking up microzonation of 30 selected cities, which lie in seismic zones V and IV and have a population density of half a million. The paper highlights the efforts made in India so far towards seismic hazard assessment as well as the future road map for such studies.

Verma, Mithila; Bansal, Brijesh K.



New Approaches to Tsunami Hazard Mitigation Demonstrated in Oregon  

NASA Astrophysics Data System (ADS)

Oregon Department of Geology and Mineral Industries and Oregon Emergency Management collaborated over the last four years to increase tsunami preparedness for residents and visitors to the Oregon coast. Utilizing support from the National Tsunami Hazards Mitigation Program (NTHMP), new approaches to outreach and tsunami hazard assessment were developed and then applied. Hazard assessment was approached by first doing two pilot studies aimed at calibrating theoretical models to direct observations of tsunami inundation gleaned from the historical and prehistoric (paleoseismic/paleotsunami) data. The results of these studies were then submitted to peer-reviewed journals and translated into 1:10,000-12,000-scale inundation maps. The inundation maps utilize a powerful new tsunami model, SELFE, developed by Joseph Zhang at the Oregon Health & Science University. SELFE uses unstructured computational grids and parallel processing technique to achieve fast accurate simulation of tsunami interactions with fine-scale coastal morphology. The inundation maps were simplified into tsunami evacuation zones accessed as map brochures and an interactive mapping portal at Unique in the world are new evacuation maps that show separate evacuation zones for distant versus locally generated tsunamis. The brochure maps explain that evacuation time is four hours or more for distant tsunamis but 15-20 minutes for local tsunamis that are invariably accompanied by strong ground shaking. Since distant tsunamis occur much more frequently than local tsunamis, the two-zone maps avoid needless over evacuation (and expense) caused by one-zone maps. Inundation mapping for the entire Oregon coast will be complete by ~2014. Educational outreach was accomplished first by doing a pilot study to measure effectiveness of various approaches using before and after polling and then applying the most effective methods. In descending order, the most effective methods were: (1) door-to-door (person-to-person) education, (2) evacuation drills, (3) outreach to K-12 schools, (4) media events, and (5) workshops targeted to key audiences (lodging facilities, teachers, and local officials). Community organizers were hired to apply these five methods to clusters of small communities, measuring performance by before and after polling. Organizers were encouraged to approach the top priority, person-to-person education, by developing Community Emergency Response Teams (CERT) or CERT-like organizations in each community, thereby leaving behind a functioning volunteer-based group that will continue the outreach program and build long term resiliency. One of the most effective person-to-person educational tools was the Map Your Neighborhood program that brings people together so they can sketch the basic layout of their neighborhoods to depict key earthquake and tsunami hazards and mitigation solutions. The various person-to-person volunteer efforts and supporting outreach activities are knitting communities together and creating a permanent culture of tsunami and earthquake preparedness. All major Oregon coastal population centers will have been covered by this intensive outreach program by ~2014.

Priest, G. R.; Rizzo, A.; Madin, I.; Lyles Smith, R.; Stimely, L.



GO/NO-GO - When is medical hazard mitigation acceptable for launch?  

NASA Technical Reports Server (NTRS)

Medical support of spaceflight missions is composed of complex tasks and decisions that dedicated to maintaining the health and performance of the crew and the completion of mission objectives. Spacecraft represent one of the most complex vehicles built by humans, and are built to very rigorous design specifications. In the course of a Flight Readiness Review (FRR) or a mission itself, the flight surgeon must be able to understand the impact of hazards and risks that may not be completely mitigated by design alone. Some hazards are not mitigated because they are never actually identified. When a hazard is identified, it must be reduced or waivered. Hazards that cannot be designed out of the vehicle or mission, are usually mitigated through other means to bring the residual risk to an acceptable level. This is possible in most engineered systems because failure modes are usually predictable and analysis can include taking these systems to failure. Medical support of space missions is complicated by the inability of flight surgeons to provide "exact" hazard and risk numbers to the NASA engineering community. Taking humans to failure is not an option. Furthermore, medical dogma is mostly comprised of "medical prevention" strategies that mitigate risk by examining the behaviour of a cohort of humans similar to astronauts. Unfortunately, this approach does not lend itself well for predicting the effect of a hazard in the unique environment of space. This presentation will discuss how Medical Operations uses an evidence-based approach to decide if hazard mitigation strategies are adequate to reduce mission risk to acceptable levels. Case studies to be discussed will include: 1. Risk of electrocution risk during EVA 2. Risk of cardiac event risk during long and short duration missions 3. Degraded cabin environmental monitoring on the ISS. Learning Objectives 1.) The audience will understand the challenges of mitigating medical risk caused by nominal and off-nominal mission events. 2.) The audience will understand the process by which medical hazards are identified and mitigated before launch. 3.) The audience will understand the roles and responsibilities of all the other flight control positions in participating in the process of reducing hazards and reducing medical risk to an acceptable level.

Hamilton, Douglas R.; Polk, James D.



Composite Materials for Hazard Mitigation of Reactive Metal Hydrides.  

SciTech Connect

In an attempt to mitigate the hazards associated with storing large quantities of reactive metal hydrides, polymer composite materials were synthesized and tested under simulated usage and accident conditions. The composites were made by polymerizing vinyl monomers using free-radical polymerization chemistry, in the presence of the metal hydride. Composites with vinyl-containing siloxane oligomers were also polymerized with and without added styrene and divinyl benzene. Hydrogen capacity measurements revealed that addition of the polymer to the metal hydride reduced the inherent hydrogen storage capacity of the material. The composites were found to be initially effective at reducing the amount of heat released during oxidation. However, upon cycling the composites, the mitigating behavior was lost. While the polymer composites we investigated have mitigating potential and are physically robust, they undergo a chemical change upon cycling that makes them subsequently ineffective at mitigating heat release upon oxidation of the metal hydride. Acknowledgements The authors would like to thank the following people who participated in this project: Ned Stetson (U.S. Department of Energy) for sponsorship and support of the project. Ken Stewart (Sandia) for building the flow-through calorimeter and cycling test stations. Isidro Ruvalcaba, Jr. (Sandia) for qualitative experiments on the interaction of sodium alanate with water. Terry Johnson (Sandia) for sharing his expertise and knowledge of metal hydrides, and sodium alanate in particular. Marcina Moreno (Sandia) for programmatic assistance. John Khalil (United Technologies Research Corp) for insight into the hazards of reactive metal hydrides and real-world accident scenario experiments. Summary In an attempt to mitigate and/or manage hazards associated with storing bulk quantities of reactive metal hydrides, polymer composite materials (a mixture of a mitigating polymer and a metal hydride) were synthesized and tested under simulated usage and accident conditions. Mitigating the hazards associated with reactive metal hydrides during an accident while finding a way to keep the original capability of the active material intact during normal use has been the focus of this work. These composites were made by polymerizing vinyl monomers using free-radical polymerization chemistry, in the presence of the metal hydride, in this case a prepared sodium alanate (chosen as a representative reactive metal hydride). It was found that the polymerization of styrene and divinyl benzene could be initiated using AIBN in toluene at 70 degC. The resulting composite materials can be either hard or brittle solids depending on the cross-linking density. Thermal decomposition of these styrene-based composite materials is lower than neat polystyrene indicating that the chemical nature of the polymer is affected by the formation of the composite. The char-forming nature of cross-linked polystyrene is low and therefore, not an ideal polymer for hazard mitigation. To obtain composite materials containing a polymer with higher char-forming potential, siloxane-based monomers were investigated. Four vinyl-containing siloxane oligomers were polymerized with and without added styrene and divinyl benzene. Like the styrene materials, these composite materials exhibited thermal decomposition behavior significantly different than the neat polymers. Specifically, the thermal decomposition temperature was shifted approximately 100 degC lower than the neat polymer signifying a major chemical change to the polymer network. Thermal analysis of the cycled samples was performed on the siloxane-based composite materials. It was found that after 30 cycles the siloxane-containing polymer composite material has similar TGA/DSC-MS traces as the virgin composite material indicating that the polymer is physically intact upon cycling. Hydrogen capacity measurements revealed that addition of the polymer to the metal hydride in the form of a composite material reduced the inherent hydrogen storage capacity of the material. This

Pratt, Joseph William; Cordaro, Joseph Gabriel; Sartor, George B.; Dedrick, Daniel E.; Reeder, Craig L.



National Landslide Hazards Mitigation Strategy: A Framework for Loss Reduction  

NSDL National Science Digital Library

This report outlines the key elements of a comprehensive and effective national strategy for reducing losses from landslides nationwide and provides an assessment of the status, needs, and associated costs of this strategy. Topics include the rising costs resulting from landslide damage, the role of the federal government in mitigating this hazard, and some features of the new strategy such as partnerships among government, academia and the private sector, expanded research, and making use of technological advantages. The material may be downloaded in PDF or plain text format.



Modeling and mitigating natural hazards: Stationarity is immortal!  

NASA Astrophysics Data System (ADS)

change is a reason of relevant concern as it is occurring at an unprecedented pace and might increase natural hazards. Moreover, it is deemed to imply a reduced representativity of past experience and data on extreme hydroclimatic events. The latter concern has been epitomized by the statement that "stationarity is dead." Setting up policies for mitigating natural hazards, including those triggered by floods and droughts, is an urgent priority in many countries, which implies practical activities of management, engineering design, and construction. These latter necessarily need to be properly informed, and therefore, the research question on the value of past data is extremely important. We herein argue that there are mechanisms in hydrological systems that are time invariant, which may need to be interpreted through data inference. In particular, hydrological predictions are based on assumptions which should include stationarity. In fact, any hydrological model, including deterministic and nonstationary approaches, is affected by uncertainty and therefore should include a random component that is stationary. Given that an unnecessary resort to nonstationarity may imply a reduction of predictive capabilities, a pragmatic approach, based on the exploitation of past experience and data is a necessary prerequisite for setting up mitigation policies for environmental risk.

Montanari, Alberto; Koutsoyiannis, Demetris



Mitigation of unconfined releases of hazardous gases via liquid spraying  

Microsoft Academic Search

The capability of water sprays in mitigating clouds of hydrofluoric acid (HF) has been demonstrated in the large-scale field experiments of Goldfish and Hawk, which took place at the DOE Nevada Test Site. The effectiveness of water sprays and fire water monitors to remove HF from vapor plume, has also been studied theoretically using the model HGSPRAY5 with the near-field





SciTech Connect

Interaction of a high intensity laser with matter may generate an ionizing radiation hazard. Very limited studies have been made, however, on the laser-induced radiation protection issue. This work reviews available literature on the physics and characteristics of laser-induced X-ray hazards. Important aspects include the laser-to-electron energy conversion efficiency, electron angular distribution, electron energy spectrum and effective temperature, and bremsstrahlung production of X-rays in the target. The possible X-ray dose rates for several femtosecond Ti:sapphire laser systems used at SLAC, including the short pulse laser system for the Matter in Extreme Conditions Instrument (peak power 4 TW and peak intensity 2.4 x 10{sup 18} W/cm{sup 2}) were analysed. A graded approach to mitigate the laser-induced X-ray hazard with a combination of engineered and administrative controls is also proposed.

Qiu, R.; Liu, J.C.; Prinz, A.A.; Rokni, S.H.; Woods, M.; Xia, Z.; /SLAC; ,



Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation  

NASA Astrophysics Data System (ADS)

An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes reduce the level of uncertainty in their results to the uncertainty in the geophysical initial conditions. Further, when coupled with real--time free--field tsunami measurements from tsunameters, validated codes are the only choice for realistic forecasting of inundation; the consequences of failure are too ghastly to take chances with numerical procedures that have not been validated. We discuss a ten step process of benchmark tests for models used for inundation mapping. The associated methodology and algorithmes have to first be validated with analytical solutions, then verified with laboratory measurements and field data. The models need to be published in the scientific literature in peer-review journals indexed by ISI. While this process may appear onerous, it reflects our state of knowledge, and is the only defensible methodology when human lives are at stake. Synolakis, C.E., and Bernard, E.N, Tsunami science before and beyond Boxing Day 2004, Phil. Trans. R. Soc. A 364 1845, 2231--2263, 2005.

Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.



Local hazard mitigation plans: a preliminary estimation of state-level completion from 2004 to 2009.  


According to the Disaster Mitigation Act of 2000 and subsequent federal policy, local governments are required to have a Hazard Mitigation Plan (HMP) written and approved by the Federal Emergency Management Agency (FEMA) to be eligible for federal mitigation assistance. This policy took effect on November 1, 2004. Using FEMA's database of approved HMPs and US Census Bureau's 2002 Survey of Local Governments, it is estimated that 3 years after the original deadline, 67 percent of the country's active local governments were without an approved HMP. A follow-up examination in 2009 of the eight states with the lowest completion percentages did not indicate significant improvement following the initial study and revealed inconsistencies in plan completion data over time. The completion percentage varied greatly by state and did not appear to follow any expected pattern such as wealth or hazard vulnerability that might encourage prompt completion of a plan. Further, the results indicate that -92 percent of the approved plans were completed by a multijurisdictional entity, which suggests single governments seldom complete and gain approval for plans. Based on these results, it is believed that state-level resolution is not adequate for explaining the variation of plan completion, and further study at the local level is warranted. PMID:24180092

Jackman, Andrea M; Beruvides, Mario G



Next-Generation GPS Station for Hazards Mitigation (Invited)  

NASA Astrophysics Data System (ADS)

Our objective is to better forecast, assess, and mitigate natural hazards, including earthquakes, tsunamis, and extreme storms and flooding through development and implementation of a modular technology for the next-generation in-situ geodetic station to support the flow of information from multiple stations to scientists, mission planners, decision makers, and first responders. The same technology developed under NASA funding can be applied to enhance monitoring of large engineering structures such as bridges, hospitals and other critical infrastructure. Meaningful warnings save lives when issued within 1-2 minutes for destructive earthquakes, several tens of minutes for tsunamis, and up to several hours for extreme storms and flooding, and can be provided by on-site fusion of multiple data types and generation of higher-order data products: GPS/GNSS and accelerometer measurements to estimate point displacements, and GPS/GNSS and meteorological measurements to estimate moisture variability in the free atmosphere. By operating semi-autonomously, each station can then provide low-latency, high-fidelity and compact data products within the constraints of narrow communications bandwidth that often accompanies natural disasters. We have developed a power-efficient, low-cost, plug-in Geodetic Module for fusion of data from in situ sensors including GPS, a strong-motion accelerometer module, and a meteorological sensor package, for deployment at existing continuous GPS stations in southern California; fifteen stations have already been upgraded. The low-cost modular design is scalable to the many existing continuous GPS stations worldwide. New on-the-fly data products are estimated with 1 mm precision and accuracy, including three-dimensional seismogeodetic displacements for earthquake, tsunami and structural monitoring and precipitable water for forecasting extreme weather events such as summer monsoons and atmospheric rivers experienced in California. Unlike more traditional approaches where data are collected and analyzed from a network of stations at a central processing facility, we are embedding these capabilities in the Geodetic Module's processor for in situ analysis and data delivery through TCP/IP to avoid single points of failure during emergencies. We are infusing our technology to several local and state groups, including the San Diego County Office of Emergency Services for earthquake and tsunami early warnings, UC San Diego Health Services for hospital monitoring and early warning, Caltrans for bridge monitoring, and NOAA's Weather Forecasting Offices in San Diego and Los Angeles Counties for forecasting extreme weather events. We describe our overall system and the ongoing efforts at technology infusion.

Bock, Y.



Evaluating fuel complexes for fire hazard mitigation planning in the southeastern United States.  

SciTech Connect

Fire hazard mitigation planning requires an accurate accounting of fuel complexes to predict potential fire behavior and effects of treatment alternatives. In the southeastern United States, rapid vegetation growth coupled with complex land use history and forest management options requires a dynamic approach to fuel characterization. In this study we assessed potential surface fire behavior with the Fuel Characteristic Classification System (FCCS), a tool which uses inventoried fuelbed inputs to predict fire behavior. Using inventory data from 629 plots established in the upper Atlantic Coastal Plain, South Carolina, we constructed FCCS fuelbeds representing median fuel characteristics by major forest type and age class. With a dry fuel moisture scenario and 6.4 km h{sub 1} midflame wind speed, the FCCS predicted moderate to high potential fire hazard for the majority of the fuelbeds under study. To explore fire hazard under potential future fuel conditions, we developed fuelbeds representing the range of quantitative inventorydata for fuelbed components that drive surface fire behavior algorithms and adjusted shrub species composition to represent 30% and 60% relative cover of highly flammable shrub species. Results indicate that the primary drivers of surface fire behavior vary by forest type, age and surface fire behavior rating. Litter tends to be a primary or secondary driver in most forest types. In comparison to other surface fire contributors, reducing shrub loading results in reduced flame lengths most consistently across forest types. FCCS fuelbeds and the results from this project can be used for fire hazard mitigation planning throughout the southern Atlantic Coastal Plain where similar forest types occur. The approach of building simulated fuelbeds across the range of available surface fuel data produces sets of incrementally different fuel characteristics that can be applied to any dynamic forest types in which surface fuel conditions change rapidly.

Andreu, Anne G.; Shea, Dan; Parresol, Bernard, R.; Ottmar, Roger, D.



NEOShield: An European Project to Address Impact Hazard Mitigation Issues  

NASA Astrophysics Data System (ADS)

The European Union NEOShield project is desribed.The primary aim of the project is to investigate in detail the three most promising mitigation techniques: the kinetic impactor, blast deflection, and the gravity tractor, and devise feasible demonstration missions.

Fulchignoni, M.; Harris, A. W.; Barucci, M. A.; Cano, J. L.; Fitzsimmons, A.; Green, S. F.; Hestroffer, D.; Lappas, V.; Lork, W.; Michel, P.; Morrison, D.; Payson, D.; Schäfer, F.; Thuillot, W.



Threshold effects of hazard mitigation in coastal human-environmental systems  

NASA Astrophysics Data System (ADS)

Despite improved scientific insight into physical and social dynamics related to natural disasters, the financial cost of extreme events continues to rise. This paradox is particularly evident along developed coastlines, where future hazards are projected to intensify with consequences of climate change, and where the presence of valuable infrastructure exacerbates risk. By design, coastal hazard mitigation buffers human activities against the variability of natural phenomena such as storms. But hazard mitigation also sets up feedbacks between human and natural dynamics. This paper explores developed coastlines as exemplary coupled human-environmental systems in which hazard mitigation is the key coupling mechanism. Results from a simplified numerical model of an agent-managed seawall illustrate the nonlinear effects that economic and physical thresholds can impart into coastal human-environmental system dynamics. The scale of mitigation action affects the time frame over which human activities and natural hazards interact. By accelerating environmental changes observable in some settings over human timescales of years to decades, climate change may temporarily strengthen the coupling between human and environmental dynamics. However, climate change could ultimately result in weaker coupling at those human timescales as mitigation actions increasingly engage global-scale systems.

Lazarus, E. D.



Assessing the costs of hazard mitigation through landscape interventions in the urban structure  

NASA Astrophysics Data System (ADS)

In this paper we look at an issue rarely approached, the economic efficiency of natural hazard risk mitigation. The urban scale at which a natural hazard can impact leads to the importance of urban planning strategy in risk management. However, usually natural, engineering, and social sciences deal with it, and the role of architecture and urban planning is neglected. Climate change can lead to risks related to increased floods, desertification, sea level rise among others. Reducing the sealed surfaces in cities through green spaces in the crowded centres can mitigate them, and can be foreseen in restructuration plans in presence or absence of disasters. For this purpose we reviewed the role of green spaces and community centres such as churches in games, which can build the core for restructuration efforts, as also field and archive studies show. We look at the way ICT can contribute to organize the information from the building survey to economic computations in direct modeling or through games. The roles of game theory, agent based modeling and networks and urban public policies in designing decision systems for risk management are discussed. Games rules are at the same time supported by our field and archive studies, as well as research by design. Also we take into consideration at a rare element, which is the role of landscape planning, through the inclusion of green elements in reconstruction after the natural and man-made disasters, or in restructuration efforts to mitigate climate change. Apart of existing old city tissue also landscape can be endangered by speculation and therefore it is vital to highlight its high economic value, also in this particular case. As ICOMOS highlights for the 2014 congress, heritage and landscape are two sides of the same coin. Landscape can become or be connected to a community centre, the first being necessary for building a settlement, the second raising its value, or can build connections between landmarks in urban routes. For this reason location plays a role not only for mitigating the effects of hazards but also for increasing the value of land through vicinities. Games are only another way to build a model of the complex system which is the urban organism in this regard, and a model is easier to be analysed than the system while displaying its basic rules. The role of landscape of building roads of memory between landmarks in the reconstruction is yet to be investigated in a future proposed COST action.

Bostenaru-Dan, Maria; Aldea Mendes, Diana; Panagopoulos, Thomas



Spatio-temporal patterns of hazards and their use in risk assessment and mitigation. Case study of road accidents in Romania  

NASA Astrophysics Data System (ADS)

Road accidents are among the leading causes of death in many world countries, partly as an inherent consequence of the increasing mobility of today society. The World Health Organization estimates that 1.3 million people died in road accidents in 2011, which means 186 deaths per million. The tragic picture is completed by millions of peoples experiencing different physical injuries or by the enormous social and economic costs that these events imply. Romania has one of the most unsafe road networks within the European Union, with annual averages of 9400 accidents, 8300 injuries and almost 2680 fatalities (2007-2012). An average of 141 death per million is more than twice the average fatality rate in European Union (about 60 death per million). Other specific indicators (accidents or fatalities reported to the road length, vehicle fleet size, driving license owners or adult population etc.) are even worst in the same European context. Road accidents are caused by a complex series of factors, some of them being a relatively constant premise, while others act as catalyzing factors or triggering agent: road features and quality, vehicle technical state, weather conditions, human related factors etc. All these lead to a complex equation with too many unknown variables, making almost impossible a probabilistic approach. However, the high concentration of accidents in a region or in some road sectors is caused by the existence of a specific context, created by factors with permanent or repetitive character, and leads to the idea of a spatial autocorrelation between locations of different adjoining accident. In the same way, the increasing frequency of road accidents and of their causes repeatability in different periods of the year would allow to identify those black timeframes with higher incidence of road accidents. Identifying and analyzing the road blackspots (hotspots) and black zones would help to improve road safety by acting against the common causes that create the spatial or temporal clustering of crash accidents. Since the 1990's, Geographical Informational Systems (GIS) became a very important tool for traffic and road safety management, allowing not only the spatial and multifactorial analysis, but also graphical and non-graphical outputs. The current paper presents an accessible GIS methodology to study the spatio-temporal pattern of injury related road accidents, to identify the high density accidents zones, to make a cluster analysis, to create multicriterial typologies, to identify spatial and temporal similarities and to explain them. In this purpose, a Geographical Information System was created, allowing a complex analysis that involves not only the events, but also a large set of interrelated and spatially linked attributes. The GIS includes the accidents as georeferenced point elements with a spatially linked attribute database: identification information (date, location details); accident type; main, secondary and aggravating causes; data about driver; vehicle information; consequences (damages, injured peoples and fatalities). Each attribute has its own number code that allows both the statistical analysis and the spatial interrogation. The database includes those road accidents that led to physical injuries and loss of human lives between 2007 and 2012 and the spatial analysis was realized using TNTmips 7.3 software facilities. Data aggregation and processing allowed creating the spatial pattern of injury related road accidents through Kernel density estimation at three different levels (national - Romania; county level - Iasi County; local level - Iasi town). Spider graphs were used to create the temporal pattern or road accidents at three levels (daily, weekly and monthly) directly related to their causes. Moreover the spatial and temporal database relates the natural hazards (glazed frost, fog, and blizzard) with the human made ones, giving the opportunity to evaluate the nature of uncertainties in risk assessment. At the end, this paper provides a clustering methodology based on several environmenta

Catalin Stanga, Iulian



The Diversity of Large Earthquakes and Its Implications for Hazard Mitigation  

NASA Astrophysics Data System (ADS)

With the advent of broadband seismology and GPS, significant diversity in the source radiation spectra of large earthquakes has been clearly demonstrated. This diversity requires different approaches to mitigate hazards. In certain tectonic environments, seismologists can forecast the future occurrence of large earthquakes within a solid scientific framework using the results from seismology and GPS. Such forecasts are critically important for long-term hazard mitigation practices, but because stochastic fracture processes are complex, the forecasts are inevitably subject to large uncertainty, and unexpected events will continue to surprise seismologists. Recent developments in real-time seismology will help seismologists to cope with and prepare for tsunamis and earthquakes. Combining a better understanding of earthquake diversity with modern technology is the key to effective and comprehensive hazard mitigation practices.

Kanamori, Hiroo



Experimentally Benchmarked Numerical Approaches to Lightning Hazard Assessment and Mitigation  

NASA Astrophysics Data System (ADS)

A natural hazard that has been with us since the beginning of time is the lighting strike. Not only does it represent a direct hazard to humans but also to the facilities that they work within and the products they produce. The latter categories are of particular concern when they are related to potentially hazardous processes and products. For this reason experimental and numerical modelling techniques are developed to understand the nature of the hazards, to develop appropriate protective approaches which can be put in place and finally to gain assurance that the overall risks fall within national, international accepted standards and those appropriate to the special nature of the work. The latter is of particular importance when the processes and the products within such facilities have a potential susceptibility to lightning strike and where failure is deemed unacceptable. This paper covers examples of the modelling approaches applied to such facilities within which high consequence operations take place, together with the protection that is required for high consequence products. In addition examples are given of how the numerical techniques are benchmarked by supporting experimental programmes. Not only should such a safety rationale be laid down and agreed early for these facilities and products but that it is maintained during the inevitable changes that will occur during the design, development, production and maintenance phases. For example an 'improvement', as seen by a civil engineer or a facility manager, may well turn out to be detrimental to lightning safety. Constant vigilance is key to ensuring the maintenance of safety.

Jones, Malcolm; Newton, David



Assessment of indirect losses and costs of emergency for project planning of alpine hazard mitigation  

NASA Astrophysics Data System (ADS)

By virtue of augmented settling in hazardous areas and increased asset values, natural disasters such as floods, landslides and rockfalls cause high economic losses in Alpine lateral valleys. Especially in small municipalities, indirect losses, mainly stemming from a breakdown of transport networks, and costs of emergency can reach critical levels. A quantification of these losses is necessary to estimate the worthiness of mitigation measures, to determine the appropriate level of disaster assistance and to improve risk management strategies. There are comprehensive approaches available for assessing direct losses. However, indirect losses and costs of emergency are widely not assessed and the empirical basis for estimating these costs is weak. To address the resulting uncertainties of project appraisals, a standardized methodology has been developed dealing with issues of local economic effects and emergency efforts needed. In our approach, the cost-benefit-analysis for technical mitigation of the Austrian Torrent and Avalanche Control (TAC) will be optimized and extended using the 2005-debris flow as a design event, which struggled a small town in the upper Inn valley in southwest Tyrol (Austria). Thereby, 84 buildings were affected, 430 people were evacuated and due to this, the TAC implemented protection measures for 3.75 million Euros. Upgrading the method of the TAC and analyzing to what extent the cost-benefit-ratio is about to change, is one of the main objectives of this study. For estimating short-run indirect effects and costs of emergency on the local level, data was collected via questionnaires, field mapping, guided interviews, as well as intense literature research. According to this, up-to-date calculation methods were evolved and the cost-benefit-analysis of TAC was recalculated with these new-implemented results. The cost-benefit-ratio will be more precise and specific and hence, the decision, which mitigation alternative will be carried out. Based on this, the worthiness of the mitigation measures can be determined in more detail and the proper level of emergency assistance can be calculated more adequately. By dint of this study, a better data basis will be created evaluating technical and non-technical mitigation measures, which is useful for government agencies, insurance companies and research.

Amenda, Lisa; Pfurtscheller, Clemens



Mitigation of EMU Cut Glove Hazard from Micrometeoroid and Orbital Debris Impacts on ISS Handrails  

NASA Technical Reports Server (NTRS)

Recent cut damages sustained on crewmember gloves during extravehicular activity (ISS) onboard the International Space Station (ISS) have been caused by contact with sharp edges or a pinch point according to analysis of the damages. One potential source are protruding sharp edged crater lips from micrometeoroid and orbital debris (MMOD) impacts on metallic handrails along EVA translation paths. A number of hypervelocity impact tests were performed on ISS handrails, and found that mm-sized projectiles were capable of inducing crater lip heights two orders of magnitude above the minimum value for glove abrasion concerns. Two techniques were evaluated for mitigating the cut glove hazard of MMOD impacts on ISS handrails: flexible overwraps which act to limit contact between crewmember gloves and impact sites, and; alternate materials which form less hazardous impact crater profiles. In parallel with redesign efforts to increase the cut resilience of EMU gloves, the modifications to ISS handrails evaluated in this study provide the means to significantly reduce cut glove risk from MMOD impact craters

Ryan, Shannon; Christiansen, Eric L.; Davis, Bruce A.; Ordonez, Erick



The U.S. National Tsunami Hazard Mitigation Program: Successes in Tsunami Preparedness  

NASA Astrophysics Data System (ADS)

Formed in 1995 by Congressional Action, the National Tsunami Hazards Mitigation Program (NTHMP) provides the framework for tsunami preparedness activities in the United States. The Program consists of the 28 U.S. coastal states, territories, and commonwealths (STCs), as well as three Federal agencies: the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), and the United States Geological Survey (USGS). Since its inception, the NTHMP has advanced tsunami preparedness in the United States through accomplishments in many areas of tsunami preparedness: - Coordination and funding of tsunami hazard analysis and preparedness activities in STCs; - Development and execution of a coordinated plan to address education and outreach activities (materials, signage, and guides) within its membership; - Lead the effort to assist communities in meeting National Weather Service (NWS) TsunamiReady guidelines through development of evacuation maps and other planning activities; - Determination of tsunami hazard zones in most highly threatened coastal communities throughout the country by detailed tsunami inundation studies; - Development of a benchmarking procedure for numerical tsunami models to ensure models used in the inundation studies meet consistent, NOAA standards; - Creation of a national tsunami exercise framework to test tsunami warning system response; - Funding community tsunami warning dissemination and reception systems such as sirens and NOAA Weather Radios; and, - Providing guidance to NOAA's Tsunami Warning Centers regarding warning dissemination and content. NTHMP activities have advanced the state of preparedness of United States coastal communities, and have helped save lives and property during recent tsunamis. Program successes as well as future plans, including maritime preparedness, are discussed.

Whitmore, P.; Wilson, R. I.



New Activities of the U.S. National Tsunami Hazard Mitigation Program, Mapping and Modeling Subcommittee  

NASA Astrophysics Data System (ADS)

The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is comprised of representatives from coastal states and federal agencies who, under the guidance of NOAA, work together to develop protocols and products to help communities prepare for and mitigate tsunami hazards. Within the NTHMP are several subcommittees responsible for complimentary aspects of tsunami assessment, mitigation, education, warning, and response. The Mapping and Modeling Subcommittee (MMS) is comprised of state and federal scientists who specialize in tsunami source characterization, numerical tsunami modeling, inundation map production, and warning forecasting. Until September 2012, much of the work of the MMS was authorized through the Tsunami Warning and Education Act, an Act that has since expired but the spirit of which is being adhered to in parallel with reauthorization efforts. Over the past several years, the MMS has developed guidance and best practices for states and territories to produce accurate and consistent tsunami inundation maps for community level evacuation planning, and has conducted benchmarking of numerical inundation models. Recent tsunami events have highlighted the need for other types of tsunami hazard analyses and products for improving evacuation planning, vertical evacuation, maritime planning, land-use planning, building construction, and warning forecasts. As the program responsible for producing accurate and consistent tsunami products nationally, the NTHMP-MMS is initiating a multi-year plan to accomplish the following: 1) Create and build on existing demonstration projects that explore new tsunami hazard analysis techniques and products, such as maps identifying areas of strong currents and potential damage within harbors as well as probabilistic tsunami hazard analysis for land-use planning. 2) Develop benchmarks for validating new numerical modeling techniques related to current velocities and landslide sources. 3) Generate guidance and protocols for the production and use of new tsunami hazard analysis products. 4) Identify multistate collaborations and funding partners interested in these new products. Application of these new products will improve the overall safety and resilience of coastal communities exposed to tsunami hazards.

Wilson, R. I.; Eble, M. C.



The NEOShield Project: Understanding the Mitigation-Relevant Physical Properties of Potentially Hazardous Asteroids  

NASA Astrophysics Data System (ADS)

NEOShield is a European-Union funded project to address impact hazard mitigation issues, coordinated by the German Aerospace Center, DLR. The NEOShield consortium consists of 13 research institutes, universities, and industrial partners from 6 countries and includes leading US and Russian space organizations. The primary aim of the 5.8 million euro, 3.5 year project, which commenced in January 2012, is to investigate in detail promising mitigation techniques, such as the kinetic impactor, blast deflection, and the gravity tractor, and devise feasible demonstration missions. Options for an international strategy for implementation when an actual impact threat arises will also be investigated. Our current scientific work is focused on examining the mitigation-relevant physical properties of the NEO population via observational data and laboratory experiments on asteroid surface analog materials. We are attempting to narrow the range of the expected properties of objects that are most likely to threaten the Earth and trigger space-borne mitigation attempts, and investigate how such objects would respond to different mitigation techniques. The results of our scientific work will flow into the technical phase of the project, during which detailed designs of feasible mitigation demonstration missions will be developed. We briefly describe the scope of the project and report on results obtained to date. Funded under EU FP7 program agreement no. 282703.

Harris, Alan W.; Drube, L.; Consortium, NEOShield



Monitoring Fogo Island, Cape Verde Archipelago, for Volcanic Hazard Mitigation  

NASA Astrophysics Data System (ADS)

Fogo Island, in the Cape Verde Archipelago (North Atlantic), with a total area of 476 km2 and a population of about 40000, is an active ocean island volcano raising from an average sea-bottom depth of the order of -3000m to a maximum altitude of 2820m. All of the 28 historically recorded eruptions (Ribeiro, 1960) since the arrival of the first settlers in the 15th Century took place in Cha das Caldeiras, a 9 km-wide flat zone 1700 meters above sea level that resulted from the infill of a large lateral collapse caldera (Day et al., 2000). The last eruptions occurred in 1951 and 1995, through secondary cones at the basis of Pico do Fogo, the main volcanic edifice. A tall scarp surrounds Cha das Calderas on its western side only, and the eastern limit leads to a very steep sub-aerial slope down to the coastline. With this morphology, the volcanic hazard is significant inside Cha das Caldeiras - with a resident population of the order of 800 - and particularly in the villages of the eastern coast. Because the magma has low viscosity, eruptions in Fogo have scarce precursory activity, and its forecast is therefore challenging. The VIGIL monitoring network was installed between 1997 and 2001, and is currently in full operation. It consists of seven seismographic stations - two of which broadband - four tilt stations, a CO2 monitoring station and a meteo station. The data is telemetred in real time to the central laboratory in the neighbor island of Santiago, and analyzed on a daily basis. The continuous data acquisition is complemented by periodic GPS, gravity and leveling surveys (Lima et al., this conference). In this paper we present the methodology adopted to monitor the level of volcanic activity of Fogo Volcano, and show examples of the data being collected. Anomalous data recorded at the end of September 2000, which led to the only occurrence of an alert warning so far, are also presented and discussed.

Faria, B. V.; Heleno, S. I.; Barros, I. J.; d'Oreye, N.; Bandomo, Z.; Fonseca, J. F.



A portfolio approach to evaluating natural hazard mitigation policies: An Application to lateral-spread ground failure in Coastal California  

USGS Publications Warehouse

In the past, efforts to prevent catastrophic losses from natural hazards have largely been undertaken by individual property owners based on site-specific evaluations of risks to particular buildings. Public efforts to assess community vulnerability and encourage mitigation have focused on either aggregating site-specific estimates or adopting standards based upon broad assumptions about regional risks. This paper develops an alternative, intermediate-scale approach to regional risk assessment and the evaluation of community mitigation policies. Properties are grouped into types with similar land uses and levels of hazard, and hypothetical community mitigation strategies for protecting these properties are modeled like investment portfolios. The portfolios consist of investments in mitigation against the risk to a community posed by a specific natural hazard, and are defined by a community's mitigation budget and the proportion of the budget invested in locations of each type. The usefulness of this approach is demonstrated through an integrated assessment of earthquake-induced lateral-spread ground failure risk in the Watsonville, California area. Data from the magnitude 6.9 Loma Prieta earthquake of 1989 are used to model lateral-spread ground failure susceptibility. Earth science and economic data are combined and analyzed in a Geographic Information System (GIS). The portfolio model is then used to evaluate the benefits of mitigating the risk in different locations. Two mitigation policies, one that prioritizes mitigation by land use type and the other by hazard zone, are compared with a status quo policy of doing no further mitigation beyond that which already exists. The portfolio representing the hazard zone rule yields a higher expected return than the land use portfolio does: However, the hazard zone portfolio experiences a higher standard deviation. Therefore, neither portfolio is clearly preferred. The two mitigation policies both reduce expected losses and increase overall expected community wealth compared to the status quo policy.

Bernknopf, R.L.; Dinitz, L.B.; Rabinovici, S.J.M.; Evans, A.M.



Earthquake Scaling and Development of Ground Motion Prediction for Earthquake Hazard Mitigation in Taiwan  

NASA Astrophysics Data System (ADS)

For earthquake hazard mitigation toward risk management, integration study from development of source model to ground motion prediction is crucial. The simulation for high frequency component ( > 1 Hz) of strong ground motions in the near field was not well resolved due to the insufficient resolution in velocity structure. Using the small events as Green's functions (i.e. empirical Green's function (EGF) method) can resolve the problem of lack of precise velocity structure to replace the path effect evaluation. If the EGF is not available, a stochastic Green's function (SGF) method can be employed. Through characterizing the slip models derived from the waveform inversion, we directly extract the parameters needed for the ground motion prediction in the EGF method or the SGF method. The slip models had been investigated from Taiwan dense strong motion and global teleseismic data. In addition, the low frequency ( < 1 Hz) can obtained numerically by the Frequency-Wavenumber (FK) method. Thus, broadband frequency strong ground motion can be calculated by a hybrid method that combining a deterministic FK method for the low frequency simulation and the EGF or SGF method for high frequency simulation. Characterizing the definitive source parameters from the empirical scaling study can provide directly to the ground motion simulation. To give the ground motion prediction for a scenario earthquake, we compiled the earthquake scaling relationship from the inverted finite-fault models of moderate to large earthquakes in Taiwan. The studies show the significant involvement of the seismogenic depth to the development of rupture width. In addition to that, several earthquakes from blind fault show distinct large stress drop, which yield regional high PGA. According to the developing scaling relationship and the possible high stress drops for earthquake from blind faults, we further deploy the hybrid method mentioned above to give the simulation of the strong motion in displacement, velocity and acceleration. We now give this exercise to the high stress drop event, and the events, which might have potential seismic hazard to a specific site to give further estimation on seismic hazard evaluation.

Ma, K.; Yen, Y.



The asteroid and comet impact hazard: risk assessment and mitigation options  

NASA Astrophysics Data System (ADS)

The impact of extraterrestrial matter onto Earth is a continuous process. On average, some 50,000 tons of dust are delivered to our planet every year. While objects smaller than about 30 m mainly disintegrate in the Earth’s atmosphere, larger ones can penetrate through it and cause damage on the ground. When an object of hundreds of meters in diameter impacts an ocean, a tsunami is created that can devastate coastal cities. Further, if a km-sized object hit the Earth it would cause a global catastrophe due to the transport of enormous amounts of dust and vapour into the atmosphere resulting in a change in the Earth’s climate. This article gives an overview of the near-Earth asteroid and comet (near-Earth object-NEO) impact hazard and the NEO search programmes which are gathering important data on these objects. It also points out options for impact hazard mitigation by using deflection systems. It further discusses the critical constraints for NEO deflection strategies and systems as well as mitigation and evacuation costs and benefits. Recommendations are given for future activities to solve the NEO impact hazard problem.

Gritzner, Christian; Dürfeld, Kai; Kasper, Jan; Fasoulas, Stefanos



Seismic hazard studies in Egypt  

NASA Astrophysics Data System (ADS)

The study of earthquake activity and seismic hazard assessment of Egypt is very important due to the great and rapid spreading of large investments in national projects, especially the nuclear power plant that will be held in the northern part of Egypt. Although Egypt is characterized by low seismicity, it has experienced occurring of damaging earthquake effect through its history. The seismotectonic sitting of Egypt suggests that large earthquakes are possible particularly along the Gulf of Aqaba-Dead Sea transform, the Subduction zone along the Hellenic and Cyprean Arcs, and the Northern Red Sea triple junction point. In addition some inland significant sources at Aswan, Dahshour, and Cairo-Suez District should be considered. The seismic hazard for Egypt is calculated utilizing a probabilistic approach (for a grid of 0.5° × 0.5°) within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for four ground motion spectral periods and for different return periods. In addition, the uniform hazard spectra for rock sites for different 25 periods, and the probabilistic hazard curves for Cairo, and Alexandria cities are graphed. The peak ground acceleration (PGA) values were found close to the Gulf of Aqaba and it was about 220 gal for 475 year return period. While the lowest (PGA) values were detected in the western part of the western desert and it is less than 25 gal.

Mohamed, Abuo El-Ela A.; El-Hadidy, M.; Deif, A.; Abou Elenean, K.



Fluor Daniel Hanford implementation plan for DOE Order 5480.28, Natural phenomena hazards mitigation  

SciTech Connect

Natural phenomena hazards (NPH) are unexpected acts of nature that pose a threat or danger to workers, the public, or the environment. Earthquakes, extreme winds (hurricane and tornado), snow, flooding, volcanic ashfall, and lightning strikes are examples of NPH that could occur at the Hanford Site. U.S. Department of Energy (DOE) policy requires facilities to be designed, constructed, and operated in a manner that protects workers, the public, and the environment from hazards caused by natural phenomena. DOE Order 5480.28, Natural Phenomena Hazards Mitigation, includes rigorous new natural phenomena criteria for the design of new DOE facilities, as well as for the evaluation and, if necessary, upgrade of existing DOE facilities. The Order was transmitted to Westinghouse Hanford Company in 1993 for compliance and is also identified in the Project Hanford Management Contract, Section J, Appendix C. Criteria and requirements of DOE Order 5480.28 are included in five standards, the last of which, DOE-STD-1023, was released in fiscal year 1996. Because the Order was released before all of its required standards were released, enforcement of the Order was waived pending release of the last standard and determination of an in-force date by DOE Richland Operations Office (DOE-RL). Agreement also was reached between the Management and Operations Contractor and DOE-RL that the Order would become enforceable for new structures, systems, and components (SSCS) 60 days following issue of a new order-based design criteria in HNF-PRO-97, Engineering Design and Evaluation. The order also requires that commitments addressing existing SSCs be included in an implementation plan that is to be issued 1 year following the release of the last standard. Subsequently, WHC-SP-1175, Westinghouse Hanford Company Implementation Plan for DOE Order 5480.28, Natural Phenomena Hazards Mitigation, Rev. 0, was issued in November 1996, and this document, HNF-SP-1175, Fluor Daniel Hanford Implementation Plan for DOE Order 5480.28, Natural Phenomena Hazards Mitigation, is Rev. 1 of that plan.

Conrads, T.J.



Impact hazard mitigation: understanding the effects of nuclear explosive outputs on comets and asteroids  

SciTech Connect

The NASA 2007 white paper ''Near-Earth Object Survey and Deflection Analysis of Alternatives'' affirms deflection as the safest and most effective means of potentially hazardous object (PHO) impact prevention. It also calls for further studies of object deflection. In principle, deflection of a PHO may be accomplished by using kinetic impactors, chemical explosives, gravity tractors, solar sails, or nuclear munitions. Of the sudden impulse options, nuclear munitions are by far the most efficient in terms of yield-per-unit-mass launched and are technically mature. However, there are still significant questions about the response of a comet or asteroid to a nuclear burst. Recent and ongoing observational and experimental work is revolutionizing our understanding of the physical and chemical properties of these bodies (e.g ., Ryan (2000) Fujiwara et al. (2006), and Jedicke et al. (2006)). The combination of this improved understanding of small solar-system bodies combined with current state-of-the-art modeling and simulation capabilities, which have also improved dramatically in recent years, allow for a science-based, comprehensive study of PHO mitigation techniques. Here we present an examination of the effects of radiation from a nuclear explosion on potentially hazardous asteroids and comets through Monte Carlo N-Particle code (MCNP) simulation techniques. MCNP is a general-purpose particle transport code commonly used to model neutron, photon, and electron transport for medical physics reactor design and safety, accelerator target and detector design, and a variety of other applications including modeling the propagation of epithermal neutrons through the Martian regolith (Prettyman 2002). It is a massively parallel code that can conduct simulations in 1-3 dimensions, complicated geometries, and with extremely powerful variance reduction techniques. It uses current nuclear cross section data, where available, and fills in the gaps with analytical models where data are not available. MCNP has undergone extensive verification and validation and is considered the gold-standard for particle transport. (Forrest B. Brown, et al., ''MCNP Version 5,'' Trans. Am. Nucl. Soc., 87, 273, November 2002.) Additionally, a new simulation capability using MCNP has become available to this collaboration. The first results of this new capability will also be presented.

Clement, Ralph R C [Los Alamos National Laboratory; Plesko, Catherine S [Los Alamos National Laboratory; Bradley, Paul A [Los Alamos National Laboratory; Conlon, Leann M [Los Alamos National Laboratory



Rio Soliette (haiti): AN International Initiative for Flood-Hazard Assessment and Mitigation  

NASA Astrophysics Data System (ADS)

Natural catastrophic events are one of most critical aspects for health and economy all around the world. However, the impact in a poor region can impact more dramatically than in others countries. Isla Hispaniola (Haiti and the Dominican Republic), one of the poorest regions of the planet, has repeatedly been hit by catastrophic natural disasters that caused incalculable human and economic losses. After the catastrophic flood event occurred in the basin of River Soliette on May 24th, 2004, the General Direction for Development and Cooperation of the Italian Department of Foreign Affairs funded an international cooperation initiative (ICI) coordinated by the University of Bologna, that involved Haitian and Dominican institutions.Main purpose of the ICI was hydrological and hydraulic analysis of the May 2004 flood event aimed at formulating a suitable and affordable flood risk mitigation plan, consisting of structural and non-structural measures. In this contest, a topographic survey was necessary to realize the hydrological model and to improve the knowledge in some areas candidates to be site for mitigation measures.To overcome the difficulties arising from the narrowness of funds, surveyors and limited time available for the survey, only GPS technique have been used, both for framing aspects (using PPP approach), and for geometrical survey of the river by means of river cross-sections and detailed surveys in two areas (RTK technique). This allowed us to reconstruct both the river geometry and the DTM's of two expansion areas (useful for design hydraulic solutions for mitigate flood-hazard risk).

Gandolfi, S.; Castellarin, A.; Barbarella, M.; Brath, A.; Domeneghetti, A.; Brandimarte, L.; Di Baldassarre, G.



Seismic Hazard and Risk Assessment in Multi-Hazard Prone Urban Areas: The Case Study of Cologne, Germany  

NASA Astrophysics Data System (ADS)

Most hazard and risk assessment studies usually analyze and represent different kinds of hazards and risks separately, although risk assessment and mitigation programs in multi-hazard prone urban areas should take into consideration possible interactions of different hazards. This is particularly true for communities located in seismically active zones, where, on the one hand, earthquakes are capable of triggering other types of hazards, while, on the other hand, one should bear in mind that temporal coincidence or succession of different hazardous events may influence the vulnerability of the existing built environment and, correspondingly, the level of the total risk. Therefore, possible inter-dependencies and inter-influences of different hazards should be reflected properly in the hazard, vulnerability and risk analyses. This work presents some methodological aspects and preliminary results of a study being implemented within the framework of the MATRIX (New Multi-Hazard and Multi-Risk Assessment Methods for Europe) project. One of the test cases of the MATRIX project is the city of Cologne, which is one of the largest cities of Germany. The area of Cologne, being exposed to windstorm, flood and earthquake hazards, has already been considered in comparative risk assessments. However, possible interactions of these different hazards have been neglected. The present study is aimed at the further development of a holistic multi-risk assessment methodology, taking into consideration possible time coincidence and inter-influences of flooding and earthquakes in the area.

Tyagunov, S.; Fleming, K.; Parolai, S.; Pittore, M.; Vorogushyn, S.; Wieland, M.; Zschau, J.



Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation  

NASA Astrophysics Data System (ADS)

During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at five of them. The results showed that teams quickly learned to collect high-quality data for each method of analysis. SPAC and refraction microtremor analysis each demonstrated that dispersion relations based on ambient noise and from arrays with an aperture of less than 200 meters could be used to determine the depth of a weak, disaggregated layer known to underlie the fast near-surface limestone terraces on which Santo Domingo is situated, and indicated the presence of unexpectedly strong rocks below. All three array methods concurred that most Santo Domingo sites has relatively high VS30 (average shear velocity to a depth of 30 m), generally at the B-C NEHRP hazard class boundary or higher. HVSR analysis revealed that the general pattern of resonance was short periods close to the coast, and an increase with distance from the shore line. In the east-west direction, significant variations were also evident at the highest elevation terrace, and near the Ozama River. In terms of the sub-soil conditions, the observed pattern of HVSR values, departs form the expected increase of sediments thickness close to the coast.

Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.



Coupling Radar Rainfall Estimation and Hydrological Modelling For Flash-flood Hazard Mitigation  

NASA Astrophysics Data System (ADS)

Flood risk mitigation is accomplished through managing either or both the hazard and vulnerability. Flood hazard may be reduced through structural measures which alter the frequency of flood levels in the area. The vulnerability of a community to flood loss can be mitigated through changing or regulating land use and through flood warning and effective emergency response. When dealing with flash-flood hazard, it is gener- ally accepted that the most effective way (and in many instances the only affordable in a sustainable perspective) to mitigate the risk is by reducing the vulnerability of the involved communities, in particular by implementing flood warning systems and community self-help programs. However, both the inherent characteristics of the at- mospheric and hydrologic processes involved in flash-flooding and the changing soci- etal needs provide a tremendous challenge to traditional flood forecasting and warning concepts. In fact, the targets of these systems are traditionally localised like urbanised sectors or hydraulic structures. Given the small spatial scale that characterises flash floods and the development of dispersed urbanisation, transportation, green tourism and water sports, human lives and property are exposed to flash flood risk in a scat- tered manner. This must be taken into consideration in flash flood warning strategies and the investigated region should be considered as a whole and every section of the drainage network as a potential target for hydrological warnings. Radar technology offers the potential to provide information describing rain intensities almost contin- uously in time and space. Recent research results indicate that coupling radar infor- mation to distributed hydrologic modelling can provide hydrologic forecasts at all potentially flooded points of a region. Nevertheless, very few flood warning services use radar data more than on a qualitative basis. After a short review of current under- standing in this area, two issues are examined: advantages and caveats of using radar rainfall estimates in operational flash flood forecasting, methodological problems as- sociated to the use of hydrological models for distributed flash flood forecasting with rainfall input estimated from radar.

Borga, M.; Creutin, J. D.


Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Research Team  

NASA Technical Reports Server (NTRS)

The Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage raft empennage.

Kelly, Michael J.



Solutions Network Formulation Report. NASA's Potential Contributions using ASTER Data in Marine Hazard Mitigation  

NASA Technical Reports Server (NTRS)

The 28-foot storm surge from Hurricane Katrina pushed inland along bays and rivers for a distance of 12 miles in some areas, contributing to the damage or destruction of about half of the fleet of boats in coastal Mississippi. Most of those boats had sought refuge in back bays and along rivers. Some boats were spared damage because the owners chose their mooring site well. Gulf mariners need a spatial analysis tool that provides guidance on the safest places to anchor their boats during future hurricanes. This product would support NOAA s mission to minimize the effects of coastal hazards through awareness, education, and mitigation strategies and could be incorporated in the Coastal Risk Atlas decision support tool.

Fletcher, Rose



Bulgaria country study address climate change mitigation  

SciTech Connect

The Bulgaria Country Study to address climate change is a research project that incorporates three major elements: inventory of greenhouse gases (GHG), assessment of vulnerability and adaptation to climate change, and mitigation analysis. The mitigation analysis is the central part of the study. Based on the assumptions of the socioeconomic development of the country up to the year 2020, it allows identification of policies and measures that may lead to the stabilization of GHG emissions, as required by the United Nations Framework Convention on Climate Change (FCCC). Bulgaria signed the FCCC in 1992, ratified it in 1995, and is now undertaking measures related to its implementation. At present Bulgaria is in the process of transition from a centrally planned economy to a market driven economy. This process is characterized by basic economic structural changes, freeing of prices (with only energy prices still controlled by the government), a drastic drop in GDP by 27% in 1992 compared to 1988, and reduction of energy consumption by 44% in 1992 compared to 1988. The analysis of the mitigation measures and their impact on the future development of economy and on the energy sector is a very complicated task. This paper will focus on the choice of methodology, some basic assumptions, and results of the study to date. 9 refs., 13 figs.

Simeonova, K. [ENERGOPROEKT, Sofia (Bulgaria)




E-print Network

AND HAZARD MITIGATION PROJECTS Dave Gauthier 1 *, Michael Conlan 2 and Bruce Jamieson 2 1 Dept. of Geological, Calgary, AB, Canada ABSTRACT: Recent advances in both digital photography and computer processing power Recent advances in both digital photography and computer processing power has led to the ability

Jamieson, Bruce


The Identification of Filters and Interdependencies for Effective Resource Allocation: Coupling the Mitigation of Natural Hazards to Economic Development.  

NASA Astrophysics Data System (ADS)

Policy formulation for the mitigation and management of risks posed by natural hazards requires that governments confront difficult decisions for resource allocation and be able to justify their spending. Governments also need to recognize when spending offers little improvement and the circumstances in which relatively small amounts of spending can make substantial differences. Because natural hazards can have detrimental impacts on local and regional economies, patterns of economic development can also be affected by spending decisions for disaster mitigation. This paper argues that by mapping interdependencies among physical, social and economic factors, governments can improve resource allocation to mitigate the risks of natural hazards while improving economic development on local and regional scales. Case studies of natural hazards in Turkey have been used to explore specific "filters" that act to modify short- and long-term outcomes. Pre-event filters can prevent an event from becoming a natural disaster or change a routine event into a disaster. Post-event filters affect both short and long-term recovery and development. Some filters cannot be easily modified by spending (e.g., rural-urban migration) but others (e.g., land-use practices) provide realistic spending targets. Net social benefits derived from spending, however, will also depend on the ways by which filters are linked, or so-called "interdependencies". A single weak link in an interdependent system, such as a power grid, can trigger a cascade of failures. Similarly, weak links in social and commercial networks can send waves of disruption through communities. Conversely, by understanding the positive impacts of interdependencies, spending can be targeted to maximize net social benefits while mitigating risks and improving economic development. Detailed information on public spending was not available for this study but case studies illustrate how networks of interdependent filters can modify social benefits and costs. For example, spending after the 1992 Erzincan earthquake targeted local businesses but limited alternative employment, labor losses and diminished local markets all contributed to economic stagnation. Spending after the 1995 Dinar earthquake provided rent subsidies, supporting a major exodus from the town. Consequently many local people were excluded from reconstruction decisions and benefits offered by reconstruction funds. After the 1999 Marmara earthquakes, a 3-year economic decline in Yalova illustrates the vulnerability of local economic stability to weak regulation enforcement by a few agents. A resource allocation framework indicates that government-community relations, lack of economic diversification, beliefs, and compensation are weak links for effective spending. Stronger positive benefits could be achieved through spending to target land-use regulation enforcement, labor losses, time-critical needs of small businesses, and infrastructure. While the impacts of the Marmara earthquakes were devastating, strong commercial networks and international interests helped to re-establish the regional economy. Interdependencies may have helped to drive a recovery. Smaller events in eastern Turkey, however, can wipe out entire communities and can have long-lasting impacts on economic development. These differences may accelerate rural to urban migration and perpetuate regional economic divergence in the country. 1: Research performed in the Wharton MBA Program, Univ. of Pennsylvania.

Agar, S. M.; Kunreuther, H.



Web-Based Geospatial Tools to Address Hazard Mitigation, Natural Resource Management, and Other Societal Issues  

USGS Publications Warehouse

Federal, State, and local government agencies in the United States face a broad range of issues on a daily basis. Among these are natural hazard mitigation, homeland security, emergency response, economic and community development, water supply, and health and safety services. The U.S. Geological Survey (USGS) helps decision makers address these issues by providing natural hazard assessments, information on energy, mineral, water and biological resources, maps, and other geospatial information. Increasingly, decision makers at all levels are challenged not by the lack of information, but by the absence of effective tools to synthesize the large volume of data available, and to utilize the data to frame policy options in a straightforward and understandable manner. While geographic information system (GIS) technology has been widely applied to this end, systems with the necessary analytical power have been usable only by trained operators. The USGS is addressing the need for more accessible, manageable data tools by developing a suite of Web-based geospatial applications that will incorporate USGS and cooperating partner data into the decision making process for a variety of critical issues. Examples of Web-based geospatial tools being used to address societal issues follow.

Hearn, Paul P.



The Effective Organization and Use of Data in Bridging the Hazard Mitigation-Climate Change Adaptation Divide (Invited)  

NASA Astrophysics Data System (ADS)

The costs associated with managing natural hazards and disasters continue to rise in the US and elsewhere. Many climate change impacts are manifested in stronger or more frequent natural hazards such as floods, wildfire, hurricanes and typhoons, droughts, and heat waves. Despite this common problem, the climate change adaptation and hazards management communities have largely failed to acknowledge each other’s work in reducing hazard impacts. This is even reflected in the language that each community uses; for example, the hazards management community refers to hazard risk reduction as mitigation while the climate change community refers to it as adaptation. In order to help bridge this divide, we suggest each community utilize data in a more formally-organized and effective manner based on four principles: 1. The scale of the data must reflect the needs of the decision maker. In most cases, decision makers’ needs are most effectively met through the development of a multiple alternatives that takes into account a variety of possible impacts. 2. Investments intended to reduce vulnerability and increase resilience should be driven by the wise use of available data using a “risk-based” strategy. 3. Climate change adaptation and hazard mitigation strategies must be integrated with other value drivers when building resiliency. Development and use of data that underscore the concept of “no regrets” risk reduction can be used to accomplish this aim. 4. The use of common data is critical in building a bridge between the climate change adaptation and hazards management communities. We will explore how the creation of data repositories that collect, analyze, display and archive hazards and disaster data can help address the challenges posed by the current and hazards management and climate change adaptation divide.

Smith, G. P.; Fox, J.; Shuford, S.



Natural Hazards and Effects on Local Populations: Applications of NSF MARGINS research to hazards mitigation in Central America  

E-print Network

: The combined focus of two MARGINS research initiatives in Central America (Subduction Factory and SIEZE hazards: A significant contribution of MARGINS research in Central America will be the developmentNatural Hazards and Effects on Local Populations: Applications of NSF MARGINS research to hazards

Marshall, Jeffrey S.


A review of accidents, prevention and mitigation options related to hazardous gases  

SciTech Connect

Statistics on industrial accidents are incomplete due to lack of specific criteria on what constitutes a release or accident. In this country, most major industrial accidents were related to explosions and fires of flammable materials, not to releases of chemicals into the environment. The EPA in a study of 6,928 accidental releases of toxic chemicals revealed that accidents at stationary facilities accounted for 75% of the total number of releases, and transportation accidents for the other 25%. About 7% of all reported accidents (468 cases) resulted in 138 deaths and 4,717 injuries ranging from temporary respiratory problems to critical injuries. In-plant accidents accounted for 65% of the casualties. The most efficient strategy to reduce hazards is to choose technologies which do not require the use of large quantities of hazardous gases. For new technologies this approach can be implemented early in development, before large financial resources and efforts are committed to specific options. Once specific materials and options have been selected, strategies to prevent accident initiating events need to be evaluated and implemented. The next step is to implement safety options which suppress a hazard when an accident initiating event occurs. Releases can be prevented or reduced with fail-safe equipment and valves, adequate warning systems and controls to reduce and interrupt gas leakage. If an accident occurs and safety systems fail to contain a hazardous gas release, then engineering control systems will be relied on to reduce/minimize environmental releases. As a final defensive barrier, the prevention of human exposure is needed if a hazardous gas is released, in spite of previous strategies. Prevention of consequences forms the final defensive barrier. Medical facilities close by that can accommodate victims of the worst accident can reduce the consequences of personnel exposure to hazardous gases.

Fthenakis, V.M.



Mitigation of EMU Glove Cut Hazard by MMOD Impact Craters on Exposed ISS Handrails  

NASA Technical Reports Server (NTRS)

Recent cut damages to crewmember extravehicular mobility unit (EMU) gloves during extravehicular activity (EVA) onboard the International Space Station (ISS) has been found to result from contact with sharp edges or pinch points rather than general wear or abrasion. One possible source of cut-hazards are protruding sharp edged crater lips from impact of micrometeoroid and orbital debris (MMOD) particles on external metallic handrails along EVA translation paths. During impact of MMOD particles at hypervelocity an evacuation flow develops behind the shock wave, resulting in the formation of crater lips that can protrude above the target surface. In this study, two methods were evaluated to limit EMU glove cut-hazards due to MMOD impact craters. In the first phase, four flexible overwrap configurations are evaluated: a felt-reusable surface insulation (FRSI), polyurethane polyether foam with beta-cloth cover, double-layer polyurethane polyether foam with beta-cloth cover, and multi-layer beta-cloth with intermediate Dacron netting spacers. These overwraps are suitable for retrofitting ground equipment that has yet to be flown, and are not intended to protect the handrail from impact of MMOD particles, rather to act as a spacer between hazardous impact profiles and crewmember gloves. At the impact conditions considered, all four overwrap configurations evaluated were effective in limiting contact between EMU gloves and impact crater profiles. The multi-layer beta-cloth configuration was the most effective in reducing the height of potentially hazardous profiles in handrail-representative targets. In the second phase of the study, four material alternatives to current aluminum and stainless steel alloys were evaluated: a metal matrix composite, carbon fiber reinforced plastic (CFRP), fiberglass, and a fiber metal laminate. Alternative material handrails are intended to prevent the formation of hazardous damage profiles during MMOD impact and are suitable for flight hardware yet to be constructed. Of the four materials evaluated, only the fiberglass formed a less hazardous damage profile than the baseline metallic target. Although the CFRP laminate did not form any noticeable crater lip, brittle protruding fibers are considered a puncture risk. In parallel with EMU glove redesign efforts, modifications to metallic ISS handrails such as those evaluated in this study provide the means to significantly reduce cut-hazards from MMOD impact craters.

Christiansen, Eric L.; Ryan, Shannon



Methodology for mitigation of seismic hazards in existing unreinforced masonry buildings: Diaphragm testing  

NASA Astrophysics Data System (ADS)

An experimental program conducted on horizontal diaphragms subjected to quasi-static, cyclic, in-plane displacement and dynamic, in-plane earthquake shaking is described. The experimental program is one of several tasks in an overall research program, sponsored by the National Science Foundation, whose objective is to develop a methodology for mitigation of seismic hazards in existing unreinforced masonary buildings. Full-scale component tests on horizontal diaphragms subjected to quasi-static, cyclic, in-plane displacement and dynamic, in-plane earthquake shaking were designed and conducted on 14 diaphragm specimens subjected to 139 test sequences that were comprised of intermingled static and dynamic loadings. The quasi-static tests produced deformations in the diaphragms that ranged from elastic excursions to excursions that produced ductilities of 2, 3, 4, and 6; however, only one diaphragm was driven to the ductility of 6. The dynamic earthquake loadings covered the full range of seismicity in the United States from an Effective Peak Acceleration of 0.1 g to 0.4 g.



Past, Present, and Future Challenges in Earthquake Hazard Mitigation of Indonesia: A Collaborative Work of Geological Agency Indonesia and Geoscience Australia  

NASA Astrophysics Data System (ADS)

In the last decade, Indonesia has suffered from earthquakes disaster since four out of twelve of the world's large earthquakes with more than 1000 causalities occurred in Indonesia. The great Sumatra earthquake of December 26, 2004 followed by tsunami which cost 227,898 of lives has brought Indonesia and its active tectonic setting to the world's attention. Therefore the government of Indonesia encourages hazard mitigation efforts that are more focused on the pre-disaster phase. In response to government policy in earthquake disaster mitigation, Geological Agency Indonesia attempts to meet the need for rigorous earthquake hazard map throughout the country in provincial scale in 2014. A collaborative work with Geoscience Australia through short-term training missions; on-going training, mentoring, assistance and studying in Australia, under the auspices of Australia-Indonesia Facility for Disaster Reduction (AIFDR) have accelerated the execution of these maps. Since 2010 to date of collaboration, by using probabilistic seismic hazard assessment (PSHA) method, provincial earthquake hazard maps of Central Java (2010), West Sulawesi, Gorontalo, and North Maluku (2011) have been published. In 2012, by the same method, the remaining provinces of Sulawesi Island, Papua, North Sumatera and Jambi will be published. In the end of 2014, all 33 Indonesian provinces hazard maps will be delivered. The future challenges are to work together with the stakeholders, to produce district scale maps and establish a national standard for earthquake hazard maps. Moreover, the most important consideration is to build the capacity to update, maintain and revise the maps as recent information available.

Hidayati, S.; Cummins, P. R.; Cipta, A.; Omang, A.; Griffin, J.; Horspool, N.; Robiana, R.; Sulaeman, C.




NSDL National Science Digital Library

USGS provides detailed fact sheets, research reports and case studies of coastal erosion caused by El Nino events in California. Other natural hazards are also covered, including earthquakes, tsunami, and landslides, illustrated with maps, photos, animated simulations, extensive links to resources by experts. Sort by topic to see all entries about coastal issues, environmental issues, mapping and data, natural hazards, and other subjects.


Looking Before We Leap: Recent Results From An Ongoing Quantitative Investigation Of Asteroid And Comet Impact Hazard Mitigation.  

NASA Astrophysics Data System (ADS)

The asteroid and comet impact hazard is now part of public consciousness, as demonstrated by movies, Super Bowl commercials, and popular news stories. However, there is a popular misconception that hazard mitigation is a solved problem. Many people think, `we'll just nuke it.’ There are, however, significant scientific questions remaining in the hazard mitigation problem. Before we can say with certainty that an explosive yield Y at height of burst h will produce a momentum change in or dispersion of a potentially hazardous object (PHO), we need to quantify how and where energy is deposited into the rubble pile or conglomerate that may make up the PHO. We then need to understand how shock waves propagate through the system, what causes them to disrupt, and how long gravitationally bound fragments take to recombine. Here we present numerical models of energy deposition from an energy source into various materials that are known PHO constituents, and rigid body dynamics models of the recombination of disrupted objects. In the energy deposition models, we explore the effects of porosity and standoff distance as well as that of composition. In the dynamical models, we explore the effects of fragment size and velocity distributions on the time it takes for gravitationally bound fragments to recombine. Initial models indicate that this recombination time is relatively short, as little as 24 hours for a 1 km sized PHO composed of 1000 meter-scale self-gravitating fragments with an initial velocity field of v/r = 0.001 1/s.

Plesko, Catherine; Weaver, R. P.; Korycansky, D. G.; Huebner, W. F.




EPA Science Inventory

Characteristics of Resource Conservation and Recovery Act hazardous waste landfills and of landfilled hazardous wastes have been described to permit development of models and other analytical techniques for predicting, reducing, and preventing landfill settlement and related cove...


Catastrophic debris flows transformed from landslides in volcanic terrains : mobility, hazard assessment and mitigation strategies  

USGS Publications Warehouse

Communities in lowlands near volcanoes are vulnerable to significant volcanic flow hazards in addition to those associated directly with eruptions. The largest such risk is from debris flows beginning as volcanic landslides, with the potential to travel over 100 kilometers. Stratovolcanic edifices commonly are hydrothermal aquifers composed of unstable, altered rock forming steep slopes at high altitudes, and the terrain surrounding them is commonly mantled by readily mobilized, weathered airfall and ashflow deposits. We propose that volcano hazard assessments integrate the potential for unanticipated debris flows with, at active volcanoes, the greater but more predictable potential of magmatically triggered flows. This proposal reinforces the already powerful arguments for minimizing populations in potential flow pathways below both active and selected inactive volcanoes. It also addresses the potential for volcano flank collapse to occur with instability early in a magmatic episode, as well as the 'false-alarm problem'-the difficulty in evacuating the potential paths of these large mobile flows. Debris flows that transform from volcanic landslides, characterized by cohesive (muddy) deposits, create risk comparable to that of their syneruptive counterparts of snow and ice-melt origin, which yield noncohesive (granular) deposits, because: (1) Volcano collapses and the failures of airfall- and ashflow-mantled slopes commonly yield highly mobile debris flows as well as debris avalanches with limited runout potential. Runout potential of debris flows may increase several fold as their volumes enlarge beyond volcanoes through bulking (entrainment) of sediment. Through this mechanism, the runouts of even relatively small collapses at Cascade Range volcanoes, in the range of 0.1 to 0.2 cubic kilometers, can extend to populated lowlands. (2) Collapse is caused by a variety of triggers: tectonic and volcanic earthquakes, gravitational failure, hydrovolcanism, and precipitation, as well as magmatic activity and eruptions. (3) Risk of collapse begins with initial magmatic activity and increases as intrusion proceeds. An archetypal debris flow from volcanic terrain occurred in Colombia with a tectonic earthquake (M 6.4) in 1994. The Rio Piez conveyed a catastrophic wave of debris flow over 100 kilometers, coalesced from multiple slides of surflcial material weakened both by weathering and by hydrothermal alteration in a large strato- volcano. Similar seismogenic flows occurred in Mexico in 1920 (M -6.5), Chile in 1960 (M 9.2), and Ecuador in 1987 (M 6.1 and 6.9). Velocities of wave fronts in two examples were 60 to 90 km/hr (17-25 meters per second) over the initial 30 kilometers. Volcano flank and sector collapses may produce untransformed debris avalanches, as occurred initially at Mount St. Helens in 1980. However, at least as common is direct transformation of the failed mass to a debris flow. At two other volcanoes in the Cascade Range-- Mount Rainier and Mount Baker--rapid transformation and high mobility were typical of most of at least 15 Holocene flows. This danger exists downstream from many stratovolcanoes worldwide; the population at risk is near 150,000 and increasing at Mount Rainier. The first step in preventing future catastrophes is documenting past flows. Deposits of some debris flows, however, can be mistaken for those of less-mobile debris avalanches on the basis of mounds formed by buoyed megaclasts. Megaclasts may record only the proximal phase of a debris flow that began as a debris avalanche. Runout may have extended much farther, and thus furore flow mobility may be underestimated. Processes and behaviors of megaclast-bearing paleoflows are best inferred from the intermegaclast matrix. Mitigation strategy can respond to volcanic flows regardless of type and trigger by: (1) Avoidance: Limit settlement in flow pathways to numbers that can be evacuated after event warnings (flow is occurring). (2) Instrumental even

Scott, Kevin M.; Macias, Jose Luis; Naranjo, Jose Antonio; Rodriguez, Sergio; McGeehin, John P.



Seismicity and seismotectonics of southern Ghana: lessons for seismic hazard mitigation  

NASA Astrophysics Data System (ADS)

Ghana is located on the West African craton and is far from the major earthquake zone of the world. It is therefore largely considered a stable region. However, the southern part of the country is seismically active. Records of damaging earthquakes in Ghana date as far back as 1615. A study on the microseismic activity in southern Ghana shows that the seismic activity is linked with active faulting between the east-west trending Coastal boundary fault and a northeast-southwest trending Akwapim fault zone. Epicentres of most of the earthquakes have been located close to the area where the two major faults intersect. This can be related to the level of activity of the faults. Some of the epicentres have been located offshore and can be associated with the level of activity of the coastal boundary fault. A review of the geological and instrumental recordings of earthquakes in Ghana show that earthquakes have occurred in the past and are still liable to occur within the vicinity of the intersection of the Akwapim fault zone and the Coastal boundary fault. Data from both historical and instrumental records indicate that the most seismically active areas in Ghana are the west of Accra, where the Akwapim fault zone and the Coastal boundary fault intersect. There are numerous minor faults in the intersection area between the Akwapim fault zone and the Coastal boundary fault. This mosaic of faults has a major implication for seismic activity in the area. Earthquake disaster mitigation measures are being put in place in recent times to reduce the impact of any major event that may occur in the country. The National Disaster Management Organization has come out with a building guide to assist in the mitigation effort of earthquake disasters and floods in the country. The building guide clearly stipulates the kind of material to be used, the proportion, what should go into the foundation for one or two storey building, the electrical materials to be used and many others.

Amponsah, Paulina



Evaluation and mitigation of lightning hazards to the space shuttle Solid Rocket Motors (SRM)  

NASA Technical Reports Server (NTRS)

The objective was to quantify electric field strengths in the Solid Rocket Motor (SRM) propellant in the event of a worst case lightning strike. Using transfer impedance measurements for selected lightning protection materials and 3D finite difference modeling, a retrofit design approach for the existing dielectric grain cover and railcar covers was evaluated and recommended for SRM segment transport. A safe level of 300 kV/m was determined for the propellant. The study indicated that a significant potential hazard exists for unprotected segments during rail transport. However, modified railcar covers and grain covers are expected to prevent lightning attachment to the SRM and to reduce the levels to several orders of magnitude below 300 kV/m.

Rigden, Gregory J.; Papazian, Peter B.



Exploratory Studies Facility Subsurface Fire Hazards Analysis  

SciTech Connect

The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

Richard C. Logan



Exploratory Studies Facility Subsurface Fire Hazards Analysis  

SciTech Connect

The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

J. L. Kubicek



Disruption Mitigation Studies in DIII--D  

NASA Astrophysics Data System (ADS)

Critical to the viability of the tokamak concept along with the operation and survivability of future devices is the development of techniques to terminate the discharge safely and mitigate the destructive effects of disruptions: high thermal and electromagnetic loads as well as intense high energy runaway electron beams. A series of dedicated disruption experiments on DIII--D have provided further data on the discharge behavior, thermal loads, halo currents, and runaway electrons and have evaluated techniques to mitigate the disruptions while minimizing runaway electron production. Non-axisymmetric halo currents occur with currents up to 30% of the plasma current and with toroidal peaking factors of 2 at the time of peak halo current. Large heat fluxes are also measured with up to 100% of the pre-disruption thermal energy deposited on the divertor floor. Fundamental questions on the halo current generation, scaling, and mitigation are being addressed by comparisons of DIII--D plasmas during disruptions with the DINA time-dependent resistive MHD code and with semi-analytic halo current evolution models. Experiments injecting cryogenic impurity ``killer'' pellets of neon, argon, and methane have successfully mitigated these disruption effects; reducing the halo currents by 30%--50%, lowering the halo current asymmetry to near unity, reducing the vertical force on the vessel, and enhancing the power loss through the radiation channel to ~90%. Often runaway electrons are generated following the pellet injection and results of recent experiments using pre-emptive ``killer'' pellets help benchmark theoretical models of the pellet ablation and plasma energy loss (KPRAD and TSC codes), and of the runaway electron generation (CQL3D Fokker-Planck code). Use of the models has led to two new runaway generation mechanisms; both a modification of the standard Dreicer process and arising out of instability induced transport or time dependent effects. Experiments with a massive helium gas puff (3000 T-l in 7 ms) have also effectively mitigated disruptions but without the formation of runaway electrons that can occur with ``killer'' pellets. The massive gas puff results provide encouragement for longer term approaches to disruption mitigation which are focusing on liquid jets. focusing on liquid jets.

Taylor, P. L.




E-print Network

are extending towards danger zones in the cryospheric systems. A number of recent glacier hazards and disasters particularly affects terrestrial systems where surface and sub-surface ice is involved. Changes in glacier

Kääb, Andreas


Piloted Simulation to Evaluate the Utility of a Real Time Envelope Protection System for Mitigating In-Flight Icing Hazards  

NASA Technical Reports Server (NTRS)

The utility of the Icing Contamination Envelope Protection (ICEPro) system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device (ICEFTD). ICEPro provides real time envelope protection cues and alerting messages on pilot displays. The pilots participating in this test were divided into two groups; a control group using baseline displays without ICEPro, and an experimental group using ICEPro driven display cueing. Each group flew identical precision approach and missed approach procedures with a simulated failure case icing condition. Pilot performance, workload, and survey questionnaires were collected for both groups of pilots. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their situation awareness of a hazardous aircraft state.

Ranaudo, Richard J.; Martos, Borja; Norton, Bill W.; Gingras, David R.; Barnhart, Billy P.; Ratvasky, Thomas P.; Morelli, Eugene



Assessing NEO hazard mitigation in terms of astrodynamics and propulsion systems requirements.  


Uncertainties associated with assessing valid near-Earth object (NEO) threats and carrying out interception missions place unique and stringent burdens on designing mission architecture, astrodynamics, and spacecraft propulsion systems. A prime uncertainty is associated with the meaning of NEO orbit predictability regarding Earth impact. Analyses of past NEO orbits and impact probabilities indicate uncertainties in determining if a projected NEO threat will actually materialize within a given time frame. Other uncertainties regard estimated mass, composition, and structural integrity of the NEO body. At issue is if one can reliably estimate a NEO threat and its magnitude. Parameters that determine NEO deflection requirements within various time frames, including the terminal orbital pass before impact, and necessary energy payloads, are quantitatively discussed. Propulsion system requirements for extending space capabilities to rapidly interact with NEOs at ranges of up to about 1 AU (astronomical unit) from Earth are outlined. Such missions, without gravitational boosts, are deemed critical for a practical and effective response to mitigation. If an impact threat is confirmed on an immediate orbital pass, the option for interactive reconnaissance, and interception, and subsequent NEO orbit deflection must be promptly carried out. There also must be an option to abort the mitigation mission if the NEO is subsequently found not to be Earth threatening. These options require optimal decision latitude and operational possibilities for NEO threat removal while minimizing alarm. Acting too far in advance of the projected impact could induce perturbations that ultimately exacerbate the threat. Given the dilemmas, uncertainties, and limited options associated with timely NEO mitigation within a decision making framework, currently available propulsion technologies that appear most viable to carry out a NEO interception/mitigation mission within the greatest margin of control and reliability are those based on a combined (bimodal) nuclear thermal/nuclear electric propulsion platform. Elements of required and currently available performance characteristics for nuclear and electric propulsion systems are also discussed. PMID:15220155

Remo, John L



A mission template for exploration and damage mitigation of potential hazard of Near Earth Asteroids  

Microsoft Academic Search

The Apophis Exploratory and Mitigation Platform (AEMP) concept was developed as a prototype mission to explore and potentially\\u000a deflect the Near Earth Asteroid (NEA) 99942 Apophis. Deflection of the asteroid from the potential 2036 impact will be achieved\\u000a using a gravity tractor technique, while a permanent deflection, eliminating future threats, will be imparted using a novel\\u000a albedo manipulation technique. This

D. C. Hyland; H. A. Altwaijry; R. Margulieux; J. Doyle; J. Sandberg; B. Young; N. Satak; J. Lopez; S. Ge; X. Bai



Marine and Hydrokinetic Renewable Energy Devices, Potential Navigational Hazards and Mitigation Measures  

SciTech Connect

On April 15, 2008, the Department of Energy (DOE) issued a Funding Opportunity Announcement for Advanced Water Power Projects which included a Topic Area for Marine and Hydrokinetic Renewable Energy Market Acceleration Projects. Within this Topic Area, DOE identified potential navigational impacts of marine and hydrokinetic renewable energy technologies and measures to prevent adverse impacts on navigation as a sub-topic area. DOE defines marine and hydrokinetic technologies as those capable of utilizing one or more of the following resource categories for energy generation: ocean waves; tides or ocean currents; free flowing water in rivers or streams; and energy generation from the differentials in ocean temperature. PCCI was awarded Cooperative Agreement DE-FC36-08GO18177 from the DOE to identify the potential navigational impacts and mitigation measures for marine hydrokinetic technologies. A technical report addressing our findings is available on this Science and Technology Information site under the Product Title, "Marine and Hydrokinetic Renewable Energy Technologies: Potential Navigational Impacts and Mitigation Measures". This product is a brochure, primarily for project developers, that summarizes important issues in that more comprehensive report, identifies locations where that report can be downloaded, and identifies points of contact for more information.

Cool, Richard, M.; Hudon, Thomas, J.; Basco, David, R.; Rondorf, Neil, E.



Examining Local Jurisdictions' Capacity and Commitment For Hazard Mitigation Policies and Strategies along the Texas Coast  

E-print Network

recent decades, coastal areas around the world have experienced coastal hazards such as tsunamis, hurricanes, tropical storms that caused the loss of human life as well as immense economic losses. The tsunami that hit Asia in 2004, for example, killed... more than 200,000 people (Lay et al., 2005) and the recent earthquake followed by a tsunami that hit Japan on March 11, 2011 killed 15,839 people and has cost estimates up to $235 billion in damages, making it the most expensive natural disaster...

Husein, Rahmawati



Using Darwin's theory of atoll formation to improve tsunami hazard mitigation in the Pacific  

NASA Astrophysics Data System (ADS)

It is 130 years since Charles Darwin's death and 176 years since he his penned his subsidence theory of atoll formation on 12th April 1836 during the voyage of the Beagle through the Pacific. This theory, founded on the premise of a subsiding volcano and the corresponding upward growth of coral reef, was astonishing for the time considering the absence of an underpinning awareness of plate tectonics. Furthermore, with the exception of the occasional permutation and opposing idea his theory has endured and has an enviable longevity amongst paradigms in geomorphology. In his theory, Darwin emphasised the generally circular morphology of the atoll shape and surprisingly, the validity of this simple morphological premise has never been questioned. There are however, few atolls in the Pacific Ocean that attain such a simple morphology with most manifesting one or more arcuate 'bight-like' structures (ABLSs). These departures from the circular form complicate his simplistic model and are indicative of geomorphological processes in the Pacific Ocean which cannot be ignored. ABLSs represent the surface morphological expression of major submarine failures of atoll volcanic foundations. Such failures can occur during any stage of atoll formation and are a valuable addition to Darwin's theory because they indicate the instability of the volcanic foundations. It is widely recognized in the research community that sector/flank collapses of island edifices are invariably tsunamigenic and yet we have no clear understanding of how significant such events are in the tsunami hazard arena. The recognition of ABLSs however, now offers scientists the opportunity to establish a first order database of potential local and regional tsunamigenic sources associated with the sector/flank collapses of island edifices. We illustrate the talk with examples of arcuate 'bight-like' structures and associated tsunamis in atoll and atoll-like environments. The implications for our understanding of tsunami hazards are profound. In essence, at present we are seriously under-estimating the significance of locally and regionally generated tsunamis throughout the entire Pacific Ocean, but we now have the opportunity to enhance our understanding of such events.

Goff, J. R.; Terry, J. P.



Sea otter oil-spill mitigation study  

SciTech Connect

The objective of the study was to analyze the effectiveness of existing capture, transport, cleaning, and rehabilitation methods and develop new methods to reduce the impact of an accidental oil spill to California sea otters, resulting from the present conditions or from future Outer Continental Shelf (OCS) oil and gas development in State or Federal waters. In addition, the study investigated whether or not a systematic difference in thermal conductivity existed between the pelts of Alaska and California Sea otters. This was done to assure that conclusions drawn from the oiling experiments carried out at Hubbs Marine Research Institute, Tetra Tech, Inc. contributed to the overall study by preparing a literature review and report on the fate and effects of oil dispersants and chemically dispersed oil.

Davis, R.W.; Thomas, J.; Williams, T.M.; Kastelein, R.; Cornell, L.



Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Research Team . Volume 2; Appendices  

NASA Technical Reports Server (NTRS)

The Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage (horizontal and vertical tail). This report contains the Appendices to Volume I.

Kelly, Michael J.



Volcanic Ash Image Products from MODIS for Aviation Safety and Natural Hazard Mitigation  

NASA Astrophysics Data System (ADS)

Multi-spectral volcanic ash image products have been developed using Moderate Resolution Imaging Spectroradiometer (MODIS) data from the NASA Terra spacecraft (Ellrod and Im 2003). Efforts are now underway to integrate these new products into the MODIS Data Retrieval System at NESDIS, for use in the operational Hazard Mapping System (HMS). The images will be used at the Washington Volcanic Ash Advisory Center (W-VAAC) in the issuance of volcanic ash advisory statements to aircraft. In addition, the images will be made available to users in the global volcano and emergency management community via the World Wide Web. During the development process, good results (high detection rate with low ­false alarms­") were obtained from a tri-spectral combination of MODIS Infrared (IR) bands centered near 8.6, 11.0 and 12.0 ŸYm (Bands 29, 31, and 32). Optimum Red-Green-Blue false color composite images were developed to provide information on ash cloud location, as well as cloud phase and surface characteristics, to aid in interpretation both day and night. Information on volcanic ash derived from the tri-spectral product was displayed using the red color gun. This information was combined with visible (0.6 ŸYm) and near-IR (1.6 ŸYm) data for green and blue, respectively, during daylight periods. At night, the 8.6 ­V 11.0 ŸYm combination and 11.0 ŸYm band were used for the green and blue colors in the RGB product. Currently, raw MODIS data in five minute ­granules­" are processed for the following regions: (1) southern Alaska, (2) Mexico, Central America and the Caribbean, and (3) northern Andes region of South America. Image products are converted to Geo-spatial Information System (GIS) compatible formats for use in the HMS, and to Man-Computer Interactive Data Access System (McIDAS) ­Area File­" format for use in currently configured W-VAAC display systems. The installation of a high speed, fiber optic line from NASA Goddard Space Flight Center to the World Weather Building, Camp Springs, Maryland (scheduled for completion by Fall, 2003) will allow a full set of data to be processed from both Terra and Aqua spacecraft.

Stephens, G.; Ellrod, G. P.; Im, J.



Mitigating Resistance to Teaching Science Through Inquiry: Studying Self  

Microsoft Academic Search

This is the report of a qualitative emergent-design study of 2 different Web-enhanced science methods courses for preservice\\u000a elementary teachers in which an experiential learning strategy, labeled “using yourself as a learning laboratory,” was implemented.\\u000a Emergent grounded theory indicated this strategy, when embedded in a course organized as an inquiry with specified action\\u000a foci, contributed to mitigating participants’ resistance to

Barbara Spector; Ruth S. Burkett; Cyndy Leard



Multi-scale earthquake hazard and risk in the Chinese mainland and countermeasures for the preparedness, mitigation, and management: an overview  

NASA Astrophysics Data System (ADS)

Earthquake hazard and risk in the Chinese mainland exhibit multi-scale characteristics. Temporal scales from centuries to months, spatial scales from the whole mainland to specific engineering structures, and energy scales from great disastrous earthquakes to small earthquakes causing social disturbance and economic loss, feature the complexity of earthquake disasters. Coping with such complex challenge, several research and application projects have been undertaken since recent years. Lessons and experiences of the 2008 Wenchuan earthquake contributed much to the launching and conducting of these projects. Understandings of the scientific problems and technical approaches taken in the mainstream studies in the Chinese mainland have no significant difference from those in the international scientific communities, albeit using of some of the terminologies have "cultural differences" - for instance, in the China Earthquake Administration (CEA), the terminology "earthquake forecast/prediction (study)" is generally used in a much broader sense, mainly indicating time-dependent seismic hazard at different spatio-temporal scales. Several scientific products have been produced serving the society in different forms. These scientific products have unique academic merits due to the long-term persistence feature and the forward forecast nature, which are all essential for the evaluation of the technical performance and the falsification of the scientific ideas. On the other hand, using the language of the "actor network theory (ANT)" in science studies (or the sociology of science), at present, the hierarchical "actors' network", making the science transformed to the actions of the public and government for the preparedness, mitigation, and management of multi-scale earthquake disasters, is still in need of careful construction and improvement.

Wu, Z.; Jiang, C.; Ma, T.



RAGE Hydrocode Modeling of Asteroid Mitigation: new simulations with parametric studies for uncertainty quantification  

NASA Astrophysics Data System (ADS)

We are performing detailed hydrodynamic simulations of the interaction from a strong explosion with sample Asteroid objects. The purpose of these simulations is to apply modern hydrodynamic codes that have been well verified and validated (V&V) to the problem of mitigating the hazard from a potentially hazardous object (PHO), an asteroid or comet that is on an Earth crossing orbit. The code we use for these simulations is the RAGE code from Los Alamos National Laboratory [1-6]. Initial runs were performed using a spherical object. Next we ran simulations using the shape form from a known asteroid: 25143 Itokawa. This particular asteroid is not a PHO but we use its shape to consider the influence of non-spherical objects. The initial work was performed using 2D cylindrically symmetric simulations and simple geometries. We then performed a major fully 3D simulation. For an Itokawa size object (~500 m) and an explosion energies ranging from 0.5 - 1 megatons, the velocities imparted to all of the PHO "rocks" in all cases were many m/s. The velocities calculated were much larger than escape velocity and would preclude re-assembly of the fragments. The dispersion of the asteroid remnants is very directional from a surface burst, with all fragments moving away from the point of the explosion. This detail can be used to time the intercept for maximum movement off the original orbit. Results from these previous studies will be summarized for background. In the new work presented here we show a variety of parametric studies around these initial simulations. We modified the explosion energy by +/- 20% and varied the internal composition from a few large "rocks" to several hundred smaller rocks. The results of these parametric studies will be presented. We have also extended our work [6],[7] to stand-off nuclear bursts and will present the initial results for the energy deposition by a generic source into the non-uniform composition asteroid. The goal of this new work is to obtain an "optimal stand-off" distance from detailed radiation transport-hydrodynamic simulations from generic explosion properties. The initial results of these two studies will also be presented. References [1] Gitting, Weaver et al 'The RAGE radiation-hydrodynamics Code,' Comp. Sci. Disc. 1 (2008) 015005 November 21, 2008 [2] Huebner, W.F. et al, 'The Engagement Space for Countermeasures Against Potentially Hazardous Objects (PHOs),' International Conference in Asteroid and Comet Hazards, 2009 held at the Russian Academy of Sciences, St. Petersburg 21-25-September 2009. [3] Gisler, Weaver, Mader, & Gittings, Two and three dimensional asteroid impact simulations, Computing in Science & Engineering, 6, 38 (2004). [4] NASA geometry courtesy of S.J. Osto et al. (2002) in Asteroids Book 3 [5] Itokawa image courtesy of JAXA: [6] Plesko, C et al "Looking Before we Leap: Recent Results from an Ongoing, Quantitative Investigation of Asteroid and Comet Impact Hazard Mitigation" Division of Planetary Sciences 2010. [7] Plesko, C et. al. "Numerical Models of Asteroid and Comet Impact Hazard Mitigation by Nuclear Stand-Off Burst." Planetary Defense Conference 2011.

Weaver, R.; Plesko, C. S.; Gisler, G. R.



Natural Hazard Mitigation thru Water Augmentation Strategies to Provide Additional Snow Pack for Water Supply and Hydropower Generation in Drought Stressed Alps/Mountains  

NASA Astrophysics Data System (ADS)

Climate variability and change are clearly stressing water supplies in high alpine regions of the Earth. These recent long-term natural hazards present critical challenges to policy makers and water managers. This paper addresses strategies to use enhanced scientific methods to mitigate the problem. Recent rapid depletions of glaciers and intense droughts throughout the world have created a need to reexamine modern water augmentation technologies for enhancing snow pack in mountainous regions. Today’s reliance on clean efficient hydroelectric power in the Alps and the Rocky Mountains poses a critical need for sustainable snow packs and high elevation water supplies through out the year. Hence, the need to make natural cloud systems more efficient precipitators during the cold season through anthropogenic weather modification techniques. The Bureau of Reclamation, US Department of the Interior, has spent over $39M in research from 1963 to 1990 to develop the scientific basis for snow pack augmentation in the headwaters of the Colorado, American, and Columbia River Basins in the western United States, and through USAID in Morocco in the High Atlas Mountains. This paper presents a brief summary of the research findings and shows that even during drought conditions potential exists for significant, cost-effective enhancement of water supplies. Examples of ground based propane and AgI seeding generators, cloud physics studies of supercooled cloud droplets and ice crystal characteristics that indicate seeding potential will be shown. Hypothetical analyses of seeding potential in 17 western states from Montana to California will be presented based on observed SNOTEL snow water equivalent measurements, and distributed by elevation and observed winter precipitation. Early studies indicated from 5 to 20% increases in snow pack were possible, if winter storm systems were seeded effectively. If this potential was realized in drought conditions observed in 2003, over 1.08 million acre feet (1.33 x 10**9 m3) of additional water could be captured by seeding efficiently and effectively in just 10 storms. Recent projects sponsored by the National Science Foundation, NOAA, and the States of Wyoming, Utah and Nevada, and conducted by the National Center for Atmospheric Research will be discussed briefly. Examples of conditions in extreme droughts of the Western United States will be presented that show potential to mitigate droughts in these regions through cloud seeding. Implications for American and European hydropower generation and sustainable water supplies will be discussed.

Matthews, D.; Brilly, M.



Study of geological hazard's assessment on coastline change  

NASA Astrophysics Data System (ADS)

In order to discuss the coastal geological hazard assessment method, the author chooses the typical hazards caused by the change of coastline as object of study. The fuzzy hierarchy comprehensive evaluation approach based on GIS is applied to study and demonstrate the main principle and process of coastal hazard assessment. then, taking Bao'an district of Shenzhen as example, we carried on hazard assessment on the change of coastline by using fuzzy hierarchy evaluation method based on GIS. And then conduct the regionalization research of coastal geological hazard. The data of three periods is selected to explain the coastline change. The result of remote sensing.s data indicate that apart from the west seacoast which has no basically change, the coastline change of Baoan district is quite obvious during the seventeen years form 1989 to 2006.

Zhao, X. L.; Liu, J.; Yang, X. L.



Regional governance and hazard information: the role of co-ordinated risk assessment and regional spatial accounting in wildfire hazard mitigation  

Microsoft Academic Search

With the threat of wildfire hanging over many communities in the Western and Southern United States, wildfire mitigation is evolving into a significant public responsibility for rural and urban edge county governments. Regional governance is an important piece of the effort to reduce wildfire risks although still weakly developed as a policy arena. This project explores two dimensions in which

Brian H. Muller; Li Yin



System Safety Hazards Assessment in Conceptual Program Trade Studies  

NASA Technical Reports Server (NTRS)

Providing a program in the concept development phase with a method of determining system safety benefits of potential concepts has always been a challenge. Lockheed Martin Space and Strategic Missiles has developed a methodology for developing a relative system safety ranking using the potential hazards of each concept. The resulting output supports program decisions with system safety as an evaluation criterion with supporting data for evaluation. This approach begins with a generic hazards list that has been tailored for the program being studied and augmented with an initial hazard analysis. Each proposed concept is assessed against the list of program hazards and ranked in three derived areas. The hazards can be weighted to show those that are of more concern to the program. Sensitivities can be also be determined to test the robustness of the conclusions

Eben, Dennis M.; Saemisch, Michael K.



Airflow Hazard Visualization for Helicopter Pilots: Flight Simulation Study Results  

NASA Technical Reports Server (NTRS)

Airflow hazards such as vortices or low level wind shear have been identified as a primary contributing factor in many helicopter accidents. US Navy ships generate airwakes over their decks, creating potentially hazardous conditions for shipboard rotorcraft launch and recovery. Recent sensor developments may enable the delivery of airwake data to the cockpit, where visualizing the hazard data may improve safety and possibly extend ship/helicopter operational envelopes. A prototype flight-deck airflow hazard visualization system was implemented on a high-fidelity rotorcraft flight dynamics simulator. Experienced helicopter pilots, including pilots from all five branches of the military, participated in a usability study of the system. Data was collected both objectively from the simulator and subjectively from post-test questionnaires. Results of the data analysis are presented, demonstrating a reduction in crash rate and other trends that illustrate the potential of airflow hazard visualization to improve flight safety.

Aragon, Cecilia R.; Long, Kurtis R.



Caribbean Tsunami and Earthquake Hazards Studies  

NSDL National Science Digital Library

This portal provides information on the seismicity and plate tectonics of the active boundary between the North American plate and the northeast corner of the Caribbean plate, and the research program being conducted there by the United States Geological Survey (USGS). There are links to maps and remote imagery of the plate boundary and the Caribbean Trench, and to publications and news articles on seismic and tsunami hazards, seafloor mapping, plate interactions, and submarine slides. There is also a movie that describes the geologic background and USGS research efforts in the area.


Comparison of flood hazard assessments on desert piedmonts and playas: A case study in Ivanpah Valley, Nevada  

NASA Astrophysics Data System (ADS)

Accurate and realistic characterizations of flood hazards on desert piedmonts and playas are increasingly important given the rapid urbanization of arid regions. Flood behavior in arid fluvial systems differs greatly from that of the perennial rivers upon which most conventional flood hazard assessment methods are based. Additionally, hazard assessments may vary widely between studies or even contradict other maps. This study's chief objective was to compare and evaluate landscape interpretation and hazard assessment between types of maps depicting assessments of flood risk in Ivanpah Valley, NV, as a case study. As a secondary goal, we explain likely causes of discrepancy between data sets to ameliorate confusion for map users. Four maps, including three different flood hazard assessments of Ivanpah Valley, NV, were compared: (i) a regulatory map prepared by FEMA, (ii) a soil survey map prepared by NRCS, (iii) a surficial geologic map, and (iv) a flood hazard map derived from the surficial geologic map, both of which were prepared by NBMG. GIS comparisons revealed that only 3.4% (33.9 km 2) of Ivanpah Valley was found to lie within a FEMA floodplain, while the geologic flood hazard map indicated that ~ 44% of Ivanpah Valley runs some risk of flooding (Fig. 2D). Due to differences in mapping methodology and scale, NRCS data could not be quantitatively compared, and other comparisons were complicated by differences in flood hazard class criteria and terminology between maps. Owing to its scale and scope of attribute data, the surficial geologic map provides the most useful information on flood hazards for land-use planning. This research has implications for future soil geomorphic mapping and flood risk mitigation on desert piedmonts and playas. The Ivanpah Valley study area also includes the location of a planned new international airport, thus this study has immediate implications for urban development and land-use planning near Las Vegas, NV.

Robins, Colin R.; Buck, Brenda J.; Williams, Amanda J.; Morton, Janice L.; House, P. Kyle; Howell, Michael S.; Yonovitz, Maureen L.



Assessing the Benefits and Costs of Earthquake Mitigation  

Microsoft Academic Search

Over the past decade, as the costs of natural disasters has skyrocketed in the United States, much emphasis has been placed on mitigating these hazards. However, to date, limited published data exists on the vulnerability of structures to ground shaking, and quantitative cost-benefit studies on earthquake mitigation are nearly non-existent. As \\

Patricia A. Grossi; M. EERI



Integrated Data Products to Forecast, Mitigate, and Educate for Natural Hazard Events Based on Recent and Historical Observations  

NASA Astrophysics Data System (ADS)

Immediately following a damaging or fatal natural hazard event there is interest to access authoritative data and information. The National Geophysical Data Center (NGDC) maintains and archives a comprehensive collection of natural hazards data. The NGDC global historic event database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Examining the past record provides clues to what might happen in the future. NGDC also archives tide gauge data from stations operated by the NOAA/NOS Center for Operational Oceanographic Products and Services and the NOAA Tsunami Warning Centers. In addition to the tide gauge data, NGDC preserves deep-ocean water-level, 15-second sampled data as collected by the Deep-ocean Assessment and Reporting of Tsunami (DART) buoys. Water-level data provide evidence of sea-level fluctuation and possible inundation events. NGDC houses an extensive collection of geologic hazards photographs available online as digital images. Visual media provide invaluable pre- and post-event data for natural hazards. Images can be used to illustrate inundation and possible damage or effects. These images are organized by event or hazard type (earthquake, volcano, tsunami, landslide, etc.), along with description and location. They may be viewed via interactive online maps and are integrated with historic event details. The planning required to achieve collection and dissemination of hazard event data is extensive. After a damaging or fatal event, NGDC begins to collect and integrate data and information from many people and organizations into the hazards databases. Sources of data include the U.S. NOAA Tsunami Warning Centers, the U.S. Geological Survey, the U.S. NOAA National Data Buoy Center, the UNESCO Intergovernmental Oceanographic Commission (IOC), Smithsonian Institution's Global Volcanism Program, news organizations, etc. NGDC then works to promptly distribute data and information for the appropriate audience. For example, when a major tsunami occurs, all of the related tsunami data are combined into one timely resource. NGDC posts a publicly accessible online report which includes: 1) event summary; 2) eyewitness and instrumental recordings from preliminary field surveys; 3) regional historical observations including similar past events and effects; 4) observed water heights and calculated tsunami travel times; and 5) near-field effects. This report is regularly updated to incorporate the most recent news and observations. Providing timely access to authoritative data and information ultimately benefits researchers, state officials, the media and the public.

McCullough, H. L.; Dunbar, P. K.; Varner, J. D.



Integration of Tsunami Analysis Tools into a GIS Workspace – Research, Modeling, and Hazard Mitigation efforts Within NOAA’s Center for Tsunami Research  

Microsoft Academic Search

\\u000a The National Oceanic and Atmospheric Administration’s \\u000a \\u000a (NOAA) Center for Tsunami Research \\u000a \\u000a \\u000a \\u000a \\u000a \\u000a \\u000a \\u000a (NCTR) uses geospatial data and GIS analysis techniques in support of building an accurate \\u000a \\u000a \\u000a tsunami forecasting system for the US Tsunami Warning Centers. \\u000a \\u000a \\u000a \\u000a The resulting forecast products can be integrated into applications and visualizations to assess hazard risk and provide mitigation\\u000a for US coastal communities ranging from small towns

Nazila Merati; Christopher Chamberlin; Christopher Moore; Vasily Titov; Tiffany C. Vance


Odor mitigation with vegetative buffers: Swine production case study  

Technology Transfer Automated Retrieval System (TEKTRAN)

Vegetative environmental buffers (VEB) are a potentially low cost sustainable odor mitigation strategy, but there is little to no data supporting their effectiveness. Wind tunnel experiments and field monitoring were used to determine the effect VEB had on wind flow patterns within a swine facility....


Mitigation of hazards from future lahars from Mount Merapi in the Krasak River channel near Yogyakarta, central Java  

USGS Publications Warehouse

Procedures for reducing hazards from future lahars and debris flows in the Krasak River channel near Yogyakarta, Central Java, Indonesia, include (1) determining the history of the location, size, and effects of previous lahars and debris flows, and (2) decreasing flow velocities. The first may be accomplished by geologic field mapping along with acquiring information by interviewing local residents, and the second by increasing the cross sectional area of the river channel and constructing barriers in the flow path.

Ege, John R.; Sutikno



Using fine-scale fuel measurements to assess wildland fuels, potential fire behavior and hazard mitigation treatments in the southeastern USA.  

SciTech Connect

The inherent spatial and temporal heterogeneity of fuelbeds in forests of the southeastern United States may require fine scale fuel measurements for providing reliable fire hazard and fuel treatment effectiveness estimates. In a series of five papers, an intensive, fine scale fuel inventory from the Savanna River Site in the southeastern United States is used for building fuelbeds and mapping fire behavior potential, evaluating fuel treatment options for effectiveness, and providing a comparative analysis of landscape modeled fire behavior using three different data sources including the Fuel Characteristic Classification System, LANDFIRE, and the Southern Wildfire Risk Assessment. The research demonstrates that fine scale fuel measurements associated with fuel inventories repeated over time can be used to assess broad scale wildland fire potential and hazard mitigation treatment effectiveness in the southeastern USA and similar fire prone regions. Additional investigations will be needed to modify and improve these processes and capture the true potential of these fine scale data sets for fire and fuel management planning.

Ottmar, Roger, D.; Blake, John, I.; Crolly, William, T.



A study on seismicity and seismic hazard for Karnataka State  

NASA Astrophysics Data System (ADS)

This paper presents a detailed study on the seismic pattern of the state of Karnataka and also quantifies the seismic hazard for the entire state. In the present work, historical and instrumental seismicity data for Karnataka (within 300 km from Karnataka political boundary) were compiled and hazard analysis was done based on this data. Geographically, Karnataka forms a part of peninsular India which is tectonically identified as an intraplate region of Indian plate. Due to the convergent movement of the Indian plate with the Eurasian plate, movements are occurring along major intraplate faults resulting in seismic activity of the region and hence the hazard assessment of this region is very important. Apart from referring to seismotectonic atlas for identifying faults and fractures, major lineaments in the study area were also mapped using satellite data. The earthquake events reported by various national and international agencies were collected until 2009. Declustering of earthquake events was done to remove foreshocks and aftershocks. Seismic hazard analysis was done for the state of Karnataka using both deterministic and probabilistic approaches incorporating logic tree methodology. The peak ground acceleration (PGA) at rock level was evaluated for the entire state considering a grid size of 0.05° × 0.05°. The attenuation relations proposed for stable continental shield region were used in evaluating the seismic hazard with appropriate weightage factors. Response spectra at rock level for important Tier II cities and Bangalore were evaluated. The contour maps showing the spatial variation of PGA values at bedrock are presented in this work.

Sitharam, T. G.; James, Naveen; Vipin, K. S.; Raj, K. Ganesha



Leak detection, monitoring, and mitigation technology trade study update  

SciTech Connect

This document is a revision and update to the initial report that describes various leak detection, monitoring, and mitigation (LDMM) technologies that can be used to support the retrieval of waste from the single-shell tanks (SST) at the Hanford Site. This revision focuses on the improvements in the technical performance of previously identified and useful technologies, and it introduces new technologies that might prove to be useful.




Highest hazard Lowest hazard  

E-print Network

for seismic design provisions of building codes, insurance rate structures, earthquake loss studies, retrofit National Seismic Hazard Maps The U.S. Geological Survey's National Seismic Hazard Maps are the basis.S. Geological Survey recently updated the National Seismic Hazard Maps by incorporating new seismic, geologic

Wilcock, William


NOAA Technical Memorandum ERL PMEL-113 Strategic Implementation Plan for Tsunami Mitigation Projects  

E-print Network

NOAA Technical Memorandum ERL PMEL-113 Strategic Implementation Plan for Tsunami Mitigation Projects approved by the Mitigation Subcommittee of the National Tsunami Hazard Mitigation Program, April Strategic Implementation Plan for Tsunami Mitigation Projects (approved by the Mitigation Subcommittee



EPA Science Inventory

The first part of this two-part paper discusses radon entry into schools, radon mitigation approaches for schools, and school characteristics (e.g., heating, ventilation, and air-conditioning -- HVAC-- system design and operation) that influence radon entry and mitigation system ...



EPA Science Inventory

The objectives of the Household hazardous Waste Characterization Study (the HHW Study) were to quantify the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County, Florida's (the county) residential solid waste (characterized in this study as municipal soli...


Household hazardous waste quantification, characterization and management in China's cities: a case study of Suzhou.  


A four-stage systematic tracking survey of 240 households was conducted from the summer of 2011 to the spring of 2012 in a Chinese city of Suzhou to determine the characteristics of household hazardous waste (HHW) generated by the city. Factor analysis and a regression model were used to study the major driving forces of HHW generation. The results indicate that the rate of HHW generation was 6.16 (0.16-31.74, 95% CI) g/person/day, which accounted for 2.23% of the household solid waste stream. The major waste categories contributing to total HHW were home cleaning products (21.33%), medicines (17.67%) and personal care products (15.19%). Packaging and containers (one-way) and products (single-use) accounted for over 80% of total HHW generation, implying a considerable potential to mitigate HHW generation by changing the packaging design and materials used by manufacturing enterprises. Strong correlations were observed between HHW generation (g/person/day) and the driving forces group of "household structure" and "consumer preferences" (among which the educational level of the household financial manager has the greatest impact). Furthermore, the HHW generation stream in Suzhou suggested the influence of another set of variables, such as local customs and culture, consumption patterns, and urban residential life-style. This study emphasizes that HHW should be categorized at its source (residential households) as an important step toward controlling the HHW hazards of Chinese cities. PMID:25022547

Gu, Binxian; Zhu, Weimo; Wang, Haikun; Zhang, Rongrong; Liu, Miaomiao; Chen, Yangqing; Wu, Yi; Yang, Xiayu; He, Sheng; Cheng, Rong; Yang, Jie; Bi, Jun



A study of shock mitigating materials in a split Hopkinson bar configuration  

SciTech Connect

Sandia National Laboratories (SNL) designs mechanical systems with electronics that must survive high shock environments. These mechanical systems include penetrators that must survive soil, rock, and ice penetration, nuclear transportation casks that must survive transportation environments, and laydown weapons that must survive delivery impact of 125-fps. These mechanical systems contain electronics that may operate during and after the high shock environment and that must be protected from the high shock environments. A study has been started to improve the packaging techniques for the advanced electronics utilized in these mechanical systems because current packaging techniques are inadequate for these more sensitive electronics. In many cases, it has been found that the packaging techniques currently used not only do not mitigate the shock environment but actually amplify the shock environment. An ambitious goal for this packaging study is to avoid amplification and possibly attenuate the shock environment before it reaches the electronics contained in the various mechanical system. As part of the investigation of packaging techniques, a two part study of shock mitigating materials is being conducted. This paper reports the first part of the shock mitigating materials study. A study to compare three thicknesses (0.125, 0.250, and 0.500 in.) of seventeen, unconfined materials for their shock mitigating characteristics has been completed with a split Hopkinson bar configuration. The nominal input as measured by strain gages on the incident Hopkinson bar is 50 fps {at} 100 {micro}s for these tests. It is hypothesized that a shock mitigating material has four purposes: to lengthen the shock pulse, to attenuate the shock pulse, to mitigate high frequency content in the shock pulse, and to absorb energy. Both time domain and frequency domain analyses of the split Hopkinson bar data have been performed to compare the materials` achievement of these purposes.

Bateman, V.I.; Bell, R.G. III; Brown, F.A.; Hansen, N.R. [Sandia National Labs., Albuquerque, NM (United States). Design, Evaluation and Test Technology Center



Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards: Part II. Validation of satellite-derived Volcanic Sulphur Dioxide Levels.  

NASA Astrophysics Data System (ADS)

The eruption of the Icelandic volcano Eyjafjallajökull in the spring of 2010 turned the attention of both the public and the scientific community to the susceptibility of the European airspace to the outflows of large volcanic eruptions. The ash-rich plume from Eyjafjallajökull drifted towards Europe and caused major disruptions of European air traffic for several weeks affecting the everyday life of millions of people and with a strong economic impact. This unparalleled situation revealed limitations in the decision making process due to the lack of information on the tolerance to ash of commercial aircraft engines as well as limitations in the ash monitoring and prediction capabilities. The European Space Agency project Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards, was introduced to facilitate the development of an optimal End-to-End System for Volcanic Ash Plume Monitoring and Prediction. This system is based on comprehensive satellite-derived ash plume and sulphur dioxide [SO2] level estimates, as well as a widespread validation using supplementary satellite, aircraft and ground-based measurements. The validation of volcanic SO2 levels extracted from the sensors GOME-2/MetopA and IASI/MetopA are shown here with emphasis on the total column observed right before, during and after the Eyjafjallajökull 2010 eruptions. Co-located ground-based Brewer Spectrophotometer data extracted from the World Ozone and Ultraviolet Radiation Data Centre, WOUDC, were compared to the different satellite estimates. The findings are presented at length, alongside a comprehensive discussion of future scenarios.

Koukouli, MariLiza; Balis, Dimitris; Dimopoulos, Spiros; Clarisse, Lieven; Carboni, Elisa; Hedelt, Pascal; Spinetti, Claudia; Theys, Nicolas; Tampellini, Lucia; Zehner, Claus



Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards: Part I. Validation of satellite-derived Volcanic Ash Levels.  

NASA Astrophysics Data System (ADS)

The 2010 eruption of the Icelandic volcano Eyjafjallajökull attracted the attention of the public and the scientific community to the vulnerability of the European airspace to volcanic eruptions. Major disruptions in European air traffic were observed for several weeks surrounding the two eruptive episodes, which had a strong impact on the everyday life of many Europeans as well as a noticable economic loss of around 2-3 billion Euros in total. The eruptions made obvious that the decision-making bodies were not informed properly and timely about the commercial aircraft capabilities to ash-leaden air, and that the ash monitoring and prediction potential is rather limited. After the Eyjafjallajökull eruptions new guidelines for aviation, changing from zero tolerance to newly established ash threshold values, were introduced. Within this spirit, the European Space Agency project Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards, called for the creation of an optimal End-to-End System for Volcanic Ash Plume Monitoring and Prediction . This system is based on improved and dedicated satellite-derived ash plume and sulphur dioxide level assessments, as well as an extensive validation using auxiliary satellite, aircraft and ground-based measurements. The validation of volcanic ash levels extracted from the sensors GOME-2/MetopA, IASI/MetopA and MODIS/Terra and MODIS/Aqua is presented in this work with emphasis on the ash plume height and ash optical depth levels. Co-located aircraft flights, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation [CALIPSO] soundings and well as European Aerosol Research Lidar Network [EARLINET] measurements were compared to the different satellite estimates for the those two eruptive episodes. The validation results are extremely promising with most satellite sensors performing quite well and within the estimated uncertainties compared to the comparative datasets. The findings are extensively presented here and future directions discussed in length.

Koukouli, MariLiza; Balis, Dimitris; Simopoulos, Spiros; Siomos, Nikos; Clarisse, Lieven; Carboni, Elisa; Wang, Ping; Siddans, Richard; Marenco, Franco; Mona, Lucia; Pappalardo, Gelsomina; Spinetti, Claudia; Theys, Nicolas; Tampellini, Lucia; Zehner, Claus



Channelized debris flow hazard mitigation through the use of flexible barriers: a simplified computational approach for a sensitivity analysis.  

NASA Astrophysics Data System (ADS)

A channelized debris flow is usually represented by a mixture of solid particles of various sizes and water, flowing along a laterally confined inclined channel-shaped region up to an unconfined area where it slow down its motion and spreads out into a flat-shaped mass. The study of these phenomena is very difficult due to their short duration and unpredictability, lack of historical data for a given basin and complexity of the involved mechanical phenomena. The post event surveys allow for the identification of some depositional features and provide indication about the maximum flow height; however they lack information about development of the phenomena with time. For this purpose the monitoring of recursive events has been carried out by several Authors. Most of the studies, aimed at the determination of the characteristic features of a debris flow, were carried out in artificial channels, where the main involved variables were measured and other where controlled during the tests; however, some uncertainties remained and other scaled models where developed to simulate the deposition mechanics as well as to analyze the transportation mechanics and the energy dissipation. The assessment of the mechanical behavior of the protection structures upon impact with the flow as well as the energy associated to it are necessary for the proper design of such structures that, in densely populated area, can avoid victims and limit the destructive effects of such a phenomenon. In this work a simplified structural model, developed by the Authors for the safety assessment of retention barrier against channelized debris flow, is presented and some parametric cases are interpreted through the proposed approach; this model is developed as a simplified and efficient tool to be used for the verification of the supporting cables and foundations of a flexible debris flow barrier. The present analytical and numerical-based approach has a different aim of a FEM model. The computational experiences by using FEM modeling for these kind of structures, had shown that a large amount of time for both the geometrical setup of the model and its computation is necessary. The big effort required by FEM for this class of problems limits the actual possibility to investigate different geometrical configurations, load schemes etc. and it is suitable to represent a specific configuration but it does not allow for investigation of the influence of parameter changes. On the other hand parametrical analysis are common practice in geotechnical design for the quoted reasons. Consequently, the Authors felt the need to develop a simplified method (which is not yet available in our knowledge) that allow to perform several parametrical analysis in a limited time. It should be noted that, in this paper, no consideration regarding the mechanical and physical behavior of debris flows are carried out; the proposed model requires the input of parameters that must be acquired through a preliminary characterization of the design event. However, adopting the proposed tool, the designer will be able to perform sensitivity analysis that will help in quantify the influence of parameters variability as commonly occurs in geotechnical design.

Segalini, Andrea; Ferrero, Anna Maria; Brighenti, Roberto



Interventionist and participatory approaches to flood risk mitigation decisions: two case studies in the Italian Alps  

NASA Astrophysics Data System (ADS)

Flood risk mitigation decisions pose key challenges not only from a technical but also from a social, economic and political viewpoint. There is an increasing demand for improving the quality of these processes by including different stakeholders - and especially by involving the local residents in the decision making process - and by guaranteeing the actual improvement of local social capacities during and after the decision making. In this paper we analyse two case studies of flood risk mitigation decisions, Malborghetto-Valbruna and Vipiteno-Sterzing, in the Italian Alps. In both of them, mitigation works have been completed or planned, yet following completely different approaches especially in terms of responses of residents and involvement of local authorities. In Malborghetto-Valbruna an 'interventionist' approach (i.e. leaning towards a top down/technocratic decision process) was used to make decisions after the flood event that affected the municipality in the year 2003. In Vipiteno-Sterzing, a 'participatory' approach (i.e. leaning towards a bottom-up/inclusive decision process) was applied: decisions about risk mitigation measures were made by submitting different projects to the local citizens and by involving them in the decision making process. The analysis of the two case studies presented in the paper is grounded on the results of two research projects. Structured and in-depth interviews, as well as questionnaire surveys were used to explore residents' and local authorities' orientations toward flood risk mitigation. Also a SWOT analysis (Strengths, Weaknesses, Opportunities and Threats) involving key stakeholders was used to better understand the characteristics of the communities and their perception of flood risk mitigation issues. The results highlight some key differences between interventionist and participatory approaches, together with some implications of their adoption in the local context. Strengths and weaknesses of the two approaches, as well as key challenges for the future are also discussed.

Bianchizza, C.; Del Bianco, D.; Pellizzoni, L.; Scolobig, A.



Versatile gas gun target assembly for studying blast wave mitigation in materials  

NASA Astrophysics Data System (ADS)

Traumatic brain injury (TBI) has become a serious problem for military personnel returning from recent conflicts. This has increased interest in investigating blast mitigating materials for use in helmets. In this paper we describe a new versatile target assembly that is used with an existing gas gun for studying these materials.

Bartyczak, S.; Mock, W., Jr.




EPA Science Inventory

The paper discusses a case study of radon diagnostics and mitigation performed by EPA in a New York State school building. esearch focused on active subslab depressurization (ASD) in the basement and, to a lesser degree, the potential for radon reduction in the basement and slab-...


Integrating geologic fault data into tsunami hazard studies  

NASA Astrophysics Data System (ADS)

We present the realization of a fault-source data set designed to become the starting point in regional-scale tsunami hazard studies. Our approach focuses on the parametric fault characterization in terms of geometry, kinematics, and assessment of activity rates, and includes a systematic classification in six justification levels of epistemic uncertainty related with the existence and behaviour of fault sources. We set up a case study in the central Mediterranean Sea, an area at the intersection of the European, African, and Aegean plates, characterized by a complex and debated tectonic structure and where several tsunamis occurred in the past. Using tsunami scenarios of maximum wave height due to crustal earthquakes (Mw=7) and subduction earthquakes (Mw=7 and Mw=8), we illustrate first-order consequences of critical choices in addressing the seismogenic and tsunamigenic potentials of fault sources. Although tsunamis generated by Mw=8 earthquakes predictably affect the entire basin, the impact of tsunamis generated by Mw=7 earthquakes on either crustal or subduction fault sources can still be strong at many locales. Such scenarios show how the relative location/orientation of faults with respect to target coastlines coupled with bathymetric features suggest avoiding the preselection of fault sources without addressing their possible impact onto hazard analysis results.

Basili, R.; Tiberti, M. M.; Kastelic, V.; Romano, F.; Piatanesi, A.; Selva, J.; Lorito, S.



A study of shock mitigating materials in a split Hopkinson bar configuration. Phase 1  

SciTech Connect

Sandia National Laboratories (SNL) designs mechanical systems with electronics that must survive high shock environments. These mechanical systems include penetrators that must survive soil, rock, and ice penetration, nuclear transportation casks that must survive transportation environments, and laydown weapons that must survive delivery impact of 125 fps. These mechanical systems contain electronics that may operate during and after the high shock environment and that must be protected from the high shock environments. A study has been started to improve the packaging techniques for the advanced electronics utilized in these mechanical systems because current packaging techniques are inadequate for these more sensitive electronics. In many cases, it has been found that the packaging techniques currently used not only do not mitigate the shock environment but actually amplify the shock environment. An ambitious goal for this packaging study is to avoid amplification and possibly attenuate the shock environment before it reaches the electronics contained in the various mechanical systems. As part of the investigation of packaging techniques, a two phase study of shock mitigating materials is being conducted. The purpose of the first phase reported here is to examine the performance of a joint that consists of shock mitigating material sandwiched in between steel and to compare the performance of the shock mitigating materials. A split Hopkinson bar experimental configuration simulates this joint and has been used to study the shock mitigating characteristics of seventeen, unconfined materials. The nominal input for these tests is an incident compressive wave with 50 fps peak (1,500 {micro}{var_epsilon} peak) amplitude and a 100 {micro}s duration (measured at 10% amplitude).

Bateman, V.I.; Brown, F.A.; Hansen, N.R.



Methodological Issues In Forestry Mitigation Projects: A CaseStudy Of Kolar District  

SciTech Connect

There is a need to assess climate change mitigationopportunities in forest sector in India in the context of methodologicalissues such as additionality, permanence, leakage, measurement andbaseline development in formulating forestry mitigation projects. A casestudy of forestry mitigation project in semi-arid community grazing landsand farmlands in Kolar district of Karnataka, was undertaken with regardto baseline and project scenariodevelopment, estimation of carbon stockchange in the project, leakage estimation and assessment ofcost-effectiveness of mitigation projects. Further, the transaction coststo develop project, and environmental and socio-economic impact ofmitigation project was assessed.The study shows the feasibility ofestablishing baselines and project C-stock changes. Since the area haslow or insignificant biomass, leakage is not an issue. The overallmitigation potential in Kolar for a total area of 14,000 ha under variousmitigation options is 278,380 tC at a rate of 20 tC/ha for the period2005-2035, which is approximately 0.67 tC/ha/yr inclusive of harvestregimes under short rotation and long rotation mitigation options. Thetransaction cost for baseline establishment is less than a rupee/tC andfor project scenario development is about Rs. 1.5-3.75/tC. The projectenhances biodiversity and the socio-economic impact is alsosignificant.

Ravindranath, N.H.; Murthy, I.K.; Sudha, P.; Ramprasad, V.; Nagendra, M.D.V.; Sahana, C.A.; Srivathsa, K.G.; Khan, H.




Microsoft Academic Search

Seismic Hazard Analyses of Bangalore city has been done based on the local soil conditions using a geotechnical data. Seismic hazard analyses parameters are evaluated in terms of amplification rating, peak ground acceleration and factor of safety against liquefaction by considering hypothetical earthquake. Amplification rating is done based on soil profiles by using Finn (1991) recommendation. The peak horizontal ground

T. G. Sitharam; P. Anbazhagan; G. U. Mahesh; K. Bharathi; P. Nischala Reddy


Countermeasures to hazardous chemicals  

SciTech Connect

Recent major incidents involving the airborne release of hazardous chemicals have led to this study of effective strategies must be developed to prevent and to deal with emergencies. The comprehensive study of FEMA and the various other entities required that the project be divided into three tasks. These included Task 1 (the nature of the threat from incidents involving airborne hazardous chemicals is described. Based on available databases, a new methodology for ranking chemical hazards is proposed and tested); Task 2 (Existing responsibilities of federal, state, and local agencies, as well as the part played by the private sector, have been surveyed. Legislation at all levels of government are reviewed and in light of this analysis, the role of FEMA is examined. Institutional options to new and existing approaches for reducing risk are reevaluated, and recommendations are made for these approaches); and Task 3 (Technical options are discussed in light of the most hazardous situations, and recommendations are made for action or research where needed. Emphasis is laid on new and emerging technologies in the area). Recommendations are offered regarding actions which would improve preparation, training, mitigation, and response on the part of FEMA to the release of hazardous chemicals. 180 refs., 13 figs., 34 tabs.

Holmes, J.M.; Byers, C.H.



The 5 key questions coping with risks due to natural hazards, answered by a case study  

NASA Astrophysics Data System (ADS)

Based on Maslow's hierarchy of needs, human endeavours concern primarily existential needs, consequently, to be safeguarded against both natural as well as man made threads. The subsequent needs are to realize chances in a variety of fields, as economics and many others. Independently, the 5 crucial questions are the same as for coping with risks due to natural hazards specifically. These 5 key questions are I) What is the impact in function of space and time ? II) What protection measures comply with the general opinion and how much do they mitigate the threat? III) How can the loss be adequately quantified and monetized ? IV) What budget for prevention and reserves for restoration and compensation are to be planned ? V) Which mix of measures and allocation of resources is sustainable, thus, optimal ? The 5 answers, exemplified by a case study, concerning the sustainable management of risk due to the debris flows by the Enterbach / Inzing / Tirol / Austria, are as follows : I) The impact, created by both the propagation of flooding and sedimentation, has been forecasted by modeling (numerical simulation) the 30, 50, 100, 150, 300 and 1000 year debris flow. The input was specified by detailed studies in meteorology, precipitation and runoff, in geology, hydrogeology, geomorphology and slope stability, in hydraulics, sediment transport and debris flow, in forestry, agriculture and development of communal settlement and infrastructure. All investigations were performed according to the method of ETAlp (Erosion and Transport in Alpine systems). ETAlp has been developed in order to achieve a sustainable development in alpine areas and has been evaluated by the research project "nab", within the context of the EU-Interreg IIIb projects. II) The risk mitigation measures of concern are in hydraulics at the one hand and in forestry at the other hand. Such risk management is evaluated according to sustainability, which means economic, ecologic and social, in short, "triple" compatibility. 100% protection against the 100 year event shows to be the optimal degree of protection. Consequently, impacts statistically less frequent than once in 100 year are accepted as the remaining risk. Such floods and debris flows respectively cause a fan of propagation which is substantially reduced due to the protection measures against the 100 year event. III) The "triple loss distribution" shows the monetized triple damage, dependent on its probability. The monetization is performed by the social process of participation of the impacted interests, if not, by official experts in representation. The triple loss distribution rises in time mainly due to the rise in density and value of precious goods. A comparison of the distributions of the triple loss and the triple risk, behaving in opposite direction, is shown and explained within the project. IV) The recommended yearly reserves to be stocked for restoration and compensation of losses, caused by debris flows, amount to € 70'000.- according to the approach of the "technical risk premium". The discrepancy in comparison with the much higher amounts according to the common approaches of natural hazards engineering are discussed. V) The sustainable mix of hydraulic and forestry measures with the highest return on investment at lowest risk is performed according to the portfolio theory (Markowitz), based on the triple value curves, generated by the method of TripelBudgetierung®. Accordingly, the optimum mix of measures to protect the community of Inzing against the natural hazard of debris flow, thus, the most efficient allocation of resources equals to 2/3 for hydraulic, 1/3 for forestry measures. In detail, the results of the research pilot project "Nachhaltiges Risikomanagement - Enterbach / Inzing / Tirol / Austria" may be consulted under

Hardegger, P.; Sausgruber, J. T.; Schiegg, H. O.




EPA Science Inventory

The objectives of the Household Hazardous Waste Characterization Study (the HHW Study) were to: 1) Quantity the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County Florida?s (the County) residential solid waste (characterized in this study as municipal s...


Geospatial Approach on Landslide Hazard Zonation Mapping Using Multicriteria Decision Analysis: A Study on Coonoor and Ooty, Part of Kallar Watershed, The Nilgiris, Tamil Nadu  

NASA Astrophysics Data System (ADS)

Landslides are one of the critical natural phenomena that frequently lead to serious problems in hilly area, resulting to loss of human life and property, as well as causing severe damage to natural resources. The local geology with high degree of slope coupled with high intensity of rainfall along with unplanned human activities of the study area causes many landslides in this region. The present study area is more attracted by tourist throughout the year, so this area must be considered for preventive measures. Geospatial based Multicriteria decision analysis (MCDA) technique is increasingly used for landslide vulnerability and hazard zonation mapping. It enables the integration of different data layers with different levels of uncertainty. In this present study, it is used analytic hierarchy process (AHP) method to prepare landslide hazard zones of the Coonoor and Ooty, part of Kallar watershed, The Nilgiris, Tamil Nadu. The study was carried out using remote sensing data, field surveys and geographic information system (GIS) tools. The ten factors that influence landslide occurrence, such as elevation, slope aspect, slope angle, drainage density, lineament density, soil, precipitation, land use/land cover (LULC), distance from road and NDVI were considered. These factors layers were extracted from the various related spatial data's. These factors were evaluated, and then, the individual factor weight and class weight were assigned to each of the related factors. The Landslide Hazard Zone Index (LHZI) was calculated using Multicriteria decision analysis (MCDA) the technique based on the assigned weight and the rating is given by the Analytical Hierarchy Process (AHP) method. The final cumulative map of the study area was categorized into four hazard zones and classified as zone I to IV. There are 3.56% of the area comes under the hazard zone IV fallowed by 48.19% of the area comes under zone III, 43.63 % of the area in zone II and 4.61% of the area comes hazard zone I. Further resulted hazard zone map and landuse/landcover map are overlaid to check the hazard status, and existing inventory of known landslides within the present study area was compared with the resulting vulnerable and hazard zone maps. The landslide hazard zonation map is useful for landslide hazard prevention, mitigation, and improvement to society, and proper planning for land use and construction in the future.

Rahamana, S. Abdul; Aruchamy, S.; Jegankumar, R.



The critical need for moderate to high resolution thermal infrared data for volcanic hazard mitigation and process monitoring from the micron to the kilometer scale  

NASA Astrophysics Data System (ADS)

The use of satellite thermal infrared (TIR) data to rapidly detect and monitor transient thermal events such as volcanic eruptions commonly relies on datasets with coarse spatial resolution (1.0 - 8.0 km) and high temporal resolution (minutes to hours). However, the growing need to extract physical parameters at meter to sub- meter scales requires data with improved spectral and spatial resolution. Current orbital systems such as the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) and the Landsat Enhanced Thematic Mapper plus (ETM+) can provide TIR data ideal for this type of scientific analysis, assessment of hazard risks, and to perform smaller scale monitoring; but at the expense of rapid repeat observations. A potential solution to this apparent conflict is to combine the spatial and temporal scales of TIR data in order to provide the benefits of rapid detection together with the potential of detailed science return. Such a fusion is now in place using ASTER data collected in the north Pacific region to monitor the Aleutian and Kamchatka arcs. However, this approach of cross-instrument/cross-satellite monitoring is in jeopardy with the lack of planned moderate resolution TIR instruments following ETM+ and ASTER. This data collection program is also being expanded globally, and was used in 2006 to assist in the response and monitoring of the volcanic crisis at Merapi Volcano in Indonesia. Merapi Volcano is one of the most active volcanoes in the country and lies in central Java north of the densely-populated city of Yogyakarta. Pyroclastic flows and lahars are common following the growth and collapse of the summit lava dome. These flows can be fatal and were the major hazard concern during a period of renewed activity beginning in April 2006. Lava at the surface was confirmed on 25 April and ASTER was tasked with an urgent request observation, subsequently collecting data on 26 April (daytime) and 28 April (nighttime). The TIR revealed thermally-elevated pixels (max = 25.9 C) clustered near the summit with a lesser anomaly (max = 15.5 C) approximately 650 m to the southwest and down slope from the summit. Such small-scale and low-grade thermal features confirmed the increased activity state of the volcano and were only made possible with the moderate spatial, spectral, and radiometric resolution of ASTER. ASTER continued to collect data for the next 12 weeks tracking the progress of large scale pyroclastic flows, the growth of the lava dome, and the path of ash-rich plumes. Data from these observations were reported world-wide and used for evacuation and hazard planning purposes. With the pending demise of such TIR data from orbit, research is also focused on the use of handheld TIR instruments such as the forward-looking infrared radiometer (FLIR) camera. These instruments provide the highest spatial resolution in-situ TIR data and have been used to observe numerous volcanic phenomena and quantitatively model others (e.g., the rise of the magma body preceding the eruption of Mt. St. Helens Volcano; the changes on the lava dome at Bezymianny Volcano; the behavior of basalt crusts during pahoehoe flow inflation). Studies such as these confirm the utility and importance of future moderate to high resolution TIR data in order to understand volcanic processes and their accompanying hazards.

Ramsey, M. S.



Hazardous Drinking-Related Characteristics of Depressive Disorders in Korea: The CRESCEND Study.  


This study aimed to identify clinical correlates of hazardous drinking in a large cohort of Korean patients with depression. We recruited a total of 402 depressed patients aged > 18 yr from the Clinical Research Center for Depression (CRESCEND) study in Korea. Patients' drinking habits were assessed using the Korean Alcohol Use Disorder Identification Test (AUDIT-K). Psychometric scales, including the HAMD, HAMA, BPRS, CGI-S, SSI-Beck, SOFAS, and WHOQOL-BREF, were used to assess depression, anxiety, overall psychiatric symptoms, global severity, suicidal ideation, social functioning, and quality of life, respectively. We compared demographic and clinical features and psychometric scores between patients with and without hazardous drinking behavior after adjusting for the effects of age and sex. We then performed binary logistic regression analysis to identify independent correlates of hazardous drinking in the study population. Our results revealed that hazardous drinking was associated with current smoking status, history of attempted suicide, greater psychomotor retardation, suicidal ideation, weight loss, and lower hypochondriasis than non-hazardous drinking. The regression model also demonstrated that more frequent smoking, higher levels of suicidal ideation, and lower levels of hypochondriasis were independently correlates for hazardous drinking in depressed patients. In conclusion, depressed patients who are hazardous drinkers experience severer symptoms and a greater burden of illness than non-hazardous drinkers. In Korea, screening depressed patients for signs of hazardous drinking could help identify subjects who may benefit from comprehensive therapeutic approaches. PMID:25552886

Park, Seon-Cheol; Lee, Sang Kyu; Oh, Hong Seok; Jun, Tae-Youn; Lee, Min-Soo; Kim, Jae-Min; Kim, Jung-Bum; Yim, Hyeon-Woo; Park, Yong Chon



Volcanic hazard studies for the Yucca Mountain project  

SciTech Connect

Volcanic hazard studies are ongoing to evaluate the risk of future volcanism with respect to siting of a repository for disposal of high-level radioactive waste at the Yucca Mountain site. Seven Quaternary basaltic volcanic centers are located a minimum distance of 12 km and a maximum distance of 47 km from the outer boundary of the exploration block. The conditional probability of disruption of a repository by future basaltic volcanism is bounded by the range of 10{sup {minus}8} to 10{sup {minus}10} yr{sup {minus}1}. These values are currently being reexamined based on new developments in the understanding of the evaluation of small volume, basaltic volcanic centers including: (1) Many, perhaps most, of the volcanic centers exhibit brief periods of eruptive activity separated by longer periods of inactivity. (2) The centers may be active for time spans exceeding 10{sup 5} yrs, (3) There is a decline in the volume of eruptions of the centers through time, and (4) Small volume eruptions occurred at two of the Quaternary centers during latest Pleistocene or Holocene time. We classify the basalt centers as polycyclic, and distinguish them from polygenetic volcanoes. Polycyclic volcanism is characterized by small volume, episodic eruptions of magma of uniform composition over time spans of 10{sup 3} to 10{sup 5} yrs. Magma eruption rates are low and the time between eruptions exceeds the cooling time of the magma volumes. 25 refs., 2 figs.

Crowe, B.; Turrin, B.; Wells, S.; Perry, F.; McFadden, L.; Renault, C.E.; Champion, D.; Harrington, C.




SciTech Connect

An air-ingress accident followed by a pipe break is considered as a critical event for a very high temperature gas-cooled reactor (VHTR). Following helium depressurization, it is anticipated that unless countermeasures are taken, air will enter the core through the break leading to oxidation of the in-core graphite structure. Thus, without mitigation features, this accident might lead to severe exothermic chemical reactions of graphite and oxygen. Under extreme circumstances, a loss of core structural integrity may occur along with excessive release of radiological inventory. Idaho National Laboratory under the auspices of the U.S. Department of Energy is performing research and development (R&D) that focuses on key phenomena important during challenging scenarios that may occur in the VHTR. Phenomena Identification and Ranking Table (PIRT) studies to date have identified the air ingress event, following on the heels of a VHTR depressurization, as very important (Oh et al. 2006, Schultz et al. 2006). Consequently, the development of advanced air ingress-related models and verification and validation (V&V) requirements are part of the experimental validation plan. This paper discusses about various air-ingress mitigation concepts applicable for the VHTRs. The study begins with identifying important factors (or phenomena) associated with the air-ingress accident by using a root-cause analysis. By preventing main causes of the important events identified in the root-cause diagram, the basic air-ingress mitigation ideas can be conceptually derived. The main concepts include (1) preventing structural degradation of graphite supporters; (2) preventing local stress concentration in the supporter; (3) preventing graphite oxidation; (4) preventing air ingress; (5) preventing density gradient driven flow; (4) preventing fluid density gradient; (5) preventing fluid temperature gradient; (6) preventing high temperature. Based on the basic concepts listed above, various air-ingress mitigation methods are proposed in this study. Among them, the following two mitigation ideas are extensively investigated using computational fluid dynamic codes (CFD): (1) helium injection in the lower plenum, and (2) reactor enclosure opened at the bottom. The main idea of the helium injection method is to replace air in the core and the lower plenum upper part by buoyancy force. This method reduces graphite oxidation damage in the severe locations of the reactor inside. To validate this method, CFD simulations are addressed here. A simple 2-D CFD model is developed based on the GT-MHR 600MWt design. The simulation results showed that the helium replace the air flow into the core and significantly reduce the air concentration in the core and bottom reflector potentially protecting oxidation damage. According to the simulation results, even small helium flow was sufficient to remove air in the core, mitigating the air-ingress successfully. The idea of the reactor enclosure with an opening at the bottom changes overall air-ingress mechanism from natural convection to molecular diffusion. This method can be applied to the current system by some design modification of the reactor cavity. To validate this concept, this study also uses CFD simulations based on the simplified 2-D geometry. The simulation results showed that the enclosure open at the bottom can successfully mitigate air-ingress into the reactor even after on-set natural circulation occurs.

Chang H. Oh



Development, Implementation, and Pilot Evaluation of a Model-Driven Envelope Protection System to Mitigate the Hazard of In-Flight Ice Contamination on a Twin-Engine Commuter Aircraft  

NASA Technical Reports Server (NTRS)

Fatal loss-of-control accidents have been directly related to in-flight airframe icing. The prototype system presented in this report directly addresses the need for real-time onboard envelope protection in icing conditions. The combination of prior information and real-time aerodynamic parameter estimations are shown to provide sufficient information for determining safe limits of the flight envelope during inflight icing encounters. The Icing Contamination Envelope Protection (ICEPro) system was designed and implemented to identify degradations in airplane performance and flying qualities resulting from ice contamination and provide safe flight-envelope cues to the pilot. The utility of the ICEPro system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their awareness of a hazardous aircraft state. The performance of ICEPro system was further evaluated by various levels of sensor noise and atmospheric turbulence.

Martos, Borja; Ranaudo, Richard; Norton, Billy; Gingras, David; Barnhart, Billy



Assessment of Ground Motion Variability and Its Effects on Seismic Hazard Analysis: A Case Study  

E-print Network

Assessment of Ground Motion Variability and Its Effects on Seismic Hazard Analysis: A Case Study hazard analysis (PSHA) generally relies on the basic assumption that ground motion prediction equations that observed ground motion and its variability at considered sites could be modelled by the selected GMPEs

Paris-Sud XI, Université de


The asteroid impact threat: instrumentation for mitigation precursor and demo missions, a study from the NEOShield project  

NASA Astrophysics Data System (ADS)

The NEOShield project [1], started in January 2012, has been funded by the European Union for a period of 3.5 years. The primary aim of the project is to study in detail the three most promising techniques to mitigate the asteroid impact risk: the kinetic impactor, blast deflection, and the gravity tractor, and to devise feasible demonstration missions. NEOShield also aims to address the issue of a still missing international agreement on how to deal with the impact threat and how to organize, prepare, and implement mitigation plans. Within the NEOShield consortium, the LESIA is the leading institute for what concerns the physical characterization of near- Earth objects (NEOs). We are currently studying which is the appropriate instrumentation for both mitigation precursor missions and mitigation demo missions.

Perna, D.; Barucci, M. A.; Fulchignoni, M.; Fornasier, S.



Factors in Perception of Tornado Hazard: An Exploratory Study.  

ERIC Educational Resources Information Center

Administered questionnaire on tornado hazard to 142 adults. Results indicated that subject's gender and education level were best predictors of perceived probability of tornado recurrence; that ratings of severity of potential damage were related to education level; and that gender accounted for significant percentage of variance in anxiety…

de Man, Anton; Simpson-Housley, Paul




EPA Science Inventory

The biological effects of hazardous substances in the environment are influenced by climate, physiography, and biota. These factors interact to determine the transport and fate of chemicals, but are difficult to model accurately except for small areas with a large data base. The ...


Seismic hazard studies for the High Flux Beam Reactor at Brookhaven National Laboratory  

SciTech Connect

This paper presents the results of a calculation to determine the site specific seismic hazard appropriate for the deep soil site at Brookhaven National Laboratory (BNL) which is to be used in the risk assessment studies being conducted for the High Flux Beam Reactor (HFBR). The calculations use as input the seismic hazard defined for the bedrock outcrop by a study conducted at Lawrence Livermore National Laboratory (LLNL). Variability in site soil properties were included in the calculations to obtain the seismic hazard at the ground surface and compare these results with those using the generic amplification factors from the LLNL study. 9 refs., 8 figs.

Costantino, C.J.; Heymsfield, E. (City Coll., New York, NY (United States). Dept. of Civil Engineering); Park, Y.J.; Hofmayer, C.H. (Brookhaven National Lab., Upton, NY (United States))



Potential of the theory of compensation for mitigating public opposition to hazardous waste treatment facility siting: Some evidence from five Massachusetts Communities  

Microsoft Academic Search

The theory of economic compensation is explained. The way in which residents of five Massachusetts communities react to proposals directed at overcoming local opposition to siting a hazardous waste treatment facility is analyzed. Residents were asked to respond to 11 specific proposals designed to allay people`s fears and to compensate them for tangible losses or costs they might incur. None

Kent E. Portney



Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria  

NASA Astrophysics Data System (ADS)

Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria A. Maltzkait (1) & C. Pfurtscheller (1) (1) Institute for Interdisciplinary Mountain Research (IGF), Austrian Academy of Sciences, Innsbruck, Austria The extreme flood events of 2002, 2005 and 2013 in Austria underlined the importance of local emergency services being able to withstand and reduce the adverse impacts of natural hazards. Although for legal reasons municipal emergency and crisis management plans exist in Austria, they mostly do not cover risk analyses of natural hazards - a sound, comparable assessment to identify and evaluate risks. Moreover, total losses and operational emergencies triggered by natural hazards have increased in recent decades. Given sparse public funds, objective budget decisions are needed to ensure the efficient provision of operating resources, like personnel, vehicles and equipment in the case of natural hazards. We present a case study of the municipality of Au, Austria, which was hardly affected during the 2005 floods. Our approach is primarily based on a qualitative risk analysis, combining existing hazard plans, GIS data, field mapping and data on operational efforts of the fire departments. The risk analysis includes a map of phenomena discussed in a workshop with local experts and a list of risks as well as a risk matrix prepared at that workshop. On the basis for the exact requirements for technical and non-technical mitigation measures for each natural hazard risk were analysed in close collaboration with members of the municipal operation control and members of the local emergency services (fire brigade, Red Cross). The measures includes warning, evacuation and, technical interventions with heavy equipment and personnel. These results are used, first, to improve the municipal emergency and crisis management plan by providing a risk map, and a list of risks and, second, to check if the local emergency forces can cope with the different risk scenarios using locally available resources. The emergency response plans will identify possible resource deficiencies in personnel, vehicles and equipment. As qualitative methods and data are used, uncertainties in the study emerged in finding definitions for safety targets, in the construction of the different risk scenarios, in the inherent uncertainty beyond the probability of occurrence and the intensity of natural hazards, also in the case of the expectable losses. Finally, we used available studies and expert interviews to develop objective rules for investment decisions for the fire departments and the Red Cross to present an empirically sound basis for the efficient provision of intervention in the case of natural hazards for the municipality of Au. Again, the regulations for objective provision were developed in close collaboration with the emergency services.

Maltzkait, Anika; Pfurtscheller, Clemens




EPA Science Inventory

The available data were gathered for a large number of case studies of hazardous waste surface impoundments (SI). Actual and projected performances were compared. This collection, analysis and dissemination of the accumulated experience can contribute significantly to improving S...


New debris flow mitigation measures in southern Gansu, China: a case study of the Zhouqu Region  

NASA Astrophysics Data System (ADS)

A devastating debris flow occurred in Zhouqu of Gansu Province, China, on 8th August 2010, resulting in a catastrophic disaster, with 1463 people being perished. The debris flow valleys, as other numerous debris valleys in the mountainous region, had preventive engineering constructions, such as check dames, properly designed based on common engineering practices for safe guiding the town located right on the debris flow fan. However, failures of such preventive measures often cause even heavier disasters than those that have no human interactions, as the mitigations give a false safety impression. Given such a weird situation and in order to explore a much more effective disaster prevention strategy against debris flows in the mountainous region, this paper makes a comparative study based on two cases in the area of which one had preventive structures and one hasn't. The result shows that inappropriate mitigation measures that have commonly been applying in the disaster reduction practices in the region are of questionable. It is concluded that going with the nature and following with the natural rules are the best strategy for disaster reduction in the region. Key words: debris flow disasters, disaster reduction strategy, preventive measures

Xiong, Muqi; Meng, Xingmin; Li, Yajun



Hazardous Waste/Mixed Waste Treatment Building throughput study  

SciTech Connect

The hazardous waste/mixed waste HW/MW Treatment Building (TB) is the specified treatment location for solid hazardous waste/mixed waste at SRS. This report provides throughput information on the facility based on known and projected waste generation rates. The HW/MW TB will have an annual waste input for the first four years of approximately 38,000 ft{sup 3} and have an annual treated waste output of approximately 50,000 ft{sup 3}. After the first four years of operation it will have an annual waste input of approximately 16,000 ft{sup 3} and an annual waste output of approximately 18,000 ft. There are several waste streams that cannot be accurately predicted (e.g. environmental restoration, decommissioning, and decontamination). The equipment and process area sizing for the initial four years should allow excess processing capability for these poorly defined waste streams. A treatment process description and process flow of the waste is included to aid in understanding the computations of the throughput. A description of the treated wastes is also included.

England, J.L.; Kanzleiter, J.P.



A study of shock mitigating materials in a split Hopkins bar configuration. Phase 2  

SciTech Connect

Sandia National Laboratories (SNL) designs mechanical systems with electronics that must survive high shock environments. These mechanical systems include penetrators that must survive soil and rock penetration, nuclear transportation casks that must survive transportation environments, and laydown weapons that must survive delivery impact. These mechanical systems contain electronics that may operate during and after the high shock environment and that must be protected from the high shock environments. A study has been started to improve the packaging techniques for the advanced electronics utilized in these mechanical systems because current packaging techniques are inadequate for these sensitive electronics. In many cases, it has been found that the packaging techniques currently used not only do not mitigate the shock environment but actually amplify the shock environment. An ambitious goal for this packaging study is to avoid amplification and possibly attenuate the shock environment before it reached the electronics contained in the various mechanical systems. Here, a study to compare two thickness values, 0.125 and 0.250 in. of five materials, GE RTV 630, HS II Silicone, Polysulfide Rubber, Sylgard 184, and Teflon for their shock mitigating characteristics with a split Hopkinson bar configuration has been completed. The five materials have been tested in both unconfined and confined conditions at ambient temperature and with two applied loads of 750 {mu}{epsilon} peak (25 fps peak) with a 100 {micro}s duration, measured at 10% amplitude, and 1500 {mu}{epsilon} peak (50 fps peak) with a 100 {micro}s duration, measured at 10% amplitude. The five materials have been tested at ambient, cold ({minus}65 F), and hot (+165 F) for the unconfined condition with the 750 {mu}{epsilon} peak (25 fps peak) applied load. Time domain and frequency domain analyses of the split Hopkinson bar data have been performed to compare how these materials lengthen the shock pulse, attenuate the shock pulse, reflect high frequency content in the shock pulse, and transmit energy.

Bateman, V.I.; Brown, F.A.; Hansen, N.R.



Comparative risk judgements for oral health hazards among Norwegian adults: a cross sectional study  

Microsoft Academic Search

BACKGROUND: This study identified optimistic biases in health and oral health hazards, and explored whether comparative risk judgements for oral health hazards vary systematically with socio-economic characteristics and self-reported risk experience. METHODS: A simple random sample of 1,190 residents born in 1972 was drawn from the population resident in three counties of Norway. A total of 735 adults (51% women)

Anne Nordrehaug Åstrøm



Assessing the Relationship Between Hazard Mitigation Plan Quality and Rural Status in a Cohort of 57 Counties from 3 States in the Southeastern U.S.  

E-print Network

3020183 Terms of Use: Please share your stories about how Open Access to this article benefits you. 2006 Challenges 2012, 3, 183-193; doi:10.3390/challe3020183 challengesISSN 2078-1547; Prater, C.S.; Brody, S.D. Measuring tsunami planning capacity on the U.S. Pacific Coast. Nat. Hazard. Rev. 2008, 9(2), 91-100. 23. Reynnells, L.; John, P.L-C. What is Rural? USDA Rural Information Center: Beltsville, MD, USA, 2008. Available online: http...

Horney, Jennifer A.; Naimi, Ashley I.; Lyles, Ward; Simon, Matt; Salvesen, David; Berke, Philip



Natural Hazards Observer  

NSDL National Science Digital Library

The Natural Hazards Center of the University of Colorado Boulder offers a free online professional hazards publication called the Natural Hazards Observer. Readers will find information on current disaster issues; new international, national, and local disaster management, mitigation, and education programs; hazards research; political and policy developments; new information sources; upcoming conferences; and recent publications. The January 2003 issue (the latest of the bimonthly publication, which dates back to 1996) includes reports with titles such as Congress Passes Inland Flood Warning Bill and Dam Safety Act Passed. Those interested can view the issues online, download and view them, and even search their content by various parameters.



A combined approach to physical vulnerability of large cities exposed to natural hazards - the case study of Arequipa, Peru  

NASA Astrophysics Data System (ADS)

Arequipa, the second largest city in Peru with almost one million inhabitants, is exposed to various natural hazards, such as earthquakes, landslides, flash floods, and volcanic eruptions. This study focuses on the vulnerability and response of housing, infrastructure and lifelines in Arequipa to flash floods and eruption induced hazards, notably lahars from El Misti volcano. We propose a combined approach for assessing physical vulnerability in a large city based on: (1) remote sensing utilizing high-resolution imagery (SPOT5, Google Earth Pro, Bing, Pléïades) to map the distribution and type of land use, properties of city blocks in terms of exposure to the hazard (elevation above river level, distance to channel, impact angle, etc.); (2) in situ survey of buildings and critical infrastructure (e.g., bridges) and strategic resources (e.g., potable water, irrigation, sewage); (3) information gained from interviews with engineers involved in construction works, previous crises (e.g., June 2001 earthquake) and risk mitigation in Arequipa. Remote sensing and mapping at the scale of the city has focused on three pilot areas, along the perennial Rio Chili valley that crosses the city and oasis from north to south, and two of the east-margin tributaries termed Quebrada (ravine): San Lazaro crossing the northern districts and Huarangal crossing the northeastern districts. Sampling of city blocks through these districts provides varying geomorphic, structural, historical, and socio-economic characteristics for each sector. A reconnaissance survey included about 900 edifices located in 40 city blocks across districts of the pilot areas, distinct in age, construction, land use and demographics. A building acts as a structural system and its strength and resistance to flashfloods and lahars therefore highly depends on the type of construction and the used material. Each building surveyed was assigned to one of eight building categories based on physical criteria (dominant building materials, number of floors, percentage and quality of openings, etc). Future steps in this study include mapping potential impacts from flash flood and lahars as a function of frequency of occurrence and magnitude. For this purpose, we will regroup the eight building types identified in Arequipa to obtain a reduced number of vulnerability categories. Fragility functions will then be established for each vulnerability category and hazard relating percentage damage to parameters such as flow velocity, depth, and dynamic and hydrostatic pressure. These functions will be applied to flow simulations for each of the three river channels considered with the final goal to determine potential losses, identify areas of particularly high risk and to prepare plans for evacuation, relocation and rehabilitation. In the long term, this investigation aims to contribute towards a multi-hazard risk analysis including earthquake- and other volcanic hazards, e.g. ashfall and pyroclastic flows, all by considering the cascading effects of a hazard chain. We also plan to address the consequences of failure of two artificial lake dams located 40 and 70 km north of the city. A lake breakout flood or lahar would propagate beyond the city and would call for an immediate response including contingency plans and evacuation practices.

Thouret, Jean-Claude; Ettinger, Susanne; Zuccaro, Giulio; Guitton, Mathieu; Martelli, Kim; Degregorio, Daniela; Nardone, Stefano; Santoni, Olivier; Magill, Christina; Luque, Juan Alexis; Arguedas, Ana



Emergency planning for hazardous industrial areas: a Brazilian case study.  


One of the characteristics of modern industrial development is the emergence of a new typology of accidents whose effects can be spread, in space as well as in time, well beyond the borders of the installations where they occur, sometimes impacting the local population and the environment in a catastrophic fashion. This is the result of a number of factors that have changed the risk profile of modern industrial activities. For a number of reasons, the developing countries have proved to be more vulnerable to industrial disasters. Three of the most catastrophic industrial accidents--Bhopal, San Juan de Ixhuatepec, and Cubatão--occurred in developing countries, claiming thousands of lives. During the 1970s and 1980s the higher degree of public visibility of industrial hazards as a result of serious accidents, led to the creation, especially in the more industrialized countries, of regulations for greater control over industrial activities, either by means of new laws or by updating existing legislation. Some of these regulations were designed to improve the response to accidents with potential impacts outside the industrial sites. This article attempts to describe the current status and identify the shortcomings of off-site emergency planning for hazardous industrial areas in Brazil. The most important problems are the lack of specific legislation and the absence of awareness and active participation of public authorities. The experience of an off-site emergency planning process for a Brazilian industrial area is presented. This experience illustrates how difficult it is to prepare and implement emergency planning processes in an industrializing country. PMID:11051072

de Souza, A B



Field Study of Exhaust Fans for Mitigating Indoor Air Quality Problems & Indoor Air Quality - Exhaust Fan Mitigation.  

SciTech Connect

Overall, the findings show that exhaust fans basically provide small amounts of ventilation compensation. By monitoring the common indoor air pollutants (radon, formaldehyde, carbon monoxide, nitrogen dioxide, and water vapor), it was found that the quality of the indoor air was not adversely affected by the use of exhaust fans. Nor did their use provide any measurable or significant benefits since no improvement in air quality was ascertained. While exhaust fans of this small size did not increase radon, which is the contaminant of most concern, the researchers caution that operation of a larger fan or installation in a very tight home could result in higher levels because depressurization is greater. The daily energy consumption for use of these appliances during the heating season was calculated to be 1.5 kilowatt hours or approximately 3% of the energy consumption in the study homes. The information collected in this collaborative field study indicates that the use of these particular ventilation systems has no significant effect on indoor air quality.

United States. Bonneville Power Administration.



Modeling effects of urban heat island mitigation strategies on heat-related morbidity: a case study for Phoenix, Arizona, USA.  


A zero-dimensional energy balance model was previously developed to serve as a user-friendly mitigation tool for practitioners seeking to study the urban heat island (UHI) effect. Accordingly, this established model is applied here to show the relative effects of four common mitigation strategies: increasing the overall (1) emissivity, (2) percentage of vegetated area, (3) thermal conductivity, and (4) albedo of the urban environment in a series of percentage increases by 5, 10, 15, and 20% from baseline values. In addition to modeling mitigation strategies, we present how the model can be utilized to evaluate human health vulnerability from excessive heat-related events, based on heat-related emergency service data from 2002 to 2006. The 24-h average heat index is shown to have the greatest correlation to heat-related emergency calls in the Phoenix (Arizona, USA) metropolitan region. The four modeled UHI mitigation strategies, taken in combination, would lead to a 48% reduction in annual heat-related emergency service calls, where increasing the albedo is the single most effective UHI mitigation strategy. PMID:19633989

Silva, Humberto R; Phelan, Patrick E; Golden, Jay S



Public willingness to pay for CO2 mitigation and the determinants under climate change: a case study of Suzhou, China.  


This study explored the factors that influence respondents' willingness to pay (WTP) for CO2 mitigation under climate change. A questionnaire survey combined with contingent valuation and psychometric paradigm methods were conducted in the city of Suzhou, Jiangsu Province in China. Respondents' traditional demographic attributes, risk perception of greenhouse gas (GHG), and attitude toward the government's risk management practices were established using a Tobit model to analyze the determinants. The results showed that about 55% of the respondents refused to pay for CO2 mitigation, respondent's WTP increased with increasing CO2 mitigation percentage. Important factors influencing WTP include people's feeling of dread of GHGs, confidence in policy, the timeliness of governmental information disclosure, age, education and income level. PMID:25151109

Yang, Jie; Zou, Liping; Lin, Tiansheng; Wu, Ying; Wang, Haikun



Echo-sounding method aids earthquake hazard studies  

USGS Publications Warehouse

Dramatic examples of catastrophic damage from an earthquake occurred in 1989, when the M 7.1 Lorna Prieta rocked the San Francisco Bay area, and in 1994, when the M 6.6 Northridge earthquake jolted southern California. The surprising amount and distribution of damage to private property and infrastructure emphasizes the importance of seismic-hazard research in urbanized areas, where the potential for damage and loss of life is greatest. During April 1995, a group of scientists from the U.S. Geological Survey and the University of Tennessee, using an echo-sounding method described below, is collecting data in San Antonio Park, California, to examine the Monte Vista fault which runs through this park. The Monte Vista fault in this vicinity shows evidence of movement within the last 10,000 years or so. The data will give them a "picture" of the subsurface rock deformation near this fault. The data will also be used to help locate a trench that will be dug across the fault by scientists from William Lettis & Associates.

U.S. Geological Survey



Seaside, Oregon Tsunami Pilot Study--Modernization of FEMA Flood Hazard Maps  

E-print Network

Seaside, Oregon Tsunami Pilot Study-- Modernization of FEMA Flood Hazard Maps By Tsunami Pilot Study Working Group Joint NOAA/USGS/FEMA Special Report U.S. National Oceanic and Atmospheric.002 annual probability of exceedance #12;#12;Seaside, Oregon Tsunami Pilot Study-- Modernization of FEMA


Studying and Improving Human Response to Natural Hazards: Lessons from the Virtual Hurricane Lab  

NASA Astrophysics Data System (ADS)

One of the most critical challenges facing communities in areas prone to natural hazards is how to best encourage residents to invest in individual and collective actions that would reduce the damaging impact of low-probability, high-consequence, environmental events. Unfortunately, what makes this goal difficult to achieve is that the relative rarity natural hazards implies that many who face the risk of natural hazards have no previous experience to draw on when making preparation decisions, or have prior experience that provides misleading guidance on how best to prepare. For example, individuals who have experienced strings of minor earthquakes or near-misses from tropical cyclones may become overly complacent about the risks that extreme events actually pose. In this presentation we report the preliminary findings of a program of work that explores the use of realistic multi-media hazard simulations designed for two purposes: 1) to serve as a basic research tool for studying of how individuals make decisions to prepare for rare natural hazards in laboratory settings; and 2) to serve as an educational tool for giving people in hazard-prone areas virtual experience in hazard preparation. We demonstrate a prototype simulation in which participants experience the approach of a virtual hurricane, where they have the opportunity to invest in different kinds of action to protect their home from damage. As the hurricane approaches participants have access to an “information dashboard” in which they can gather information about the storm threat from a variety of natural sources, including mock television weather broadcasts, web sites, and conversations with neighbors. In response to this information they then have the opportunity to invest in different levels of protective actions. Some versions of the simulation are designed as games, where participants are rewarded based on their ability to make the optimal trade-off between under and over-preparing for the threat. From a basic research perspective the data provide valuable potential insights into the dynamics of information gathering prior to hurricane impacts, as well as laboratory in which we can study how both information gathering and responses varies in responses to controlled variations in such factors as the complexity of forecast information. From an applied perspective the simulations provide an opportunity for residents in hazard-prone areas to learn about different kinds of information and receive feedback on their potential biases prior to an actual encounter with a hazard. The presentation concludes with a summary of some of the basic research findings that have emerged from the hurricane lab to date, as well as a discussion of the prospects for extending the technology to a broad range of environmental hazards.

Meyer, R.; Broad, K.; Orlove, B. S.



Natural phenomena hazards site characterization criteria  

SciTech Connect

The criteria and recommendations in this standard shall apply to site characterization for the purpose of mitigating Natural Phenomena Hazards (wind, floods, landslide, earthquake, volcano, etc.) in all DOE facilities covered by DOE Order 5480.28. Criteria for site characterization not related to NPH are not included unless necessary for clarification. General and detailed site characterization requirements are provided in areas of meteorology, hydrology, geology, seismology, and geotechnical studies.

Not Available



Mini-Sosie high-resolution seismic method aids hazards studies  

USGS Publications Warehouse

The Mini-Sosie high-resolution seismic method has been effective in imaging shallow-structure and stratigraphic features that aid in seismic-hazard and neotectonic studies. The method is not an alternative to Vibroseis acquisition for large-scale studies. However, it has two major advantages over Vibroseis as it is being used by the USGS in its seismic-hazards program. First, the sources are extremely portable and can be used in both rural and urban environments. Second, the shifting-and-summation process during acquisition improves the signal-to-noise ratio and cancels out seismic noise sources such as cars and pedestrians. -from Authors

Stephenson, W.J.; Odum, J.; Shedlock, K.M.; Pratt, T.L.; Williams, R.A.



Remote sensing techniques for landslide studies and hazard zonation in Europe  

Microsoft Academic Search

An inventory is presented of researches concerning the use of remote sensing for landslide studies and hazard zonation as mainly carried out in the countries belonging to the European Community. An overview is given of the applicability of remote sensing in the following phases of landslide studies: 1.(1) Detection and classification of landslides. Special emphasis is given to the types

Franco Mantovani; Robert Soeters; C. J. Van Westen



Seismic hazard assessment of the cultural heritage sites: A case study in Cappadocia (Turkey)  

NASA Astrophysics Data System (ADS)

Turkey is one of the most seismically active regions in the world. Major earthquakes with the potential of threatening life and property occur frequently here. In the last decade, over 50,000 residents lost their lives, commonly as a result of building failures in seismic events. The Cappadocia region is one of the most important touristic sites in Turkey. At the same time, the region has been included to the Word Heritage List by UNESCO at 1985 due to its natural, historical and cultural values. The region is undesirably affected by several environmental conditions, which are subjected in many previous studies. But, there are limited studies about the seismic evaluation of the region. Some of the important historical and cultural heritage sites are: Goreme Open Air Museum, Uchisar Castle, Ortahisar Castle, Derinkuyu Underground City and Ihlara Valley. According to seismic hazard zonation map published by the Ministry of Reconstruction and Settlement these heritage sites fall in Zone III, Zone IV and Zone V. This map show peak ground acceleration or 10 percent probability of exceedance in 50 years for bedrock. In this connection, seismic hazard assessment of these heritage sites has to be evaluated. In this study, seismic hazard calculations are performed both deterministic and probabilistic approaches with local site conditions. A catalog of historical and instrumental earthquakes is prepared and used in this study. The seismic sources have been identified for seismic hazard assessment based on geological, seismological and geophysical information. Peak Ground Acceleration (PGA) at bed rock level is calculated for different seismic sources using available attenuation relationship formula applicable to Turkey. The result of the present study reveals that the seismic hazard at these sites is closely matching with the Seismic Zonation map published by the Ministry of Reconstruction and Settlement. Keywords: Seismic Hazard Assessment, Probabilistic Approach, Deterministic Approach, Historical Heritage, Cappadocia.

Seyrek, Evren; Orhan, Ahmet; Dinçer, ?smail



Study of the environmental hazard caused by the oil shale industry solid waste.  


The environmental hazard was studied of eight soil and solid waste samples originating from a region of Estonia heavily polluted by the oil shale industry. The samples were contaminated mainly with oil products (up to 7231mg/kg) and polycyclic aromatic hydrocarbons (PAHs; up to 434mg/kg). Concentrations of heavy metals and water-extractable phenols were low. The toxicities of the aqueous extracts of solid-phase samples were evaluated by using a battery of Toxkit tests (involving crustaceans, protozoa, rotifers and algae). Waste rock and fresh semi-coke were classified as of "high acute toxic hazard", whereas aged semi-coke and most of the polluted soils were classified as of "acute toxic hazard". Analysis of the soil slurries by using the photobacterial solid-phase flash assay showed the presence of particle-bound toxicity in most samples. In the case of four samples out of the eight, chemical and toxicological evaluations both showed that the levels of PAHs, oil products or both exceeded their respective permitted limit values for the living zone (20mg PAHs/kg and 500mg oil products/kg); the toxicity tests showed a toxic hazard. However, in the case of three samples, the chemical and toxicological hazard predictions differed markedly: polluted soil from the Erra River bank contained 2334mg oil/kg, but did not show any water-extractable toxicity. In contrast, spent rock and aged semi-coke that contained none of the pollutants in hazardous concentrations, showed adverse effects in toxicity tests. The environmental hazard of solid waste deposits from the oil shale industry needs further assessment. PMID:11387023

Põllumaa, L; Maloveryan, A; Trapido, M; Sillak, H; Kahru, A



An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study  

NASA Technical Reports Server (NTRS)

The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

Ray, Paul S.



A Case Study in Ethical Decision Making Regarding Remote Mitigation of Botnets  

NASA Astrophysics Data System (ADS)

It is becoming more common for researchers to find themselves in a position of being able to take over control of a malicious botnet. If this happens, should they use this knowledge to clean up all the infected hosts? How would this affect not only the owners and operators of the zombie computers, but also other researchers, law enforcement agents serving justice, or even the criminals themselves? What dire circumstances would change the calculus about what is or is not appropriate action to take? We review two case studies of long-lived malicious botnets that present serious challenges to researchers and responders and use them to illuminate many ethical issues regarding aggressive mitigation. We make no judgments about the questions raised, instead laying out the pros and cons of possible choices and allowing workshop attendees to consider how and where they would draw lines. By this, we hope to expose where there is clear community consensus as well as where controversy or uncertainty exists.

Dittrich, David; Leder, Felix; Werner, Tillmann


Reducing aluminum dust explosion hazards: case study of dust inerting in an aluminum buffing operation.  


Metal powders or dusts can represent significant dust explosion hazards in industry, due to their relatively low ignition energy and high explosivity. The hazard is well known in industries that produce or use aluminum powders, but is sometimes not recognized by facilities that produce aluminum dust as a byproduct of bulk aluminum processing. As demonstrated by the 2003 dust explosion at aluminum wheel manufacturer Hayes Lemmerz, facilities that process bulk metals are at risk due to dust generated during machining and finishing operations [U.S. Chemical Safety and Hazard Investigation Board, Investigation Report, Aluminum Dust Explosion Hayes Lemmerz International, Inc., Huntington, Indiana, Report No. 2004-01-I-IN, September 2005]. Previous studies have shown that aluminum dust explosions are more difficult to suppress with flame retardants or inerting agents than dust explosions fueled by other materials such as coal [A.G. Dastidar, P.R. Amyotte, J. Going, K. Chatrathi, Flammability limits of dust-minimum inerting concentrations, Proc. Saf. Progr., 18-1 (1999) 56-63]. In this paper, an inerting method is discussed to reduce the dust explosion hazard of residue created in an aluminum buffing operation as the residue is generated. This technique reduces the dust explosion hazard throughout the buffing process and within the dust collector systems making the process inherently safer. Dust explosion testing results are presented for process dusts produced during trials with varying amounts of flame retardant additives. PMID:18423857

Myers, Timothy J



Adaptive management for mitigating Cryptosporidium risk in source water: a case study in an agricultural catchment in South Australia.  


Water-borne pathogens such as Cryptosporidium pose a significant human health risk and catchments provide the first critical pollution 'barrier' in mitigating risk in drinking water supply. In this paper we apply an adaptive management framework to mitigating Cryptosporidium risk in source water using a case study of the Myponga catchment in South Australia. Firstly, we evaluated the effectiveness of past water quality management programs in relation to the adoption of practices by landholders using a socio-economic survey of land use and management in the catchment. The impact of past management on the mitigation of Cryptosporidium risk in source water was also evaluated based on analysis of water quality monitoring data. Quantitative risk assessment was used in planning the next round of management in the adaptive cycle. Specifically, a pathogen budget model was used to identify the major remaining sources of Cryptosporidium in the catchment and estimate the mitigation impact of 30 alternative catchment management scenarios. Survey results show that earlier programs have resulted in the comprehensive adoption of best management practices by dairy farmers including exclusion of stock from watercourses and effluent management from 2000 to 2007. Whilst median Cryptosporidium concentrations in source water have decreased since 2004 they remain above target levels and put pressure on other barriers to mitigate risk, particularly the treatment plant. Non-dairy calves were identified as the major remaining source of Cryptosporidium in the Myponga catchment. The restriction of watercourse access of non-dairy calves could achieve a further reduction in Cryptosporidium export to the Myponga reservoir of around 90% from current levels. The adaptive management framework applied in this study was useful in guiding learning from past management, and in analysing, planning and refocusing the next round of catchment management strategies to achieve water quality targets. PMID:19515479

Bryan, Brett A; Kandulu, John; Deere, Daniel A; White, Monique; Frizenschaf, Jacqueline; Crossman, Neville D



A hazard and operability study of anhydrous ammonia application in agriculture.  


Researchers from the National Institute for Occupational Safety and Health (NIOSH) applied Hazard and Operability (HAZOP) analysis to examine hazards during the use of anhydrous ammonia by farmers. This analysis evaluated the storage, transfer, and application of anhydrous ammonia, identifying credible hazard scenarios, practical solutions, and research needs. Ninety-five findings were developed that are of use to farmers, distributors of ammonia and application equipment, and manufacturers of application equipment. The findings generally involve training, equipment design changes, preventive maintenance, and material compatibilities. The HAZOP team found that additional safety features need to be developed or implemented. The study also pointed out where correct operator procedure and preventive maintenance can prevent inadvertent releases. Other inadvertent releases are caused by incompatible materials, or by using equipment in ways other than intended. Several examples of the findings are given to emphasize the HAZOP technique and the high-risk scenarios. Strategies for dissemination to the agricultural community are presented. PMID:8256691

Spencer, A B; Gressel, M G



Urban Vulnerability Assessment to Seismic Hazard through Spatial Multi-Criteria Analysis. Case Study: the Bucharest Municipality/Romania  

NASA Astrophysics Data System (ADS)

In the context of an explosive increase in value of the damage caused by natural disasters, an alarming challenge in the third millennium is the rapid growth of urban population in vulnerable areas. Cities are, by definition, very fragile socio-ecological systems with a high level of vulnerability when it comes to environmental changes and that are responsible for important transformations of the space, determining dysfunctions shown in the state of the natural variables (Parker and Mitchell, 1995, The OFDA/CRED International Disaster Database). A contributing factor is the demographic dynamic that affects urban areas. The aim of this study is to estimate the overall vulnerability of the urban area of Bucharest in the context of the seismic hazard, by using environmental, socio-economic, and physical measurable variables in the framework of a spatial multi-criteria analysis. For this approach the capital city of Romania was chosen based on its high vulnerability due to the explosive urban development and the advanced state of degradation of the buildings (most of the building stock being built between 1940 and 1977). Combining these attributes with the seismic hazard induced by the Vrancea source, Bucharest was ranked as the 10th capital city worldwide in the terms of seismic risk. Over 40 years of experience in the natural risk field shows that the only directly accessible way to reduce the natural risk is by reducing the vulnerability of the space (Adger et al., 2001, Turner et al., 2003; UN/ISDR, 2004, Dayton-Johnson, 2004, Kasperson et al., 2005; Birkmann, 2006 etc.). In effect, reducing the vulnerability of urban spaces would imply lower costs produced by natural disasters. By applying the SMCA method, the result reveals a circular pattern, signaling as hot spots the Bucharest historic centre (located on a river terrace and with aged building stock) and peripheral areas (isolated from the emergency centers and defined by precarious social and economic conditions). In effect, the example of Bucharest demonstrates how the results shape the ‘vulnerability to seismic hazard profile of the city, based on which decision makers could develop proper mitigation strategies. To sum up, the use of an analytical framework as the standard Spatial Multi-Criteria Analysis (SMCA) - despite all difficulties in creating justifiable weights (Yeh et al., 1999) - results in accurate estimations of the state of the urban system. Although this method was often mistrusted by decision makers (Janssen, 2001), we consider that the results can represent, based on precisely the level of generalization, a decision support framework for policy makers to critically reflect on possible risk mitigation plans. Further study will lead to the improvement of the analysis by integrating a series of daytime and nighttime scenarios and a better definition of the constructed space variables.

Armas, Iuliana; Dumitrascu, Silvia; Bostenaru, Maria



Voltage Sag Mitigation Strategies for an Indian Power Systems: A Case Study  

NASA Astrophysics Data System (ADS)

Under modern deregulated environment, both utilities and customers are concerned with the power quality improvement but with different objectives/interests. The utility reconfigure its power network and install mitigation devices, if needed, to improve power quality. The paper presents a strategy for selecting cost-effective solutions to mitigate voltage sags, the most frequent power quality disturbance. In this paper, mitigation device(s) is/are inducted in the optimal network topology at suitable places for their better effectiveness for further improvement in power quality. The optimal placement is looked from utility perspectives for overall benefit. Finally, their performance is evaluated on the basis of reduction in total number of voltage sags, reduction in total number of process trips and reduction in total financial losses due to voltage sags.

Goswami, A. K.; Gupta, C. P.; Singh, G. K.



Success in transmitting hazard science  

NASA Astrophysics Data System (ADS)

Money motivates mitigation. An example of success in communicating scientific information about hazards, coupled with information about available money, is the follow-up action by local governments to actually mitigate. The Nevada Hazard Mitigation Planning Committee helps local governments prepare competitive proposals for federal funds to reduce risks from natural hazards. Composed of volunteers with expertise in emergency management, building standards, and earthquake, flood, and wildfire hazards, the committee advises the Nevada Division of Emergency Management on (1) the content of the State’s hazard mitigation plan and (2) projects that have been proposed by local governments and state agencies for funding from various post- and pre-disaster hazard mitigation programs of the Federal Emergency Management Agency. Local governments must have FEMA-approved hazard mitigation plans in place before they can receive this funding. The committee has been meeting quarterly with elected and appointed county officials, at their offices, to encourage them to update their mitigation plans and apply for this funding. We have settled on a format that includes the county’s giving the committee an overview of its infrastructure, hazards, and preparedness. The committee explains the process for applying for mitigation grants and presents the latest information that we have about earthquake hazards, including locations of nearby active faults, historical seismicity, geodetic strain, loss-estimation modeling, scenarios, and documents about what to do before, during, and after an earthquake. Much of the county-specific information is available on the web. The presentations have been well received, in part because the committee makes the effort to go to their communities, and in part because the committee is helping them attract federal funds for local mitigation of not only earthquake hazards but also floods (including canal breaches) and wildfires, the other major concerns in Nevada. Local citizens appreciate the efforts of the state officials to present the information in a public forum. The Committee’s earthquake presentations to the counties are supplemented by regular updates in the two most populous counties during quarterly meetings of the Nevada Earthquake Safety Council, generally alternating between Las Vegas and Reno. We have only 17 counties in Nevada, so we are making good progress at reaching each within a few years. The Committee is also learning from the county officials about their frustrations in dealing with the state and federal bureaucracies. Success is documented by the mitigation projects that FEMA has funded.

Price, J. G.; Garside, T.




EPA Science Inventory

This study is expected to fill an important gap in the literature by focusing on how individuals characterize exposure in terms of risk and hazard, and how this understanding can lead to concrete changes in their personal and professional lives. I expect that people differ gre...


Earthquakes and faults in the Krahnjkar area Review of hazards and recommended further studies  

E-print Network

Earthquakes and faults in the Kárahnjúkar area Review of hazards and recommended further studies in the Kárahnjúkar area 6. Strike-slip faults at Kárahnjukar 7. Potential future faulting and earthquakes 8. Future in the Kárahnjukar area due to earthquakes and faults, taking into consideration relevant new geological observations

Sigmundsson, Freysteinn


Disposing of hazardous waste. An update on waste management studies.  


Waste management in the dental office is not a limited issue involving only dentists from the Region of Hamilton-Wentworth. While the ODA has had the opportunity to work with the Hamilton Academy of Dentistry and has the support of this society for a two-phased project, the Metro Toronto component societies will be joining the existing MOEE/Hamilton study. The MOEE in Halton-Peel has informed us that they will be conducting a similar survey and study. The committee would like to thank the Executive of the Hamilton Academy of Dentistry who have provided needed follow-up on this project. We look forward to the cooperation of individual dentists in all communities involved in this environmental study. Dentists are encouraged to complete the survey and to consider volunteering to take part in the in-office sample study. If you have any questions, we invite you to contact members of the Health Care Committee or the staff in the Department of Professional Affairs. PMID:9468925

Samek, L



Studies on Hazard Characterization for Performance-based Structural Design  

E-print Network

................. 33 x Page Figure 18 Wind Speed and Surge Data (1995-2008) for Galveston Pier 21, TX, Including Hurricanes Gustav and Ike (2008).......................... 38 Figure 19 Wind-ROD Pairs for Super Cell #2.... This has obvious implications for emergency management including pre-disaster planning and post-disaster response. Additional, more recent, hurricane event data (i.e., Gustav, 2008 and Ike, 2008) were included during the course of this study to improve...

Wang, Yue



Study on the urban heat island mitigation effect achieved by converting to grass-covered parking  

Microsoft Academic Search

The urban heat island mitigation effect of conversion from asphalt-covered parking areas to grass-covered ones is estimated by observation and calculation. The mean surface temperature in a parking lot is calculated from a thermal image captured by an infrared camera. The sensible heat flux in each parking space is calculated based on the surface heat budget. The reduction in the

Hideki Takebayashi; Masakazu Moriyama



Mitigation of indirect environmental effects of GM crops  

PubMed Central

Currently, the UK has no procedure for the approval of novel agricultural practices that is based on environmental risk management principles. Here, we make a first application of the ‘bow-tie’ risk management approach in agriculture, for assessment of land use changes, in a case study of the introduction of genetically modified herbicide tolerant (GMHT) sugar beet. There are agronomic and economic benefits, but indirect environmental harm from increased weed control is a hazard. The Farm Scale Evaluation (FSE) trials demonstrated reduced broad-leaved weed biomass and seed production at the field scale. The simplest mitigation measure is to leave a proportion of rows unsprayed in each GMHT crop field. Our calculations, based on FSE data, show that a maximum of 2% of field area left unsprayed is required to mitigate weed seed production and 4% to mitigate weed biomass production. Tilled margin effects could simply be mitigated by increasing the margin width from 0.5 to 1.5?m. Such changes are cheap and simple to implement in farming practices. This case study demonstrates the usefulness of the bow-tie risk management approach and the transparency with which hazards can be addressed. If adopted generally, it would help to enable agriculture to adopt new practices with due environmental precaution. PMID:17439853

Pidgeon, J.D; May, M.J; Perry, J.N; Poppy, G.M



Adsorption and desorption studies on hazardous dye Naphthol Yellow S.  


In the present study, the batch technique was adopted under a variety of conditions, viz., amount of adsorbent, contact time, concentration, temperature and pH. By using UV spectrophotometer, concentration of dye was measured before and after adsorption. Dye removal data were fitted into the Langmuir and Freundlich adsorption isotherm equations. The values of their corresponding constants were determined. Thermodynamic parameters like free energy (DeltaG), enthalpy (DeltaH) and entropy (DeltaS) of the systems were calculated by using Langmuir constant. The estimated values for (DeltaG) were -8.027x10(3) and -28.46x10(3) kJ mol(-1) over activated carbon and activated de-oiled mustard at 303 K (30 degrees C), indicate toward a spontaneous process. The adsorption process followed pseudo-first-order model. The values of % removal and k(ad) for dye systems were calculated at different temperatures ranging (303-323 K). Desorption studies indicate that elution by dilute NaOH through the fixed bed of the adsorbents columns could be regenerated and a quantitative recovery of Naphthol Yellow S can be achieved. PMID:20667651

Jain, Rajeev; Gupta, V K; Sikarwar, Shalini



Development based climate change adaptation and mitigation—conceptual issues and lessons learned in studies in developing countries  

Microsoft Academic Search

This paper discusses the conceptual basis for linking development policies and climate change adaptation and mitigation and\\u000a suggests an analytical approach that can be applied to studies in developing countries. The approach is centred on a broad\\u000a set of policy evaluation criteria that merge traditional economic and sectoral goals and broader social issues related to\\u000a health and income distribution. The

Kirsten Halsnæs; Jan Verhagen



A feasibility study on the influence of the geomorphological feature in identifying the potential landslide hazard  

NASA Astrophysics Data System (ADS)

In this study we focused on identifying geomorphological features that control the location of landslides. The representation of these features is based on a high resolution DEM (Digital Elevation Model) derived from airborne laser altimetry (LiDAR) and evaluated by statistical analysis of axial orientation data. The main principle of this analysis is generating eigenvalues from axial orientation data and comparing them. The Planarity, a ratio of eigenvalues, would tell the degree of roughness on ground surface based on their ratios. Results are compared to the recent landslide case in Korea in order to evaluate the feasibility of the proposed methodology in identifying the potential landslide hazard. The preliminary landslide assessment based on the Planarity analysis well discriminates features between stable and unstable domain in the study area especially in the landslide initiation zones. Results also show it is beneficial to build the preliminary landslide hazard especially inventory mapping where none of information on historical records of landslides is existed. By combining other physical procedures such as geotechnical monitoring, the landslide hazard assessment using geomorphological features will promise a better understanding of landslides and their mechanisms, and provide an enhanced methodology to evaluate their hazards and appropriate actions.

Baek, M. H.; Kim, T. H.



Thermal study of payload module for the next-generation infrared space telescope SPICA in risk mitigation phase  

NASA Astrophysics Data System (ADS)

SPace Infrared telescope for Cosmology and Astrophysics (SPICA) is a pre-project of JAXA in collaboration with ESA to be launched around 2020. The SPICA is transferred into a halo orbit around the second Lagrangian point (L2) in the Sun-Earth system, which enables us to use effective radiant cooling in combination with mechanical cooling system in order to cool a 3 m large IR telescope below 6 K. At a present, a conceptional study of SPICA is underway to assess and mitigate mission's risks; the thermal study for the risk mitigation sets a goal of a 25% margin on cooling power of 4 K/1 K temperature regions, a 25% margin on the heat load from Focal Plane Instruments (FPIs) at intermediated temperature region, to enhance the reliability of the mechanical cooler system, and to enhance feasibility of ground tests. Thermal property measurements of FRP materials are also important. This paper introduces details of the thermal design study for risk mitigation, including development of the truss separation mechanism, the cryogenic radiator, mechanical cooler system, and thermal property measurements of materials.

Shinozaki, Keisuke; Sato, Yoichi; Sawada, Kenichiro; Ando, Makiko; Sugita, Hiroyuki; Yamawaki, Toshihiro; Mizutani, Tadahiro; Komatsu, Keiji; Nakagawa, Takao; Murakami, Hiroshi; Matsuhara, Hideo; Takada, Makoto; Takai, Shigeki; Okabayashi, Akinobu; Tsunematsu, Shoji; Kanao, Kenichi; Narasaki, Katsuhiro



Respiratory hazards in hard metal workers: a cross sectional study.  

PubMed Central

A cross sectional study was conducted on 513 employees at three hard metal plants: 425 exposed workers (351 men, 74 women) and 88 controls (69 men, 19 women). Cough and sputum were more frequent in workers engaged in "soft powder" and presintering workshops compared with controls (12.5% and 16.5% v 3.5%). Spirometric abnormalities were more frequent among women in sintering and finishing workshops compared with control women (56.8% v 23.8%) and abnormalities of carbon monoxide test were more frequent in exposed groups than in controls; this difference was more pronounced in women (31.4% v 5.6%) than in men (18.5% v 13%). No significant correlation was observed between duration of exposure and age adjusted lung function tests. Slight abnormalities of chest radiographs (0/1, 1/1 according to ILO classification) were more frequent in exposed men than controls (12.8% v 1.9%) and mostly in soft powder workers. In subjects with abnormal chest radiographs FVC, FEV1 and carbon monoxide indices (fractional uptake of CO or CO transfer index or both) were lower compared with those with normal chest radiographs. Although relatively mild, the clinical, radiological, and functional abnormalities uncovered call for a regular supervision of workers exposed to hard metal dust. PMID:2787666

Meyer-Bisch, C; Pham, Q T; Mur, J M; Massin, N; Moulin, J J; Teculescu, D; Carton, B; Pierre, F; Baruthio, F



Examination of Icing Induced Loss of Control and Its Mitigations  

NASA Technical Reports Server (NTRS)

Factors external to the aircraft are often a significant causal factor in loss of control (LOC) accidents. In today s aviation world, very few accidents stem from a single cause and typically have a number of causal factors that culminate in a LOC accident. Very often the "trigger" that initiates an accident sequence is an external environment factor. In a recent NASA statistical analysis of LOC accidents, aircraft icing was shown to be the most common external environmental LOC causal factor for scheduled operations. When investigating LOC accident or incidents aircraft icing causal factors can be categorized into groups of 1) in-flight encounter with super-cooled liquid water clouds, 2) take-off with ice contamination, or 3) in-flight encounter with high concentrations of ice crystals. As with other flight hazards, icing induced LOC accidents can be prevented through avoidance, detection, and recovery mitigations. For icing hazards, avoidance can take the form of avoiding flight into icing conditions or avoiding the hazard of icing by making the aircraft tolerant to icing conditions. Icing detection mitigations can take the form of detecting icing conditions or detecting early performance degradation caused by icing. Recovery from icing induced LOC requires flight crew or automated systems capable of accounting for reduced aircraft performance and degraded control authority during the recovery maneuvers. In this report we review the icing induced LOC accident mitigations defined in a recent LOC study and for each mitigation describe a research topic required to enable or strengthen the mitigation. Many of these research topics are already included in ongoing or planned NASA icing research activities or are being addressed by members of the icing research community. These research activities are described and the status of the ongoing or planned research to address the technology needs is discussed

Reehorst, Andrew L.; Addy, Harold E., Jr.; Colantonio, Renato O.



Preparation of a national Copernicus service for detection and monitoring of land subsidence and mass movements in the context of remote sensing assisted hazard mitigation  

NASA Astrophysics Data System (ADS)

Land subsidence can cause severe damage for e.g. infrastructure and buildings and mass movements even can lead to loss of live. Detection and monitoring of these processes by terrestrial measurement techniques remain a challenge due to limitations in spatial coverage and temporal resolution. Since the launch of ERS-1 in 1991 numerous scientific studies demonstrated the capability of differential SAR-Interferometry (DInSAR) for the detection of surface deformation proving the usability of this method. In order to assist the utilization of DInSAR for governmental tasks a national service-concept within the EU-ESA Program "Copernicus" is in the process of preparation. This is done by i) analyzing the user requirements, ii) developing a concept and iii) perform case studies as "proof of concept". Due to the iterative nature of this procedure governmental users as well as DInSAR experts are involved. This paper introduces the concept, shows the available SAR data archive from ERS-1/2, TerraSAR-X and TanDEM-X as well as the proposed case study. The case study is focusing on the application of advanced DInSAR methods for the detection of subsidence in a region with active gas extraction. The area of interest is located in the state of Lower Saxony in the northwest of Germany. The DInSAR analysis will be based on ERS-1/2 and on TerraSARX/ TanDEM-X SAR data. The usability of the DInSAR products will be discussed with the responsible mining authority (LBEG) in order to adapt the DInSAR products to the user needs and to evaluate the proposed concept.

Kalia, Andre C.; Frei, Michaela; Lege, Thomas



Seaside, Oregon, Tsunami Pilot Study-Modernization of FEMA Flood Hazard Maps: GIS Data  

USGS Publications Warehouse

Introduction: The Federal Emergency Management Agency (FEMA) Federal Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study (Chowdhury and others, 2005). Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Analysis (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines (Tsunami Pilot Study Working Group, 2006). The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey, and the National Oceanic and Atmospheric Administration (NOAA), in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. We present the spatial (geographic information system, GIS) data from the pilot study in standard GIS formats and provide files for visualization in Google Earth, a global map viewer.

Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.



Proportional hazards regression in epidemiologic follow-up studies: an intuitive consideration of primary time scale.  


In epidemiologic cohort studies of chronic diseases, such as heart disease or cancer, confounding by age can bias the estimated effects of risk factors under study. With Cox proportional-hazards regression modeling in such studies, it would generally be recommended that chronological age be handled nonparametrically as the primary time scale. However, studies involving baseline measurements of biomarkers or other factors frequently use follow-up time since measurement as the primary time scale, with no explicit justification. The effects of age are adjusted for by modeling age at entry as a parametric covariate. Parametric adjustment raises the question of model adequacy, in that it assumes a known functional relationship between age and disease, whereas using age as the primary time scale does not. We illustrate this graphically and show intuitively why the parametric approach to age adjustment using follow-up time as the primary time scale provides a poor approximation to age-specific incidence. Adequate parametric adjustment for age could require extensive modeling, which is wasteful, given the simplicity of using age as the primary time scale. Furthermore, the underlying hazard with follow-up time based on arbitrary timing of study initiation may have no inherent meaning in terms of risk. Given the potential for biased risk estimates, age should be considered as the preferred time scale for proportional-hazards regression with epidemiologic follow-up data when confounding by age is a concern. PMID:22517300

Cologne, John; Hsu, Wan-Ling; Abbott, Robert D; Ohishi, Waka; Grant, Eric J; Fujiwara, Saeko; Cullings, Harry M



Climate engineering of vegetated land for hot extremes mitigation: an ESM sensitivity study  

NASA Astrophysics Data System (ADS)

Mitigation efforts to reduce anthropogenic climate forcing have thus far proven inadequate, as evident from accelerating greenhouse gas emissions. Many subtropical and mid-latitude regions are expected to experience longer and more frequent heat waves and droughts within the next century. This increased occurrence of weather extremes has important implications for human health, mortality and for socio-economic factors including forest fires, water availability and agricultural production. Various solar radiation management (SRM) schemes that attempt to homogeneously counter the anthropogenic forcing have been examined with different Earth System Models (ESM). Land climate engineering schemes have also been investigated which reduces the amount of solar radiation that is absorbed at the surface. However, few studies have investigated their effects on extremes but rather on mean climate response. Here we present the results of a series of climate engineering sensitivity experiments performed with the Community Earth System Model (CESM) version 1.0.2 at 2°-resolution. This configuration entails 5 fully coupled model components responsible for simulating the Earth's atmosphere, land, land-ice, ocean and sea-ice that interact through a central coupler. Historical and RCP8.5 scenarios were performed with transient land-cover changes and prognostic terrestrial Carbon/Nitrogen cycles. Four sets of experiments are performed in which surface albedo over snow-free vegetated grid points is increased by 0.5, 0.10, 0.15 and 0.20. The simulations show a strong preferential cooling of hot extremes throughout the Northern mid-latitudes during boreal summer. A strong linear scaling between the cooling of extremes and additional surface albedo applied to the land model is observed. The strongest preferential cooling is found in southeastern Europe and the central United States, where increases of soil moisture and evaporative fraction are the largest relative to the control simulation. This preferential cooling is found to intensify in the future scenario. Cloud cover strongly limits the efficacy of a given surface albedo increase to reflect incoming solar radiation back into space. As anthropogenic forcing increases, cloud cover decreases over much of the northern mid-latitudes in CESM.

Wilhelm, Micah; Davin, Edouard; Seneviratne, Sonia



Use of geotextiles for mitigation of the effects of man-made hazards such as greening of waste deposits in frame of the conversion of industrial areas  

NASA Astrophysics Data System (ADS)

The city of Karlsruhe lays on the Rhine valley; however, it is situated at a certain distance from the Rhine river and the coastal front is not integrated in the urban development. However, the port to the Rhine developed to the second largest internal port in Germany. With the process of deindustrialisation, industrial use is now shrinking. With the simultaneous process of the ecological re-win of rivers, the conversion of the industrial area to green and residential areals is imposed. In the 1990s a project was made by the third author of the contribution with Andrea Ciobanu as students of the University of Karlsruhe for the conversion of the Rhine port area of Karlsruhe into such a nature-residential use. The area included also a waste deposit, proposed to be transformed into a "green hill". Such an integration of a waste deposit into a park in the process of the conversion of an industrial area is not singular in Germany; several such projects were proposed and some of them realised at the IBA Emscher Park in the Ruhr area. Some of them were coupled with artistic projects. The technical details are also subject of the contribution. Studies were made by the first two authors on the conditions in which plants grow on former waste deposits if supported by intermediar layers of a geotextile. The characteristics of the geotextiles, together with the technologic process of obtaining, and the results of laboratory and field experiments for use on waste deposits in comparable conditions in Romania will be shown. The geotextile is also usable for ash deposits such as those in the Ruhr area.

Bostenaru, Magdalena; Siminea, Ioana; Bostenaru, Maria



Assessment and indirect adjustment for confounding by smoking in cohort studies using relative hazards models.  


Workers' smoking histories are not measured in many occupational cohort studies. Here we discuss the use of negative control outcomes to detect and adjust for confounding in analyses that lack information on smoking. We clarify the assumptions necessary to detect confounding by smoking and the additional assumptions necessary to indirectly adjust for such bias. We illustrate these methods using data from 2 studies of radiation and lung cancer: the Colorado Plateau cohort study (1950-2005) of underground uranium miners (in which smoking was measured) and a French cohort study (1950-2004) of nuclear industry workers (in which smoking was unmeasured). A cause-specific relative hazards model is proposed for estimation of indirectly adjusted associations. Among the miners, the proposed method suggests no confounding by smoking of the association between radon and lung cancer--a conclusion supported by adjustment for measured smoking. Among the nuclear workers, the proposed method suggests substantial confounding by smoking of the association between radiation and lung cancer. Indirect adjustment for confounding by smoking resulted in an 18% decrease in the adjusted estimated hazard ratio, yet this cannot be verified because smoking was unmeasured. Assumptions underlying this method are described, and a cause-specific proportional hazards model that allows easy implementation using standard software is presented. PMID:25245043

Richardson, David B; Laurier, Dominique; Schubauer-Berigan, Mary K; Tchetgen Tchetgen, Eric; Cole, Stephen R



Characteristics and predictors of home injury hazards among toddlers in Wenzhou, China: a community-based cross-sectional study  

PubMed Central

Background Home hazards are associated with toddlers receiving unintentional home injuries (UHI). These result in not only physical and psychological difficulties for children, but also economic losses and additional stress for their families. Few researchers pay attention to predictors of home hazards among toddlers in a systematic way. The purpose of this study is firstly to describe the characteristics of homes with hazards and secondly to explore the predicted relationship of children, parents and family factors to home hazards among toddlers aged 24–47 months in Wenzhou, China. Methods A random cluster sampling was employed to select 366 parents having children aged 24 – 47 months from 13 kindergartens between March and April of 2012. Four instruments assessed home hazards, demographics, parent’s awareness of UHI, as well as family functioning. Results Descriptive statistics showed that the mean of home hazards was 12.29 (SD?=?6.39). The nine kinds of home hazards that were identified in over 50% of households were: plastic bags (74.3%), coin buttons (69.1%), and toys with small components (66.7%) etc. Multivariate linear regression revealed that the predictors of home hazards were the child’s age, the child’s residential status and family functioning (b?=?.19, 2.02, - .07, p?hazards were significantly attributed to older toddlers, migrant toddlers and poorer family functioning. This result suggested that heath care providers should focus on the vulnerable family and help the parents assess home hazards. Further study is needed to find interventions on how to manage home hazards for toddlers in China. PMID:24953678




Microsoft Academic Search

Many reservoirs currently in operation trap most or all of the sediment entering the reservoir, creating sediment-depleted conditions downstream. This may cause channel adjustment in the form of bank erosion, bed erosion, substrate coarsening, and channel planform change. Channel adjustment may also result from episodic sediment releases during reservoir operation, or from sediment evacuation following dam removal. Channel adjustment to



Study on anaerobic digestion treatment of hazardous colistin sulphate contained pharmaceutical sludge.  


Pharmaceutical sludge is considered as a hazardous substance with high treatment and disposal fees. Anaerobic digestion could not only transform the hazardous substance into activated sludge, but also generate valuable biogas. This research had two objectives. First: studying the feasibility of anaerobic digestion and determining the biochemical methane potential (BMP) of pharmaceutical sludge under different Inoculum to substrate TS ratios (ISRs) of 0, 0.65, 2.58 and 10.32 in mesophilic condition of 37±1°C. Secondly, investigating the removal efficiency of colistin sulphate during anaerobic digestion. The results showed that the use of anaerobic digestion to treat the pharmaceutical sludge is feasible and that it can completely eliminate the colistin sulphate. The highest biogas production from pharmaceutical sludge is 499.46mL/gTS at an ISR of 10.32. PMID:25490101

Yin, Fubin; Wang, Dongling; Li, Zifu; Ohlsen, Thomas; Hartwig, Peter; Czekalla, Sven



CMMAD usability case study in support of countermine and hazard sensing  

NASA Astrophysics Data System (ADS)

During field trials, operator usability data were collected in support of lane clearing missions and hazard sensing for two robot platforms with Robot Intelligence Kernel (RIK) software and sensor scanning payloads onboard. The tests featured autonomous and shared robot autonomy levels where tasking of the robot used a graphical interface featuring mine location and sensor readings. The goal of this work was to provide insights that could be used to further technology development. The efficacy of countermine and hazard systems in terms of mobility, search, path planning, detection, and localization were assessed. Findings from objective and subjective operator interaction measures are reviewed along with commentary from soldiers having taken part in the study who strongly endorse the system.

Walker, Victor G.; Gertman, David I.



49 CFR 195.579 - What must I do to mitigate internal corrosion?  

Code of Federal Regulations, 2010 CFR

...What must I do to mitigate internal corrosion? 195.579 Section 195.579...TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Corrosion Control § 195.579 What must I do to mitigate internal corrosion? (a) General. If you...



49 CFR 195.579 - What must I do to mitigate internal corrosion?  

Code of Federal Regulations, 2013 CFR

...What must I do to mitigate internal corrosion? 195.579 Section 195.579...TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Corrosion Control § 195.579 What must I do to mitigate internal corrosion? (a) General. If you...



49 CFR 195.579 - What must I do to mitigate internal corrosion?  

Code of Federal Regulations, 2011 CFR

...What must I do to mitigate internal corrosion? 195.579 Section 195.579...TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Corrosion Control § 195.579 What must I do to mitigate internal corrosion? (a) General. If you...



49 CFR 195.579 - What must I do to mitigate internal corrosion?  

Code of Federal Regulations, 2014 CFR

...What must I do to mitigate internal corrosion? 195.579 Section 195.579...TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Corrosion Control § 195.579 What must I do to mitigate internal corrosion? (a) General. If you...



49 CFR 195.579 - What must I do to mitigate internal corrosion?  

Code of Federal Regulations, 2012 CFR

...What must I do to mitigate internal corrosion? 195.579 Section 195.579...TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Corrosion Control § 195.579 What must I do to mitigate internal corrosion? (a) General. If you...



Status of volcanic hazard studies for the Nevada Nuclear Waste Storage Investigations  

SciTech Connect

Volcanism studies of the Nevada Test Site (NTS) region are concerned with hazards of future volcanism with respect to underground disposal of high-level radioactive waste. The hazards of silicic volcanism are judged to be negligible; hazards of basaltic volcanism are judged through research approaches combining hazard appraisal and risk assessment. The NTS region is cut obliquely by a N-NE trending belt of volcanism. This belt developed about 8 Myr ago following cessation of silicic volcanism and contemporaneous with migration of basaltic activity toward the southwest margin of the Great Basin. Two types of fields are present in the belt: (1) large-volume, long-lived basalt and local rhyolite fields with numerous eruptive centers and (2) small-volume fields formed by scattered basaltic scoria cones. Late Cenozoic basalts of the NTS region belong to the second field type. Monogenetic basalt centers of this region were formed mostly by Strombolian eruptions; Surtseyean activity has been recognized at three centers. Geochemically, the basalts of the NTS region are classified as straddle A-type basalts of the alkalic suite. Petrological studies indicate a volumetric dominance of evolved hawaiite magmas. Trace- and rare-earth-element abundances of younger basalt (<4 Myr) of the NTS region and southern Death Valley area, California, indicate an enrichment in incompatible elements, with the exception of rubidium. The conditional probability of recurring basaltic volcanism and disruption of a repository by that event is bounded by the range of 10{sup -8} to 10{sup -10} as calculated for a 1-yr period. Potential disruptive and dispersal effects of magmatic penetration of a repository are controlled primarily by the geometry of basalt feeder systems, the mechanism of waste incorporation in magma, and Strombolian eruption processes.

Crowe, B.M.; Vaniman, D.T.; Carr, W.J.



Mitigation potential of horizontal ground coupled heat pumps for current and future climatic conditions: UK environmental modelling and monitoring studies  

NASA Astrophysics Data System (ADS)

An increased uptake of alternative low or non-CO2 emitting energy sources is one of the key priorities for policy makers to mitigate the effects of environmental change. Relatively little work has been undertaken on the mitigation potential of Ground Coupled Heat Pumps (GCHPs) despite the fact that a GCHP could significantly reduce CO2 emissions from heating systems. It is predicted that under climate change the most probable scenario is for UK temperatures to increase and for winter rainfall to become more abundant; the latter is likely to cause a general rise in groundwater levels. Summer rainfall may reduce considerably, while vegetation type and density may change. Furthermore, recent studies underline the likelihood of an increase in the number of heat waves. Under such a scenario, GCHPs will increasingly be used for cooling as well as heating. These factors will affect long-term performance of horizontal GCHP systems and hence their economic viability and mitigation potential during their life span ( 50 years). The seasonal temperature differences encountered in soil are harnessed by GCHPs to provide heating in the winter and cooling in the summer. The performance of a GCHP system will depend on technical factors (heat exchanger (HE) type, length, depth, and spacing of pipes), but also it will be determined to a large extent by interactions between the below-ground parts of the system and the environment (atmospheric conditions, vegetation and soil characteristics). Depending on the balance between extraction and rejection of heat from and to the ground, the soil temperature in the neighbourhood of the HE may fall or rise. The GROMIT project (GROund coupled heat pumps MITigation potential), funded by the Natural Environment Research Council (UK), is a multi-disciplinary research project, in collaboration with EarthEnergy Ltd., which aims to quantify the CO2 mitigation potential of horizontal GCHPs. It considers changing environmental conditions and combines model predictions of soil moisture content and soil temperature with measurements at different GCHP locations over the UK. The combined effect of environment dynamics and horizontal GCHP technical properties on long-term GCHP performance will be assessed using a detailed land surface model (JULES: Joint UK Land Environment Simulator, Meteorological Office, UK) with additional equations embedded describing the interaction between GCHP heat exchangers and the surrounding soil. However, a number of key soil physical processes are currently not incorporated in JULES, such as groundwater flow, which, especially in lowland areas, can have an important effect on the heat flow between soil and HE. Furthermore, the interaction between HE and soil may also cause soil vapour and moisture fluxes. These will affect soil thermal conductivity and hence heat flow between the HE and the surrounding soil, which will in turn influence system performance. The project will address these issues. We propose to drive an improved version of JULES (with equations to simulate GCHP exchange embedded), with long-term gridded (1 km) atmospheric, soil and vegetation data (reflecting current and future environmental conditions) to reliably assess the mitigation potential of GCHPs over the entire domain of the UK, where uptake of GCHPs has been low traditionally. In this way we can identify areas that are most suitable for the installation of GCHPs. Only then recommendations can be made to local and regional governments, for example, on how to improve the mitigation potential in less suitable areas by adjusting GCHP configurations or design.

García González, Raquel; Verhoef, Anne; Vidale, Pier Luigi; Gan, Guohui; Wu, Yupeng; Hughes, Andrew; Mansour, Majdi; Blyth, Eleanor; Finch, Jon; Main, Bruce



Economics of Tsunami Mitigation in the Pacific Northwest  

NASA Astrophysics Data System (ADS)

The death total in a major Cascadia Subduction Zone (CSZ) tsunami may be comparable to the Tohoku tsunami - tens of thousands. To date, tsunami risk reduction activities have been almost exclusively hazard mapping and evacuation planning. Reducing deaths in locations where evacuation to high ground is impossible in the short time between ground shaking and arrival of tsunamis requires measures such as vertical evacuation facilities or engineered pathways to safe ground. Yet, very few, if any, such tsunami mitigation projects have been done. In contrast, many tornado safe room and earthquake mitigation projects driven entirely or in largely by life safety have been done with costs in the billions of dollars. The absence of tsunami mitigation measures results from the belief that tsunamis are too infrequent and the costs too high to justify life safety mitigation measures. A simple analysis based on return periods, death rates, and the geographic distribution of high risk areas for these hazards demonstrates that this belief is incorrect: well-engineered tsunami mitigation projects are more cost-effective with higher benefit-cost ratios than almost all tornado or earthquake mitigation projects. Goldfinger's paleoseismic studies of CSZ turbidites indicate return periods for major CSZ tsunamis of about 250-500 years (USGS Prof. Paper 1661-F in press). Tsunami return periods are comparable to those for major earthquakes at a given location in high seismic areas and are much shorter than those for tornados at any location which range from >4,000 to >16,000 years for >EF2 and >EF4 tornadoes, respectively. The average earthquake death rate in the US over the past 100-years is about 1/year, or about 30/year including the 1906 San Francisco earthquake. The average death rate for tornadoes is about 90/year. For CSZ tsunamis, the estimated average death rate ranges from about 20/year (10,000 every 500 years) to 80/year (20,000 every 250 years). Thus, the long term deaths rates from tsunamis, earthquakes and tornadoes are comparable. High hazard areas for tornadoes and earthquakes cover ~40% and ~15% of the contiguous US, ~1,250,000 and ~500,000 square miles, respectively. In marked contrast, tsunami life safety risk is concentrated in communities with significant populations in areas where evacuation to high ground is impossible: probably <4,000 square miles or <0.1% of the US. The geographic distribution of life safety risk profoundly affects the economics of tsunami life safety mitigation projects. Consider a tsunami life safety project which saves an average of one life per year (500 lives per 500 years). Using FEMA's value of human life (5.8 million), 7% discount rate and a 50-year project useful lifetime, the net present value of avoided deaths is 80 million. Thus, the benefit-cost ratio would be about 16 or about 80 for tsunami mitigation projects which cost 5 million or 1 million, respectively. These rough calculations indicate that tsunami mitigation projects in high risk locations are economically justified. More importantly, these results indicate that national and local priorities for natural hazard mitigation should be reconsidered, with tsunami mitigation given a very high priority.

Goettel, K. A.; Rizzo, A.; Sigrist, D.; Bernard, E. N.



ERTS-1 flood hazard studies in the Mississippi River Basin. [Missouri, Mississippi, and Arkansas  

NASA Technical Reports Server (NTRS)

The Spring 1973 Mississippi River flood was investigated using remotely sensed data from ERTS-1. Both manual and automatic analyses of the data indicate that ERTS-1 is extremely useful as a regional tool for flood and floodplain management. The maximum error of such flood area measurements is conservatively estimated to be less than five percent. Change detection analysis indicates that the flood had major impacts on soil moisture, land pattern stability, and vegetation stress. Flood hazard identification was conducted using photointerpretation techniques in three study areas along the Mississippi River using pre-flood ERTS-1 imagery down to 1:100,000 scale. Flood prone area boundaries obtained from ERTS-1 were generally in agreement with flood hazard maps produced by the U.S. Army Corps of Engineers and the U.S. Geological Survey although the latter are somewhat more detailed because of their larger scale. Initial results indicate that ERTS-1 digital mapping of the flood-prone areas can be performed at least 1:62,500 which is comparable to conventional flood hazard map scales.

Rango, A.; Anderson, A. T.



Prediction of Ungauged River Basin for Hydro Power Potential and Flood Risk Mitigation; a Case Study at Gin River, Sri Lanka  

NASA Astrophysics Data System (ADS)

The most of the primary civilizations of the world emerged in or near river valleys or floodplains. The river channels and floodplains are single hydrologic and geomorphic system. The failure to appreciate the integral connection between floodplains and channel underlies many socioeconomic and environmental problems in river management today. However it is a difficult task of collecting reliable field hydrological data. Under such situations either synthetic or statistically generated data were used for hydraulic engineering designing and flood modeling. The fundamentals of precipitation-runoff relationship through synthetic unit hydrograph for Gin River basin were prepared using the method of the Flood Studies Report of the National Environmental Research Council, United Kingdom (1975). The Triangular Irregular Network model was constructed using Geographic Information System (GIS) to determine hazard prone zones. The 1:10,000 and 1:50,000 topography maps and field excursions were also used for initial site selection of mini-hydro power units and determine flooding area. The turbines output power generations were calculated using the parameters of net head and efficiency of turbine. The peak discharge achieves within 4.74 hours from the onset of the rainstorm and 11.95 hours time takes to reach its normal discharge conditions of Gin River basin. Stream frequency of Gin River is 4.56 (Junctions/ km2) while the channel slope is 7.90 (m/km). The regional coefficient on the catchment is 0.00296. Higher stream frequency and gentle channel slope were recognized as the flood triggering factors of Gin River basin and other parameters such as basins catchment area, main stream length, standard average annual rainfall and soil do not show any significant variations with other catchments of Sri Lanka. The flood management process, including control of flood disaster, prepared for a flood, and minimize it impacts are complicated in human population encroached and modified floodplains. Thus modern GIS technology has been productively executed to prepare hazard maps based on the flood modeling and also it would be further utilized for disaster preparedness and mitigation activities. Five suitable hydraulic heads were recognized for mini-hydro power sites and it would be the most economical and applicable flood controlling hydraulic engineering structure considering all morphologic, climatic, environmental and socioeconomic proxies of the study area. Mini-hydro power sites also utilized as clean, eco friendly and reliable energy source (8630.0 kW). Finally Francis Turbine can be employed as the most efficiency turbine for the selected sites bearing in mind of both technical and economical parameters.

Ratnayake, A. S.



Evaluation of the fire and explosion hazards of oil-shale mining and processing. Volume 1. Analytical studies and accident scenarios. Open file report, 16 June 1977-15 July 1983  

SciTech Connect

The objectives of this research were to identify and evaluate potential fire and explosion hazards in oil-shale mining and processing by laboratory testing to provide recommendations for mitigation safety monitoring and to establish a basis for regulation. A series of scenarios were developed describing hypothetical fire and explosion incidents that might occur in oil-shale mining. The objectives were achieved through the following accomplishments: (1) It was found that fire and explosion properties of oil shale increase with oil shale richness and decreasing particle size. (2) Data from dust loading study in several mines showed that the total potential yield of combustibles was about one-tenth the amount required to fuel a propagating explosion. (3) Aging of oil shale dusts over a period of several years reduces the content of volatile combustibles and the corresponding fire and explosion properties. (4) Data and information from the completed program indicate that the hazard of dust explosions is less severe than the hazard of fire in mine muck piles. Laboratory data were used to relate fire and explosivity properties of oil shales to those of coals and other carbonaceous materials and to assist in the identification and evaluation of potential hazardous situations that may be encountered in oil shale mining and processing.

Crookston, R.B.; Atwood, M.T.; Williams, R.E.; McGuire, M.E.



Study on FPGA SEU Mitigation for Readout Electronics of DAMPE BGO Calorimeter  

E-print Network

The BGO calorimeter, which provides a wide measurement range of the primary cosmic ray spectrum, is a key sub-detector of Dark Matter Particle Explorer (DAMPE). The readout electronics of calorimeter consists of 16 pieces of Actel ProASIC Plus FLASH-based FPGA, of which the design-level flip-flops and embedded block RAMs are single event upset (SEU) sensitive in the harsh space environment. Therefore to comply with radiation hardness assurance (RHA), SEU mitigation methods, including partial triple modular redundancy (TMR), CRC checksum, and multi-domain reset are analyzed and tested by the heavy-ion beam test. Composed of multi-level redundancy, a FPGA design with the characteristics of SEU tolerance and low resource consumption is implemented for the readout electronics.

Shen, Zhongtao; Gao, Shanshan; Zhang, Deliang; Jiang, Di; Liu, Shubin; An, Qi



Study on FPGA SEU Mitigation for Readout Electronics of DAMPE BGO Calorimeter  

E-print Network

The BGO calorimeter, which provides a wide measurement range of the primary cosmic ray spectrum, is a key sub-detector of Dark Matter Particle Explorer (DAMPE). The readout electronics of calorimeter consists of 16 pieces of Actel ProASIC Plus FLASH-based FPGA, of which the design-level flip-flops and embedded block RAMs are single event upset (SEU) sensitive in the harsh space environment. Therefore to comply with radiation hardness assurance (RHA), SEU mitigation methods, including partial triple modular redundancy (TMR), CRC checksum, and multi-domain reset are analyzed and tested by the heavy-ion beam test. Composed of multi-level redundancy, a FPGA design with the characteristics of SEU tolerance and low resource consumption is implemented for the readout electronics.

Zhongtao Shen; Changqing Feng; Shanshan Gao; Deliang Zhang; Di Jiang; Shubin Liu; Qi An



Mapping Europe's Seismic Hazard  

NASA Astrophysics Data System (ADS)

From the rift that cuts through the heart of Iceland to the complex tectonic convergence that causes frequent and often deadly earthquakes in Italy, Greece, and Turkey to the volcanic tremors that rattle the Mediterranean, seismic activity is a prevalent and often life-threatening reality across Europe. Any attempt to mitigate the seismic risk faced by society requires an accurate estimate of the seismic hazard.

Giardini, Domenico; Wössner, Jochen; Danciu, Laurentiu



GIS data for the Seaside, Oregon, Tsunami Pilot Study to modernize FEMA flood hazard maps  

USGS Publications Warehouse

A Tsunami Pilot Study was conducted for the area surrounding the coastal town of Seaside, Oregon, as part of the Federal Emergency Management's (FEMA) Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). The Cascadia subduction zone extends from Cape Mendocino, California, to Vancouver Island, Canada. The Seaside area was chosen because it is typical of many coastal communities subject to tsunamis generated by far- and near-field (Cascadia) earthquakes. Two goals of the pilot study were to develop probabilistic 100-year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improving tsunami hazard assessment guidelines for FEMA and state and local agencies. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study model data and results are published separately as a geographic information systems (GIS) data report (Wong and others, 2006). The flood maps and GIS data are briefly described here.

Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.



Awareness of occupational hazards and use of safety measures among welders: a cross-sectional study from eastern Nepal  

PubMed Central

Objective The proper use of safety measures by welders is an important way of preventing and/or reducing a variety of health hazards that they are exposed to during welding. There is a lack of knowledge about hazards and personal protective equipments (PPEs) and the use of PPE among the welders in Nepal is limited. We designed a study to assess welders’ awareness of hazards and PPE, and the use of PPE among the welders of eastern Nepal and to find a possible correlation between awareness and use of PPE among them. Materials and methods A cross-sectional study of 300 welders selected by simple random sampling from three districts of eastern Nepal was conducted using a semistructured questionnaire. Data regarding age, education level, duration of employment, awareness of hazards, safety measures and the actual use of safety measures were recorded. Results Overall, 272 (90.7%) welders were aware of at least one hazard of welding and a similar proportion of welders were aware of at least one PPE. However, only 47.7% used one or more types of PPE. Education and duration of employment were significantly associated with the awareness of hazards and of PPE and its use. The welders who reported using PPE during welding were two times more likely to have been aware of hazards (OR=2.52, 95% CI 1.09 to 5.81) and five times more likely to have been aware of PPE compared with the welders who did not report the use of PPE (OR=5.13, 95% CI 2.34 to 11.26). Conclusions The welders using PPE were those who were aware of hazards and PPE. There is a gap between being aware of hazards and PPE (90%) and use of PPE (47%) at work. Further research is needed to identify the underlying factors leading to low utilisation of PPE despite the welders of eastern Nepal being knowledgeable of it. PMID:24889850

Budhathoki, Shyam Sundar; Singh, Suman Bahadur; Sagtani, Reshu Agrawal; Niraula, Surya Raj; Pokharel, Paras Kumar



Properties of Hazards  

NSDL National Science Digital Library

A Hazard is any physical, biological or chemical agent that poses a threat to human health. You can get exposed to hazards due to the way you live, the job you do, or the environment you are surrounded by. This "Properties of Hazards" module has five instructional units. Follow the links above in order. Each lesson contains study material for one day. Learn more about just how close you are to environmental health hazards and how they can affect you.



Hazardous Waste  


... you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint ...


Results of alkylation unit hazard, operability study are analyzed and summarized  

Microsoft Academic Search

Hazard and operability analysis (Hazop) has recently focused on hydrocarbon alkylation units. The Hazop technique is a structured process hazards analysis method that can be used to help identify these hazards and manage the risks. This analysis tool is gaining widespread acceptance as a key tool for process safety management in the hydrocarbon processing industry. Important results and observations have

D. K. Whittle; D. K. Lorenzo; J. Q. Kirkman



Space Debris & its Mitigation  

NASA Astrophysics Data System (ADS)

Space debris has become a growing concern in recent years, since collisions at orbital velocities can be highly damaging to functioning satellites and can also produce even more space debris in the process. Some spacecraft, like the International Space Station, are now armored to deal with this hazard but armor and mitigation measures can be prohibitively costly when trying to protect satellites or human spaceflight vehicles like the shuttle. This paper describes the current orbital debris environment, outline its main sources, and identify mitigation measures to reduce orbital debris growth by controlling these sources. We studied the literature on the topic Space Debris. We have proposed some methods to solve this problem of space debris. We have also highlighted the shortcomings of already proposed methods by space experts and we have proposed some modification in those methods. Some of them can be very effective in the process of mitigation of space debris, but some of them need some modification. Recently proposed methods by space experts are maneuver, shielding of space elevator with the foil, vaporizing or redirecting of space debris back to earth with the help of laser, use of aerogel as a protective layer, construction of large junkyards around international space station, use of electrodynamics tether & the latest method proposed is the use of nano satellites in the clearing of the space debris. Limitations of the already proposed methods are as follows: - Maneuvering can't be the final solution to our problem as it is the act of self-defence. - Shielding can't be done on the parts like solar panels and optical devices. - Vaporizing or redirecting of space debris can affect the human life on earth if it is not done in proper manner. - Aerogel has a threshold limit up to which it can bear (resist) the impact of collision. - Large junkyards can be effective only for large sized debris. In this paper we propose: A. The Use of Nano Tubes by creating a mesh: In this technique we will use the nano tubes. We will create a mesh that will act as a touch panel of the touch screen cell phone. When any small or tiny particle will come on this mesh and touch it then the mesh will act as a touch panel and so that the corresponding processor or sensor will come to know the co-ordinates of it then further by using Destructive laser beam we can destroy that particle. B. Use of the Nano tubes and Nano Bots for the collection of the Space Debris: In this method also we will use a nano mesh which is made up of the nano tubes and the corresponding arrangement will be done so that that mesh will act as a touch panel same as that of the touch screen phones. So when tiny particles will dash on the nano mesh then the Nano Bots which will be at the specific co-ordinates collect the particles and store them into the garbage storage. C. Further the space Debris can be use for the other purposes too:- As we know that the space debris can be any tiny particle in the space. So instead of decomposing that particles or destroying it we can use those particles for the purpose of energy production by using the fuel cells, but for this the one condition is that the particle material should be capable of forming the ionize liquid or solution which can be successfully use in the fuel cell for energy production. But this is useful for only the big projects where in smallest amount of energy has also the great demand or value. D. RECYCLING OF SPACE DEBRIS The general idea of making space structures by recycling space debris is to capture the aluminum of the upper stages, melt it, and form it into new aluminum structures, perhaps by coating the inside of inflatable balloons, to make very large structures of thin aluminum shells. CONCLUSION Space debris has become the topic of great concern in recent years. Space debris creation can't be stopped completely but it can be minimized by adopting some measures. Many methods of space debris mitigation have been proposed earlier by many space experts, but some of them have limitations in them. After some

Kaushal, Sourabh; Arora, Nishant



Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia  

NASA Astrophysics Data System (ADS)

This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor


Concerns About Climate Change Mitigation Projects: Summary of Findings from Case Studies in Brazil, India, Mexico, and South Africa  

SciTech Connect

The concept of joint implementation as a way to implement climate change mitigation projects in another country has been controversial ever since its inception. Developing countries have raised numerous issues at the project-specific technical level, and broader concerns having to do with equity and burden sharing. This paper summarizes the findings of studies for Brazil, India, Mexico and South Africa, four countries that have large greenhouse gas emissions and are heavily engaged in the debate on climate change projects under the Kyoto Protocol. The studies examine potential or current projects/programs to determine whether eight technical concerns about joint implementation can be adequately addressed. They conclude that about half the concerns were minor or well managed by project developers, but concerns about additionality of funds, host country institutions and guarantees of performance (including the issues of baselines and possible leakage) need much more effort to be adequately addressed. All the papers agree on the need to develop institutional arrangements for approving and monitoring such projects in each of the countries represented. The case studies illustrate that these projects have the potential to bring new technology, investment, employment and ancillary socioeconomic and environmental benefits to developing countries. These benefits are consistent with the goal of sustainable development in the four study countries. At a policy level, the studies' authors note that in their view, the Annex I countries should consider limits on the use of jointly implemented projects as a way to get credits against their own emissions at home, and stress the importance of industrialized countries developing new technologies that will benefit all countries. The authors also observe that if all countries accepted caps on their emissions (with a longer time period allowed for developing countries to do so) project-based GHG mitigation would be significantly facilitated by the improved private investment climate.

Sathaye, Jayant A.; Andrasko, Kenneth; Makundi, Willy; La Rovere, Emilio Lebre; Ravinandranath, N.H.; Melli, Anandi; Rangachari, Anita; Amaz, Mireya; Gay, Carlos; Friedmann, Rafael; Goldberg, Beth; van Horen, Clive; Simmonds, Gillina; Parker, Gretchen



Viscoelastic Materials Study for the Mitigation of Blast-Related Brain Injury  

NASA Astrophysics Data System (ADS)

Recent preliminary research into the causes of blast-related brain injury indicates that exposure to blast pressures, such as from IED detonation or multiple firings of a weapon, causes damage to brain tissue resulting in Traumatic Brain Injury (TBI) and Post Traumatic Stress Disorder (PTSD). Current combat helmets are not sufficient to protect the warfighter from this danger and the effects are debilitating, costly, and long-lasting. Commercially available viscoelastic materials, designed to dampen vibration caused by shock waves, might be useful as helmet liners to dampen blast waves. The objective of this research is to develop an experimental technique to test these commercially available materials when subject to blast waves and evaluate their blast mitigating behavior. A 40-mm-bore gas gun is being used as a shock tube to generate blast waves (ranging from 1 to 500 psi) in a test fixture at the gun muzzle. A fast opening valve is used to release nitrogen gas from the breech to impact instrumented targets. The targets consist of aluminum/ viscoelastic polymer/ aluminum materials. Blast attenuation is determined through the measurement of pressure and accelerometer data in front of and behind the target. The experimental technique, calibration and checkout procedures, and results will be presented.

Bartyczak, Susan; Mock, Willis, Jr.



Implications of Adhesion Studies for Dust Mitigation on Thermal Control Surfaces  

NASA Technical Reports Server (NTRS)

Experiments measuring the adhesion forces under ultrahigh vacuum conditions (10 (exp -10) torr) between a synthetic volcanic glass and commonly used space exploration materials have recently been described. The glass has a chemistry and surface structure typical of the lunar regolith. It was found that Van der Waals forces between the glass and common spacecraft materials was negligible. Charge transfer between the materials was induced by mechanically striking the spacecraft material pin against the glass plate. No measurable adhesion occurred when striking the highly conducting materials, however, on striking insulating dielectric materials the adhesion increased dramatically. This indicates that electrostatic forces dominate over Van der Waals forces under these conditions. The presence of small amounts of surface contaminants was found to lower adhesive forces by at least two orders of magnitude, and perhaps more. Both particle and space exploration material surfaces will be cleaned by the interaction with the solar wind and other energetic processes and stay clean because of the extremely high vacuum (10 (exp -12) torr) so the atomically clean adhesion values are probably the relevant ones for the lunar surface environment. These results are used to interpret the results of dust mitigation technology experiments utilizing textured surfaces, work function matching surfaces and brushing. They have also been used to reinterpret the results of the Apollo 14 Thermal Degradation Samples experiment.

Gaier, James R.; Berkebile, Stephen P.



Study of cover source mismatch in steganalysis and ways to mitigate its impact  

NASA Astrophysics Data System (ADS)

When a steganalysis detector trained on one cover source is applied to images from a different source, generally the detection error increases due to the mismatch between both sources. In steganography, this situation is recognized as the so-called cover source mismatch (CSM). The drop in detection accuracy depends on many factors, including the properties of both sources, the detector construction, the feature space used to represent the covers, and the steganographic algorithm. Although well recognized as the single most important factor negatively affecting the performance of steganalyzers in practice, the CSM received surprisingly little attention from researchers. One of the reasons for this is the diversity with which the CSM can manifest. On a series of experiments in the spatial and JPEG domains, we refute some of the common misconceptions that the severity of the CSM is tied to the feature dimensionality or their "fragility." The CSM impact on detection appears too difficult to predict due to the effect of complex dependencies among the features. We also investigate ways to mitigate the negative effect of the CSM using simple measures, such as by enlarging the diversity of the training set (training on a mixture of sources) and by employing a bank of detectors trained on multiple different sources and testing on a detector trained on the closest source.

Kodovský, Jan; Sedighi, Vahid; Fridrich, Jessica



First Production of C60 Nanoparticle Plasma Jet for Study of Disruption Mitigation for ITER  

NASA Astrophysics Data System (ADS)

Unique fast response and large mass-velocity delivery of nanoparticle plasma jets (NPPJs) provide a novel application for ITER disruption mitigation, runaway electrons diagnostics and deep fueling. NPPJs carry a much larger mass than usual gases. An electromagnetic plasma gun provides a very high injection velocity (many km/s). NPPJ has much higher ram pressure than any standard gas injection method and penetrates the tokamak confining magnetic field. Assimilation is enhanced due to the NP large surface-to-volume ratio. Radially expanding NPPJs help achieving toroidal uniformity of radiation power. FAR-TECH's NPPJ system was successfully tested: a coaxial plasma gun prototype (˜35 cm length, 96 kJ energy) using a solid state TiH2/C60 pulsed power cartridge injector produced a hyper-velocity (>4 km/s), high-density (>10^23 m-3), C60 plasma jet in ˜0.5 ms, with ˜1-2 ms overall response-delivery time. We present the TiH2/C60 cartridge injector output characterization (˜180 mg of sublimated C60 gas) and first production results of a high momentum C60 plasma jet (˜0.6

Bogatu, I. N.; Thompson, J. R.; Galkin, S. A.; Kim, J. S.; Brockington, S.; Case, A.; Messer, S. J.; Witherspoon, F. D.



Social Uptake of Scientific Understanding of Seismic Hazard in Sumatra and Cascadia  

NASA Astrophysics Data System (ADS)

The importance of science within hazard mitigation cannot be underestimated. Robust mitigation polices rely strongly on a sound understanding of the science underlying potential natural disasters and the transference of that knowledge from the scientific community to the general public via governments and policy makers. We aim to investigate how and why the public's knowledge, perceptions, response, adjustments and values towards science have changed throughout two decades of research conducted in areas along and adjacent to the Sumatran and Cascadia subduction zones. We will focus on two countries subject to the same potential hazard, but which encompass starkly contrasting political, economic, social and environmental settings. The transfer of scientific knowledge into the public/ social arena is a complex process, the success of which is reflected in a community's ability to withstand large scale devastating events. Although no one could have foreseen the magnitude of the 2004 Boxing Day tsunami, the social devastation generated underscored the stark absence of mitigation measures in the nations most heavily affected. It furthermore emphasized the need for the design and implementation of disaster preparedness measures. Survey of existing literature has already established timelines for major events and public policy changes in the case study areas. Clear evidence exists of the link between scientific knowledge and its subsequent translation into public policy, particularly in the Cascadia context. The initiation of the National Tsunami Hazard Mitigation Program following the Cape Mendocino earthquake in 1992 embodies this link. Despite a series of environmental disasters with recorded widespread fatalities dating back to the mid 1900s and a heightened impetus for scientific research into tsunami/ earthquake hazard following the 2004 Boxing Day tsunami, the translation of science into the public realm is not widely obvious in the Sumatran context. This research aims to further investigate how the enhanced understanding of earthquake and tsunami hazards is being used to direct hazard mitigation strategies and enables direct comparison with the scientific and public policy developments in Cascadia.

Shannon, R.; McCloskey, J.; Guyer, C.; McDowell, S.; Steacy, S.



Assessing natural hazards in high-latitude fiords: A case study in Passage Canal, south-central Alaska  

NASA Astrophysics Data System (ADS)

Mountainous regions are susceptible to a range of natural hazards that can cause damage to property and loss of life. At high-latitudes, mountainous regions are typically characterized by glacial, paraglacial, and periglacial environments, making them highly sensitive to thermal perturbations, which can lead to a change in the magnitude and frequency of some natural hazards. A large number of communities in Alaska are located in remote steep fiords that are host to multiple hazards, many of which are connected by shared triggering mechanisms or by positive feedbacks. This study employs air- and spaceborne remote sensing and in situ observations to assess natural hazards and evaluate environmental change in Passage Canal, south-central Alaska.

Wolken, G. J.; Balazs, M. S.



Natural Hazards and Earth System Sciences, 6, 2132, 2006 SRef-ID: 1684-9981/nhess/2006-6-21  

E-print Network

resources for emergency management. A specific case study pertain- ing to the hydrological risk in the Val. The effects of natural hazards can be mitigated by the use of proper "pre-event" interventions on "key" el, methodologies and tools should be studied to support decision makers in the analysis of a ter- ritory, in order

Boyer, Edmond


Natural phenomena hazards, Hanford Site, Washington  

SciTech Connect

This document presents the natural phenomena hazard loads for use in implementing DOE Order 5480.28, Natural Phenomena Hazards Mitigation, and supports development of double-shell tank systems specifications at the Hanford Site in south-central Washington State. The natural phenomena covered are seismic, flood, wind, volcanic ash, lightning, snow, temperature, solar radiation, suspended sediment, and relative humidity.

Conrads, T.J.



Property-close source separation of hazardous waste and waste electrical and electronic equipment - A Swedish case study  

SciTech Connect

Through an agreement with EEE producers, Swedish municipalities are responsible for collection of hazardous waste and waste electrical and electronic equipment (WEEE). In most Swedish municipalities, collection of these waste fractions is concentrated to waste recycling centres where households can source-separate and deposit hazardous waste and WEEE free of charge. However, the centres are often located on the outskirts of city centres and cars are needed in order to use the facilities in most cases. A full-scale experiment was performed in a residential area in southern Sweden to evaluate effects of a system for property-close source separation of hazardous waste and WEEE. After the system was introduced, results show a clear reduction in the amount of hazardous waste and WEEE disposed of incorrectly amongst residual waste or dry recyclables. The systems resulted in a source separation ratio of 70 wt% for hazardous waste and 76 wt% in the case of WEEE. Results show that households in the study area were willing to increase source separation of hazardous waste and WEEE when accessibility was improved and that this and similar collection systems can play an important role in building up increasingly sustainable solid waste management systems.

Bernstad, Anna, E-mail: [Dep. of Chem. Eng., Faculty of Eng., Lund University, Lund (Sweden); Cour Jansen, Jes la [Dep. of Chem. Eng., Faculty of Eng., Lund University, Lund (Sweden); Aspegren, Henrik [VA SYD, City of Malmoe (Sweden)



Geologic hazards and Alaska's communities in a changing climate  

NASA Astrophysics Data System (ADS)

Observations indicate that changes in climate modify or intensify geomorphic processes in high-latitude regions. Changes in these processes can increase the magnitude and frequency of geologic hazards leading to casualties, damages to property and infrastructure, and a host of socio-economic problems. Numerous communities in Alaska are threatened by geologic hazards and are currently involved in adaptation or mitigation efforts to cope with these risks. In many communities, relocation is the preferred method for managing risk, but a lack of baseline geoscience data prohibits a sound evaluation of geologic hazards and recent landscape change and prevents informed community decision making. In an attempt to bridge this information gap, the Climate Change Hazards Program at the Alaska Division of Geological & Geophysical Surveys (DGGS) is collecting baseline geoscience data, quantifying landscape change, and conducting hazards assessments in and around imperiled communities in Alaska. An important and challenging step in each study is effectively communicating scientific results to community residents, other government agencies, and policy makers, which requires communication beyond peer-reviewed publications. Community visits, public meetings, and workshops are potentially important mechanism for disseminating important geologic hazards information to stakeholders in Alaska. Current DGGS pilot projects in the areas of Kivalina and Koyukuk illustrate the need for conducting geologic hazards assessments and properly disseminating scientific information.

Wolken, G. J.



Eco-efficiency for greenhouse gas emissions mitigation of municipal solid waste management: A case study of Tianjin, China  

SciTech Connect

The issue of municipal solid waste (MSW) management has been highlighted in China due to the continually increasing MSW volumes being generated and the limited capacity of waste treatment facilities. This article presents a quantitative eco-efficiency (E/E) analysis on MSW management in terms of greenhouse gas (GHG) mitigation. A methodology for E/E analysis has been proposed, with an emphasis on the consistent integration of life cycle assessment (LCA) and life cycle costing (LCC). The environmental and economic impacts derived from LCA and LCC have been normalized and defined as a quantitative E/E indicator. The proposed method was applied in a case study of Tianjin, China. The study assessed the current MSW management system, as well as a set of alternative scenarios, to investigate trade-offs between economy and GHG emissions mitigation. Additionally, contribution analysis was conducted on both LCA and LCC to identify key issues driving environmental and economic impacts. The results show that the current Tianjin's MSW management system emits the highest GHG and costs the least, whereas the situation reverses in the integrated scenario. The key issues identified by the contribution analysis show no linear relationship between the global warming impact and the cost impact in MSW management system. The landfill gas utilization scenario is indicated as a potential optimum scenario by the proposed E/E analysis, given the characteristics of MSW, technology levels, and chosen methodologies. The E/E analysis provides an attractive direction towards sustainable waste management, though some questions with respect to uncertainty need to be discussed further.

Zhao Wei, E-mail: [College of Civil Engineering and Architecture, Liaoning University of Technology, 121000 Jinzhou (China); Institute of Environmental Sciences (CML), Leiden University, P.O. Box 9518, 2300RA Leiden (Netherlands); Huppes, Gjalt, E-mail: [Institute of Environmental Sciences (CML), Leiden University, P.O. Box 9518, 2300RA Leiden (Netherlands); Voet, Ester van der, E-mail: [Institute of Environmental Sciences (CML), Leiden University, P.O. Box 9518, 2300RA Leiden (Netherlands)



A Review of Hazardous Chemical Toxicity Studies Utilizing Genetically-Modified Animals-Their Applications for Risk Assessment  

Microsoft Academic Search

Studies on the mechanisms of chemical toxicity carried out using knockout mice lacking genes of enzymes for drug metabolism or nuclear receptor proteins were reviewed, and the studies were compared with the respective conventional mechanistic studies. While the toxicity of many hazardous chemicals was observed only in wild-type or knockout mice, which clearly showed that their toxicity was involved in

Tamie NAKAJIMA; Rui-Sheng WANG; Yuki ITO; Toshifumi AOYAMA; Michihiro KAMIJIMA



Case study: Mapping tsunami hazards associated with debris flow into a reservoir  

USGS Publications Warehouse

Debris-flow generated impulse waves (tsunamis) pose hazards in lakes, especially those used for hydropower or recreation. We describe a method for assessing tsunami-related hazards for the case in which inundation by coherent water waves, rather than chaotic splashing, is of primary concern. The method involves an experimentally based initial condition (tsunami source) and a Boussinesq model for tsunami propagation and inundation. Model results are used to create hazard maps that offer guidance for emergency planners and responders. An example application explores tsunami hazards associated with potential debris flows entering Baker Lake, a reservoir on the flanks of the Mount Baker volcano in the northwestern United States. ?? 2006 ASCE.

Walder, J.S.; Watts, P.; Waythomas, C.F.



Geomorphological surveys and software simulations for rock fall hazard assessment: a case study in the Italian Alps  

NASA Astrophysics Data System (ADS)

In northern Italy, fast-moving landslides represent a significant threat to the population and human facilities. In the eastern portion of the Italian Alps, rock falls are recurrent and are often responsible for casualties or severe damage to roads and buildings. The above-cited type of landslide is frequent in mountain ranges, is characterised by strong relief energy and is triggered by earthquakes or copious rainfall, which often exceed 2000 mm yr-1. These factors cause morphological dynamics with intense slope erosion and degradation processes. This work investigates the appraisal of the rock-fall hazard related to the presence of several large unstable blocks located at the top of a limestone peak, approximately 500 m NW with respect to the Village of Cimolais. Field surveys recognised a limestone block exceeding a volume of 400 m3 and identified this block as the most hazardous for Cimolais Village because of its proximity to the rocky cliff. A first assessment of the possible transit and stop areas has been investigated through in-depth traditional activities, such as geomorphological mapping and aerial photo analysis. The output of field surveys was a detailed land use map, which provided a fundamental starting point for rock fall software analysis. The geomorphological observations were correlated with DTMs derived by regional topography and Airborne Laser Scanning (ALS) surveys to recognise possible rock fall routes. To simulate properly rock fall trajectories with a hybrid computer program, particular attention was devoted to the correct quantification of rates of input parameters, such as restitution coefficients and horizontal acceleration associated to earthquakes, which historically occur in this portion of Italy. The simulation outputs regarding the distribution of rock fall end points and kinetic energy along rock falling paths highlight the hazardous situation for Cimolais Village. Because of this reason, mitigation works have been suggested to immediately reduce the landslide risk. This proposal accounts for the high volume of blocks, which, in case of a fall, render the passive mitigation measures already in place at the back of Cimolais worthless.

Devoto, S.; Boccali, C.; Podda, F.



Status of volcanic hazard studies for the Nevada Nuclear Waste Storage Investigations. Volume II  

SciTech Connect

Volcanic hazard investigations during FY 1984 focused on five topics: the emplacement mechanism of shallow basalt intrusions, geochemical trends through time for volcanic fields of the Death Valley-Pancake Range volcanic zone, the possibility of bimodal basalt-rhyolite volcanism, the age and process of enrichment for incompatible elements in young basalts of the Nevada Test Site (NTS) region, and the possibility of hydrovolcanic activity. The stress regime of Yucca Mountain may favor formation of shallow basalt intrusions. However, combined field and drill-hole studies suggest shallow basalt intrusions are rare in the geologic record of the southern Great Basin. The geochemical patterns of basaltic volcanism through time in the NTS region provide no evidence for evolution toward a large-volume volcanic field or increases in future rates of volcanism. Existing data are consistent with a declining volcanic system comparable to the late stages of the southern Death Valley volcanic field. The hazards of bimodal volcanism in this area are judged to be low. The source of a 6-Myr pumice discovered in alluvial deposits of Crater Flat has not been found. Geochemical studies show that the enrichment of trace elements in the younger rift basalts must be related to an enrichment of their mantle source rocks. This geochemical enrichment event, which may have been metasomatic alteration, predates the basalts of the silicic episode and is, therefore, not a young event. Studies of crater dimensions of hydrovolcanic landforms indicate that the worst case scenario (exhumation of a repository at Yucca Mountain by hydrovolcanic explosions) is unlikely. Theoretical models of melt-water vapor explosions, particularly the thermal detonation model, suggest hydrovolcanic explosion are possible at Yucca Mountain. 80 refs., 21 figs., 5 tabs.

Crowe, B.M.; Wohletz, K.H.; Vaniman, D.T.; Gladney, E.; Bower, N.



Probabilistic Hazard Curves for Tornadic Winds, Wind Gusts, and Extreme Rainfall Events  

SciTech Connect

'This paper summarizes a study carried on at the Savannah River Site (SRS) for determining probabilistic hazard curves for tornadic winds, wind gusts, and extreme rainfall events. DOE Order 420.1, Facility Safety, outlines the requirements for Natural Phenomena Hazards (NPH) mitigation for new and existing DOE facilities. Specifically, NPH include tornadic winds, maximum wind gusts, and extreme rainfall events. Probabilistic hazard curves for each phenomenon indicate the recurrence frequency, and these hazard curves must be updated at least every 10 years to account for recent data, improved methodologies, or criteria changes. Also, emergency response exercises often use hypothetical weather data to initiate accident scenarios. The hazard curves in these reports provide a means to use extreme weather events based on models and measurements rather than scenarios that are created ad hoc as is often the case.'

Weber, A.H.



CMMAD Usability Case Study in Support of Countermine and Hazard Sensing  

SciTech Connect

During field trials, operator usability data were collected in support of lane clearing missions and hazard sensing for two robot platforms with Robot Intelligence Kernel (RIK) software and sensor scanning payloads onboard. The tests featured autonomous and shared robot autonomy levels where tasking of the robot used a graphical interface featuring mine location and sensor readings. The goal of this work was to provide insights that could be used to further technology development. The efficacy of countermine systems in terms of mobility, search, path planning, detection, and localization were assessed. Findings from objective and subjective operator interaction measures are reviewed along with commentary from soldiers having taken part in the study who strongly endorse the system.

Victor G. Walker; David I. Gertman



Hazardous alcohol consumption among university students in Ireland: a cross-sectional study  

PubMed Central

Objective There is considerable evidence of a cultural shift towards heavier alcohol consumption among university students, especially women. The aim of this study is to investigate the prevalence and correlates of hazardous alcohol consumption (HAC) among university students with particular reference to gender and to compare different modes of data collection in this population. Setting A large Irish university. Design A cross-sectional study using a classroom distributed paper questionnaire. Participants A total of 2275 undergraduates completed the classroom survey, 84% of those in class and 51% of those registered for the relevant module. Main outcome measures Prevalence of HAC measured using the Alcohol Use Disorders Identification Test for Consumption (AUDIT-C) and the proportion of university students reporting 1 or more of 13 adverse consequences linked to HAC. HAC was defined as an AUDIT-C score of 6 or more among males and 5 or more among females. Results In the classroom sample, 66.4% (95% CI 64.4 to 68.3) reported HAC (65.2% men and 67.3% women). In women, 57.4% met HAC thresholds for men. Similar patterns of adverse consequences were observed among men and women. Students with a hazardous consumption pattern were more likely to report smoking, illicit drug use and being sexually active. Conclusions The findings highlight the high prevalence of HAC among university students relative to the general population. Public policy measures require review to tackle the short-term and long-term risks to physical, mental and social health and well-being. PMID:25633284

Davoren, Martin P; Shiely, Frances; Byrne, Michael; Perry, Ivan J



Distinguishing Realistic Military Blasts from Firecrackers in Mitigation Studies of Blast Induced Traumatic Brain Injury  

SciTech Connect

In their Contributed Article, Nyein et al. (1,2) present numerical simulations of blast waves interacting with a helmeted head and conclude that a face shield may significantly mitigate blast induced traumatic brain injury (TBI). A face shield may indeed be important for future military helmets, but the authors derive their conclusions from a much smaller explosion than typically experienced on the battlefield. The blast from the 3.16 gm TNT charge of (1) has the following approximate peak overpressures, positive phase durations, and incident impulses (3): 10 atm, 0.25 ms, and 3.9 psi-ms at the front of the head (14 cm from charge), and 1.4 atm, 0.32 ms, and 1.7 psi-ms at the back of a typical 20 cm head (34 cm from charge). The peak pressure of the wave decreases by a factor of 7 as it traverses the head. The blast conditions are at the threshold for injury at the front of the head, but well below threshold at the back of the head (4). The blast traverses the head in 0.3 ms, roughly equal to the positive phase duration of the blast. Therefore, when the blast reaches the back of the head, near ambient conditions exist at the front. Because the headform is so close to the charge, it experiences a wave with significant curvature. By contrast, a realistic blast from a 2.2 kg TNT charge ({approx} an uncased 105 mm artillery round) is fatal at an overpressure of 10 atm (4). For an injury level (4) similar to (1), a 2.2 kg charge has the following approximate peak overpressures, positive phase durations, and incident impulses (3): 2.1 atm, 2.3 ms, and 18 psi-ms at the front of the head (250 cm from charge), and 1.8 atm, 2.5 ms, and 16.8 psi-ms at the back of the head (270 cm from charge). The peak pressure decreases by only a factor of 1.2 as it traverses the head. Because the 0.36 ms traversal time is much smaller than the positive phase duration, pressures on the head become relatively uniform when the blast reaches the back of the head. The larger standoff implies that the headform locally experiences a nearly planar blast wave. Also, the positive phase durations and blast impulses are much larger than those of (1). Consequently, the blast model used in (1) is spatially and temporally very different from a military blast. It would be useful to repeat the calculations using military blast parameters. Finally, (1) overlooks a significant part of (5). On page 1 and on page 3, (1) states that (5) did not consider helmet pads. But pages pages 3 and 4 of (5) present simulations of blast wave propagation across an ACH helmeted head form with and without pads. (5) states that when the pads are present, the 'underwash' of air under the helmet is blocked when compared to the case without. (1) reaches this same conclusion, but reports it as a new result rather than a confirmation of that already found in (5).

Moss, W C; King, M J; Blackman, E G



Balancing Mitigation Against Impact: A Case Study From the 2005 Chicxulub Seismic Survey  

NASA Astrophysics Data System (ADS)

In early 2005 the R/V Maurice Ewing conducted a large-scale deep seismic reflection-refraction survey offshore Yucatan, Mexico, to investigate the internal structure of the Chicxulub impact crater, centred on the coastline. Shots from a tuned 20 airgun, 6970 cu in array were recorded on a 6 km streamer and 25 ocean bottom seismometers (OBS). The water is exceptionally shallow to large distances offshore, reaching 30 m about 60 km from the land, making it unattractive to the larger marine mammals, although there are small populations of Atlantic and spotted dolphins living in the area, as well as several turtle breeding and feeding grounds on the Yucatan peninsula. In the light of calibrated tests of the Ewing's array (Tolstoy et al., 2004, Geophysical Research Letters 31, L14310), a 180 dB safety radius of 3.5 km around the gun array was adopted. An energetic campaign was organised by environmentalists opposing the work. In addition to the usual precautions of visual and listening watches by independent observers, gradual ramp-ups of the gun arrays, and power-downs or shut-downs for sightings, constraints were also placed to limit the survey to daylight hours and weather conditions not exceeding Beaufort 4. The operations were subject to several on-board inspections by the Mexican environmental authorities, causing logistical difficulties. Although less than 1% of the total working time was lost to shutdowns due to actual observation of dolphins or turtles, approximately 60% of the cruise time was taken up in precautionary inactivity. A diver in the water 3.5 km from the profiling ship reported that the sound in the water was barely noticeable, leading us to examine the actual sound levels recorded by both the 6 km streamer and the OBS hydrophones. The datasets are highly self-consistent, and give the same pattern of decay with distance past about 2 km offset, but with different overall levels: this may be due to geometry or calibration differences under investigation. Both datasets indicate significantly lower levels than reported by Tolstoy et al. (2004). There was no evidence of environmental damage created by this survey. It can be concluded that the mitigation measures were extremely successful, but there is also a concern that the overhead cost of the environmental protection made this one of the most costly academic surveys ever undertaken, and that not all of this protection was necessary. In particular, the predicted 180 dB safety radius appeared to be overly conservative, even though based on calibrated measurements in very similar physical circumstances, and we suggest that these differences were a result of local seismic velocity structure in the water column and/or shallow seabed, which resulted in different partitioning of the energy. These results suggest that real time monitoring of hydrophone array data may provide a method of determining the safety radius dynamically, in response to local conditions.

Barton, P.; Diebold, J.; Gulick, S.



Studying and Improving Human Response to Natural Hazards: Lessons from the Virtual Hurricane Lab  

Microsoft Academic Search

One of the most critical challenges facing communities in areas prone to natural hazards is how to best encourage residents to invest in individual and collective actions that would reduce the damaging impact of low-probability, high-consequence, environmental events. Unfortunately, what makes this goal difficult to achieve is that the relative rarity natural hazards implies that many who face the risk

R. Meyer; K. Broad; B. S. Orlove




EPA Science Inventory

The US Environmental Protection Agency's (EPA) National Exposure Research Laboratory (NERL) and the US Department of Housing and Urban Development's (HUD) Office of Healthy Homes and Lead Hazard Control conducted a national survey of housing related hazards in US residences. The...


Mitigation: Interfaces between NASA, Risk Managers, and the Public Clark R. Chapman1,2  

E-print Network

Mitigation," Vail CO, 26 June 2006. Summary: Threat mitigation, in the disaster management community the public, officials, and disaster management agencies; strategic planning must guide response. NASA must's responsibility. I. NEO Threat Mitigation: All- Hazards Disaster Management Strategy The Congressional mandate

Chapman, Clark R.


Further RAGE modeling of asteroid mitigation: surface and subsurface explosions in porous objects  

SciTech Connect

Disruption or mitigation of a potentially hazardous object (PHO) by a high-energy subsurface burst is considered. This is just one possible method of impact-hazard mitigation. We present RAGE hydrocode models of the shock-generated disruption of PHOs by subsurface nuclear bursts using scenario-specific models from realistic RADAR shape models. We will show 2D and 3D models for the disruption by a large energy source at the center of such PHO models ({approx}100 kt-10 Mt) specifically for the shape of the asteroid 25143 Itokawa. We study the effects of non-uniform composition (rubble pile), shallow buried bursts for the optimal depth of burial and porosity.

Weaver, Robert P [Los Alamos National Laboratory; Plesko, Catherine S [Los Alamos National Laboratory; Dearholt, William R [Los Alamos National Laboratory



Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs  

EPA Science Inventory

This is the fourth, also the last, report of the report series entitled “Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings.” This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method...


Over-Pressurized Drums: Their Causes and Mitigation  

SciTech Connect

Having to contend with bulging or over-pressurized drums is, unfortunately, a common event for people storing chemicals and chemical wastes. (Figure 1) The Department of Energy alone reported over 120 incidents of bulging drums between 1992 and 1999 (1). Bulging drums can be caused by many different mechanisms, represent a number of significant hazards and can be tricky to mitigate. In this article, we will discuss reasons or mechanisms by which drums can become over-pressurized, recognition of the hazards associated with and mitigation of over-pressurized drums, and methods that can be used to prevent drum over-pressurization from ever occurring. Drum pressurization can represent a significant safety hazard. Unless recognized and properly mitigated, improperly manipulated pressurized drums can result in employee exposure, employee injury, and environmental contamination. Therefore, recognition of when a drum is pressurized and knowledge of pressurized drum mitigation techniques is essential.

Simmons, Fred; Kuntamukkula, Murty; Quigley, David; Robertson, Janeen; Freshwater, David



Methane emission from ruminants and solid waste: A critical analysis of baseline and mitigation projections for climate and policy studies  

NASA Astrophysics Data System (ADS)

Current and projected estimates of methane (CH4) emission from anthropogenic sources are numerous but largely unexamined or compared. Presented here is a critical appraisal of CH4 projections used in climate-chemistry and policy studies. We compare emissions for major CH4 sources from several groups, including our own new data and RCP projections developed for climate-chemistry models for the next IPCC Assessment Report (AR5). We focus on current and projected baseline and mitigation emissions from ruminant animals and solid waste that are both predicted to rise dramatically in coming decades, driven primarily by developing countries. For waste, drivers include increasing urban populations, higher per capita waste generation due to economic growth and increasing landfilling rates. Analysis of a new global data base detailing waste composition, collection and disposal indicates that IPCC-based methodologies and default data overestimate CH4 emission for the current period which cascades into substantial overestimates in future projections. CH4 emission from solid waste is estimated to be ~10-15 Tg CH4/yr currently rather than the ~35 Tg/yr often reported in the literature. Moreover, emissions from developing countries are unlikely to rise rapidly in coming decades because new management approaches, such as sanitary landfills, that would increase emissions are maladapted to infrastructures in these countries and therefore unlikely to be implemented. The low current emission associated with solid waste (~10 Tg), together with future modest growth, implies that mitigation of waste-related CH4 emission is a poor candidate for slowing global warming. In the case of ruminant animals (~90 Tg CH4/yr currently), the dominant assumption driving future trajectories of CH4 emission is a substantial increase in meat and dairy consumption in developing countries to be satisfied by growing animal populations. Unlike solid waste, current ruminant emissions among studies exhibit a narrow range that does not necessarily signal low uncertainty but rather a reliance on similar animal statistics and emission factors. The UN Food and Agriculture Organization (FAO) projects 2000-2030 growth rates of livestock for most developing countries at 2% to >3% annually. However, the assumption of rapidly rising meat consumption is not supported by current trends nor by resource availability. For example, increased meat consumption in China and other developing countries is poultry and pork that do not affect CH4 emissions, suggesting that the rapid growth projected for all animals, boosting growth in CH4 emission, will not occur. From a resource standpoint, large increases in cattle, sheep and goat populations, especially for African countries (~60% by 2030), are not supportable on arid grazing lands that require very low stocking rates and semi-nomadic management. Increases projected for African animal populations would require either that about 2/3 more animals are grazed on increasingly drier lands or that all non-forested areas become grazing lands. Similar to solid waste, future methane emission from ruminant animals is likely to grow modestly although animals are not a likely candidate for CH4 mitigation due to their dispersed distribution throughout widely varying agricultural systems under very local management.

Matthews, E.



Vulnerability studies and integrated assessments for hazard risk reduction in Pittsburgh, PA (Invited)  

NASA Astrophysics Data System (ADS)

Today's environmental problems stretch beyond the bounds of most academic disciplines, and thus solutions require an interdisciplinary approach. For instance, the scientific consensus is changes in the frequency and severity of many types of extreme weather events are increasing (IPCC 2012). Yet despite our efforts to reduce greenhouse gases, we continue to experience severe weather events such as Superstorm Sandy, record heat and blizzards, and droughts. These natural hazards, combined with increased vulnerability and exposure, result in longer-lasting disruptions to critical infrastructure and business continuity throughout the world. In order to protect both our lives and the economy, we must think beyond the bounds of any one discipline to include an integrated assessment of relevant work. In the wake of recent events, New York City, Washington, DC, Chicago, and a myriad of other cities have turned to their academic powerhouses for assistance in better understanding their vulnerabilities. This talk will share a case study of the state of integrated assessments and vulnerability studies of energy, transportation, water, real estate, and other main sectors in Pittsburgh, PA. Then the talk will use integrated assessment models and other vulnerability studies to create coordinated sets of climate projections for use by the many public agencies and private-sector organizations in the region.

Klima, K.



Early Intervention Following Trauma May Mitigate Genetic Risk for PTSD in Civilians: A Pilot Prospective Emergency Department Study  

PubMed Central

Background Civilian posttraumatic stress disorder (PTSD) and combat PTSD are major public health concerns. Although a number of psychosocial risk factors have been identified related to PTSD risk, there are no accepted, robust biological predictors that identify who will develop PTSD or who will respond to early intervention following trauma. We wished to examine whether genetic risk for PTSD can be mitigated with an early intervention. Method 65 emergency department patients recruited in 2009–2010 at Grady Memorial Hospital in Atlanta, Georgia, who met criterion A of DSM-IV PTSD received either 3 sessions of an exposure intervention, beginning in the emergency department shortly after trauma exposure or assessment only. PTSD symptoms were assessed 4 and 12 weeks after trauma exposure. A composite additive risk score was derived from polymorphisms in 10 previously identified genes associated with stress-response (ADCYAP1R1, COMT, CRHR1, DBH, DRD2, FAAH, FKBP5, NPY, NTRK2, and PCLO), and gene x treatment effects were examined. The intervention included 3 sessions of imaginal exposure to the trauma memory and additional exposure homework. The primary outcome measure was the PTSD Symptom Scale-Interview Version or DSM-IV–based PTSD diagnosis in patients related to genotype and treatment group. Results A gene x intervention x time effect was detected for individual polymorphisms, in particular the PACAP receptor, ADCYAP1R1, as well as with a combined genotype risk score created from independent SNP markers. Subjects who did not receive treatment had higher symptoms than those who received intervention. Furthermore, subjects with the “risk” genotypes who did not receive intervention had higher PTSD symptoms compared to those with the “low-risk” or “resilience” genotypes or those who received intervention. Additionally, PTSD symptoms correlated with level of genetic risk at week 12 (P < .005) in the assessment-only group, but with no relationship in the intervention group, even after controlling for age, sex, race, education, income, and childhood trauma. Using logistic regression, the number of risk alleles was significantly associated with likelihood of PTSD diagnosis at week 12 (P < .05). Conclusions This pilot prospective study suggests that combined genetic variants may serve to predict those most at risk for developing PTSD following trauma. A psychotherapeutic intervention initiated in the emergency department within hours of the trauma may mitigate this risk. The role of genetic predictors of risk and resilience should be further evaluated in larger, prospective intervention and prevention trials. Trial Registration identifier: NCT00895518 PMID:25188543

Rothbaum, Barbara O.; Kearns, Megan C.; Reiser, Emily; Davis, Jennifer S.; Kerley, Kimberly A.; Rothbaum, Alex O.; Mercer, Kristina B.; Price, Matthew; Houry, Debra; Ressler, Kerry J.



Procedure for Prioritization of Natural Phenomena Hazards Evaluations for Existing DOE Facilities  

SciTech Connect

This document describes the procedure to be used for the prioritization for natural phenomena hazards evaluations of existing DOE facilities in conformance with DOE Order 5480.28, `Natural Phenomena Hazards Mitigation.`

Conrads, T.J., Westinghouse Hanford



Natural Hazards  

NSDL National Science Digital Library

This lesson introduces students to a variety of natural hazards, emphasizing that when people understand these threats they are better able to avoid or reduce their potential impacts. First, the class discusses what they know about natural hazards and natural disasters. Then, working in pairs, they research particular hazards; the threats they pose and where and when those threats are most pronounced. Following this research period, the class assembles what they have learned into a general overview of natural hazards nationwide. Following the second class discussion, students work in pairs again to explore the hazards that affect particular towns, cities, or regions of the country. In doing so, they learn more about how and why certain natural hazards impact specific areas, as well as what people are doing to minimize the threats these hazards pose.



Role of human- and animal-sperm studies in the evaluation of male reproductive hazards  

SciTech Connect

Human sperm tests provide a direct means of assessing chemically induced spermatogenic dysfunction in man. Available tests include sperm count, motility, morphology (seminal cytology), and Y-body analyses. Over 70 different human exposures have been monitored in various groups of exposed men. The majority of exposures studied showed a significant change from control in one or more sperm tests. When carefully controlled, the sperm morphology test is statistically the most sensitive of these human sperm tests. Several sperm tests have been developed in nonhuman mammals for the study of chemical spermatotoxins. The sperm morphology test in mice has been the most widely used. Results with this test seem to be related to germ-cell mutagenicity. In general, animal sperm tests should play an important role in the identification and assessment of potential human reproductive hazards. Exposure to spermatotoxins may lead to infertility, and more importantly, to heritable genetic damage. While there are considerable animal and human data suggesting that sperm tests may be used to detect agents causing infertility, the extent to which these tests detect heritable genetic damage remains unclear. (ERB)

Wyrobek, A.J.; Gordon, L.; Watchmaker, G.



Mitigating Challenges of Using Virtual Reality in Online Courses: A Case Study  

ERIC Educational Resources Information Center

Case study methodology was used to describe the challenges experienced in the development of a virtual component for a freshman-level undergraduate course. The purpose of the project was to use a virtual environment component to provide an interactive and engaging learning environment. While some student and faculty feedback was positive, this…

Stewart, Barbara; Hutchins, Holly M.; Ezell, Shirley; De Martino, Darrell; Bobba, Anil



Effect of reminders on mitigating participation bias in a case-control study  

PubMed Central

Background Researchers commonly employ strategies to increase participation in health studies. These include use of incentives and intensive reminders. There is, however, little evidence regarding the quantitative effect that such strategies have on study results. We present an analysis of data from a case-control study of Campylobacter enteritis in England to assess the usefulness of a two-reminder strategy for control recruitment. Methods We compared sociodemographic characteristics of participants and non-participants, and calculated odds ratio estimates for a wide range of risk factors by mailing wave. Results Non-participants were more often male, younger and from more deprived areas. Among participants, early responders were more likely to be female, older and live in less deprived areas, but despite these differences, we found little evidence of a systematic bias in the results when using data from early reponders only. Conclusions We conclude that the main benefit of using reminders in our study was the gain in statistical power from a larger sample size. PMID:21453477



Effect of anxiolytic and hypnotic drug prescriptions on mortality hazards: retrospective cohort study  

PubMed Central

Objective To test the hypothesis that people taking anxiolytic and hypnotic drugs are at increased risk of premature mortality, using primary care prescription records and after adjusting for a wide range of potential confounders. Design Retrospective cohort study. Setting 273 UK primary care practices contributing data to the General Practice Research Database. Participants 34?727 patients aged 16 years and older first prescribed anxiolytic or hypnotic drugs, or both, between 1998 and 2001, and 69?418 patients with no prescriptions for such drugs (controls) matched by age, sex, and practice. Patients were followed-up for a mean of 7.6 years (range 0.1-13.4 years). Main outcome All cause mortality ascertained from practice records. Results Physical and psychiatric comorbidities and prescribing of non-study drugs were significantly more prevalent among those prescribed study drugs than among controls. The age adjusted hazard ratio for mortality during the whole follow-up period for use of any study drug in the first year after recruitment was 3.46 (95% confidence interval 3.34 to 3.59) and 3.32 (3.19 to 3.45) after adjusting for other potential confounders. Dose-response associations were found for all three classes of study drugs (benzodiazepines, Z drugs (zaleplon, zolpidem, and zopiclone), and other drugs). After excluding deaths in the first year, there were approximately four excess deaths linked to drug use per 100 people followed for an average of 7.6 years after their first prescription. Conclusions In this large cohort of patients attending UK primary care, anxiolytic and hypnotic drugs were associated with significantly increased risk of mortality over a seven year period, after adjusting for a range of potential confounders. As with all observational findings, however, these results are prone to bias arising from unmeasured and residual confounding. PMID:24647164



Field study of exhaust fans for mitigating indoor air quality problems: Final report  

SciTech Connect

Residential ventilation in the United States housing stock is provided primarily by infiltration, the natural leakage of outdoor air into a building through cracks and holes in the building shell. Since ventilation is the dominant mechanism for control of indoor pollutant concentrations, low infiltration rates caused fluctuation in weather conditions may lead to high indoor pollutant concentrations. Supplemental mechanical ventilation can be used to eliminate these periods of low infiltration. This study examined effects of small continuously-operating exhaust fan on pollutant concentrations and energy use in residences.

Grimsrud, D.T.; Szydlowski, R.F.; Turk, B.H.



Studies on lithium salts to mitigate ASR-induced expansion in new concrete: a critical review  

SciTech Connect

This paper provides a critical review of the research work conducted so far on the suppressive effects of lithium compounds on expansion due to alkali-silica reaction (ASR) in concrete and on the mechanism or mechanisms by which lithium inhibits the expansion. After a thorough examination of the existing literature regarding lithium salts in controlling ASR expansion, a summary of research findings is provided. It shows that all the lithium salts studied, including LiF, LiCl, LiBr, LiOH, LiOH.H{sub 2}O, LiNO{sub 3}, LiNO{sub 2}, Li{sub 2}CO{sub 3}, Li{sub 2}SO{sub 4}, Li{sub 2}HPO{sub 4}, and Li{sub 2}SiO{sub 3}, are effective in suppressing ASR expansion in new concrete, provided they are used at the appropriate dosages. Among these compounds, LiNO{sub 3} appears to be the most promising one. Although the mechanism(s) for the suppressive effects of lithium are not well understood, several mechanisms have been proposed. A detailed discussion about these existing mechanisms is provided in the paper. Finally, some recommendations for future studies are identified.

Feng, X. [Department of Civil Engineering, University of New Brunswick, PO Box 4400, Fredericton, NB E3B 5A3 (Canada)]. E-mail:; Thomas, M.D.A. [Department of Civil Engineering, University of New Brunswick, PO Box 4400, Fredericton, NB E3B 5A3 (Canada); Bremner, T.W. [Department of Civil Engineering, University of New Brunswick, PO Box 4400, Fredericton, NB E3B 5A3 (Canada); Balcom, B.J. [MRI Center, University of New Brunswick, PO Box 4400, Fredericton, NB E3B 5A3 (Canada); Folliard, K.J. [Department of Civil Engineering, University of Texas, Austin, TX 78712 (United States)



An experimental study on tuned liquid damper for mitigation of structural response  

NASA Astrophysics Data System (ADS)

This paper investigates the performance of unidirectional tuned liquid damper (TLD) that relies upon the motion of shallow liquid in a rigid tank for changing the dynamic characteristics of a structure and dissipating its vibration energy under harmonic excitation. A series of experimental tests are conducted on a scaled model of structure-tuned liquid damper systems to evaluate their performance under harmonic excitation. One rectangular and one square TLD with various water depth ratios are examined over different frequency ratios, and time histories of accelerations are measured by precisely controlled shaking table tests. The behaviour of TLD is also studied by changing the orientation of the rectangular TLD subjected to the given range of harmonic excitation frequencies. The effectiveness of TLD is evaluated based on the response reduction of the structure. From the study, it is found that for each TLD, there exists an optimum water depth that corresponds to the minimum response amplitude, and the maximum control of vibration is obtained under resonance condition with the attachment of TLD.

Bhattacharjee, Emili; Halder, Lipika; Sharma, Richi Prasad



Simultaneous transcutaneous electrical nerve stimulation mitigates simulator sickness symptoms in healthy adults: a crossover study  

PubMed Central

Background Flight simulators have been used to train pilots to experience and recognize spatial disorientation, a condition in which pilots incorrectly perceive the position, location, and movement of their aircrafts. However, during or after simulator training, simulator sickness (SS) may develop. Spatial disorientation and SS share common symptoms and signs and may involve a similar mechanism of dys-synchronization of neural inputs from the vestibular, visual, and proprioceptive systems. Transcutaneous electrical nerve stimulation (TENS), a maneuver used for pain control, was found to influence autonomic cardiovascular responses and enhance visuospatial abilities, postural control, and cognitive function. The purpose of present study was to investigate the protective effects of TENS on SS. Methods Fifteen healthy young men (age: 28.6?±?0.9 years, height: 172.5?±?1.4 cm, body weight: 69.3?±?1.3 kg, body mass index: 23.4?±?1.8 kg/m2) participated in this within-subject crossover study. SS was induced by a flight simulator. TENS treatment involved 30 minutes simultaneous electrical stimulation of the posterior neck and the right Zusanli acupoint. Each subject completed 4 sessions (control, SS, TENS, and TENS?+?SS) in a randomized order. Outcome indicators included SS symptom severity and cognitive function, evaluated with the Simulator Sickness Questionnaire (SSQ) and d2 test of attention, respectively. Sleepiness was rated using the Visual Analogue Scales for Sleepiness Symptoms (VAS-SS). Autonomic and stress responses were evaluated by heart rate, heart rate variability (HRV) and salivary stress biomarkers (salivary alpha-amylase activity and salivary cortisol concentration). Results Simulator exposure increased SS symptoms (SSQ and VAS-SS scores) and decreased the task response speed and concentration. The heart rate, salivary stress biomarker levels, and the sympathetic parameter of HRV increased with simulator exposure, but parasympathetic parameters decreased (p?



Study and mitigation of calibration error sources in a water vapour Raman lidar  

NASA Astrophysics Data System (ADS)

The monitoring of water vapour throughout the atmosphere is important for many scientific applications (weather forecasting, climate research, calibration of GNSS altimetry measurements). Measuring water vapour remains a technical challenge because of its high variability in space and time. The major issues are achieving long-term stability (e.g., for climate trends monitoring) and high accuracy (e.g. for calibration/validation applications). LAREG and LOEMI at Institut National de l'Information Géographique et Forestière (IGN) have developed a mobile scanning water vapour Raman lidar in collaboration with LATMOS at CNRS. This system aims at providing high accuracy water vapour measurements throughout the troposphere for calibrating GNSS wet delay signals and thus improving vertical positioning. Current developments aim at improving the calibration method and long term stability of the system to allow the Raman lidar to be used as a reference instrument. The IGN-LATMOS lidar was deployed in the DEMEVAP (Development of Methodologies for Water Vapour Measurement) campaign that took place in 2011 at the Observatoire de Haute Provence. The goals of DEMEVAP were to inter-compare different water vapour sounding techniques (lidars, operational and research radiosondes, GPS,…) and to study various calibration methods for the Raman lidar. A significant decrease of the signals and of the calibration constants of the IGN-LATMOS Raman lidar has been noticed all along the campaign. This led us to study the likely sources of uncertainty and drifts in each part of the instrument: emission, reception and detection. We inventoried several error sources as well as instability sources. The impact of the temperature dependence of the Raman lines on the filter transmission or the fluorescence in the fibre, are examples of the error sources. We investigated each error source and each instability source (uncontrolled laser beam jitter, temporal fluctuations of the photomultiplier gain and spatial inhomogeneity in the sensitivity of the photomultiplier photocathode,…) separately using theoretical analysis, numerical and optical simulations, and laboratory experiments. The instability induced by the use of an optics fibre for coupling the signal collected by the telescope to the detectors is especially investigated. We quantified the impact of all these error sources on the water vapour and nitrogen Raman channels measurements and on the change in the differential calibration constant and we tried to implement an experimental solution to minimize the variations.

David, Leslie; Bock, Olivier; Bosser, Pierre; Thom, Christian; Pelon, Jacques



Feasibility study of tank leakage mitigation using subsurface barriers. Revision 1  

SciTech Connect

This document reflects the evaluations and analyses performed in response to Tri-Party Agreement Milestone M-45-07A - {open_quotes}Complete Evaluation of Subsurface Barrier Feasibility{close_quotes} (September 1994). In addition, this feasibility study was revised reflecting ongoing work supporting a pending decision by the DOE Richland Operations Office, the Washington State Department of Ecology, and the US Environmental Protection Agency regarding further development of subsurface barrier options for SSTs and whether to proceed with demonstration plans at the Hanford Site (Tri-Party Agreement Milestone M-45-07B). Analyses of 14 integrated SST tank farm remediation alternatives were conducted in response to the three stated objectives of Tri-Party Agreement Milestone M-45-07A. The alternatives include eight with subsurface barriers and six without. Technologies used in the alternatives include three types of tank waste retrieval, seven types of subsurface barriers, a method of stabilizing the void space of emptied tanks, two types of in situ soil flushing, one type of surface barrier, and a clean-closure method. A no-action alternative and a surface-barrier-only alternative were included as nonviable alternatives for comparison. All other alternatives were designed to result in closure of SST tank farms as landfills or in clean-closure. Revision 1 incorporates additional analyses of worker safety, large leak scenarios, and sensitivity to the leach rates of risk controlling constituents. The additional analyses were conducted to support TPA Milestone M-45-07B.

Treat, R.L.; Peters, B.B.; Cameron, R.J. [Enserch Environmental, Inc., Richland, WA (United States)] [and others



Houston’s Novel Strategy to Control Hazardous Air Pollutants: A Case Study in Policy Innovation and Political Stalemate  

PubMed Central

Although ambient concentrations have declined steadily over the past 30 years, Houston has recorded some of the highest levels of hazardous air pollutants in the United States. Nevertheless, federal and state regulatory efforts historically have emphasized compliance with the National Ambient Air Quality Standard for ozone, treating “air toxics” in Houston as a residual problem to be solved through application of technology-based standards. Between 2004 and 2009, Mayor Bill White and his administration challenged the well-established hierarchy of air quality management spelled out in the Clean Air Act, whereby federal and state authorities are assigned primacy over local municipalities for the purpose of designing and implementing air pollution control strategies. The White Administration believed that existing regulations were not sufficient to protect the health of Houstonians and took a diversity of both collaborative and combative policy actions to mitigate air toxic emissions from stationary sources. Opposition was substantial from a local coalition of entrenched interests satisfied with the status quo, which hindered the city’s attempts to take unilateral policy actions. In the short term, the White Administration successfully raised the profile of the air toxics issue, pushed federal and state regulators to pay more attention, and induced a few polluting facilities to reduce emissions. But since White left office in 2010, air quality management in Houston has returned to the way it was before, and today there is scant evidence that his policies have had any lasting impact.

Sexton, Ken; Linder, Stephen H



Geoengineering, climate change scepticism and the 'moral hazard' argument: an experimental study of UK public perceptions.  


Many commentators have expressed concerns that researching and/or developing geoengineering technologies may undermine support for existing climate policies-the so-called moral hazard argument. This argument plays a central role in policy debates about geoengineering. However, there has not yet been a systematic investigation of how members of the public view the moral hazard argument, or whether it impacts on people's beliefs about geoengineering and climate change. In this paper, we describe an online experiment with a representative sample of the UK public, in which participants read one of two arguments (either endorsing or rejecting the idea that geoengineering poses a moral hazard). The argument endorsing the idea of geoengineering as a moral hazard was perceived as more convincing overall. However, people with more sceptical views and those who endorsed 'self-enhancing' values were more likely to agree that the prospect of geoengineering would reduce their motivation to make changes in their own behaviour in response to climate change. The findings suggest that geoengineering is likely to pose a moral hazard for some people more than others, and the implications for engaging the public are discussed. PMID:25404680

Corner, Adam; Pidgeon, Nick



Geoengineering, climate change scepticism and the ‘moral hazard’ argument: an experimental study of UK public perceptions  

PubMed Central

Many commentators have expressed concerns that researching and/or developing geoengineering technologies may undermine support for existing climate policies—the so-called moral hazard argument. This argument plays a central role in policy debates about geoengineering. However, there has not yet been a systematic investigation of how members of the public view the moral hazard argument, or whether it impacts on people's beliefs about geoengineering and climate change. In this paper, we describe an online experiment with a representative sample of the UK public, in which participants read one of two arguments (either endorsing or rejecting the idea that geoengineering poses a moral hazard). The argument endorsing the idea of geoengineering as a moral hazard was perceived as more convincing overall. However, people with more sceptical views and those who endorsed ‘self-enhancing’ values were more likely to agree that the prospect of geoengineering would reduce their motivation to make changes in their own behaviour in response to climate change. The findings suggest that geoengineering is likely to pose a moral hazard for some people more than others, and the implications for engaging the public are discussed. PMID:25404680

Corner, Adam; Pidgeon, Nick



Experimental and Numerical Studies of the Effects of Water Sprinkling on Urban Pavement on Heat Island Mitigation  

NASA Astrophysics Data System (ADS)

One of the main causes of 'heat island phenomeno' is thought to be the artificial covers of the ground surface with asphalt or concrete which reduce greatly inherent cooling effect of water evaporation from soil surface. In this study, as a candidate method of mitigating the heat island the effects of the 'water sprinkling' on the pavements are discussed from field experiments and numerical studies. Three field experiments of water sprinkling on the asphalt/concrete pavements were performed in hot summer days in 2004-2006. For detecting the change in temperatures, the authors developed and used a 3-D measurements system which consists of two vertical planes with 6m high and 16m wide, and has network arrays of 102 thermistors distributed spatially in the planes. The temperatures measured in and around the water sprinkled area indicated that the ground surface temperature decreased 5 to 15 degrees uniformly in the water sprinkled area compared with those in the un-sprinkled area, while the relative decrease of atmospheric temperature was approximately up to 1 degree. The subsurface temperature at a depth of 14cm under the pavement decreased significantly and kept lower than that at the same depth in un-sprinkled area over the next morning. A numerical model was developed and applied to interpret the experimental results. It deals with the heat balance of radiation, sensible/latent heat transfer at the ground surface and heat conduction through the artificial and natural soil layer under ground. temperature and vapor conditions changes at and near ground surface were modeled by using the bulk formula.Good agreements between the calculated time-temperature profiles and the experimental ones were obtained by assuming adequate physical parameters and meteorological conditions. The model could be improved in order to evaluate the changes of temperature and vapor contents in atmosphere near the ground surface caused by aerodynamic turbulent diffusion.

Yoshioka, M.; Tosaka, H.; Nakagawa, K.



Hazardous Materials  

NSDL National Science Digital Library

Hazardous Materials is a lesson plan which teaches students how to recognize and safely handle hazardous materials in the workplace. After completing this module, students should be able to interpret MSDS sheets and demonstrate their ability to identify hazardous materials during an experiential exercise. Note: This module is part of a modularized manufacturing technology curriculum created by the PSCME, found at

Alston, Michele


Mitigating GHG emissions from agriculture under climate change constrains - a case study for the State of Saxony, Germany  

NASA Astrophysics Data System (ADS)

Mitigating greenhouse gas (N2O, CO2, CH4) emissions from agricultural soils under conditions of projected climate change (IPCC SRES scenarios) is a prerequisite to limit global warming. In this study we used the recently developed regional biogeochemical ecosystem model LandscapeDNDC (Haas et al., 2012, Landscape Ecology) and two time slices for present day (1998 - 2018) and future climate (2078-2098) (regional downscale of IPCC SRES A1B climate simulation) and compared a business as usual agricultural management scenario (winter rape seed - winter barley - winter wheat rotation; fertilization: 170 / 150 / 110 kg-N mineral fertilizer; straw harvest barley/wheat: 90 %) with scenarios where either one or all of the following options were realized: no-till, residue return to fields equal 100%, reduction of fertilization rate s were left on the field or reduction of N fertilization by 10%. The spatial domain is the State of Saxony (1 073 523 hectares of arable land), a typical region for agricultural production in Central Europe. The simulations are based on a high resolution polygonal datasets (5 517 agricultural grid cells) for which relevant information on soil properties is available. The regionalization of the N2O emissions was validated against the IPCC Tier I methodology resulting in N2O emissions of 1 824 / 1 610 / 1 180 [t N2O-N yr-1] for of the baseline years whereas the simulations results in 6 955 / 6 039 / 2 207 [t N2O-N yr-1] for the first three years of the baseline scenarios and ranging between 621 and 6 955 [t N2O-N yr-1] within the following years (mean of 2 923). The influence of climate change (elevated mean temperature of approx. 2°C and minor changes in precipitation) results in an increase of 259 [t N2O-N yr-1] (mean 3 182) or approx. 9 percent on average (with a minimum of 618 and a maximum of 6 553 [t N2O-N yr-1]). Focusing on the mitigation , the recarbonization did result in an increase of soil carbon stocks of 2 585 [kg C/ha] within the simulation time span (with 161 868 [kg C/ha] at the initial stage and 164 453 [kg C/ha] at the end of the 21 years of simulation, mean at 163 444). The study present a fully compensation of the carbon sequestration due to the incorporation of the residues by a steady increase of soil N2O emissions due to additional nitrogen supply by the mineralization of organic material (residues). For the derivation of a sustainable land use, the study presents an optimal scenario to keep the yields high while increasing soil C and reducing gaseous N losses and leaching.

Haas, E.; Kiese, R.; Klatt, S.; Butterbach-Bahl, K.



Integrated multi-parameters Probabilistic Seismic Landslide Hazard Analysis (PSLHA): the case study of Ischia island, Italy  

NASA Astrophysics Data System (ADS)

The Ischia island is a large, complex, partly submerged, active volcanic field located about 20 km east to the Campi Flegrei, a major active volcano-tectonic area near Naples. The island is morphologically characterized in its central part by the resurgent block of Mt. Epomeo, controlled by NW-SE and NE-SW trending fault systems, by mountain stream basin with high relief energy and by a heterogeneous coastline with alternation of beach and tuff/lava cliffs in a continuous reshape due to the weather and sea erosion. The volcano-tectonic process is a main factor for slope stability, as it produces seismic activity and generated steep slopes in volcanic deposits (lava, tuff, pumice and ash layers) characterized by variable strength. In the Campi Flegrei and surrounding areas the possible occurrence of a moderate/large seismic event represents a serious threat for the inhabitants, for the infrastructures as well as for the environment. The most relevant seismic sources for Ischia are represented by the Campi Flegrei caldera and a 5 km long fault located below the island north coast. However those sources are difficult to constrain. The first one due to the on-shore and off-shore extension not yet completely defined. The second characterized only by few large historical events is difficult to parameterize in the framework of probabilistic hazard approach. The high population density, the presence of many infrastructures and the more relevant archaeological sites associated with the natural and artistic values, makes this area a strategic natural laboratory to develop new methodologies. Moreover Ischia represents the only sector, in the Campi Flegrei area, with documented historical landslides originated by earthquake, allowing for the possibility of testing the adequacy and stability of the method. In the framework of the Italian project MON.I.C.A (infrastructural coastlines monitoring) an innovative and dedicated probabilistic methodology has been applied to identify the areas with higher susceptibility of landslide occurrence due to the seismic effect. The (PSLHA) combines the probability of exceedance maps for different GM parameters with the geological and geomorphological information, in terms of critical acceleration and dynamic stability factor. Generally the maps are evaluated for Peak Ground Acceleration, Velocity or Intensity, are well related with anthropic infrastructures (e.g. streets, building, etc.). Each ground motion parameter represents a different aspect in the hazard and has a different correlation with the generation of possible damages. Many works pointed out that other GM like Arias and Housner intensity and the absolute displacement could represent a better choice to analyse for example the cliffs stability. The selection of the GM parameter is of crucial importance to obtain the most useful hazard maps. However in the last decades different Ground Motion Prediction Equations for a new set of GM parameters have been published. Based on this information a series of landslide hazard maps can be produced. The new maps will lead to the identification of areas with highest probability of landslide induced by an earthquake. In a strategic site like Ischia this new methodologies will represent an innovative and advanced tool for the landslide hazard mitigation.

Caccavale, Mauro; Matano, Fabio; Sacchi, Marco; Mazzola, Salvatore; Somma, Renato; Troise, Claudia; De Natale, Giuseppe



Modelling the efficacy of proposed mitigation areas for shorebirds: a case study on the Seine estuary, France  

Microsoft Academic Search

A behaviour-based model was used to explore the effect of an extension of the port at Le Havre (Port 2000), and the effect of proposed mitigation measures, on the mortality and body condition of the three main shorebird species that overwinter in the estuary of the river Seine, France. In the model, a 20% reduction in the area of mudflats

Sarah E. A. Le V. dit Durell; Richard A. Stillman; Patrick Triplet; Christophe Aulert; Damien Ono dit Biot; Agnès Bouchet; Sylvain Duhamel; Sebastien Mayot; John D. Goss-Custard



Development and application of the EPIC model for carbon cycle, greenhouse-gas mitigation, and biofuel studies  

SciTech Connect

This chapter provides a comprehensive review of the EPIC model in relation to carbon cycle, greenhouse-gas mitigation, and biofuel applications. From its original capabilities and purpose (i.e., quantify the impacts or erosion on soil productivity), the EPIC model has evolved into a comprehensive terrestrial ecosystem model for simulating with more or less process-level detail many ecosystem processes such as weather, hydrology, plant growth and development, carbon cycle (including erosion), nutrient cycling, greenhouse-gas emissions, and the most complete set of manipulations that can be implemented on a parcel of land (e.g. tillage, harvest, fertilization, irrigation, drainage, liming, burning, pesticide application). The chapter also provides details and examples of the latest efforts in model development such as the coupled carbon-nitrogen model, a microbial denitrification model with feedback to the carbon decomposition model, updates on calculation of ecosystem carbon balances, and carbon emissions from fossil fuels. The chapter has included examples of applications of the EPIC model in soil carbon sequestration, net ecosystem carbon balance, and biofuel studies. Finally, the chapter provides the reader with an update on upcoming improvements in EPIC such as the additions of modules for simulating biochar amendments, sorption of soluble C in subsoil horizons, nitrification including the release of N2O, and the formation and consumption of methane in soils. Completion of these model development activities will render an EPIC model with one of the most complete representation of biogeochemical processes and capable of simulating the dynamic feedback of soils to climate and management in terms not only of transient processes (e.g., soil water content, heterotrophic respiration, N2O emissions) but also of fundamental soil properties (e.g. soil depth, soil organic matter, soil bulk density, water limits).

Izaurralde, Roberto C.; Mcgill, William B.; Williams, J.R.



Integrated study to define the hazard of the unstable flanks of Mt. Etna: the Italian DPC-INGV FLANK Project  

NASA Astrophysics Data System (ADS)

Volcanoes are often characterized by unstable flanks. The eastern and south-eastern flanks of Mt. Etna (Italy) have shown repeated evidence of instability in the recent past. The extent and frequency of these processes varies widely, from nearly continuous creep-like movements of specific portions of the flank to the rarer slip of the entire eastern sector, involving also the off-shore portion. Estimated slip rates may vary enormously, from mm/yr to m/week. The most dramatic instability events are associated with major eruptions and shallow seismic activity, as during 2002-2003, posing a serious hazard to the inhabited flanks of the volcano. The Italian Department of Civil Defense (DPC), with the National Institute of Geophysics and Volcanology (INGV), as well as with the involvement of Italian Universities and other Research Institutes, has launched a 2-years project (may 2008-may 2010) devoted to minimize the hazard deriving from the instability of the Etna flanks. This multidisciplinary project embraces geological, geophysical, volcanological, modeling and hazard studies, both on the on-shore and the off-shore portions of the E and SE flanks of the volcano. Indeed, the main aims are to define: (a) the 3D geometry of the collapsing sector(s); (b) the relationships between flank movement and volcanic and seismic activity; (c) the hazard related to the flank instability. The collected data populate a GIS database implemented according the WoVo rules. This project represents the first attempt, at least in Europe, to use an integrated approach to minimize the hazard deriving from flank instability in a volcano. Here we briefly summarize the state of the art of the project at an advanced stage, highlighting the path of the different Tasks, as well as the main results.

Acocella, Valerio; Puglisi, Giuseppe



Hazard Assessment of Chemical Air Contaminants Measured in Residences  

SciTech Connect

Identifying air pollutants that pose a potential hazard indoors can facilitate exposure mitigation. In this study, we compiled summary results from 77 published studies reporting measurements of chemical pollutants in residences in the United States and in countries with similar lifestyles. These data were used to calculate representative mid-range and upper bound concentrations relevant to chronic exposures for 267 pollutants and representative peak concentrations relevant to acute exposures for 5 activity-associated pollutants. Representative concentrations are compared to available chronic and acute health standards for 97 pollutants. Fifteen pollutants appear to exceed chronic health standards in a large fraction of homes. Nine other pollutants are identified as potential chronic health hazards in a substantial minority of homes and an additional nine are identified as potential hazards in a very small percentage of homes. Nine pollutants are identified as priority hazards based on the robustness of measured concentration data and the fraction of residences that appear to be impacted: acetaldehyde; acrolein; benzene; 1,3-butadiene; 1,4-dichlorobenzene; formaldehyde; naphthalene; nitrogen dioxide; and PM{sub 2.5}. Activity-based emissions are shown to pose potential acute health hazards for PM{sub 2.5}, formaldehyde, CO, chloroform, and NO{sub 2}.

Logue, J.M.; McKone, T.E.; Sherman, M. H.; Singer, B.C.



A study of hazardous air pollutants at the Tidd PFBC Demonstration Plant  

SciTech Connect

The US Department of Energy (DOE) Clean Coal Technology (CCD Program is a joint effort between government and industry to develop a new generation of coal utilization processes. In 1986, the Ohio Power Company, a subsidiary of American Electric Power (AEP), was awarded cofunding through the CCT program for the Tidd Pressure Fluidized Bed Combustor (PFBC) Demonstration Plant located in Brilliant, Ohio. The Tidd PFBC unit began operation in 1990 and was later selected as a test site for an advanced particle filtration (APF) system designed for hot gas particulate removal. The APF system was sponsored by the DOE Morgantown Energy Technology Center (METC) through their Hot Gas Cleanup Research and Development Program. A complementary goal of the DOE CCT and METC R&D programs has always been to demonstrate the environmental acceptability of these emerging technologies. The Clean Air Act Amendments of 1990 (CAAA) have focused that commitment toward evaluating the fate of hazardous air pollutants (HAPs) associated with advanced coal-based and hot gas cleanup technologies. Radian Corporation was contacted by AEP to perform this assessment of HAPs at the Tidd PFBC demonstration plant. The objective of this study is to assess the major input, process, and emission streams at Plant Tidd for the HAPs identified in Title III of the CAAA. Four flue gas stream locations were tested: ESP inlet, ESP outlet, APF inlet, and APF outlet. Other process streams sampled were raw coal, coal paste, sorbent, bed ash, cyclone ash, individual ESP hopper ash, APF ash, and service water. Samples were analyzed for trace elements, minor and major elements, anions, volatile organic compounds, dioxin/furan compounds, ammonia, cyanide, formaldehyde, and semivolatile organic compounds. The particle size distribution in the ESP inlet and outlet gas streams and collected ash from individual ESP hoppers was also determined.




Hazardous materials  


... people how to work with hazardous materials and waste. There are many different kinds of hazardous materials, including: Chemicals, like some that are used for cleaning Drugs, like chemotherapy to treat cancer Radioactive material that is used for x-rays or ...


Site selection for hazardous wastes: A case study from the GAP area, Turkey  

Microsoft Academic Search

The increase in the popularity of using environmental design criteria in town and country planning has brought about the need to fully identify the principles to determine the best location of hazardous wastes to be landfilled. This environmental management issue has received considerable attention because of its applications in urban and rural infrastructure planning, industrial development planning as well as

Mehmet Irfan Yesilnacar; Hasan Cetin



Volcanic hazards and aviation safety  

USGS Publications Warehouse

An aeronautical chart was developed to determine the relative proximity of volcanoes or ash clouds to the airports and flight corridors that may be affected by volcanic debris. The map aims to inform and increase awareness about the close spatial relationship between volcanoes and aviation operations. It shows the locations of the active volcanoes together with selected aeronautical navigation aids and great-circle routes. The map mitigates the threat that volcanic hazards pose to aircraft and improves aviation safety.

Casadevall, Thomas J.; Thompson, Theodore B.; Ewert, John W.



Determinants of Spatial and Temporal Patterns in Compensatory Wetland Mitigation  

NASA Astrophysics Data System (ADS)

Development projects that impact wetlands commonly require compensatory mitigation, usually through creation or restoration of wetlands on or off the project site. Over the last decade, federal support has increased for third-party off-site mitigation methods. At the same time, regulators have lowered the minimum impact size that triggers the requirement for compensatory mitigation. Few studies have examined the aggregate impact of individual wetland mitigation projects. No previous study has compared the choice of mitigation method by regulatory agency or development size. We analyze 1058 locally and federally permitted wetland mitigation transactions in the Chicago region between 1993 and 2004. We show that decreasing mitigation thresholds have had striking effects on the methods and spatial distribution of wetland mitigation. In particular, the observed increase in mitigation bank use is driven largely by the needs of the smallest impacts. Conversely, throughout the time period studied, large developments have rarely used mitigation banking, and have been relatively unaffected by changing regulatory focus and banking industry growth. We surmise that small developments lack the scale economies necessary for feasible permittee responsible mitigation. Finally, we compare the rates at which compensation required by both county and federal regulators is performed across major watershed boundaries. We show that local regulations prohibiting cross-county mitigation lead to higher levels of cross- watershed mitigation than federal regulations without cross-county prohibitions. Our data suggest that local control over wetland mitigation may prioritize administrative boundaries over hydrologic function in the matter of selecting compensation sites.

Bendor, Todd; Brozovi?, Nicholas



Flood fatality hazard and flood damage hazard: combining multiple hazard characteristics into meaningful maps for spatial planning  

NASA Astrophysics Data System (ADS)

For comprehensive flood risk management, accurate information on flood hazards is crucial. While in the past an estimate of potential flood consequences in large areas was often sufficient to make decisions on flood protection, there currently is an increasing demand to have detailed hazard maps available to be able to consider other risk reducing measures as well. Hazard maps are a prerequisite for spatial planning, but can also support emergency management, the design of flood mitigation measures, and the setting of insurance policies. The increase in flood risks due to population growth and economic development in hazardous areas in the past shows that sensible spatial planning is crucial to prevent risks increasing further. Assigning the least hazardous locations for development or adapting developments to the actual hazard requires comprehensive flood hazard maps. Since flood hazard is a multi-dimensional phenomenon, many different maps could be relevant. Having large numbers of maps to take into account does, however, not make planning easier. To support flood risk management planning we therefore introduce a new approach in which all relevant flood hazard parameters can be combined into two comprehensive maps of flood damage hazard respectively flood fatality hazard.

de Bruijn, K. M.; Klijn, F.; van de Pas, B.; Slager, C. T. J.



Resident perception of volcanic hazards and evacuation procedures  

NASA Astrophysics Data System (ADS)

Katla volcano, located beneath the Mýrdalsjökull ice cap in southern Iceland, is capable of producing catastrophic jökulhlaup. The Icelandic Civil Protection (ICP), in conjunction with scientists, local police and emergency managers, developed mitigation strategies for possible jökulhlaup produced during future Katla eruptions. These strategies were tested during a full-scale evacuation exercise in March 2006. A positive public response during a volcanic crisis not only depends upon the public's knowledge of the evacuation plan but also their knowledge and perception of the possible hazards. To improve the effectiveness of residents' compliance with warning and evacuation messages it is important that emergency management officials understand how the public interpret their situation in relation to volcanic hazards and their potential response during a crisis and apply this information to the ongoing development of risk mitigation strategies. We adopted a mixed methods approach in order to gain a broad understanding of residents' knowledge and perception of the Katla volcano in general, jökulhlaup hazards specifically and the regional emergency evacuation plan. This entailed field observations during the major evacuation exercise, interviews with key emergency management officials and questionnaire survey interviews with local residents. Our survey shows that despite living within the hazard zone, many residents do not perceive that their homes could be affected by a jökulhlaup, and many participants who perceive that their homes are safe, stated that they would not evacuate if an evacuation warning was issued. Alarmingly, most participants did not receive an evacuation message during the exercise. However, the majority of participants who took part in the exercise were positive about its implementation. This assessment of resident knowledge and perception of volcanic hazards and the evacuation plan is the first of its kind in this region. Our data can be used as a baseline by the ICP for more detailed studies in Iceland's volcanic regions.

Bird, D. K.; Gisladottir, G.; Dominey-Howes, D.



Pulsed focused ultrasound treatment of muscle mitigates paralysis-induced bone loss in the adjacent bone: a study in a mouse model.  


Bone loss can result from bed rest, space flight, spinal cord injury or age-related hormonal changes. Current bone loss mitigation techniques include pharmaceutical interventions, exercise, pulsed ultrasound targeted to bone and whole body vibration. In this study, we attempted to mitigate paralysis-induced bone loss by applying focused ultrasound to the midbelly of a paralyzed muscle. We employed a mouse model of disuse that uses onabotulinumtoxinA-induced paralysis, which causes rapid bone loss in 5 d. A focused 2 MHz transducer applied pulsed exposures with pulse repetition frequency mimicking that of motor neuron firing during walking (80 Hz), standing (20 Hz), or the standard pulsed ultrasound frequency used in fracture healing (1 kHz). Exposures were applied daily to calf muscle for 4 consecutive d. Trabecular bone changes were characterized using micro-computed tomography. Our results indicated that application of certain focused pulsed ultrasound parameters was able to mitigate some of the paralysis-induced bone loss. PMID:24857416

Poliachik, Sandra L; Khokhlova, Tatiana D; Wang, Yak-Nam; Simon, Julianna C; Bailey, Michael R



Geologic and Geophysicsal Studies of Natural Hazards and Risks in the Gulf of Peter the Great, Japan Sea  

NASA Astrophysics Data System (ADS)

The area of the Gulf of Peter the Great is socially, economically and culturally one of the most important regions for the Russian Far East. At the same time, there have been reported palpable natural hazards, which pose a real threat to local infrastructure. Complex field team of the Gramaberg VNIIOkeangeologia institute carried out geological and geophysical studies of natural hazards in the water area and coastal zone of the gulf in the summer and autumn of 2012. The research program included - geodetic deformation monitoring of the coastal zone by the HDS 3000 Leica tachometer; - echo sounding of the underwater part of the coastal slope by the LCX-37C depth sounder equipped with active external 12-channel GPS Lowrance antenna LGC-3000; - high-frequency acoustic profiling by GeoPulse Subbotom Profilier with oscillator frequency of 12.2 kHz for the study of bottom sediments to a depth of 40 m; - hydromagnetic measurements by SeaSPY Marine Magnetics magnetometer for investigation of deep geological structure; - sonar measurements by GEO SM C-MAX, 325 kHz frequency emitters for studying seafloor features; - studies of the water column (sensing and sampling); - bottom sediment sampling. Analytic work was performed by mass spectrometry, atomic absorption spectrophotometry, chromatography, gas chromatography-mass spectrometry, gamma spectrometry and included the following. For water - the content of Fe, Mn, Cd, As, Pb, Cu, Co, Ni, Cr, Zn, Hg in solution and in suspension, polycyclic aromatic compounds, organochlorine pesticides, oil, methane. For sediments - grade analysis, mineralogical analysis of sand, determination of Fe, Mn, Cd, As, Pb, Cu, Co, Ni, Cr, Zn, Hg content; identification of petroleum products, polychlorinated biphenyls, organochlorine pesticides, the specific activity of Cs-137. As a result, a set of geological maps was composed: maps of pre-Quaternary and Quaternary rocks and deposits, lithological map, geomorphological map, map of engineering geological zoning, map of the major hydro- and lithodynamic processes, hydraulic and geochemical maps and sections, seismotectonic map, map of endogenous geodynamics map exogenous geological processes, map assess the overall geo-ecological situations etc. As a result of the first stage of these studies we identified the following significant hazards and risks faced by the region: 1. Seismic hazards - along seismoactivity faults in the region. 2. Tsunami hasards in the coasts of Amursky, Ussuriysky and other gulf of the region. 3. Destruction of shore, including landslides, in many littoral zones of the region. 4. Avalanche sedimentation in Amursky, Ussuriysky gulfs. 5. Gas emissions in bottom of the shelf zone. 6. Industrial pollution in aquatories near industrial centres. Estimation of hazards and risks in the Gulf of Peter the Great will be continued.

Anokhin, Vladimir; Shcherbakov, Viktor; Motychko, Viktor; Slinchenkov, Vladimir; Sokolov, Georgy; Kotov, Sergey; Kartashov, Sergey



Facts, Contradictions And Possible Improvement Actions For Hazardous Wastewater Management - A Case Study  

Microsoft Academic Search

Pollution caused by direct discharge of waste leaching in tanning industry is a major contamination for sites (soil and groundwater)\\u000a in Romania. Next to Nitrogen-based organics, organic and inorganic sulphides and Chromium are some of the most hazardous contaminants\\u000a discharged and historically present on tannery sites. As the industrial branch enters the IPPC (Integrated Pollution Prevention\\u000a Control) Directive, the strategies

Maura Teodorescu; Carmen Gaidau


Asymptotic results for fitting marginal hazards models from stratified case-cohort studies with multiple disease outcomes  

PubMed Central

In stratified case-cohort designs, samplings of case-cohort samples are conducted via a stratified random sampling based on covariate information available on the entire cohort members. In this paper, we extended the work of Kang & Cai (2009) to a generalized stratified case-cohort study design for failure time data with multiple disease outcomes. Under this study design, we developed weighted estimating procedures for model parameters in marginal multiplicative intensity models and for the cumulative baseline hazard function. The asymptotic properties of the estimators are studied using martingales, modern empirical process theory, and results for finite population sampling. PMID:22442642

Kang, Sangwook; Cai, Jianwen



Modeling, Forecasting and Mitigating Extreme Earthquakes  

NASA Astrophysics Data System (ADS)

Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.



Analyzing electrical hazards in the workplace.  


In resolving the issues in analyzing electrical hazards in an industry, we must follow a path that will lead to a comprehensive analysis of the problems that exist and provide a quantified value to ensure the selection of appropriate personal protective equipment and clothing. An analysis of all three hazards--shock, arc, and blast--must be completed and steps taken to prevent injuries. The following steps could be taken to ensure adequacy of the electrical safe work practices program and training of "qualified" electrical personnel: 1. Conduct a comprehensive Job Task Analysis. 2. Complete a Task Hazard Assessment including: a) shock hazard, b) arc flash hazard, c) arc blast hazard, d) other hazards (slip, fall, struck-by, environmental, etc.). 3. Analyze task for the personal protective equipment needed. 4. Conduct training needs assessment for qualified and non-qualified electrical workers. 5. Revise, update, or publish a complete electrical safe work practices program. Regulatory agencies and standards organizations have long recognized the need to analyze the hazards of electrical work and plan accordingly to mitigate the hazards. Unfortunately, many in the electrical industry have chosen to "take their chances," largely because nothing bad has yet happened. As more information becomes available on the economic and human costs of electrical accidents, it is hoped that more in the industry will recognize the need for systematic hazard analysis and an electrical safe work program that emphasizes hazard identification and abatement. PMID:24358642

Neitzel, Dennis K



Repetitive flood victims and acceptance of FEMA mitigation offers: an analysis with community-system policy implications.  


Of all natural disasters, flooding causes the greatest amount of economic and social damage. The United States' Federal Emergency Management Agency (FEMA) uses a number of hazard mitigation grant programmes for flood victims, including mitigation offers to relocate permanently repetitive flood loss victims. This study examines factors that help to explain the degree of difficulty repetitive flood loss victims experience when they make decisions about relocating permanently after multiple flood losses. Data are drawn from interviews with FEMA officials and a survey of flood victims from eight repetitive flooding sites. The qualitative and quantitative results show the importance of rational choices by flood victims in their mitigation decisions, as they relate to financial variables, perceptions of future risk, attachments to home and community, and the relationships between repetitive flood loss victims and the local flood management officials who help them. The results offer evidence to suggest the value of a more community-system approach to FEMA relocation practices. PMID:21272056

Kick, Edward L; Fraser, James C; Fulkerson, Gregory M; McKinney, Laura A; De Vries, Daniel H



Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout  

NASA Astrophysics Data System (ADS)

During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly evolving crisis, accurately accounting for and propagating all uncertainties and enabling rational decision making under uncertainty.

Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner



Reproductive Hazards  


... and female reproductive systems play a role in pregnancy. Problems with these systems can affect fertility and ... a reproductive hazard can cause different effects during pregnancy, depending on when she is exposed. During the ...


Coastal Hazards.  

ERIC Educational Resources Information Center

Focuses on hurricanes and tsunamis and uses these topics to address other parts of the science curriculum. In addition to a discussion on beach erosion, a poster is provided that depicts these natural hazards that threaten coastlines. (DDR)

Vandas, Steve



Evaluating a multi-criteria model for hazard assessment in urban design. The Porto Marghera case study  

SciTech Connect

The aim of this paper is to describe a new approach to major industrial hazard assessment, which has been recently studied by the authors in conjunction with the Italian Environmental Protection Agency ('ARPAV'). The real opportunity for developing a different approach arose from the need of the Italian EPA to provide the Venice Port Authority with an appropriate estimation of major industrial hazards in Porto Marghera, an industrial estate near Venice (Italy). However, the standard model, the quantitative risk analysis (QRA), only provided a list of individual quantitative risk values, related to single locations. The experimental model is based on a multi-criteria approach--the Analytic Hierarchy Process--which introduces the use of expert opinions, complementary skills and expertise from different disciplines in conjunction with quantitative traditional analysis. This permitted the generation of quantitative data on risk assessment from a series of qualitative assessments, on the present situation and on three other future scenarios, and use of this information as indirect quantitative measures, which could be aggregated for obtaining the global risk rate. This approach is in line with the main concepts proposed by the last European directive on Major Hazard Accidents, which recommends increasing the participation of operators, taking the other players into account and, moreover, paying more attention to the concepts of 'urban control', 'subjective risk' (risk perception) and intangible factors (factors not directly quantifiable)

Luria, Paolo; Aspinall, Peter A



A Multiresolution Hazard Model for Multicenter Survival Studies: Application to Tamoxifen Treatment in Early Stage Breast Cancer  

PubMed Central

In multicenter studies, one often needs to make inference about a population survival curve based on multiple, possibly heterogeneous survival data from individual centers. We investigate a flexible Bayesian method for estimating a population survival curve based on a semiparametric multiresolution hazard model that can incorporate covariates and account for center heterogeneity. The method yields a smooth estimate of the survival curve for “multiple resolutions” or time scales of interest. The Bayesian model used has the capability to accommodate general forms of censoring and a priori smoothness assumptions. We develop a model checking and diagnostic technique based on the posterior predictive distribution and use it to identify departures from the model assumptions. The hazard estimator is used to analyze data from 110 centers that participated in a multicenter randomized clinical trial to evaluate tamoxifen in the treatment of early stage breast cancer. Of particular interest are the estimates of center heterogeneity in the baseline hazard curves and in the treatment effects, after adjustment for a few key clinical covariates. Our analysis suggests that the treatment effect estimates are rather robust, even for a collection of small trial centers, despite variations in center characteristics.

BOUMAN, Peter; MENG, Xiao-Li; DIGNAM, James; DUKI?, Vanja



HAZARDOUS MATERIALS INCIDENTS What are hazardous materials?  

E-print Network

for hazardous material inventories, training, and proper waste disposal. Regulations also define hazardousHAZARDOUS MATERIALS INCIDENTS What are hazardous materials? Hazardous materials are chemicals I do if there is a small spill in the area and personnel trained in Hazardous Material clean up

Fernandez, Eduardo


Protection of large alpine infrastructures against natural hazards  

NASA Astrophysics Data System (ADS)

Large infrastructures in alpine domains are threatened by a variety of natural hazards like debris flows, rock falls and snow avalanches. Especially linear infrastructure including roads, railway lines, pipe lines and power lines passes through the entire mountain range and the impact of natural hazards can be expected along a distance over hundreds of kilometers. New infrastructure projects like storage power plants or ski resorts including access roads are often located in remote alpine domains without any historical record of hazardous events. Mitigation strategies against natural hazards require a detailed analysis on the exposure of the infrastructure to natural hazards. Following conventional concepts extensive mapping and documentation of surface processes over hundreds to several thousand km² of steep alpine domain is essential but can be hardly performed. We present a case study from the Austrian Alps to demonstrate the ability of a multi-level concept to describe the impact of natural hazards on infrastructure by an iterative process. This includes new state of the art numerical models, modern field work and GIS-analysis with an increasing level of refinement at each stage. A set of new numerical models for rock falls, debris flows and snow avalanches was designed to operate with information from field in different qualities and spatial resolutions. Our analysis starts with simple and fast cellular automata for rockfalls and debrisflows to show the exposure of the infrastructure to natural hazards in huge domains and detects "high risk areas" that are investigated in more detail in field in the next refinement level. Finally, sophisticated 2D- depth averaged fluid dynamic models for all kinds of rapid mass movements are applied to support the development of protection structures.

Robl, Jörg; Scheikl, Manfred; Hergarten, Stefan



A probabilistic tsunami hazard assessment for Indonesia  

NASA Astrophysics Data System (ADS)

Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.



The best plan for flood mitigation: A case study in the north-eastern part of IRAN  

NASA Astrophysics Data System (ADS)

Frequency and magnitude of flood and debris flow have dramatically risen in north-eastern part of IRAN in the past decade. The evidence shows that the peak discharge of 2001 flood has exceeded the estimated PMF (Probable Maximum Flood) of Goleastan dam. The extreme foods of the region which mostly occurred in the summer, have damaged hundreds of life and thousands of livestock and destroyed a lot of infrastructures in recent years. Structural in association with non structural measures have been identified essential elements of flood mitigation in the master plan. Consequently two-phased plan including urgent measures and a master plan have been prepared for the basin as mid-term and long-term solution respectively. Considering flash flood manner of the region, flood detention and attenuation in upstream areas has been assessed as an effective measure in order to mitigate flood magnitude in down stream areas. Therefore a detention dam has been designed in the upstream catchments where there is high contribution in flood generation of the basin. In the design stage of the detention dam, several alternatives of reservoir and spillway capacity have been assessed regarding to flood reduction in the whole catchments. However, detention dam characteristic has been finalized based on maximum justifiable flood attenuation due to high vulnerability of the areas. The designed detention dam can completely control floods up to 200 year and reduce 1000 year peak discharge to less than 100 year return period at the dam site. Nevertheless, the dam would mitigate floods of downstream damage center at least 40% comparing to without project situation. This paper introduces not only the proposed master plan but also evaluates efficiency of the detention dam in flood reduction of the whole basin.

Heidari, A.



Contributions to the Characterization and Mitigation of Rotorcraft Brownout  

NASA Astrophysics Data System (ADS)

Rotorcraft brownout, the condition in which the flow field of a rotorcraft mobilizes sediment from the ground to generate a cloud that obscures the pilot's field of view, continues to be a significant hazard to civil and military rotorcraft operations. This dissertation presents methodologies for: (i) the systematic mitigation of rotorcraft brownout through operational and design strategies and (ii) the quantitative characterization of the visual degradation caused by a brownout cloud. In Part I of the dissertation, brownout mitigation strategies are developed through simulation-based brownout studies that are mathematically formulated within a numerical optimization framework. Two optimization studies are presented. The first study involves the determination of approach-to-landing maneuvers that result in reduced brownout severity. The second study presents a potential methodology for the design of helicopter rotors with improved brownout characteristics. The results of both studies indicate that the fundamental mechanisms underlying brownout mitigation are aerodynamic in nature, and the evolution of a ground vortex ahead of the rotor disk is seen to be a key element in the development of a brownout cloud. In Part II of the dissertation, brownout cloud characterizations are based upon the Modulation Transfer Function (MTF), a metric commonly used in the optics community for the characterization of imaging systems. The use of the MTF in experimentation is examined first, and the application of MTF calculation and interpretation methods to actual flight test data is described. The potential for predicting the MTF from numerical simulations is examined second, and an initial methodology is presented for the prediction of the MTF of a brownout cloud. Results from the experimental and analytical studies rigorously quantify the intuitively-known facts that the visual degradation caused by brownout is a space and time-dependent phenomenon, and that high spatial frequency features, i.e., fine-grained detail, are obscured before low spatial frequency features, i.e., large objects. As such, the MTF is a metric that is amenable to Handling Qualities (HQ) analyses.

Tritschler, John Kirwin


Studying geodesy and earthquake hazard in and around the New Madrid Seismic Zone  

USGS Publications Warehouse

Workshop on New Madrid Geodesy and the Challenges of Understanding Intraplate Earthquakes; Norwood, Massachusetts, 4 March 2011 Twenty-six researchers gathered for a workshop sponsored by the U.S. Geological Survey (USGS) and FM Global to discuss geodesy in and around the New Madrid seismic zone (NMSZ) and its relation to earthquake hazards. The group addressed the challenge of reconciling current geodetic measurements, which show low present-day surface strain rates, with paleoseismic evidence of recent, relatively frequent, major earthquakes in the region. The workshop presentations and conclusions will be available in a forthcoming USGS open-file report (

Boyd, Oliver Salz; Magistrale, Harold



Recording and cataloging hazards information, revision A  

NASA Technical Reports Server (NTRS)

A data collection process is described for the purpose of discerning causation factors of accidents, and the establishment of boundaries or controls aimed at mitigating and eliminating accidents. A procedure is proposed that suggests a discipline approach to hazard identification based on energy interrelationships together with an integrated control technique which takes the form of checklists.

Stein, R. J.



Evaluation of soil bioassays for use at Washington state hazardous waste sites: A pilot study  

SciTech Connect

The Washington State Department of Ecology (Ecology) is developing guidelines to assess soil toxicity at hazardous waste sites being investigated under the Washington Model Toxics Control Act Cleanup Regulation. To evaluate soil toxicity, Ecology selected five bioassay protocols -- Daphnia, Earthworm, Seedling, Fathead Minnow, and Frog Embryo Teratogenesis Assay Xenopus (FETAX) -- for use as screening level assessment tools at six State hazardous waste sites. Sites contained a variety of contaminants including metals, creosote, pesticides, and petroleum products (leaking underground storage tanks). Three locations, representing high, medium, and low levels of contamination, were samples at each site. In general, the high contaminant samples resulted in the highest toxic response in all bioassays. The order of site toxicity, as assessed by overall toxic response, is creosote, petroleum products, metals, and pesticides. Results indicate that human health standards, especially for metals, may not adequately protect some of the species tested. The FETAX bioassay had the greatest overall number of toxic responses and lowest variance. The seedling and Daphnia bioassays had lower and similar overall toxic response results, followed by the earthworm and fathead minnow. Variability was markedly highest for the seedling. The Daphnia and fathead minnow variability were similar to the FETAX level, while the earthworm variability was slightly higher.

Blakley, N.; Norton, D.; Stinson, M. [Washington Department of Ecology, Olympia, WA (United States); Boyer, R. [WCFWRU, Seattle, WA (United States). School of Fisheries




USGS Publications Warehouse

Various measures of the seismic source mechanism of mine tremors, such as magnitude, moment, stress drop, apparent stress, and seismic efficiency, can be related directly to several aspects of the problem of determining the underground hazard arising from strong ground motion of large seismic events. First, the relation between the sum of seismic moments of tremors and the volume of stope closure caused by mining during a given period can be used in conjunction with magnitude-frequency statistics and an empirical relation between moment and magnitude to estimate the maximum possible sized tremor for a given mining situation. Second, it is shown that the 'energy release rate,' a commonly-used parameter for predicting underground seismic hazard, may be misleading in that the importance of overburden stress, or depth, is overstated. Third, results involving the relation between peak velocity and magnitude, magnitude-frequency statistics, and the maximum possible magnitude are applied to the problem of estimating the frequency at which design limits of certain underground support equipment are likely to be exceeded.

McGarr, A.



Pilot studies of seismic hazard and risk in North Sulawesi Province, Indonesia  

USGS Publications Warehouse

Earthquake ground motions in North Sulawesi on soft soil that have a 90% probability of not been exceeded in 560 years are estimated to be 0.63 g (63% of the acceleration of gravity) at Palu, 0.31 g at Gorontalo, and 0.27 g at Manado. Estimated ground motions for rock conditions for the same probability level and exposure time are 56% of those for soft soil. The hazard estimates are obtained from seismic sources that model the earthquake potential to a depth of 100 km beneath northern and central Sulawesi and include the Palu fault zone of western Sulawesi, the North Sulawesi subduction zone, and the southern most segment of the Sangihe subduction zone beneath the Molucca Sea. An attenuation relation based on Japanese strong-motion data and considered appropriate for subduction environments of the western Pacific was used in determination of ground motions. Following the 18 April 1990 North Sulawesi earthquake (Ms 7.3) a seismic hazard and risk assessment was carried out. -from Authors

Thenhaus, P.C.; Hanson, S.L.; Effendi, I.; Kertapati, E.K.; Algermissen, S.T.



Lunar Dust: Characterization and Mitigation  

NASA Technical Reports Server (NTRS)

Lunar dust is a ubiquitous phenomenon which must be explicitly addressed during upcoming human lunar exploration missions. Near term plans to revisit the moon as a stepping stone for further exploration of Mars, and beyond, places a primary emphasis on characterization and mitigation of lunar dust. Comprised of regolith particles ranging in size from tens of nanometers to microns, lunar dust is a manifestation of the complex interaction of the lunar soil with multiple mechanical, electrical, and gravitational effects. The environmental and anthropogenic factors effecting the perturbation, transport, and deposition of lunar dust must be studied in order to mitigate it's potentially harmful effects on exploration systems. The same hold true for assessing the risk it may pose for toxicological health problems if inhaled. This paper presents the current perspective and implementation of dust knowledge management and integration, and mitigation technology development activities within NASA's Exploration Technology Development Program. This work is presented within the context of the Constellation Program's Integrated Lunar Dust Management Strategy. This work further outlines the scientific basis for lunar dust behavior, it's characteristics and potential effects, and surveys several potential strategies for its control and mitigation both for lunar surface operations and within the working volumes of a lunar outpost. The paper also presents a perspective on lessons learned from Apollo and forensics engineering studies of Apollo hardware.

Hyatt. Mark J.; Feighery, John



Automated Standard Hazard Tool  

NASA Technical Reports Server (NTRS)

The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

Stebler, Shane



Field implementation complexities of EPA developmental methods during remediation at hazardous waste sites: Case study  

SciTech Connect

The objective of this presentation is to provide insight into the complexities of field implementation of Environmental Protection Agency developmental methods for polynuclear aromatic hydrocarbons, pentachlorophenol, particulates and metals at hazardous waste site remediations. A remedial action plan developed for the site called for the development and subsequent execution of an air monitoring plan during the removal of affected subsurface soils. Ambient air monitoring for polynuclear aromatic hydrocarbons, pentachlorophenol, total particulates and arsenic, chromium and copper was conducted from February through May, 1992. After May, sampling for arsenic, chromium and copper was dropped from the plan because of the extremely low levels of metals associated with the soils. Real-time monitoring for total suspended particulates was conducted from February through September, 1992.

Green, E.L. [Eagle Environmental Health, Inc., Houston, TX (United States); Cunningham, E.A. [Tenneco, Inc., Houston, TX (United States); Grabinski, C. [Joslyn Corp., Chicago, IL (United States)



Lunar Dust Mitigation Technology Development  

NASA Technical Reports Server (NTRS)

NASA s plans for implementing the Vision for Space Exploration include returning to the moon as a stepping stone for further exploration of Mars, and beyond. Dust on the lunar surface has a ubiquitous presence which must be explicitly addressed during upcoming human lunar exploration missions. While the operational challenges attributable to dust during the Apollo missions did not prove critical, the comparatively long duration of impending missions presents a different challenge. Near term plans to revisit the moon places a primary emphasis on characterization and mitigation of lunar dust. Comprised of regolith particles ranging in size from tens of nanometers to microns, lunar dust is a manifestation of the complex interaction of the lunar soil with multiple mechanical, electrical, and gravitational effects. The environmental and anthropogenic factors effecting the perturbation, transport, and deposition of lunar dust must be studied in order to mitigate it s potentially harmful effects on exploration systems. This paper presents the current perspective and implementation of dust knowledge management and integration, and mitigation technology development activities within NASA s Exploration Technology Development Program. This work is presented within the context of the Constellation Program s Integrated Lunar Dust Management Strategy. The Lunar Dust Mitigation Technology Development project has been implemented within the ETDP. Project scope and plans will be presented, along with a a perspective on lessons learned from Apollo and forensics engineering studies of Apollo hardware. This paper further outlines the scientific basis for lunar dust behavior, it s characteristics and potential effects, and surveys several potential strategies for its control and mitigation both for lunar surface operations and within the working volumes of a lunar outpost.

Hyatt, Mark J.; Deluane, Paul B.



Applications of a Forward-Looking Interferometer for the On-board Detection of Aviation Weather Hazards  

NASA Technical Reports Server (NTRS)

The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining measurements of potential weather hazards to alert flight crews. The FLI concept is based on high-resolution Infrared (IR) Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing, and which have also been applied to the detection of aerosols and gases for other purposes. It is being evaluated for multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing, during all phases of flight. Previous sensitivity and characterization studies addressed the phenomenology that supports detection and mitigation by the FLI. Techniques for determining the range, and hence warning time, were demonstrated for several of the hazards, and a table of research instrument parameters was developed for investigating all of the hazards discussed above. This work supports the feasibility of detecting multiple hazards with an FLI multi-hazard airborne sensor, and for producing enhanced IR images in reduced visibility conditions; however, further research must be performed to develop a means to estimate the intensities of the hazards posed to an aircraft and to develop robust algorithms to relate sensor measurables to hazard levels. In addition, validation tests need to be performed with a prototype system.

West, Leanne; Gimmestad, Gary; Smith, William; Kireev, Stanislav; Cornman, Larry B.; Schaffner, Philip R.; Tsoucalas, George



HAZARDOUS MATERIALS INCIDENTS What are hazardous materials?  

E-print Network

, and contaminated PPE in a compatible waste container; CLOSE the container, label it "Hazardous Waste" and specify. These regulations contain requirements for hazardous material inventories, training, and proper waste disposalHAZARDOUS MATERIALS INCIDENTS What are hazardous materials? Hazardous materials are chemicals

Fernandez, Eduardo


HAZARDOUS MATERIALS INCIDENTS What are hazardous materials?  

E-print Network

contain requirements for hazardous material inventories, training, and proper waste disposal. RegulationsHAZARDOUS MATERIALS INCIDENTS What are hazardous materials? Hazardous materials are chemicals I do if there is a small spill in the area and personnel trained in Hazardous Material clean up

Fernandez, Eduardo


HAZARDOUS MATERIALS INCIDENTS What are hazardous materials?  

E-print Network

inventories, training, and proper waste disposal. Regulations also define hazardous material groups to includeHAZARDOUS MATERIALS INCIDENTS What are hazardous materials? Hazardous materials are chemicals I do if there is a small spill in the area and personnel trained in Hazardous Material clean up

Fernandez, Eduardo


Flood hazard and flood risk assessment using a time series of satellite images: a case study in Namibia.  


In this article, the use of time series of satellite imagery to flood hazard mapping and flood risk assessment is presented. Flooded areas are extracted from satellite images for the flood-prone territory, and a maximum flood extent image for each flood event is produced. These maps are further fused to determine relative frequency of inundation (RFI). The study shows that RFI values and relative water depth exhibit the same probabilistic distribution, which is confirmed by Kolmogorov-Smirnov test. The produced RFI map can be used as a flood hazard map, especially in cases when flood modeling is complicated by lack of available data and high uncertainties. The derived RFI map is further used for flood risk assessment. Efficiency of the presented approach is demonstrated for the Katima Mulilo region (Namibia). A time series of Landsat-5/7 satellite images acquired from 1989 to 2012 is processed to derive RFI map using the presented approach. The following direct damage categories are considered in the study for flood risk assessment: dwelling units, roads, health facilities, and schools. The produced flood risk map shows that the risk is distributed uniformly all over the region. The cities and villages with the highest risk are identified. The proposed approach has minimum data requirements, and RFI maps can be generated rapidly to assist rescuers and decisionmakers in case of emergencies. On the other hand, limitations include: strong dependence on the available data sets, and limitations in simulations with extrapolated water depth values. PMID:24372226

Skakun, Sergii; Kussul, Nataliia; Shelestov, Andrii; Kussul, Olga



Landslide Hazards  

NSDL National Science Digital Library

This fact sheet provides an overview of the hazards presented by landslides and debris flows, particluarly in wet weather conditions in landslide-prone areas. Topics include a discussion of the characteristics of debris flows, and suggestions for residents who live in steep, hilly terrain which is subject to periodic heavy rains.



Mitigation Monitoring Plan  

SciTech Connect

The Final Supplemental Environmental Impact Report (SEIR) (September 1992) for the Proposed Renewal of the Contract between the United States Department of Energy and The Regents of the University of California for the Operation and Management of the Lawrence Berkeley Laboratory identifies the environmental impacts associated with renewing the contract and specifies a series of measures designed to mitigate adverse impacts to the environment. This Mitigation Monitoring Plan describes the procedures the University will use to implement the mitigation measures adopted in connection with the approval of the Contract.

Not Available



Integration of spatial and temporal data for landslide hazard assessment. A case study in the Area North of Lisbon (Portugal)  

NASA Astrophysics Data System (ADS)

A general methodology for the probabilistic evaluation of landslide hazard is presented, taking in account both the landslide susceptibility and the instability triggering factors. The method is applied in the Fanhões-Trancão test site (North of Lisbon, Portugal) where 148 landslides were mapped and integrated into a database. For the landslide susceptibility assessment it is assumed that future landslides can be predicted by statistical relationships between past landslides and the spatial data set of the conditioning factors (e.g. slope, aspect, geomorphology, lithology, superficial deposits, land use, vegetation cover, etc.). Susceptibility is evaluated using algorithms based on statistical/probabilistic analysis (Bayesian model) over unique-condition terrain units in a raster basis. The landslide susceptibility map is prepared by sorting all pixels according to the pixel susceptibility value in descending order. In order to validate the results of the susceptibility predictions, the landslide data set is divided in two parts, using temporal criteria. The first subset is used for obtaining a prediction image and the second subset is compared with the prediction results for validation. The obtained prediction-rate curve is used for the quantitative interpretation of the initial susceptibility map (Fabbri et al., 2002). Landslides in the study area are triggered by rainfall. The integration of triggering information in hazard assessment includes (i) the definition of thresholds of rainfall (quantity-duration) responsible for past landslide events; (ii) the calculation of the relevant return periods; (iii) the assumption that the same rainfall patterns (quantity/duration) which produced slope instability in the past will produce the same effects in the future (i.e. same types of landslides and same total affected area). The landslide hazard is present as the probability of each pixel to be affected by a slope movement, and results from the coupling between the susceptibility map, prediction-rate curve and the return periods of critical rainfall events, on a scenario basis. Using this methodology, different hazard scenarios can be assessed, each one corresponding to a different return period.

Zezere, J. L.; Reis, E.; Vieira, G.; Rodrigues, M. L.; Garcia, R.; Oliveira, S.; Ferreira, A. B.



Augmented Reality Cues and Elderly Driver Hazard Perception  

PubMed Central

Objective Evaluate the effectiveness of augmented reality (AR) cues in improving driving safety in elderly drivers who are at increased crash risk due to cognitive impairments. Background Cognitively challenging driving environments pose a particular crash risk for elderly drivers. AR cueing is a promising technology to mitigate risk by directing driver attention to roadway hazards. This study investigates whether AR cues improve or interfere with hazard perception in elderly drivers with age-related cognitive decline. Methods Twenty elderly (Mean= 73 years, SD= 5 years), licensed drivers with a range of cognitive abilities measured by a speed of processing (SOP) composite participated in a one-hour drive in an interactive, fixed-base driving simulator. Each participant drove through six, straight, six-mile-long rural roadway scenarios following a lead vehicle. AR cues directed attention to potential roadside hazards in three of the scenarios, and the other three were uncued (baseline) drives. Effects of AR cueing were evaluated with respect to: 1) detection of hazardous target objects, 2) interference with detecting nonhazardous secondary objects, and 3) impairment in maintaining safe distance behind a lead vehicle. Results AR cueing improved the detection of hazardous target objects of low visibility. AR cues did not interfere with detection of nonhazardous secondary objects and did not impair ability to maintain safe distance behind a lead vehicle. SOP capacity did not moderate those effects. Conclusion AR cues show promise for improving elderly driver safety by increasing hazard detection likelihood without interfering with other driving tasks such as maintaining safe headway. PMID:23829037

Schall, Mark C.; Rusch, Michelle L.; Lee, John D.; Dawson, Jeffrey D.; Thomas, Geb; Aksan, Nazan; Rizzo, Matthew



Seismic hazard and risk assessment for Baku, the capital city of Azerbaijan  

NASA Astrophysics Data System (ADS)

A rapid growth of population, intensive civil and industrial building, land and water instabilities (e.g., landslides, significant underground water level fluctuations), and the lack of public awareness regarding seismic hazard contribute to the increase of vulnerability of the Baku city to earthquakes. In this study, we use a quantitative approach to estimate an earthquake risk in the city. The earthquake risk has been determined as a convolution of seismic hazard in terms of peak ground acceleration, vulnerability due to ground and construction conditions, population features, the gross domestic product per capita, and exposure of infrastructure and critical facilities. Seismic hazard and earthquake risk for Baku are assessed for three earthquake scenarios (near, far, and local events). The developed maps of earthquake loss provide useful information to identify the factors influencing the earthquake loss. Our results allow elaborating strategic countermeasure plans for the seismic risk mitigation in the Baku city.

Babayev, Gulam; Ismail-Zadeh, Alik; Le Mouël, Jean-Louis



Dust Mitigation Vehicle  

NASA Technical Reports Server (NTRS)

A document describes the development and demonstration of an apparatus, called a dust mitigation vehicle, for reducing the amount of free dust on the surface of the Moon. The dust mitigation vehicle would be used to pave surfaces on the Moon to prevent the dust from levitating or adhering to surfaces. The basic principle of operation of these apparatuses is to use a lens or a dish mirror to concentrate solar thermal radiation onto a small spot to heat lunar regolith. In the case of the prototype dust mitigation vehicle, a Fresnel lens was used to heat a surface layer of regolith sufficiently to sinter or melt dust grains into a solid mass. The prototype vehicle has demonstrated paving rates up to 1.8 square meters per day. The proposed flight design of the dust mitigation vehicle is also described.

Cardiff, Eric H.



Orbital Debris Mitigation  

NASA Technical Reports Server (NTRS)

Policies on limiting orbital debris are found throughout the US Government, many foreign space agencies, and as adopted guidelines in the United Nations. The underlying purpose of these policies is to ensure the environment remains safe for the operation of robotic and human spacecraft in near- Earth orbit. For this reason, it is important to consider orbital debris mitigation during the design of all space vehicles. Documenting compliance with the debris mitigation guidelines occurs after the vehicle has already been designed and fabricated for many CubeSats, whereas larger satellites are evaluated throughout the design process. This paper will provide a brief explanation of the US Government Orbital Debris Mitigation Standard Practices, a discussion of international guidelines, as well as NASA's process for compliance evaluation. In addition, it will discuss the educational value of considering orbital debris mitigation requirements as a part of student built satellite design.

Kelley, R. L.; Jarkey, D. R.; Stansbery, G.



Study of the seismicity temporal variation for the current seismic hazard evaluation in Val d'Agri, Italy  

NASA Astrophysics Data System (ADS)

This study examines the temporal variation of the seismicity in the Val d'Agri (southern Italy) and adjacent areas, for the current seismic hazard evaluation. The temporal variation of the seismicity is expressed as time series of the number of earthquakes, b value of Gutenberg-Richter relationship or b value of the frequency-magnitude distribution and the seismic energy released in the form of logE2/3. The analysis was performed by means of a new research tool that includes visualizing techniques, which helps the interactive exploration and the interpretation of temporal variation changes. The obtained time series show a precursory seismicity pattern, characterized by low and high probability periods, which preceded earthquakes of magnitude M ? 4.0. The 75% of the examined cases were successfully correlated with a change in seismicity pattern. The average duration of the low and the high probability periods is 10.6 and 13.8 months respectively. These results indicate that the seismicity temporal variation monitoring in a given area and the recognition of the low and high probability periods can contribute to the evaluation, in regular monthly intervals, of current seismic hazard status.

Baskoutas, I.; D'Alessandro, A.



A performance improvement case study in aircraft maintenance and its implications for hazard identification.  


Aircraft maintenance is a highly regulated, safety critical, complex and competitive industry. There is a need to develop innovative solutions to address process efficiency without compromising safety and quality. This paper presents the case that in order to improve a highly complex system such as aircraft maintenance, it is necessary to develop a comprehensive and ecologically valid model of the operational system, which represents not just what is meant to happen, but what normally happens. This model then provides the backdrop against which to change or improve the system. A performance report, the Blocker Report, specific to aircraft maintenance and related to the model was developed gathering data on anything that 'blocks' task or check performance. A Blocker Resolution Process was designed to resolve blockers and improve the current check system. Significant results were obtained for the company in the first trial and implications for safety management systems and hazard identification are discussed. Statement of Relevance: Aircraft maintenance is a safety critical, complex, competitive industry with a need to develop innovative solutions to address process and safety efficiency. This research addresses this through the development of a comprehensive and ecologically valid model of the system linked with a performance reporting and resolution system. PMID:20099178

Ward, Marie; McDonald, Nick; Morrison, Rabea; Gaynor, Des; Nugent, Tony



Seismic hazard study for selected sites in New Mexico and Nevada  

NASA Astrophysics Data System (ADS)

Seismic hazard evaluations were conducted for specific sites in New Mexico and Nevada. For New Mexico, a model of seismicity was developed from historical accounts of medium to large shocks and the current microactivity record from local networks. Ninety percent confidence levels at Albuquerque and Roswell were computed to be 56 gals for a 10-year period and 77 gals for a 20-year period. Values of ground motion for Clovis were below these values. Peak velocity and displacement were also computed for each site. Deterministic spectra based on the estimated maximum credible earthquake for the zones which the sites occupy were also computed. For the sites in Nevada, the regionalizations used in Battis (1982) for the uniform seismicity model were slightly modified. For 10- and 20-year time periods, peak acceleration values for Indian Springs were computed to be 94 gals and 123 gals and for Hawthorne 206 gals and 268 gals. Deterministic spectra were also computed. The input parameters were well determined for the analysis for the Nevada sites because of the abundance of data. The values computed for New Mexico, however, are likely upper limits. As more data are collected from the area of the Rio Grande rift zone, the pattern of seismicity will become better understood. At this time a more detailed, and thus more accurate, model may emerge.

Johnston, J. C.



Diverse Applications of Structural Blast Mitigation in Steel Frame Buildings using a Common Connection Geometry  

Microsoft Academic Search

This paper highlights the design diversity of a new-generation steel frame connection technology that exhibits unique geometry characteristics capable of providing cost- effective multi-hazard mitigation using only a single structural system. In addition to providing a proven structural system for resisting natural hazards such as extreme wind and earthquake, it will be demonstrated, using actual project applications, how this connection

David L. Houghton; Jesse E. Karns


Hazardous Waste Facilities\\  

Microsoft Academic Search

Recent widely publicized studies claim facilities for treatment, storage, and disposal of hazard ous wastes (TSDFs) are located in areas with higher than average proportions of minorities, thereby exposing minorities to relatively greater levels of potential risk. These claims have influenced national policies and public perceptions. This article revisits those claims in the first national study of TSDFs to use

Douglas L. Anderton; Andy B. Anderson; Peter H. Rossi; John Michael Oakes; Michael R. Fraser; Eleanor W. Weber; Edward J. Calabrese



ETINDE. Improving the role of a methodological approach and ancillary ethnoarchaeological data application for place vulnerability and resilience to a multi-hazard environment: Mt. Cameroon volcano case study [MIA-VITA project -FP7-ENV-2007-1  

NASA Astrophysics Data System (ADS)

The FP7 MIA-VITA [Mitigate and assess risk from volcanic impact on terrain and human activities] project has been designed to address multidisciplinary aspects of volcanic threat assessment and management from prevention to crisis management recovery. In the socio-economic analysis carried out at Mt. Cameroon Bakweri and Bakossi ethnic groups, ancillary ethnoarchaeological information has been included to point out the cultural interaction between the volcano and its residents. In 2009-2011, ethnoanthropological surveys and interviews for data collection were carried out at Buea, Limbe, West Coast, Tiko and Muyuka sub-divisions adjacent to Mt. Cameroon. One of the outstanding, results from the Bakweri and Bakossi cultural tradition study: natural hazards are managed and produced by supernatural forces, as: Epasa Moto, God of the Mountain (Mt. Cameroon volcano) and Nyango Na Nwana , Goddess of the sea (Gulf of Guinea). In the case of Mount Cameroon, people may seek the spirit or gods of the mountain before farming, hunting and most recently the undertaking of the Mount Cameroon annual race are done. The spirit of this mountain must be seek to avert or stop a volcanic eruption because the eruption is attributed to the anger of the spirit. Among the Northern Bakweri, the association of spirits with the mountain could also be explained in terms of the importance of the mountain to the people. Most of their farming and hunting is done on the Mountain. Some forest products, for instance, wood for building and furniture is obtained from the forest of the mountain; this implies that the people rely on the Mountain for food, game and architecture/furniture etc. In addition, the eruption of the mountain is something which affects the people. It does not only destroy property, it frustrates people and takes away human lives when it occurs. Because of this economic importance of the Mountain and its unexpected and unwanted eruption, the tendency is to believe that it has some supernatural force dwelling in it: the god EPASA MOTO. Since social group is forever indebted to the gods because of his deceptive behavior, it must remedy to calm the anger of the gods. Rites are managed by traditional chiefs in the name of the group making offerings and sacrifices, which preciousness is directly proportional to the request; The perception of vulnerability to natural disasters is mitigated by ritual practices devoted to keep under control the Genius Loci (EPASA MOTO) negative reactions as eruptions, tidal waves, etc.. According with landscape evolution, the present work will describe the anthropogenic remodeled space and the related Vulnerability Hazards-of-Place Model elaborated by S. Cutter in 1996. Results will suggest a good approach to local geo-hazards management through traditional methods. Principles of Geoethics are important tools in managing natural hazards in different cultural contexts. A geoethical approach in risk management guarantees the respect for beliefs and cultural traditions and the development of strategies respectful of values and sensibilities of the involved populations.

Ilaria Pannaccione Apa, Maria; Kouokam, Emmanuel; Mbe Akoko, Robert; Peppoloni, Silvia; Fabrizia Buongiorno, Maria; Thierry, Pierre



Decision analysis of mitigation and remediation of sedimentation within large wetland systems: a case study using Agassiz National Wildlife Refuge  

USGS Publications Warehouse

Sedimentation has been identified as an important stressor across a range of wetland systems. The U.S. Fish and Wildlife Service has the responsibility of maintaining wetlands within its National Wildlife Refuge System for use by migratory waterbirds and other wildlife. Many of these wetlands could be negatively affected by accelerated rates of sedimentation, especially those located in agricultural parts of the landscape. In this report we document the results of a decision analysis project designed to help U.S. Fish and Wildlife Service staff at the Agassiz National Wildlife Refuge (herein referred to as the Refuge) determine a strategy for managing and mitigating the negative effects of sediment loading within Refuge wetlands. The Refuge’s largest wetland, Agassiz Pool, has accumulated so much sediment that it has become dominated by hybrid cattail (Typha × glauca), and the ability of the staff to control water levels in the Agassiz Pool has been substantially reduced. This project consisted of a workshop with Refuge staff, local and regional stakeholders, and several technical and scientific experts. At the workshop we established Refuge management and stakeholder objectives, a range of possible management strategies, and assessed the consequences of those strategies. After deliberating a range of actions, the staff chose to consider the following three strategies: (1) an inexpensive strategy, which largely focused on using outreach to reduce external sediment inputs to the Refuge; (2) the most expensive option, which built on the first option and relied on additional infrastructure changes to the Refuge to increase management capacity; and (3) a strategy that was less expensive than strategy 2 and relied mostly on existing infrastructure to improve management capacity. Despite the fact that our assessments were qualitative, Refuge staff decided they had enough information to select the third strategy. Following our qualitative assessment, we discussed additional considerations and uncertainties that might affect implementation of this strategy.

Post van der Burg, Max; Jenni, Karen E.; Nieman, Timothy L.; Eash, Josh D.; Knutsen, Gregory A.



Environmental Hazards, Health, and Racial Inequity in Hazardous Waste Distribution  

Microsoft Academic Search

This study addresses the critical issue of hazardous wastes and associated human health problems. The issue of inequitable distribution of environmental hazards by race is discussed with special reference to a municipal solid waste landfill and the petrochemical plants as the principal environmental stressors in the Baton Rouge Standard Metropolitan Statistical Area (SMSA). In a random sample of 213 respondents,

Francis O. Adeola



Designing Effective Natural Hazards Preparedness Communications: Factors that Influence Perceptions and Action  

NASA Astrophysics Data System (ADS)

Even though most people believe that natural hazards preparation is important for mitigating damage to their homes and basic survival in the aftermath of a disaster, few actually disaster-proof their homes, create plans, or obtain supplies recommended by agencies such as the Federal Emergency Management Agency. Several observational studies suggest that socio-demographic characteristics such as income and psychological characteristics such as self-efficacy affect whether or not an individual takes action to prepare for a natural hazard. These studies, however, only suggest that these characteristics may play a role. There has been little research that systematically investigates how these characteristics play a role in people's perceptions of recommended preparatory activities and decisions to perform them. Therefore, in Study 1, we explore people's perceptions of natural hazards preparedness measures on four dimensions: time, cost, helpfulness, and sense of preparedness. We further investigate if these responses vary by the socio-demographic and psychological characteristics of self-efficacy, knowledge, and income level. In Study 2, we experimentally test whether people's sense of self-efficacy, as it relates to natural hazards, can be manipulated through exposure to an "easy-and-effective" versus a "hard-and-effective" set of preparation measures. Our findings have implications for the design of natural hazards communication materials for the general public.

Wong-Parodi, G.; Fischhoff, B.



Experimental study of SBS mitigation and transmission improvement from cross-phase modulation in 10.7 Gb/s unrepeatered systems.  


We experimentally study the effects of cross-phase modulation and stimulated Brillouin scattering limitations in long unrepeatered transmission systems at 10.7 Gb/s. We find significant SBS suppression in wavelength-division multiplexed systems and investigate system performance for different numbers of channels and channel spacing. We find greater than 4 orders of magnitude improvement in bit error rate (BER) with a WDM system in comparison to single channel transmission due to SBS impairment mitigation. Unrepeatered transmission over 323 km with a simple system configuration using low-attenuation fiber, NRZ modulation, and only EDFA amplification is demonstrated with more than 5 dB system margin over the forward error correction (FEC) threshold. PMID:19547300

Downie, John D; Hurley, Jason



Landslide hazard evaluation: a review of current techniques and their application in a multi-scale study, Central Italy  

Microsoft Academic Search

In recent years, growing population and expansion of settlements and life-lines over hazardous areas have largely increased the impact of natural disasters both in industrialized and developing countries. Third world countries have difficulty meeting the high costs of controlling natural hazards through major engineering works and rational land-use planning. Industrialized societies are increasingly reluctant to invest money in structural measures

Fausto Guzzetti; Alberto Carrara; Mauro Cardinali; Paola Reichenbach



HACCP (Hazard Analysis and Critical Control Points) to guarantee safe water reuse and drinking water production - a case study  

Microsoft Academic Search

To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was

T. Dewettinck; E. Van Houtte; D. Geenens; K. Van Hege; W. Verstraete



Communicating uncertainties in natural hazard forecasts  

NASA Astrophysics Data System (ADS)

Natural hazards research seeks to help society develop strategies that appropriately balance risks and mitigation costs in addressing potential imminent threats and possible longer-term hazards. However, because scientists have only limited knowledge of the future, they must also communicate the uncertainties in what they know about the hazards. How to do so has been the subject of extensive recent discussion [Sarewitz et al., 2000; Oreskes, 2000; Pilkey and Pilkey-Jarvis, 2006]. One approach is General Colin Powell's charge to intelligence officers [Powell, 2012]: “Tell me what you know. Tell me what you don't know. Then tell me what you think. Always distinguish which is which.” In dealing with natural hazards, the last point can be modified to “which is which and why.” To illustrate this approach, it is helpful to consider some successful and unsuccessful examples [Stein, 2010; Stein et al., 2012].

Stein, Seth; Geller, Robert J.



Effects of hazardous and harmful alcohol use on HIV incidence and sexual behaviour: a cohort study of Kenyan female sex workers  

PubMed Central

Aims To investigate putative links between alcohol use, and unsafe sex and incident HIV infection in sub-Saharan Africa. Methods A cohort of 400 HIV-negative female sex workers was established in Mombasa, Kenya. Associations between categories of the Alcohol Use Disorders Identification Test (AUDIT) and the incidence at one year of unsafe sex, HIV and pregnancy were assessed using Cox proportional hazards models. Violence or STIs other than HIV measured at one year was compared across AUDIT categories using multivariate logistic regression. Results Participants had high levels of hazardous (17.3%, 69/399) and harmful drinking (9.5%, 38/399), while 36.1% abstained from alcohol. Hazardous and harmful drinkers had more unprotected sex and higher partner numbers than abstainers. Sex while feeling drunk was frequent and associated with lower condom use. Occurrence of condom accidents rose step-wise with each increase in AUDIT category. Compared with non-drinkers, women with harmful drinking had 4.1-fold higher sexual violence (95% CI adjusted odds ratio [AOR]?=?1.9-8.9) and 8.4 higher odds of physical violence (95% CI AOR?=?3.9-18.0), while hazardous drinkers had 3.1-fold higher physical violence (95% CI AOR?=?1.7-5.6). No association was detected between AUDIT category and pregnancy, or infection with Syphilis or Trichomonas vaginalis. The adjusted hazard ratio of HIV incidence was 9.6 comparing women with hazardous drinking to non-drinkers (95% CI?=?1.1-87.9). Conclusions Unsafe sex, partner violence and HIV incidence were higher in women with alcohol use disorders. This prospective study, using validated alcohol measures, indicates that harmful or hazardous alcohol can influence sexual behaviour. Possible mechanisms include increased unprotected sex, condom accidents and exposure to sexual violence. Experimental evidence is required demonstrating that interventions to reduce alcohol use can avert unsafe sex. PMID:24708844



Landslides Hazards  

NSDL National Science Digital Library

At this USGS educational web site, the public can find out about the nature and problems of landslides. Individuals can learn how wildfires can induce debris flows and other types of landslides. Within the National Landslide Information Center link, students and educators can find landslide fact sheets, numerous images of landslides, an interactive module on debris flows, and materials about current USGS landslide projects. The website features a searchable bibliographic database, lists of publications, and links to local organizations dealing with this natural hazard.


77 FR 75441 - Healthy Home and Lead Hazard Control Grant Programs Data Collection; Progress Reporting  

Federal Register 2010, 2011, 2012, 2013, 2014

...Program, Healthy Homes Technical Studies Program, Lead Base paint Hazard Control program, Lead Hazard Reduction Demonstration...Program, Healthy Homes Technical Studies Program, Lead Base paint Hazard Control program, Lead Hazard Reduction...



Thermal hazard from propane fireballs  

Microsoft Academic Search

Tank trucks and rail cars containing such hazardous materials as commercial propane are often involved in accidents wherein the tanks are ruptured and fires occur. In this study, a model for determining the thermal hazard associated with the resulting fireball is developed, and the results are compared with available test data.This work was supported by the Atomic Energy Commission.

H. C. Hardee; D. O. Lee



Respiratory hospitalizations of children living near a hazardous industrial site adjusted for prevalent dust: A case-control study.  


The Neot Hovav Industrial Park (IP), located in southern Israel, hosts 23 chemical industry facilities and the national site for treatment of hazardous waste. Yet, information about its impact on the health of local population has been mostly ecological, focused on Bedouins and did not control for possible confounding effect of prevalent dust storms. This case-control study examined whether living near the IP could lead to increased risk of pediatric hospitalization for respiratory diseases. Cases (n=3608) were residents of the Be'er Sheva sub-district aged 0-14 years who were admitted for respiratory illnesses between 2004 and 2009. These were compared to children admitted for non-respiratory conditions (n=3058). Home addresses were geocoded and the distances from the IP to the child's residence were calculated. The association between hospitalization and residential distance from the IP was examined for three age groups (0-1, 2-6, 7-14) by logistic regressions adjusting for gender, socioeconomic status, urbanity and temperature. We found that infants in the first year of life who lived within 10km of the IP had increased risk of respiratory hospitalization when compared with those living >20km from the IP (adjusted odds ratio, OR=2.07, 95% confidence interval, CI: 1.19-3.59). In models with both distance from the IP and particulate matter with aerodynamic diameter <10?m (PM10) the estimated risk was modestly attenuated (OR=1.96, 95% CI: 1.09-3.51). Elevated risk was also observed for children 2-5 years of age but with no statistical significance (OR=1.16, 95% CI: 0.76-1.76). Our findings suggest that residential proximity to a hazardous industrial site may contribute to early life respiratory admissions, beyond that of prevailing PM10. PMID:25547415

Nirel, Ronit; Maimon, Nimrod; Fireman, Elizabeth; Agami, Sarit; Eyal, Arnona; Peretz, Alon



Hazards of Mercury.  

ERIC Educational Resources Information Center

Common concern for the protection and improvement of the environment and the enhancement of human health and welfare underscore the purpose of this special report on the hazards of mercury directed to the Secretary's Pesticide Advisory Committee, Department of Health, Education, and Welfare. The report summarizes the findings of a ten-member study

Environmental Research, 1971



Satellite Breakup Risk Mitigation  

NASA Technical Reports Server (NTRS)

Many satellite breakups occur as a result of an explosion of stored energy on-board spacecraft or rocket-bodies. These breakups generate a cloud of tens or possibly hundreds of thousands of debris fragments which may pose a transient elevated threat to spaceflight crews and vehicles. Satellite breakups pose a unique threat because the majority of the debris fragments are too small to be tracked from the ground. The United States Human Spaceflight Program is currently implementing a risk mitigation strategy that includes modeling breakup events, establishing action thresholds, and prescribing corresponding mitigation actions in response to satellite breakups.

Leleux, Darrin P.; Smith, Jason T.



NASA Hazard Analysis Process  

NASA Technical Reports Server (NTRS)

This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

Deckert, George




EPA Science Inventory

Lead-based paints (LEP) and primers have been used in the past by the Department of Defense (DoD) to protect steel structures from corrosion. DoD owns about 200 million sq ft of steel structures with lead-based paint (such as bridges, aircraft hangars, water tanks, etc.). The DoD...


Monitoring Fogo Island, Cape Verde Archipelago, for Volcanic Hazard Mitigation  

Microsoft Academic Search

Fogo Island, in the Cape Verde Archipelago (North Atlantic), with a total area of 476 km2 and a population of about 40000, is an active ocean island volcano raising from an average sea-bottom depth of the order of -3000m to a maximum altitude of 2820m. All of the 28 historically recorded eruptions (Ribeiro, 1960) since the arrival of the first

B. V. Faria; S. I. Heleno; I. J. Barros; N. d'Oreye; Z. Bandomo; J. F. Fonseca



Completion Time Dynamics of Doctoral Studies at Makerere University: A Hazard Model Evaluation  

ERIC Educational Resources Information Center

Issues related to attrition and completion time of graduate studies are certainly an internationally challenging and important area of higher education literature. In this paper, completion time dynamics of doctoral studies at Makerere University were investigated based on data extracted for all 295 candidates in the commencement cohorts from 2000…

Wamala, Robert; Oonyu, Joseph; Ocaya, Bruno



Problems with mitigation translocation of herpetofauna.  


Mitigation translocation of nuisance animals is a commonly used management practice aimed at resolution of human-animal conflict by removal and release of an individual animal. Long considered a reasonable undertaking, especially by the general public, it is now known that translocated subjects are negatively affected by the practice. Mitigation translocation is typically undertaken with individual adult organisms and has a much lower success rate than the more widely practiced conservation translocation of threatened and endangered species. Nonetheless, the public and many conservation practitioners believe that because population-level conservation translocations have been successful that mitigation translocation can be satisfactorily applied to a wide variety of human-wildlife conflict situations. We reviewed mitigation translocations of reptiles, including our own work with 3 long-lived species (Gila monsters [Heloderma suspectum], Sonoran desert tortoises [Gopherus morafkai], and western diamond-backed rattlesnakes [Crotalus atrox]). Overall, mitigation translocation had a low success rate when judged either by effects on individuals (in all studies reviewed they exhibited increased movement or increased mortality) or by the success of the resolution of the human-animal conflict (translocated individuals often returned to the capture site). Careful planning and identification of knowledge gaps are critical to increasing success rates in mitigation translocations in the face of increasing pressure to find solutions for species threatened by diverse anthropogenic factors, including climate change and exurban and energy development. Problemas con la Mitigación por Traslocación de Herpetofauna. PMID:25040040

Sullivan, Brian K; Nowak, Erika M; Kwiatkowski, Matthew A



Long-term volcanic hazard assessment on El Hierro (Canary Islands)  

NASA Astrophysics Data System (ADS)

Long-term hazard assessment, one of the bastions of risk-mitigation programs, is required for land-use planning and for developing emergency plans. To ensure quality and representative results, long-term volcanic hazard assessment requires several sequential steps to be completed, which include the compilation of geological and volcanological information, the characterisation of past eruptions, spatial and temporal probabilistic studies, and the simulation of different eruptive scenarios. Despite being a densely populated active volcanic region that receives millions of visitors per year, no systematic hazard assessment has ever been conducted on the Canary Islands. In this paper we focus our attention on El Hierro, the youngest of the Canary Islands and the most recently affected by an eruption. We analyse the past eruptive activity to determine the spatial and temporal probability, and likely style of a future eruption on the island, i.e. the where, when and how. By studying the past eruptive behaviour of the island and assuming that future eruptive patterns will be similar, we aim to identify the most likely volcanic scenarios and corresponding hazards, which include lava flows, pyroclastic fallout and pyroclastic density currents (PDCs). Finally, we estimate their probability of occurrence. The end result, through the combination of the most probable scenarios (lava flows, pyroclastic density currents and ashfall), is the first qualitative integrated volcanic hazard map of the island.

Becerril, L.; Bartolini, S.; Sobradelo, R.; Martí, J.; Morales, J. M.; Galindo, I.



Willamette Wildlife Mitigation Program  

E-print Network

Willamette Wildlife Mitigation Program Michael Pope Oregon Department of Fish and Wildlife #12 Creek Cougar Dexter Lookout Point #12;#12;Impact Assessment Results Willamette River Dams-94,306 HU 0 5000 10000 15000 20000 25000 30000 Cougar Lookout Pt. D etroit G reen Peter HUs Lost H ills Creek D



Microsoft Academic Search

In 1987, the Port of Los Angeles entered into an interagency agreement to restore Batiquitos Lagoon in Carlsbad, California as mitigation for the construction of cargo terminals in the Outer Los Angeles Harbor. After ten years of planning, environmental review, design and permitting, the restoration was completed and the lagoon was opened to the Pacific Ocean. The $57 million project

Ralph G. Appy


Predicting Risk-Mitigating Behaviors From Indecisiveness and Trait Anxiety: Two Cognitive Pathways to Task Avoidance.  


Past research suggests that indecisiveness and trait anxiety may both decrease the likelihood of performing risk-mitigating preparatory behaviors (e.g., preparing for natural hazards) and suggests two cognitive processes (perceived control and worrying) as potential mediators. However, no single study to date has examined the influence of these traits and processes together. Examining them simultaneously is necessary to gain an integrated understanding of their relationship with risk-mitigating behaviors. We therefore examined these traits and mediators in relation to wildfire preparedness in a two-wave field study among residents of wildfire-prone areas in Western Australia (total N?=?223). Structural equation modeling results showed that indecisiveness uniquely predicted preparedness, with higher indecisiveness predicting lower preparedness. This relationship was fully mediated by perceived control over wildfire-related outcomes. Trait anxiety did not uniquely predict preparedness or perceived control, but it did uniquely predict worry, with higher trait anxiety predicting more worrying. Also, worry trended toward uniquely predicting preparedness, albeit in an unpredicted positive direction. This shows how the lack of performing risk-mitigating behaviors can result from distinct cognitive processes that are linked to distinct personality traits. It also highlights how simultaneous examination of multiple pathways to behavior creates a fuller understanding of its antecedents. PMID:25234125

McNeill, Ilona M; Dunlop, Patrick D; Skinner, Timothy C; Morrison, David L



Hazard identification of environmental pollutants by combining results from ecological and biomarker studies: an example  

EPA Science Inventory

Objective: Linking exposures from environmental pollutants with adverse health effects is difficult because these exposures are usually low-dose and ill-defined. According to several investigators, a series of multidisciplinary, multilevel studies is needed to address this prob...


Preliminary hazards analysis for the National Ignition Facility  

SciTech Connect

This report documents the Preliminary Hazards Analysis (PHA) for the National Ignition Facility (NIF). In summary, it provides: a general description of the facility and its operation; identification of hazards at the facility; and details of the hazards analysis, including inventories, bounding releases, consequences, and conclusions. As part of the safety analysis procedure set forth by DOE, a PHA must be performed for the NIF. The PHA characterizes the level of intrinsic potential hazard associated with a facility, and provides the basis for hazard classification. The hazard classification determines the level of safety documentation required, and the DOE Order governing the safety analysis. The hazard classification also determines the level of review and approval required for the safety analysis report. The hazards of primary concern associated with NIF are radiological and toxicological in nature. The hazard classification is determined by comparing facility inventories of radionuclides and chemicals with threshold values for the various hazard classification levels and by examining postulated bounding accidents associated with the hazards of greatest significance. Such postulated bounding accidents cannot take into account active mitigative features; they must assume the unmitigated consequences of a release, taking into account only passive safety features. In this way, the intrinsic hazard level of the facility can be ascertained.

Brereton, S.J.



Thermal degradation events as health hazards: Particle vs gas phase effects, mechanistic studies with particles  

NASA Astrophysics Data System (ADS)

Exposure to thermal degradation products arising from fire or smoke could be a major concern for manned space missions. Severe acute lung damage has been reported in people after accidental exposure to fumes from plastic materials, and animal studies revealed the extremely high toxicity of freshly generated fumes whereas a decrease in toxicity of aged fumes has been found. This and the fact that toxicity of the freshly generated fumes can be prevented with filters raises the question whether the toxicity may be due to the particulate rather than the gas phase components of the thermodegradation products. Indeed, results from recent studies implicate ultrafine particles (particle diameter in the nm range) as potential severe pulmonary toxicants. We have conducted a number of in vivo (inhalation and instillation studies in rats) and in vitro studies to test the hypothesis that ultrafine particles possess an increased potential to injure the lung compared to larger-sized particles. We used as surrogate particles ultrafine TiO 2 particles (12 and 20 nm diameter). Results in exposed rats showed that the ultrafine TiO 2 particles not only induce a greater acute inflammatory reaction in the lung than larger-sized TiO 2 particles, but can also lead to persistent chronic effects, as indicated by an adverse effect on alveolar macrophage mediated clearance function of particles. Release of mediators from alveolar macrophages during phagocytosis of the ultrafine particles and an increased access of the ultrafine particles to the pulmonary interstitium are likely factors contributing to their pulmonary toxicity. In vitro studies with lung cells (alveolar macrophages) showed, in addition, that ultrafine TiO 2 particles have a greater potential to induce cytokines than larger-sized particles. We conclude from our present studies that ultrafine particles have a significant potential to injure the lung and that their occurrence in thermal degradation events can play a major role in the highly acute toxicity of fumes. Future studies will include adsorption of typical gas phase components (HCl, HF) on surrogate particles to differentiate between gas and particle phase effects and to perform mechanistic studies aimed at introducing therapeutic/preventive measures. These studies will be complemented by a comparison with actual thermal degradation products.

Oberdörster, G.; Ferin, J.; Finkelstein, J.; Soderholm, S.


Impacts of urbanization on the hazard, vulnerability and risk of pluvial disaster  

NASA Astrophysics Data System (ADS)

The design capacity of an urban drainage system is often smaller than that of a fluvial protection facility such as levee. Many metropolises located in lowlands suffer pluvial inundation disaster more than pluvial flood disaster. For improving mitigation strategies, flood risk assessment is an important tool of non-structure flood control measures, especially in the countries suffering tropical cyclones and monsoon with high frequency. Locating in the hot zone of typhoon tracks in the Western Pacific, Taiwan suffers three to five typhoons annually. As results of urbanization in Taiwan, heavy rainfalls cause inundation disaster rising with the increase of population and the demand of land development. The purpose of this study is to evaluate the impacts of urbanization on the hazard, vulnerability and risk of pluvial disaster. This study applies the concept that risk is composed by hazard and vulnerability to assess the flood risk of human life. Two-dimensional overland-flow simulation is performed based on a design extreme rainfall event to calculate the score of pluvial hazard factors for human life, including flood depth, velocity and rising ratio. The score of pluvial vulnerability for human life is carried out according to the factors of resident and environment. The risk matrix is applied to show the risk by composing the inundation hazards and vulnerabilities. Additionally, flood simulations performed are concerned with different stages of drainage channel construction that indicates the progress of the pluvial disaster mitigation for evaluating the impacts of urbanization on inundation hazard. The changes of land use and density of population are concerned with the impacts of urbanization on inundation vulnerability. The Tainan City, one of the earliest cities on Taiwan, is selected as the case study because serious flooding was induced by Typhoon Morakot in 2009. Typhoon Morakot carried intense rain moved from the east slowly as low as 4 km/hr while the southwest monsoon also entered this region at the same time. The combined effect of these was that in the mid-area between typhoon and southwest monsoon, a sharp air-pressure gradient was built which unpredictably brought about heavy rainfall for about 72 hours in the study area to produce a record-breaking rainfall of 625mm in 48 hours. Through the assessing the impacts of urbanization on pluvial inundation risk of the Tainan City in the Typhoon Morakot event, the results show that the inundation hazard is decreased and the vulnerability is increased due to urbanization. Finally, the pluvial inundation risk maps for human life can provide useful information for setting mitigation strategies of flood inundation.

Pan, T.-Y.; Chang, T.-J.; Lai, J.-S.; Chang, H.-K.



Environmental hazard mapping using GIS and AHP - A case study of Dong Trieu District in Quang Ninh Province, Vietnam  

NASA Astrophysics Data System (ADS)

In recent years, Vietnamese economy has been growing up rapidly and caused serious environmental quality plunging, especially in industrial and mining areas. It brings an enormous threat to a socially sustainable development and the health of human beings. Environmental quality assessment and protection are complex and dynamic processes, since it involves spatial information from multi-sector, multi-region and multi-field sources and needs complicated data processing. Therefore, an effective environmental protection information system is needed, in which considerable factors hidden in the complex relationships will become clear and visible. In this paper, the authors present the methodology which was used to generate environmental hazard maps which are applied to the integration of Analytic Hierarchy Process (AHP) and Geographical Information system (GIS). We demonstrate the results that were obtained from the study area in Dong Trieu district. This research study has contributed an overall perspective of environmental quality and identified the devastated areas where the administration urgently needs to establish an appropriate policy to improve and protect the environment.

Anh, N. K.; Phonekeo, V.; My, V. C.; Duong, N. D.; Dat, P. T.



Genetic k-Means Clustering Approach for Mapping Human Vulnerability to Chemical Hazards in the Industrialized City: A Case Study of Shanghai, China  

PubMed Central

Reducing human vulnerability to chemical hazards in the industrialized city is a matter of great urgency. Vulnerability mapping is an alternative approach for providing vulnerability-reducing interventions in a region. This study presents a method for mapping human vulnerability to chemical hazards by using clustering analysis for effective vulnerability reduction. Taking the city of Shanghai as the study area, we measure human exposure to chemical hazards by using the proximity model with additionally considering the toxicity of hazardous substances, and capture the sensitivity and coping capacity with corresponding indicators. We perform an improved k-means clustering approach on the basis of genetic algorithm by using a 500 m × 500 m geographical grid as basic spatial unit. The sum of squared errors and silhouette coefficient are combined to measure the quality of clustering and to determine the optimal clustering number. Clustering result reveals a set of six typical human vulnerability patterns that show distinct vulnerability dimension combinations. The vulnerability mapping of the study area reflects cluster-specific vulnerability characteristics and their spatial distribution. Finally, we suggest specific points that can provide new insights in rationally allocating the limited funds for the vulnerability reduction of each cluster. PMID:23787337

Shi, Weifang; Zeng, Weihua




EPA Science Inventory

This study was undertaken to decipher the mechanisms of solidification/stabilization (S/S) of water soluble organics - namely phenol and ethylene glycol. Portland cement Type I was the binder used in ratios that varied from 2% to 50% organics/waste mixture. The S/S product was st...


The Integrated Hazard Analysis Integrator  

NASA Technical Reports Server (NTRS)

Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and sufficient requirements of one of the significant contributors to mission success, the IHA integrator. Discussions will be provided to describe both the mindset required as well as deleterious assumptions/behaviors to avoid when integrating within a large scale system.

Morris, A. Terry; Massie, Michael J.



wind engineering & natural disaster mitigation  

E-print Network

wind engineering & natural disaster mitigation #12;wind engineering & natural disaster mitigation Investment WindEEE Dome at Advanced Manufacturing Park $31million Insurance Research Lab for Better Homes $8million Advanced Facility for Avian Research $9million #12;wind engineering & natural disaster mitigation

Denham, Graham


Creating Probabilistic Multi-Peril Hazard Maps  

NASA Astrophysics Data System (ADS)

An often overlooked component of natural hazards is the element of human involvement. Physical events--such as massive earthquakes--that do not affect society constitute natural phenomena, but are not necessarily natural hazards. Natural phenomena that occur in populated areas constitute hazardous events. Furthermore, hazardous events that cause damage--either in the form of structural damage or the loss or injury of lives--constitute natural disasters. Geographic areas that do not contain human interests, by definition, cannot suffer from hazardous events. Therefore, they do not contain a component of natural hazard. Note that this definition differs from the view of natural hazards as "unavoidable havoc wreaked by the unrestrained forces of nature". On the contrary, the burden of cause is shifted from purely natural processes to the concurrent presence of human society and natural events. Although individuals can do little to change the occurrences or intensities of most natural phenomena, they can mitigate their exposure to natural events and help ensure hazardous events do not become natural disasters. For example, choosing to build new settlements in known flood zones increases the exposure--and therefore risk--to natural flood events. Similarly, while volcanoes do erupt periodically, it is the conscious act of reappropriating the rich soils formed by ejecta as farmland that makes the volcanoes hazardous. Again, this empowers individuals and makes them responsible for their own exposure to natural hazards. Various local and governmental agencies--in particular, the United States Geographical Survey (USGS)--do a good job of identifying and listing various local natural hazards. These listings, however, are often treated individually and independently. Thus, it is often difficult to construct a "big picture" image of total natural hazard exposure. In this presentation, we discuss methods of identifying and combining different natural hazards for a given location. We then further refine our calculation into a single value--a hazard exposure index--and show how this value can be used to create multi-peril hazard maps. As an example, we contour and present one such map for all of California.

Holliday, J. R.; Page, N. A.; Rundle, J. B.



Numerical Study on the Chemical Hazard Aiming at the Security during 2008 Beijing Olympic Games  

Microsoft Academic Search

In this study, a comprehensive system was developed to meet the demand of the Security Guarding during 2008 Beijing Olympic Games. In the system, meteorological models, namely, MM5 and RAMS6.0, and a poisonous clouds diffusion model over complex terrain (CDM) were configured in a one-way off-line nested way. In the system, MM5 runs were performed in an real-time operational way

Shunxiang Huang; Feng Liu; Huimin Li; Yuan Zuo



Action on Hazardous Wastes.  

ERIC Educational Resources Information Center

U.S. EPA is gearing up to investigate about 300 hazardous waste dump sites per year that could pose an imminent health hazard. Prosecutions are expected to result from the priority effort at investigating illegal hazardous waste disposal. (RE)

EPA Journal, 1979



Hazard Alert: Trenches  


... Construction Chart Book, p. 39. CPWR. 2008. HAZARD ALERT Find out more about safe work in trenches: • ... about construction hazards. Get more of these Hazard Alert cards – and cards on other topics. Call 301- ...


The impact of overlapping processes on rockfall hazard analysis - the Bolonia Bay study (southern Spain)  

NASA Astrophysics Data System (ADS)

For rockfall simulations, competitive case studies and data sets are important to develop and evaluate the models or software. Especially for empirical or data driven stochastic modelling the quality of the reference data sets has a major impact on model skills and knowledge discovery. Therefore, rockfalls in the Bolonia Bay close to Tarifa (Spain) were mapped. Here, the siliciclastic Miocene rocks (megaturbidites) are intensively joined and disaggregated by a perpendicular joint system. Although bedding supports stability as the dip is not directed towards the rock face, the deposits indicate a continuous process of material loss from the 80 m high cliff of the San Bartolome mountain front by single large rock falls. For more than 300 blocks data on size, shape, type of rock, and location were collected. The work concentrated on rockfall blocks with a volume of more than 2 m³ and up to 350 m³. Occasionally very long "runout" distances of up to 2 km have been observed. For all major source areas and deposits, runout analysis using empirical models and a numerical trajectorian model has been performed. The most empirical models are principally based on the relation between fall height and travel distance. Beside the "Fahrböschung" from Heim (1932) the "shadow angle" introduced by Evans and Hungr (1993) is most common today. However, studies from different sites show a wide variance of the angle relations (Dorren 2003, Corominas 1996). The reasons for that might be different environments and trigger mechanisms, or varying secondary effects such as post-depositional movement. Today, "semi" numerical approaches based on trajectorian models are quite common to evaluate the rockfall energy and the runout distance for protection measures and risk evaluations. The results of the models highly depend on the quality of the input parameters. One problem here might be that some of the parameters, especially the dynamic ones, are not easy to determine and the quality of the digital elevation model has an large impact on energy estimations and travel paths. In the course of this study the model of "shadow angel", "Fahrböschung" and a numerical simulation using "Rockfall 6.2" (Spang & Sonser 1995) have been applied to the mapped rockfall deposits. The results revealed a good coherence of all three modeling attempts in some cases. Especially for deposition areas where many single rockfall events could be identified as young all models performed well and showed nearly identical results. In other areas with large deposits and long travel distances, the model predictions vary strongly and the shadow angles do not fit the usual ranges at all. Here, post-depositional transport by surface-near landslides in a piggy-back style is postulated. Medium- and large-scaled landslides and creep in soils are proven in the whole bay. Landslide bodies can be observed in the deposition areas and were proved with GPR. Additionally, the weathered marls and clays of the Flysch deposits below the rock face are highly active and likely to be subject to sliding after heavy rainfalls. Another reason for the extraordinary long runout distances might be seismic triggering. Paleoseismological and archeoseismological investigations already showed that the study area suffered destructive earthquakes even in historical times (Silva et al 2009). This trigger mechanism was simulated for various blocks, but did not lead to the expected results in all cases. Strong winds have also to be considered as an additionally trigger mechanism for rockfalls by leverage as wind forces > 5 Bft are present in the forested study area more than 300 days per year. The results show that simple stochastic analysis using large data sets without taking triggering mechanism and geological environment in consideration may lead to mere general models. More data sets and comparative studies are necessary to evaluate the threshold values for the empirical models like the shadow angle. Anyhow the results from the des

Fernandez-Steeger, T.; Grützner, C.; Reicherter, K.; Braun, A.; Höbig, N.



The risk perception paradox--implications for governance and communication of natural hazards.  


This article reviews the main insights from selected literature on risk perception, particularly in connection with natural hazards. It includes numerous case studies on perception and social behavior dealing with floods, droughts, earthquakes, volcano eruptions, wild fires, and landslides. The review reveals that personal experience of a natural hazard and trust--or lack of trust--in authorities and experts have the most substantial impact on risk perception. Cultural and individual factors such as media coverage, age, gender, education, income, social status, and others do not play such an important role but act as mediators or amplifiers of the main causal connections between experience, trust, perception, and preparedness to take protective actions. When analyzing the factors of experience and trust on risk perception and on the likeliness of individuals to take preparedness action, the review found that a risk perception paradox exists in that it is assumed that high risk perception will lead to personal preparedness and, in the next step, to risk mitigation behavior. However, this is not necessarily true. In fact, the opposite can occur if individuals with high risk perception still choose not to personally prepare themselves in the face of a natural hazard. Therefore, based on the results of the review, this article offers three explanations suggesting why this paradox might occur. These findings have implications for future risk governance and communication as well as for the willingness of individuals to invest in risk preparedness or risk mitigation actions. PMID:23278120

Wachinger, Gisela; Renn, Ortwin; Begg, Chloe; Kuhlicke, Christian



Hazardous materials: chemistry and safe handling aspects of flammable, toxic and radioactive materials. A course of study  

SciTech Connect

The subject of this dissertation is a one semester, three credit course designed for students who have taken at least twelve credits college chemistry, and for high school teachers as a continuing education course. The need for such a course arises from the increased concern for safety in recent years and the introduction of many regulations of which the working chemist should be aware, notably those issued by the Occupational Safety and Health Administration. A few colleges have recently started to offer courses in laboratory safety to undergraduate and graduate chemistry students. Thus, there is a need for the development of courses in which chemical safety is taught. This course is divided into three units: 1) flammable materials; 2) toxic materials; and 3) radioactive materials. Each unit is self contained and could be taught separately as a one credit course. The material necessary for lecture presentation is given in the text of this dissertation: there are about seven topics in each unit. The chemical properties of selected substances are emphasized. Examples of governmental regulations are given, and there are sample examination questions for each unit and homework assignments that require the use of reference sources. Laboratory exercises are included to enable students to gain experience in the safe handling of hazardous chemicals and of some equipment and instruments used to analyze and study flammable, toxic and radioactive materials.

Smith, M.W.



Treatability study on the use of by-product sulfur in Kazakhstan for the stabilization of hazardous and radioactive wastes  

SciTech Connect

The Republic of Kazakhstan generates significant quantities of excess elemental sulfur from the production and refining of petroleum reserves. In addition, the country also produces hazardous, and radioactive wastes which require treatment/stabilization. In an effort to find secondary uses for the elemental sulfur, and simultaneously produce a material which could be used to encapsulate, and reduce the dispersion of harmful contaminants into the environment, BNL evaluated the use of the sulfur polymer cement (SPC) produced from by-product sulfur in Kazakhstan. This thermoplastic binder material forms a durable waste form with low leaching properties and is compatible with a wide range of waste types. Several hundred kilograms of Kazakhstan sulfur were shipped to the US and converted to SPC (by reaction with 5 wt% organic modifiers) for use in this study. A phosphogypsum sand waste generated in Kazakhstan during the purification of phosphate fertilizer was selected for treatment. Waste loadings of 40 wt% were easily achieved. Waste form performance testing included compressive strength, water immersion, and Accelerated Leach Testing.

Kalb, P.D.; Milian, L.W. [Brookhaven National Lab., Upton, NY (United States). Environmental and Waste Technology Center; Yim, S.P. [Korea Atomic Energy Research Inst. (Korea, Republic of); Dyer, R.S.; Michaud, W.R. [Environmental Protection Agency (United States)



Treatability study on the use of by-product sulfur in Kazakhstan for the stabilization of hazardous and radioactive wastes  

SciTech Connect

The Republic of Kazakhstan generates significant quantities of excess sulfur from the production and refining of petroleum reserves. In addition, the country also produces hazardous, and radioactive wastes which require treatment/stabilization. In an effort to find secondary uses for the elemental sulfur, and simultaneously produce a material which could be used to encapsulate, and reduce the dispersion of harmful contaminants into the environment, BNL evaluated the use of the sulfur polymer cement (SPC) produced from by-product sulfur in Kazakhstan. This thermoplastic binder material forms a durable waste form with low leaching properties and is compatible with a wide range of waste types. Several hundred kilograms of Kazakhstan sulfur were shipped to the U.S. and converted to SPC (by reaction with 5 wt% organic modifiers) for use in this study. A phosphogypsum sand waste generated in Kazakhstan during the purification of phosphate fertilizer was selected for treatment. Waste loading of 40 wt% were easily achieved. Waste form performance testing included compressive strength, water immersion, and Accelerated Leach Testing. 14 refs., 7 figs., 6 tabs.

Yim, Sung Paal; Kalb, P.D.; Milian, L.W.



Studies of ChemChar cocurrent flow gasification of hazardous wastes using radiolabeled compounds  

SciTech Connect

This paper describes investigations of the ChemChar thermal process using radiolabeled compounds. The process uses triple reverse-burn (TRB) char as the waste carrier. Production weight/volume loss and adsorption characteristics of TRB char were investigated. Batch mode laboratory scale gasification systems were developed to study the fate of radiolabeled organic compounds at low and high loadings. The following conclusions were drawn from experimental results: (1) TRB char sorbs pollutants, independent of their being in the aqueous phase or gase phase, (2) TRB char incorporates carbon during cocurrent gasification of hydrocarbons and organohalides, (3) none of the compounds tested (benzoic acid; chlorobenzene; 1,4-dichlorobenzene; hexachlorobenzene; 3,3`,4,4`-tetrachlorobiphenyl; and dibenzofuran) produced oxygenated organic byproducts during cocurrent gasification, (4) strong reductive destruction mechanisms are predominant during cocurrent gasification in a batch mode reactor, and (5) none of the chlorinated compounds tested formed byproducts with a higher number of chlorine atoms than the parent compounds.

Gorman, M.K.; Velagaleti, R.R.; Larsen, D.W. [Analytical Bio-Chemistry Labs., Inc., Columbia, MO (United States)] [and others



[Environmental Hazards Assessment Program annual report, June 1992--June 1993]. Summer undergraduate research program: Environmental studies  

SciTech Connect

The purpose of the summer undergraduate internship program for research in environmental studies is to provide an opportunity for well-qualified students to undertake an original research project as an apprentice to an active research scientist in basic environmental research. Ten students from throughout the midwestern and eastern areas of the country were accepted into the program. These students selected projects in the areas of marine sciences, biostatistics and epidemiology, and toxicology. The research experience for all these students and their mentors was very positive. The seminars were well attended and the students showed their interest in the presentations and environmental sciences as a whole by presenting the speakers with thoughtful and intuitive questions. This report contains the research project written presentations prepared by the student interns.

McMillan, J. [ed.



In-Space Propulsion Engine Architecture Based on Sublimation of Planetary Resources: From Exploration Robots to NED Mitigation  

NASA Technical Reports Server (NTRS)

The purpose of this NIAC study is to identify those volatile and mineral resources that are available on asteroids, comets, moons and planets in the solar system, and investigate methods to transform these resources into forms of power that will expand the capabilities of future robotic and human exploration missions to explore planetary bodies beyond the Moon and will mitigate hazards from NEOs. The sources of power used for deep space probe missions are usually derived from either solar panels for electrical energy, radioisotope thermal generators for thermal energy, or fuel cells and chemical reactions for chemical energy and propulsion.

Sibille, Laurent; Mantovani, James; Dominquez, Jesus



Applications of research from the U.S. Geological Survey program, assessment of regional earthquake hazards and risk along the Wasatch Front, Utah  

USGS Publications Warehouse

INTERACTIVE WORKSHOPS: ESSENTIAL ELEMENTS OF THE EARTHQUAKE HAZARDS RESEARCH AND REDUCTION PROGRAM IN THE WASATCH FRONT, UTAH: Interactive workshops provided the forum and stimulus necessary to foster collaboration among the participants in the multidisciplinary, 5-yr program of earthquake hazards reduction in the Wasatch Front, Utah. The workshop process validated well-documented social science theories on the importance of interpersonal interaction, including interaction between researchers and users of research to increase the probability that research will be relevant to the user's needs and, therefore, more readily used. REDUCING EARTHQUAKE HAZARDS IN UTAH: THE CRUCIAL CONNECTION BETWEEN RESEARCHERS AND PRACTITIONERS: Complex scientific and engineering studies must be translated for and transferred to nontechnical personnel for use in reducing earthquake hazards in Utah. The three elements needed for effective translation, likelihood of occurrence, location, and severity of potential hazards, and the three elements needed for effective transfer, delivery, assistance, and encouragement, are described and illustrated for Utah. The importance of evaluating and revising earthquake hazard reduction programs and their components is emphasized. More than 30 evaluations of various natural hazard reduction programs and techniques are introduced. This report was prepared for research managers, funding sources, and evaluators of the Utah earthquake hazard reduction program who are concerned about effectiveness. An overview of the Utah program is provided for those researchers, engineers, planners, and decisionmakers, both public and private, who are committed to reducing human casualties, property damage, and interruptions of socioeconomic systems. PUBLIC PERCEPTIONS OF THE IMPLEMENTATION OF EARTHQUAKE MITIGATION POLICIES ALONG THE WASATCH FRONT IN UTAH: The earthquake hazard potential along the Wasatch Front in Utah has been well defined by a number of scientific and engineering studies. Translated earthquake hazard maps have also been developed to identify areas that are particularly vulnerable to various causes of damage such as ground shaking, surface rupturing, and liquefaction. The implementation of earthquake hazard reduction plans are now under way in various communities in Utah. The results of a survey presented in this paper indicate that technical public officials (planners and building officials) have an understanding of the earthquake hazards and how to mitigate the risks. Although the survey shows that the general public has a slightly lower concern about the potential for economic losses, they recognize the potential problems and can support a number of earthquake mitigation measures. The study suggests that many community groups along the Wasatch Front, including volunteer groups, business groups, and elected and appointed officials, are ready for action-oriented educational programs. These programs could lead to a significant reduction in the risks associated with earthquake hazards. A DATA BASE DESIGNED FOR URBAN SEISMIC HAZARDS STUDIES: A computerized data base has been designed for use in urban seismic hazards studies conducted by the U.S. Geological Survey. The design includes file structures for 16 linked data sets, which contain geological, geophysical, and seismological data used in preparing relative ground response maps of large urban areas. The data base is organized along relational data base principles. A prototype urban hazards data base has been created for evaluation in two urban areas currently under investigation: the Wasatch Front region of Utah and the Puget Sound area of Washington. The initial implementation of the urban hazards data base was accomplished on a microcomputer using dBASE III Plus software and transferred to minicomputers and a work station. A MAPPING OF GROUND-SHAKING INTENSITIES FOR SALT LAKE COUNTY, UTAH: This paper documents the development of maps showing a

Gori, Paula L., (Edited By)



Hazard Evaluation of Landslides Triggered by a Rainfall Storm, a Case Study of Valladolid River Basin in Ecuador  

NASA Astrophysics Data System (ADS)

Present paper describes a meteorological event occurred on May 24th, 2007, that triggered an important debris flow in the upper Valladolid river basin, in Ecuador. The generated debris flood affected several kilometers producing erosion, sedimentation downstream and causing damage in a number of works located near the river channel; predicted 250-year hydrological flood peak discharges for the river basin were about 80 m3s-1, while estimated actual debris flood discharge during the event reached 200 to 300 m3s-1. Moreover, field observations showed that instability processes occurred in a well conserved basin that did not showed evidences of incidence of such phenomena in the recent past. The study was carried out by CAMINOSCA in the framework of a consultancy study, developed to design the Valladolid Hydroelectric project. Main triggering factors such as the rainfall and soil moisture were considered for event analysis. To do so, the storm intensity for durations less than 12 hours and antecedent precipitation AP for the cumulative rainfall in 30, 60, 90, 120, 150, 160 and 180 days prior the event, were studied through frequency analysis. The return period of the concurrence of the two factors was defined, as well as the risk. Results showed that the cause of the land slide were the combination of more than 1 900 mm of cumulated antecedent precipitation on 160 days prior the event, the continuous occurrence of the rainfall, with only 12 days without rain in 160 days, and a triggering short duration 15-years return period storm occurred the day of the landslide. Finally, the magnitude of triggering factors is compared with landslide precipitation thresholds based on data from several places around the world, as proposed by different authors. Key Words: Hazard, Debris Flow, Meteorological Characterization, Landslide Thresholds.

Heredia, E.



Microzonation of Seismic Hazard Potential in Tainan, Taiwan  

NASA Astrophysics Data System (ADS)

Majority of Tainan area is densely populated in alluvial plains. Medium- and high-story buildings are increasing. Reconstruction speed of old buildings in some communities is slow. Considering at least six active faults are distributed in this area, we can foresee very high earthquake hazard potential. In this study, a catalog of 2044 shallow earthquakes occurred from 1900 to 2010 with Mw magnitudes ranging from 5.0 to 8.2, and 11 disastrous earthquakes occurred from 1683-1899 are used to estimate the seismic hazard potential in Tainan area for seismic microzonation. The results reveal that the highest the earthquake hazard potential in Tainan area is located in the northern part, including Houbi Dist., Xinying Dist., Liuying Dist., western Baihe Dist., western Liujia Dist., northeastern Yanshui Dist., northeastern Xiaying Dist., northern Guantian Dist. and northern Madou Dist. The probabilities of seismic intensity exceeding MMI VIII in 10, 30, and 50-year periods in above areas are greater than 30%, 50% and 70%, respectively. Moreover, the probabilities of seismic intensity exceeding MMI VI in 10-year period are greater than 70% in the central and northern areas of Tainan. Finally, by comparing with the seismic zoning map of Taiwan in current building code that was revised after 921 earthquakes, we find that classification of whole Tainan are zone I has underestimated the following areas: southern Houbi Dist., western Baihe Dist., western Tungshan Dist., central and eastern Liuying Dist., northern Liujia Dist. and northeastern Xinying Dist.. Results of this study show high earthquake hazard potential in Tainan area. They provide a valuable database for the seismic design of critical facilities. It will help mitigate Tainan earthquake disaster loss in the future, as well as provide critical information for emergency response plans.

Liu, Kun Sung



Coal-mining seismicity and ground-shaking hazard: A case study in the Trail Mountain area, Emery County, Utah  

USGS Publications Warehouse

We describe a multipart study to quantify the potential ground-shaking hazard to Joes Valley Dam, a 58-m-high earthfill dam, posed by mining-induced seismicity (MIS) from future underground coal mining, which could approach as close as ???1 km to the dam. To characterize future MIS close to the dam, we studied MIS located ???3-7 km from the dam at the Trail Mountain coal mine. A 12-station local seismic network (11 stations above ground, one below, combining eight triaxial accelerometers and varied velocity sensors) was operated in the Trail Mountain area from late 2000 through mid-2001 for the dual purpose of (1) continuously monitoring and locating MIS associated with longwall mining at a depth of 0.5-0.6 km and (2) recording high-quality data to develop ground-motion prediction equations for the shallow MIS. (Ground-motion attenuation relationships and moment-tensor results are reported in companion articles.) Utilizing a data set of 1913 earthquakes (M ??? 2.2), we describe space-time-magnitude distributions of the observed MIS and source-mechanism information. The MIS was highly correlated with mining activity both in space and time. Most of the better-located events have depths constrained within ??0.6 km of mine level. For the preponderance (98%) of the 1913 located events, only dilatational P-wave first motions were observed, consistent with other evidence for implosive or collapse-type mechanisms associated with coal mining in this region. We assess a probable maximum magnitude of M 3.9 (84th percentile of a cumulative distribution) for potential MIS close to Joes Valley Dam based on both the worldwide and regional record of coal-mining-related MIS and the local geology and future mining scenarios.

Arabasz, W.J.; Nava, S.J.; McCarter, M.K.; Pankow, K.L.; Pechmann, J.C.; Ake, J.; McGarr, A.



Natural hazard impact on the infrastructure in large cities: the case study of Moscow and St.Petersburg  

NASA Astrophysics Data System (ADS)

According to all forecasts, the severity of natural hazard impact on the cities of Russia is expected to increase in coming years and decades due to climate changes, on the one hand, and due to many social, economic and technical causes that reduce their protection and increase vulnerability, on the other hand. In particular, the frequency and severity of various natural-technological accidents will multiply. The highest damage from natural hazard impact will affect large densely populated cities with many technologically sophisticated objects, especially with dangerous ones. Moscow and St.Petersburg have the highest number of natural-technological accidents among all Russian cities. Moscow is prone to influence of various natural hazards such as strong winds, floods, landslides, karst, heavy rainfalls and snowfalls, heatwave, wild fires, storms and lightning, and others. These natural hazards cause many traffic accidents and disruptions in transportation, blackouts, breakdowns of communication lines, collapses of buildings, etc. The climatic and topographic conditions of St.Petersburg contribute to manifestation of many natural hazard types such as floods, storms, strong winds, extreme heat and frost, snowfalls, heavy rains, hale, etc. Hydro-meteorological phenomena occur the most often among them; more than 50 per cent are caused by storms and strong winds and about 25 per cent by floods. This situation can have serious consequences because a nuclear power plant and many chemical plants and other dangerous objects are located in this region. Power failures and transportation problems are the most frequent among all natural-technological accidents in St.Petersburg. The statistical analysis of data about natural-technological accidents in both large Russian cities have been done using information collected in data base that was created by author. Power lines and other infrastructure objects are especially vulrerable to the impacts of natural hazards both in Moscow and St.Petersburg. Critical infrastructure facilities located in these regions need a special protection and modernization.

Petrova, Elena



Planning Tools For Seismic Risk Mitigation. Rules And Applications  

SciTech Connect

Recently, Italian urban planning research in the field of seismic risk mitigation are renewing. In particular, it promotes strategies that integrate urban rehabilitation and aseismic objectives, and also politicizes that are directed to revitalizes urban systems, coupling physical renewal and socio-economic development.In Italy the first law concerning planning for seismic mitigation dates back 1974, the law n. 64 'Regulation for buildings with particular rules for the seismic areas' where the rules for buildings in seismic areas concerning also the local hazard. This law, in fact, forced the municipalities to acquire, during the formation of the plans, a preventive opinion of compatibility between planning conditions and geomorphology conditions of the territory. From this date the conviction that the seismic risk must be considered inside the territorial planning especially in terms of strategies of mitigation has been strengthened.The town planners have started to take an interest in seismic risk in the [80]s when the Irpinia's earthquake took place. The researches developed after this earthquake have established that the principal cause of the collapse of buildings are due to from the wrong location of urban settlements (on slopes or crowns) After Irpinia's earthquake the first researches on seismic risk mitigation, in particular on the aspects related to the hazards and to the urban vulnerability were made.

De Paoli, Rosa Grazia [Department of Landscape Planning, Mediterranean University of Reggio Calabria (Italy)



International Studies of Hazardous Groundwater/Surface Water Exchange in the Volcanic Eruption and Tsunami Affected Areas of Kamchatka  

NASA Astrophysics Data System (ADS)

During the US-Russia Geohazards Workshop held July 17-19, 2012 in Moscow, Russia the international research effort was asked to identify cooperative actions for disaster risk reduction, focusing on extreme geophysical events. As a part of this recommendation the PIRE project was developed to understand, quantify, forecast and protect the coastal zone aquifers and inland water resources of Kamchatka (Russia) and its ecosystems affected by the November 4, 1952 Kamchatka tsunami (Khalatyrka Beach near Petropavlovsk-Kamchatskiy) and the January 2, 1996 Karymskiy volcano eruption and the lake tsunami. This project brings together teams from U.S. universities and research institutions located in Russia. The research consortium was briefed on recent technical developments and will utilize samples secured via major international volcanic and tsunami programs for the purpose of advancing the study of submarine groundwater discharge (SGD) in the volcanic eruption and tsunami affected coastal areas and inland lakes of Kamchatka. We plan to accomplish this project by developing and applying the next generation of field sampling, remote sensing, laboratory techniques and mathematical tools to study groundwater-surface water interaction processes and SGD. We will develop a field and modeling approach to define SGD environment, key controls, and influence of volcano eruption and tsunami, which will provide a framework for making recommendations to combat contamination. This is valuable for politicians, water resource managers and decision-makers and for the volcano eruption and tsunami affected region water supply and water quality of Kamchatka. Data mining and results of our field work will be compiled for spatial modeling by Geo-Information System (GIS) using 3-D Earth Systems Visualization Lab. The field and model results will be communicated to interested stakeholders via an interactive web site. This will allow computation of SGD spatial patterns. In addition, thanks to the conceptual integrated approach, the mathematical tool will be transportable to other regions affected by volcanic eruption and tsunami. We will involve students in the work, incorporate the results into our teaching portfolio and work closely with the IUGG GeoRisk Commission and AGU Natural Hazards Focus Group to communicate our findings to the broader public, specifically local communities that will be most impacted. Under the PIRE education component, a cohort of U.S. and Russian post-doctoral researchers and students will receive training and contribute to the overall natural hazards SGD science agenda in cooperation with senior U.S. researchers and leading investigators from the Russian institutions. Overall, the extensive team of researchers, students and institutions is poised to deliver an innovative and broad spectrum of science associated with the study of SGD in the volcanic eruption and tsunami affected areas, in a way not possible to achieve in isolation.

Kontar, Y. A.; Gusiakov, V. K.; Izbekov, P. E.; Gordeev, E.; Titov, V. V.; Verstraeten, I. M.; Pinegina, T. K.; Tsadikovsky, E. I.; Heilweil, V. M.; Gingerich, S. B.



Ab initio study of adsorption properties of hazardous organic molecules on graphene: Phenol, phenyl azide, and phenylnitrene  

NASA Astrophysics Data System (ADS)

Phenol, phenyl azide, and phenylnitrene are hazardous organic molecules; therefore, the fabrication of sensors or filters with high sorption capabilities for the chemicals is necessary. Considering van der Waals interaction, we perform first-principles density functional theory calculations to investigate the adsorption properties of the hazardous molecules on graphene. For parallel stacking configurations, AB stacking is slightly more favorable than AA stacking for all the adsorbates that we considered. We find that phenyl azide has a higher adsorption energy than phenol. Phenylnitrene forms covalent bonds with graphene in oblique stacking structures, resulting in a bandgap opening in graphene.

Lee, Junsu; Min, Kyung-Ah; Hong, Suklyun; Kim, Gunn



A study of reactive adhesion promoters and their ability to mitigate pattern collapse in thin film lithography  

NASA Astrophysics Data System (ADS)

As integrated circuit fabrication continues to advance towards the 22 nm node and below, it has become clear that although line edge roughness and resolution are important, other issues such as pattern collapse must be addressed in order for technology to continue to progress. One of the primary modes of pattern collapse at small feature sizes is adhesion failure caused by loss of adhesion of the resist to the substrate during the drying process. The main forces which govern pattern collapse by adhesion failure are related to substrate/resist interactions. Significant research has been conducted to find methods for reducing capillary forces, such as use of surfactants in rinses, to reduce pattern collapse. However, the use of spin drying has also been observed to exhibit other collapse related effects that are not sensitive to such treatments. To this end, in this work a reactive adhesion promoter capable of covalently attaching to hydroxystyrene-based positive tone resist copolymers has been developed and demonstrated. A vinyl-ether-modified silane was prepared and effectively applied using a solution silanization reaction. A model hydroxystyrene-based positive tone resist was applied and subjected to post apply bake to cause reaction of the surface modifier with the photoresist to occur prior to patterning using e-beam lithography. Contact angle studies and ellipsometry were used to characterize the surface silanization reaction. Pattern collapse test structures were fabricated and analyzed after development and drying on the different surfaces to quantify the impact of the use of the covalent surface linker and compare it to more standard adhesion promoter processes such as those utilizing hexamethyldilazane (HMDS). The effect of soft bake condition on the performance of the reactive adhesion promoter has also been studied. Ultimately, the results of critical stress analysis and SEM studies of the resulting patterns confirm that use of surface priming agents that covalently attach the resist to the substrate can significantly enhance resist-substrate adhesion and dramatically reduce pattern collapse.

Yeh, Wei-Ming; Lawson, Richard A.; Tolbert, Laren M.; Henderson, Clifford L.



The discriminatory cost of ICD-10-CM transition between clinical specialties: metrics, case study, and mitigating tools  

PubMed Central

Objective Applying the science of networks to quantify the discriminatory impact of the ICD-9-CM to ICD-10-CM transition between clinical specialties. Materials and Methods Datasets were the Center for Medicaid and Medicare Services ICD-9-CM to ICD-10-CM mapping files, general equivalence mappings, and statewide Medicaid emergency department billing. Diagnoses were represented as nodes and their mappings as directional relationships. The complex network was synthesized as an aggregate of simpler motifs and tabulation per clinical specialty. Results We identified five mapping motif categories: identity, class-to-subclass, subclass-to-class, convoluted, and no mapping. Convoluted mappings indicate that multiple ICD-9-CM and ICD-10-CM codes share complex, entangled, and non-reciprocal mappings. The proportions of convoluted diagnoses mappings (36% overall) range from 5% (hematology) to 60% (obstetrics and injuries). In a case study of 24?008 patient visits in 217 emergency departments, 27% of the costs are associated with convoluted diagnoses, with ‘abdominal pain’ and ‘gastroenteritis’ accounting for approximately 3.5%. Discussion Previous qualitative studies report that administrators and clinicians are likely to be challenged in understanding and managing their practice because of the ICD-10-CM transition. We substantiate the complexity of this transition with a thorough quantitative summary per clinical specialty, a case study, and the tools to apply this methodology easily to any clinical practice in the form of a web portal and analytic tables. Conclusions Post-transition, successful management of frequent diseases with convoluted mapping network patterns is critical. The web portal provides insight in linking onerous diseases to the ICD-10 transition. PMID:23645552

Boyd, Andrew D; Li, Jianrong ‘John’; Burton, Mike D; Jonen, Michael; Gardeux, Vincent; Achour, Ikbel; Luo, Roger Q; Zenku, Ilir; Bahroos, Neil; Brown, Stephen B; Vanden Hoek, Terry; Lussier, Yves A



Assessment of Nearshore Hazard due to Tsunami-Induced Currents (Invited)  

NASA Astrophysics Data System (ADS)

The California Tsunami Program coordinated by CalOES and CGS in cooperation with NOAA and FEMA has begun implementing a plan to increase awareness of tsunami generated hazards to the maritime community (both ships and harbor infrastructure) through the development of in-harbor hazard maps, offshore safety zones for boater evacuation, and associated guidance for harbors and marinas before, during and following tsunamis. The hope is that the maritime guidance and associated education and outreach program will help save lives and reduce exposure of damage to boats and harbor infrastructure. An important step in this process is to understand the causative mechanism for damage in ports and harbors, and then ensure that the models used to generate hazard maps are able to accurately simulate these processes. Findings will be used to develop maps, guidance documents, and consistent policy recommendations for emergency managers and port authorities and provide information critical to real-time decisions required when responding to tsunami alert notifications. The goals of the study are to (1) evaluate the effectiveness and sensitivity of existing numerical models for assessing maritime tsunami hazards, (2) find a relationship between current speeds and expected damage levels, (3) evaluate California ports and harbors in terms of tsunami induced hazards by identifying regions that are prone to higher current speeds and damage and to identify regions of relatively lower impact that may be used for evacuation of maritime assets, and (4) determine ';safe depths' for evacuation of vessels from ports and harbors during a tsunami event. This presentation will focus on the results from five California ports and harbors, and will include feedback we have received from initial discussion with local harbor masters and port authorities. This work in California will form the basis for tsunami hazard reduction for all U.S. maritime communities through the National Tsunami Hazard Mitigation Program.

Lynett, P. J.; Borrero, J. C.; Son, S.; Wilson, R. I.; Miller, K.



A~probabilistic tsunami hazard assessment for Indonesia  

NASA Astrophysics Data System (ADS)

Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.



Toxic hazards of underground excavation  

SciTech Connect

Inadvertent intrusion into natural or man-made toxic or hazardous material deposits as a consequence of activities such as mining, excavation or tunnelling has resulted in numerous deaths and injuries in this country. This study is a preliminary investigation to identify and document instances of such fatal or injurious intrusion. An objective is to provide useful insights and information related to potential hazards due to future intrusion into underground radioactive-waste-disposal facilities. The methodology used in this study includes literature review and correspondence with appropriate government agencies and organizations. Key categories of intrusion hazards are asphyxiation, methane, hydrogen sulfide, silica and asbestos, naturally occurring radionuclides, and various mine or waste dump related hazards.

Smith, R.; Chitnis, V.; Damasian, M.; Lemm, M.; Popplesdorf, N.; Ryan, T.; Saban, C.; Cohen, J.; Smith, C.; Ciminesi, F.



Tracking World Aerosol Hazards  

NSDL National Science Digital Library

Worldwide patterns and sources of aerosols are analyzed and evaluated for potential hazards to aircraft safety. Using aerosol index maps created from data gathered by the TOMS instrument, student groups will analyze and compare aerosol data from either eight consecutive or eight random days. Each group will graph the data, rank the hazard level of their study area and analyze the patterns and probable causes of those aerosols. Directions and materials are included for classes with computer access and for those without computer access. The URL opens to the investigation directory, with links to teacher and student materials, lesson extensions, resources, teaching tips, and assessment strategies. Note that this is the last of three investigations found in the Grades 5-8 Module 1 of Mission Geography. The Mission Geography curriculum integrates data and images from NASA missions with the National Geography Standards. Each of the three investigations in Module 1, while related, can be done independently.



USGS Geologic Hazards  

NSDL National Science Digital Library

The Geologic Hazards section of the US Geological Survey (USGS) conducts research into the causes of geological phenomena such as landslides and earthquakes. The homepage connects visitors to the Geologic Hazards team's three main areas of endeavor. Geomagnetism provides links to the National Geomagnetic Information Center; Magnetic Observatories, Models, and Charts; and the Geomagnetic Information Node, which receives geomagnetic observatory data from around the world. The Landslide group studies the "causes and mechanisms of ground failure" to prevent "long-term losses and casualties." Their section provides links to the program and information center, publications, events, and current projects. The Earthquakes department hosts a wealth of information, including neotectonics, engineering seismology, and paleoseismology. Interactive maps are also provided.


Enhancing Students' Understanding of Risk and Geologic Hazards Using a Dartboard Model.  

ERIC Educational Resources Information Center

Uses dartboards to represent magnitude-frequency relationships of natural hazards which engage students at different levels of preparation in different contexts, and for different lengths of time. Helps students to mitigate the misconceptions that processes occur periodically by emphasizing the random nature of hazards. Includes 12 references.…

Lutz, Timothy M.



Modeling household adoption of earthquake hazard adjustments: a longitudinal panel study of Southern California and Western Washington residents  

E-print Network

they are relevant to the adoption of seismic hazard adjustments. It also addressed three key attributes� knowledge, trustworthiness, and responsibility for protection�ascribed to these multiple stakeholders and the relationships of these stakeholder...

Arlikatti, Sudha S



An Environmental Justice Study of Los Angeles County Elementary Schools and Their Proximities to Liquefaction and Landslide Hazard Areas  

Microsoft Academic Search

My research centers around two separate GIS projects: School Boundary Project and Earthquake Hazards Project. I am compiling a GIS geography database for the School Boundary Project in preparation for environmental justice research. I constructed the attendance boundaries for the elementary schools in Los Angeles County using attendance boundary maps and street indexes obtained from school district offices. There are

Phuong Kim Chau




EPA Science Inventory

The purpose of the report is to provide data to the U.S. EPA on the use of sanitary landfills for hazardous waste disposal in Florida by small quantity generators. The report was completed in two stages, each resulting in a three volume interim report. The first interim report co...



EPA Science Inventory

The purpose of the report is to provide data to the U.S. EPA on the use of sanitary landfills for hazardous waste disposal in Florida by small quantity generators. The report was completed in two stages, each resulting in a three volume interim report. Each interim report consist...


Study of application of ERTS-A imagery to fracture related mine safety hazards in the coal mining industry  

NASA Technical Reports Server (NTRS)

The author has identified the following significant results. The utility of ERTS-1/high altitude aircraft imagery to detect underground mine hazards is strongly suggested. A 1:250,000 scale mined lands map of the Vincennes Quadrangle, Indiana has been prepared. This map is a prototype for a national mined lands inventory and will be distributed to State and Federal offices.

Wier, C. E.; Wobber, F. J. (principal investigators); Russell, O. R.; Amato, R. V.



Subsides to the creation of a regional model of forest fire hazard: Taquari River Springs Park, MS—A case study  

NASA Astrophysics Data System (ADS)

Using map algebra, in the GIS (geographic information system) environment, this study integrates the B-RAMS, Brazilian Regional Atmospheric Models Software (CPTEC, 2005) climate model data with remote sensing data, intending to obtain a wildfire hazard map. The Taquari River Springs Park (TRSP) was chosen as a case study, due to the presence of springs which are considered important contributors to the Upper Paraguay River Basin, and it also contains essential remnants of the Cerrado Biome. The B-RAMS model has provided relative humidity, components of the horizontal wind and temperature. The TRSP land cover was identified by object oriented classification of a LANDSAT ETM+ image, supported by field observations. From the land cover phytophysionomic type characterization, a forest wildfire fuel map has been elaborated. The integration of the different maps has been made using a GIS, and a new map with its associated GIS database was generated showing the most vulnerable zones to wildfire hazard.

de Albuquerque, L. M. Mercê; Paranhos Filho, A. C.; Torres, T. G.; Kassar, E.; de Matos Filho, H. J. S.; Carrijo, M. G. G.; Pavão, H. G.; de Souza, A.