Science.gov

Sample records for hazard mitigation studies

  1. Unacceptable Risk: Earthquake Hazard Mitigation in One California School District. Hazard Mitigation Case Study.

    ERIC Educational Resources Information Center

    California State Office of Emergency Services, Sacramento.

    Earthquakes are a perpetual threat to California's school buildings. School administrators must be aware that hazard mitigation means much more than simply having a supply of water bottles in the school; it means getting everyone involved in efforts to prevent tragedies from occurring in school building in the event of an earthquake. The PTA in…

  2. Numerical study on tsunami hazard mitigation using a submerged breakwater.

    PubMed

    Ha, Taemin; Yoo, Jeseon; Han, Sejong; Cho, Yong-Sik

    2014-01-01

    Most coastal structures have been built in surf zones to protect coastal areas. In general, the transformation of waves in the surf zone is quite complicated and numerous hazards to coastal communities may be associated with such phenomena. Therefore, the behavior of waves in the surf zone should be carefully analyzed and predicted. Furthermore, an accurate analysis of deformed waves around coastal structures is directly related to the construction of economically sound and safe coastal structures because wave height plays an important role in determining the weight and shape of a levee body or armoring material. In this study, a numerical model using a large eddy simulation is employed to predict the runup heights of nonlinear waves that passed a submerged structure in the surf zone. Reduced runup heights are also predicted, and their characteristics in terms of wave reflection, transmission, and dissipation coefficients are investigated. PMID:25215334

  3. Numerical Study on Tsunami Hazard Mitigation Using a Submerged Breakwater

    PubMed Central

    Yoo, Jeseon; Han, Sejong; Cho, Yong-Sik

    2014-01-01

    Most coastal structures have been built in surf zones to protect coastal areas. In general, the transformation of waves in the surf zone is quite complicated and numerous hazards to coastal communities may be associated with such phenomena. Therefore, the behavior of waves in the surf zone should be carefully analyzed and predicted. Furthermore, an accurate analysis of deformed waves around coastal structures is directly related to the construction of economically sound and safe coastal structures because wave height plays an important role in determining the weight and shape of a levee body or armoring material. In this study, a numerical model using a large eddy simulation is employed to predict the runup heights of nonlinear waves that passed a submerged structure in the surf zone. Reduced runup heights are also predicted, and their characteristics in terms of wave reflection, transmission, and dissipation coefficients are investigated. PMID:25215334

  4. Washington Tsunami Hazard Mitigation Program

    NASA Astrophysics Data System (ADS)

    Walsh, T. J.; Schelling, J.

    2012-12-01

    Washington State has participated in the National Tsunami Hazard Mitigation Program (NTHMP) since its inception in 1995. We have participated in the tsunami inundation hazard mapping, evacuation planning, education, and outreach efforts that generally characterize the NTHMP efforts. We have also investigated hazards of significant interest to the Pacific Northwest. The hazard from locally generated earthquakes on the Cascadia subduction zone, which threatens tsunami inundation in less than hour following a magnitude 9 earthquake, creates special problems for low-lying accretionary shoreforms in Washington, such as the spits of Long Beach and Ocean Shores, where high ground is not accessible within the limited time available for evacuation. To ameliorate this problem, we convened a panel of the Applied Technology Council to develop guidelines for construction of facilities for vertical evacuation from tsunamis, published as FEMA 646, now incorporated in the International Building Code as Appendix M. We followed this with a program called Project Safe Haven (http://www.facebook.com/ProjectSafeHaven) to site such facilities along the Washington coast in appropriate locations and appropriate designs to blend with the local communities, as chosen by the citizens. This has now been completed for the entire outer coast of Washington. In conjunction with this effort, we have evaluated the potential for earthquake-induced ground failures in and near tsunami hazard zones to help develop cost estimates for these structures and to establish appropriate tsunami evacuation routes and evacuation assembly areas that are likely to to be available after a major subduction zone earthquake. We intend to continue these geotechnical evaluations for all tsunami hazard zones in Washington.

  5. Contributions of Nimbus 7 TOMS Data to Volcanic Study and Hazard Mitigation

    NASA Technical Reports Server (NTRS)

    Krueger, Arlin J.; Bluth, G. J. S.; Schaefer, S. A.

    1998-01-01

    Nimbus TOMS data have led to advancements among many volcano-related scientific disciplines, from the initial ability to quantify SO2 clouds leading to derivations of eruptive S budgets and fluxes, to tracking of individual clouds, assessing global volcanism and atmospheric impacts. Some of the major aspects of TOMS-related research, listed below, will be reviewed and updated: (1) Measurement of volcanic SO2 clouds: Nimbus TOMS observed over 100 individual SO2 clouds during its mission lifetime; large explosive eruptions are now routinely and reliably measured by satellite. (2) Eruption processes: quantification of SO2 emissions have allowed assessments of eruption sulfur budgets, the evaluation of "excess" sulfur, and inferences of H2S emissions. (3) Detection of ash: TOMS data are now used to detect volcanic particulates in the atmosphere, providing complementary analyses to infrared methods of detection. Paired TOMS and AVHRR studies have provided invaluable information on volcanic cloud compositions and processes. (4) Cloud tracking and hazard mitigation: volcanic clouds can be considered gigantic tracers in the atmosphere, and studies of the fates of these clouds have led to new knowledge of their physical and chemical dispersion in the atmosphere for predictive models. (5) Global trends: the long term data set has provided researchers an unparalleled record of explosive volcanism, and forms a key component in assessing annual to decadal trends in global S emissions. (6) Atmospheric impacts: TOMS data have been linked to independent records of atmospheric change, in order to compare cause and effect processes following a massive injection of SO2 into the atmosphere. (7) Future TOMS instruments and applications: Nimbus TOMS has given way to new satellite platforms, with several wavelength and resolution modifications. New efforts to launch a geostationary TOMS could provide unprecedented observations of volcanic activity.

  6. Landslide hazard mitigation in North America

    USGS Publications Warehouse

    Wieczorek, G.F.; Leahy, P.P.

    2008-01-01

    Active landslides throughout the states and territories of the United States result in extensive property loss and 25-50 deaths per year. The U.S. Geological Survey (USGS) has a long history of detailed examination of landslides since the work of Howe (1909) in the San Juan Mountains of Colorado. In the last four decades, landslide inventory maps and landslide hazard maps have depicted landslides of different ages, identified fresh landslide scarps, and indicated the direction of landslide movement for different regions of the states of Colorado, California, and Pennsylvania. Probability-based methods improve landslide hazards assessments. Rainstorms, earthquakes, wildfires, and volcanic eruptions can trigger landslides. Improvements in remote sensing of rainfall make it possible to issue landslide advisories and warnings for vulnerable areas. From 1986 to 1995, the USGS issued hazard warnings based on rainfall in the San Francisco Bay area. USGS workers also identified rainfall thresholds triggering landslides in Puerto Rico, Hawaii, Washington, and the Blue Ridge Mountains of central Virginia. Detailed onsite monitoring of landslides near highways in California and Colorado aided transportation officials. The USGS developed a comprehensive, multi-sector, and multi-agency strategy to mitigate landslide hazards nationwide. This study formed the foundation of the National Landslide Hazards Mitigation Strategy. The USGS, in partnership with the U.S. National Weather Service and the State of California, began to develop a real-time warning system for landslides from wildfires in Southern California as a pilot study in 2005.

  7. An economic and geographic appraisal of a spatial natural hazard risk: a study of landslide mitigation rules

    USGS Publications Warehouse

    Bernknopf, R.L.; Brookshire, D.S.; Campbell, R.H.; Shapiro, C.D.

    1988-01-01

    Efficient mitigation of natural hazards requires a spatial representation of the risk, based upon the geographic distribution of physical parameters and man-related development activities. Through such a representation, the spatial probability of landslides based upon physical science concepts is estimated for Cincinnati, Ohio. Mitigation programs designed to reduce loss from landslide natural hazards are then evaluated. An optimum mitigation rule is suggested that is spatially selective and is determined by objective measurements of hillside slope and properties of the underlying soil. -Authors

  8. Earthquake Hazard Mitigation Strategy in Indonesia

    NASA Astrophysics Data System (ADS)

    Karnawati, D.; Anderson, R.; Pramumijoyo, S.

    2008-05-01

    Because of the active tectonic setting of the region, the risks of geological hazards inevitably increase in Indonesian Archipelagoes and other ASIAN countries. Encouraging community living in the vulnerable area to adapt with the nature of geology will be the most appropriate strategy for earthquake risk reduction. Updating the Earthquake Hazard Maps, enhancement ofthe existing landuse management , establishment of public education strategy and method, strengthening linkages among stake holders of disaster mitigation institutions as well as establishement of continues public consultation are the main strategic programs for community resilience in earthquake vulnerable areas. This paper highlights some important achievements of Earthquake Hazard Mitigation Programs in Indonesia, together with the difficulties in implementing such programs. Case examples of Yogyakarta and Bengkulu Earthquake Mitigation efforts will also be discussed as the lesson learned. The new approach for developing earthquake hazard map which is innitiating by mapping the psychological aspect of the people living in vulnerable area will be addressed as well.

  9. Playing against nature: improving earthquake hazard mitigation

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Stein, J.

    2012-12-01

    The great 2011 Tohoku earthquake dramatically demonstrated the need to improve earthquake and tsunami hazard assessment and mitigation policies. The earthquake was much larger than predicted by hazard models, and the resulting tsunami overtopped coastal defenses, causing more than 15,000 deaths and $210 billion damage. Hence if and how such defenses should be rebuilt is a challenging question, because the defences fared poorly and building ones to withstand tsunamis as large as March's is too expensive,. A similar issue arises along the Nankai Trough to the south, where new estimates warning of tsunamis 2-5 times higher than in previous models raise the question of what to do, given that the timescale on which such events may occur is unknown. Thus in the words of economist H. Hori, "What should we do in face of uncertainty? Some say we should spend our resources on present problems instead of wasting them on things whose results are uncertain. Others say we should prepare for future unknown disasters precisely because they are uncertain". Thus society needs strategies to mitigate earthquake and tsunami hazards that make economic and societal sense, given that our ability to assess these hazards is poor, as illustrated by highly destructive earthquakes that often occur in areas predicted by hazard maps to be relatively safe. Conceptually, we are playing a game against nature "of which we still don't know all the rules" (Lomnitz, 1989). Nature chooses tsunami heights or ground shaking, and society selects the strategy to minimize the total costs of damage plus mitigation costs. As in any game of chance, we maximize our expectation value by selecting the best strategy, given our limited ability to estimate the occurrence and effects of future events. We thus outline a framework to find the optimal level of mitigation by balancing its cost against the expected damages, recognizing the uncertainties in the hazard estimates. This framework illustrates the role of the

  10. Volcano hazard mitigation program in Indonesia

    USGS Publications Warehouse

    Sudradjat, A.

    1990-01-01

    Volcanological investigations in Indonesia were started in the 18th century, when Valentijn in 1726 prepared a chronological report of the eruption of Banda Api volcno, Maluku. Modern and intensive volcanological studies did not begin until the catastrophic eruption of Kelut volcano, East Java, in 1919. The eruption took 5,011 lives and destroyed thousands of acres of coffee plantation. An eruption lahar generated by the crater lake water mixed with volcanic eruptions products was the cause of death for a high number of victims. An effort to mitigate the danger from volcanic eruption was first initiated in 1921 by constructing a tunnel to drain the crater lake water of Kelut volcano. At the same time a Volcanological Survey was established by the government with the responsibility of seeking every means for minimizing the hazard caused by volcanic eruption. 

  11. Reduce toxic hazards using passive mitigation

    SciTech Connect

    Flamberg, S.A.; Torti, K.S.; Myers, P.M.

    1998-07-01

    The primary goal of the Risk Management Program Rule promulgated under Section 112(r) of the 1990 US Clean Air Act Amendments is to prevent the accidental release of those chemicals that pose the greatest threat to the public and the environment, and to encourage emergency preparedness to mitigate the severity of such releases. The Rule requires facility owners to identify, evaluate, and communicate to the public any potential worst-case scenarios that could involve accidental releases of toxic and flammable substances. A worst-case scenario is defined by the US Environmental Protection Agency (EPA; Washington, DC) as: {hor_ellipsis}the release of the largest quantity of a regulated substance from a vessel or process line failure that results in the greatest distance to an endpoint. When designing systems to store or process hazardous materials, passive-mitigation methods--those that function without human, mechanical, or energy input--should be considered. Such systems contain or limit a potential release of hazardous materials. And, because they have no mechanical requirements, passive-mitigation techniques are considered more reliable than active methods, such as emergency-shutdown and water-spray systems. Passive mitigation should also be considered when defining potential release scenarios and modeling hazard zones.

  12. WHC natural phenomena hazards mitigation implementation plan

    SciTech Connect

    Conrads, T.J.

    1996-09-11

    Natural phenomena hazards (NPH) are unexpected acts of nature which pose a threat or danger to workers, the public or to the environment. Earthquakes, extreme winds (hurricane and tornado),snow, flooding, volcanic ashfall, and lightning strike are examples of NPH at Hanford. It is the policy of U.S. Department of Energy (DOE) to design, construct and operate DOE facilitiesso that workers, the public and the environment are protected from NPH and other hazards. During 1993 DOE, Richland Operations Office (RL) transmitted DOE Order 5480.28, ``Natural Phenomena Hazards Mitigation,`` to Westinghouse Hanford COmpany (WHC) for compliance. The Order includes rigorous new NPH criteria for the design of new DOE facilities as well as for the evaluation and upgrade of existing DOE facilities. In 1995 DOE issued Order 420.1, ``Facility Safety`` which contains the same NPH requirements and invokes the same applicable standards as Order 5480.28. It will supersede Order 5480.28 when an in-force date for Order 420.1 is established through contract revision. Activities will be planned and accomplished in four phases: Mobilization; Prioritization; Evaluation; and Upgrade. The basis for the graded approach is the designation of facilities/structures into one of five performance categories based upon safety function, mission and cost. This Implementation Plan develops the program for the Prioritization Phase, as well as an overall strategy for the implemention of DOE Order 5480.2B.

  13. California Earthquakes: Science, Risks, and the Politics of Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Shedlock, Kaye M.

    "Politics" should be the lead word in the sub-title of this engrossing study of the emergence and growth of the California and federal earthquake hazard reduction infrastructures. Beginning primarily with the 1906 San Francisco earthquake, scientists, engineers, and other professionals cooperated and clashed with state and federal officials, the business community, " boosters," and the general public to create programs, agencies, and commissions to support earthquake research and hazards mitigation. Moreover, they created a "regulatory-state" apparatus that governs human behavior without sustained public support for its creation. The public readily accepts that earthquake research and mitigation are government responsibilities. The government employs or funds the scientists, engineers, emergency response personnel, safety officials, building inspectors, and others who are instrumental in reducing earthquake hazards. This book clearly illustrates how, and why all of this came to pass.

  14. The National Tsunami Hazard Mitigation Program

    NASA Astrophysics Data System (ADS)

    Bernard, E. N.

    2003-12-01

    The National Tsunami Hazard Mitigation Program (NTHMP) is a state/Federal partnership that was created to reduce the impacts of tsunamis to U. S. Coastal areas. It is a coordinated effort between the states of Alaska, California, Hawaii, Oregon, and Washington and four Federal agencies: the National Oceanic and Atmospheric Administration(NOAA), the Federal Emergency Management Agency (FEMA), the U. S. Geological Survey (USGS), and the National Science Foundation(NSF). NOAA has led the effort to forge a solid partnership between the states and the Federal agencies because of it's responsibility to provide tsunami warning services to the nation. This successful partnership has established a mitigation program in each state that is preparing coastal communities for the next tsunami. Inundation maps are now available for many of the coastal communities of Alaska, California, Hawaii, Oregon, and Washington. These maps are used to develop evacuation plans and, in the case of Oregon, for land use management. The partnership has successfully upgraded the warning capability in NOAA so that earthquakes can be detected within 5 minutes and tsunamis can be detected in the open ocean in real time, paving the way for improved tsunami forecasts. NSF's new Network for Earthquake Engineering (NEES) program has agreed to work with the NTHMP to focus tsunami research on national needs. An overview of the NTHMP will be given including a discussion of accomplishments and the new collaboration with NEES.

  15. EVALUATION OF FOAMS FOR MITIGATING AIR POLLUTION FROM HAZARDOUS SPILLS

    EPA Science Inventory

    This program has been conducted to evaluate commercially available water base foams for mitigating the vapors from hazardous chemical spills. Foam systems were evaluated in the laboratory to define those foam properties which are important in mitigating hazardous vapors. Larger s...

  16. The National Tsunami Hazard Mitigation Program

    NASA Astrophysics Data System (ADS)

    Bernard, E. N.

    2004-12-01

    The National Tsunami Hazard Mitigation Program (NTHMP) is a state/Federal partnership that was created to reduce the impacts of tsunamis to U.S. Coastal areas. It is a coordinated effort between the states of Alaska, California, Hawaii, Oregon, and Washington and four Federal agencies: the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), the U.S. Geological Survey (USGS), and the National Science Foundation (NSF). NOAA has led the effort to forge a solid partnership between the states and the Federal agencies because of it's responsibility to provide tsunami warning services to the nation. The successful partnership has established a mitigation program in each state that is developing tsunami resilient coastal communities. Inundation maps are now available for many of the coastal communities of Alaska, California, Hawaii, Oregon, and Washington. These maps are used to develop evacuation plans and, in the case of Oregon, for land use management. The NTHMP mapping technology is now being applied to FEMA's Flood Insurance Rate Maps (FIRMs). The NTHMP has successfully upgraded the warning capability in NOAA so that earthquakes can be detected within 5 minutes and tsunamis can be detected in the open ocean in real time. Deep ocean reporting of tsunamis has already averted one unnecessary evacuation of Hawaii and demonstrated that real-time tsunami forecasting is now possible. NSF's new Network for Earthquake Engineering (NEES) program has agreed to work with the NTHMP to focus tsunami research on national needs. An overview of the NTHMP will be given including a discussion of accomplishments and a progress report on NEES and FIRM activities.

  17. Risk perception and volcanic hazard mitigation: Individual and social perspectives

    NASA Astrophysics Data System (ADS)

    Paton, Douglas; Smith, Leigh; Daly, Michele; Johnston, David

    2008-05-01

    This paper discusses how people's interpretation of their experience of volcanic hazards and public volcanic hazard education programs influences their risk perception and whether or not they adopt measures that can mitigate their risk. Drawing on four studies of volcanic risk perception and preparedness, the paper first examines why experiencing volcanic hazards need not necessarily motivate people to prepare for future volcanic crises. This work introduces how effective risk communication requires communities and civic agencies to play complementary roles in the risk management process. Next, the findings of a study evaluating the effectiveness of a public volcanic hazard education program introduce the important role that social interaction amongst community members plays in risk management. Building on the conclusions of these studies, a model that depicts preparing as a social process is developed and tested. The model predicts that it is the quality of the relationships between people, communities and civic agencies that determines whether people adopt measures that can reduce their risk from volcanic hazard consequences. The implications of the model for conceptualizing and delivering volcanic hazard public education programs in ways that accommodate these relationships is discussed.

  18. Debris flow hazards mitigation--Mechanics, prediction, and assessment

    USGS Publications Warehouse

    2007-01-01

    These proceedings contain papers presented at the Fourth International Conference on Debris-Flow Hazards Mitigation: Mechanics, Prediction, and Assessment held in Chengdu, China, September 10-13, 2007. The papers cover a wide range of topics on debris-flow science and engineering, including the factors triggering debris flows, geomorphic effects, mechanics of debris flows (e.g., rheology, fluvial mechanisms, erosion and deposition processes), numerical modeling, various debris-flow experiments, landslide-induced debris flows, assessment of debris-flow hazards and risk, field observations and measurements, monitoring and alert systems, structural and non-structural countermeasures against debris-flow hazards and case studies. The papers reflect the latest devel-opments and advances in debris-flow research. Several studies discuss the development and appli-cation of Geographic Information System (GIS) and Remote Sensing (RS) technologies in debris-flow hazard/risk assessment. Timely topics presented in a few papers also include the development of new or innovative techniques for debris-flow monitoring and alert systems, especially an infra-sound acoustic sensor for detecting debris flows. Many case studies illustrate a wide variety of debris-flow hazards and related phenomena as well as their hazardous effects on human activities and settlements.

  19. Potentially Hazardous Objects (PHO) Mitigation Program

    NASA Astrophysics Data System (ADS)

    Huebner, Walter

    Southwest Research Institute (SwRI) and its partner, Los Alamos National Laboratory (LANL), are prepared to develop, implement, and expand procedures to avert collisions of potentially hazardous objects (PHOs) with Earth as recommended by NASA in its White Paper "Near- Earth Object Survey and Deflection Analysis of Alternatives" requested by the US Congress and submitted to it in March 2007. In addition to developing the general mitigation program as outlined in the NASA White Paper, the program will be expanded to include aggressive mitigation procedures for small (e.g., Tunguska-sized) PHOs and other short warning-time PHOs such as some long-period comet nuclei. As a first step the program will concentrate on the most likely and critical cases, namely small objects and long-period comet nuclei with short warning-times, but without losing sight of objects with longer warning-times. Objects smaller than a few hundred meters are of interest because they are about 1000 times more abundant than kilometer-sized objects and are fainter and more difficult to detect, which may lead to short warning times and hence short reaction times. Yet, even these small PHOs can have devastating effects as the 30 June 1908, Tungaska event has shown. In addition, long-period comets, although relatively rare but large (sometimes tens of kilometers in size), cannot be predicted because of their long orbital periods. Comet C/1983 H1 (IRAS-Araki-Alcock), for example, has an orbital period of 963.22 years, was discovered 27 April 1983, and passed Earth only two weeks later, on 11 May 1983, at a distance of 0.0312 AU. Aggressive methods and continuous alertness will be needed to defend against objects with such short warning times. While intact deflection of a PHO remains a key objective, destruction of a PHO and dispersion of the pieces must also be considered. The effectiveness of several alternative methods including nuclear demolition munitions, conventional explosives, and hyper

  20. Influence of behavioral biases on the assessment of multi-hazard risks and the implementation of multi-hazard risks mitigation measures: case study of multi-hazard cyclone shelters in Tamil Nadu, India

    NASA Astrophysics Data System (ADS)

    Komendantova, Nadejda; Patt, Anthony

    2013-04-01

    In December 2004, a multiple hazards event devastated the Tamil Nadu province of India. The Sumatra -Andaman earthquake with a magnitude of Mw=9.1-9.3 caused the Indian Ocean tsunami with wave heights up to 30 m, and flooding that reached up to two kilometers inland in some locations. More than 7,790 persons were killed in the province of Tamil Nadu, with 206 in its capital Chennai. The time lag between the earthquake and the tsunami's arrival in India was over an hour, therefore, if a suitable early warning system existed, a proper means of communicating the warning and shelters existing for people would exist, than while this would not have prevented the destruction of infrastructure, several thousands of human lives would have been saved. India has over forty years of experience in the construction of cyclone shelters. With additional efforts and investment, these shelters could be adapted to other types of hazards such as tsunamis and flooding, as well as the construction of new multi-hazard cyclone shelters (MPCS). It would therefore be possible to mitigate one hazard such as cyclones by the construction of a network of shelters while at the same time adapting these shelters to also deal with, for example, tsunamis, with some additional investment. In this historical case, the failure to consider multiple hazards caused significant human losses. The current paper investigates the patterns of the national decision-making process with regards to multiple hazards mitigation measures and how the presence of behavioral and cognitive biases influenced the perceptions of the probabilities of multiple hazards and the choices made for their mitigation by the national decision-makers. Our methodology was based on the analysis of existing reports from national and international organizations as well as available scientific literature on behavioral economics and natural hazards. The results identified several biases in the national decision-making process when the

  1. Destructive Interactions Between Mitigation Strategies and the Causes of Unexpected Failures in Natural Hazard Mitigation Systems

    NASA Astrophysics Data System (ADS)

    Day, S. J.; Fearnley, C. J.

    2013-12-01

    Large investments in the mitigation of natural hazards, using a variety of technology-based mitigation strategies, have proven to be surprisingly ineffective in some recent natural disasters. These failures reveal a need for a systematic classification of mitigation strategies; an understanding of the scientific uncertainties that affect the effectiveness of such strategies; and an understanding of how the different types of strategy within an overall mitigation system interact destructively to reduce the effectiveness of the overall mitigation system. We classify mitigation strategies into permanent, responsive and anticipatory. Permanent mitigation strategies such as flood and tsunami defenses or land use restrictions, are both costly and 'brittle': when they malfunction they can increase mortality. Such strategies critically depend on the accuracy of the estimates of expected hazard intensity in the hazard assessments that underpin their design. Responsive mitigation strategies such as tsunami and lahar warning systems rely on capacities to detect and quantify the hazard source events and to transmit warnings fast enough to enable at risk populations to decide and act effectively. Self-warning and voluntary evacuation is also usually a responsive mitigation strategy. Uncertainty in the nature and magnitude of the detected hazard source event is often the key scientific obstacle to responsive mitigation; public understanding of both the hazard and the warnings, to enable decision making, can also be a critical obstacle. Anticipatory mitigation strategies use interpretation of precursors to hazard source events and are used widely in mitigation of volcanic hazards. Their critical limitations are due to uncertainties in time, space and magnitude relationships between precursors and hazard events. Examples of destructive interaction between different mitigation strategies are provided by the Tohoku 2011 earthquake and tsunami; recent earthquakes that have impacted

  2. Volcanic hazards and their mitigation: Progress and problems

    NASA Astrophysics Data System (ADS)

    Tilling, Robert I.

    1989-05-01

    At the beginning of the twentieth century, volcanology began to emerge as a modern science as a result of increased interest in eruptive phenomena following some of the worst volcanic disasters in recorded history: Krakatau (Indonesia) in 1883 and Mont Pelée (Martinique), Soufrière (St. Vincent), and Santa María (Guatemala) in 1902. Volcanology is again experiencing a period of heightened public awareness and scientific growth in the 1980s, the worst period since 1902 in terms of volcanic disasters and crises. A review of hazards mitigation approaches and techniques indicates that significant advances have been made in hazards assessment, volcano monitoring, and eruption forecasting. For example, the remarkable accuracy of the predictions of dome-building events at Mount St. Helens since June 1980 is unprecedented. Yet a predictive capability for more voluminous and explosive eruptions still has not been achieved. Studies of magma-induced seismicity and ground deformation continue to provide the most systematic and reliable data for early detection of precursors to eruptions and shallow intrusions. In addition, some other geophysical monitoring techniques and geochemical methods have been refined and are being more widely applied and tested. Comparison of the four major volcanic disasters of the 1980s (Mount St. Helens, U.S.A. (1980), El Chichón, Mexico (1982); Galunggung, Indonesia (1982); and Nevado del Ruíz, Colombia (1985) illustrates the importance of predisaster geoscience studies, volcanic hazards assessments, volcano monitoring, contingency planning, and effective communications between scientists and authorities. The death toll (>22,000) from the Ruíz catastrophe probably could have been greatly reduced; the reasons for the tragically ineffective implementation of evacuation measures are still unclear and puzzling in view of the fact that sufficient warnings were given. The most pressing problem in the mitigation of volcanic and associated hazards on

  3. Space options for tropical cyclone hazard mitigation

    NASA Astrophysics Data System (ADS)

    Dicaire, Isabelle; Nakamura, Ryoko; Arikawa, Yoshihisa; Okada, Kazuyuki; Itahashi, Takamasa; Summerer, Leopold

    2015-02-01

    This paper investigates potential space options for mitigating the impact of tropical cyclones on cities and civilians. Ground-based techniques combined with space-based remote sensing instrumentation are presented together with space-borne concepts employing space solar power technology. Two space-borne mitigation options are considered: atmospheric warming based on microwave irradiation and laser-induced cloud seeding based on laser power transfer. Finally technology roadmaps dedicated to the space-borne options are presented, including a detailed discussion on the technological viability and technology readiness level of our proposed systems. Based on these assessments, the space-borne cyclone mitigation options presented in this paper may be established in a quarter of a century.

  4. Mitigation of earthquake hazards using seismic base isolation systems

    SciTech Connect

    Wang, C.Y.

    1994-06-01

    This paper deals with mitigation of earthquake hazards using seismic base-isolation systems. A numerical algorithm is described for system response analysis of isolated structures with laminated elastomer bearings. The focus of this paper is on the adaptation of a nonlinear constitutive equation for the isolation bearing, and the treatment of foundation embedment for the soil-structure-interaction analysis. Sample problems are presented to illustrate the mitigating effect of using base-isolation systems.

  5. Rockslide susceptibility and hazard assessment for mitigation works design along vertical rocky cliffs: workflow proposal based on a real case-study conducted in Sacco (Campania), Italy

    NASA Astrophysics Data System (ADS)

    Pignalosa, Antonio; Di Crescenzo, Giuseppe; Marino, Ermanno; Terracciano, Rosario; Santo, Antonio

    2015-04-01

    The work here presented concerns a case study in which a complete multidisciplinary workflow has been applied for an extensive assessment of the rockslide susceptibility and hazard in a common scenario such as a vertical and fractured rocky cliffs. The studied area is located in a high-relief zone in Southern Italy (Sacco, Salerno, Campania), characterized by wide vertical rocky cliffs formed by tectonized thick successions of shallow-water limestones. The study concerned the following phases: a) topographic surveying integrating of 3d laser scanning, photogrammetry and GNSS; b) gelogical surveying, characterization of single instabilities and geomecanichal surveying, conducted by geologists rock climbers; c) processing of 3d data and reconstruction of high resolution geometrical models; d) structural and geomechanical analyses; e) data filing in a GIS-based spatial database; f) geo-statistical and spatial analyses and mapping of the whole set of data; g) 3D rockfall analysis; The main goals of the study have been a) to set-up an investigation method to achieve a complete and thorough characterization of the slope stability conditions and b) to provide a detailed base for an accurate definition of the reinforcement and mitigation systems. For this purposes the most up-to-date methods of field surveying, remote sensing, 3d modelling and geospatial data analysis have been integrated in a systematic workflow, accounting of the economic sustainability of the whole project. A novel integrated approach have been applied both fusing deterministic and statistical surveying methods. This approach enabled to deal with the wide extension of the studied area (near to 200.000 m2), without compromising an high accuracy of the results. The deterministic phase, based on a field characterization of single instabilities and their further analyses on 3d models, has been applied for delineating the peculiarity of each single feature. The statistical approach, based on geostructural

  6. Speakers urge a unified approach to mitigating natural hazards

    NASA Astrophysics Data System (ADS)

    White, M. Catherine

    On November 3, while wildfires consumed acres of coastal land in California, the U.S. Natural Hazards Symposium in Washington, D.C., addressed the threat of natural hazards in the United States, disaster mitigation and recovery, and the need to consider natural hazards in land development plans. Several of the scheduled speakers were unable to participate because they were called to California to investigate the fires, including keynote speaker James Witt, the new director of the Federal Emergency Management Agency (FEMA).Substitute keynote speaker Harvey Ryland, Witt's senior adviser at FEMA, emphasized that “we must sell mitigation as an effective means of protecting people and property.” He discussed FEMA's new “National Mitigation Strategy,” which will serve as the basis for its emergency management program. The strategy is expected to be in place by January 1995. As part of the approach, FEMA will establish a mitigation directorate to organize various disaster mitigation efforts in one office. Ryland also discussed the idea of creating risk reduction enterprise zones, designated high risk areas that would offer incentives to property owners who take proper mitigation measures. “Such incentives would be offset by reduced disaster assistance costs,” Ryland added.

  7. Mitigation options for accidental releases of hazardous gases

    SciTech Connect

    Fthenakis, V.M.

    1995-05-01

    The objective of this paper is to review and compare technologies available for mitigation of unconfined releases of toxic and flammable gases. These technologies include: secondary confinement, deinventory, vapor barriers, foam spraying, and water sprays/monitors. Guidelines for the design and/or operation of effective post-release mitigation systems and case studies involving actual industrial mitigation systems are also presented.

  8. Input space-dependent controller for multi-hazard mitigation

    NASA Astrophysics Data System (ADS)

    Cao, Liang; Laflamme, Simon

    2016-04-01

    Semi-active and active structural control systems are advanced mechanical devices and systems capable of high damping performance, ideal for mitigation of multi-hazards. The implementation of these devices within structural systems is still in its infancy, because of the complexity in designing a robust closed-loop control system that can ensure reliable and high mitigation performance. Particular challenges in designing a controller for multi-hazard mitigation include: 1) very large uncertainties on dynamic parameters and unknown excitations; 2) limited measurements with probabilities of sensor failure; 3) immediate performance requirements; and 4) unavailable sets of input-output during design. To facilitate the implementation of structural control systems, a new type of controllers with high adaptive capabilities is proposed. It is based on real-time identification of an embedding that represents the essential dynamics found in the input space, or in the sensors measurements. This type of controller is termed input-space dependent controllers (ISDC). In this paper, the principle of ISDC is presented, their stability and performance derived analytically for the case of harmonic inputs, and their performance demonstrated in the case of different types of hazards. Results show the promise of this new type of controller at mitigating multi-hazards by 1) relying on local and limited sensors only; 2) not requiring prior evaluation or training; and 3) adapting to systems non-stationarities.

  9. Seismic hazard assessment and mitigation in India: an overview

    NASA Astrophysics Data System (ADS)

    Verma, Mithila; Bansal, Brijesh K.

    2013-07-01

    The Indian subcontinent is characterized by various tectonic units viz., Himalayan collision zone in North, Indo-Burmese arc in north-east, failed rift zones in its interior in Peninsular Indian shield and Andaman Sumatra trench in south-east Indian Territory. During the last about 100 years, the country has witnessed four great and several major earthquakes. Soon after the occurrence of the first great earthquake, the Shillong earthquake ( M w: 8.1) in 1897, efforts were started to assess the seismic hazard in the country. The first such attempt was made by Geological Survey of India in 1898 and since then considerable progress has been made. The current seismic zonation map prepared and published by Bureau of Indian Standards, broadly places seismic risk in different parts of the country in four major zones. However, this map is not sufficient for the assessment of area-specific seismic risks, necessitating detailed seismic zoning, that is, microzonation for earthquake disaster mitigation and management. Recently, seismic microzonation studies are being introduced in India, and the first level seismic microzonation has already been completed for selected urban centres including, Jabalpur, Guwahati, Delhi, Bangalore, Ahmadabad, Dehradun, etc. The maps prepared for these cities are being further refined on larger scales as per the requirements, and a plan has also been firmed up for taking up microzonation of 30 selected cities, which lie in seismic zones V and IV and have a population density of half a million. The paper highlights the efforts made in India so far towards seismic hazard assessment as well as the future road map for such studies.

  10. New Approaches to Tsunami Hazard Mitigation Demonstrated in Oregon

    NASA Astrophysics Data System (ADS)

    Priest, G. R.; Rizzo, A.; Madin, I.; Lyles Smith, R.; Stimely, L.

    2012-12-01

    Oregon Department of Geology and Mineral Industries and Oregon Emergency Management collaborated over the last four years to increase tsunami preparedness for residents and visitors to the Oregon coast. Utilizing support from the National Tsunami Hazards Mitigation Program (NTHMP), new approaches to outreach and tsunami hazard assessment were developed and then applied. Hazard assessment was approached by first doing two pilot studies aimed at calibrating theoretical models to direct observations of tsunami inundation gleaned from the historical and prehistoric (paleoseismic/paleotsunami) data. The results of these studies were then submitted to peer-reviewed journals and translated into 1:10,000-12,000-scale inundation maps. The inundation maps utilize a powerful new tsunami model, SELFE, developed by Joseph Zhang at the Oregon Health & Science University. SELFE uses unstructured computational grids and parallel processing technique to achieve fast accurate simulation of tsunami interactions with fine-scale coastal morphology. The inundation maps were simplified into tsunami evacuation zones accessed as map brochures and an interactive mapping portal at http://www.oregongeology.org/tsuclearinghouse/. Unique in the world are new evacuation maps that show separate evacuation zones for distant versus locally generated tsunamis. The brochure maps explain that evacuation time is four hours or more for distant tsunamis but 15-20 minutes for local tsunamis that are invariably accompanied by strong ground shaking. Since distant tsunamis occur much more frequently than local tsunamis, the two-zone maps avoid needless over evacuation (and expense) caused by one-zone maps. Inundation mapping for the entire Oregon coast will be complete by ~2014. Educational outreach was accomplished first by doing a pilot study to measure effectiveness of various approaches using before and after polling and then applying the most effective methods. In descending order, the most effective

  11. Collaborative Monitoring and Hazard Mitigation at Fuego Volcano, Guatemala

    NASA Astrophysics Data System (ADS)

    Lyons, J. J.; Bluth, G. J.; Rose, W. I.; Patrick, M.; Johnson, J. B.; Stix, J.

    2007-05-01

    A portable, digital sensor network has been installed to closely monitor changing activity at Fuego volcano, which takes advantage of an international collaborative effort among Guatemala, U.S. and Canadian universities, and the Peace Corps. The goal of this effort is to improve the understanding shallow internal processes, and consequently to more effectively mitigate volcanic hazards. Fuego volcano has had more than 60 historical eruptions and nearly-continuous activity make it an ideal laboratory to study volcanic processes. Close monitoring is needed to identify base-line activity, and rapidly identify and disseminate changes in the activity which might threaten nearby communities. The sensor network is comprised of a miniature DOAS ultraviolet spectrometer fitted with a system for automated plume scans, a digital video camera, and two seismo-acoustic stations and portable dataloggers. These sensors are on loan from scientists who visited Fuego during short field seasons and donated use of their sensors to a resident Peace Corps Masters International student from Michigan Technological University for extended data collection. The sensor network is based around the local volcano observatory maintained by Instituto National de Sismologia, Vulcanologia, Metrologia e Hidrologia (INSIVUMEH). INSIVUMEH provides local support and historical knowledge of Fuego activity as well as a secure location for storage of scientific equipment, data processing, and charging of the batteries that power the sensors. The complete sensor network came online in mid-February 2007 and here we present preliminary results from concurrent gas, seismic, and acoustic monitoring of activity from Fuego volcano.

  12. Composite Materials for Hazard Mitigation of Reactive Metal Hydrides.

    SciTech Connect

    Pratt, Joseph William; Cordaro, Joseph Gabriel; Sartor, George B.; Dedrick, Daniel E.; Reeder, Craig L.

    2012-02-01

    In an attempt to mitigate the hazards associated with storing large quantities of reactive metal hydrides, polymer composite materials were synthesized and tested under simulated usage and accident conditions. The composites were made by polymerizing vinyl monomers using free-radical polymerization chemistry, in the presence of the metal hydride. Composites with vinyl-containing siloxane oligomers were also polymerized with and without added styrene and divinyl benzene. Hydrogen capacity measurements revealed that addition of the polymer to the metal hydride reduced the inherent hydrogen storage capacity of the material. The composites were found to be initially effective at reducing the amount of heat released during oxidation. However, upon cycling the composites, the mitigating behavior was lost. While the polymer composites we investigated have mitigating potential and are physically robust, they undergo a chemical change upon cycling that makes them subsequently ineffective at mitigating heat release upon oxidation of the metal hydride. Acknowledgements The authors would like to thank the following people who participated in this project: Ned Stetson (U.S. Department of Energy) for sponsorship and support of the project. Ken Stewart (Sandia) for building the flow-through calorimeter and cycling test stations. Isidro Ruvalcaba, Jr. (Sandia) for qualitative experiments on the interaction of sodium alanate with water. Terry Johnson (Sandia) for sharing his expertise and knowledge of metal hydrides, and sodium alanate in particular. Marcina Moreno (Sandia) for programmatic assistance. John Khalil (United Technologies Research Corp) for insight into the hazards of reactive metal hydrides and real-world accident scenario experiments. Summary In an attempt to mitigate and/or manage hazards associated with storing bulk quantities of reactive metal hydrides, polymer composite materials (a mixture of a mitigating polymer and a metal hydride) were synthesized and tested

  13. GO/NO-GO - When is medical hazard mitigation acceptable for launch?

    NASA Technical Reports Server (NTRS)

    Hamilton, Douglas R.; Polk, James D.

    2005-01-01

    Medical support of spaceflight missions is composed of complex tasks and decisions that dedicated to maintaining the health and performance of the crew and the completion of mission objectives. Spacecraft represent one of the most complex vehicles built by humans, and are built to very rigorous design specifications. In the course of a Flight Readiness Review (FRR) or a mission itself, the flight surgeon must be able to understand the impact of hazards and risks that may not be completely mitigated by design alone. Some hazards are not mitigated because they are never actually identified. When a hazard is identified, it must be reduced or waivered. Hazards that cannot be designed out of the vehicle or mission, are usually mitigated through other means to bring the residual risk to an acceptable level. This is possible in most engineered systems because failure modes are usually predictable and analysis can include taking these systems to failure. Medical support of space missions is complicated by the inability of flight surgeons to provide "exact" hazard and risk numbers to the NASA engineering community. Taking humans to failure is not an option. Furthermore, medical dogma is mostly comprised of "medical prevention" strategies that mitigate risk by examining the behaviour of a cohort of humans similar to astronauts. Unfortunately, this approach does not lend itself well for predicting the effect of a hazard in the unique environment of space. This presentation will discuss how Medical Operations uses an evidence-based approach to decide if hazard mitigation strategies are adequate to reduce mission risk to acceptable levels. Case studies to be discussed will include: 1. Risk of electrocution risk during EVA 2. Risk of cardiac event risk during long and short duration missions 3. Degraded cabin environmental monitoring on the ISS. Learning Objectives 1.) The audience will understand the challenges of mitigating medical risk caused by nominal and off

  14. 77 FR 24505 - Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential Buildings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-24

    ... SECURITY Federal Emergency Management Agency Hazard Mitigation Assistance for Wind Retrofit Projects for... comments on Hazard Mitigation Assistance for Wind Retrofit Projects for Existing Residential Buildings... property from hazards and their effects. One such activity is the implementation of wind retrofit...

  15. Deterministic and Nondeterministic Behavior of Earthquakes and Hazard Mitigation Strategy

    NASA Astrophysics Data System (ADS)

    Kanamori, H.

    2014-12-01

    Earthquakes exhibit both deterministic and nondeterministic behavior. Deterministic behavior is controlled by length and time scales such as the dimension of seismogenic zones and plate-motion speed. Nondeterministic behavior is controlled by the interaction of many elements, such as asperities, in the system. Some subduction zones have strong deterministic elements which allow forecasts of future seismicity. For example, the forecasts of the 2010 Mw=8.8 Maule, Chile, earthquake and the 2012 Mw=7.6, Costa Rica, earthquake are good examples in which useful forecasts were made within a solid scientific framework using GPS. However, even in these cases, because of the nondeterministic elements uncertainties are difficult to quantify. In some subduction zones, nondeterministic behavior dominates because of complex plate boundary structures and defies useful forecasts. The 2011 Mw=9.0 Tohoku-Oki earthquake may be an example in which the physical framework was reasonably well understood, but complex interactions of asperities and insufficient knowledge about the subduction-zone structures led to the unexpected tragic consequence. Despite these difficulties, broadband seismology, GPS, and rapid data processing-telemetry technology can contribute to effective hazard mitigation through scenario earthquake approach and real-time warning. A scale-independent relation between M0 (seismic moment) and the source duration, t, can be used for the design of average scenario earthquakes. However, outliers caused by the variation of stress drop, radiation efficiency, and aspect ratio of the rupture plane are often the most hazardous and need to be included in scenario earthquakes. The recent development in real-time technology would help seismologists to cope with, and prepare for, devastating tsunamis and earthquakes. Combining a better understanding of earthquake diversity and modern technology is the key to effective and comprehensive hazard mitigation practices.

  16. 76 FR 23613 - Draft Programmatic Environmental Assessment for Hazard Mitigation Safe Room Construction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-27

    ... SECURITY Federal Emergency Management Agency Draft Programmatic Environmental Assessment for Hazard Mitigation Safe Room Construction AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice of... Mitigation Grant Program (HMGP), the Federal Emergency Management Agency (FEMA) may provide funding...

  17. Meteorological Hazard Assessment and Risk Mitigation in Rwanda.

    NASA Astrophysics Data System (ADS)

    Nduwayezu, Emmanuel; Jaboyedoff, Michel; Bugnon, Pierre-Charles; Nsengiyumva, Jean-Baptiste; Horton, Pascal; Derron, Marc-Henri

    2015-04-01

    used in identifying the most risky areas. Finally, based on practical experiences in this kind of field and produced documents some recommendations for low-cost mitigation measures will be proposed. Reference: MIDIMAR, Impacts of floods and landslides on socio-economic development profile. Case study: Musanze District. Kigali, June 2012.

  18. Tsunami hazard mitigation in tourism in the tropical and subtropical coastal areas: a case study in the Ryukyu Islands, southwest of Japan

    NASA Astrophysics Data System (ADS)

    Matsumoto, T.

    2006-12-01

    Life and economy (including tourism) in tropical and subtropical coastal areas, such as Okinawa Prefecture (Ryukyu) are highly relying on the sea. The sea has both "gentle" side to give people healing and "dangerous" side to kill people. If we are going to utilise the sea for marine tourism such as constructing resort facilities on the oceanfront, we should know all of the sea, including the both sides of the sea: especially the nature of tsunamis. And also we islanders should issue accurate information about the sea towards outsiders, especially tourists visiting the island. We have already learned a lesson about this issue from the Sumatra tsunami in 2004. However, measures against the tsunami disaster by marine tourism industry are still inadequate in these areas. The goal of tsunami hazard mitigation for those engaged in tourism industry in tropical and subtropical coastal areas should be as follows. (1) Preparedness against tsunamis: "Be aware of the characteristics of tsunamis." "Prepare tsunamis when you feel an earthquake." "Prepare tsunamis when an earthquake takes place somewhere in the world." (2) Maintenance of an exact tsunami hazard map under quantitative analyses of the characteristics of tsunamis: "Flooding areas by tsunami attacks are dependent not only on altitude but also on amplification and inundation due to the seafloor topography near the coast and the onland topographic relief." "Tsunami damage happens repeatedly." (3) Maintenance of a tsunami disaster prevention manual and training after the manual: "Who should do what in case of tsunamis?" "How should the resort hotel employees lead the guests to the safe place?" Such a policy for disaster prevention is discussed in the class of the general education of "Ocean Sciences" in University of the Ryukyus (UR) and summer school for high school students. The students (most of them are from Okinawa Prefecture) consider, discuss and make reports about what to do in case of tsunamis as an islander

  19. Mitigating mountain hazards in Austria - legislation, risk transfer, and awareness building

    NASA Astrophysics Data System (ADS)

    Holub, M.; Fuchs, S.

    2009-04-01

    Embedded in the overall concept of integral risk management, mitigating mountain hazards is pillared by land use regulations, risk transfer, and information. In this paper aspects on legislation related to natural hazards in Austria are summarised, with a particular focus on spatial planning activities and hazard mapping, and possible adaptations focussing on enhanced resilience are outlined. Furthermore, the system of risk transfer is discussed, highlighting the importance of creating incentives for risk-aware behaviour, above all with respect to individual precaution and insurance solutions. Therefore, the issue of creating awareness through information is essential, which is presented subsequently. The study results in recommendations of how administrative units on different federal and local levels could increase the enforcement of regulations related to the minimisation of natural hazard risk. Moreover, the nexus to risk transfer mechanisms is provided, focusing on the current compensation system in Austria and some possible adjustments in order to provide economic incentives for (private) investments in mitigation measures, i.e. local structural protection. These incentives should be supported by delivering information on hazard and risk target-oriented to any stakeholder involved. Therefore, coping strategies have to be adjusted and the interaction between prevention and precaution has to be highlighted. The paper closes with recommendations of how these efforts could be achieved, with a particular focus on the situation in the Republic of Austria.

  20. Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.

    2006-12-01

    An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes

  1. Climate resiliency: A unique multi-hazard mitigation approach.

    PubMed

    Baja, Kristin

    2016-01-01

    Baltimore's unique combination of shocks and stresses cuts across social, economic and environmental factors. Like many other post-industrial cities, over the past several decades, Baltimore has experienced a decline in its population -- resulting in a lower tax base. These trends have had deleterious effects on the city's ability to attend to much needed infrastructure improvements and human and social services. In addition to considerable social and economic issues, the city has begun to experience negative impacts due to climate change. The compounding nature of these trends has put Baltimore, like other post-industrial cities, in the position of having to do more with fewer available resources. Rather than wait for disaster to strike, Baltimore took a proactive approach to planning for shocks and stresses by determining unique ways to pre-emptively plan for and adapt to effects from climate change and incorporating these into the City's All Hazard Mitigation Plan. Since adopting the plan in 2013, Baltimore has been moving forward with various projects aimed at improving systems, enhancing adaptive capacity and building a more resilient and sustainable city. This paper describes the basis for the city's approach and offers a portrait of its efforts in order to broaden foundational knowledge of the emerging ways that cities are recasting the role of planning in light of unprecedented circumstances that demand complex solutions that draw on few resources. PMID:27318285

  2. ANALYSIS AND MITIGATION OF X-RAY HAZARD GENERATED FROM HIGH INTENSITY LASER-TARGET INTERACTIONS

    SciTech Connect

    Qiu, R.; Liu, J.C.; Prinz, A.A.; Rokni, S.H.; Woods, M.; Xia, Z.; /SLAC

    2011-03-21

    Interaction of a high intensity laser with matter may generate an ionizing radiation hazard. Very limited studies have been made, however, on the laser-induced radiation protection issue. This work reviews available literature on the physics and characteristics of laser-induced X-ray hazards. Important aspects include the laser-to-electron energy conversion efficiency, electron angular distribution, electron energy spectrum and effective temperature, and bremsstrahlung production of X-rays in the target. The possible X-ray dose rates for several femtosecond Ti:sapphire laser systems used at SLAC, including the short pulse laser system for the Matter in Extreme Conditions Instrument (peak power 4 TW and peak intensity 2.4 x 10{sup 18} W/cm{sup 2}) were analysed. A graded approach to mitigate the laser-induced X-ray hazard with a combination of engineered and administrative controls is also proposed.

  3. Mitigation of unconfined releases of hazardous gases via liquid spraying

    SciTech Connect

    Fthenakis, V.M.

    1997-02-01

    The capability of water sprays in mitigating clouds of hydrofluoric acid (HF) has been demonstrated in the large-scale field experiments of Goldfish and Hawk, which took place at the DOE Nevada Test Site. The effectiveness of water sprays and fire water monitors to remove HF from vapor plume, has also been studied theoretically using the model HGSPRAY5 with the near-field and far-field dispersion described by the HGSYSTEM models. This paper presents options to select and evaluate liquid spraying systems, based on the industry experience and mathematical modeling.

  4. Local hazard mitigation plans: a preliminary estimation of state-level completion from 2004 to 2009.

    PubMed

    Jackman, Andrea M; Beruvides, Mario G

    2013-01-01

    According to the Disaster Mitigation Act of 2000 and subsequent federal policy, local governments are required to have a Hazard Mitigation Plan (HMP) written and approved by the Federal Emergency Management Agency (FEMA) to be eligible for federal mitigation assistance. This policy took effect on November 1, 2004. Using FEMA's database of approved HMPs and US Census Bureau's 2002 Survey of Local Governments, it is estimated that 3 years after the original deadline, 67 percent of the country's active local governments were without an approved HMP. A follow-up examination in 2009 of the eight states with the lowest completion percentages did not indicate significant improvement following the initial study and revealed inconsistencies in plan completion data over time. The completion percentage varied greatly by state and did not appear to follow any expected pattern such as wealth or hazard vulnerability that might encourage prompt completion of a plan. Further, the results indicate that -92 percent of the approved plans were completed by a multijurisdictional entity, which suggests single governments seldom complete and gain approval for plans. Based on these results, it is believed that state-level resolution is not adequate for explaining the variation of plan completion, and further study at the local level is warranted. PMID:24180092

  5. Next-Generation GPS Station for Hazards Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2013-12-01

    Our objective is to better forecast, assess, and mitigate natural hazards, including earthquakes, tsunamis, and extreme storms and flooding through development and implementation of a modular technology for the next-generation in-situ geodetic station to support the flow of information from multiple stations to scientists, mission planners, decision makers, and first responders. The same technology developed under NASA funding can be applied to enhance monitoring of large engineering structures such as bridges, hospitals and other critical infrastructure. Meaningful warnings save lives when issued within 1-2 minutes for destructive earthquakes, several tens of minutes for tsunamis, and up to several hours for extreme storms and flooding, and can be provided by on-site fusion of multiple data types and generation of higher-order data products: GPS/GNSS and accelerometer measurements to estimate point displacements, and GPS/GNSS and meteorological measurements to estimate moisture variability in the free atmosphere. By operating semi-autonomously, each station can then provide low-latency, high-fidelity and compact data products within the constraints of narrow communications bandwidth that often accompanies natural disasters. We have developed a power-efficient, low-cost, plug-in Geodetic Module for fusion of data from in situ sensors including GPS, a strong-motion accelerometer module, and a meteorological sensor package, for deployment at existing continuous GPS stations in southern California; fifteen stations have already been upgraded. The low-cost modular design is scalable to the many existing continuous GPS stations worldwide. New on-the-fly data products are estimated with 1 mm precision and accuracy, including three-dimensional seismogeodetic displacements for earthquake, tsunami and structural monitoring and precipitable water for forecasting extreme weather events such as summer monsoons and atmospheric rivers experienced in California. Unlike more

  6. Aligning Natural Resource Conservation and Flood Hazard Mitigation in California

    PubMed Central

    Calil, Juliano; Beck, Michael W.; Gleason, Mary; Merrifield, Matthew; Klausmeyer, Kirk; Newkirk, Sarah

    2015-01-01

    Flooding is the most common and damaging of all natural disasters in the United States, and was a factor in almost all declared disasters in U.S. history. Direct flood losses in the U.S. in 2011 totaled $8.41 billion and flood damage has also been on the rise globally over the past century. The National Flood Insurance Program paid out more than $38 billion in claims since its inception in 1968, more than a third of which has gone to the one percent of policies that experienced multiple losses and are classified as “repetitive loss.” During the same period, the loss of coastal wetlands and other natural habitat has continued, and funds for conservation and restoration of these habitats are very limited. This study demonstrates that flood losses could be mitigated through action that meets both flood risk reduction and conservation objectives. We found that there are at least 11,243km2 of land in coastal California, which is both flood-prone and has natural resource conservation value, and where a property/structure buyout and habitat restoration project could meet multiple objectives. For example, our results show that in Sonoma County, the extent of land that meets these criteria is 564km2. Further, we explore flood mitigation grant programs that can be a significant source of funds to such projects. We demonstrate that government funded buyouts followed by restoration of targeted lands can support social, environmental, and economic objectives: reduction of flood exposure, restoration of natural resources, and efficient use of limited governmental funds. PMID:26200353

  7. Numerical and probabilistic analysis of asteroid and comet impact hazard mitigation

    SciTech Connect

    Plesko, Catherine S; Weaver, Robert P; Huebner, Walter F

    2010-09-09

    The possibility of asteroid and comet impacts on Earth has received significant recent media and scientific attention. Still, there are many outstanding questions about the correct response once a potentially hazardous object (PHO) is found. Nuclear munitions are often suggested as a deflection mechanism because they have a high internal energy per unit launch mass. However, major uncertainties remain about the use of nuclear munitions for hazard mitigation. There are large uncertainties in a PHO's physical response to a strong deflection or dispersion impulse like that delivered by nuclear munitions. Objects smaller than 100 m may be solid, and objects at all sizes may be 'rubble piles' with large porosities and little strength. Objects with these different properties would respond very differently, so the effects of object properties must be accounted for. Recent ground-based observations and missions to asteroids and comets have improved the planetary science community's understanding of these objects. Computational power and simulation capabilities have improved such that it is possible to numerically model the hazard mitigation problem from first principles. Before we know that explosive yield Y at height h or depth -h from the target surface will produce a momentum change in or dispersion of a PHO, we must quantify energy deposition into the system of particles that make up the PHO. Here we present the initial results of a parameter study in which we model the efficiency of energy deposition from a stand-off nuclear burst onto targets made of PHO constituent materials.

  8. Implementation strategies for U.S. DOE Order 5480.28 Natural Phenomena Hazards Mitigation

    SciTech Connect

    Conrads, T.J.

    1995-01-01

    This paper describes the strategies used by Westinghouse Hanford Company for implementing a new U.S. Department of Energy Order 5480.28, Natural Phenomena Hazards Mitigation. The order requires that all new and existing structures, systems, and components be designed and evaluated for the effects of natural phenomena (seismic, wind, flood, and volcano) applicable at a given site. It also requires that instrumentation be available to record the expected seismic events and that procedures be available to inspect facilities for damage following a natural phenomena event. This order requires that probabilistic hazards studies be conducted for the applicable natural phenomena to determine appropriate loads to be applied in a graded approach to structures, systems, and components important to safety. This paper discusses the processes, tasks, and methods used to implement this directive, which altered the standard design basis for new and existing structures, systems, and components at the Hanford Site. It also addresses a correlation between the performance category nomenclature of DOE Order 5480.28 and the safety classification described in DOE Order 5480.23, Nuclear Safety Analysis Reports. This correlation was deemed to be a prerequisite for the cost-effective implementation of the new DOE Order on natural phenomena hazards mitigation.

  9. Evaluating fuel complexes for fire hazard mitigation planning in the southeastern United States.

    SciTech Connect

    Andreu, Anne G.; Shea, Dan; Parresol, Bernard, R.; Ottmar, Roger, D.

    2012-01-01

    Fire hazard mitigation planning requires an accurate accounting of fuel complexes to predict potential fire behavior and effects of treatment alternatives. In the southeastern United States, rapid vegetation growth coupled with complex land use history and forest management options requires a dynamic approach to fuel characterization. In this study we assessed potential surface fire behavior with the Fuel Characteristic Classification System (FCCS), a tool which uses inventoried fuelbed inputs to predict fire behavior. Using inventory data from 629 plots established in the upper Atlantic Coastal Plain, South Carolina, we constructed FCCS fuelbeds representing median fuel characteristics by major forest type and age class. With a dry fuel moisture scenario and 6.4 km h{sub 1} midflame wind speed, the FCCS predicted moderate to high potential fire hazard for the majority of the fuelbeds under study. To explore fire hazard under potential future fuel conditions, we developed fuelbeds representing the range of quantitative inventorydata for fuelbed components that drive surface fire behavior algorithms and adjusted shrub species composition to represent 30% and 60% relative cover of highly flammable shrub species. Results indicate that the primary drivers of surface fire behavior vary by forest type, age and surface fire behavior rating. Litter tends to be a primary or secondary driver in most forest types. In comparison to other surface fire contributors, reducing shrub loading results in reduced flame lengths most consistently across forest types. FCCS fuelbeds and the results from this project can be used for fire hazard mitigation planning throughout the southern Atlantic Coastal Plain where similar forest types occur. The approach of building simulated fuelbeds across the range of available surface fuel data produces sets of incrementally different fuel characteristics that can be applied to any dynamic forest types in which surface fuel conditions change rapidly.

  10. Spatio-temporal patterns of hazards and their use in risk assessment and mitigation. Case study of road accidents in Romania

    NASA Astrophysics Data System (ADS)

    Catalin Stanga, Iulian

    2013-04-01

    the spatial or temporal clustering of crash accidents. Since the 1990's, Geographical Informational Systems (GIS) became a very important tool for traffic and road safety management, allowing not only the spatial and multifactorial analysis, but also graphical and non-graphical outputs. The current paper presents an accessible GIS methodology to study the spatio-temporal pattern of injury related road accidents, to identify the high density accidents zones, to make a cluster analysis, to create multicriterial typologies, to identify spatial and temporal similarities and to explain them. In this purpose, a Geographical Information System was created, allowing a complex analysis that involves not only the events, but also a large set of interrelated and spatially linked attributes. The GIS includes the accidents as georeferenced point elements with a spatially linked attribute database: identification information (date, location details); accident type; main, secondary and aggravating causes; data about driver; vehicle information; consequences (damages, injured peoples and fatalities). Each attribute has its own number code that allows both the statistical analysis and the spatial interrogation. The database includes those road accidents that led to physical injuries and loss of human lives between 2007 and 2012 and the spatial analysis was realized using TNTmips 7.3 software facilities. Data aggregation and processing allowed creating the spatial pattern of injury related road accidents through Kernel density estimation at three different levels (national - Romania; county level - Iasi County; local level - Iasi town). Spider graphs were used to create the temporal pattern or road accidents at three levels (daily, weekly and monthly) directly related to their causes. Moreover the spatial and temporal database relates the natural hazards (glazed frost, fog, and blizzard) with the human made ones, giving the opportunity to evaluate the nature of uncertainties in risk

  11. Spatio-temporal patterns of hazards and their use in risk assessment and mitigation. Case study of road accidents in Romania

    NASA Astrophysics Data System (ADS)

    Catalin Stanga, Iulian

    2013-04-01

    the spatial or temporal clustering of crash accidents. Since the 1990's, Geographical Informational Systems (GIS) became a very important tool for traffic and road safety management, allowing not only the spatial and multifactorial analysis, but also graphical and non-graphical outputs. The current paper presents an accessible GIS methodology to study the spatio-temporal pattern of injury related road accidents, to identify the high density accidents zones, to make a cluster analysis, to create multicriterial typologies, to identify spatial and temporal similarities and to explain them. In this purpose, a Geographical Information System was created, allowing a complex analysis that involves not only the events, but also a large set of interrelated and spatially linked attributes. The GIS includes the accidents as georeferenced point elements with a spatially linked attribute database: identification information (date, location details); accident type; main, secondary and aggravating causes; data about driver; vehicle information; consequences (damages, injured peoples and fatalities). Each attribute has its own number code that allows both the statistical analysis and the spatial interrogation. The database includes those road accidents that led to physical injuries and loss of human lives between 2007 and 2012 and the spatial analysis was realized using TNTmips 7.3 software facilities. Data aggregation and processing allowed creating the spatial pattern of injury related road accidents through Kernel density estimation at three different levels (national - Romania; county level - Iasi County; local level - Iasi town). Spider graphs were used to create the temporal pattern or road accidents at three levels (daily, weekly and monthly) directly related to their causes. Moreover the spatial and temporal database relates the natural hazards (glazed frost, fog, and blizzard) with the human made ones, giving the opportunity to evaluate the nature of uncertainties in risk

  12. Threshold effects of hazard mitigation in coastal human-environmental systems

    NASA Astrophysics Data System (ADS)

    Lazarus, E. D.

    2014-01-01

    Despite improved scientific insight into physical and social dynamics related to natural disasters, the financial cost of extreme events continues to rise. This paradox is particularly evident along developed coastlines, where future hazards are projected to intensify with consequences of climate change, and where the presence of valuable infrastructure exacerbates risk. By design, coastal hazard mitigation buffers human activities against the variability of natural phenomena such as storms. But hazard mitigation also sets up feedbacks between human and natural dynamics. This paper explores developed coastlines as exemplary coupled human-environmental systems in which hazard mitigation is the key coupling mechanism. Results from a simplified numerical model of an agent-managed seawall illustrate the nonlinear effects that economic and physical thresholds can impart into coastal human-environmental system dynamics. The scale of mitigation action affects the time frame over which human activities and natural hazards interact. By accelerating environmental changes observable in some settings over human timescales of years to decades, climate change may temporarily strengthen the coupling between human and environmental dynamics. However, climate change could ultimately result in weaker coupling at those human timescales as mitigation actions increasingly engage global-scale systems.

  13. Threshold effects of hazard mitigation in coastal human-environmental systems

    NASA Astrophysics Data System (ADS)

    Lazarus, E. D.

    2013-10-01

    Despite improved scientific insight into physical and social dynamics related to natural disasters, the financial cost of extreme events continues to rise. This paradox is particularly evident along developed coastlines, where future hazards are projected to intensify with consequences of climate change, and where the presence of valuable infrastructure exacerbates risk. By design, coastal hazard mitigation buffers human activities against the variability of natural phenomena such as storms. But hazard mitigation also sets up feedbacks between human and natural dynamics. This paper explores developed coastlines as exemplary coupled human-environmental systems in which hazard mitigation is the key coupling mechanism. Results from a simplified numerical model of an agent-managed seawall illustrate the nonlinear effects that economic and physical thresholds can impart into coupled-system dynamics. The scale of mitigation action affects the time frame over which human activities and natural hazards interact. By accelerating environmental changes observable in some settings over human time scales of years to decades, climate change may temporarily strengthen the coupling between human and environmental dynamics. However, climate change could ultimately result in weaker coupling at those human time scales as mitigation actions increasingly engage global-scale systems.

  14. Assessing the costs of hazard mitigation through landscape interventions in the urban structure

    NASA Astrophysics Data System (ADS)

    Bostenaru-Dan, Maria; Aldea Mendes, Diana; Panagopoulos, Thomas

    2014-05-01

    In this paper we look at an issue rarely approached, the economic efficiency of natural hazard risk mitigation. The urban scale at which a natural hazard can impact leads to the importance of urban planning strategy in risk management. However, usually natural, engineering, and social sciences deal with it, and the role of architecture and urban planning is neglected. Climate change can lead to risks related to increased floods, desertification, sea level rise among others. Reducing the sealed surfaces in cities through green spaces in the crowded centres can mitigate them, and can be foreseen in restructuration plans in presence or absence of disasters. For this purpose we reviewed the role of green spaces and community centres such as churches in games, which can build the core for restructuration efforts, as also field and archive studies show. We look at the way ICT can contribute to organize the information from the building survey to economic computations in direct modeling or through games. The roles of game theory, agent based modeling and networks and urban public policies in designing decision systems for risk management are discussed. Games rules are at the same time supported by our field and archive studies, as well as research by design. Also we take into consideration at a rare element, which is the role of landscape planning, through the inclusion of green elements in reconstruction after the natural and man-made disasters, or in restructuration efforts to mitigate climate change. Apart of existing old city tissue also landscape can be endangered by speculation and therefore it is vital to highlight its high economic value, also in this particular case. As ICOMOS highlights for the 2014 congress, heritage and landscape are two sides of the same coin. Landscape can become or be connected to a community centre, the first being necessary for building a settlement, the second raising its value, or can build connections between landmarks in urban routes

  15. The price of safety: costs for mitigating and coping with Alpine hazards

    NASA Astrophysics Data System (ADS)

    Pfurtscheller, C.; Thieken, A. H.

    2013-10-01

    Due to limited public budgets and the need to economize, the analysis of costs of hazard mitigation and emergency management of natural hazards becomes increasingly important for public natural hazard and risk management. In recent years there has been a growing body of literature on the estimation of losses which supported to help to determine benefits of measures in terms of prevented losses. On the contrary, the costs of mitigation are hardly addressed. This paper thus aims to shed some light on expenses for mitigation and emergency services. For this, we analysed the annual costs of mitigation efforts in four regions/countries of the Alpine Arc: Bavaria (Germany), Tyrol (Austria), South Tyrol (Italy) and Switzerland. On the basis of PPP values (purchasing power parities), annual expenses on public safety ranged from EUR 44 per capita in the Free State of Bavaria to EUR 216 in the Autonomous Province of South Tyrol. To analyse the (variable) costs for emergency services in case of an event, we used detailed data from the 2005 floods in the Federal State of Tyrol (Austria) as well as aggregated data from the 2002 floods in Germany. The analysis revealed that multi-hazards, the occurrence and intermixture of different natural hazard processes, contribute to increasing emergency costs. Based on these findings, research gaps and recommendations for costing Alpine natural hazards are discussed.

  16. Assessment and mitigation of combustible dust hazards in the plastics industry

    NASA Astrophysics Data System (ADS)

    Stern, Michael C.; Ibarreta, Alfonso; Myers, Timothy J.

    2015-05-01

    A number of recent industrial combustible dust fires and explosions, some involving powders used in the plastics industry, have led to heightened awareness of combustible dust hazards, increased regulatory enforcement, and changes to the current standards and regulations. This paper provides a summary of the fundamentals of combustible dust explosion hazards, comparing and contrasting combustible dust to flammable gases and vapors. The types of tests used to quantify and evaluate the potential hazard posed by plastic dusts are explored. Recent changes in NFPA 654, a standard applicable to combustible dust in the plastics industry, are also discussed. Finally, guidance on the primary methods for prevention and mitigation of combustible dust hazards are provided.

  17. An establishment on the hazard mitigation system of large scale landslides for Zengwen reservoir watershed management in Taiwan

    NASA Astrophysics Data System (ADS)

    Tsai, Kuang-Jung; Lee, Ming-Hsi; Chen, Yie-Ruey; Huang, Meng-Hsuan; Yu, Chia-Ching

    2016-04-01

    Extremely heavy rainfall with accumulated rainfall amount more than 2900mm within continuous 3 day event occurred at southern Taiwan has been recognized as a serious natural hazard caused by Morakot typhoon in august, 2009. Very destructive large scale landslides and debris flows were induced by this heavy rainfall event. According to the satellite image processing and monitoring project was conducted by Soil & Water Conservation Bureau after Morakot typhoon. More than 10904 sites of landslide with total sliding area of 18113 ha were significantly found by this project. Also, the field investigation on all landslide areas were executed by this research on the basis of disaster type, scale and location related to the topographic condition, colluvium soil characteristics, bedrock formation and geological structure after Morakot hazard. The mechanism, characteristics and behavior of this large scale landslide combined with debris flow disasters are analyzed and Investigated to rule out the interaction of factors concerned above and identify the disaster extent of rainfall induced landslide during the period of this study. In order to reduce the disaster risk of large scale landslide and debris flow, the adaption strategy of hazard mitigation system should be set up as soon as possible and taken into consideration of slope land conservation, landslide control countermeasure planning, disaster database establishment, environment impact analysis and disaster risk assessment respectively. As a result, this 3-year research has been focused on the field investigation by using GPS/GIS/RS integration, mechanism and behavior study regarding to the rainfall induced landslide occurrence, disaster database and hazard mitigation system establishment. In fact, this project has become an important issue which was seriously concerned by the government and people live in Taiwan. Hopefully, all results come from this research can be used as a guidance for the disaster prevention and

  18. The Diversity of Large Earthquakes and Its Implications for Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Kanamori, Hiroo

    2014-05-01

    With the advent of broadband seismology and GPS, significant diversity in the source radiation spectra of large earthquakes has been clearly demonstrated. This diversity requires different approaches to mitigate hazards. In certain tectonic environments, seismologists can forecast the future occurrence of large earthquakes within a solid scientific framework using the results from seismology and GPS. Such forecasts are critically important for long-term hazard mitigation practices, but because stochastic fracture processes are complex, the forecasts are inevitably subject to large uncertainty, and unexpected events will continue to surprise seismologists. Recent developments in real-time seismology will help seismologists to cope with and prepare for tsunamis and earthquakes. Combining a better understanding of earthquake diversity with modern technology is the key to effective and comprehensive hazard mitigation practices.

  19. Numerical and Probabilistic Analysis of Asteroid and Comet Impact Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Plesko, C.; Weaver, R.; Huebner, W.

    2010-09-01

    The possibility of asteroid and comet nucleus impacts on Earth has received significant recent media and scientific attention. Still, there are many outstanding questions about the correct response once a potentially hazardous object (PHO) is found. Nuclear explosives are often suggested as a deflection mechanism because they have a high internal energy per unit launch mass. However, major uncertainties remain about the use of nuclear explosives for hazard mitigation. There are large uncertainties in a PHO’s physical response to a strong deflection or dispersion impulse like that delivered by nuclear munitions. Objects smaller than 100 m may be solid, and objects at all sizes may be “rubble piles” with large porosities and little strength [1]. Objects with these different properties would respond very differently, so the effects of object properties must be accounted for. Recent ground-based observations and missions to asteroids and comets have improved the planetary science community’s understanding of these objects. Computational power and simulation capabilities have improved to such an extent that it is possible to numerically model the hazard mitigation problem from first principles. Before we know that explosive yield Y at height h or depth -h from the target surface will produce a momentum change in or dispersion of a PHO, we must quantify the energy deposition into the system of particles that make up the PHO. Here we present the initial results of a parameter study in which we model the efficiency of energy deposition from a stand-off nuclear burst onto targets made of PHO constituent materials.

  20. Fourth DOE Natural Phenomena Hazards Mitigation Conference: Proceedings. Volume 1

    SciTech Connect

    Not Available

    1993-12-31

    This conference allowed an interchange in the natural phenomena area among designers, safety professionals, and managers. The papers presented in Volume I of the proceedings are from sessions I - VIII which cover the general topics of: DOE standards, lessons learned and walkdowns, wind, waste tanks, ground motion, testing and materials, probabilistic seismic hazards, risk assessment, base isolation and energy dissipation, and lifelines and floods. Individual papers are indexed separately. (GH)

  1. Department of Energy Natural Phenomena Hazards Mitigation Program

    SciTech Connect

    Murray, R.C.

    1993-09-01

    This paper will present a summary of past and present accomplishments of the Natural Phenomena Hazards Program that has been ongoing at Lawrence Livermore National Laboratory since 1975. The Natural Phenomena covered includes earthquake; winds, hurricanes, and tornadoes; flooding and precipitation; lightning; and volcanic events. The work is organized into four major areas (1) Policy, requirements, standards, and guidance (2) Technical support, research development, (3) Technology transfer, and (4) Oversight.

  2. Advances(?) in mitigating volcano hazards in Latin America

    USGS Publications Warehouse

    Hall, M.L.

    1991-01-01

    The 1980's were incredible years for volcanology. As a consequence of the Mount St. Helens and other eruptions, major advances in our understanding of volcanic processes and eruption dynamics were made. the decade also witnessed the greatest death toll caused by volcanism since 1902. Following Mount St. Helens, awareness of volcano hazards increased throughout the world; however, in Latin America, subsequent events showed that much was still to be learned. 

  3. Development Of An Open System For Integration Of Heterogeneous Models For Flood Forecasting And Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Chang, W.; Tsai, W.; Lin, F.; Lin, S.; Lien, H.; Chung, T.; Huang, L.; Lee, K.; Chang, C.

    2008-12-01

    During a typhoon or a heavy storm event, using various forecasting models to predict rainfall intensity, and water level variation in rivers and flood situation in the urban area is able to reveal its capability technically. However, in practice, the following two causes tend to restrain the further application of these models as a decision support system (DSS) for the hazard mitigation. The first one is due to the difficulty of integration of heterogeneous models. One has to take into consideration the different using format of models, such as input files, output files, computational requirements, and so on. The second one is that the development of DSS requires, due to the heterogeneity of models and systems, a friendly user interface or platform to hide the complexity of various tools from users. It is expected that users can be governmental officials rather than professional experts, therefore the complicated interface of DSS is not acceptable. Based on the above considerations, in the present study, we develop an open system for integration of several simulation models for flood forecasting by adopting the FEWS (Flood Early Warning System) platform developed by WL | Delft Hydraulics. It allows us to link heterogeneous models effectively and provides suitable display modules. In addition, FEWS also has been adopted by Water Resource Agency (WRA), Taiwan as the standard operational system for river flooding management. That means this work can be much easily integrated with the use of practical cases. In the present study, based on FEWS platform, the basin rainfall-runoff model, SOBEK channel-routing model, and estuary tide forecasting model are linked and integrated through the physical connection of model initial and boundary definitions. The work flow of the integrated processes of models is shown in Fig. 1. This differs from the typical single model linking used in FEWS, which only aims at data exchange but without much physical consideration. So it really

  4. Looking before we leap: an ongoing, quantative investigation of asteroid and comet impact hazard mitigation

    SciTech Connect

    Plesko, Catherine S; Weaver, Robert P; Bradley, Paul A; Huebner, Walter F

    2010-01-01

    There are many outstanding questions about the correct response to an asteroid or comet impact threat on Earth. Nuclear munitions are currently thought to be the most efficient method of delivering an impact-preventing impulse to a potentially hazardous object (PHO). However, there are major uncertainties about the response of PHOs to a nuclear burst, and the most appropriate ways to use nuclear munitions for hazard mitigation.

  5. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  6. Assessment of indirect losses and costs of emergency for project planning of alpine hazard mitigation

    NASA Astrophysics Data System (ADS)

    Amenda, Lisa; Pfurtscheller, Clemens

    2013-04-01

    By virtue of augmented settling in hazardous areas and increased asset values, natural disasters such as floods, landslides and rockfalls cause high economic losses in Alpine lateral valleys. Especially in small municipalities, indirect losses, mainly stemming from a breakdown of transport networks, and costs of emergency can reach critical levels. A quantification of these losses is necessary to estimate the worthiness of mitigation measures, to determine the appropriate level of disaster assistance and to improve risk management strategies. There are comprehensive approaches available for assessing direct losses. However, indirect losses and costs of emergency are widely not assessed and the empirical basis for estimating these costs is weak. To address the resulting uncertainties of project appraisals, a standardized methodology has been developed dealing with issues of local economic effects and emergency efforts needed. In our approach, the cost-benefit-analysis for technical mitigation of the Austrian Torrent and Avalanche Control (TAC) will be optimized and extended using the 2005-debris flow as a design event, which struggled a small town in the upper Inn valley in southwest Tyrol (Austria). Thereby, 84 buildings were affected, 430 people were evacuated and due to this, the TAC implemented protection measures for 3.75 million Euros. Upgrading the method of the TAC and analyzing to what extent the cost-benefit-ratio is about to change, is one of the main objectives of this study. For estimating short-run indirect effects and costs of emergency on the local level, data was collected via questionnaires, field mapping, guided interviews, as well as intense literature research. According to this, up-to-date calculation methods were evolved and the cost-benefit-analysis of TAC was recalculated with these new-implemented results. The cost-benefit-ratio will be more precise and specific and hence, the decision, which mitigation alternative will be carried out

  7. Monitoring Fogo Island, Cape Verde Archipelago, for Volcanic Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Faria, B. V.; Heleno, S. I.; Barros, I. J.; d'Oreye, N.; Bandomo, Z.; Fonseca, J. F.

    2001-12-01

    Fogo Island, in the Cape Verde Archipelago (North Atlantic), with a total area of 476 km2 and a population of about 40000, is an active ocean island volcano raising from an average sea-bottom depth of the order of -3000m to a maximum altitude of 2820m. All of the 28 historically recorded eruptions (Ribeiro, 1960) since the arrival of the first settlers in the 15th Century took place in Cha das Caldeiras, a 9 km-wide flat zone 1700 meters above sea level that resulted from the infill of a large lateral collapse caldera (Day et al., 2000). The last eruptions occurred in 1951 and 1995, through secondary cones at the basis of Pico do Fogo, the main volcanic edifice. A tall scarp surrounds Cha das Calderas on its western side only, and the eastern limit leads to a very steep sub-aerial slope down to the coastline. With this morphology, the volcanic hazard is significant inside Cha das Caldeiras - with a resident population of the order of 800 - and particularly in the villages of the eastern coast. Because the magma has low viscosity, eruptions in Fogo have scarce precursory activity, and its forecast is therefore challenging. The VIGIL monitoring network was installed between 1997 and 2001, and is currently in full operation. It consists of seven seismographic stations - two of which broadband - four tilt stations, a CO2 monitoring station and a meteo station. The data is telemetred in real time to the central laboratory in the neighbor island of Santiago, and analyzed on a daily basis. The continuous data acquisition is complemented by periodic GPS, gravity and leveling surveys (Lima et al., this conference). In this paper we present the methodology adopted to monitor the level of volcanic activity of Fogo Volcano, and show examples of the data being collected. Anomalous data recorded at the end of September 2000, which led to the only occurrence of an alert warning so far, are also presented and discussed.

  8. Mitigation of EMU Cut Glove Hazard from Micrometeoroid and Orbital Debris Impacts on ISS Handrails

    NASA Technical Reports Server (NTRS)

    Ryan, Shannon; Christiansen, Eric L.; Davis, Bruce A.; Ordonez, Erick

    2009-01-01

    Recent cut damages sustained on crewmember gloves during extravehicular activity (ISS) onboard the International Space Station (ISS) have been caused by contact with sharp edges or a pinch point according to analysis of the damages. One potential source are protruding sharp edged crater lips from micrometeoroid and orbital debris (MMOD) impacts on metallic handrails along EVA translation paths. A number of hypervelocity impact tests were performed on ISS handrails, and found that mm-sized projectiles were capable of inducing crater lip heights two orders of magnitude above the minimum value for glove abrasion concerns. Two techniques were evaluated for mitigating the cut glove hazard of MMOD impacts on ISS handrails: flexible overwraps which act to limit contact between crewmember gloves and impact sites, and; alternate materials which form less hazardous impact crater profiles. In parallel with redesign efforts to increase the cut resilience of EMU gloves, the modifications to ISS handrails evaluated in this study provide the means to significantly reduce cut glove risk from MMOD impact craters

  9. Quantifying the effect of early warning systems for mitigating risks from alpine hazards

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Sättele, Martina; Bründl, Michael

    2016-04-01

    Early warning systems (EWS) are increasingly applied as flexible and non-intrusive measures for mitigating risks from alpine hazards. They are typically planed and installed in an ad-hoc manner and their effectiveness is not quantified, which is in contrast to structural risk mitigation measures. The effect of an EWS on the risk depends on human decision makers: experts interpret the signals from EWS, authorities decide on intervention measures and the public responds to the warnings. This interaction of the EWS with humans makes the quantification of their effectiveness challenging. Nevertheless, such a quantification is an important step in understanding, improving and justifying the use of EWS. We systematically discuss and demonstrate the factors that influence EWS effectiveness for alpine hazards, and present approaches and tools for analysing them. These include Bayesian network models, which are a powerful tool for an integral probabilistic assessment. The theory is illustrated through applications of warning systems for debris flow and rockfall hazards. References: Sättele M., Bründl M., Straub D. (in print). Quantifying the effectiveness of early warning systems for natural hazards. Natural Hazards and Earth System Sciences. Sättele M., Bründl M., Straub D. (2015). Reliability and Effectiveness of Warning Systems for Natural Hazards: Concepts and Application to Debris Flow Warning. Reliability Engineering & System Safety, 142: 192-202

  10. The U.S. National Tsunami Hazard Mitigation Program: Successes in Tsunami Preparedness

    NASA Astrophysics Data System (ADS)

    Whitmore, P.; Wilson, R. I.

    2012-12-01

    Formed in 1995 by Congressional Action, the National Tsunami Hazards Mitigation Program (NTHMP) provides the framework for tsunami preparedness activities in the United States. The Program consists of the 28 U.S. coastal states, territories, and commonwealths (STCs), as well as three Federal agencies: the National Oceanic and Atmospheric Administration (NOAA), the Federal Emergency Management Agency (FEMA), and the United States Geological Survey (USGS). Since its inception, the NTHMP has advanced tsunami preparedness in the United States through accomplishments in many areas of tsunami preparedness: - Coordination and funding of tsunami hazard analysis and preparedness activities in STCs; - Development and execution of a coordinated plan to address education and outreach activities (materials, signage, and guides) within its membership; - Lead the effort to assist communities in meeting National Weather Service (NWS) TsunamiReady guidelines through development of evacuation maps and other planning activities; - Determination of tsunami hazard zones in most highly threatened coastal communities throughout the country by detailed tsunami inundation studies; - Development of a benchmarking procedure for numerical tsunami models to ensure models used in the inundation studies meet consistent, NOAA standards; - Creation of a national tsunami exercise framework to test tsunami warning system response; - Funding community tsunami warning dissemination and reception systems such as sirens and NOAA Weather Radios; and, - Providing guidance to NOAA's Tsunami Warning Centers regarding warning dissemination and content. NTHMP activities have advanced the state of preparedness of United States coastal communities, and have helped save lives and property during recent tsunamis. Program successes as well as future plans, including maritime preparedness, are discussed.

  11. New Activities of the U.S. National Tsunami Hazard Mitigation Program, Mapping and Modeling Subcommittee

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Eble, M. C.

    2013-12-01

    The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is comprised of representatives from coastal states and federal agencies who, under the guidance of NOAA, work together to develop protocols and products to help communities prepare for and mitigate tsunami hazards. Within the NTHMP are several subcommittees responsible for complimentary aspects of tsunami assessment, mitigation, education, warning, and response. The Mapping and Modeling Subcommittee (MMS) is comprised of state and federal scientists who specialize in tsunami source characterization, numerical tsunami modeling, inundation map production, and warning forecasting. Until September 2012, much of the work of the MMS was authorized through the Tsunami Warning and Education Act, an Act that has since expired but the spirit of which is being adhered to in parallel with reauthorization efforts. Over the past several years, the MMS has developed guidance and best practices for states and territories to produce accurate and consistent tsunami inundation maps for community level evacuation planning, and has conducted benchmarking of numerical inundation models. Recent tsunami events have highlighted the need for other types of tsunami hazard analyses and products for improving evacuation planning, vertical evacuation, maritime planning, land-use planning, building construction, and warning forecasts. As the program responsible for producing accurate and consistent tsunami products nationally, the NTHMP-MMS is initiating a multi-year plan to accomplish the following: 1) Create and build on existing demonstration projects that explore new tsunami hazard analysis techniques and products, such as maps identifying areas of strong currents and potential damage within harbors as well as probabilistic tsunami hazard analysis for land-use planning. 2) Develop benchmarks for validating new numerical modeling techniques related to current velocities and landslide sources. 3) Generate guidance and protocols for

  12. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies

    NASA Technical Reports Server (NTRS)

    2010-01-01

    The United States spends approximately four million dollars each year searching for near-Earth objects (NEOs). The objective is to detect those that may collide with Earth. The majority of this funding supports the operation of several observatories that scan the sky searching for NEOs. This, however, is insufficient in detecting the majority of NEOs that may present a tangible threat to humanity. A significantly smaller amount of funding supports ways to protect the Earth from such a potential collision or "mitigation." In 2005, a Congressional mandate called for NASA to detect 90 percent of NEOs with diameters of 140 meters of greater by 2020. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies identifies the need for detection of objects as small as 30 to 50 meters as these can be highly destructive. The book explores four main types of mitigation including civil defense, "slow push" or "pull" methods, kinetic impactors and nuclear explosions. It also asserts that responding effectively to hazards posed by NEOs requires national and international cooperation. Defending Planet Earth: Near-Earth Object Surveys and Hazard Mitigation Strategies is a useful guide for scientists, astronomers, policy makers and engineers.

  13. RADON MITIGATION STUDIES: NASHVILLE DEMONSTRATION

    EPA Science Inventory

    The report gives results of an EPA radon mitigation demonstration project involving 14 houses in the Nashville, TN, area with indoor radon levels of 5.6-47.6 pCi/L, using a variety of techniques, designed to be the most cost effective methods possible to implement, and yet adequa...

  14. Developing a scientific procedure for community based hazard mapping and risk mitigation

    NASA Astrophysics Data System (ADS)

    Verrier, M.

    2011-12-01

    As an international exchange student from the Geological Sciences Department at San Diego State University (SDSU), I joined the KKN-PPM program at Universitas Gadjah Mada (UGM), Yogyakarta, Indonesia, in July 2011 for 12 days (July 4th to July 16th) of its two month duration (July 4th to August 25th). The KKN-PPM group I was attached was designated 154 and was focused in Plosorejo Village, Karanganyar, Kerjo, Central Java, Indonesia. The mission of KKN-PPM 154 was to survey Plosorejo village for existing landslides, to generate a simple hazard susceptibility map that can be understood by local villagers, and then to begin dissemination of that map into the community. To generate our susceptibility map we first conducted a geological survey of the existing landslides in the field study area, with a focus on determining landslide triggers and gauging areas for susceptibility for future landslides. The methods for gauging susceptibility included lithological observation, the presence of linear cracking, visible loss of structural integrity in structures such as villager homes, as well as collaboration with local residents and with the local rescue and response team. There were three color distinctions used in representing susceptibility which were green, where there is no immediate danger of landslide damage; orange, where transportation routes are at risk of being disrupted by landslides; and red, where imminent landslide potential puts a home in direct danger. The landslide inventory and susceptibility data was compiled into digital mediums such as CorelDraw, ArcGIS and Google Earth. Once a technical map was generated, we presented it to the village leadership for confirmation and modification based on their experience. Finally, we began to use the technical susceptibility map to draft evacuation routes and meeting points in the event of landslides, as well as simple susceptibility maps that can be understood and utilized by local villagers. Landslide mitigation

  15. Almost strict liability: Wind River Petroleum and the Utah Hazardous Substance Mitigation Act

    SciTech Connect

    1996-12-31

    In Wind River, the Utah Supreme Court developed a two-step liability standard. The court ruled that under the act, statutorily responsible parties are strictly liable for any release of hazardous material from their facility. Among responsible parties, liability is to be apportioned on an equitable contribution standard. However, the Utah Legislature has subsequently amended the Mitigation Act to prohibit the application of unapportioned strict liability. Therefore, Wind River can no longer be relied upon as the law regarding liability under the Mitigation Act.

  16. The influence of hazard models on GIS-based regional risk assessments and mitigation policies

    USGS Publications Warehouse

    Bernknopf, R.L.; Rabinovici, S.J.M.; Wood, N.J.; Dinitz, L.B.

    2006-01-01

    Geographic information systems (GIS) are important tools for understanding and communicating the spatial distribution of risks associated with natural hazards in regional economies. We present a GIS-based decision support system (DSS) for assessing community vulnerability to natural hazards and evaluating potential mitigation policy outcomes. The Land Use Portfolio Modeler (LUPM) integrates earth science and socioeconomic information to predict the economic impacts of loss-reduction strategies. However, the potential use of such systems in decision making may be limited when multiple but conflicting interpretations of the hazard are available. To explore this problem, we conduct a policy comparison using the LUPM to test the sensitivity of three available assessments of earthquake-induced lateral-spread ground failure susceptibility in a coastal California community. We find that the uncertainty regarding the interpretation of the science inputs can influence the development and implementation of natural hazard management policies. Copyright ?? 2006 Inderscience Enterprises Ltd.

  17. Planning ahead for asteroid and comet hazard mitigation, phase 1: parameter space exploration and scenario modeling

    SciTech Connect

    Plesko, Catherine S; Clement, R Ryan; Weaver, Robert P; Bradley, Paul A; Huebner, Walter F

    2009-01-01

    The mitigation of impact hazards resulting from Earth-approaching asteroids and comets has received much attention in the popular press. However, many questions remain about the near-term and long-term, feasibility and appropriate application of all proposed methods. Recent and ongoing ground- and space-based observations of small solar-system body composition and dynamics have revolutionized our understanding of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). Ongoing increases in computing power and algorithm sophistication make it possible to calculate the response of these inhomogeneous objects to proposed mitigation techniques. Here we present the first phase of a comprehensive hazard mitigation planning effort undertaken by Southwest Research Institute and Los Alamos National Laboratory. We begin by reviewing the parameter space of the object's physical and chemical composition and trajectory. We then use the radiation hydrocode RAGE (Gittings et al. 2008), Monte Carlo N-Particle (MCNP) radiation transport (see Clement et al., this conference), and N-body dynamics codes to explore the effects these variations in object properties have on the coupling of energy into the object from a variety of mitigation techniques, including deflection and disruption by nuclear and conventional munitions, and a kinetic impactor.

  18. A perspective multidisciplinary geological approach for mitigation of effects due to the asbestos hazard

    NASA Astrophysics Data System (ADS)

    Vignaroli, Gianluca; Rossetti, Federico; Belardi, Girolamo; Billi, Andrea

    2010-05-01

    Asbestos-bearing rock sequences constitute a remarkable natural hazard that poses important threat to human health and may be at the origin of diseases such as asbestosis, mesothelioma and lung cancer). Presently, asbestos is classified as Category 1 carcinogen by world health authorities. Although regulatory agencies in many countries prohibit or restrict the use of asbestos, and discipline the environmental asbestos exposure, the impact of asbestos on human life still constitutes a major problem. Naturally occurring asbestos includes serpentine and amphibole minerals characterised by fibrous morphology and it is a constituent of mineralogical associations typical of mafic and ultramafic rocks within the ophiolitic sequences. Release of fibres can occur both through natural processes (erosion) and through human activities requiring fragmentation of ophiolite rocks (quarrying, tunnelling, railways construction, etc.). As a consequence, vulnerability is increasing in sites where workers and living people are involved by dispersion of fibres during mining and milling of ophiolitic rocks. By analysing in the field different exposures of ophiolitic sequences from the Italian peninsula and after an extensive review of the existing literature, we remark the importance of the geological context (origin, tectonic and deformation history) of ophiolites as a first-order parameter in evaluating the asbestos hazard. Integrated structural, textural, mineralogical and petrological studies significantly improve our understanding of the mechanisms governing the nucleation/growth of fibrous minerals in deformation structures (both ductile and brittle) within the ophiolitic rocks. A primary role is recognised in the structural processes favouring the fibrous mineralization, with correlation existing between the fibrous parameters (such as mineralogical composition, texture, mechanics characteristics) and the particles released in the air (such as shape, size, and amount liberated

  19. Multidisciplinary Approach to Identify and Mitigate the Hazard from Induced Seismicity in Oklahoma

    NASA Astrophysics Data System (ADS)

    Holland, A. A.; Keller, G. R., Jr.; Darold, A. P.; Murray, K. E.; Holloway, S. D.

    2014-12-01

    Oklahoma has experienced a very significant increase in seismicity rates over the last 5 years with the greatest increase occurring in 2014. The observed rate increase indicates that the seismic hazard for at least some parts of Oklahoma has increased significantly. Many seismologists consider the large number of salt-water disposal wells operating in Oklahoma as the largest contributing factor to this increase. However, unlike many cases of seismicity induced by injection, the greatest increase is occurring over a very large area, about 15% of the state. There are more than 3,000 disposal wells currently operating within Oklahoma along with injection volumes greater than 2010 rates. These factors add many significant challenges to identifying potential cases of induced seismicity and understanding the contributing factors well enough to mitigate such occurrences. In response to a clear need for a better geotechnical understanding of what is occurring in Oklahoma, a multi-year multidisciplinary study some of the most active areas has begun at the University of Oklahoma. This study includes additional seismic monitoring, better geological and geophysical characterization of the subsurface, hydrological and reservoir modeling, and geomechanical studies to better understand the rise in seismicity rates. The Oklahoma Corporation Commission has added new rules regarding reporting and monitoring of salt-water disposal wells, and continue to work with the Oklahoma Geological Survey and other researchers.

  20. Earthquake Scaling and Development of Ground Motion Prediction for Earthquake Hazard Mitigation in Taiwan

    NASA Astrophysics Data System (ADS)

    Ma, K.; Yen, Y.

    2011-12-01

    For earthquake hazard mitigation toward risk management, integration study from development of source model to ground motion prediction is crucial. The simulation for high frequency component ( > 1 Hz) of strong ground motions in the near field was not well resolved due to the insufficient resolution in velocity structure. Using the small events as Green's functions (i.e. empirical Green's function (EGF) method) can resolve the problem of lack of precise velocity structure to replace the path effect evaluation. If the EGF is not available, a stochastic Green's function (SGF) method can be employed. Through characterizing the slip models derived from the waveform inversion, we directly extract the parameters needed for the ground motion prediction in the EGF method or the SGF method. The slip models had been investigated from Taiwan dense strong motion and global teleseismic data. In addition, the low frequency ( < 1 Hz) can obtained numerically by the Frequency-Wavenumber (FK) method. Thus, broadband frequency strong ground motion can be calculated by a hybrid method that combining a deterministic FK method for the low frequency simulation and the EGF or SGF method for high frequency simulation. Characterizing the definitive source parameters from the empirical scaling study can provide directly to the ground motion simulation. To give the ground motion prediction for a scenario earthquake, we compiled the earthquake scaling relationship from the inverted finite-fault models of moderate to large earthquakes in Taiwan. The studies show the significant involvement of the seismogenic depth to the development of rupture width. In addition to that, several earthquakes from blind fault show distinct large stress drop, which yield regional high PGA. According to the developing scaling relationship and the possible high stress drops for earthquake from blind faults, we further deploy the hybrid method mentioned above to give the simulation of the strong motion in

  1. L-Reactor Habitat Mitigation Study

    SciTech Connect

    Not Available

    1988-02-01

    The L-Reactor Fish and Wildlife Resource Mitigation Study was conducted to quantify the effects on habitat of the L-Reactor restart and to identify the appropriate mitigation for these impacts. The completed project evaluated in this study includes construction of a 1000 acre reactor cooling reservoir formed by damming Steel Creek. Habitat impacts identified include a loss of approximately 3,700 average annual habitat units. This report presents a mitigation plan, Plan A, to offset these habitat losses. Plan A will offset losses for all species studied, except whitetailed deer. The South Carolina Wildlife and Marine Resources Department strongly recommends creation of a game management area to provide realistic mitigation for loss of deer habitats. 10 refs., 5 figs., 3 tabs. (MHB)

  2. A portfolio approach to evaluating natural hazard mitigation policies: An Application to lateral-spread ground failure in Coastal California

    USGS Publications Warehouse

    Bernknopf, R.L.; Dinitz, L.B.; Rabinovici, S.J.M.; Evans, A.M.

    2001-01-01

    In the past, efforts to prevent catastrophic losses from natural hazards have largely been undertaken by individual property owners based on site-specific evaluations of risks to particular buildings. Public efforts to assess community vulnerability and encourage mitigation have focused on either aggregating site-specific estimates or adopting standards based upon broad assumptions about regional risks. This paper develops an alternative, intermediate-scale approach to regional risk assessment and the evaluation of community mitigation policies. Properties are grouped into types with similar land uses and levels of hazard, and hypothetical community mitigation strategies for protecting these properties are modeled like investment portfolios. The portfolios consist of investments in mitigation against the risk to a community posed by a specific natural hazard, and are defined by a community's mitigation budget and the proportion of the budget invested in locations of each type. The usefulness of this approach is demonstrated through an integrated assessment of earthquake-induced lateral-spread ground failure risk in the Watsonville, California area. Data from the magnitude 6.9 Loma Prieta earthquake of 1989 are used to model lateral-spread ground failure susceptibility. Earth science and economic data are combined and analyzed in a Geographic Information System (GIS). The portfolio model is then used to evaluate the benefits of mitigating the risk in different locations. Two mitigation policies, one that prioritizes mitigation by land use type and the other by hazard zone, are compared with a status quo policy of doing no further mitigation beyond that which already exists. The portfolio representing the hazard zone rule yields a higher expected return than the land use portfolio does: However, the hazard zone portfolio experiences a higher standard deviation. Therefore, neither portfolio is clearly preferred. The two mitigation policies both reduce expected losses

  3. Lidar and Electro-Optics for Atmospheric Hazard Sensing and Mitigation

    NASA Technical Reports Server (NTRS)

    Clark, Ivan O.

    2012-01-01

    This paper provides an overview of the research and development efforts of the Lidar and Electro-Optics element of NASA's Aviation Safety Program. This element is seeking to improve the understanding of the atmospheric environments encountered by aviation and to provide enhanced situation awareness for atmospheric hazards. The improved understanding of atmospheric conditions is specifically to develop sensor signatures for atmospheric hazards. The current emphasis is on kinetic air hazards such as turbulence, aircraft wake vortices, mountain rotors, and windshear. Additional efforts are underway to identify and quantify the hazards arising from multi-phase atmospheric conditions including liquid and solid hydrometeors and volcanic ash. When the multi-phase conditions act as obscurants that result in reduced visual awareness, the element seeks to mitigate the hazards associated with these diminished visual environments. The overall purpose of these efforts is to enable safety improvements for air transport class and business jet class aircraft as the transition to the Next Generation Air Transportation System occurs.

  4. Nationwide Operational Assessment of Hazards and success stories in disaster prevention and mitigation in the Philippines

    NASA Astrophysics Data System (ADS)

    Mahar Francisco Lagmay, Alfredo

    2016-04-01

    The Philippines, being a locus of typhoons, tsunamis, earthquakes, and volcanic eruptions, is a hotbed of disasters. Natural hazards inflict loss of lives and costly damage to property in the country. In 2011, after tropical storm Washi devastated cities in southern Philippines, the Department of Science and Technology put in place a responsive program to warn and give communities hours-in-advance lead-time to prepare for imminent hazards and use advanced science and technology to enhance geohazard maps for more effective disaster prevention and mitigation. Since its launch, there have been many success stories on the use of Project NOAH, which after Typhoon Haiyan was integrated into the Pre-Disaster Risk Assessment (PDRA) system of the National Disaster Risk Reduction and Management Council (NDRRMC), the government agency tasked to prepare for, and respond to, natural calamities. Learning from past disasters, NDRRMC now issues warnings, through scientific advise from DOST-Project NOAH and PAGASA (Philippine Weather Bureau) that are hazards-specific, area-focused and time-bound. Severe weather events in 2015 generated dangerous hazard phenomena such as widespread floods and massive debris flows, which if not for timely, accessible and understandable warnings, could have turned into disasters. We call these events as "disasters that did not happen". The innovative warning system of the Philippine government has so far proven effective in addressing the impacts of hydrometeorological hazards and can be employed elsewhere in the world.

  5. The Wenchuan, China M8.0 Earthquake: A Lesson and Implication for Seismic Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Wang, Z.

    2008-12-01

    The Wenchuan, China M8.0 earthquake caused great damage and huge casualty. 69,197 people were killed, 374,176 people were injured, and 18,341 people are still missing. The estimated direct economic loss is about 126 billion U.S. dollar. The Wenchuan earthquake again demonstrated that earthquake does not kill people, but the built environments and induced hazards, landslides in particular, do. Therefore, it is critical to strengthen the built environments, such buildings and bridges, and to mitigate the induced hazards in order to avoid such disaster. As a part of the so-called North-South Seismic Zone in China, the Wenchuan earthquake occurred along the Longmen Shan thrust belt which forms a boundary between the Qinghai-Tibet Plateau and the Sichuan basin, and there is a long history (~4,000 years) of seismicity in the area. The historical records show that the area experienced high intensity (i.e., greater than IX) in the past several thousand years. In other words, the area is well-known to have high seismic hazard because of its tectonic setting and seismicity. However, only intensity VII (0.1 to 0.15g PGA) has been considered for seismic design for the built environments in the area. This was one of the main reasons that so many building collapses, particularly the school buildings, during the Wenchuan earthquake. It is clear that the seismic design (i.e., the design ground motion or intensity) is not adequate in the Wenchuan earthquake stricken area. A lesson can be learned from the Wenchuan earthquake on the seismic hazard and risk assessment. A lesson can also be learned from this earthquake on seismic hazard mitigation and/or seismic risk reduction.

  6. Linear Aerospike SR-71 Experiment (LASRE): Aerospace Propulsion Hazard Mitigation Systems

    NASA Technical Reports Server (NTRS)

    Mizukami, Masashi; Corpening, Griffin P.; Ray, Ronald J.; Hass, Neal; Ennix, Kimberly A.; Lazaroff, Scott M.

    1998-01-01

    A major hazard posed by the propulsion system of hypersonic and space vehicles is the possibility of fire or explosion in the vehicle environment. The hazard is mitigated by minimizing or detecting, in the vehicle environment, the three ingredients essential to producing fire: fuel, oxidizer, and an ignition source. The Linear Aerospike SR-71 Experiment (LASRE) consisted of a linear aerospike rocket engine integrated into one-half of an X-33-like lifting body shape, carried on top of an SR-71 aircraft. Gaseous hydrogen and liquid oxygen were used as propellants. Although LASRE is a one-of-a-kind experimental system, it must be rated for piloted flight, so this test presented a unique challenge. To help meet safety requirements, the following propulsion hazard mitigation systems were incorporated into the experiment: pod inert purge, oxygen sensors, a hydrogen leak detection algorithm, hydrogen sensors, fire detection and pod temperature thermocouples, water misting, and control room displays. These systems are described, and their development discussed. Analyses, ground test, and flight test results are presented, as are findings and lessons learned.

  7. Seismic hazard studies in Egypt

    NASA Astrophysics Data System (ADS)

    Mohamed, Abuo El-Ela A.; El-Hadidy, M.; Deif, A.; Abou Elenean, K.

    2012-12-01

    The study of earthquake activity and seismic hazard assessment of Egypt is very important due to the great and rapid spreading of large investments in national projects, especially the nuclear power plant that will be held in the northern part of Egypt. Although Egypt is characterized by low seismicity, it has experienced occurring of damaging earthquake effect through its history. The seismotectonic sitting of Egypt suggests that large earthquakes are possible particularly along the Gulf of Aqaba-Dead Sea transform, the Subduction zone along the Hellenic and Cyprean Arcs, and the Northern Red Sea triple junction point. In addition some inland significant sources at Aswan, Dahshour, and Cairo-Suez District should be considered. The seismic hazard for Egypt is calculated utilizing a probabilistic approach (for a grid of 0.5° × 0.5°) within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for four ground motion spectral periods and for different return periods. In addition, the uniform hazard spectra for rock sites for different 25 periods, and the probabilistic hazard curves for Cairo, and Alexandria cities are graphed. The peak ground acceleration (PGA) values were found close to the Gulf of Aqaba and it was about 220 gal for 475 year return period. While the lowest (PGA) values were detected in the western part of the western desert and it is less than 25 gal.

  8. Fluor Daniel Hanford implementation plan for DOE Order 5480.28, Natural phenomena hazards mitigation

    SciTech Connect

    Conrads, T.J.

    1997-09-12

    Natural phenomena hazards (NPH) are unexpected acts of nature that pose a threat or danger to workers, the public, or the environment. Earthquakes, extreme winds (hurricane and tornado), snow, flooding, volcanic ashfall, and lightning strikes are examples of NPH that could occur at the Hanford Site. U.S. Department of Energy (DOE) policy requires facilities to be designed, constructed, and operated in a manner that protects workers, the public, and the environment from hazards caused by natural phenomena. DOE Order 5480.28, Natural Phenomena Hazards Mitigation, includes rigorous new natural phenomena criteria for the design of new DOE facilities, as well as for the evaluation and, if necessary, upgrade of existing DOE facilities. The Order was transmitted to Westinghouse Hanford Company in 1993 for compliance and is also identified in the Project Hanford Management Contract, Section J, Appendix C. Criteria and requirements of DOE Order 5480.28 are included in five standards, the last of which, DOE-STD-1023, was released in fiscal year 1996. Because the Order was released before all of its required standards were released, enforcement of the Order was waived pending release of the last standard and determination of an in-force date by DOE Richland Operations Office (DOE-RL). Agreement also was reached between the Management and Operations Contractor and DOE-RL that the Order would become enforceable for new structures, systems, and components (SSCS) 60 days following issue of a new order-based design criteria in HNF-PRO-97, Engineering Design and Evaluation. The order also requires that commitments addressing existing SSCs be included in an implementation plan that is to be issued 1 year following the release of the last standard. Subsequently, WHC-SP-1175, Westinghouse Hanford Company Implementation Plan for DOE Order 5480.28, Natural Phenomena Hazards Mitigation, Rev. 0, was issued in November 1996, and this document, HNF-SP-1175, Fluor Daniel Hanford

  9. The Puerto Rico Component of the National Tsunami Hazard and Mitigation Program Pr-Nthmp

    NASA Astrophysics Data System (ADS)

    Huerfano Moreno, V. A.; Hincapie-Cardenas, C. M.

    2014-12-01

    Tsunami hazard assessment, detection, warning, education and outreach efforts are intended to reduce losses to life and property. The Puerto Rico Seismic Network (PRSN) is participating in an effort with local and federal agencies, to developing tsunami hazard risk reduction strategies under the National Tsunami Hazards and Mitigation Program (NTHMP). This grant supports the TsunamiReady program which is the base of the tsunami preparedness and mitigation in PR. The Caribbean region has a documented history of damaging tsunamis that have affected coastal areas. The seismic water waves originating in the prominent fault systems around PR are considered to be a near-field hazard for Puerto Rico and the Virgin islands (PR/VI) because they can reach coastal areas within a few minutes after the earthquake. Sources for local, regional and tele tsunamis have been identified and modeled and tsunami evacuation maps were prepared for PR. These maps were generated in three phases: First, hypothetical tsunami scenarios on the basis of the parameters of potential underwater earthquakes were developed. Secondly, each of these scenarios was simulated. The third step was to determine the worst case scenario (MOM). The run-ups were drawn on GIS referenced maps and aerial photographs. These products are being used by emergency managers to educate the public and develop mitigation strategies. Online maps and related evacuation products are available to the public via the PR-TDST (PR Tsunami Decision Support Tool). Currently all the 44 coastal municipalities were recognized as TsunamiReady by the US NWS. The main goal of the program is to declare Puerto Rico as TsunamiReady, including two cities that are not coastal but could be affected by tsunamis. Based on these evacuation maps, tsunami signs were installed, vulnerability profiles were created, communication systems to receive and disseminate tsunami messages were installed in each TWFP, and tsunami response plans were approved

  10. Use of a Novel Visual Metaphor Measure (PRISM) to Evaluate School Children's Perceptions of Natural Hazards, Sources of Hazard Information, Hazard Mitigation Organizations, and the Effectiveness of Future Hazard Education Programs in Dominica, Eastern Caribbean

    NASA Astrophysics Data System (ADS)

    Parham, M.; Day, S. J.; Teeuw, R. M.; Solana, C.; Sensky, T.

    2014-12-01

    This project aims to study the development of understanding of natural hazards (and of hazard mitigation) from the age of 11 to the age of 15 in secondary school children from 5 geographically and socially different schools on Dominica, through repeated interviews with the students and their teachers. These interviews will be coupled with a structured course of hazard education in the Geography syllabus; the students not taking Geography will form a control group. To avoid distortion of our results arising from the developing verbalization and literacy skills of the students over the 5 years of the project, we have adapted the PRISM tool used in clinical practice to assess patient perceptions of illness and treatment (Buchi & Sensky, 1999). This novel measure is essentially non-verbal, and uses spatial positions of moveable markers ("object" markers) on a board, relative to a fixed marker that represents the subject's "self", as a visual metaphor for the importance of the object to the subject. The subjects also explain their reasons for placing the markers as they have, to provide additional qualitative information. The PRISM method thus produces data on the perceptions measured on the board that can be subjected to statistical analysis, and also succinct qualitative data about each subject. Our study will gather data on participants' perceptions of different natural hazards, different sources of information about these, and organizations or individuals to whom they would go for help in a disaster, and investigate how these vary with geographical and social factors. To illustrate the method, which is generalisable, we present results from our initial interviews of the cohort of 11 year olds whom we will follow through their secondary school education.Büchi, S., & Sensky, T. (1999). PRISM: Pictorial Representation of Illness and Self Measure: a brief nonverbal measure of illness impact and therapeutic aid in psychosomatic medicine. Psychosomatics, 40(4), 314-320.

  11. Use of a Novel Visual Metaphor Measure (PRISM) to Evaluate School Children's Perceptions of Natural Hazards, Sources of Hazard Information, Hazard Mitigation Organizations, and the Effectiveness of Future Hazard Education Programs in Dominica, Eastern Car

    NASA Astrophysics Data System (ADS)

    Parham, Martin; Day, Simon; Teeuw, Richard; Solana, Carmen; Sensky, Tom

    2015-04-01

    This project aims to study the development of understanding of natural hazards (and of hazard mitigation) from the age of 11 to the age of 15 in secondary school children from 5 geographically and socially different schools on Dominica, through repeated interviews with the students and their teachers. These interviews will be coupled with a structured course of hazard education in the Geography syllabus; the students not taking Geography will form a control group. To avoid distortion of our results arising from the developing verbalization and literacy skills of the students over the 5 years of the project, we have adapted the PRISM tool used in clinical practice to assess patient perceptions of illness and treatment (Buchi & Sensky, 1999). This novel measure is essentially non-verbal, and uses spatial positions of moveable markers ("object" markers) on a board, relative to a fixed marker that represents the subject's "self", as a visual metaphor for the importance of the object to the subject. The subjects also explain their reasons for placing the markers as they have, to provide additional qualitative information. The PRISM method thus produces data on the perceptions measured on the board that can be subjected to statistical analysis, and also succinct qualitative data about each subject. Our study will gather data on participants' perceptions of different natural hazards, different sources of information about these, and organizations or individuals to whom they would go for help in a disaster, and investigate how these vary with geographical and social factors. To illustrate the method, which is generalisable, we present results from our initial interviews of the cohort of 11 year olds whom we will follow through their secondary school education. Büchi, S., & Sensky, T. (1999). PRISM: Pictorial Representation of Illness and Self Measure: a brief nonverbal measure of illness impact and therapeutic aid in psychosomatic medicine. Psychosomatics, 40(4), 314-320.

  12. Impact Hazard Mitigation: Understanding the Effects of Nuclear Explosive Outputs on Comets and Asteroids

    NASA Astrophysics Data System (ADS)

    Clement, R.

    The NASA 2007 white paper "Near-Earth Object Survey and Deflection Analysis of Alternatives" affirms deflection as the safest and most effective means of potentially hazardous object (PHO) impact prevention. It also calls for further studies of object deflection. In principle, deflection of a PHO may be accomplished by using kinetic impactors, chemical explosives, gravity tractors, solar sails, or nuclear munitions. Of the sudden impulse options, nuclear munitions are by far the most efficient in terms of yield-per-unit-mass launched and are technically mature. However, there are still significant questions about the response of a comet or asteroid to a nuclear burst. Recent and ongoing observational and experimental work is revolutionizing our understanding of the physical and chemical properties of these bodies (e.g., Ryan (2000), Fujiwara et al. (2006), and Jedicke et al. (2006)). The combination of this improved understanding of small solar-system bodies combined with current state-of-the-art modeling and simulation capabilities, which have also improved dramatically in recent years, allow for a science-based, comprehensive study of PHO mitigation techniques. Here we present an examination of the effects of radiation from a nuclear explosion on potentially hazardous asteroids and comets through Monte Carlo N-Particle code (MCNP) simulation techniques. MCNP is a general-purpose particle transport code commonly used to model neutron, photon, and electron transport for medical physics, reactor design and safety, accelerator target and detector design, and a variety of other applications including modeling the propagation of epithermal neutrons through the Martian regolith (Prettyman 2002). It is a massively parallel code that can conduct simulations in 1-3 dimensions, complicated geometries, and with extremely powerful variance reduction techniques. It uses current nuclear cross section data, where available, and fills in the gaps with analytical models where data

  13. Impact hazard mitigation: understanding the effects of nuclear explosive outputs on comets and asteroids

    SciTech Connect

    Clement, Ralph R C; Plesko, Catherine S; Bradley, Paul A; Conlon, Leann M

    2009-01-01

    The NASA 2007 white paper ''Near-Earth Object Survey and Deflection Analysis of Alternatives'' affirms deflection as the safest and most effective means of potentially hazardous object (PHO) impact prevention. It also calls for further studies of object deflection. In principle, deflection of a PHO may be accomplished by using kinetic impactors, chemical explosives, gravity tractors, solar sails, or nuclear munitions. Of the sudden impulse options, nuclear munitions are by far the most efficient in terms of yield-per-unit-mass launched and are technically mature. However, there are still significant questions about the response of a comet or asteroid to a nuclear burst. Recent and ongoing observational and experimental work is revolutionizing our understanding of the physical and chemical properties of these bodies (e.g ., Ryan (2000) Fujiwara et al. (2006), and Jedicke et al. (2006)). The combination of this improved understanding of small solar-system bodies combined with current state-of-the-art modeling and simulation capabilities, which have also improved dramatically in recent years, allow for a science-based, comprehensive study of PHO mitigation techniques. Here we present an examination of the effects of radiation from a nuclear explosion on potentially hazardous asteroids and comets through Monte Carlo N-Particle code (MCNP) simulation techniques. MCNP is a general-purpose particle transport code commonly used to model neutron, photon, and electron transport for medical physics reactor design and safety, accelerator target and detector design, and a variety of other applications including modeling the propagation of epithermal neutrons through the Martian regolith (Prettyman 2002). It is a massively parallel code that can conduct simulations in 1-3 dimensions, complicated geometries, and with extremely powerful variance reduction techniques. It uses current nuclear cross section data, where available, and fills in the gaps with analytical models where

  14. Identification, prediction, and mitigation of sinkhole hazards in evaporite karst areas

    USGS Publications Warehouse

    Gutierrez, F.; Cooper, A.H.; Johnson, K.S.

    2008-01-01

    Sinkholes usually have a higher probability of occurrence and a greater genetic diversity in evaporite terrains than in carbonate karst areas. This is because evaporites have a higher solubility and, commonly, a lower mechanical strength. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas. To deal with these hazards, a phased approach is needed for sinkhole identification, investigation, prediction, and mitigation. Identification techniques include field surveys and geomorphological mapping combined with accounts from local people and historical sources. Detailed sinkhole maps can be constructed from sequential historical maps, recent topographical maps, and digital elevation models (DEMs) complemented with building-damage surveying, remote sensing, and high-resolution geodetic surveys. On a more detailed level, information from exposed paleosubsidence features (paleokarst), speleological explorations, geophysical investigations, trenching, dating techniques, and boreholes may help in investigating dissolution and subsidence features. Information on the hydrogeological pathways including caves, springs, and swallow holes are particularly important especially when corroborated by tracer tests. These diverse data sources make a valuable database-the karst inventory. From this dataset, sinkhole susceptibility zonations (relative probability) may be produced based on the spatial distribution of the features and good knowledge of the local geology. Sinkhole distribution can be investigated by spatial distribution analysis techniques including studies of preferential elongation, alignment, and nearest neighbor analysis. More objective susceptibility models may be obtained by analyzing the statistical relationships between the known sinkholes and the conditioning factors. Chronological information on sinkhole formation is required to estimate the probability of

  15. Monitoring active volcanoes and mitigating volcanic hazards: the case for including simple approaches

    NASA Astrophysics Data System (ADS)

    Stoiber, Richard E.; Williams, Stanley N.

    1990-07-01

    Simple approaches to problems brought about eruptions and their ensuing hazardous effects should be advocated and used by volcanologists while awaiting more sophisticated remedies. The expedients we advocate have all or many of the following attributes: only locally available materials are required; no extensive training of operators or installation is necessary; they are affordable and do not require foreign aid or exports; they are often labor intensive and are sustainable without outside assistance. Where appropriate, the involvement of local residents is advocated. Examples of simple expedients which can be used in forecasting or mitigating the effects of crises emphasize the relative ease and the less elaborate requirements with which simple approaches can be activated. Emphasis is on visual observations often by untrained observers, simple meteorogical measurements, observations of water level in lakes, temperature and chemistry of springs and fumaroles, new springs and collapse areas and observations of volcanic plumes. Simple methods are suggested which can be applied to mitigating damage from mudflows, nuées ardentes, tephra falls and gas discharge. A review in hindsight at Ruiz includes the use of both chemical indicators and simple mudflow alarms. Simple expedients are sufficiently effective that any expert volcanologist called to aid in a crisis must include them in the package of advice offered. Simple approaches are a critical and logical complement to highly technical solutions to hazardous situations.

  16. 2009 ERUPTION OF REDOUBT VOLCANO: Lahars, Oil, and the Role of Science in Hazards Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Swenson, R.; Nye, C. J.

    2009-12-01

    In March, 2009, Redoubt Volcano erupted for the third time in 45 years. More than 19 explosions produced ash plumes to 60,000 ft asl, lahar flows of mud and ice down the Drift river ~30 miles to the coast, and tephra fall up to 1.5 mm onto surrounding communities. The eruption had severe impact on many operations. Airlines were forced to cancel or divert hundreds of international and domestic passenger and cargo flights, and Anchorage International airport closed for over 12 hours. Mudflows and floods down the Drift River to the coast impacted operations at the Drift River Oil Terminal (DROT) which was forced to shut down and ultimately be evacuated. Prior mitigation efforts to protect the DROT oil tank farm from potential impacts associated with a major eruptive event were successful, and none of the 148,000 barrels of oil stored at the facility was spilled or released. Nevertheless, the threat of continued eruptive activity at Redoubt, with the possibility of continued lahar flows down the Drift River alluvial fan, required an incident command post be established so that the US Coast Guard, Alaska Dept. of Environmental Conservation, and the Cook Inlet Pipeline Company could coordinate a response to the potential hazards. Ultimately, the incident command team relied heavily on continuous real-time data updates from the Alaska Volcano Observatory, as well as continuous geologic interpretations and risk analysis by the USGS Volcanic Hazards group, the State Division of Geological and Geophysical Surveys and the University of Alaska Geophysical Institute, all members of the collaborative effort of the Alaska Volcano Observatory. The great success story that unfolded attests to the efforts of the incident command team, and their reliance on real-time scientific analysis from scientific experts. The positive results also highlight how pre-disaster mitigation and monitoring efforts, in concert with hazards response planning, can be used in a cooperative industry

  17. Bringing New Tools and Techniques to Bear on Earthquake Hazard Analysis and Mitigation

    NASA Astrophysics Data System (ADS)

    Willemann, R. J.; Pulliam, J.; Polanco, E.; Louie, J. N.; Huerta-Lopez, C.; Schmitz, M.; Moschetti, M. P.; Huerfano Moreno, V.; Pasyanos, M.

    2013-12-01

    During July 2013, IRIS held an Advanced Studies Institute in Santo Domingo, Dominican Republic, that was designed to enable early-career scientists who already have mastered the fundamentals of seismology to begin collaborating in frontier seismological research. The Institute was conceived of at a strategic planning workshop in Heredia, Costa Rica, that was supported and partially funded by USAID, with a goal of building geophysical capacity to mitigate the effects of future earthquakes. To address this broad goal, we drew participants from a dozen different countries of Middle America. Our objectives were to develop understanding of the principles of earthquake hazard analysis, particularly site characterization techniques, and to facilitate future research collaborations. The Institute was divided into three main sections: overviews on the fundamentals of earthquake hazard analysis and lectures on the theory behind methods of site characterization; fieldwork where participants acquired new data of the types typically used in site characterization; and computer-based analysis projects in which participants applied their newly-learned techniques to the data they collected. This was the first IRIS institute to combine an instructional short course with field work for data acquisition. Participants broke into small teams to acquire data, analyze it on their own computers, and then make presentations to the assembled group describing their techniques and results.Using broadband three-component seismometers, the teams acquired data for Spatial Auto-Correlation (SPAC) analysis at seven array locations, and Horizontal to Vertical Spectral Ratio (HVSR) analysis at 60 individual sites along six profiles throughout Santo Domingo. Using a 24-channel geophone string, the teams acquired data for Refraction Microtremor (SeisOptReMi™ from Optim) analysis at 11 sites, with supplementary data for active-source Multi-channel Spectral Analysis of Surface Waves (MASW) analysis at

  18. Coupling Radar Rainfall Estimation and Hydrological Modelling For Flash-flood Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Borga, M.; Creutin, J. D.

    Flood risk mitigation is accomplished through managing either or both the hazard and vulnerability. Flood hazard may be reduced through structural measures which alter the frequency of flood levels in the area. The vulnerability of a community to flood loss can be mitigated through changing or regulating land use and through flood warning and effective emergency response. When dealing with flash-flood hazard, it is gener- ally accepted that the most effective way (and in many instances the only affordable in a sustainable perspective) to mitigate the risk is by reducing the vulnerability of the involved communities, in particular by implementing flood warning systems and community self-help programs. However, both the inherent characteristics of the at- mospheric and hydrologic processes involved in flash-flooding and the changing soci- etal needs provide a tremendous challenge to traditional flood forecasting and warning concepts. In fact, the targets of these systems are traditionally localised like urbanised sectors or hydraulic structures. Given the small spatial scale that characterises flash floods and the development of dispersed urbanisation, transportation, green tourism and water sports, human lives and property are exposed to flash flood risk in a scat- tered manner. This must be taken into consideration in flash flood warning strategies and the investigated region should be considered as a whole and every section of the drainage network as a potential target for hydrological warnings. Radar technology offers the potential to provide information describing rain intensities almost contin- uously in time and space. Recent research results indicate that coupling radar infor- mation to distributed hydrologic modelling can provide hydrologic forecasts at all potentially flooded points of a region. Nevertheless, very few flood warning services use radar data more than on a qualitative basis. After a short review of current under- standing in this area, two

  19. Earth sciences, GIS and geomatics for natural hazards assessment and risks mitigation: a civil protection perspective

    NASA Astrophysics Data System (ADS)

    Perotti, Luigi; Conte, Riccardo; Lanfranco, Massimo; Perrone, Gianluigi; Giardino, Marco; Ratto, Sara

    2010-05-01

    Geo-information and remote sensing are proper tools to enhance functional strategies for increasing awareness on natural hazards and risks and for supporting research and operational activities devoted to disaster reduction. An improved Earth Sciences knowledge coupled with Geomatics advanced technologies has been developed by the joint research group and applied by the ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action) centre, within its partnership with the UN World Food Programme (WFP) with the goal of reducing human, social, economic and environmental losses due to natural hazards and related disasters. By cooperating with local and regional authorities (Municipalities, Centro Funzionale of the Aosta Valley, Civil Protection Agency of Regione Piemonte), data on natural hazards and risks have been collected, compared to national and global data, then interpreted for helping communities and civil protection agencies of sensitive mountain regions to make strategic choices and decisions to better mitigation and adaption measures. To enhance the application of GIS and Remote-sensing technologies for geothematic mapping of geological and geomorphological risks of mountain territories of Europe and Developing Countries, research activities led to the collection and evaluation of data from scientific literature and historical technical archives, for the definition of predisposing/triggering factors and evolutionary processes of natural instability phenomena (landslides, floods, storms, …) and for the design and implementation of early-warning and early-impact systems. Geodatabases, Remote Sensing and Mobile-GIS applications were developed to perform analysis of : 1) large climate-related disaster (Hurricane Mitch, Central America), by the application of remote sensing techniques, either for early warning or mitigation measures at the national and international scale; 2) distribution of slope instabilities at the regional scale (Aosta

  20. The Puerto Rico Component of the National Tsunami Hazard and Mitigation Program (PR-NTHMP)

    NASA Astrophysics Data System (ADS)

    Vanacore, E. A.; Huerfano Moreno, V. A.; Lopez, A. M.

    2015-12-01

    The Caribbean region has a documented history of damaging tsunamis that have affected coastal areas. Of particular interest is the Puerto Rico - Virgin Islands (PRVI) region, where the proximity of the coast to prominent tectonic faults would result in near-field tsunamis. Tsunami hazard assessment, detection capabilities, warning, education and outreach efforts are common tools intended to reduce loss of life and property. It is for these reasons that the PRSN is participating in an effort with local and federal agencies to develop tsunami hazard risk reduction strategies under the NTHMP. This grant supports the TsunamiReady program, which is the base of the tsunami preparedness and mitigation in PR. In order to recognize threatened communities in PR as TsunamiReady by the US NWS, the PR Component of the NTHMP have identified and modeled sources for local, regional and tele-tsunamis and the results of simulations have been used to develop tsunami response plans. The main goal of the PR-NTHMP is to strengthen resilient coastal communities that are prepared for tsunami hazards, and recognize PR as TsunamiReady. Evacuation maps were generated in three phases: First, hypothetical tsunami scenarios of potential underwater earthquakes were developed, and these scenarios were then modeled through during the second phase. The third phase consisted in determining the worst-case scenario based on the Maximum of Maximums (MOM). Inundation and evacuation zones were drawn on GIS referenced maps and aerial photographs. These products are being used by emergency managers to educate the public and develop mitigation strategies. Maps and related evacuation products, like evacuation times, can be accessed online via the PR Tsunami Decision Support Tool. Based on these evacuation maps, tsunami signs were installed, vulnerability profiles were created, communication systems to receive and disseminate tsunami messages were installed in each TWFP, and tsunami response plans were

  1. The Brave New World of Real-time GPS for Hazards Mitigation

    NASA Astrophysics Data System (ADS)

    Melbourne, T. I.; Szeliga, W. M.; Santillan, V. M.; Scrivner, C. W.

    2015-12-01

    Over 600 continuously-operating, real-time telemetered GPS receivers operate throughout California, Oregon, Washington and Alaska. These receivers straddle active crustal faults, volcanoes and landslides, the magnitude-9 Cascadia and northeastern Alaskan subduction zones and their attendant tsunamigenic regions along the Pacific coast. Around the circum-Pacific, there are hundreds more and the number is growing steadily as real-time networks proliferate. Despite offering the potential for sub-cm positioning accuracy in real-time useful for a broad array of hazards mitigation, these GPS stations are only now being incorporated into routine seismic, tsunami, volcanic, land-slide, space-weather, or meterologic monitoring. We will discuss NASA's READI (Real-time Earthquake Analysis for DIsasters) initiative. This effort is focussed on developing all aspects of real-time GPS for hazards mitigation, from establishing international data-sharing agreements to improving basic positioning algorithms. READI's long-term goal is to expand real-time GPS monitoring throughout the circum-Pacific as overseas data become freely available, so that it may be adopted by NOAA, USGS and other operational agencies responsible for natural hazards monitoring. Currently ~100 stations are being jointly processed by CWU and Scripps Inst. of Oceanography for algorithm comparison and downstream merging purposes. The resultant solution streams include point-position estimates in a global reference frame every second with centimeter accuracy, ionospheric total electron content and tropospheric zenith water content. These solutions are freely available to third-party agencies over several streaming protocols to enable their incorporation and use in hazards monitoring. This number will ramp up to ~400 stations over the next year. We will also discuss technical efforts underway to develop a variety of downstream applications of the real-time position streams, including the ability to broadcast

  2. Solutions Network Formulation Report. NASA's Potential Contributions using ASTER Data in Marine Hazard Mitigation

    NASA Technical Reports Server (NTRS)

    Fletcher, Rose

    2010-01-01

    The 28-foot storm surge from Hurricane Katrina pushed inland along bays and rivers for a distance of 12 miles in some areas, contributing to the damage or destruction of about half of the fleet of boats in coastal Mississippi. Most of those boats had sought refuge in back bays and along rivers. Some boats were spared damage because the owners chose their mooring site well. Gulf mariners need a spatial analysis tool that provides guidance on the safest places to anchor their boats during future hurricanes. This product would support NOAA s mission to minimize the effects of coastal hazards through awareness, education, and mitigation strategies and could be incorporated in the Coastal Risk Atlas decision support tool.

  3. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    NASA Astrophysics Data System (ADS)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation

  4. Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Research Team

    NASA Technical Reports Server (NTRS)

    Kelly, Michael J.

    2013-01-01

    The Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage raft empennage.

  5. Evaluation Of Risk And Possible Mitigation Schemes For Previously Unidentified Hazards

    NASA Technical Reports Server (NTRS)

    Linzey, William; McCutchan, Micah; Traskos, Michael; Gilbrech, Richard; Cherney, Robert; Slenski, George; Thomas, Walter, III

    2006-01-01

    protection wire schemes, 145 tests were conducted using various fuel/ox wire alternatives (shielded and unshielded) and/or different combinations of polytetrafuloroethylene (PTFE), Mystik tape and convoluted wraps to prevent unwanted coil activation. Test results were evaluated along with other pertinent data and information to develop a mitigation strategy for an inadvertent RCS firing. The SSP evaluated civilian aircraft wiring failures to search for aging trends in assessing the wire-short hazard. Appendix 2 applies Weibull statistical methods to the same data with a similar purpose.

  6. Using Robust Decision Making to Assess and Mitigate the Risks of Natural Hazards in Developing Countries

    NASA Astrophysics Data System (ADS)

    Kalra, N.; Lempert, R. J.; Peyraud, S.

    2012-12-01

    Ho Chi Minh City (HCMC) ranks fourth globally among coastal cities most vulnerable to climate change and already experiences extensive routine flooding. In the coming decades, increased precipitation, rising sea levels, and land subsidence could permanently inundate a large portion of the city's population, place the poor at particular risk, and threaten new economic development in low-lying areas. HCMC is not alone in facing the impacts of natural hazards exacerbated by uncertain future climate change, development, and other deep uncertainties. Assessing and managing these risks is a tremendous challenge, particularly in developing countries which face pervasive shortages of the data and models generally used to plan for such changes. Using HCMC as a case study, this talk will demonstrate how a scenario-based approach that uses robustness as a decision and planning element can help developing countries assess future climate risk and manage the risk of natural disasters. In contrast to traditional approaches which treat uncertainty with a small number of handcrafted scenarios, this talk will emphasize how robust decision making, which uses modeling to explore over thousands of scenarios, can identify potential vulnerabilities to HCMC's emerging flood risk management strategy and suggest potential responses. The talk will highlight several novel features of the collaboration with the HCMC Steering Committee for Flood Control. First, it examines several types of risk -- risk to the poor, risk to the non-poor, and risk to the economy -- and illustrates how management policies have different implications for these sectors. Second, it demonstrates how diverse and sometimes incomplete climate, hydrologic, socioeconomic, GIS, and other data and models can be integrated into a modeling framework to develop and evaluate many scenarios of flood risk. Third, it illustrates the importance of non-structural policies such as land use management and building design to manage

  7. Volcano Hazard Tracking and Disaster Risk Mitigation: A Detailed Gap Analysis from Data-Collection to User Implementation

    NASA Astrophysics Data System (ADS)

    Faied, D.; Sanchez, A.

    2009-04-01

    Volcano Hazard Tracking and Disaster Risk Mitigation: A Detailed Gap Analysis from Data-Collection to User Implementation Dohy Faied, Aurora Sanchez (on behalf of SSP08 VAPOR Project Team) Dohy.Faied@masters.isunet.edu While numerous global initiatives exist to address the potential hazards posed by volcanic eruption events and assess impacts from a civil security viewpoint, there does not yet exist a single, unified, international system of early warning and hazard tracking for eruptions. Numerous gaps exist in the risk reduction cycle, from data collection, to data processing, and finally dissemination of salient information to relevant parties. As part of the 2008 International Space University's Space Studies Program, a detailed gap analysis of the state of volcano disaster risk reduction was undertaken, and this paper presents the principal results. This gap analysis considered current sensor technologies, data processing algorithms, and utilization of data products by various international organizations. Recommendations for strategies to minimize or eliminate certain gaps are also provided. In the effort to address the gaps, a framework evolved at system level. This framework, known as VIDA, is a tool to develop user requirements for civil security in hazardous contexts, and a candidate system concept for a detailed design phase. VIDA also offers substantial educational potential: the framework includes a centralized clearinghouse for volcanology data which could support education at a variety of levels. Basic geophysical data, satellite maps, and raw sensor data are combined and accessible in a way that allows the relationships between these data types to be explored and used in a training environment. Such a resource naturally lends itself to research efforts in the subject but also research in operational tools, system architecture, and human/machine interaction in civil protection or emergency scenarios.

  8. Field Guide for Testing Existing Photovoltaic Systems for Ground Faults and Installing Equipment to Mitigate Fire Hazards

    SciTech Connect

    Brooks, William; Basso, Thomas; Coddington, Michael

    2015-10-01

    Ground faults and arc faults are the two most common reasons for fires in photovoltaic (PV) arrays and methods exist that can mitigate the hazards. This report provides field procedures for testing PV arrays for ground faults, and for implementing high resolution ground fault and arc fault detectors in existing and new PV system designs.

  9. The Identification of Filters and Interdependencies for Effective Resource Allocation: Coupling the Mitigation of Natural Hazards to Economic Development.

    NASA Astrophysics Data System (ADS)

    Agar, S. M.; Kunreuther, H.

    2005-12-01

    Policy formulation for the mitigation and management of risks posed by natural hazards requires that governments confront difficult decisions for resource allocation and be able to justify their spending. Governments also need to recognize when spending offers little improvement and the circumstances in which relatively small amounts of spending can make substantial differences. Because natural hazards can have detrimental impacts on local and regional economies, patterns of economic development can also be affected by spending decisions for disaster mitigation. This paper argues that by mapping interdependencies among physical, social and economic factors, governments can improve resource allocation to mitigate the risks of natural hazards while improving economic development on local and regional scales. Case studies of natural hazards in Turkey have been used to explore specific "filters" that act to modify short- and long-term outcomes. Pre-event filters can prevent an event from becoming a natural disaster or change a routine event into a disaster. Post-event filters affect both short and long-term recovery and development. Some filters cannot be easily modified by spending (e.g., rural-urban migration) but others (e.g., land-use practices) provide realistic spending targets. Net social benefits derived from spending, however, will also depend on the ways by which filters are linked, or so-called "interdependencies". A single weak link in an interdependent system, such as a power grid, can trigger a cascade of failures. Similarly, weak links in social and commercial networks can send waves of disruption through communities. Conversely, by understanding the positive impacts of interdependencies, spending can be targeted to maximize net social benefits while mitigating risks and improving economic development. Detailed information on public spending was not available for this study but case studies illustrate how networks of interdependent filters can modify

  10. Challenges in understanding, modelling, and mitigating Lake Outburst Flood Hazard: experiences from Central Asia

    NASA Astrophysics Data System (ADS)

    Mergili, Martin; Schneider, Demian; Andres, Norina; Worni, Raphael; Gruber, Fabian; Schneider, Jean F.

    2010-05-01

    Lake Outburst Floods can evolve from complex process chains like avalanches of rock or ice that produce flood waves in a lake which may overtop and eventually breach glacial, morainic, landslide, or artificial dams. Rising lake levels can lead to progressive incision and destabilization of a dam, to enhanced ground water flow (piping), or even to hydrostatic failure of ice dams which can cause sudden outflow of accumulated water. These events often have a highly destructive potential because a large amount of water is released in a short time, with a high capacity to erode loose debris, leading to a powerful debris flow with a long travel distance. The best-known example of a lake outburst flood is the Vajont event (Northern Italy, 1963), where a landslide rushed into an artificial lake which spilled over and caused a flood leading to almost 2000 fatalities. Hazards from the failure of landslide dams are often (not always) fairly manageable: most breaches occur in the first few days or weeks after the landslide event and the rapid construction of a spillway - though problematic - has solved some hazardous situations (e.g. in the case of Hattian landslide in 2005 in Pakistan). Older dams, like Usoi dam (Lake Sarez) in Tajikistan, are usually fairly stable, though landsildes into the lakes may create floodwaves overtopping and eventually weakening the dams. The analysis and the mitigation of glacial lake outburst flood (GLOF) hazard remains a challenge. A number of GLOFs resulting in fatalities and severe damage have occurred during the previous decades, particularly in the Himalayas and in the mountains of Central Asia (Pamir, Tien Shan). The source area is usually far away from the area of impact and events occur at very long intervals or as singularities, so that the population at risk is usually not prepared. Even though potentially hazardous lakes can be identified relatively easily with remote sensing and field work, modeling and predicting of GLOFs (and also

  11. Environmental legislation as the legal framework for mitigating natural hazards in Spain

    NASA Astrophysics Data System (ADS)

    Garrido, Jesús; Arana, Estanislao; Jiménez Soto, Ignacio; Delgado, José

    2015-04-01

    In Spain, the socioeconomic losses due to natural hazards (floods, earthquakes or landslides) are considerable, and the indirect costs associated with them are rarely considered because they are very difficult to evaluate. The prevention of losses due to natural hazards is more economic and efficient through legislation and spatial planning rather than through structural measures, such as walls, anchorages or structural reinforcements. However, there isn't a Spanish natural hazards law and national and regional sector legislation make only sparse mention of them. After 1978, when the Spanish Constitution was enacted, the Autonomous Communities (Spanish regions) were able to legislate according to the different competences (urban planning, environment or civil protection), which were established in the Constitution. In the 1990's, the Civil Protection legislation (national law and regional civil protection tools) dealt specifically with natural hazards (floods, earthquakes and volcanoes), but this was before any soil, seismic or hydrological studies were recommended in the national sector legislation. On the other hand, some Autonomous Communities referred to natural hazards in the Environmental Impact Assessment legislation (EIA) and also in the spatial and urban planning legislation and tools. The National Land Act, enacted in 1998, established, for the first time, that those lands exposed to natural hazards should be classified as non-developable. The Spanish recast text of the Land Act, enacted by Royal Legislative Decree 2/2008, requires that a natural hazards map be included in the Environmental Sustainability Report (ESR), which is compulsory for all master plans, according to the provisions set out by Act 9/2006, known as Spanish Strategic Environmental Assessment (SEA). Consequently, the environmental legislation, after the aforementioned transposition of the SEA European Directive 2001/42/EC, is the legal framework to prevent losses due to natural hazards

  12. Determination of Bedrock Variations and S-wave Velocity Structure in the NW part of Turkey for Earthquake Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Ozel, A. O.; Arslan, M. S.; Aksahin, B. B.; Genc, T.; Isseven, T.; Tuncer, M. K.

    2015-12-01

    Tekirdag region (NW Turkey) is quite close to the North Anatolian Fault which is capable of producing a large earthquake. Therefore, earthquake hazard mitigation studies are important for the urban areas close to the major faults. From this point of view, integration of different geophysical methods has important role for the study of seismic hazard problems including seismotectonic zoning. On the other hand, geological mapping and determining the subsurface structure, which is a key to assist management of new developed areas, conversion of current urban areas or assessment of urban geological hazards can be performed by integrated geophysical methods. This study has been performed in the frame of a national project, which is a complimentary project of the cooperative project between Turkey and Japan (JICA&JST), named as "Earthquake and Tsunami Disaster Mitigation in the Marmara Region and Disaster Education". With this principal aim, this study is focused on Tekirdag and its surrounding region (NW of Turkey) where some uncertainties in subsurface knowledge (maps of bedrock depth, thickness of quaternary sediments, basin geometry and seismic velocity structure,) need to be resolved. Several geophysical methods (microgravity, magnetic and single station and array microtremor measurements) are applied and the results are evaluated to characterize lithological changes in the region. Array microtremor measurements with several radiuses are taken in 30 locations and 1D-velocity structures of S-waves are determined by the inversion of phase velocities of surface waves, and the results of 1D structures are verified by theoretical Rayleigh wave modelling. Following the array measurements, single-station microtremor measurements are implemented at 75 locations to determine the predominant frequency distribution. The predominant frequencies in the region range from 0.5 Hz to 8 Hz in study area. On the other hand, microgravity and magnetic measurements are performed on

  13. A review of accidents, prevention and mitigation options related to hazardous gases

    SciTech Connect

    Fthenakis, V.M.

    1993-05-01

    Statistics on industrial accidents are incomplete due to lack of specific criteria on what constitutes a release or accident. In this country, most major industrial accidents were related to explosions and fires of flammable materials, not to releases of chemicals into the environment. The EPA in a study of 6,928 accidental releases of toxic chemicals revealed that accidents at stationary facilities accounted for 75% of the total number of releases, and transportation accidents for the other 25%. About 7% of all reported accidents (468 cases) resulted in 138 deaths and 4,717 injuries ranging from temporary respiratory problems to critical injuries. In-plant accidents accounted for 65% of the casualties. The most efficient strategy to reduce hazards is to choose technologies which do not require the use of large quantities of hazardous gases. For new technologies this approach can be implemented early in development, before large financial resources and efforts are committed to specific options. Once specific materials and options have been selected, strategies to prevent accident initiating events need to be evaluated and implemented. The next step is to implement safety options which suppress a hazard when an accident initiating event occurs. Releases can be prevented or reduced with fail-safe equipment and valves, adequate warning systems and controls to reduce and interrupt gas leakage. If an accident occurs and safety systems fail to contain a hazardous gas release, then engineering control systems will be relied on to reduce/minimize environmental releases. As a final defensive barrier, the prevention of human exposure is needed if a hazardous gas is released, in spite of previous strategies. Prevention of consequences forms the final defensive barrier. Medical facilities close by that can accommodate victims of the worst accident can reduce the consequences of personnel exposure to hazardous gases.

  14. Volcanic hazard in Mexico: a comprehensive on-line database for risk mitigation

    NASA Astrophysics Data System (ADS)

    Manea, Marina; Constantin Manea, Vlad; Capra, Lucia; Bonasia, Rosanna

    2013-04-01

    Researchers are currently working on several key aspects of the Mexican volcanoes, such as remote sensing, field data of old and recent volcaniclastic deposits, structural framework, monitoring (rainfall data and visual observation of lahars), and laboratory experiment (analogue models and numerical simulations - fall3D, titan2D). Each investigation is focused on specific processes, but it is fundamental to visualize the global status of the volcano in order to understand its behavior and to mitigate future hazards. The Mexican Volcanoes @nline represents a novel initiative aimed to collect, on a systematic basis, the complete set of data obtained so far on the volcanoes, and to continuously update the database with new data. All the information is compiled from published works and updated frequently. Maps, such as the geological map of the Mexican volcanos and the associated hazard zonation, as well as point data, such as stratigraphic sections, sedimentology and diagrams of rainfall intensities, are presented in Google Earth format in order to be easily accessed by the scientific community and the general public. An important section of this online database is the presentation of numerical simulations results for ash dispersion associated with the principal Mexican active volcanoes. Daily prediction of ash flow dispersion (based on real-time data from CENAPRED and the Mexican Meteorological Service), as well as large-scale high-resolution subduction simulations performed on HORUS (the Computational Geodynamics Laboratory's supercomputer) represent a central part of the Mexican Volcanos @nline database. The Mexican Volcanoes @nline database is maintained by the Computational Geodynamics Laboratory and it is based entirely on Open Source software. The website can be visited at: http://www.geociencias.unam.mx/mexican_volcanoes.

  15. Integrated Tsunami Data Supports Forecast, Warning, Research, Hazard Assessment, and Mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Dunbar, P. K.; Stroker, K. J.

    2009-12-01

    With nearly 230,000 fatalities, the 26 December 2004 Indian Ocean tsunami was the deadliest tsunami in history, illustrating the importance of developing basinwide warning systems. Key to creating these systems is easy access to quality-controlled, verified data on past tsunamis. It is essential that warning centers, emergency managers, and modelers can determine if and when similar events have occurred. Following the 2004 tsunami, the National Oceanic and Atmospheric Administration’s (NOAA) National Geophysical Data Center (NGDC) began examining all aspects of the tsunami data archive to help answer questions regarding the frequency and severity of past tsunamis. Historical databases span insufficient time to reveal a region’s full tsunami hazard, so a global database of citations to articles on tsunami deposits was added to the archive. NGDC further expanded the archive to include high-resolution tide gauge data, deep-ocean sensor data, and digital elevation models used for propagation and inundation modeling. NGDC continuously reviews the data for accuracy, making modifications as new information is obtained. These added databases allow NGDC to provide the tsunami data necessary for warning guidance, hazard assessments, and mitigation efforts. NGDC is also at the forefront of standards-based Web delivery of integrated science data through a variety of tools, from Web-form interfaces to interactive maps. The majority of the data in the tsunami archive are discoverable online. Scientists, journalists, educators, planners, and emergency managers are among the many users of these public domain data, which may be used without restriction provided that users cite data sources.

  16. Physical Prototype Development for the Real-Time Detection and Mitigation of Hazardous Releases into a Flow System

    NASA Astrophysics Data System (ADS)

    Rimer, Sara; Katopodes, Nikolaos

    2013-11-01

    The threat of accidental or deliberate toxic chemicals released into public spaces is a significant concern to public safety. The real-time detection and mitigation of such hazardous contaminants has the potential to minimize harm and save lives. In this study, we demonstrate the feasibility of feedback control of a hazardous contaminant by means of a laboratory-scale physical prototype integrated with a previously-developed robust predictive control numerical model. The physical prototype is designed to imitate a public space characterized by a long conduit with an ambient flow (e.g. airport terminal). Unidirectional air flows through a 24-foot long duct. The ``contaminant'' plume of propylene glycol smoke is released into the duct. Camera sensors are used to visually measure concentration of the plume. A pneumatic system is utilized to localize the contaminant via air curtains, and draw it out via vacuum nozzles. The control prescribed to the pneumatic system is based on the numerical model. NSF-CMMI 0856438.

  17. Mitigation of EMU Glove Cut Hazard by MMOD Impact Craters on Exposed ISS Handrails

    NASA Technical Reports Server (NTRS)

    Christiansen, Eric L.; Ryan, Shannon

    2009-01-01

    Recent cut damages to crewmember extravehicular mobility unit (EMU) gloves during extravehicular activity (EVA) onboard the International Space Station (ISS) has been found to result from contact with sharp edges or pinch points rather than general wear or abrasion. One possible source of cut-hazards are protruding sharp edged crater lips from impact of micrometeoroid and orbital debris (MMOD) particles on external metallic handrails along EVA translation paths. During impact of MMOD particles at hypervelocity an evacuation flow develops behind the shock wave, resulting in the formation of crater lips that can protrude above the target surface. In this study, two methods were evaluated to limit EMU glove cut-hazards due to MMOD impact craters. In the first phase, four flexible overwrap configurations are evaluated: a felt-reusable surface insulation (FRSI), polyurethane polyether foam with beta-cloth cover, double-layer polyurethane polyether foam with beta-cloth cover, and multi-layer beta-cloth with intermediate Dacron netting spacers. These overwraps are suitable for retrofitting ground equipment that has yet to be flown, and are not intended to protect the handrail from impact of MMOD particles, rather to act as a spacer between hazardous impact profiles and crewmember gloves. At the impact conditions considered, all four overwrap configurations evaluated were effective in limiting contact between EMU gloves and impact crater profiles. The multi-layer beta-cloth configuration was the most effective in reducing the height of potentially hazardous profiles in handrail-representative targets. In the second phase of the study, four material alternatives to current aluminum and stainless steel alloys were evaluated: a metal matrix composite, carbon fiber reinforced plastic (CFRP), fiberglass, and a fiber metal laminate. Alternative material handrails are intended to prevent the formation of hazardous damage profiles during MMOD impact and are suitable for flight

  18. Earthquake Hazard Mitigation and Real-Time Warnings of Tsunamis and Earthquakes

    NASA Astrophysics Data System (ADS)

    Kanamori, Hiroo

    2015-09-01

    With better understanding of earthquake physics and the advent of broadband seismology and GPS, seismologists can forecast the future activity of large earthquakes on a sound scientific basis. Such forecasts are critically important for long-term hazard mitigation, but because stochastic fracture processes are complex, the forecasts are inevitably subject to large uncertainties, and unexpected events will inevitably occur. Recent developments in real-time seismology helps seismologists cope with and prepare for such unexpected events, including tsunamis and earthquakes. For a tsunami warning, the required warning time is fairly long (usually 5 min or longer) and enables use of a rigorous method for this purpose. Significant advances have already been made. In contrast, early warning of earthquakes is far more challenging because the required warning time is very short (as short as three seconds). Despite this difficulty the methods used for regional warnings have advanced substantially, and several systems have been already developed and implemented. A future strategy for more challenging, rapid (a few second) warnings, which are critically important for saving properties and lives, is discussed.

  19. California Real Time Network: Test Bed for Mitigation of Geological and Atmospheric Hazards within a Modern Data Portal Environment

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2008-12-01

    Global geological and atmospheric hazards such as earthquakes, volcanoes, tsunamis, landslides, storms and floods continue to wreak havoc on the lives of millions of people worldwide. High precision geodetic observations of surface displacements and atmospheric water vapor are indispensable tools in studying natural hazards along side more traditional seismic and atmospheric measurements. The rapid proliferation of dense in situ GPS networks for crustal deformation studies such as the Earthscope Plate Boundary Observatory provides us with unique data sets. However, the full information content and timeliness of these observations have not been fully developed, in particular at higher frequencies than traditional daily continuous GPS position time series. Nor have scientists taken full advantage of the complementary nature of space-based and in situ observations in forecasting, assessing and mitigating natural hazards. The primary operating mode for in situ GPS networks has been daily download of GPS data sampled at a 15-30 s sample rate, and the production of daily position time series or hourly tropospheric zenith delay estimates. However, as continuous GPS networks are being upgraded to provide even higher-frequency information approaching the sampling rates (1-50 Hz) of modern GPS receivers, and with a latency of less than 1 second, new data processing approaches are being developed. Low-latency high-rate measurements are being applied to earthquake source modeling, early warning of natural hazards (geological and atmospheric), and structural monitoring. Since 2002, more than 80 CGPS stations in southern California have been upgraded to a 1 Hz sample rate, including stations from the SCIGN and PBO networks, and several large earthquakes have been recorded. The upgraded stations comprise the California Real Time Network (CRTN - http://sopac.ucsd.edu/projects/realtime/). This prototype network provides continuous 1 Hz (upgradable to 10 Hz at some stations) GPS

  20. PREDICTION/MITIGATION OF SUBSIDENCE DAMAGE TO HAZARDOUS WASTE LANDFILL COVERS

    EPA Science Inventory

    Characteristics of Resource Conservation and Recovery Act hazardous waste landfills and of landfilled hazardous wastes have been described to permit development of models and other analytical techniques for predicting, reducing, and preventing landfill settlement and related cove...

  1. Catastrophic debris flows transformed from landslides in volcanic terrains : mobility, hazard assessment and mitigation strategies

    USGS Publications Warehouse

    Scott, Kevin M.; Macias, Jose Luis; Naranjo, Jose Antonio; Rodriguez, Sergio; McGeehin, John P.

    2001-01-01

    Communities in lowlands near volcanoes are vulnerable to significant volcanic flow hazards in addition to those associated directly with eruptions. The largest such risk is from debris flows beginning as volcanic landslides, with the potential to travel over 100 kilometers. Stratovolcanic edifices commonly are hydrothermal aquifers composed of unstable, altered rock forming steep slopes at high altitudes, and the terrain surrounding them is commonly mantled by readily mobilized, weathered airfall and ashflow deposits. We propose that volcano hazard assessments integrate the potential for unanticipated debris flows with, at active volcanoes, the greater but more predictable potential of magmatically triggered flows. This proposal reinforces the already powerful arguments for minimizing populations in potential flow pathways below both active and selected inactive volcanoes. It also addresses the potential for volcano flank collapse to occur with instability early in a magmatic episode, as well as the 'false-alarm problem'-the difficulty in evacuating the potential paths of these large mobile flows. Debris flows that transform from volcanic landslides, characterized by cohesive (muddy) deposits, create risk comparable to that of their syneruptive counterparts of snow and ice-melt origin, which yield noncohesive (granular) deposits, because: (1) Volcano collapses and the failures of airfall- and ashflow-mantled slopes commonly yield highly mobile debris flows as well as debris avalanches with limited runout potential. Runout potential of debris flows may increase several fold as their volumes enlarge beyond volcanoes through bulking (entrainment) of sediment. Through this mechanism, the runouts of even relatively small collapses at Cascade Range volcanoes, in the range of 0.1 to 0.2 cubic kilometers, can extend to populated lowlands. (2) Collapse is caused by a variety of triggers: tectonic and volcanic earthquakes, gravitational failure, hydrovolcanism, and

  2. Mitigating hazards to aircraft from drifting volcanic clouds by comparing and combining IR satellite data with forward transport models

    NASA Astrophysics Data System (ADS)

    Matiella Novak, M. Alexandra

    Volcanic ash clouds in the upper atmosphere (>10km) present a significant hazard to the aviation community and in some cases cause near-disastrous situations for aircraft that inadvertently encounter them. The two most commonly used techniques for mitigating hazards to aircraft from drifting volcanic clouds are (1) using data from satellite observations and (2) the forecasting of dispersion and trajectories with numerical models. This dissertation aims to aid in the mitigation of this hazard by using Moderate Infrared Resolution Spectroradiometer (MODIS) and Advanced Very High Resolution Radiometer (AVHRR) infrared (IR) satellite data to quantitatively analyze and constrain the uncertainties in the PUFF volcanic ash transport model. Furthermore, this dissertation has experimented with the viability of combining IR data with the PUFF model to increase the model's reliability. Comparing IR satellite data with forward transport models provides valuable information concerning the uncertainty and sensitivity of the transport models. A study analyzing the viability of combining satellite-based information with the PUFF model was also done. Factors controlling the cloud-shape evolution, such as the horizontal dispersion coefficient, vertical distribution of particles, the height of the cloud, and the location of the cloud were all updated based on observations from satellite data in an attempt to increase the reliability of the simulations. Comparing center of mass locations--calculated from satellite data--to HYSPLIT trajectory simulations provides insight into the vertical distribution of the cloud. A case study of the May 10, 2003 Anatahan Volcano eruption was undertaken to assess methods of calculating errors in PUFF simulations with respect to the transport and dispersion of the erupted cloud. An analysis of the factors controlling the cloud-shape evolution of the cloud in the model was also completed and compared to the shape evolution of the cloud observed in the

  3. Disruption mitigation studies in DIII-D

    SciTech Connect

    Taylor, P.L.; Kellman, A.G.; Evans, T.E.

    1999-01-01

    Data on the discharge behavior, thermal loads, halo currents, and runaway electrons have been obtained in disruptions on the DIII-D tokamak. These experiments have also evaluated techniques to mitigate the disruptions while minimizing runaway electron production. Experiments injecting cryogenic impurity killer pellets of neon and argon and massive amounts of helium gas have successfully reduced these disruption effects. The halo current generation, scaling, and mitigation are understood and are in good agreement with predictions of a semianalytic model. Results from killer pellet injection have been used to benchmark theoretical models of the pellet ablation and energy loss. Runaway electrons are often generated by the pellets and new runaway generation mechanisms, modifications of the standard Dreicer process, have been found to explain the runaways. Experiments with the massive helium gas puff have also effectively mitigated disruptions without the formation of runaway electrons that can occur with killer pellets.

  4. Evaluation and mitigation of lightning hazards to the space shuttle Solid Rocket Motors (SRM)

    NASA Technical Reports Server (NTRS)

    Rigden, Gregory J.; Papazian, Peter B.

    1988-01-01

    The objective was to quantify electric field strengths in the Solid Rocket Motor (SRM) propellant in the event of a worst case lightning strike. Using transfer impedance measurements for selected lightning protection materials and 3D finite difference modeling, a retrofit design approach for the existing dielectric grain cover and railcar covers was evaluated and recommended for SRM segment transport. A safe level of 300 kV/m was determined for the propellant. The study indicated that a significant potential hazard exists for unprotected segments during rail transport. However, modified railcar covers and grain covers are expected to prevent lightning attachment to the SRM and to reduce the levels to several orders of magnitude below 300 kV/m.

  5. Pulsed Electric Processing of the Seismic-Active Fault for Earthquake Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Novikov, V. A.; Zeigarnik, V. A.; Konev, Yu. B.; Klyuchkin, V. N.

    2010-03-01

    Previous field and laboratory investigations performed in Russia (1999-2008) showed a possibility of application of high-power electric current pulses generated by pulsed MHD power system for triggering the weak seismicity and release of tectonic stresses in the Earth crust for earthquake hazard mitigation. The mechanism of the influence of man-made electromagnetic field on the regional seismicity is not clear yet. One of possible cause of the phenomenon may be formation of cracks in the rocks under fluid pressure increase due to Joule heat generation by electric current injected into the Earth crust. Detailed 3D-calculaton of electric current density in the Earth crust of Northern Tien Shan provided by pulsed MHD power system connected to grounded electric dipole showed that at the depth of earthquake epicenters (> 5km) the electric current density is lower than 10-7 A/m2 that is not sufficient for increase of pressure in the fluid-saturated porous geological medium due to Joule heat generation, which may provide formation of cracks resulting in the fault propagation and release of tectonic stresses in the Earth crust. Nevertheless, under certain conditions, when electric current will be injected into the fault through the casing pipes of deep wells with preliminary injection of conductive fluid into the fault, the current density may be high enough for significant increase of mechanic pressure in the porous two-phase geological medium. Numerical analysis of a crack formation triggered by high-power electric pulses based on generation of mechanical pressure in the geological medium was carried out. It was shown that calculation of mechanical pressure impulse due to high-power electrical current in the porous two-phase medium may be performed neglecting thermal conductance by solving the non-stationary equation of piezo-conductivity with Joule heat generation. For calculation of heat generation the known solution of the task of current spreading from spherical or

  6. Experimental and Numerical Study of Free-Field Blast Mitigation

    NASA Astrophysics Data System (ADS)

    Allen, R. M.; Kirkpatrick, D. J.; Longbottom, A. W.; Milne, A. M.; Bourne, N. K.

    2004-07-01

    The development of a fundamental understanding of the mechanisms governing the attenuation of explosives effects by a surrounding mitigant material or system would benefit many civilian and military applications. Current approaches rely almost exclusively on empirical data, few if any truly predictive models exist. Dstl has recently pursued an experimental programme investigating the mitigation of effects from detonating explosives in support of general requirements to attenuate blast and fragmentation. The physical properties of a range of mitigant materials have been studied at a more fundamental level, both experimentally and numerically. A preliminary numerical parameter study has been undertaken by FGE using two-phase numerical simulations to complement the experimental studies. Initial work used idealised equations of state for generic mitigants but more recently material characterisation experiments have been undertaken at RMCS. Results confirm that porosity and particle density are dominant factors affecting the efficiency of the mitigant in reducing free-field blast.

  7. Looking Before We Leap: Recent Results From An Ongoing Quantitative Investigation Of Asteroid And Comet Impact Hazard Mitigation.

    NASA Astrophysics Data System (ADS)

    Plesko, Catherine; Weaver, R. P.; Korycansky, D. G.; Huebner, W. F.

    2010-10-01

    The asteroid and comet impact hazard is now part of public consciousness, as demonstrated by movies, Super Bowl commercials, and popular news stories. However, there is a popular misconception that hazard mitigation is a solved problem. Many people think, `we'll just nuke it.’ There are, however, significant scientific questions remaining in the hazard mitigation problem. Before we can say with certainty that an explosive yield Y at height of burst h will produce a momentum change in or dispersion of a potentially hazardous object (PHO), we need to quantify how and where energy is deposited into the rubble pile or conglomerate that may make up the PHO. We then need to understand how shock waves propagate through the system, what causes them to disrupt, and how long gravitationally bound fragments take to recombine. Here we present numerical models of energy deposition from an energy source into various materials that are known PHO constituents, and rigid body dynamics models of the recombination of disrupted objects. In the energy deposition models, we explore the effects of porosity and standoff distance as well as that of composition. In the dynamical models, we explore the effects of fragment size and velocity distributions on the time it takes for gravitationally bound fragments to recombine. Initial models indicate that this recombination time is relatively short, as little as 24 hours for a 1 km sized PHO composed of 1000 meter-scale self-gravitating fragments with an initial velocity field of v/r = 0.001 1/s.

  8. Bike Helmets and Black Riders: Experiential Approaches to Helping Students Understand Natural Hazard Assessment and Mitigation Issues

    NASA Astrophysics Data System (ADS)

    Stein, S. A.; Kley, J.; Hindle, D.; Friedrich, A. M.

    2014-12-01

    Defending society against natural hazards is a high-stakes game of chance against nature, involving tough decisions. How should a developing nation allocate its budget between building schools for towns without ones or making existing schools earthquake-resistant? Does it make more sense to build levees to protect against floods, or to prevent development in the areas at risk? Would more lives be saved by making hospitals earthquake-resistant, or using the funds for patient care? These topics are challenging because they are far from normal experience, in that they involve rare events and large sums. To help students in natural hazard classes conceptualize them, we pose tough and thought-provoking questions about complex issues involved and explore them together via lectures, videos, field trips, and in-class and homework questions. We discuss analogous examples from the students' experiences, drawing on a new book "Playing Against Nature, Integrating Science and Economics to Mitigate Natural Hazards in an Uncertain World". Asking whether they wear bicycle helmets and why or why not shows the cultural perception of risk. Individual students' responses vary, and the overall results vary dramatically between the US, UK, and Germany. Challenges in hazard assessment in an uncertain world are illustrated by asking German students whether they buy a ticket on public transportation - accepting a known cost - or "ride black" - not paying but risking a heavy fine if caught. We explore the challenge of balancing mitigation costs and benefits via the question "If you were a student in Los Angeles, how much more would you pay in rent each month to live in an earthquake-safe building?" Students learn that interdisciplinary thinking is needed, and that due to both uncertainties and sociocultural factors, no unique or right strategies exist for a particular community, much the less all communities. However, we can seek robust policies that give sensible results given

  9. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    J. L. Kubicek

    2001-09-07

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  10. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    Richard C. Logan

    2002-03-28

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  11. Disruption mitigation studies in DIII-D

    SciTech Connect

    Taylor, P.L.; Kellman, A.G.; Evans, T.E.; Gray, D.S.; Humphreys, D.A.; Hyatt, A.W.; Jernigan, T.C.; Lee, R.L.; Leuer, J.A.; Luckhardt, S.C.; Parks, P.B.; Schaffer, M.J.; Whyte, D.G.; Zhang, J.

    1999-05-01

    Data on the discharge behavior, thermal loads, halo currents, and runaway electrons have been obtained in disruptions on the DIII-D tokamak [J. L. Luxon and L. G. Davis, Fusion Technol. {bold 8}, 2A 441 (1985)]. These experiments have also evaluated techniques to mitigate the disruptions while minimizing runaway electron production. Experiments injecting cryogenic impurity {open_quotes}killer{close_quotes} pellets of neon and argon and massive amounts of helium gas have successfully reduced these disruption effects. The halo current generation, scaling, and mitigation are understood and are in good agreement with predictions of a semianalytic model. Results from {open_quotes}killer{close_quotes} pellet injection have been used to benchmark theoretical models of the pellet ablation and energy loss. Runaway electrons are often generated by the pellets and new runaway generation mechanisms, modifications of the standard Dreicer process, have been found to explain the runaways. Experiments with the massive helium gas puff have also effectively mitigated disruptions without the formation of runaway electrons that can occur with {open_quotes}killer{close_quotes} pellets. {copyright} {ital 1999 American Institute of Physics.}

  12. Studies Update Vinyl Chloride Hazards.

    ERIC Educational Resources Information Center

    Rawls, Rebecca

    1980-01-01

    Extensive study affirms that vinyl chloride is a potent animal carcinogen. Epidemiological studies show elevated rates of human cancers in association with extended contact with the compound. (Author/RE)

  13. Laboratory scale studies on mitigation of high 222Rn concentrations in air and water

    NASA Astrophysics Data System (ADS)

    Mamoon, A.; Gomma, M. A.; Sohsah, M.

    2004-01-01

    In view of the occasional occurrence of high 222Rn concentrations in air and water under certain circumstances, and in view of the potential health hazards of increased levels of 222Rn in respirable air and in potable water, mitigation of such high 222Rn concentration has become of primary concern. To facilitate the study of the efficiency of the various 222Rn mitigating factors simple laboratory systems were used. Altered alkali granite was used as radon source to enrich air and a piece of pitchblende was used as radon source to enrich water samples. Both enriched media will then be subjected to the mitigation treatments. Charcoal canister technique along with gamma spectrometry were used to measure 222Rn concentrations in air before and after the different mitigating treatments. These were: use of ventilation, radon barriers such as geo-membranes and aluminum sheet, and sealant such as epoxy and vinyl tape. Regarding high levels of 222Rn in air ventilation was the most efficient mitigating factor. Standard liquid scintillation counting was used to measure 222Rn concentrations in water before and after the different mitigation treatments. These were: use of aeration, activated charcoal and heating. Regarding high levels of 222Rn in water, aeration using bubblers and large volume of air was most effective in removing radon from water in a short time. However all the mitigating factors proved effective, in different degrees in decreasing 222Rn concentrations in the respective media. The result from these studies are in general agreement with reports in the literature. It can be concluded then that the different 222Rn mitigating factors can be tested and compared effectively under controlled conditions using simple laboratory scale systems.

  14. Advances in Remote Sensing Approaches for Hazard Mitigation and Natural Resource Protection in Pacific Latin America: A Workshop for Advanced Graduate Students, Post- Doctoral Researchers, and Junior Faculty

    NASA Astrophysics Data System (ADS)

    Gierke, J. S.; Rose, W. I.; Waite, G. P.; Palma, J. L.; Gross, E. L.

    2008-12-01

    Though much of the developing world has the potential to gain significantly from remote sensing techniques in terms of public health and safety, they often lack resources for advancing the development and practice of remote sensing. All countries share a mutual interest in furthering remote sensing capabilities for natural hazard mitigation and resource development. With National Science Foundation support from the Partnerships in International Research and Education program, we are developing a new educational system of applied research and engineering for advancing collaborative linkages among agencies and institutions in Pacific Latin American countries (to date: Guatemala, El Salvador, Nicaragua, Costa Rica, Panama, and Ecuador) in the development of remote sensing tools for hazard mitigation and water resources management. The project aims to prepare students for careers in science and engineering through their efforts to solve suites of problems needing creative solutions: collaboration with foreign agencies; living abroad immersed in different cultures; and adapting their academic training to contend with potentially difficult field conditions and limited resources. The ultimate goal of integrating research with education is to encourage cross-disciplinary, creative, and critical thinking in problem solving and foster the ability to deal with uncertainty in analyzing problems and designing appropriate solutions. In addition to traditional approaches for graduate and undergraduate research, we have built new educational systems of applied research and engineering: (1) the Peace Corp/Master's International program in Natural Hazards which features a 2-year field assignment during service in the U.S. Peace Corps, (2) the Michigan Tech Enterprise program for undergraduates, which gives teams of students from different disciplines the opportunity to work for three years in a business-like setting to solve real-world problems, and (3) a unique university exchange

  15. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr

  16. Remote Sensing for Hazard Mitigation and Resource Protection in Pacific Latin America: New NSF sponsored initiative at Michigan Tech.

    NASA Astrophysics Data System (ADS)

    Rose, W. I.; Bluth, G. J.; Gierke, J. S.; Gross, E.

    2005-12-01

    Though much of the developing world has the potential to gain significantly from remote sensing techniques in terms of public health and safety and, eventually, economic development, they lack the resources required to advance the development and practice of remote sensing. Both developed and developing countries share a mutual interest in furthering remote sensing capabilities for natural hazard mitigation and resource development, and this common commitment creates a solid foundation upon which to build an integrated education and research project. This will prepare students for careers in science and engineering through their efforts to solve a suite of problems needing creative solutions: collaboration with foreign agencies; living abroad immersed in different cultures; and adapting their academic training to contend with potentially difficult field conditions and limited resources. This project makes two important advances: (1) We intend to develop the first formal linkage among geoscience agencies from four Pacific Latin American countries (Guatemala, El Salvador, Nicaragua and Ecuador), focusing on the collaborative development of remote sensing tools for hazard mitigation and water resource development; (2) We will build a new educational system of applied research and engineering, using two existing educational programs at Michigan Tech: a new Peace Corp/Master's International (PC/MI) program in Natural Hazards which features a 2-year field assignment, and an "Enterprise" program for undergraduates, which gives teams of geoengineering students the opportunity to work for three years in a business-like setting to solve real-world problems This project will involve 1-2 post-doctoral researchers, 3 Ph.D., 9 PC/MI, and roughly 20 undergraduate students each year.

  17. The respiratory health hazards of volcanic ash: a review for volcanic risk mitigation

    NASA Astrophysics Data System (ADS)

    Horwell, Claire J.; Baxter, Peter J.

    2006-07-01

    Studies of the respiratory health effects of different types of volcanic ash have been undertaken only in the last 40 years, and mostly since the eruption of Mt. St. Helens in 1980. This review of all published clinical, epidemiological and toxicological studies, and other work known to the authors up to and including 2005, highlights the sparseness of studies on acute health effects after eruptions and the complexity of evaluating the long-term health risk (silicosis, non-specific pneumoconiosis and chronic obstructive pulmonary disease) in populations from prolonged exposure to ash due to persistent eruptive activity. The acute and chronic health effects of volcanic ash depend upon particle size (particularly the proportion of respirable-sized material), mineralogical composition (including the crystalline silica content) and the physico-chemical properties of the surfaces of the ash particles, all of which vary between volcanoes and even eruptions of the same volcano, but adequate information on these key characteristics is not reported for most eruptions. The incidence of acute respiratory symptoms (e.g. asthma, bronchitis) varies greatly after ashfalls, from very few, if any, reported cases to population outbreaks of asthma. The studies are inadequate for excluding increases in acute respiratory mortality after eruptions. Individuals with pre-existing lung disease, including asthma, can be at increased risk of their symptoms being exacerbated after falls of fine ash. A comprehensive risk assessment, including toxicological studies, to determine the long-term risk of silicosis from chronic exposure to volcanic ash, has been undertaken only in the eruptions of Mt. St. Helens (1980), USA, and Soufrière Hills, Montserrat (1995 onwards). In the Soufrière Hills eruption, a long-term silicosis hazard has been identified and sufficient exposure and toxicological information obtained to make a probabilistic risk assessment for the development of silicosis in outdoor

  18. Piloted Simulation to Evaluate the Utility of a Real Time Envelope Protection System for Mitigating In-Flight Icing Hazards

    NASA Technical Reports Server (NTRS)

    Ranaudo, Richard J.; Martos, Borja; Norton, Bill W.; Gingras, David R.; Barnhart, Billy P.; Ratvasky, Thomas P.; Morelli, Eugene

    2011-01-01

    The utility of the Icing Contamination Envelope Protection (ICEPro) system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device (ICEFTD). ICEPro provides real time envelope protection cues and alerting messages on pilot displays. The pilots participating in this test were divided into two groups; a control group using baseline displays without ICEPro, and an experimental group using ICEPro driven display cueing. Each group flew identical precision approach and missed approach procedures with a simulated failure case icing condition. Pilot performance, workload, and survey questionnaires were collected for both groups of pilots. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their situation awareness of a hazardous aircraft state.

  19. Assessing NEO hazard mitigation in terms of astrodynamics and propulsion systems requirements.

    PubMed

    Remo, John L

    2004-05-01

    Uncertainties associated with assessing valid near-Earth object (NEO) threats and carrying out interception missions place unique and stringent burdens on designing mission architecture, astrodynamics, and spacecraft propulsion systems. A prime uncertainty is associated with the meaning of NEO orbit predictability regarding Earth impact. Analyses of past NEO orbits and impact probabilities indicate uncertainties in determining if a projected NEO threat will actually materialize within a given time frame. Other uncertainties regard estimated mass, composition, and structural integrity of the NEO body. At issue is if one can reliably estimate a NEO threat and its magnitude. Parameters that determine NEO deflection requirements within various time frames, including the terminal orbital pass before impact, and necessary energy payloads, are quantitatively discussed. Propulsion system requirements for extending space capabilities to rapidly interact with NEOs at ranges of up to about 1 AU (astronomical unit) from Earth are outlined. Such missions, without gravitational boosts, are deemed critical for a practical and effective response to mitigation. If an impact threat is confirmed on an immediate orbital pass, the option for interactive reconnaissance, and interception, and subsequent NEO orbit deflection must be promptly carried out. There also must be an option to abort the mitigation mission if the NEO is subsequently found not to be Earth threatening. These options require optimal decision latitude and operational possibilities for NEO threat removal while minimizing alarm. Acting too far in advance of the projected impact could induce perturbations that ultimately exacerbate the threat. Given the dilemmas, uncertainties, and limited options associated with timely NEO mitigation within a decision making framework, currently available propulsion technologies that appear most viable to carry out a NEO interception/mitigation mission within the greatest margin of

  20. Hazardous near Earth asteroid mitigation campaign planning based on uncertain information on fundamental asteroid characteristics

    NASA Astrophysics Data System (ADS)

    Sugimoto, Y.; Radice, G.; Ceriotti, M.; Sanchez, J. P.

    2014-10-01

    Given a limited warning time, an asteroid impact mitigation campaign would hinge on uncertainty-based information consisting of remote observational data of the identified Earth-threatening object, general knowledge of near-Earth asteroids (NEAs), and engineering judgment. Due to these ambiguities, the campaign credibility could be profoundly compromised. It is therefore imperative to comprehensively evaluate the inherent uncertainty in deflection and plan the campaign accordingly to ensure successful mitigation. This research demonstrates dual-deflection mitigation campaigns consisting of primary (instantaneous/quasi-instantaneous) and secondary (slow-push) deflection missions, where both deflection efficiency and campaign credibility are taken into account. The results of the dual-deflection campaign analysis show that there are trade-offs between the competing aspects: the launch cost, mission duration, deflection distance, and the confidence in successful deflection. The design approach is found to be useful for multi-deflection campaign planning, allowing us to select the best possible combination of missions from a catalogue of campaign options, without compromising the campaign credibility.

  1. Hawaiian cultural influences on support for lava flow hazard mitigation measures during the January 1960 eruption of Kīlauea volcano, Kapoho, Hawai‘i

    NASA Astrophysics Data System (ADS)

    Gregg, C. E.; Houghton, B. F.; Paton, D.; Swanson, D. A.; Lachman, R.; Bonk, W. J.

    2008-05-01

    In 1960, Kīlauea volcano in Hawaii erupted, destroying most of the village of Kapoho and forcing evacuation of its approximately 300 residents. A large and unprecedented social science survey was undertaken during the eruption to develop an understanding of human behavior, beliefs, and coping strategies among the adult evacuees ( n = 160). Identical studies were also performed in three control towns located at varying distances from the eruption site ( n = 478). During these studies data were collected that characterized ethnic grouping and attitudes toward Hawaiian cultural issues such as belief in Pele and two lava flow mitigation measures—use of barriers and bombs to influence the flow of lava, but the data were never published. Using these forgotten data, we examined the relationship between Hawaiian cultural issues and attitudes toward the use of barriers and bombs as mitigation strategies to protect Kapoho. On average, 72% of respondents favored the construction of earthen barriers to hold back or divert lava and protect Kapoho, but far fewer agreed with the military's use of bombs (14%) to protect Kapoho. In contrast, about one-third of respondents conditionally agreed with the use of bombs. It is suggested that local participation in the bombing strategy may explain the increased conditional acceptance of bombs as a mitigation tool, although this can not be conclusively demonstrated. Belief in Pele and being of Hawaiian ethnicity did not reduce support for the use of barriers, but did reduce support for bombs in both bombing scenarios. The disparity in levels of acceptance of barriers versus bombing and of one bombing strategy versus another suggests that historically public attitudes toward lava flow hazard mitigation strategies were complex. A modern comparative study is needed before the next damaging eruption to inform debates and decisions about whether or not to interfere with the flow of lava. Recent changes in the current eruption of K

  2. Marine and Hydrokinetic Renewable Energy Devices, Potential Navigational Hazards and Mitigation Measures

    SciTech Connect

    Cool, Richard, M.; Hudon, Thomas, J.; Basco, David, R.; Rondorf, Neil, E.

    2009-12-01

    On April 15, 2008, the Department of Energy (DOE) issued a Funding Opportunity Announcement for Advanced Water Power Projects which included a Topic Area for Marine and Hydrokinetic Renewable Energy Market Acceleration Projects. Within this Topic Area, DOE identified potential navigational impacts of marine and hydrokinetic renewable energy technologies and measures to prevent adverse impacts on navigation as a sub-topic area. DOE defines marine and hydrokinetic technologies as those capable of utilizing one or more of the following resource categories for energy generation: ocean waves; tides or ocean currents; free flowing water in rivers or streams; and energy generation from the differentials in ocean temperature. PCCI was awarded Cooperative Agreement DE-FC36-08GO18177 from the DOE to identify the potential navigational impacts and mitigation measures for marine hydrokinetic technologies. A technical report addressing our findings is available on this Science and Technology Information site under the Product Title, "Marine and Hydrokinetic Renewable Energy Technologies: Potential Navigational Impacts and Mitigation Measures". This product is a brochure, primarily for project developers, that summarizes important issues in that more comprehensive report, identifies locations where that report can be downloaded, and identifies points of contact for more information.

  3. Debris flood hazard documentation and mitigation on the Tilcara alluvial fan (Quebrada de Humahuaca, Jujuy province, North-West Argentina)

    NASA Astrophysics Data System (ADS)

    Marcato, G.; Bossi, G.; Rivelli, F.; Borgatti, L.

    2012-06-01

    For some decades, mass wasting processes such as landslides and debris floods have been threatening villages and transportation routes in the Rio Grande Valley, named Quebrada de Humauhuaca. One of the most significant examples is the urban area of Tilcara, built on a large alluvial fan. In recent years, debris flood phenomena have been triggered in the tributary valley of the Huasamayo Stream and reached the alluvial fan on a decadal basis. In view of proper development of the area, hazard and risk assessment together with risk mitigation strategies are of paramount importance. The need is urgent also because the Quebrada de Humahuaca was recently included in the UNESCO World Cultural Heritage. Therefore, the growing tourism industry may lead to uncontrolled exploitation and urbanization of the valley, with a consequent increase of the vulnerability of the elements exposed to risk. In this context, structural and non structural mitigation measures not only have to be based on the understanding of natural processes, but also have to consider environmental and sociological factors that could hinder the effectiveness of the countermeasure works. The hydrogeological processes are described with reference to present-day hazard and risk conditions. Considering the socio-economic context, some possible interventions are outlined, which encompass budget constraints and local practices. One viable solution would be to build a protecting dam upstream of the fan apex and an artificial channel, in order to divert the floodwaters in a gully that would then convey water and sediments into the Rio Grande, some kilometers downstream of Tilcara. The proposed remedial measures should employ easily available and relatively cheap technologies and local workers, incorporating low environmental and visual impacts issues, in order to ensure both the future conservation of the site and its safe exploitation for inhabitants and tourists.

  4. Societal transformation and adaptation necessary to manage dynamics in flood hazard and risk mitigation (TRANS-ADAPT)

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Thaler, Thomas; Bonnefond, Mathieu; Clarke, Darren; Driessen, Peter; Hegger, Dries; Gatien-Tournat, Amandine; Gralepois, Mathilde; Fournier, Marie; Mees, Heleen; Murphy, Conor; Servain-Courant, Sylvie

    2015-04-01

    Facing the challenges of climate change, this project aims to analyse and to evaluate the multiple use of flood alleviation schemes with respect to social transformation in communities exposed to flood hazards in Europe. The overall goals are: (1) the identification of indicators and parameters necessary for strategies to increase societal resilience, (2) an analysis of the institutional settings needed for societal transformation, and (3) perspectives of changing divisions of responsibilities between public and private actors necessary to arrive at more resilient societies. This proposal assesses societal transformations from the perspective of changing divisions of responsibilities between public and private actors necessary to arrive at more resilient societies. Yet each risk mitigation measure is built on a narrative of exchanges and relations between people and therefore may condition the outputs. As such, governance is done by people interacting and defining risk mitigation measures as well as climate change adaptation are therefore simultaneously both outcomes of, and productive to, public and private responsibilities. Building off current knowledge this project will focus on different dimensions of adaptation and mitigation strategies based on social, economic and institutional incentives and settings, centring on the linkages between these different dimensions and complementing existing flood risk governance arrangements. The policy dimension of adaptation, predominantly decisions on the societal admissible level of vulnerability and risk, will be evaluated by a human-environment interaction approach using multiple methods and the assessment of social capacities of stakeholders across scales. As such, the challenges of adaptation to flood risk will be tackled by converting scientific frameworks into practical assessment and policy advice. In addressing the relationship between these dimensions of adaptation on different temporal and spatial scales, this

  5. Status of Seismotectonic and seismic hazard studies in South Africa

    NASA Astrophysics Data System (ADS)

    Midzi, V.

    2012-04-01

    Though South Africa is considered to lie in a stable continental region, earthquakes are recorded and located daily. Large events have been recorded that resulted in severe damage to infrastructure in nearby towns, farms, underground mines and even death in some circumstances. Therefore, it is necessary that we consider the effects of these events in the design of our infrastructure. This mitigation is done by carrying out reliable seismic hazard and risk studies of our regions using state of the art methodologies. In South Africa, several regional seismic hazard studies have been carried out and published. Continental wide studies that include the South African region were also published by various scientists from the continent (e.g. GSHAP). However, to ensure that we conform to international best practice in such studies, more studies need and are being done to improve data, knowledge and methodologies used in the assessments. We continue to collect and improve collection methods of historical and instrumental seismicity data. Available geological information is being used to identify and characterize active or capable faults.

  6. A probabilistic framework for hazard assessment and mitigation of induced seismicity related to deep geothermal systems

    NASA Astrophysics Data System (ADS)

    Wiemer, S.; Bachmann, C. E.; Allmann, B.; Giardini, D.; Woessner, J.; Catalli, F.; Mena Carbrera, B.

    2011-12-01

    Slip on tectonic faults take place over a wide range of spatial and temporal scales as earthquakes, continuous aseismic creep, or transient creep events. Shallow creep events on continental strike-slip faults can occur spontaneously, or are coupled with earthquake afterslip, or are triggered by nearby earthquakes. Despite more than five decades of observations, the mechanism of shallow creep events and their implications for seismic hazard are still not fully understood. To understand the mechanism of creep events, we developed a physics-based numerical model to simulate shallow creep events on a strike-slip fault with rate-and-state frictional properties (Wei et al., 2013). We show that the widely used synoptic model (Scholz, 1998) cannot reproduce both rapid afterslip and frequent creep events as observed on the Superstition Hills fault in the Salton Trough after the 1987 Mw 6.6 earthquake. Rather, an unstable layer embedded in the shallow stable zone is required to match the geodetic observations of the creep behavior. Using the strike-slip fault model, we studied the triggering process of creep events, by either static or dynamic, or combined stress perturbations induced on the fault by nearby earthquakes. Preliminary results show that static stress perturbations in the effective normal stress on a system with spontaneous creep events can advance or delay creep events. The magnitude and timing of perturbations determines the clock change of creep events. The magnitude and interval of creep events changes permanently after static stress perturbation. Dynamic stress perturbations in effective normal stress can advance the timings of creep events when the perturbation temporally decreases the effective normal stress. A threshold exists for instantaneous triggering. The size of triggered slip increases as the dynamic perturbation increases in the direction of less normal stress. The system returns to pre-perturbation state after a long period of no slip. The length

  7. Volcanic risk: mitigation of lava flow invasion hazard through optimized barrier configuration

    NASA Astrophysics Data System (ADS)

    Scifoni, S.; Coltelli, M.; Marsella, M.; Napoleoni, Q.; Del Negro, C.; Proietti, C.; Vicari, A.

    2009-04-01

    In order to mitigate the destructive effects of lava flows along volcanic slopes, the building of artificial barriers is a fundamental action for controlling and slowing down the lava flow advance, as experienced during a few recent eruptions of Etna. The simulated lava path can be used to define an optimize project to locate the work but for a timely action it is also necessary to quickly construct a barrier. Therefore this work investigates different type of engineering work that can be adopted to build up a lava containing barrier for improving the efficiency of the structure. From the analysis of historical cases it is clear that barriers were generally constructed by building up earth, lava blocks and incoherent, low density material. This solution implies complex operational constraints and logistical problems that justify the effort of looking for alternative design. Moreover for optimizing the barrier construction an alternative project of gabion-made barrier was here proposed. In this way the volume of mobilized material is lower than that for a earth barrier, thus reducing the time needed for build up the structure. A second crucial aspect to be considered is the geometry of the barrier which, is one of the few parameters that can be modulated, the others being linked to the morphological and topographical characteristics of the ground. Once the walls have been realized, it may be necessary to be able to expand the structure vertically. The use of gabion has many advantages over loose riprap (earthen walls) owing to their modularity and capability to be stacked in various shapes. Furthermore, the elements which are not inundated by lava can be removed and rapidly used for other barriers. The combination between numerical simulations and gabions will allow a quicker mitigation of risk on lava flows and this is an important aspect for a civil protection intervention in emergency cases.

  8. Smart Oceans BC: Supporting Coastal and Ocean Natural Hazards Mitigation for British Columbia

    NASA Astrophysics Data System (ADS)

    Moran, K.; Insua, T. L.; Pirenne, B.; Hoeberechts, M.; McLean, S.

    2014-12-01

    Smart Oceans BC is a new multi-faceted program to support decision-makers faced with responding to natural disasters and hazards in Canada's Province of British Columbia. It leverages the unique capabilities of Ocean Networks Canada's cabled ocean observatories, NEPTUNE and VENUS to enhance public safety, marine safety and environmental monitoring. Smart Oceans BC combines existing and new marine sensing technology with its robust data management and archive system, Oceans 2.0, to deliver information and science for good ocean management and responsible ocean use. Smart Oceans BC includes new ocean observing infrastructure for: public safety, through natural hazard detection for earthquake groundshaking and near-field tsunamis; marine safety, by monitoring and providing alerts on sea state, ship traffic, and marine mammal presence; and environmental protection, by establishing baseline data in critical areas, and providing real-time environmental observations. Here we present the elements of this new ocean observing initiative that are focused on tsunami and earthquake early warning including cabled and autonomous sensor systems, real-time data delivery, software developments that enable rapid detection, analytics used in notification development, and stakeholder engagement plans.

  9. A fast global tsunami modeling suite as a trans-oceanic tsunami hazard prediction and mitigation tool

    NASA Astrophysics Data System (ADS)

    Mohammed, F.; Li, S.; Jalali Farahani, R.; Williams, C. R.; Astill, S.; Wilson, P. S.; B, S.; Lee, R.

    2014-12-01

    The past decade has been witness to two mega-tsunami events, 2004 Indian ocean tsunami and 2011 Japan tsunami and multiple major tsunami events; 2006 Java, Kuril Islands, 2007 Solomon Islands, 2009 Samoa and 2010 Chile, to name a few. These events generated both local and far field tsunami inundations with runup ranging from a few meters to around 40 m in the coastal impact regions. With a majority of the coastal population at risk, there is need for a sophisticated outlook towards catastrophe risk estimation and a quick mitigation response. At the same time tools and information are needed to aid advanced tsunami hazard prediction. There is an increased need for insurers, reinsurers and Federal hazard management agencies to quantify coastal inundations and vulnerability of coastal habitat to tsunami inundations. A novel tool is developed to model local and far-field tsunami generation, propagation and inundation to estimate tsunami hazards. The tool is a combination of the NOAA MOST propagation database and an efficient and fast GPU (Graphical Processing Unit)-based non-linear shallow water wave model solver. The tsunamigenic seismic sources are mapped on to the NOAA unit source distribution along subduction zones in the ocean basin. Slip models are defined for tsunamigenic seismic sources through a slip distribution on the unit sources while maintaining limits of fault areas. A GPU based finite volume solver is used to simulate non-linear shallow water wave propagation, inundation and runup. Deformation on the unit sources provide initial conditions for modeling local impacts, while the wave history from propagation database provides boundary conditions for far field impacts. The modeling suite provides good agreement with basins for basin wide tsunami propagation to validate local and far field tsunami inundations.

  10. Field Guide for Testing Existing Photovoltaic Systems for Ground Faults and Installing Equipment to Mitigate Fire Hazards: November 2012 - October 2013

    SciTech Connect

    Brooks, William

    2015-02-01

    Ground faults and arc faults are the two most common reasons for fires in photovoltaic (PV) arrays and methods exist that can mitigate the hazards. This report provides field procedures for testing PV arrays for ground faults, and for implementing high resolution ground fault and arc fault detectors in existing and new PV system designs.

  11. Novel bio-inspired smart control for hazard mitigation of civil structures

    NASA Astrophysics Data System (ADS)

    Kim, Yeesock; Kim, Changwon; Langari, Reza

    2010-11-01

    In this paper, a new bio-inspired controller is proposed for vibration mitigation of smart structures subjected to ground disturbances (i.e. earthquakes). The control system is developed through the integration of a brain emotional learning (BEL) algorithm with a proportional-integral-derivative (PID) controller and a semiactive inversion (Inv) algorithm. The BEL algorithm is based on the neurologically inspired computational model of the amygdala and the orbitofrontal cortex. To demonstrate the effectiveness of the proposed hybrid BEL-PID-Inv control algorithm, a seismically excited building structure equipped with a magnetorheological (MR) damper is investigated. The performance of the proposed hybrid BEL-PID-Inv control algorithm is compared with that of passive, PID, linear quadratic Gaussian (LQG), and BEL control systems. In the simulation, the robustness of the hybrid BEL-PID-Inv control algorithm in the presence of modeling uncertainties as well as external disturbances is investigated. It is shown that the proposed hybrid BEL-PID-Inv control algorithm is effective in improving the dynamic responses of seismically excited building structure-MR damper systems.

  12. Educational Approach to Seismic Risk Mitigation in Indian Himalayas -Hazard Map Making Workshops at High Schools-

    NASA Astrophysics Data System (ADS)

    Koketsu, K.; Oki, S.; Kimura, M.; Chadha, R. K.; Davuluri, S.

    2014-12-01

    How can we encourage people to take preventive measures against damage risks and empower them to take the right actions in emergencies to save their lives? The conventional approach taken by scientists had been disseminating intelligible information on up-to-date seismological knowledge. However, it has been proven that knowledge alone does not have enough impact to modify people's behaviors in emergencies (Oki and Nakayachi, 2012). On the other hand, the conventional approach taken by practitioners had been to conduct emergency drills at schools or workplaces. The loss of many lives from the 2011 Tohoku earthquake has proven that these emergency drills were not enough to save people's lives, unless they were empowered to assess the given situation on their own and react flexibly. Our challenge is to bridge the gap between knowledge and practice. With reference to best practices observed in Tohoku, such as The Miracles of Kamaishi, our endeavor is to design an effective Disaster Preparedness Education Program that is applicable to other disaster-prone regions in the world, even with different geological, socio-economical and cultural backgrounds. The key concepts for this new approach are 1) empowering individuals to take preventive actions to save their lives, 2) granting community-based understanding of disaster risks and 3) building a sense of reality and relevancy to disasters. With these in mind, we held workshops at some high schools in the Lesser Himalayan Region, combining lectures with an activity called "Hazard Map Making" where students proactively identify and assess the hazards around their living areas and learn practical strategies on how to manage risks. We observed the change of awareness of the students by conducting a preliminary questionnaire survey and interviews after each session. Results strongly implied that the significant change of students' attitudes towards disaster preparedness occurred not by the lectures of scientific knowledge, but

  13. Using Darwin's theory of atoll formation to improve tsunami hazard mitigation in the Pacific

    NASA Astrophysics Data System (ADS)

    Goff, J. R.; Terry, J. P.

    2012-12-01

    It is 130 years since Charles Darwin's death and 176 years since he his penned his subsidence theory of atoll formation on 12th April 1836 during the voyage of the Beagle through the Pacific. This theory, founded on the premise of a subsiding volcano and the corresponding upward growth of coral reef, was astonishing for the time considering the absence of an underpinning awareness of plate tectonics. Furthermore, with the exception of the occasional permutation and opposing idea his theory has endured and has an enviable longevity amongst paradigms in geomorphology. In his theory, Darwin emphasised the generally circular morphology of the atoll shape and surprisingly, the validity of this simple morphological premise has never been questioned. There are however, few atolls in the Pacific Ocean that attain such a simple morphology with most manifesting one or more arcuate 'bight-like' structures (ABLSs). These departures from the circular form complicate his simplistic model and are indicative of geomorphological processes in the Pacific Ocean which cannot be ignored. ABLSs represent the surface morphological expression of major submarine failures of atoll volcanic foundations. Such failures can occur during any stage of atoll formation and are a valuable addition to Darwin's theory because they indicate the instability of the volcanic foundations. It is widely recognized in the research community that sector/flank collapses of island edifices are invariably tsunamigenic and yet we have no clear understanding of how significant such events are in the tsunami hazard arena. The recognition of ABLSs however, now offers scientists the opportunity to establish a first order database of potential local and regional tsunamigenic sources associated with the sector/flank collapses of island edifices. We illustrate the talk with examples of arcuate 'bight-like' structures and associated tsunamis in atoll and atoll-like environments. The implications for our understanding of

  14. Hawaiian cultural influences on support for lava flow hazard mitigation measures during the January 1960 eruption of Kīlauea volcano, Kapoho, Hawai‘i

    USGS Publications Warehouse

    Gregg, Chris E.; Houghton, B.F.; Paton, Douglas; Swanson, D.A.; Lachman, R.; Bonk, W.J.

    2008-01-01

    On average, 72% of respondents favored the construction of earthen barriers to hold back or divert lava and protect Kapoho, but far fewer agreed with the military's use of bombs (14%) to protect Kapoho. In contrast, about one-third of respondents conditionally agreed with the use of bombs. It is suggested that local participation in the bombing strategy may explain the increased conditional acceptance of bombs as a mitigation tool, although this can not be conclusively demonstrated. Belief in Pele and being of Hawaiian ethnicity did not reduce support for the use of barriers, but did reduce support for bombs in both bombing scenarios. The disparity in levels of acceptance of barriers versus bombing and of one bombing strategy versus another suggests that historically public attitudes toward lava flow hazard mitigation strategies were complex. A modern comparative study is needed before the next damaging eruption to inform debates and decisions about whether or not to interfere with the flow of lava. Recent changes in the current eruption of Kīlauea make this a timely topic.

  15. Sea otter oil-spill mitigation study

    SciTech Connect

    Davis, R.W.; Thomas, J.; Williams, T.M.; Kastelein, R.; Cornell, L.

    1986-05-01

    The objective of the study was to analyze the effectiveness of existing capture, transport, cleaning, and rehabilitation methods and develop new methods to reduce the impact of an accidental oil spill to California sea otters, resulting from the present conditions or from future Outer Continental Shelf (OCS) oil and gas development in State or Federal waters. In addition, the study investigated whether or not a systematic difference in thermal conductivity existed between the pelts of Alaska and California Sea otters. This was done to assure that conclusions drawn from the oiling experiments carried out at Hubbs Marine Research Institute, Tetra Tech, Inc. contributed to the overall study by preparing a literature review and report on the fate and effects of oil dispersants and chemically dispersed oil.

  16. The subsurface cross section resistivity using magnetotelluric method in Pelabuhan Ratu area, West Java, implication for geological hazard mitigation

    NASA Astrophysics Data System (ADS)

    Gaffar, Eddy Z.

    2016-02-01

    Pelabuhan Ratu area is located on the south coast of West Java. Pelabuhan Ratu area's rapid development and population growth were partly stimulated by the Indonesian Government Regulation No. 66 the year 1998 that made Pelabuhan Ratu the capital city of the district of Sukabumi. Because of this fact, it is very important to create a geological hazard mitigation plan for the area. Pelabuhan Ratu were passed by two major faults: Cimandiri fault in the western and Citarik fault in the eastern. Cimandiri fault starts from the upstream of Cimandiri River to the southern of Sukabumi and Cianjur city. While Citarik fault starts from the Citarik River until the Salak Mountain. These two faults needs to be observed closely as they are prone to cause earthquake in the area. To mitigate earthquake that is estimated will occur at Cimandiri fault or the Citarik fault, the Research Center for Geotechnology LIPI conducted research using Magnetotelluric (MT) method with artificial Phoenix MT tool to determine the cross-section resistivity of the Pelabuhan Ratu and the surrounding area. Measurements were taken at 40 points along the highway towards Jampang to Pelabuhan Ratu, and to Bandung towards Cibadak with a distance of less than 500 meters between the measuring points. Measurement results using this tool will generate AMT cross-section resistivity to a depth of 1500 meters below the surface. Cross-section resistivity measurement results showed that there was a layer of rock with about 10 Ohm-m to 1000 Ohm-m resistivity. Rocks with resistivity of 10 Ohm-m was interpreted as conductive rocks that were loose or sandstone containing water. If an earthquake to occur in this area, it will lead to a strong movement and liquefaction that will destroy buildings and potentially cause casualties in this area.

  17. Volcanic Ash Image Products from MODIS for Aviation Safety and Natural Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Stephens, G.; Ellrod, G. P.; Im, J.

    2003-12-01

    Multi-spectral volcanic ash image products have been developed using Moderate Resolution Imaging Spectroradiometer (MODIS) data from the NASA Terra spacecraft (Ellrod and Im 2003). Efforts are now underway to integrate these new products into the MODIS Data Retrieval System at NESDIS, for use in the operational Hazard Mapping System (HMS). The images will be used at the Washington Volcanic Ash Advisory Center (W-VAAC) in the issuance of volcanic ash advisory statements to aircraft. In addition, the images will be made available to users in the global volcano and emergency management community via the World Wide Web. During the development process, good results (high detection rate with low ­false alarms­") were obtained from a tri-spectral combination of MODIS Infrared (IR) bands centered near 8.6, 11.0 and 12.0 ŸYm (Bands 29, 31, and 32). Optimum Red-Green-Blue false color composite images were developed to provide information on ash cloud location, as well as cloud phase and surface characteristics, to aid in interpretation both day and night. Information on volcanic ash derived from the tri-spectral product was displayed using the red color gun. This information was combined with visible (0.6 ŸYm) and near-IR (1.6 ŸYm) data for green and blue, respectively, during daylight periods. At night, the 8.6 ­V 11.0 ŸYm combination and 11.0 ŸYm band were used for the green and blue colors in the RGB product. Currently, raw MODIS data in five minute ­granules­" are processed for the following regions: (1) southern Alaska, (2) Mexico, Central America and the Caribbean, and (3) northern Andes region of South America. Image products are converted to Geo-spatial Information System (GIS) compatible formats for use in the HMS, and to Man-Computer Interactive Data Access System (McIDAS) ­Area File­" format for use in currently configured W-VAAC display systems. The installation of a high speed, fiber optic line from NASA Goddard Space Flight Center to the World

  18. Volcanic sulfur dioxide index and volcanic explosivity index inferred from eruptive volume of volcanoes in Jeju Island, Korea: application to volcanic hazard mitigation

    NASA Astrophysics Data System (ADS)

    Ko, Bokyun; Yun, Sung-Hyo

    2016-04-01

    Jeju Island located in the southwestern part of Korea Peninsula is a volcanic island composed of lavaflows, pyroclasts, and around 450 monogenetic volcanoes. The volcanic activity of the island commenced with phreatomagmatic eruptions under subaqueous condition ca. 1.8-2.0 Ma and lasted until ca. 1,000 year BP. For evaluating volcanic activity of the most recently erupted volcanoes with reported age, volcanic explosivity index (VEI) and volcanic sulfur dioxide index (VSI) of three volcanoes (Ilchulbong tuff cone, Songaksan tuff ring, and Biyangdo scoria cone) are inferred from their eruptive volumes. The quantity of eruptive materials such as tuff, lavaflow, scoria, and so on, is calculated using a model developed in Auckland Volcanic Field which has similar volcanic setting to the island. The eruptive volumes of them are 11,911,534 m3, 24,987,557 m3, and 9,652,025 m3, which correspond to VEI of 3, 3, and 2, respectively. According to the correlation between VEI and VSI, the average quantity of SO2 emission during an eruption with VEI of 3 is 2-8 × 103 kiloton considering that the island was formed under intraplate tectonic setting. Jeju Island was regarded as an extinct volcano, however, several studies have recently reported some volcanic eruption ages within 10,000 year BP owing to the development in age dating technique. Thus, the island is a dormant volcano potentially implying high probability to erupt again in the future. The volcanoes might have explosive eruptions (vulcanian to plinian) with the possibility that SO2 emitted by the eruption reaches stratosphere causing climate change due to backscattering incoming solar radiation, increase in cloud reflectivity, etc. Consequently, recommencement of volcanic eruption in the island is able to result in serious volcanic hazard and this study provides fundamental and important data for volcanic hazard mitigation of East Asia as well as the island. ACKNOWLEDGMENTS: This research was supported by a grant [MPSS

  19. Scientific Animations for Tsunami Hazard Mitigation: The Pacific Tsunami Warning Center's YouTube Channel

    NASA Astrophysics Data System (ADS)

    Becker, N. C.; Wang, D.; Shiro, B.; Ward, B.

    2013-12-01

    Outreach and education save lives, and the Pacific Tsunami Warning Center (PTWC) has a new tool--a YouTube Channel--to advance its mission to protect lives and property from dangerous tsunamis. Such outreach and education is critical for coastal populations nearest an earthquake since they may not get an official warning before a tsunami reaches them and will need to know what to do when they feel strong shaking. Those who live far enough away to receive useful official warnings and react to them, however, can also benefit from PTWC's education and outreach efforts. They can better understand a tsunami warning message when they receive one, can better understand the danger facing them, and can better anticipate how events will unfold while the warning is in effect. The same holds true for emergency managers, who have the authority to evacuate the public they serve, and for the news media, critical partners in disseminating tsunami hazard information. PTWC's YouTube channel supplements its formal outreach and education efforts by making its computer animations available 24/7 to anyone with an Internet connection. Though the YouTube channel is only a month old (as of August 2013), it should rapidly develop a large global audience since similar videos on PTWC's Facebook page have reached over 70,000 viewers during organized media events, while PTWC's official web page has received tens of millions of hits during damaging tsunamis. These animations are not mere cartoons but use scientific data and calculations to render graphical depictions of real-world phenomena as accurately as possible. This practice holds true whether the animation is a simple comparison of historic earthquake magnitudes or a complex simulation cycling through thousands of high-resolution data grids to render tsunami waves propagating across an entire ocean basin. PTWC's animations fall into two broad categories. The first group illustrates concepts about seismology and how it is critical to

  20. Towards the Establishment of the Hawaii Integrated Seismic Network for Tsunami, Seismic, and Volcanic Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Shiro, B. R.; Koyanagi, S. K.; Okubo, P. G.; Wolfe, C. J.

    2006-12-01

    The NOAA Pacific Tsunami Warning Center (PTWC) located in `Ewa Beach, Hawai`i, provides warnings to the State of Hawai`i regarding locally generated tsunamis. The USGS Hawaiian Volcano Observatory (HVO) located in Hawai`i National Park monitors earthquakes on the island of Hawai`i in order to characterize volcanic and earthquake activity and hazards. In support of these missions, PTWC and HVO operate seismic networks for rapidly detecting and evaluating earthquakes for their tsunamigenic potential and volcanic risk, respectively. These existing seismic networks are comprised mostly of short-period vertical seismometers with analog data collection and transmission based on decades-old technology. The USGS National Strong Motion Program (NSMP) operates 31 accelerometers throughout the state, but none currently transmit their data in real time. As a result of enhancements to the U.S. Tsunami Program in the wake of the December 2004 Indian Ocean tsunami disaster, PTWC is upgrading and expanding its seismic network using digital real-time telemetry from broadband and strong motion accelerometer stations. Through new cooperative agreements with partners including the USGS (HVO and NSMP), IRIS, University of Hawai`i, and Germany's GEOFON, the enhanced seismic network has been designed to ensure maximum benefit to all stakeholders. The Hawaii Integrated Seismic Network (HISN) will provide a statewide resource for tsunami, earthquake, and volcanic warnings. Furthermore, because all data will be archived by the IRIS Data Management Center (DMC), the HISN will become a research resource to greater scientific community. The performance target for the enhanced HISN is for PTWC to provide initial local tsunami warnings within 90 seconds of the earthquake origin time. This will be accomplished using real-time digital data transmission over redundant paths and by implementing contemporary analysis algorithms in real-time and near-real-time. Earthquake location, depth, and

  1. Public Policy Issues Associated with Tsunami Hazard Mitigation, Response and Recovery: Transferable Lessons from Recent Global Disasters

    NASA Astrophysics Data System (ADS)

    Johnson, L.

    2014-12-01

    Since 2004, a sequence of devastating tsunamis has taken the lives of more than 300,000 people worldwide. The path of destruction left by each is typically measured in hundreds of meters to a few kilometers and its breadth can extend for hundreds even thousands of kilometers, crossing towns and countries and even traversing an entire oceanic basin. Tsunami disasters in Indonesia, Chile, Japan and elsewhere have also shown that the almost binary nature of tsunami impacts can present some unique risk reduction, response, recovery and rebuilding challenges, with transferable lessons to other tsunami vulnerable coastal communities around the world. In particular, the trauma can motivate survivors to relocate homes, jobs, and even whole communities to safer ground, sometimes at tremendous social and financial costs. For governments, the level of concentrated devastation usually exceeds the local capacity to respond and thus requires complex inter-governmental arrangements with regional, national and even international partners to support the recovery of impacted communities, infrastructure and economies. Two parallel projects underway in California since 2011—the SAFRR (Science Application for Risk Reduction) tsunami scenario project and the California Tsunami Policy Working Group (CTPWG)—have worked to digest key lessons from recent tsunami disasters, with an emphasis on identifying gaps to be addressed in the current state and federal policy framework to enhance tsunami risk awareness, hazard mitigation, and response and recovery planning ahead of disaster and also improve post-disaster implementation practices following a future California or U.S. tsunami event.

  2. Conceptual Study on Air Ingress Mitigation for VHTRs

    SciTech Connect

    Chang H. Oh; Eung S. Kim

    2012-09-01

    An air-ingress accident followed by a pipe break is considered as a critical event for a very high temperature gas-cooled reactor (VHTR) safety. Following helium depressurization, it is anticipated that unless countermeasures are taken, air will enter the core through the break leading to oxidation of the in-core graphite structure. Thus, without mitigation features, this accident might lead to severe exothermic chemical reactions of graphite and oxygen depending on the accident scenario and the design. Under extreme circumstances, a loss of core structural integrity may occur and lead to a detrimental situation for the VHTR safety. This paper discusses various air-ingress mitigation concepts applicable for the VHTRs. The study begins with identifying important factors (or phenomena) associated with the air-ingress accident using a root-cause analysis. By preventing main causes of the important events identified in the root-cause diagram, the basic air-ingress mitigation ideas were conceptually developed. Among them, two concepts were finally evaluated as effective candidates. One concept is to inject helium into the lower plenum which is a direct in-vessel helium injection. The other concept is to enclose the reactor with a non-pressure boundary consisting of an opening at the bottom, which is an ex-vessel enclosure boundary. Computational fluid dynamics (CFD) methods were used to validate these concepts. As a result, it was shown that both concepts can effectively mitigate the air-ingress process. In the first concept, the injected helium replaces the air in the core and the lower plenum upper part by buoyancy force because of its low density. It prevented air from moving into the reactor core showing great potential for mitigating graphite oxidation in the core. In the second concept, the air-ingress rate is controlled by molecular diffusion through the opening at the enclosure bottom after depressurization. Some modified reactor cavity design is expected to

  3. Probing Aircraft Flight Test Hazard Mitigation for the Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Research Team . Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Kelly, Michael J.

    2013-01-01

    The Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage (horizontal and vertical tail). This report contains the Appendices to Volume I.

  4. Multi-scale earthquake hazard and risk in the Chinese mainland and countermeasures for the preparedness, mitigation, and management: an overview

    NASA Astrophysics Data System (ADS)

    Wu, Z.; Jiang, C.; Ma, T.

    2012-12-01

    Earthquake hazard and risk in the Chinese mainland exhibit multi-scale characteristics. Temporal scales from centuries to months, spatial scales from the whole mainland to specific engineering structures, and energy scales from great disastrous earthquakes to small earthquakes causing social disturbance and economic loss, feature the complexity of earthquake disasters. Coping with such complex challenge, several research and application projects have been undertaken since recent years. Lessons and experiences of the 2008 Wenchuan earthquake contributed much to the launching and conducting of these projects. Understandings of the scientific problems and technical approaches taken in the mainstream studies in the Chinese mainland have no significant difference from those in the international scientific communities, albeit using of some of the terminologies have "cultural differences" - for instance, in the China Earthquake Administration (CEA), the terminology "earthquake forecast/prediction (study)" is generally used in a much broader sense, mainly indicating time-dependent seismic hazard at different spatio-temporal scales. Several scientific products have been produced serving the society in different forms. These scientific products have unique academic merits due to the long-term persistence feature and the forward forecast nature, which are all essential for the evaluation of the technical performance and the falsification of the scientific ideas. On the other hand, using the language of the "actor network theory (ANT)" in science studies (or the sociology of science), at present, the hierarchical "actors' network", making the science transformed to the actions of the public and government for the preparedness, mitigation, and management of multi-scale earthquake disasters, is still in need of careful construction and improvement.

  5. Natural hazards and motivation for mitigation behavior: people cannot predict the affect evoked by a severe flood.

    PubMed

    Siegrist, Michael; Gutscher, Heinz

    2008-06-01

    Past research indicates that personal flood experience is an important factor in motivating mitigation behavior. It is not fully clear, however, why such experience is so important. This study tested the hypothesis that people without flooding experience underestimate the negative affect evoked by such an event. People who were affected by a severe recent flood disaster were compared with people who were not affected, but who also lived in flood-prone areas. Face-to-face interviews with open and closed questions were conducted (n= 201). Results suggest that people without flood experience envisaged the consequences of a flood differently from people who had actually experienced severe losses due to a flood. People who were not affected strongly underestimated the negative affect associated with a flood. Based on the results, it can be concluded that risk communication must not focus solely on technical aspects; in order to trigger motivation for mitigation behavior, successful communication must also help people to envisage the negative emotional consequences of natural disasters. PMID:18643832

  6. Hazardous and radioactive waste incineration studies

    NASA Astrophysics Data System (ADS)

    Vavruska, J. S.; Stretz, L. A.; Borduin, L. C.

    Development and demonstration of a transuranic (TRU) waste volume-reduction process is described. A production-scale controlled air incinerator using commercially available equipment and technology was modified for solid radioactive waste service. This unit successfully demonstrated the volume reduction of transuranic (TRU) waste with an average TRU content of about 20 nCi/g. The same incinerator and offgas treatment system is being modified further to evaluate the destruction of hazardous liquid wastes such as polychlorinated biphenyls (PCBs) and hazardous solid wastes such as pentachlorophenol (PCP)-treated wood.

  7. A design study on complexity reduced multipath mitigation

    NASA Astrophysics Data System (ADS)

    Wasenmüller, U.; Brack, T.; Groh, I.; Staudinger, E.; Sand, S.; Wehn, N.

    2012-09-01

    Global navigation satellite systems, e.g. the current GPS and the future European Galileo system, are frequently used in car navigation systems or smart phones to determine the position of a user. The calculation of the mobile position is based on the signal propagation times between the satellites and the mobile terminal. At least four time of arrival (TOA) measurements from four different satellites are required to resolve the position uniquely. Further, the satellites need to be line-of-sight to the receiver for exact position calculation. However, in an urban area, the direct path may be blocked and the resulting multipath propagation causes errors in the order of tens of meters for each measurement. and in the case of non-line-of-sight (NLOS), positive errors in the order of hundreds of meters. In this paper an advanced algorithm for multipath mitigation known as CRMM is presented. CRMM features reduced algorithmic complexity and superior performance in comparison with other state of the art multipath mitigation algorithms. Simulation results demonstrate the significant improvements in position calculation in environments with severe multipath propagation. Nevertheless, in relation to traditional algorithms an increased effort is required for real-time signal processing due to the large amount of data, which has to be processed in parallel. Based on CRMM, we performed a comprehensive design study including a design space exploration for the tracking unit hardware part, and prototype implementation for hardware complexity estimation.

  8. Natural Hazard Mitigation thru Water Augmentation Strategies to Provide Additional Snow Pack for Water Supply and Hydropower Generation in Drought Stressed Alps/Mountains

    NASA Astrophysics Data System (ADS)

    Matthews, D.; Brilly, M.

    2009-12-01

    Climate variability and change are clearly stressing water supplies in high alpine regions of the Earth. These recent long-term natural hazards present critical challenges to policy makers and water managers. This paper addresses strategies to use enhanced scientific methods to mitigate the problem. Recent rapid depletions of glaciers and intense droughts throughout the world have created a need to reexamine modern water augmentation technologies for enhancing snow pack in mountainous regions. Today’s reliance on clean efficient hydroelectric power in the Alps and the Rocky Mountains poses a critical need for sustainable snow packs and high elevation water supplies through out the year. Hence, the need to make natural cloud systems more efficient precipitators during the cold season through anthropogenic weather modification techniques. The Bureau of Reclamation, US Department of the Interior, has spent over $39M in research from 1963 to 1990 to develop the scientific basis for snow pack augmentation in the headwaters of the Colorado, American, and Columbia River Basins in the western United States, and through USAID in Morocco in the High Atlas Mountains. This paper presents a brief summary of the research findings and shows that even during drought conditions potential exists for significant, cost-effective enhancement of water supplies. Examples of ground based propane and AgI seeding generators, cloud physics studies of supercooled cloud droplets and ice crystal characteristics that indicate seeding potential will be shown. Hypothetical analyses of seeding potential in 17 western states from Montana to California will be presented based on observed SNOTEL snow water equivalent measurements, and distributed by elevation and observed winter precipitation. Early studies indicated from 5 to 20% increases in snow pack were possible, if winter storm systems were seeded effectively. If this potential was realized in drought conditions observed in 2003, over 1

  9. United States studies in orbital debris - Prevention and mitigation

    NASA Technical Reports Server (NTRS)

    Loftus, Joseph P., Jr.; Potter, Andrew E.

    1990-01-01

    Debris in space has become an issue that has commanded considerable interest in recent years as society has become both more dependent upon space based systems, and more aware of its dependence. After many years of study the United States Space Policy of February 1988 directed that all sectors of the U.S. community minimize space debris. Other space organizations have adopted similar policies. Among the study activities leading to the policy and to subsequent implementing directives were discussions with the ESA, NASDA, and other space operating agencies. The policy derived from technical consensus on the nature of the issues and upon the courses of action available to mitigate the problem, but there remains the concern as to the adequacy of the data to define cost effective strategies. There are now in place mechanisms to continue technical discussions in more formal terms.

  10. Mitigating Resistance to Teaching Science Through Inquiry: Studying Self

    NASA Astrophysics Data System (ADS)

    Spector, Barbara; Burkett, Ruth S.; Leard, Cyndy

    2007-04-01

    This is the report of a qualitative emergent-design study of 2 different Web-enhanced science methods courses for preservice elementary teachers in which an experiential learning strategy, labeled “using yourself as a learning laboratory,” was implemented. Emergent grounded theory indicated this strategy, when embedded in a course organized as an inquiry with specified action foci, contributed to mitigating participants’ resistance to learning and teaching through inquiry. Enroute to embracing inquiry, learners experienced stages resembling the stages of grief one experiences after a major loss. Data sources included participant observation, electronic artifacts in WebCT, and interviews. Findings are reported in 3 major sections: “Action Foci Common to Both Courses,” “Participants’ Growth and Change,” and “Challenges and Tradeoffs.”

  11. Airflow Hazard Visualization for Helicopter Pilots: Flight Simulation Study Results

    NASA Technical Reports Server (NTRS)

    Aragon, Cecilia R.; Long, Kurtis R.

    2005-01-01

    Airflow hazards such as vortices or low level wind shear have been identified as a primary contributing factor in many helicopter accidents. US Navy ships generate airwakes over their decks, creating potentially hazardous conditions for shipboard rotorcraft launch and recovery. Recent sensor developments may enable the delivery of airwake data to the cockpit, where visualizing the hazard data may improve safety and possibly extend ship/helicopter operational envelopes. A prototype flight-deck airflow hazard visualization system was implemented on a high-fidelity rotorcraft flight dynamics simulator. Experienced helicopter pilots, including pilots from all five branches of the military, participated in a usability study of the system. Data was collected both objectively from the simulator and subjectively from post-test questionnaires. Results of the data analysis are presented, demonstrating a reduction in crash rate and other trends that illustrate the potential of airflow hazard visualization to improve flight safety.

  12. AMENDING SOILS WITH PHOSPHATE AS MEANS TO MITIGATE SOIL LEAD HAZARD: A CRITICAL REVIEW OF THE STATE OF THE SCIENCE

    EPA Science Inventory

    Ingested soil and surface dust may be important contributors to elevated blood lead (Pb) levels in children exposed to Pb contaminated environments. Mitigation strategies have typically focused on excavation and removal of the contaminated soil. However, this is not always feas...

  13. Peru mitigation assessment of greenhouse gases: Sector -- Energy. Peru climate change country study; Final report

    SciTech Connect

    1996-08-01

    The aim of this study is to determine the Inventory and propose Greenhouse Gases Mitigation alternatives in order to face the future development of the country in a clean environmental setting without delaying the development process required to improve Peruvian standard of living. The main idea of this executive abstract is to show concisely the results of the Greenhouse Gases Mitigation for Peru in the period 1990--2015. The studies about mitigation for the Energy Sector are shown in this summary.

  14. From structural investigation towards multi-parameter early warning systems: geophysical contributions to hazard mitigation at the landslide of Gschliefgraben (Gmunden, Upper Austria)

    NASA Astrophysics Data System (ADS)

    Supper, Robert; Baron, Ivo; Jochum, Birgit; Ita, Anna; Winkler, Edmund; Motschka, Klaus; Moser, Günter

    2010-05-01

    In December 2007 the large landslide system inside the Gschliefgraben valley (located at the east edge of the Traun lake, Upper Austria), known over centuries for its repeated activity, was reactivated. Although a hazard zone map was already set up in 1974, giving rise to a complete prohibition on building, some hundreds of people are living on the alluvial fan close to the lake. Consequently, in frame of the first emergency measures, 55 building had to be evacuated. Within the first phase of mitigation, measures were focused on property and infrastructure protection. Around 220 wells and one deep channel were implemented to drain the sliding mass. Additionally a big quantity of sliding material was removed close to the inhabited areas. Differential GPS and water level measurements were performed to evaluate the effectiveness of the measures, which led to a significant slowdown of the movement. Soon after the suspension of the evacuation several investigations, including drilling, borehole logging and complex geophysical measurements were performed to investigate the structure of the landslide area in order to evaluate maximum hazard scenarios as a basis for planning further measures. Based on these results, measuring techniques for an adapted, future early warning system are currently being tested. This emergency system should enable local stakeholders to take appropriate and timely measures in case of a future event thus lessening the impact of a future disaster significantly. Within this tree-step-plan the application of geophysical methodologies was an integral part of the research and could considerably contribute to the success. Several innovative approaches were implemented which will be described in more detail within the talk. Airborne multi-sensor geophysical surveying is one of new and progressive approaches which can remarkably contribute to effectively analyse triggering processes of large landslides and to better predict their hazard. It was tested in

  15. Integrated Data Products to Forecast, Mitigate, and Educate for Natural Hazard Events Based on Recent and Historical Observations

    NASA Astrophysics Data System (ADS)

    McCullough, H. L.; Dunbar, P. K.; Varner, J. D.

    2011-12-01

    Immediately following a damaging or fatal natural hazard event there is interest to access authoritative data and information. The National Geophysical Data Center (NGDC) maintains and archives a comprehensive collection of natural hazards data. The NGDC global historic event database includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. Examining the past record provides clues to what might happen in the future. NGDC also archives tide gauge data from stations operated by the NOAA/NOS Center for Operational Oceanographic Products and Services and the NOAA Tsunami Warning Centers. In addition to the tide gauge data, NGDC preserves deep-ocean water-level, 15-second sampled data as collected by the Deep-ocean Assessment and Reporting of Tsunami (DART) buoys. Water-level data provide evidence of sea-level fluctuation and possible inundation events. NGDC houses an extensive collection of geologic hazards photographs available online as digital images. Visual media provide invaluable pre- and post-event data for natural hazards. Images can be used to illustrate inundation and possible damage or effects. These images are organized by event or hazard type (earthquake, volcano, tsunami, landslide, etc.), along with description and location. They may be viewed via interactive online maps and are integrated with historic event details. The planning required to achieve collection and dissemination of hazard event data is extensive. After a damaging or fatal event, NGDC begins to collect and integrate data and information from many people and organizations into the hazards databases. Sources of data include the U.S. NOAA Tsunami Warning Centers, the U.S. Geological Survey, the U.S. NOAA National Data Buoy Center, the UNESCO Intergovernmental Oceanographic Commission (IOC), Smithsonian Institution's Global Volcanism Program, news organizations, etc. NGDC then works to

  16. US country studies program: Results from mitigation studies

    SciTech Connect

    1996-12-31

    This paper describes the U.S. Country Studies Program which was implemented to support the principles and objectives of the Framework Convention on Climate Change (FCCC). There were three principle objectives in this program: to enhance capabilities to conduct climate change assessments, prepare action plans, and implement technology projects; to help establish a process for developing and implementing national policies and measures; to support principles and objective of the FCCC. As a result, 55 countries are completing studies, more than 2000 analysts engaged in the studies have been trained, and there is a much broader understanding and support for climate change concerns. The article describes experiences of some countries, and general observations and conclusions which are broadly seperated into developed countries and those with economies in transition.

  17. Mitigation of hazards from future lahars from Mount Merapi in the Krasak River channel near Yogyakarta, central Java

    USGS Publications Warehouse

    Ege, John R.; Sutikno

    1983-01-01

    Procedures for reducing hazards from future lahars and debris flows in the Krasak River channel near Yogyakarta, Central Java, Indonesia, include (1) determining the history of the location, size, and effects of previous lahars and debris flows, and (2) decreasing flow velocities. The first may be accomplished by geologic field mapping along with acquiring information by interviewing local residents, and the second by increasing the cross sectional area of the river channel and constructing barriers in the flow path.

  18. Is research on soil erosion hazard and mitigation in the Global South still needed? (Alexander von Humbold Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Poesen, Jean

    2016-04-01

    Soil erosion represents a geomorphological and geological hazard that may cause environmental damage (land degradation), property damage, loss of livelihoods and services as well as social and economic disruption. Erosion not only lowers the quality of our soils on site, resulting in a drastic reduction of their ecosystem functions that play a vital role in daily life, but causes also significant sediment-related problems off site. To curb soil erosion problems, a range of soil conservation techniques and strategies have been designed and are being applied. Worldwide, ca. 62 000 research papers on soil erosion and 116 000 on soil conservation have been published (Web of Science, Dec. 2015). The number of such papers dealing with the Global South represents less than 20 % of all papers, despite the fact that many regions in this part of the world face significant soil erosion problems, aggravated by a rapidly growing population and major environmental changes. Given the large number of research papers on this topic, one might therefore conclude that we now know almost everything about the various soil erosion processes and rates, their factors and consequences as well as their control so that little new knowledge can still be added to the vast amount of available information. We refute this conclusion by pointing to some major research gaps that still need to be addressed if we want to use our soils in a more sustainable way. More specifically the following topics need more research attention: 1) improved understanding of both natural and anthropogenic soil erosion processes and their interactions, 2) scaling up soil erosion processes and rates in space and time, and 3) innovative techniques and strategies to prevent or reduce erosion rates. This will be illustrated with case studies from the Global South. If future research focuses on these research gaps, we will 1) better understand processes and their interactions operating at a range of spatial and temporal

  19. Source-to-sink sediment transfers, environmental engineering and hazard mitigation in the steep Var River catchment, French Riviera, southeastern France

    NASA Astrophysics Data System (ADS)

    Anthony, Edward J.; Julian, Maurice

    1999-12-01

    Steep coastal margins are potentially subject to mass wasting processes involving notable landslide activity and sediment evacuation downstream by steep-gradient streams. Sediment transfer from short source-to-sink segments, coupled with mountain hydrological regimes, regulate patterns of river channel aggradation and coastal sediment supply in such geomorphic settings. On the steep French Riviera margin, sediment transfers from existing landslides or from various minor mass wasting processes to stream channels may result following bursts of heavy, concentrated rainfall. High-magnitude flooding and massive sediment transport downstream are generally related to unpredictable extreme rainfalls. Both mass movements and channel sediment storage pose serious hazards to downvalley settlements and infrastructure. A consideration of channel sediment storage patterns in the Var River catchment, the most important catchment in this area, highlights two important shortcomings relative to environmental engineering and hazard mitigation practices. In the first place, the appreciation of geomorphic processes is rather poor. This is illustrated by the undersized nature of engineering works constructed to mitigate hazards in the upstream bedload-dominated channels, and by the unforeseen effects that ten rock dams, constructed in the early 1970s, have had on downstream and coastal sediment storage and on sediment dispersal patterns and, consequently, valley flooding. Secondly, planners and environmental engineers have lacked foresight in valley and coastal management issues on this steep setting, notably as regards the reclaimed areas of the lower Var channel and delta liable to flooding. Urbanization and transport and environmental engineering works have progressively affected patterns of storage and transport of fine-grained sediments in the lower Var channel and delta. Meanwhile the problems raised by these changes have not been adequately addressed in terms of scientific

  20. 3D modelling of Mt. Talaga Bodas Crater (Indonesia) by using terrestrial laser scanner for volcano hazard mitigation

    NASA Astrophysics Data System (ADS)

    Gumilar, Irwan; Abidin, Hasanuddin Z.; Putra, Andreas D.; Haerani, Nia

    2015-04-01

    Indonesia is a country with many volcanoes. Each volcano in Indonesia typically has its own crater characteristics. One of them is the Mt.Talaga Bodas, located in Garut, West Java. Researches regarding the crater characteristics are necessary for volcanic disaster mitigation process. One of them is the modelling of the shape of the crater. One of the methods that can be used to model the volcanic crater is using Terrestrial Laser Scanner (TLS). This research aims to create a 3 dimensional (3D) model of the crater of the Mt. Talaga Bodas, that hopefully can be utilized for volcanic disaster mitigation. The methodology used in this research is by obtaining the scanning data using TLS and GPS measurements to obtain the coordinates of the reference points. The data processing methods consist of several steps, namely target to target registration, filterization, georeference, meshing point cloud, surface making, drawing, and 3D modelling. These steps were done using the Cyclone 7 software, and also using 3DS MAX for 3D modelling. The result of this data processing is a 3D model of the crater of the Mt. Talaga Bodas which is similar with the real shape. The calculation result shows that the height of the crater is 62.522 m, the diameter of the crater is 467.231 m, and the total area is 2961054.652 m2. The main obstacle in this research is the dense vegetation which becomes the noise and affects the crater model.

  1. Human uses of forested watersheds and riparian corridors: hazard mitigation as an ecosystem service, with examples from Panama, Puerto Rico, and Venezuela

    NASA Astrophysics Data System (ADS)

    Larsen, M. C.

    2015-12-01

    Humans have long favored settlement along rivers for access to water supply for drinking and agriculture, for transport corridors, and for food sources. Additionally, settlement in or near montane forests include benefits such as food sources, wood supply, esthetic values, and high quality water resources derived from watersheds where upstream human disturbance and environmental degradation is generally reduced. However, the advantages afforded by these riparian and montane settings pose episodic risks for communities located there as floods, landslides, and wildfires cause loss of life, destroy infrastructure, and damage or destroy crops. A basic understanding of flood probability and magnitude as well as hillslope stability by residents in these environments can mitigate these risks. Early humans presumably developed some degree of knowledge about these risks by means of their long periods of occupation in these environments and their observations of seasonal and storm rainfall patterns and river discharge, which became more refined as agriculture developed over the past 10,000 years. Modern global urbanization, particularly in regions of rapid economic growth, has resulted in much of this "organic" knowledge being lost, as rural populations move into megacities, many of which encroach on floodplains and mountain fronts. Moreover, the most likely occupants of these hazardous locations are often economically constrained, increasing their vulnerabity. Effective stewardship of river floodplains and upstream montane forests yields a key ecosystem service, which in addition to the well-known services, ie. water, hydroelectric energy, etc., provides a risk mitigation service, by reducing hazard and vulnerability. Puerto Rico, Panama, and Venezuela illustrate a range of practices and results, providing useful examples for planners and land use managers.

  2. Experimental study designs to improve the evaluation of road mitigation measures for wildlife.

    PubMed

    Rytwinski, Trina; van der Ree, Rodney; Cunnington, Glenn M; Fahrig, Lenore; Findlay, C Scott; Houlahan, Jeff; Jaeger, Jochen A G; Soanes, Kylie; van der Grift, Edgar A

    2015-05-01

    An experimental approach to road mitigation that maximizes inferential power is essential to ensure that mitigation is both ecologically-effective and cost-effective. Here, we set out the need for and standards of using an experimental approach to road mitigation, in order to improve knowledge of the influence of mitigation measures on wildlife populations. We point out two key areas that need to be considered when conducting mitigation experiments. First, researchers need to get involved at the earliest stage of the road or mitigation project to ensure the necessary planning and funds are available for conducting a high quality experiment. Second, experimentation will generate new knowledge about the parameters that influence mitigation effectiveness, which ultimately allows better prediction for future road mitigation projects. We identify seven key questions about mitigation structures (i.e., wildlife crossing structures and fencing) that remain largely or entirely unanswered at the population-level: (1) Does a given crossing structure work? What type and size of crossing structures should we use? (2) How many crossing structures should we build? (3) Is it more effective to install a small number of large-sized crossing structures or a large number of small-sized crossing structures? (4) How much barrier fencing is needed for a given length of road? (5) Do we need funnel fencing to lead animals to crossing structures, and how long does such fencing have to be? (6) How should we manage/manipulate the environment in the area around the crossing structures and fencing? (7) Where should we place crossing structures and barrier fencing? We provide experimental approaches to answering each of them using example Before-After-Control-Impact (BACI) study designs for two stages in the road/mitigation project where researchers may become involved: (1) at the beginning of a road/mitigation project, and (2) after the mitigation has been constructed; highlighting real case

  3. Odor Mitigation with Tree Buffers: Swine Production Case Study

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Tree buffers are a potential low cost sustainable odor mitigation strategy, but there is little to no data on their effectiveness. Odor transport is thought to occur one of two ways either directly through vapor phase transport or indirectly through sorption onto particles. Consequently, monitoring...

  4. Odor mitigation with vegetative buffers: Swine production case study

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Vegetative environmental buffers (VEB) are a potentially low cost sustainable odor mitigation strategy, but there is little to no data supporting their effectiveness. Wind tunnel experiments and field monitoring were used to determine the effect VEB had on wind flow patterns within a swine facility....

  5. Comparison of flood hazard assessments on desert piedmonts and playas: A case study in Ivanpah Valley, Nevada

    NASA Astrophysics Data System (ADS)

    Robins, Colin R.; Buck, Brenda J.; Williams, Amanda J.; Morton, Janice L.; House, P. Kyle; Howell, Michael S.; Yonovitz, Maureen L.

    2009-02-01

    Accurate and realistic characterizations of flood hazards on desert piedmonts and playas are increasingly important given the rapid urbanization of arid regions. Flood behavior in arid fluvial systems differs greatly from that of the perennial rivers upon which most conventional flood hazard assessment methods are based. Additionally, hazard assessments may vary widely between studies or even contradict other maps. This study's chief objective was to compare and evaluate landscape interpretation and hazard assessment between types of maps depicting assessments of flood risk in Ivanpah Valley, NV, as a case study. As a secondary goal, we explain likely causes of discrepancy between data sets to ameliorate confusion for map users. Four maps, including three different flood hazard assessments of Ivanpah Valley, NV, were compared: (i) a regulatory map prepared by FEMA, (ii) a soil survey map prepared by NRCS, (iii) a surficial geologic map, and (iv) a flood hazard map derived from the surficial geologic map, both of which were prepared by NBMG. GIS comparisons revealed that only 3.4% (33.9 km 2) of Ivanpah Valley was found to lie within a FEMA floodplain, while the geologic flood hazard map indicated that ~ 44% of Ivanpah Valley runs some risk of flooding (Fig. 2D). Due to differences in mapping methodology and scale, NRCS data could not be quantitatively compared, and other comparisons were complicated by differences in flood hazard class criteria and terminology between maps. Owing to its scale and scope of attribute data, the surficial geologic map provides the most useful information on flood hazards for land-use planning. This research has implications for future soil geomorphic mapping and flood risk mitigation on desert piedmonts and playas. The Ivanpah Valley study area also includes the location of a planned new international airport, thus this study has immediate implications for urban development and land-use planning near Las Vegas, NV.

  6. Using fine-scale fuel measurements to assess wildland fuels, potential fire behavior and hazard mitigation treatments in the southeastern USA.

    SciTech Connect

    Ottmar, Roger, D.; Blake, John, I.; Crolly, William, T.

    2012-01-01

    The inherent spatial and temporal heterogeneity of fuelbeds in forests of the southeastern United States may require fine scale fuel measurements for providing reliable fire hazard and fuel treatment effectiveness estimates. In a series of five papers, an intensive, fine scale fuel inventory from the Savanna River Site in the southeastern United States is used for building fuelbeds and mapping fire behavior potential, evaluating fuel treatment options for effectiveness, and providing a comparative analysis of landscape modeled fire behavior using three different data sources including the Fuel Characteristic Classification System, LANDFIRE, and the Southern Wildfire Risk Assessment. The research demonstrates that fine scale fuel measurements associated with fuel inventories repeated over time can be used to assess broad scale wildland fire potential and hazard mitigation treatment effectiveness in the southeastern USA and similar fire prone regions. Additional investigations will be needed to modify and improve these processes and capture the true potential of these fine scale data sets for fire and fuel management planning.

  7. Hazard & Operability Study for Removal of Spent Nuclear Fuel from the 324 Building

    SciTech Connect

    VAN KEUREN, J.C.

    2002-05-07

    A hazard and operability (HAZOP) study was conducted to examine the hazards associated with the removal of the spent nuclear fuel from the 324 Building. Fifty-nine potentially hazardous conditions were identified.

  8. The Value of Linking Mitigation and Adaptation: A Case Study of Bangladesh

    NASA Astrophysics Data System (ADS)

    Ayers, Jessica M.; Huq, Saleemul

    2009-05-01

    There are two principal strategies for managing climate change risks: mitigation and adaptation. Until recently, mitigation and adaptation have been considered separately in both climate change science and policy. Mitigation has been treated as an issue for developed countries, which hold the greatest responsibility for climate change, while adaptation is seen as a priority for the South, where mitigative capacity is low and vulnerability is high. This conceptual divide has hindered progress against the achievement of the fundamental sustainable development challenges of climate change. Recent attention to exploring the synergies between mitigation and adaptation suggests that an integrated approach could go some way to bridging the gap between the development and adaptation priorities of the South and the need to achieve global engagement in mitigation. These issues are explored through a case study analysis of climate change policy and practice in Bangladesh. Using the example of waste-to-compost projects, a mitigation-adaptation-development nexus is demonstrated, as projects contribute to mitigation through reducing methane emissions; adaptation through soil improvement in drought-prone areas; and sustainable development, because poverty is exacerbated when climate change reduces the flows of ecosystem services. Further, linking adaptation to mitigation makes mitigation action more relevant to policymakers in Bangladesh, increasing engagement in the international climate change agenda in preparation for a post-Kyoto global strategy. This case study strengthens the argument that while combining mitigation and adaptation is not a magic bullet for climate policy, synergies, particularly at the project level, can contribute to the sustainable development goals of climate change and are worth exploring.

  9. Geographic studies of pediatric cancer near hazardous waste sites.

    PubMed

    White, E; Aldrich, T E

    1999-01-01

    The incidence of pediatric (i.e., 0-17 y of age) cancers in North Carolina was studied for the years 1990-1993 in counties and ZIP-code areas that contained a National Priorities List hazardous-waste site. We analyzed the areas to determine if there was an excess incidence of cancer among the pediatric population. We used geographic information systems technology to address-match and map the cancer cases, along with county and National Priorities List hazardous waste-site location. No significantly elevated cancer incidence rates were found at the county level. Two ZIP-code areas had statistically significant elevations in cancer incidence (p < .05). Only 3 of the cancer cases we mapped resided within a 1.6-km (1 mi) buffer zone of a National Priorities List hazardous-waste site. These 3 cases were not in the ZIP-code areas that had increased incidence rates. The small numerators throughout the study led us to question the accuracy of the assessment of underlying rates. The general capabilities of the geographic information systems, as well as advantages and limitations of the system, are discussed. As an exploratory study, this study serves as a springboard into more in-depth environmental-health hypotheses and more-specific investigations of point sources of hazardous exposures. PMID:10634228

  10. Safety Hazards in Child Care Settings. CPSC Staff Study.

    ERIC Educational Resources Information Center

    Consumer Product Safety Commission, Washington, DC.

    Each year, thousands of children in child care settings are injured seriously enough to need emergency medical treatment. This national study identified potential safety hazards in 220 licensed child care settings in October and November 1998. Eight product areas were examined: cribs, soft bedding, playground surfacing, playground surface…

  11. Hazards analysis and prediction from remote sensing and GIS using spatial data mining and knowledge discovery: a case study for landslide hazard zonation

    NASA Astrophysics Data System (ADS)

    Hsu, Pai-Hui; Su, Wen-Ray; Chang, Chy-Chang

    2011-11-01

    Due to the particular geographical location and geological condition, Taiwan suffers from many natural hazards which often cause series property damages and life losses. To reduce the damages and casualty, an effective real-time system for hazard prediction and mitigation is necessary. In this study, a case study for Landslide Hazard Zonation (LHZ) is tested in accordance with Spatial Data Mining and Knowledge Discovery (SDMKD) from database. Many different kinds of geospatial data, such as the terrain elevation, land cover types, the distance to roads and rivers, geology maps, NDVI, and monitoring rainfall data etc., are collected into the database for SDMKD. In order to guarantee the data quality, the spatial data cleaning is essential to remove the noises, errors, outliers, and inconsistency hiding in the input spatial data sets. In this paper, the Kriging interpolation is used to calibrate the QPESUMS rainfall data to the rainfall observations from rain gauge stations to remove the data inconsistency. After the data cleaning, the artificial neural networks (ANNs) is applied to generate the LHZ map throughout the test area. The experiment results show that the accuracy of LHZ is about 92.3% with the ANNs analysis, and the landslides induced by heavy-rainfall can be mapped efficiently from remotely sensed images and geospatial data using SDMKD technologies.

  12. Deepwater Gulf of Mexico Shallow Hazards Studies Best Practices

    NASA Astrophysics Data System (ADS)

    Fernandez, M.; Hobbs, B.

    2005-05-01

    ConocoPhillips (hConoco) has been involved in deepwater exploration in the Gulf of Mexico for the last 5 years using a dynamically positioned (DP) drillship. As part of the Federal (MMS) and State permitting process for deepwater exploration, ConocoPhillips (COPC) actively undertakes in securing seabed and shallow subsurface hazard surveys and analyses for every potential drillsite. COPC conducts seabed and shallow subsurface hazards surveys for at least two main reasons: To be a safe, efficient operator, seabed and shallow subsurface hazard surveys and analyses are necessary steps of the Exploration Work Flow to help ensure a safe well, and to fulfill MMS (or local government) regulatory requirements The purpose of shallow geohazards studies is to determine seafloor and sub-bottom conditions, inspect for possible chemosynthetic communities, and to provide a shallow hazards assessment in accordance with NTL 2003-G17. During the five years of deepwater exploration COPC has contracted Fugro Geoservices to perform hazards studies in over 30 offshore blocks. Deepwater Gulf of Mexico Shallow Hazards Studies Best Practices The results of the seabed and shallow geohazards are a critical part of the construction of all of our well plans and are dynamically used in all MDT's. The results of the seabed and shallow geohazards investigations have greatly improved our drilling efficiency by predicting and avoiding possible chemosynthetic communities, sea floor faults, shallow gas, and shallow water flow. CoP's outstanding safety record and environmental stewardship with regards to geohazards has helped us in accelerating certain Exploration Plans (within MMS guidelines). These types of efforts has saved money and kept the drilling schedule running smoothly. In the last two years, the MMS has given COPC approval to use existing 3D spec seismic volumes for Shallow Hazards Assessment at several locations where applicable. This type of effort has saved ConocoPhillips hundreds of

  13. A study on seismicity and seismic hazard for Karnataka State

    NASA Astrophysics Data System (ADS)

    Sitharam, T. G.; James, Naveen; Vipin, K. S.; Raj, K. Ganesha

    2012-04-01

    This paper presents a detailed study on the seismic pattern of the state of Karnataka and also quantifies the seismic hazard for the entire state. In the present work, historical and instrumental seismicity data for Karnataka (within 300 km from Karnataka political boundary) were compiled and hazard analysis was done based on this data. Geographically, Karnataka forms a part of peninsular India which is tectonically identified as an intraplate region of Indian plate. Due to the convergent movement of the Indian plate with the Eurasian plate, movements are occurring along major intraplate faults resulting in seismic activity of the region and hence the hazard assessment of this region is very important. Apart from referring to seismotectonic atlas for identifying faults and fractures, major lineaments in the study area were also mapped using satellite data. The earthquake events reported by various national and international agencies were collected until 2009. Declustering of earthquake events was done to remove foreshocks and aftershocks. Seismic hazard analysis was done for the state of Karnataka using both deterministic and probabilistic approaches incorporating logic tree methodology. The peak ground acceleration (PGA) at rock level was evaluated for the entire state considering a grid size of 0.05° × 0.05°. The attenuation relations proposed for stable continental shield region were used in evaluating the seismic hazard with appropriate weightage factors. Response spectra at rock level for important Tier II cities and Bangalore were evaluated. The contour maps showing the spatial variation of PGA values at bedrock are presented in this work.

  14. Detecting Slow Deformation Signals Preceding Dynamic Failure: A New Strategy For The Mitigation Of Natural Hazards (SAFER)

    NASA Astrophysics Data System (ADS)

    Vinciguerra, Sergio; Colombero, Chiara; Comina, Cesare; Ferrero, Anna Maria; Mandrone, Giuseppe; Umili, Gessica; Fiaschi, Andrea; Saccorotti, Gilberto

    2015-04-01

    Rock slope monitoring is a major aim in territorial risk assessment and mitigation. The high velocity that usually characterizes the failure phase of rock instabilities makes the traditional instruments based on slope deformation measurements not applicable for early warning systems. The use of "site specific" microseismic monitoring systems, with particular reference to potential destabilizing factors, such as rainfalls and temperature changes, can allow to detect pre-failure signals in unstable sectors within the rock mass and to predict the possible acceleration to the failure. We deployed a microseismic monitoring system in October 2013 developed by the University of Turin/Compagnia San Paolo and consisting of a network of 4 triaxial 4.5 Hz seismometers connected to a 12 channel data logger on an unstable patch of the Madonna del Sasso, Italian Western Alps. The initial characterization based on geomechanical and geophysical tests allowed to understand the instability mechanism and to design a 'large aperture' configuration which encompasses the entire unstable rock and can monitor subtle changes of the mechanical properties of the medium. Stability analysis showed that the stability of the slope is due to rock bridges. A continuous recording at 250 Hz sampling frequency (switched in March 2014 to 1 kHz for improving the first arrival time picking and obtain wider frequency content information) and a trigger recording based on a STA/LTA (Short Time Average over Long Time Average) detection algorithm have been used. More than 2000 events with different waveforms, duration and frequency content have been recorded between November 2013 and March 2014. By inspecting the acquired events we identified the key parameters for a reliable distinction among the nature of each signal, i.e. the signal shape in terms of amplitude, duration, kurtosis and the frequency content in terms of range of maximum frequency content, frequency distribution in spectrograms. Four main

  15. RADON MITIGATION IN SCHOOLS: CASE STUDIES OF RADON MITIGATION SYSTEMS INSTALLED BY EPA IN FOUR MARYLAND SCHOOLS ARE PRESENTED

    EPA Science Inventory

    The first part of this two-part paper discusses radon entry into schools, radon mitigation approaches for schools, and school characteristics (e.g., heating, ventilation, and air-conditioning -- HVAC-- system design and operation) that influence radon entry and mitigation system ...

  16. Seismic hazard studies at the Department of Energy owned Paducah and Portsmouth Gaseous Diffusion plants

    SciTech Connect

    Beavers, J.E.; Brock, W.R.; Hunt, R.J. )

    1991-01-01

    Seismic hazard levels for free-field rock motion are defined and presented in this paper as annual exceedance probabilities versus peak acceleration and as uniform hazard response spectra. The conclusions of an independent review are also summarized. Based on the seismic hazard studies, peak horizontal acceleration values and uniform hazard response spectra for rock conditions are recommended. 15 refs., 6 figs., 1 tab.

  17. Household hazardous waste quantification, characterization and management in China's cities: a case study of Suzhou.

    PubMed

    Gu, Binxian; Zhu, Weimo; Wang, Haikun; Zhang, Rongrong; Liu, Miaomiao; Chen, Yangqing; Wu, Yi; Yang, Xiayu; He, Sheng; Cheng, Rong; Yang, Jie; Bi, Jun

    2014-11-01

    A four-stage systematic tracking survey of 240 households was conducted from the summer of 2011 to the spring of 2012 in a Chinese city of Suzhou to determine the characteristics of household hazardous waste (HHW) generated by the city. Factor analysis and a regression model were used to study the major driving forces of HHW generation. The results indicate that the rate of HHW generation was 6.16 (0.16-31.74, 95% CI) g/person/day, which accounted for 2.23% of the household solid waste stream. The major waste categories contributing to total HHW were home cleaning products (21.33%), medicines (17.67%) and personal care products (15.19%). Packaging and containers (one-way) and products (single-use) accounted for over 80% of total HHW generation, implying a considerable potential to mitigate HHW generation by changing the packaging design and materials used by manufacturing enterprises. Strong correlations were observed between HHW generation (g/person/day) and the driving forces group of "household structure" and "consumer preferences" (among which the educational level of the household financial manager has the greatest impact). Furthermore, the HHW generation stream in Suzhou suggested the influence of another set of variables, such as local customs and culture, consumption patterns, and urban residential life-style. This study emphasizes that HHW should be categorized at its source (residential households) as an important step toward controlling the HHW hazards of Chinese cities. PMID:25022547

  18. Study on the health hazards of scrap metal cutters.

    PubMed

    Ho, S F; Wong, P H; Kwok, S F

    1989-12-01

    Scrap metal cutters seemed to be left out in most preventive programmes as the workers were mainly contract workers. The health hazards of scrap metal cutting in 54 workers from a foundry and a ship breaking plant were evaluated. Environmental sampling showed lead levels ranging from 0.02 to 0.57 mg/m3 (threshold limit values is 0.15 mg/m3). Exposure to lead came mainly from the paint coat of the metals cut. Metal fume fever was not reported although their main complaints were cough and rhinitis. Skin burns at all stages of healing and residual scars were seen over hands, forearms and thighs. 96% of the cutters had blood lead levels exceeding 40 micrograms/100 ml with 10 workers exceeding 70 micrograms/100 ml. None had clinical evidence of lead poisoning. The study showed that scrap metal cutting is a hazardous industry associated with significant lead exposure. With proper medical supervision, the blood lead levels of this group of workers decreased illustrating the importance of identifying the hazard and implementing appropriate medical surveillance programmes. PMID:2635395

  19. Interdisciplinary approach to hydrological hazard mitigation and disaster response and effects of climate change on the occurrence of flood severity in central Alaska

    NASA Astrophysics Data System (ADS)

    Kontar, Y. Y.; Bhatt, U. S.; Lindsey, S. D.; Plumb, E. W.; Thoman, R. L.

    2015-06-01

    In May 2013, a massive ice jam on the Yukon River caused flooding that destroyed much of the infrastructure in the Interior Alaska village of Galena and forced the long-term evacuation of nearly 70% of its residents. This case study compares the communication efforts of the out-of-state emergency response agents with those of the Alaska River Watch program, a state-operated flood preparedness and community outreach initiative. For over 50 years, the River Watch program has been fostering long-lasting, open, and reciprocal communication with flood prone communities, as well as local emergency management and tribal officials. By taking into account cultural, ethnic, and socioeconomic features of rural Alaskan communities, the River Watch program was able to establish and maintain a sense of partnership and reliable communication patterns with communities at risk. As a result, officials and residents in these communities are open to information and guidance from the River Watch during the time of a flood, and thus are poised to take prompt actions. By informing communities of existing ice conditions and flood threats on a regular basis, the River Watch provides effective mitigation efforts in terms of ice jam flood effects reduction. Although other ice jam mitigation attempts had been made throughout US and Alaskan history, the majority proved to be futile and/or cost-ineffective. Galena, along with other rural riverine Alaskan communities, has to rely primarily on disaster response and recovery strategies to withstand the shock of disasters. Significant government funds are spent on these challenging efforts and these expenses might be reduced through an improved understanding of both the physical and climatological principals behind river ice breakup and risk mitigation. This study finds that long term dialogue is critical for effective disaster response and recovery during extreme hydrological events connected to changing climate, timing of river ice breakup, and

  20. Channelized debris flow hazard mitigation through the use of flexible barriers: a simplified computational approach for a sensitivity analysis.

    NASA Astrophysics Data System (ADS)

    Segalini, Andrea; Ferrero, Anna Maria; Brighenti, Roberto

    2013-04-01

    A channelized debris flow is usually represented by a mixture of solid particles of various sizes and water, flowing along a laterally confined inclined channel-shaped region up to an unconfined area where it slow down its motion and spreads out into a flat-shaped mass. The study of these phenomena is very difficult due to their short duration and unpredictability, lack of historical data for a given basin and complexity of the involved mechanical phenomena. The post event surveys allow for the identification of some depositional features and provide indication about the maximum flow height; however they lack information about development of the phenomena with time. For this purpose the monitoring of recursive events has been carried out by several Authors. Most of the studies, aimed at the determination of the characteristic features of a debris flow, were carried out in artificial channels, where the main involved variables were measured and other where controlled during the tests; however, some uncertainties remained and other scaled models where developed to simulate the deposition mechanics as well as to analyze the transportation mechanics and the energy dissipation. The assessment of the mechanical behavior of the protection structures upon impact with the flow as well as the energy associated to it are necessary for the proper design of such structures that, in densely populated area, can avoid victims and limit the destructive effects of such a phenomenon. In this work a simplified structural model, developed by the Authors for the safety assessment of retention barrier against channelized debris flow, is presented and some parametric cases are interpreted through the proposed approach; this model is developed as a simplified and efficient tool to be used for the verification of the supporting cables and foundations of a flexible debris flow barrier. The present analytical and numerical-based approach has a different aim of a FEM model. The computational

  1. Decay extent evaluation of wood degraded by a fungal community using NIRS: application for ecological engineering structures used for natural hazard mitigation

    NASA Astrophysics Data System (ADS)

    Baptiste Barré, Jean; Bourrier, Franck; Bertrand, David; Rey, Freddy

    2015-04-01

    .13). This tool improves the evaluation accuracy of wood decay extent in the context of ecological engineering structures used for natural hazard mitigation.

  2. A European effort towards the development of tools for tsunami hazard and risk assessment and mitigation, and tsunami early warning: the EC-funded TRANSFER project

    NASA Astrophysics Data System (ADS)

    Tinti, S.; Armigliato, A.

    2007-12-01

    TRANSFER (acronym for "Tsunami Risk ANd Strategies For the European Region") is a European Community funded project being coordinated by the University of Bologna (Italy) and involving 29 partners in Europe, Turkey and Israel. The main objectives of the project can be summarised as: 1) improving our understanding of tsunami processes in the Euro-Mediterranean region, 2) contributing to the tsunami hazard, vulnerability and risk assessment, 3) identifying the best strategies for reduction of tsunami risk, 4) focussing on the gaps and needs for the implementation of an efficient tsunami early warning system (TEWS) in the Euro-Mediterranean area, which is a high-priority task in consideration that no tsunami early warning system is today in place in the Euro- Mediterranean countries. This paper briefly outlines the results that were obtained in the first year of life of the project and the activities that are currently carried out and planned for the future. In particular, we will emphasize the efforts made so far in the following directions. 1) The improvement of existing numerical models for tsunami generation, propagation and impact, and the possible development of new ones. Existing numerical models have been already applied to selected benchmark problems. At the same time, the project is making an important effort in the development of standards for inundation maps in Europe. 2) The project Consortium has selected seven test areas in different countries facing the Mediterranean Sea and the eastern Atlantic Ocean, where innovative probabilistic and statistical approaches for tsunami hazard assessment, up-to-date and new methods to compute inundation maps are being and will be applied. For the same test areas, tsunami scenario approaches are being developed, vulnerability and risk assessed, prevention and mitigation measures defined also by the advice of end users that are organised in an End User Group. 3) A final key aspect is represented by the dissemination of

  3. Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards: Part II. Validation of satellite-derived Volcanic Sulphur Dioxide Levels.

    NASA Astrophysics Data System (ADS)

    Koukouli, MariLiza; Balis, Dimitris; Dimopoulos, Spiros; Clarisse, Lieven; Carboni, Elisa; Hedelt, Pascal; Spinetti, Claudia; Theys, Nicolas; Tampellini, Lucia; Zehner, Claus

    2014-05-01

    The eruption of the Icelandic volcano Eyjafjallajökull in the spring of 2010 turned the attention of both the public and the scientific community to the susceptibility of the European airspace to the outflows of large volcanic eruptions. The ash-rich plume from Eyjafjallajökull drifted towards Europe and caused major disruptions of European air traffic for several weeks affecting the everyday life of millions of people and with a strong economic impact. This unparalleled situation revealed limitations in the decision making process due to the lack of information on the tolerance to ash of commercial aircraft engines as well as limitations in the ash monitoring and prediction capabilities. The European Space Agency project Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards, was introduced to facilitate the development of an optimal End-to-End System for Volcanic Ash Plume Monitoring and Prediction. This system is based on comprehensive satellite-derived ash plume and sulphur dioxide [SO2] level estimates, as well as a widespread validation using supplementary satellite, aircraft and ground-based measurements. The validation of volcanic SO2 levels extracted from the sensors GOME-2/MetopA and IASI/MetopA are shown here with emphasis on the total column observed right before, during and after the Eyjafjallajökull 2010 eruptions. Co-located ground-based Brewer Spectrophotometer data extracted from the World Ozone and Ultraviolet Radiation Data Centre, WOUDC, were compared to the different satellite estimates. The findings are presented at length, alongside a comprehensive discussion of future scenarios.

  4. Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards: Part I. Validation of satellite-derived Volcanic Ash Levels.

    NASA Astrophysics Data System (ADS)

    Koukouli, MariLiza; Balis, Dimitris; Simopoulos, Spiros; Siomos, Nikos; Clarisse, Lieven; Carboni, Elisa; Wang, Ping; Siddans, Richard; Marenco, Franco; Mona, Lucia; Pappalardo, Gelsomina; Spinetti, Claudia; Theys, Nicolas; Tampellini, Lucia; Zehner, Claus

    2014-05-01

    The 2010 eruption of the Icelandic volcano Eyjafjallajökull attracted the attention of the public and the scientific community to the vulnerability of the European airspace to volcanic eruptions. Major disruptions in European air traffic were observed for several weeks surrounding the two eruptive episodes, which had a strong impact on the everyday life of many Europeans as well as a noticable economic loss of around 2-3 billion Euros in total. The eruptions made obvious that the decision-making bodies were not informed properly and timely about the commercial aircraft capabilities to ash-leaden air, and that the ash monitoring and prediction potential is rather limited. After the Eyjafjallajökull eruptions new guidelines for aviation, changing from zero tolerance to newly established ash threshold values, were introduced. Within this spirit, the European Space Agency project Satellite Monitoring of Ash and Sulphur Dioxide for the mitigation of Aviation Hazards, called for the creation of an optimal End-to-End System for Volcanic Ash Plume Monitoring and Prediction . This system is based on improved and dedicated satellite-derived ash plume and sulphur dioxide level assessments, as well as an extensive validation using auxiliary satellite, aircraft and ground-based measurements. The validation of volcanic ash levels extracted from the sensors GOME-2/MetopA, IASI/MetopA and MODIS/Terra and MODIS/Aqua is presented in this work with emphasis on the ash plume height and ash optical depth levels. Co-located aircraft flights, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation [CALIPSO] soundings and well as European Aerosol Research Lidar Network [EARLINET] measurements were compared to the different satellite estimates for the those two eruptive episodes. The validation results are extremely promising with most satellite sensors performing quite well and within the estimated uncertainties compared to the comparative datasets. The findings are

  5. Mitigating Hazards in School Facilities

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    School safety is a human concern, one that every school and community must take seriously and strive continually to achieve. It is also a legal concern; schools can be held liable if they do not make good-faith efforts to provide a safe and secure school environment. How schools are built and maintained is an integral part of school safety and…

  6. Conceptual issues in the study of dynamic hazard warnings.

    PubMed

    Meyer, Joachim

    2004-01-01

    This paper presents a conceptual analysis of dynamic hazard warning systems. The normative aspects of responses to warnings are analyzed, and a distinction is made between two forms of responses to a warning system, referred to as compliance and reliance. Determinants of the responses to warnings are identified, and they are broadly classified into normative, task, and operator factors. Existing research on warnings and automation is assessed in view of this conceptual framework, and directions for future research are discussed. Some implications of this analysis for practitioners, designers, and researchers are indicated. Actual or potential applications of this research include recommendations for the analysis, design, and study of dynamic warning systems. PMID:15359670

  7. A study of shock mitigating materials in a split Hopkinson bar configuration

    SciTech Connect

    Bateman, V.I.; Bell, R.G. III; Brown, F.A.; Hansen, N.R.

    1996-12-31

    Sandia National Laboratories (SNL) designs mechanical systems with electronics that must survive high shock environments. These mechanical systems include penetrators that must survive soil, rock, and ice penetration, nuclear transportation casks that must survive transportation environments, and laydown weapons that must survive delivery impact of 125-fps. These mechanical systems contain electronics that may operate during and after the high shock environment and that must be protected from the high shock environments. A study has been started to improve the packaging techniques for the advanced electronics utilized in these mechanical systems because current packaging techniques are inadequate for these more sensitive electronics. In many cases, it has been found that the packaging techniques currently used not only do not mitigate the shock environment but actually amplify the shock environment. An ambitious goal for this packaging study is to avoid amplification and possibly attenuate the shock environment before it reaches the electronics contained in the various mechanical system. As part of the investigation of packaging techniques, a two part study of shock mitigating materials is being conducted. This paper reports the first part of the shock mitigating materials study. A study to compare three thicknesses (0.125, 0.250, and 0.500 in.) of seventeen, unconfined materials for their shock mitigating characteristics has been completed with a split Hopkinson bar configuration. The nominal input as measured by strain gages on the incident Hopkinson bar is 50 fps {at} 100 {micro}s for these tests. It is hypothesized that a shock mitigating material has four purposes: to lengthen the shock pulse, to attenuate the shock pulse, to mitigate high frequency content in the shock pulse, and to absorb energy. Both time domain and frequency domain analyses of the split Hopkinson bar data have been performed to compare the materials` achievement of these purposes.

  8. An exploratory study on perceptions of seismic risk and mitigation in two districts of Istanbul.

    PubMed

    Eraybar, Korel; Okazaki, Kenji; Ilki, Alper

    2010-01-01

    Istanbul is one of the world's cities most vulnerable to seismic events. According to seismologists, the probability of a severe earthquake in the next 30 years is approximately 40 per cent. Following an outline of the seismicity of this vital Turkish city and a summary of current seismic risks and mitigation studies, this paper presents the results of a survey conducted in two districts of Istanbul, Avcilar and Bakirkoy. The survey comprised some 60 questions on the seismic risk perceptions of individuals and requested basic personal data, such as on age, education level, employment type, financial income, and gender. Despite various differences among the survey population, such as academic background and level of financial income, responses were surprisingly similar, especially in terms of having no plan for a safer house. The data may help those planning mitigation programmes and public awareness campaigns on preparedness and particularly mitigation in highly vulnerable regions. PMID:19624701

  9. A STUDY ON GREENHOUSE GAS EMISSIONS AND MITIGATION POTENTIALS IN AGRICULRE

    NASA Astrophysics Data System (ADS)

    Hasegawa, Tomoko; Matsuoka, Yuzuru

    In this study, world food production and consumption are estimated from 2005 to 2030 using a model developed by General-to-specific modeling methodology. Based on the agricultural production, we estimated GHG emissions and mitigation potentials and evaluated mitigation countermeasures in agriculture. As a result, world crop and meat production will increase by 1.4 and 1.3 times respectively up to 2030. World GHG emissions from agriculture were 5.7 GtCO2eq in 2005. CH4 emission from enteric fermentation and N2O emission from nitrogen fertilizer contributed a large part of the emissions. In 2030, technical and economical mitigation potentials will be 2.0 GtCO2eq and 1.2 GtCO2eq respectively. The potentials correspond to 36% and 22% of total emissions in 2000. The countermeasures with highest effects will be water management in rice paddy such as "Midseason drainage" and "Off-season straw".

  10. Interventionist and participatory approaches to flood risk mitigation decisions: two case studies in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Bianchizza, C.; Del Bianco, D.; Pellizzoni, L.; Scolobig, A.

    2012-04-01

    Flood risk mitigation decisions pose key challenges not only from a technical but also from a social, economic and political viewpoint. There is an increasing demand for improving the quality of these processes by including different stakeholders - and especially by involving the local residents in the decision making process - and by guaranteeing the actual improvement of local social capacities during and after the decision making. In this paper we analyse two case studies of flood risk mitigation decisions, Malborghetto-Valbruna and Vipiteno-Sterzing, in the Italian Alps. In both of them, mitigation works have been completed or planned, yet following completely different approaches especially in terms of responses of residents and involvement of local authorities. In Malborghetto-Valbruna an 'interventionist' approach (i.e. leaning towards a top down/technocratic decision process) was used to make decisions after the flood event that affected the municipality in the year 2003. In Vipiteno-Sterzing, a 'participatory' approach (i.e. leaning towards a bottom-up/inclusive decision process) was applied: decisions about risk mitigation measures were made by submitting different projects to the local citizens and by involving them in the decision making process. The analysis of the two case studies presented in the paper is grounded on the results of two research projects. Structured and in-depth interviews, as well as questionnaire surveys were used to explore residents' and local authorities' orientations toward flood risk mitigation. Also a SWOT analysis (Strengths, Weaknesses, Opportunities and Threats) involving key stakeholders was used to better understand the characteristics of the communities and their perception of flood risk mitigation issues. The results highlight some key differences between interventionist and participatory approaches, together with some implications of their adoption in the local context. Strengths and weaknesses of the two approaches

  11. Insights from EMF Associated Agricultural and Forestry Greenhouse Gas Mitigation Studies

    SciTech Connect

    McCarl, Bruce A.; Murray, Brian; Kim, Man-Keun; Lee, Heng-Chi; Sands, Ronald D.; Schneider, Uwe

    2007-11-19

    Integrated assessment modeling (IAM) as employed by the Energy Modeling Forum (EMF) generally involves a multi-sector appraisal of greenhouse gas emission (GHGE) mitigation alternatives and climate change effects typically at the global level. Such a multi-sector evaluation encompasses potential climate change effects and mitigative actions within the agricultural and forestry (AF) sectors. In comparison with many of the other sectors covered by IAM, the AF sectors may require somewhat different treatment due to their critical dependence upon spatially and temporally varying resource and climatic conditions. In particular, in large countries like the United States, forest production conditions vary dramatically across the landscape. For example, some areas in the southern US present conditions favorable to production of fast growing, heat tolerant pine species, while more northern regions often favor slower-growing hardwood and softwood species. Moreover, some lands are currently not suitable for forest production (e.g., the arid western plains). Similarly, in agriculture, the US has areas where citrus and cotton can be grown and other areas where barley and wheat are more suitable. This diversity across the landscape causes differential GHGE mitigation potential in the face of climatic changes and/or responses to policy or price incentives. It is difficult for a reasonably sized global IAM system to reflect the full range of sub-national geographic AF production possibilities alluded to above. AF response in the face of climate change altered temperature precipitation regimes or mitigation incentives will likely involve region-specific shifts in land use and agricultural/forest production. This chapter addresses AF sectoral responses in climate change mitigation analysis. Specifically, we draw upon US-based studies of AF GHGE mitigation possibilities that incorporate sub-national detail drawing largely on a body of studies done by the authors in association with

  12. CASE STUDY OF RADON DIAGNOSTICS AND MITIGATION IN A NEW YORK STATE SCHOOL

    EPA Science Inventory

    The paper discusses a case study of radon diagnostics and mitigation performed by EPA in a New York State school building. esearch focused on active subslab depressurization (ASD) in the basement and, to a lesser degree, the potential for radon reduction in the basement and slab-...

  13. Versatile gas gun target assembly for studying blast wave mitigation in materials

    NASA Astrophysics Data System (ADS)

    Bartyczak, S.; Mock, W., Jr.

    2012-03-01

    Traumatic brain injury (TBI) has become a serious problem for military personnel returning from recent conflicts. This has increased interest in investigating blast mitigating materials for use in helmets. In this paper we describe a new versatile target assembly that is used with an existing gas gun for studying these materials.

  14. Studies of millimeter-wave phenomenology for helicopter brownout mitigation

    NASA Astrophysics Data System (ADS)

    Schuetz, Christopher A.; Stein, E. Lee, Jr.; Samluk, Jesse; Mackrides, Daniel; Wilson, John P.; Martin, Richard D.; Dillon, Thomas E.; Prather, Dennis W.

    2009-09-01

    The unique ability of the millimeter-wave portion of the spectrum to penetrate typical visual obscurants has resulted in a wide range of possible applications for imagers in this spectrum. Of particular interest to the military community are imagers that can operate effectively in Degraded Visual Environments (DVE's) experienced by helicopter pilots when landing in dry, dusty environments, otherwise known as "brownout." One of the first steps to developing operational requirements for imagers in this spectrum is to develop a quantitative understanding of the phenomenology that governs imaging in these environments. While preliminary studies have been done in this area, quantitative, calibrated measurements of typical targets and degradation of target contrasts due to brownout conditions are not available. To this end, we will present results from calibrated, empirical measurements of typical targets of interest to helicopter pilots made in a representative desert environment. In addition, real-time measurements of target contrast reduction due to brownout conditions generated by helicopter downwash will be shown. These data were acquired using a W-band, dual-polarization radiometric scanner using optical-upconversion detectors.

  15. Feasibility study of tank leakage mitigation using subsurface barriers

    SciTech Connect

    Treat, R.L.; Peters, B.B.; Cameron, R.J.; McCormak, W.D.; Trenkler, T.; Walters, M.F.; Rouse, J.K.; McLaughlin, T.J.; Cruse, J.M.

    1994-09-21

    The US Department of Energy (DOE) has established the Tank Waste Remediation System (TWRS) to satisfy manage and dispose of the waste currently stored in the underground storage tanks. The retrieval element of TWRS includes a work scope to develop subsurface impermeable barriers beneath SSTs. The barriers could serve as a means to contain leakage that may result from waste retrieval operations and could also support site closure activities by facilitating cleanup. Three types of subsurface barrier systems have emerged for further consideration: (1) chemical grout, (2) freeze walls, and (3) desiccant, represented in this feasibility study as a circulating air barrier. This report contains analyses of the costs and relative risks associated with combinations retrieval technologies and barrier technologies that from 14 alternatives. Eight of the alternatives include the use of subsurface barriers; the remaining six nonbarrier alternative are included in order to compare the costs, relative risks and other values of retrieval with subsurface barriers. Each alternative includes various combinations of technologies that can impact the risks associated with future contamination of the groundwater beneath the Hanford Site to varying degrees. Other potential risks associated with these alternatives, such as those related to accidents and airborne contamination resulting from retrieval and barrier emplacement operations, are not quantitatively evaluated in this report.

  16. The fujairah united arab emirates (uae) (ml = 5.1) earthquake of march 11, 2002 a reminder for the immediate need to develop and implement a national hazard mitigation strategy

    NASA Astrophysics Data System (ADS)

    Al-Homoud, A.

    2003-04-01

    the epicenter of the earthquake. Indeed, the March 11, 2002 and "aftershocks" scared the citizens of Masafi and surrounding regions and ignited the attention of the public and government to the subject matter of earthquake hazard, specialty this earthquake came one year after the near by Indian m = 6.5 destructive Earthquake. Indeed the recent m = 6.2 June 22 destructive earthquake too that hit north west Iran, has again reminded the UAE public and government with the need to take quick and concrete measures to dtake the necessary steps to mitigate any anticipated earthquake hazard. This study reflects in some details on the following aspects related to the region and vicinity: geological and tectonic setting, seismicity, earthquake activity data base and seismic hazard assessment. Moreover, it documents the following aspects of the March 11, 2002 earthquake: tectonic, seismological, instrumental seismic data, aftershocks, strong motion recordings and response spectral and local site effect analysis, geotechnical effects and structural observations in the region affected by the earthquake. The study identifies local site ground amplification effects and liquefaction hazard potential in some parts of the UAE. Moreover, the study reflects on the coverage of the incident in the media, public and government response, state of earthquake engineering practice in the construction industry in the UAE, and the national preparedness and public awareness issues. However, it is concluded for this event that the mild damages that occurred in Masafi region were due to poor quality of construction, and lack of underestimating of the design base shear. Practical recommendations are suggested for the authorities to avoid damages in newly constructed buildings and lifelines as a result of future stronger earthquakes, in addition to recommendations on a national strategy for earthquake hazard mitigation in the UAE, which is still missing. The recommendations include the development and

  17. Hazards and operability study for the surface moisture monitoring system

    SciTech Connect

    Board, B.D.

    1996-04-04

    The Hanford Nuclear Reservation Tank Farms` underground waste tanks have been used to store liquid radioactive waste from defense materials production since the 1940`s. Waste in certain of the tanks may contain material in the form of ferrocyanide or various organic compounds which could potentially be susceptible to condensed phase chemical reactions. Because of the presence of oxidizing materials (nitrate compounds) and heat sources (radioactive decay and chemical reactions), the ferrocyanide or organic material could potentially fuel a propagating exothermic reaction with undesirable consequences. Analysis and experiments indicate that the reaction propagation and/or initiation may be prevented by the presence of sufficient moisture in the waste. Because the reaction would probably be initiated at the surface of the waste, evidence of sufficient moisture concentration would help provide evidence that the tank waste can continue to be safely stored. The Surface Moisture Measurement System (SMMS) was developed to collect data on the surface moisture in the waste by inserting two types of probes (singly) into a waste tank-a neutron probe and an electromagnetic inductance (EMI) probe. The sensor probes will be placed on the surface of the waste utilizing a moveable deployment arm to lower them through an available riser. The movement of the SMMS within the tank will be monitored by a camera lowered through an adjacent riser. The SMMS equipment is the subject of this study. Hazards and Operability Analysis (HAZOP) is a systematic technique for assessing potential hazards and/or operability problems for a new activity. It utilizes a multidiscipline team of knowledgeable individuals in a systematic brainstorming effort. The results of this study will be used as input to an Unreviewed Safety Question determination.

  18. A study of shock mitigating materials in a split Hopkinson bar configuration. Phase 1

    SciTech Connect

    Bateman, V.I.; Brown, F.A.; Hansen, N.R.

    1998-06-01

    Sandia National Laboratories (SNL) designs mechanical systems with electronics that must survive high shock environments. These mechanical systems include penetrators that must survive soil, rock, and ice penetration, nuclear transportation casks that must survive transportation environments, and laydown weapons that must survive delivery impact of 125 fps. These mechanical systems contain electronics that may operate during and after the high shock environment and that must be protected from the high shock environments. A study has been started to improve the packaging techniques for the advanced electronics utilized in these mechanical systems because current packaging techniques are inadequate for these more sensitive electronics. In many cases, it has been found that the packaging techniques currently used not only do not mitigate the shock environment but actually amplify the shock environment. An ambitious goal for this packaging study is to avoid amplification and possibly attenuate the shock environment before it reaches the electronics contained in the various mechanical systems. As part of the investigation of packaging techniques, a two phase study of shock mitigating materials is being conducted. The purpose of the first phase reported here is to examine the performance of a joint that consists of shock mitigating material sandwiched in between steel and to compare the performance of the shock mitigating materials. A split Hopkinson bar experimental configuration simulates this joint and has been used to study the shock mitigating characteristics of seventeen, unconfined materials. The nominal input for these tests is an incident compressive wave with 50 fps peak (1,500 {micro}{var_epsilon} peak) amplitude and a 100 {micro}s duration (measured at 10% amplitude).

  19. Methodological Issues In Forestry Mitigation Projects: A CaseStudy Of Kolar District

    SciTech Connect

    Ravindranath, N.H.; Murthy, I.K.; Sudha, P.; Ramprasad, V.; Nagendra, M.D.V.; Sahana, C.A.; Srivathsa, K.G.; Khan, H.

    2007-06-01

    There is a need to assess climate change mitigationopportunities in forest sector in India in the context of methodologicalissues such as additionality, permanence, leakage, measurement andbaseline development in formulating forestry mitigation projects. A casestudy of forestry mitigation project in semi-arid community grazing landsand farmlands in Kolar district of Karnataka, was undertaken with regardto baseline and project scenariodevelopment, estimation of carbon stockchange in the project, leakage estimation and assessment ofcost-effectiveness of mitigation projects. Further, the transaction coststo develop project, and environmental and socio-economic impact ofmitigation project was assessed.The study shows the feasibility ofestablishing baselines and project C-stock changes. Since the area haslow or insignificant biomass, leakage is not an issue. The overallmitigation potential in Kolar for a total area of 14,000 ha under variousmitigation options is 278,380 tC at a rate of 20 tC/ha for the period2005-2035, which is approximately 0.67 tC/ha/yr inclusive of harvestregimes under short rotation and long rotation mitigation options. Thetransaction cost for baseline establishment is less than a rupee/tC andfor project scenario development is about Rs. 1.5-3.75/tC. The projectenhances biodiversity and the socio-economic impact is alsosignificant.

  20. Land use /Land Cover Approaches as Instruments of Natural Hazard Mitigation in the Manjira River Sub-Basin, Andhra Pradesh, India.

    NASA Astrophysics Data System (ADS)

    THATIPARTI, V. L.

    2001-05-01

    Rapid industrialization during the last three decades had a profound adverse effect on the land use / land cover practices in , and the water quality, of the Manjira River sub-basin, Medak district, Andhra Pradesh, India. As water interacts with all other components of the environment, such as geology, soils, weather and climate, flora and fauna, the pollution of water has affected both biophysical and socioeconomic and cultural environments. The area of study is the catchment of Nakkavagu (stream) in the Manjira river system, which lies between long. 78 05' - 78 25' E., and the lat. 17 25'- and 17 45' N., and covers an area of 734 sq.km. Remote Sensing and GIS techniques have been employed to identify and quantify measures for mitigating the adverse impacts of the industrialization and for being prepared for extreme weather events. The methodology employed in the present study involves the generation of various thematic layers like slope, hydrogeomorphology and land use / land cover maps using Land sat MSS, IRS IA LISS II and IRS ID LISS III and PAN merged data in EASI / PACE 6.3 ver. Platform. By overlaying all the above thematic maps, action plan maps are generated to device various ways and means of rolling back the degradation of the environment, and to develop low -cost, people - participatory strategies ( such as, agricultural practices, use of water bodies and land under urbanization, structural and non-structural, particularly vegetation methods, etc.) of reducing the vulnerability of the population for extreme weather events.

  1. OFFSHORE PLATFORM HAZARDOUS WASTE INCINERATION FACILITY: FEASIBILITY STUDY

    EPA Science Inventory

    This report describes a program conducted to evaluate the technical and environmental feasibility of using a proposed offshore platform incineration facility in the destruction of hazardous wastes and for incineration research.

  2. Feasibility Study of Radiometry for Airborne Detection of Aviation Hazards

    NASA Technical Reports Server (NTRS)

    Gimmestad, Gary G.; Papanicolopoulos, Chris D.; Richards, Mark A.; Sherman, Donald L.; West, Leanne L.; Johnson, James W. (Technical Monitor)

    2001-01-01

    Radiometric sensors for aviation hazards have the potential for widespread and inexpensive deployment on aircraft. This report contains discussions of three aviation hazards - icing, turbulence, and volcanic ash - as well as candidate radiometric detection techniques for each hazard. Dual-polarization microwave radiometry is the only viable radiometric technique for detection of icing conditions, but more research will be required to assess its usefulness to the aviation community. Passive infrared techniques are being developed for detection of turbulence and volcanic ash by researchers in this country and also in Australia. Further investigation of the infrared airborne radiometric hazard detection approaches will also be required in order to develop reliable detection/discrimination techniques. This report includes a description of a commercial hyperspectral imager for investigating the infrared detection techniques for turbulence and volcanic ash.

  3. Volcanic hazard studies for the Yucca Mountain project

    SciTech Connect

    Crowe, B.; Harrington, C.; Turrin, B.; Champion, D.; Wells, S.; Perry, F.; McFadden, L.; Renault, C.

    1989-12-31

    Volcanic hazard studies are ongoing to evaluate the risk of future volcanism with respect to siting of a repository for disposal of high-level radioactive waste at the Yucca Mountain site. Seven Quaternary basaltic volcanic centers are located between 8 and 47 km from the outer boundary of the exploration block. The conditional probability of disruption of a repository by future basaltic volcanism is bounded by the range of 10-8 to 10-10 yr-1. These bounds are currently being reexamined based on new developments in the understanding of the evolution of small volume, basaltic volcanic centers including: Many of the volcanic centers exhibit brief periods of eruptive activity separated by longer periods of inactivity, The centers may be active for time spans exceeding 105 yrs, There is a decline in the volume of eruptions of the centers through time, and Small volume eruptions occurred at two of the Quaternary centers during latest Pleistocene or Holocene. The authors classify the basalt centers as polycyclic, and distinguish them from polygenetic volcanoes. Polycyclic volcanism is characterized by small volume, episodic eruptions of magma of uniform composition over time spans of 103 to 105 yrs. magma eruption rates are low and the time between eruptions exceeds the cooling time of the magma volumes.

  4. Volcanic hazard studies for the Yucca Mountain project

    SciTech Connect

    Crowe, B.; Turrin, B.; Wells, S.; Perry, F.; McFadden, L.; Renault, C.E.; Champion, D.; Harrington, C.

    1989-05-01

    Volcanic hazard studies are ongoing to evaluate the risk of future volcanism with respect to siting of a repository for disposal of high-level radioactive waste at the Yucca Mountain site. Seven Quaternary basaltic volcanic centers are located a minimum distance of 12 km and a maximum distance of 47 km from the outer boundary of the exploration block. The conditional probability of disruption of a repository by future basaltic volcanism is bounded by the range of 10{sup {minus}8} to 10{sup {minus}10} yr{sup {minus}1}. These values are currently being reexamined based on new developments in the understanding of the evaluation of small volume, basaltic volcanic centers including: (1) Many, perhaps most, of the volcanic centers exhibit brief periods of eruptive activity separated by longer periods of inactivity. (2) The centers may be active for time spans exceeding 10{sup 5} yrs, (3) There is a decline in the volume of eruptions of the centers through time, and (4) Small volume eruptions occurred at two of the Quaternary centers during latest Pleistocene or Holocene time. We classify the basalt centers as polycyclic, and distinguish them from polygenetic volcanoes. Polycyclic volcanism is characterized by small volume, episodic eruptions of magma of uniform composition over time spans of 10{sup 3} to 10{sup 5} yrs. Magma eruption rates are low and the time between eruptions exceeds the cooling time of the magma volumes. 25 refs., 2 figs.

  5. A procedure for global flood hazard mapping - the Africa case study

    NASA Astrophysics Data System (ADS)

    Dottori, Francesco; Salamon, Peter; Feyen, Luc; Barbosa, Paulo

    2015-04-01

    River floods are recognized as one of the major causes of economic damages and loss of human lives worldwide, and their impact in the next decades could be dramatically increased by socio-economic and climatic changes. In Africa, the availability of tools and models for predicting, mapping and analysing flood hazard and risk is still limited. Consistent, high-resolution (1km or less), continental-scale hazard maps are extremely valuable for local authorities and water managers to mitigate flood risk and to reduce catastrophic impacts on population and assets. The present work describes the development of a procedure for global flood hazard mapping, which is tested and applied over Africa to derive continental flood hazard maps. We derive a long-term dataset of daily river discharges from global hydrological simulations to design flood hydrographs for different return periods for the major African river network. We then apply a hydrodynamic model to identify flood-prone areas in major river catchments, which are merged to create pan-African flood hazard maps at 900m resolution. The flood map designed for a return period of 20 years is compared with a mosaic of satellite images showing all flooded areas in the period 2000-2014. We discuss strengths and limitations emerging from the comparison and present potential future applications and developments of the methodology.

  6. Management of agricultural soils for greenhouse gas mitigation: Learning from a case study in NE Spain.

    PubMed

    Sánchez, B; Iglesias, A; McVittie, A; Álvaro-Fuentes, J; Ingram, J; Mills, J; Lesschen, J P; Kuikman, P J

    2016-04-01

    A portfolio of agricultural practices is now available that can contribute to reaching European mitigation targets. Among them, the management of agricultural soils has a large potential for reducing GHG emissions or sequestering carbon. Many of the practices are based on well tested agronomic and technical know-how, with proven benefits for farmers and the environment. A suite of practices has to be used since none of the practices can provide a unique solution. However, there are limitations in the process of policy development: (a) agricultural activities are based on biological processes and thus, these practices are location specific and climate, soils and crops determine their agronomic potential; (b) since agriculture sustains rural communities, the costs and potential for implementation have also to be regionally evaluated and (c) the aggregated regional potential of the combination of practices has to be defined in order to inform abatement targets. We believe that, when implementing mitigation practices, three questions are important: Are they cost-effective for farmers? Do they reduce GHG emissions? What policies favour their implementation? This study addressed these questions in three sequential steps. First, mapping the use of representative soil management practices in the European regions to provide a spatial context to upscale the local results. Second, using a Marginal Abatement Cost Curve (MACC) in a Mediterranean case study (NE Spain) for ranking soil management practices in terms of their cost-effectiveness. Finally, using a wedge approach of the practices as a complementary tool to link science to mitigation policy. A set of soil management practices was found to be financially attractive for Mediterranean farmers, which in turn could achieve significant abatements (e.g., 1.34 MtCO2e in the case study region). The quantitative analysis was completed by a discussion of potential farming and policy choices to shape realistic mitigation policy at

  7. Land Use/Land Cover Approaches as Instruments of Natural Hazard Mitigation in the Manjira River Sub-Basin, Andhra Pradesh, India

    NASA Astrophysics Data System (ADS)

    Lakshmi, T. V.; Reddy, M. A.; Anjaneyulu, Y.

    2001-05-01

    Rapid industrialization during the last three decades had a profound adverse effect on the land use/land cover practices in, and the water quality, of the Manjira River sub-basin, Medak District, Andhra Pradesh, India. As water interacts with all other components of the environment, such as, geology, soils, weather and climate, flora and fauna, the pollution of water has affected both biophysical and socioeconomic and cultural environments. The area of study is the catchment of Nakkavagu (stream) in the Manjira river system, which lies between long. 78 05' - 78 25' E., and the lat. 17 25' - 17 45' N., and covers an area of 734 sq. km. Remote sensing and GIS techniques have been employed to identify and quantify measures for mitigating the adverse impacts of the industrialization and for being prepared for extreme weather events. The methodology employed in the present study involves the generation of various thematic layers like slope, hydrogeomorphology and land use / land cover maps using Landsat MSS, IRS 1A LISS II and IRS 1D LISS III and PAN merged data in EASI/PACE 6.3 ver. platform. By overlaying all the above thematic maps, action plan maps are generated to devise various ways and means of rolling back the degradation of the environment, and to develop low-cost, people-participatory strategies (such as, agricultural practices, use of water bodies and land under urbanization, structural and non-structural, particularly vegetation methods, etc.) of reducing the vulnerability of the population for extreme weather events.

  8. The 5 key questions coping with risks due to natural hazards, answered by a case study

    NASA Astrophysics Data System (ADS)

    Hardegger, P.; Sausgruber, J. T.; Schiegg, H. O.

    2009-04-01

    Based on Maslow's hierarchy of needs, human endeavours concern primarily existential needs, consequently, to be safeguarded against both natural as well as man made threads. The subsequent needs are to realize chances in a variety of fields, as economics and many others. Independently, the 5 crucial questions are the same as for coping with risks due to natural hazards specifically. These 5 key questions are I) What is the impact in function of space and time ? II) What protection measures comply with the general opinion and how much do they mitigate the threat? III) How can the loss be adequately quantified and monetized ? IV) What budget for prevention and reserves for restoration and compensation are to be planned ? V) Which mix of measures and allocation of resources is sustainable, thus, optimal ? The 5 answers, exemplified by a case study, concerning the sustainable management of risk due to the debris flows by the Enterbach / Inzing / Tirol / Austria, are as follows : I) The impact, created by both the propagation of flooding and sedimentation, has been forecasted by modeling (numerical simulation) the 30, 50, 100, 150, 300 and 1000 year debris flow. The input was specified by detailed studies in meteorology, precipitation and runoff, in geology, hydrogeology, geomorphology and slope stability, in hydraulics, sediment transport and debris flow, in forestry, agriculture and development of communal settlement and infrastructure. All investigations were performed according to the method of ETAlp (Erosion and Transport in Alpine systems). ETAlp has been developed in order to achieve a sustainable development in alpine areas and has been evaluated by the research project "nab", within the context of the EU-Interreg IIIb projects. II) The risk mitigation measures of concern are in hydraulics at the one hand and in forestry at the other hand. Such risk management is evaluated according to sustainability, which means economic, ecologic and social, in short, "triple

  9. HOUSEHOLD HAZARDOUS WASTE CHARACTERIZATION STUDY FOR PALM BEACH COUNTY, FLORIDA - A MITE PROGRAM EVALUATION

    EPA Science Inventory

    The objectives of the Household Hazardous Waste Characterization Study (the HHW Study) were to: 1) Quantity the annual household hazardous waste (HHW) tonnages disposed in Palm Beach County Florida’s (the County) residential solid waste (characterized in this study as municipal s...

  10. Reducing risk from lahar hazards: concepts, case studies, and roles for scientists

    USGS Publications Warehouse

    Pierson, Thomas C.; Wood, Nathan J.; Driedger, Carolyn L.

    2014-01-01

    Lahars are rapid flows of mud-rock slurries that can occur without warning and catastrophically impact areas more than 100 km downstream of source volcanoes. Strategies to mitigate the potential for damage or loss from lahars fall into four basic categories: (1) avoidance of lahar hazards through land-use planning; (2) modification of lahar hazards through engineered protection structures; (3) lahar warning systems to enable evacuations; and (4) effective response to and recovery from lahars when they do occur. Successful application of any of these strategies requires an accurate understanding and assessment of the hazard, an understanding of the applicability and limitations of the strategy, and thorough planning. The human and institutional components leading to successful application can be even more important: engagement of all stakeholders in hazard education and risk-reduction planning; good communication of hazard and risk information among scientists, emergency managers, elected officials, and the at-risk public during crisis and non-crisis periods; sustained response training; and adequate funding for risk-reduction efforts. This paper reviews a number of methods for lahar-hazard risk reduction, examines the limitations and tradeoffs, and provides real-world examples of their application in the U.S. Pacific Northwest and in other volcanic regions of the world. An overriding theme is that lahar-hazard risk reduction cannot be effectively accomplished without the active, impartial involvement of volcano scientists, who are willing to assume educational, interpretive, and advisory roles to work in partnership with elected officials, emergency managers, and vulnerable communities.

  11. Best Practices in Grid Integration of Variable Wind Power: Summary of Recent US Case Study Results and Mitigation Measures

    SciTech Connect

    Smith, J. Charles; Parsons, Brian; Acker, Thomas; Milligan, Michael; Zavidil, Robert; Schuerger, Matthew; DeMeo, Edgar

    2010-01-22

    This paper will summarize results from a number of utility wind integration case studies conducted recently in the US, and outline a number of mitigation measures based on insights from those studies.

  12. Hazardous drinking-related characteristics of depressive disorders in Korea: the CRESCEND study.

    PubMed

    Park, Seon-Cheol; Lee, Sang Kyu; Oh, Hong Seok; Jun, Tae-Youn; Lee, Min-Soo; Kim, Jae-Min; Kim, Jung-Bum; Yim, Hyeon-Woo; Park, Yong Chon

    2015-01-01

    This study aimed to identify clinical correlates of hazardous drinking in a large cohort of Korean patients with depression. We recruited a total of 402 depressed patients aged > 18 yr from the Clinical Research Center for Depression (CRESCEND) study in Korea. Patients' drinking habits were assessed using the Korean Alcohol Use Disorder Identification Test (AUDIT-K). Psychometric scales, including the HAMD, HAMA, BPRS, CGI-S, SSI-Beck, SOFAS, and WHOQOL-BREF, were used to assess depression, anxiety, overall psychiatric symptoms, global severity, suicidal ideation, social functioning, and quality of life, respectively. We compared demographic and clinical features and psychometric scores between patients with and without hazardous drinking behavior after adjusting for the effects of age and sex. We then performed binary logistic regression analysis to identify independent correlates of hazardous drinking in the study population. Our results revealed that hazardous drinking was associated with current smoking status, history of attempted suicide, greater psychomotor retardation, suicidal ideation, weight loss, and lower hypochondriasis than non-hazardous drinking. The regression model also demonstrated that more frequent smoking, higher levels of suicidal ideation, and lower levels of hypochondriasis were independently correlates for hazardous drinking in depressed patients. In conclusion, depressed patients who are hazardous drinkers experience severer symptoms and a greater burden of illness than non-hazardous drinkers. In Korea, screening depressed patients for signs of hazardous drinking could help identify subjects who may benefit from comprehensive therapeutic approaches. PMID:25552886

  13. Factors in Perception of Tornado Hazard: An Exploratory Study.

    ERIC Educational Resources Information Center

    de Man, Anton; Simpson-Housley, Paul

    1987-01-01

    Administered questionnaire on tornado hazard to 142 adults. Results indicated that subject's gender and education level were best predictors of perceived probability of tornado recurrence; that ratings of severity of potential damage were related to education level; and that gender accounted for significant percentage of variance in anxiety…

  14. Occupational Health Hazards among Healthcare Workers in Kampala, Uganda

    PubMed Central

    Yu, Xiaozhong; Buregyeya, Esther; Musoke, David; Wang, Jia-Sheng; Halage, Abdullah Ali; Whalen, Christopher; Bazeyo, William; Williams, Phillip; Ssempebwa, John

    2015-01-01

    Objective. To assess the occupational health hazards faced by healthcare workers and the mitigation measures. Methods. We conducted a cross-sectional study utilizing quantitative data collection methods among 200 respondents who worked in 8 major health facilities in Kampala. Results. Overall, 50.0% of respondents reported experiencing an occupational health hazard. Among these, 39.5% experienced biological hazards while 31.5% experienced nonbiological hazards. Predictors for experiencing hazards included not wearing the necessary personal protective equipment (PPE), working overtime, job related pressures, and working in multiple health facilities. Control measures to mitigate hazards were availing separate areas and containers to store medical waste and provision of safety tools and equipment. Conclusion. Healthcare workers in this setting experience several hazards in their workplaces. Associated factors include not wearing all necessary protective equipment, working overtime, experiencing work related pressures, and working in multiple facilities. Interventions should be instituted to mitigate the hazards. Specifically PPE supply gaps, job related pressures, and complacence in adhering to mitigation measures should be addressed. PMID:25802531

  15. STUDY ON AIR INGRESS MITIGATION METHODS IN THE VERY HIGH TEMPERATURE GAS COOLED REACTOR (VHTR)

    SciTech Connect

    Chang H. Oh

    2011-03-01

    An air-ingress accident followed by a pipe break is considered as a critical event for a very high temperature gas-cooled reactor (VHTR). Following helium depressurization, it is anticipated that unless countermeasures are taken, air will enter the core through the break leading to oxidation of the in-core graphite structure. Thus, without mitigation features, this accident might lead to severe exothermic chemical reactions of graphite and oxygen. Under extreme circumstances, a loss of core structural integrity may occur along with excessive release of radiological inventory. Idaho National Laboratory under the auspices of the U.S. Department of Energy is performing research and development (R&D) that focuses on key phenomena important during challenging scenarios that may occur in the VHTR. Phenomena Identification and Ranking Table (PIRT) studies to date have identified the air ingress event, following on the heels of a VHTR depressurization, as very important (Oh et al. 2006, Schultz et al. 2006). Consequently, the development of advanced air ingress-related models and verification and validation (V&V) requirements are part of the experimental validation plan. This paper discusses about various air-ingress mitigation concepts applicable for the VHTRs. The study begins with identifying important factors (or phenomena) associated with the air-ingress accident by using a root-cause analysis. By preventing main causes of the important events identified in the root-cause diagram, the basic air-ingress mitigation ideas can be conceptually derived. The main concepts include (1) preventing structural degradation of graphite supporters; (2) preventing local stress concentration in the supporter; (3) preventing graphite oxidation; (4) preventing air ingress; (5) preventing density gradient driven flow; (4) preventing fluid density gradient; (5) preventing fluid temperature gradient; (6) preventing high temperature. Based on the basic concepts listed above, various air

  16. Geospatial Approach on Landslide Hazard Zonation Mapping Using Multicriteria Decision Analysis: A Study on Coonoor and Ooty, Part of Kallar Watershed, The Nilgiris, Tamil Nadu

    NASA Astrophysics Data System (ADS)

    Rahamana, S. Abdul; Aruchamy, S.; Jegankumar, R.

    2014-12-01

    zone I. Further resulted hazard zone map and landuse/landcover map are overlaid to check the hazard status, and existing inventory of known landslides within the present study area was compared with the resulting vulnerable and hazard zone maps. The landslide hazard zonation map is useful for landslide hazard prevention, mitigation, and improvement to society, and proper planning for land use and construction in the future.

  17. Development, Implementation, and Pilot Evaluation of a Model-Driven Envelope Protection System to Mitigate the Hazard of In-Flight Ice Contamination on a Twin-Engine Commuter Aircraft

    NASA Technical Reports Server (NTRS)

    Martos, Borja; Ranaudo, Richard; Norton, Billy; Gingras, David; Barnhart, Billy

    2014-01-01

    Fatal loss-of-control accidents have been directly related to in-flight airframe icing. The prototype system presented in this report directly addresses the need for real-time onboard envelope protection in icing conditions. The combination of prior information and real-time aerodynamic parameter estimations are shown to provide sufficient information for determining safe limits of the flight envelope during inflight icing encounters. The Icing Contamination Envelope Protection (ICEPro) system was designed and implemented to identify degradations in airplane performance and flying qualities resulting from ice contamination and provide safe flight-envelope cues to the pilot. The utility of the ICEPro system for mitigating a potentially hazardous icing condition was evaluated by 29 pilots using the NASA Ice Contamination Effects Flight Training Device. Results showed that real time assessment cues were effective in reducing the number of potentially hazardous upset events and in lessening exposure to loss of control following an incipient upset condition. Pilot workload with the added ICEPro displays was not measurably affected, but pilot opinion surveys showed that real time cueing greatly improved their awareness of a hazardous aircraft state. The performance of ICEPro system was further evaluated by various levels of sensor noise and atmospheric turbulence.

  18. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 44 Emergency Management and Assistance 1 2012-10-01 2011-10-01 true Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  19. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 44 Emergency Management and Assistance 1 2014-10-01 2014-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  20. 44 CFR 78.5 - Flood Mitigation Plan development.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.5 Flood Mitigation Plan development. A Flood Mitigation Plan will articulate...

  1. Method Study of Flood Hazard Analysis for Plain River Network Area, Taihu Basin, China

    NASA Astrophysics Data System (ADS)

    HAN, C.; Liu, S.; Zhong, G.; Zhang, X.

    2015-12-01

    Flood is one of the most common and serious natural calamities. Taihu Basin is located in delta region of the Yangtze River in East China (see Fig. 1). Because of the abundant rainfall and low-lying terrain, the area frequently suffers from flood hazard which have caused serious casualty and economic loss. In order to reduce the severe impacts of floods events, numerous polder areas and hydraulic constructions (including pumps, water gates etc.) were constructed. Flood Hazard Map is an effective non-structural flood mitigation tool measures. Numerical simulation of flood propagation is one of the key technologies of flood hazard mapping. Because of the complexity of its underlying surface characteristics, numerical simulation of flood propagation was faced with some special problems for the plain river network area in Taihu Basin. In this paper, a coupled one and two dimensional hydrodynamic model was established. Densely covered and interconnected river networks, numerous polder areas and complex scheduling hydraulic constructions were generalized in the model. The model was proved to be believable and stable. Based on the results of the simulation of flood propagation, flood hazard map was compiled.

  2. Nonpoint-Source Agricultural Hazard Index: A Case Study of the Province of Cremona, Italy

    NASA Astrophysics Data System (ADS)

    Trevisan, Marco; Padovani, Laura; Capri, Ettore

    2000-11-01

    This paper reports the results of a study aimed at the evaluation of the hazard level of farming activities in the province of Cremona, Italy, with particular reference to groundwater. The applied methodology employs a parametric approach based on the definition of potential hazard indexes (nonpoint-source agricultural hazard indexes, NPSAHI). Two categories of parameters were considered: the hazard factors (HF), which represent all farming activities that cause or might cause an impact on groundwater (use of fertilizers and pesticides, application of livestock and poultry manure, food industry wastewater, and urban sludge), and the control factors (CF), which adapt the hazard factor to the characteristics of the site (geographical location, slope, agronomic practices, and type of irrigation). The hazard index (HI) can be calculated multiplying the hazard factors by the control factors and, finally, the NPSAHI are obtained dividing HI into classes on a percentile basis using a scale ranging from 1 to 10. Organization, processing, and display of all data layers were performed using the geographical information system (GIS) ArcView and its Spatial Analyst extension. Results show that the potential hazard of groundwater pollution by farming activities in the province of Cremona falls mainly in the fifth class (very low hazard).

  3. Water Induced Hazard Mapping in Nepal: A Case Study of East Rapti River Basin

    NASA Astrophysics Data System (ADS)

    Neupane, N.

    2010-12-01

    This paper presents illustration on typical water induced hazard mapping of East Rapti River Basin under the DWIDP, GON. The basin covers an area of 2398 sq km. The methodology includes making of base map of water induced disaster in the basin. Landslide hazard maps were prepared by SINMAP approach. Debris flow hazard maps were prepared by considering geology, slope, and saturation. Flood hazard maps were prepared by using two approaches: HEC-RAS and Satellite Imagery Interpretation. The composite water-induced hazard maps were produced by compiling the hazards rendered by landslide, debris flow, and flood. The monsoon average rainfall in the basin is 1907 mm whereas maximum 24 hours precipitation is 456.8 mm. The peak discharge of the Rapati River in the year of 1993 at station was 1220 cu m/sec. This discharge nearly corresponds to the discharge of 100-year return period. The landslides, floods, and debris flows triggered by the heavy rain of July 1993 claimed 265 lives, affected 148516 people, and damaged 1500 houses in the basin. The field investigation and integrated GIS interpretation showed that the very high and high landslide hazard zones collectively cover 38.38% and debris flow hazard zone constitutes 6.58%. High flood hazard zone occupies 4.28% area of the watershed. Mitigation measures are recommendated according to Integrated Watershed Management Approach under which the non-structural and structural measures are proposed. The non-structural measures includes: disaster management training, formulation of evacuation system (arrangement of information plan about disaster), agriculture management practices, protection of water sources, slope protections and removal of excessive bed load from the river channel. Similarly, structural measures such as dike, spur, rehabilitation of existing preventive measures and river training at some locations are recommendated. The major factors that have contributed to induce high incidences of various types of mass

  4. A Study on Integrated Community Based Flood Mitigation with Remote Sensing Technique in Kota Bharu, Kelantan

    NASA Astrophysics Data System (ADS)

    'Ainullotfi, A. A.; Ibrahim, A. L.; Masron, T.

    2014-02-01

    This study is conducted to establish a community based flood management system that is integrated with remote sensing technique. To understand local knowledge, the demographic of the local society is obtained by using the survey approach. The local authorities are approached first to obtain information regarding the society in the study areas such as the population, the gender and the tabulation of settlement. The information about age, religion, ethnic, occupation, years of experience facing flood in the area, are recorded to understand more on how the local knowledge emerges. Then geographic data is obtained such as rainfall data, land use, land elevation, river discharge data. This information is used to establish a hydrological model of flood in the study area. Analysis were made from the survey approach to understand the pattern of society and how they react to floods while the analysis of geographic data is used to analyse the water extent and damage done by the flood. The final result of this research is to produce a flood mitigation method with a community based framework in the state of Kelantan. With the flood mitigation that involves the community's understanding towards flood also the techniques to forecast heavy rainfall and flood occurrence using remote sensing, it is hope that it could reduce the casualties and damage that might cause to the society and infrastructures in the study area.

  5. Accuracy Assessment Study of UNB3m Neutral Atmosphere Model for Global Tropospheric Delay Mitigation

    NASA Astrophysics Data System (ADS)

    Farah, Ashraf

    2015-12-01

    Tropospheric delay is the second major source of error after the ionospheric delay for satellite navigation systems. The transmitted signal could face a delay caused by the troposphere of over 2m at zenith and 20m at lower satellite elevation angles of 10 degrees and below. Positioning errors of 10m or greater can result from the inaccurate mitigation of the tropospheric delay. Many techniques are available for tropospheric delay mitigation consisting of surface meteorological models and global empirical models. Surface meteorological models need surface meteorological data to give high accuracy mitigation while the global empirical models need not. Several hybrid neutral atmosphere delay models have been developed by (University of New Brunswick, Canada) UNB researchers over the past decade or so. The most widely applicable current version is UNB3m, which uses the Saastamoinen zenith delays, Niell mapping functions, and a look-up table with annual mean and amplitude for temperature, pressure, and water vapour pressure varying with respect to latitude and height. This paper presents an assessment study of the behaviour of the UNB3m model compared with highly accurate IGS-tropospheric estimation for three different (latitude/height) IGS stations. The study was performed over four nonconsecutive weeks on different seasons over one year (October 2014 to July 2015). It can be concluded that using UNB3m model gives tropospheric delay correction accuracy of 0.050m in average for low latitude regions in all seasons. The model's accuracy is about 0.075m for medium latitude regions, while its highest accuracy is about 0.014m for high latitude regions.

  6. Seismic hazard studies for the High Flux Beam Reactor at Brookhaven National Laboratory

    SciTech Connect

    Costantino, C.J.; Heymsfield, E. . Dept. of Civil Engineering); Park, Y.J.; Hofmayer, C.H. )

    1991-01-01

    This paper presents the results of a calculation to determine the site specific seismic hazard appropriate for the deep soil site at Brookhaven National Laboratory (BNL) which is to be used in the risk assessment studies being conducted for the High Flux Beam Reactor (HFBR). The calculations use as input the seismic hazard defined for the bedrock outcrop by a study conducted at Lawrence Livermore National Laboratory (LLNL). Variability in site soil properties were included in the calculations to obtain the seismic hazard at the ground surface and compare these results with those using the generic amplification factors from the LLNL study. 9 refs., 8 figs.

  7. Observational Studies of Earthquake Preparation and Generation to Mitigate Seismic Risks in Mines

    NASA Astrophysics Data System (ADS)

    Durrheim, R. J.; Ogasawara, H.; Nakatani, M.; Milev, A.; Cichowicz, A.; Kawakata, H.; Yabe, Y.; Murakami, O.; Naoi, M. M.; Moriya, H.; Satoh, T.

    2011-12-01

    We provide a status report on a 5-year project to monitor in-situ fault instability and strong motion in South African gold mines. The project has two main aims: (1) To learn more about earthquake preparation and generation mechanisms by deploying dense arrays of high-sensitivity sensors within rock volumes where mining is likely to induce significant seismic activity. (2) To upgrade the South African national surface seismic network in the mining districts. This knowledge will contribute to efforts to upgrade schemes of seismic hazard assessment and to limit and mitigate the seismic risks in deep mines. As of 31 July 2011, 46 boreholes totalling 1.9 km in length had been drilled at project sites at Ezulwini, Moab-Khotsong and Driefontein gold mines. Several dozen more holes are still to be drilled. Acoustic emission sensors, strain- and tiltmeters, and controlled seismic sources are being installed to monitor the deformation of the rock mass, the accumulation of damage during the preparation phase, and changes in dynamic stress as the rupture front propagates. These data will be integrated with measurements of stope closure, stope strong motion, seismic data recorded by the mine-wide network, and stress modelling. Preliminary results will be reported at AGU meeting. The project is endorsed by the Japan Science and Technology Agency (JST), Japan International Cooperation Agency (JICA) and the South African government. It is funded by the JST-JICA program for Science and Technology Research Partnership for Sustainable development (SATREPS, the Council for Scientific and Industrial Research (CSIR), the Council for Geoscience, the University of the Witwatersrand and the Department of Science and Technology. The contributions of Seismogen CC, OHMS Ltd, AnglogoldAshanti Rock Engineering Applied Research Group, First Uranium, the Gold Fields Seismic Department and the Institute of Mine Seismology are gratefully acknowledged.

  8. Study of Landslide Disaster Prevention System in Malaysia as a Disaster Mitigation Prototype for South East Asia Countries

    NASA Astrophysics Data System (ADS)

    Koay, Swee Peng; Fukuoka, Hiroshi; Tien Tay, Lea; Murakami, Satoshi; Koyama, Tomofumi; Chan, Huah Yong; Sakai, Naoki; Hazarika, Hemanta; Jamaludin, Suhaimi; Lateh, Habibah

    2016-04-01

    Every year, hundreds of landslides occur in Malaysia and other tropical monsoon South East Asia countries. Therefore, prevention casualties and economical losses, by rain induced slope failure, are those countries government most important agenda. In Malaysia, millions of Malaysian Ringgit are allocated for slope monitoring and mitigation in every year budget. Besides monitoring the slopes, here, we propose the IT system which provides hazard map information, landslide historical information, slope failure prediction, knowledge on natural hazard, and information on evacuation centres via internet for user to understand the risk of landslides as well as flood. Moreover, the user can obtain information on rainfall intensity in the monitoring sites to predict the occurrence of the slope failure. Furthermore, we are working with PWD, Malaysia to set the threshold value for the landslide prediction system which will alert the officer if there is a risk of the slope failure in the monitoring sites by calculating rainfall intensity. Although the IT plays a significant role in information dissemination, education is also important in disaster prevention by educating school students to be more alert in natural hazard, and there will be bottom up approach to alert parents on what is natural hazard, by conversion among family members, as most of the parents are busy and may not have time to attend natural hazard workshop. There are many races living in Malaysia as well in most of South East Asia countries. It is not easy to educate them in single education method as the level of living and education are different. We started landslides education workshops in primary schools in rural and urban area, in Malaysia. We found out that we have to use their mother tongue language while conducting natural hazard education for better understanding. We took questionnaires from the students before and after the education workshop. Learning from the questionnaire result, the students are

  9. Landslide hazard evaluation: a review of current techniques and their application in a multi-scale study, Central Italy

    NASA Astrophysics Data System (ADS)

    Guzzetti, Fausto; Carrara, Alberto; Cardinali, Mauro; Reichenbach, Paola

    1999-12-01

    In recent years, growing population and expansion of settlements and life-lines over hazardous areas have largely increased the impact of natural disasters both in industrialized and developing countries. Third world countries have difficulty meeting the high costs of controlling natural hazards through major engineering works and rational land-use planning. Industrialized societies are increasingly reluctant to invest money in structural measures that can reduce natural risks. Hence, the new issue is to implement warning systems and land utilization regulations aimed at minimizing the loss of lives and property without investing in long-term, costly projects of ground stabilization. Government and research institutions worldwide have long attempted to assess landslide hazard and risks and to portray its spatial distribution in maps. Several different methods for assessing landslide hazard were proposed or implemented. The reliability of these maps and the criteria behind these hazard evaluations are ill-formalized or poorly documented. Geomorphological information remains largely descriptive and subjective. It is, hence, somewhat unsuitable to engineers, policy-makers or developers when planning land resources and mitigating the effects of geological hazards. In the Umbria and Marche Regions of Central Italy, attempts at testing the proficiency and limitations of multivariate statistical techniques and of different methodologies for dividing the territory into suitable areas for landslide hazard assessment have been completed, or are in progress, at various scales. These experiments showed that, despite the operational and conceptual limitations, landslide hazard assessment may indeed constitute a suitable, cost-effective aid to land-use planning. Within this framework, engineering geomorphology may play a renewed role in assessing areas at high landslide hazard, and helping mitigate the associated risk.

  10. Why so many sperm cells? Not only a possible means of mitigating the hazards inherent to human reproduction but also an indicator of an exaptation

    PubMed Central

    Barlow, Peter W.

    2016-01-01

    ABSTRACT Redundancy—the excess of supply over necessity—has recently been proposed for human sperm cells. However, the apparent superfluity of cell numbers may be necessary in order to circumvent the hazards, many of which can be quantified, that can occur during the transition from gametogenesis within the testes to zygosis within the female reproductive tract. Sperm cell numbers are directly related to testicular volume, and it is owing to a redundancy, and the possible exaptation, of this latter parameter that a putative excess of sperm cells is perceived. PMID:27574542

  11. Computational study of laser imprint mitigation in foam-buffered inertial confinement fusion targets

    SciTech Connect

    Mason, R.J.; Kopp, R.A.; Vu, H.X.; Wilson, D.C.; Goldman, S.R.; Watt, R.G.; Dunne, M.; Willi, O.

    1998-01-01

    Recent experiments have shown that low density foam layers can significantly mitigate the perturbing effects of beam nonuniformities affecting the acceleration of thin shells. This problem is studied parametrically with two-dimensional LASNEX [G. B. Zimmerman and W. L. Kruer, Comments Plasma Phys. Controlled Fusion {bold 2}, 51 (1975)]. Foam-buffered targets are employed, consisting typically of 250 {Angstrom} of gold, and 50 {mu}m of 50mg/cm{sup 3} C{sub 10}H{sub 8}O{sub 4} foam attached to a 10 {mu}m foil. In simulation these were characteristically exposed to 1.2 ns, flat-topped green light pulses at 1.4{times}10{sup 14}W/cm{sup 2} intensity, bearing 30 {mu}m lateral perturbations of up to 60{percent} variation in intensity. Without the buffer layers the foils were severely disrupted by 1 ns. With buffering only minimal distortion was manifest at 3 ns. The smoothing is shown to derive principally from the high thermal conductivity of the heated foam. The simulation results imply that (1) the foam thickness should exceed the disturbance wavelength; (2) intensities exceeding 5{times}10{sup 13}W/cm{sup 2} are needed for assured stability beyond 2 ns; (3) longer foams at lower densities are needed for effective mitigation with shorter wavelength light; (4) the gold layer hastens conversion of the structured foam to a uniform plasma. {copyright} {ital 1998 American Institute of Physics.}

  12. New debris flow mitigation measures in southern Gansu, China: a case study of the Zhouqu Region

    NASA Astrophysics Data System (ADS)

    Xiong, Muqi; Meng, Xingmin; Li, Yajun

    2014-05-01

    A devastating debris flow occurred in Zhouqu of Gansu Province, China, on 8th August 2010, resulting in a catastrophic disaster, with 1463 people being perished. The debris flow valleys, as other numerous debris valleys in the mountainous region, had preventive engineering constructions, such as check dames, properly designed based on common engineering practices for safe guiding the town located right on the debris flow fan. However, failures of such preventive measures often cause even heavier disasters than those that have no human interactions, as the mitigations give a false safety impression. Given such a weird situation and in order to explore a much more effective disaster prevention strategy against debris flows in the mountainous region, this paper makes a comparative study based on two cases in the area of which one had preventive structures and one hasn't. The result shows that inappropriate mitigation measures that have commonly been applying in the disaster reduction practices in the region are of questionable. It is concluded that going with the nature and following with the natural rules are the best strategy for disaster reduction in the region. Key words: debris flow disasters, disaster reduction strategy, preventive measures

  13. Study of the ELM fluctuation characteristics during the mitigation of type-I ELMs

    NASA Astrophysics Data System (ADS)

    Bogomolov, A. V.; Classen, I. G. J.; Boom, J. E.; Donné, A. J. H.; Wolfrum, E.; Fischer, R.; Viezzer, E.; Schneider, P.; Manz, P.; Suttrop, W.; Luhmann, N. C., Jr.

    2015-08-01

    Transitions from type-I to small edge localized modes (ELMs) and back are studied using the electron cyclotron emission imaging (ECEI) diagnostic on the ASDEX Upgrade (AUG). ECEI measurements show that the average poloidal velocity of temperature fluctuations of both type-I ELM onsets and small ELMs is the same and is close to 5-6 km s-1. Radially, the temperature fluctuations are distributed in the same narrow 2 cm region between 0.975≤slant {ρ\\text{pol}}≤slant 1.025 with associated poloidal mode numbers m=96+/- 18 and toroidal mode numbers n=16+/- 4 . The observed fluctuations related to both type-I ELMs and small ELMs vary over the transition simultaneously, however, showing slightly different behaviour. The similarities between type-I ELMs and small ELMs observed via AUG suggest that they have the same nature and evolve together. In the transition phase a temperature fluctuation mode (‘inter-ELM mode’) appears, which becomes continuous in the mitigated ELM phase and might cause the ELM mitigation. The mode characteristics (velocity, frequency and wave number) obtained in the analysis can be further used for direct comparison in various code simulations.

  14. Study on mitigation of pulsed heat load for ITER cryogenic system

    NASA Astrophysics Data System (ADS)

    Peng, N.; Xiong, L. Y.; Jiang, Y. C.; Tang, J. C.; Liu, L. Q.

    2015-03-01

    One of the key requirements for ITER cryogenic system is the mitigation of the pulsed heat load deposited in the magnet system due to magnetic field variation and pulsed DT neutron production. As one of the control strategies, bypass valves of Toroidal Field (TF) case helium loop would be adjusted to mitigate the pulsed heat load to the LHe plant. A quasi-3D time-dependent thermal-hydraulic analysis of the TF winding packs and TF case has been performed to study the behaviors of TF magnets during the reference plasma scenario with the pulses of 400 s burn and repetition time of 1800 s. The model is based on a 1D helium flow and quasi-3D solid heat conduction model. The whole TF magnet is simulated taking into account thermal conduction between winding pack and case which are cooled separately. The heat loads are given as input information, which include AC losses in the conductor, eddy current losses in the structure, thermal radiation, thermal conduction and nuclear heating. The simulation results indicate that the temperature variation of TF magnet stays within the allowable range when the smooth control strategy is active.

  15. Methane emissions from landfills in Serbia and potential mitigation strategies: a case study.

    PubMed

    Stanisavljevic, Nemanja; Ubavin, Dejan; Batinic, Bojan; Fellner, Johann; Vujic, Goran

    2012-10-01

    Open dumping and landfilling have represented the predominant method of waste management in Serbia during the past decades. This practice resulted in over 3600 waste disposal sites distributed all over the country. The locations of the sites and their characteristics have been determined in the framework of the presented study. The vast majority of disposal sites (up to 3300) are characterized by small deposition depth of waste and total waste volumes of less than 10,000 m(3). Only about 50 landfills in Serbia contain more than 100,000 m(3) of waste. These large landfills are responsible for more than 95% of the total CH(4) emissions from waste disposal, which was assessed as 60,000 tons of CH(4) in 2010. The evaluation of different measures [soil cover, compost cover and landfill gas (LFG) systems] for mitigating greenhouse gas emissions from Serbian landfills indicated that enhanced microbial CH(4) oxidation (using a compost cover), as well as the installation of LFG systems, could generate net revenues as saved CH(4) emissions are creditable for the European Greenhouse Gas Emissions Trading Scheme. In total between 4 and 7 million tons of CO(2) equivalent emissions could be avoided within the next 20 years by mitigating CH(4) emissions from Serbian landfills. PMID:22751946

  16. Emergency planning for hazardous industrial areas: a Brazilian case study.

    PubMed

    de Souza, A B

    2000-08-01

    One of the characteristics of modern industrial development is the emergence of a new typology of accidents whose effects can be spread, in space as well as in time, well beyond the borders of the installations where they occur, sometimes impacting the local population and the environment in a catastrophic fashion. This is the result of a number of factors that have changed the risk profile of modern industrial activities. For a number of reasons, the developing countries have proved to be more vulnerable to industrial disasters. Three of the most catastrophic industrial accidents--Bhopal, San Juan de Ixhuatepec, and Cubatão--occurred in developing countries, claiming thousands of lives. During the 1970s and 1980s the higher degree of public visibility of industrial hazards as a result of serious accidents, led to the creation, especially in the more industrialized countries, of regulations for greater control over industrial activities, either by means of new laws or by updating existing legislation. Some of these regulations were designed to improve the response to accidents with potential impacts outside the industrial sites. This article attempts to describe the current status and identify the shortcomings of off-site emergency planning for hazardous industrial areas in Brazil. The most important problems are the lack of specific legislation and the absence of awareness and active participation of public authorities. The experience of an off-site emergency planning process for a Brazilian industrial area is presented. This experience illustrates how difficult it is to prepare and implement emergency planning processes in an industrializing country. PMID:11051072

  17. Field Study of Exhaust Fans for Mitigating Indoor Air Quality Problems & Indoor Air Quality - Exhaust Fan Mitigation.

    SciTech Connect

    United States. Bonneville Power Administration.

    1987-07-01

    Overall, the findings show that exhaust fans basically provide small amounts of ventilation compensation. By monitoring the common indoor air pollutants (radon, formaldehyde, carbon monoxide, nitrogen dioxide, and water vapor), it was found that the quality of the indoor air was not adversely affected by the use of exhaust fans. Nor did their use provide any measurable or significant benefits since no improvement in air quality was ascertained. While exhaust fans of this small size did not increase radon, which is the contaminant of most concern, the researchers caution that operation of a larger fan or installation in a very tight home could result in higher levels because depressurization is greater. The daily energy consumption for use of these appliances during the heating season was calculated to be 1.5 kilowatt hours or approximately 3% of the energy consumption in the study homes. The information collected in this collaborative field study indicates that the use of these particular ventilation systems has no significant effect on indoor air quality.

  18. Hazard Interactions and Interaction Networks (Cascades) within Multi-Hazard Methodologies

    NASA Astrophysics Data System (ADS)

    Gill, Joel; Malamud, Bruce D.

    2016-04-01

    Here we combine research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between 'multi-layer single hazard' approaches and 'multi-hazard' approaches that integrate such interactions. This synthesis suggests that ignoring interactions could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. We proceed to present an enhanced multi-hazard framework, through the following steps: (i) describe and define three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment; (ii) outline three types of interaction relationship (triggering, increased probability, and catalysis/impedance); and (iii) assess the importance of networks of interactions (cascades) through case-study examples (based on literature, field observations and semi-structured interviews). We further propose visualisation frameworks to represent these networks of interactions. Our approach reinforces the importance of integrating interactions between natural hazards, anthropogenic processes and technological hazards/disasters into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential, and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.

  19. Assessment of bio-physical drought hazards. A case study of Karkheh River basin in Iran

    NASA Astrophysics Data System (ADS)

    Kamali, Bahareh; Abbaspour, Karim; Houshmand Kouchi, Delaram; Yang, Hong

    2016-04-01

    Iran has been affected by frequent droughts. Climate change is expected to intensify the situation in the future. Extreme drought events have had serious impacts on hydrological and agricultural sector. Thus, identification of bio-physical drought hazard is critically important for formulating effective adaptive measures to improve water and food security. This study aims to investigate temporal and spatial pattern of drought hazards in meteorological, hydrological, and agricultural (inclusively biophysical) sectors in the Karkheh River Basin of Iran in the historical and future climate change context. To do so, drought hazard indices were built based on the severity and frequency of standardized precipitation index (SPI), standardized runoff index (SRI), and standardized soil moisture index (SSMI), which represent the three aspects of drought hazards. Variables required for calculating these indices were obtained from SWAT (Soil and Water Assessment Tool) model constructed for the basin. The model was calibrated based on monthly runoff using the Sequential Uncertainty Fitting (SUFI-2) algorithm in SWAT-CUP. Based on the climate variability and drought analysis, three drought hazard classes, namely low, medium and high, were defined. This help identify how agricultural and hydrological sectors are related to meteorological droughts. Additionally, the bio-physical drivers of drought hazards were identified for each class. Comparing the results during historic and future scenarios revealed that the frequency of high- severity hazards will increase, whereas the same is not predicted for the area with medium hazard intensity. Inferred from findings of this study, the combined application of the SWAT model with bio-physical drought hazard concept helps better understanding of climate risks to water and food security. The developed approach is replicable at different scales to provide a robust planning tool for policy makers.

  20. Mitigation and prevention of exertional heat stress in firefighters: a review of cooling strategies for structural firefighting and hazardous materials responders.

    PubMed

    McEntire, Serina J; Suyama, Joe; Hostler, David

    2013-01-01

    Most duties performed by firefighters require the use of personal protective equipment, which inhibits normal thermoregulation during exertion, creating an uncompensable heat stress. Structured rest periods are required to correct the effects of uncompensable heat stress and ensure that firefighter safety is maintained and that operations can be continued until their conclusion. While considerable work has been done to optimize firefighter cooling during fireground operations, there is little consensus on when or how cooling should be deployed. A systematic review of cooling techniques and practices among firefighters and hazardous materials operators was conducted to describe the state of the science and provide recommendations for deploying resources for fireground rehab (i.e., structured rest periods during an incident). Five electronic databases were searched using a selected combination of key words. One hundred forty publications were found in the initial search, with 27 meeting all the inclusion criteria. Two independent reviewers performed a qualitative assessment of each article based on nine specific questions. From the selected literature, the efficacy of multiple cooling strategies was compared during exertion and immediately following exertion under varying environmental conditions. When considering the literature available for cooling firefighters and hazardous materials technicians during emergency incident rehabilitation, widespread use of cooling devices does not appear to be warranted if ambient temperature and humidity approximate room temperature and protective garments can be removed. When emergency incident rehabilitation must be conducted in hot or humid conditions, active cooling devices are needed. Hand/forearm immersion is likely the best modality for cooling during rehab under hot, humid conditions; however, this therapy has a number of limitations. Cooling during work thus far has been limited primarily to cooling vests and liquid- or

  1. BICAPA case study of natural hazards that trigger technological disasters

    NASA Astrophysics Data System (ADS)

    Boca, Gabriela; Ozunu, Alexandru; Nicolae Vlad, Serban

    2010-05-01

    Industrial facilities are vulnerable to natural disasters. Natural disasters and technological accidents are not always singular or isolated events. The example in this paper show that they can occur in complex combinations and/or in rapid succession, known as NaTech disasters, thereby triggering multiple impacts. This analysis indicates that NaTech disasters have the potential to trigger hazmat releases and other types of technological accidents. Climate changes play an important role in prevalence and NATECH triggering mechanisms. Projections under the IPCC IS92 a scenario (similar to SRES A1B; IPCC, 1992) and two GCMs indicate that the risk of floods increases in central and eastern Europe. Increase in intense short-duration precipitation is likely to lead to increased risk of flash floods. (Lehner et al., 2006). It is emergent to develop tools for the assessment of risks due to NATECH events in the industrial processes, in a framework starting with the characterization of frequency and severity of natural disasters and continuing with complex analysis of industrial processes, to risk assessment and residual functionality analysis. The Ponds with dangerous technological residues are the most vulnerable targets of natural hazards. Technological accidents such as those in Baia Mare, (from January to March 2000) had an important international echo. Extreme weather phenomena, like those in the winter of 2000 in Baia Mare, and other natural disasters such as floods or earthquakes, can cause a similar disaster at Târnăveni in Transylvania Depression. During 1972 - 1978 three decanting ponds were built on the Chemical Platform Târnăveni, now SC BICAPA SA, for disposal of the hazardous-wastes resulting from the manufacture of sodium dichromate, inorganic salts, sludge from waste water purification and filtration, wet gas production from carbide. The ponds are located on the right bank of the river Târnava at about 35-50m from the flooding defense dam. The total

  2. Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria

    NASA Astrophysics Data System (ADS)

    Maltzkait, Anika; Pfurtscheller, Clemens

    2014-05-01

    Multihazard risk analysis and disaster planning for emergency services as a basis for efficient provision in the case of natural hazards - case study municipality of Au, Austria A. Maltzkait (1) & C. Pfurtscheller (1) (1) Institute for Interdisciplinary Mountain Research (IGF), Austrian Academy of Sciences, Innsbruck, Austria The extreme flood events of 2002, 2005 and 2013 in Austria underlined the importance of local emergency services being able to withstand and reduce the adverse impacts of natural hazards. Although for legal reasons municipal emergency and crisis management plans exist in Austria, they mostly do not cover risk analyses of natural hazards - a sound, comparable assessment to identify and evaluate risks. Moreover, total losses and operational emergencies triggered by natural hazards have increased in recent decades. Given sparse public funds, objective budget decisions are needed to ensure the efficient provision of operating resources, like personnel, vehicles and equipment in the case of natural hazards. We present a case study of the municipality of Au, Austria, which was hardly affected during the 2005 floods. Our approach is primarily based on a qualitative risk analysis, combining existing hazard plans, GIS data, field mapping and data on operational efforts of the fire departments. The risk analysis includes a map of phenomena discussed in a workshop with local experts and a list of risks as well as a risk matrix prepared at that workshop. On the basis for the exact requirements for technical and non-technical mitigation measures for each natural hazard risk were analysed in close collaboration with members of the municipal operation control and members of the local emergency services (fire brigade, Red Cross). The measures includes warning, evacuation and, technical interventions with heavy equipment and personnel. These results are used, first, to improve the municipal emergency and crisis management plan by providing a risk map, and a

  3. Melatonin mitigate cerebral vasospasm after experimental subarachnoid hemorrhage: a study of synchrotron radiation angiography

    NASA Astrophysics Data System (ADS)

    Cai, J.; He, C.; Chen, L.; Han, T.; Huang, S.; Huang, Y.; Bai, Y.; Bao, Y.; Zhang, H.; Ling, F.

    2013-06-01

    Cerebral vasospasm (CV) after subarachnoid hemorrhage (SAH) is a devastating and unsolved clinical issue. In this study, the rat models, which had been induced SAH by prechiasmatic cistern injection, were treated with melatonin. Synchrotron radiation angiography (SRA) was employed to detect and evaluate CV of animal models. Neurological scoring and histological examinations were used to assess the neurological deficits and CV as well. Using SRA techniques and histological analyses, the anterior cerebral artery diameters of SAH rats with melatonin administration were larger than those without melatonin treatment (p < 0.05). The neurological deficits of SAH rats treated with melatonin were less than those without melatonin treatment (p < 0.05). We concluded that SRA was a precise and in vivo tool to observe and evaluate CV of SAH rats; intraperitoneally administration of melatonin could mitigate CV after experimental SAH.

  4. Echo-sounding method aids earthquake hazard studies

    USGS Publications Warehouse

    U.S. Geological Survey

    1995-01-01

    Dramatic examples of catastrophic damage from an earthquake occurred in 1989, when the M 7.1 Lorna Prieta rocked the San Francisco Bay area, and in 1994, when the M 6.6 Northridge earthquake jolted southern California. The surprising amount and distribution of damage to private property and infrastructure emphasizes the importance of seismic-hazard research in urbanized areas, where the potential for damage and loss of life is greatest. During April 1995, a group of scientists from the U.S. Geological Survey and the University of Tennessee, using an echo-sounding method described below, is collecting data in San Antonio Park, California, to examine the Monte Vista fault which runs through this park. The Monte Vista fault in this vicinity shows evidence of movement within the last 10,000 years or so. The data will give them a "picture" of the subsurface rock deformation near this fault. The data will also be used to help locate a trench that will be dug across the fault by scientists from William Lettis & Associates.

  5. Study on mobility-disadvantage group' risk perception and coping behaviors of abrupt geological hazards in coastal rural area of China.

    PubMed

    Pan, Anping

    2016-07-01

    China is a country highly vulnerable to abrupt geological hazards. The present study aims to investigate disaster preparedness and perception of abrupt geological disasters (such as rock avalanches, landslide, mud-rock flows etc) in mobility-disadvantage group living in coastal rural area of China. This research is to take into account all factors regarding disasters and to design the questionnaires accordingly. Two debris flow vulnerable townships are selected as study areas including Hedi Township in Qinyuan County and Xianxi Township in Yueqing City which are located in East China's Zhejiang Province. SPSS was applied to conduct descriptive analysis, which results in an effective empirical model for evacuation behavior of the disable groups. The result of this study shows mobility-disadvantage groups' awareness on disaster prevention and mitigation is poor and their knowledge about basic theory and emergency response is limited. Errors and distortions in public consciousness on disaster prevention and mitigation stimulate the development of areas with frequent disasters, which will expose more life and property to danger and aggravate the vulnerability of hazard bearing body. In conclusion, before drafting emergency planning, the government should consider more the disable group's expectations and actual evacuation behavior than the request of the situation to ensure the planning is good to work. PMID:27174691

  6. Study of Seismic Hazards in the Center of the State of Veracruz, MÉXICO.

    NASA Astrophysics Data System (ADS)

    Torres Morales, G. F.; Leonardo Suárez, M.; Dávalos Sotelo, R.; Mora González, I.; Castillo Aguilar, S.

    2015-12-01

    Preliminary results obtained from the project "Microzonation of geological and hydrometeorological hazards for conurbations of Orizaba, Veracruz, and major sites located in the lower sub-basins: The Antigua and Jamapa" are presented. These project was supported by the Joint Funds CONACyT-Veracruz state government. It was developed a probabilistic seismic hazard assessment (henceforth PSHA) in the central area of Veracruz State, mainly in a region bounded by the watersheds of the rivers Jamapa and Antigua, whit the aim to evaluate the geological and hydrometeorological hazards in this region. The project pays most attention to extreme weather phenomena, floods and earthquakes, in order to calculate the risk induced by previous for landslides and rock falls. In addition, as part of the study, the PSHA was developed considered the site effect in the urban zones of the cities Xalapa and Orizaba; the site effects were incorporated by a standard format proposed in studies of microzonation and its application in computer systems, which allows to optimize and condense microzonation studies in a city. The results obtained from the PSHA are presented through to seismic hazard maps (hazard footprints), exceedance rate curves and uniform hazard spectrum for different spectral ordinates, between 0.01 and 5.0 seconds, associated to selected return periods: 72, 225, 475 and 2475 years.

  7. A combined approach to physical vulnerability of large cities exposed to natural hazards - the case study of Arequipa, Peru

    NASA Astrophysics Data System (ADS)

    Thouret, Jean-Claude; Ettinger, Susanne; Zuccaro, Giulio; Guitton, Mathieu; Martelli, Kim; Degregorio, Daniela; Nardone, Stefano; Santoni, Olivier; Magill, Christina; Luque, Juan Alexis; Arguedas, Ana

    2013-04-01

    Arequipa, the second largest city in Peru with almost one million inhabitants, is exposed to various natural hazards, such as earthquakes, landslides, flash floods, and volcanic eruptions. This study focuses on the vulnerability and response of housing, infrastructure and lifelines in Arequipa to flash floods and eruption induced hazards, notably lahars from El Misti volcano. We propose a combined approach for assessing physical vulnerability in a large city based on: (1) remote sensing utilizing high-resolution imagery (SPOT5, Google Earth Pro, Bing, Pléïades) to map the distribution and type of land use, properties of city blocks in terms of exposure to the hazard (elevation above river level, distance to channel, impact angle, etc.); (2) in situ survey of buildings and critical infrastructure (e.g., bridges) and strategic resources (e.g., potable water, irrigation, sewage); (3) information gained from interviews with engineers involved in construction works, previous crises (e.g., June 2001 earthquake) and risk mitigation in Arequipa. Remote sensing and mapping at the scale of the city has focused on three pilot areas, along the perennial Rio Chili valley that crosses the city and oasis from north to south, and two of the east-margin tributaries termed Quebrada (ravine): San Lazaro crossing the northern districts and Huarangal crossing the northeastern districts. Sampling of city blocks through these districts provides varying geomorphic, structural, historical, and socio-economic characteristics for each sector. A reconnaissance survey included about 900 edifices located in 40 city blocks across districts of the pilot areas, distinct in age, construction, land use and demographics. A building acts as a structural system and its strength and resistance to flashfloods and lahars therefore highly depends on the type of construction and the used material. Each building surveyed was assigned to one of eight building categories based on physical criteria (dominant

  8. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 44 Emergency Management and Assistance 1 2014-10-01 2014-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  9. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 44 Emergency Management and Assistance 1 2012-10-01 2011-10-01 true Flood Mitigation Plan approval..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  10. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 44 Emergency Management and Assistance 1 2013-10-01 2013-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  11. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 44 Emergency Management and Assistance 1 2011-10-01 2011-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  12. 44 CFR 78.6 - Flood Mitigation Plan approval process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Flood Mitigation Plan..., DEPARTMENT OF HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program FLOOD MITIGATION ASSISTANCE § 78.6 Flood Mitigation Plan approval process. The State POC will forward all...

  13. Blast effect on the lower extremities and its mitigation: a computational study.

    PubMed

    Dong, Liqiang; Zhu, Feng; Jin, Xin; Suresh, Mahi; Jiang, Binhui; Sevagan, Gopinath; Cai, Yun; Li, Guangyao; Yang, King H

    2013-12-01

    A series of computational studies were performed to investigate the response of the lower extremities of mounted soldiers under landmine detonation. A numerical human body model newly developed at Wayne State University was used to simulate two types of experimental studies and the model predictions were validated against test data in terms of the tibia axial force as well as bone fracture pattern. Based on the validated model, the minimum axial force causing tibia facture was found. Then a series of parametric studies was conducted to determine the critical velocity (peak velocity of the floor plate) causing tibia fracture at different upper/lower leg angles. In addition, to limit the load transmission through the vehicular floor, two types of energy absorbing materials, namely IMPAXX(®) foam and aluminum alloy honeycomb, were selected for floor matting. Their performances in terms of blast effect mitigation were compared using the validated numerical model, and it has been found that honeycomb is a more efficient material for blast injury prevention under the loading conditions studied. PMID:23973770

  14. Public willingness to pay for CO2 mitigation and the determinants under climate change: a case study of Suzhou, China.

    PubMed

    Yang, Jie; Zou, Liping; Lin, Tiansheng; Wu, Ying; Wang, Haikun

    2014-12-15

    This study explored the factors that influence respondents' willingness to pay (WTP) for CO2 mitigation under climate change. A questionnaire survey combined with contingent valuation and psychometric paradigm methods were conducted in the city of Suzhou, Jiangsu Province in China. Respondents' traditional demographic attributes, risk perception of greenhouse gas (GHG), and attitude toward the government's risk management practices were established using a Tobit model to analyze the determinants. The results showed that about 55% of the respondents refused to pay for CO2 mitigation, respondent's WTP increased with increasing CO2 mitigation percentage. Important factors influencing WTP include people's feeling of dread of GHGs, confidence in policy, the timeliness of governmental information disclosure, age, education and income level. PMID:25151109

  15. Underground Coal-Fires in Xinjiang, China: A Continued Effort in Applying Geophysics to Solve a Local Problem and to Mitigate a Global Hazard

    NASA Astrophysics Data System (ADS)

    Wuttke, M. W.; Halisch, M.; Tanner, D. C.; Cai, Z. Y.; Zeng, Q.; Wang, C.

    2012-04-01

    laboratory measurements realistic dynamical models of fire-zones are constructed to increase the understanding of particular coal-fires, to interpret the surface signatures of the coal-fire in terms of location and propagation and to estimate the output of hazardous exhaust products to evaluate the economic benefit of fire extinction.

  16. Studying and Improving Human Response to Natural Hazards: Lessons from the Virtual Hurricane Lab

    NASA Astrophysics Data System (ADS)

    Meyer, R.; Broad, K.; Orlove, B. S.

    2010-12-01

    One of the most critical challenges facing communities in areas prone to natural hazards is how to best encourage residents to invest in individual and collective actions that would reduce the damaging impact of low-probability, high-consequence, environmental events. Unfortunately, what makes this goal difficult to achieve is that the relative rarity natural hazards implies that many who face the risk of natural hazards have no previous experience to draw on when making preparation decisions, or have prior experience that provides misleading guidance on how best to prepare. For example, individuals who have experienced strings of minor earthquakes or near-misses from tropical cyclones may become overly complacent about the risks that extreme events actually pose. In this presentation we report the preliminary findings of a program of work that explores the use of realistic multi-media hazard simulations designed for two purposes: 1) to serve as a basic research tool for studying of how individuals make decisions to prepare for rare natural hazards in laboratory settings; and 2) to serve as an educational tool for giving people in hazard-prone areas virtual experience in hazard preparation. We demonstrate a prototype simulation in which participants experience the approach of a virtual hurricane, where they have the opportunity to invest in different kinds of action to protect their home from damage. As the hurricane approaches participants have access to an “information dashboard” in which they can gather information about the storm threat from a variety of natural sources, including mock television weather broadcasts, web sites, and conversations with neighbors. In response to this information they then have the opportunity to invest in different levels of protective actions. Some versions of the simulation are designed as games, where participants are rewarded based on their ability to make the optimal trade-off between under and over-preparing for the

  17. Alternatives to hazard ratios for comparing efficacy or safety of therapies in noninferiority studies

    PubMed Central

    Uno, Hajime; Wittes, Janet; Fu, Haoda; Solomon, Scott D.; Claggett, Brian; Tian, Lu; Cai, Tianxi; Pfeffer, Marc A.; Evans, Scott R.; Wei, Lee-Jen

    2015-01-01

    A noninferiority study is often used to investigate whether a treatment’s efficacy or safety profile is acceptable compared to an alternative therapy regarding the time to a clinical event. The empirical quantification of the treatment difference for such a study is routinely based on the hazard ratio estimate. The hazard ratio, which is not a relative risk, may be difficult to interpret clinically, especially when the underlying proportional hazards assumption is violated. The precision of the hazard ratio estimate depends primarily on the number of observed events, but not directly on either exposure times or sample size of the study population. If the event rate is low, the study may require an impractically large number of events to ensure that the prespecified noninferiority criterion for the hazard ratio is attainable. This article discusses deficiencies of the current approach for design and analysis of a noninferiority study. We then provide alternative procedures, which do not depend on any model assumption, to compare two treatments. For a noninferiority safety study, the patients’ exposure times are more clinically important than the observed number of events. If the study patients’ exposure times are long enough to evaluate safety reliably, these alternative procedures can effectively provide clinically interpretable evidence on safety, even with relatively few observed events. We illustrate these procedures with data from two studies. One explores the cardiovascular safety of a pain medicine; the second examines the cardiovascular safety of a new treatment for diabetes. These alternative strategies to evaluate safety or efficacy of an intervention lead to more meaningful interpretations of the analysis results than the conventional one via the hazard ratio estimate. PMID:26054047

  18. Communicating Volcanic Hazards in the North Pacific

    NASA Astrophysics Data System (ADS)

    Dehn, J.; Webley, P.; Cunningham, K. W.

    2014-12-01

    For over 25 years, effective hazard communication has been key to effective mitigation of volcanic hazards in the North Pacific. These hazards are omnipresent, with a large event happening in Alaska every few years to a decade, though in many cases can happen with little or no warning (e.g. Kasatochi and Okmok in 2008). Here a useful hazard mitigation strategy has been built on (1) a large database of historic activity from many datasets, (2) an operational alert system with graduated levels of concern, (3) scenario planning, and (4) routine checks and communication with emergency managers and the public. These baseline efforts are then enhanced in the time of crisis with coordinated talking points, targeted studies and public outreach. Scientists naturally tend to target other scientists as their audience, whereas in effective monitoring of hazards that may only occur on year to decadal timescales, details can distract from the essentially important information. Creating talking points and practice in public communications can help make hazard response a part of the culture. Promoting situational awareness and familiarity can relieve indecision and concerns at the time of a crisis.

  19. Economic valuation of flood mitigation services: A case study from the Otter Creek, VT.

    NASA Astrophysics Data System (ADS)

    Galford, G. L.; Ricketts, T.; Bryan, K. L.; ONeil-Dunne, J.; Polasky, S.

    2014-12-01

    The ecosystem services provided by wetlands are widely recognized but difficult to quantify. In particular, estimating the effect of landcover and land use on downstream flood outcomes remains challenging, but is increasingly important in light of climate change predictions of increased precipitation in many areas. Economic valuation can help incorporate ecosystem services into decisions and enable communities to plan for climate and flood resiliency. Here we estimate the economic value of Otter Creek wetlands for Middlebury, VT in mitigating the flood that followed Tropical Storm Irene, as well as for ten historic floods. Observationally, hydrographs above and below the wetlands in the case of each storm indicated the wetlands functioned as a temporary reservoir, slowing the delivery of water to Middlebury. We compare observed floods, based on Middlebury's hydrograph, with simulated floods for scenarios without wetlands. To simulate these "without wetlands" scenarios, we assume the same volume of water was delivered to Middlebury, but in a shorter time pulse similar to a hydrograph upstream of the wetlands. For scenarios with and without wetlands, we map the spatial extent of flooding using LiDAR digital elevation data. We then estimate flood depth at each affected building, and calculate monetary losses as a function of the flood depth and house value using established depth damage relationships. For example, we expect damages equal to 20% of the houses value for a flood depth of two feet in a two-story home with a basement. We define the value of flood mitigation services as the difference in damages between the with and without wetlands scenario, and find that the Otter Creek wetlands reduced flood damage in Middlebury by 88% following Hurricane Irene. Using the 10 additional historic floods, we estimate an ongoing mean value of $400,000 in avoided damages per year. Economic impacts of this magnitude stress the importance of wetland conservation and warrant the

  20. Ocean thermal conversion (OTEC) project bottom cable protection study: environmental characteristics and hazards analysis

    SciTech Connect

    Chern, C.; Tudor, W.

    1981-10-01

    Seafloor cable-protection criteria and technology as applied to the four proposed OTEC plant sites and cable routes at Hawaii, Puerto Rico, Guam and Florida were examined. Study of environmental characteristics for each site covered: (A) natural factors of location, tide and currents, wind and wave, bottom soil type and seafloor movement; and (B) man-made factors such as ship traffic, fishing activities, ocean mining, government regulations. These characteristics were studied to determine the hazards which are potential sources of damage to a cable system. Hazards include: chafe and corrosion, hydrodynamic forces due to wave and current action, mudslides, earthquakes, trawler and/or dredge action and ship anchors. An analysis of the history of submarine-cable failures was conducted. Included are the probabilities of damage related to water depth. Probabilities become minimal for all hazards in water depths of 1500 feet and more. Chafe and corrosion had the highest probability of causing damage to a seafloor cable compared to the other hazards. Because of the hazards present at all sites, cable burial is recommended as the best means of protection.

  1. A study on the use of planarity for quick identification of potential landslide hazard

    NASA Astrophysics Data System (ADS)

    Baek, M. H.; Kim, T. H.

    2015-05-01

    In this study we focused on identifying a geomorphological feature that controls the location of landslides. The representation of the feature is based on a high-resolution digital elevation model derived from the airborne laser altimetry (LiDAR) and evaluated by the statistical analysis of axial orientation data. The main principle of this analysis is generating eigenvalues from axial orientation data and comparing them. The planarity, a ratio of eigenvalues, would tell the degree of irregularity on the ground surface based on their ratios. Results are compared to the recent landslide case in Korea in order to evaluate the feasibility of the proposed methodology in identifying the potential landslide hazard. The preliminary landslide hazard assessment based on the planarity analysis discriminates features between stable and unstable domain in the study area well, especially in the landslide initiation zones. Results also show it is beneficial to build the landslide hazard inventory mapping, especially where no information on historical records of landslides exists. By combining other physical procedures such as geotechnical monitoring, the landslide hazard assessment using geomorphological features promises a better understanding of landslides and their mechanisms and provides an enhanced methodology to evaluate their hazards and appropriate actions.

  2. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    SciTech Connect

    Bernreuter, D. L

    1981-08-08

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectral for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimted seismic hazard in this region of the country. 29 refs., 15 tabs.

  3. Mini-Sosie high-resolution seismic method aids hazards studies

    USGS Publications Warehouse

    Stephenson, W.J.; Odum, J.; Shedlock, K.M.; Pratt, T.L.; Williams, R.A.

    1992-01-01

    The Mini-Sosie high-resolution seismic method has been effective in imaging shallow-structure and stratigraphic features that aid in seismic-hazard and neotectonic studies. The method is not an alternative to Vibroseis acquisition for large-scale studies. However, it has two major advantages over Vibroseis as it is being used by the USGS in its seismic-hazards program. First, the sources are extremely portable and can be used in both rural and urban environments. Second, the shifting-and-summation process during acquisition improves the signal-to-noise ratio and cancels out seismic noise sources such as cars and pedestrians. -from Authors

  4. Pyrotechnic hazards classification and evaluation program test report. Heat flux study of deflagrating pyrotechnic munitions

    NASA Technical Reports Server (NTRS)

    Fassnacht, P. O.

    1971-01-01

    A heat flux study of deflagrating pyrotechnic munitions is presented. Three tests were authorized to investigate whether heat flux measurements may be used as effective hazards evaluation criteria to determine safe quantity distances for pyrotechnics. A passive sensor study was conducted simultaneously to investigate their usefulness in recording events and conditions. It was concluded that heat flux measurements can effectively be used to evaluate hazards criteria and that passive sensors are an inexpensive tool to record certain events in the vicinity of deflagrating pyrotechnic stacks.

  5. Study of the environmental hazard caused by the oil shale industry solid waste.

    PubMed

    Põllumaa, L; Maloveryan, A; Trapido, M; Sillak, H; Kahru, A

    2001-01-01

    The environmental hazard was studied of eight soil and solid waste samples originating from a region of Estonia heavily polluted by the oil shale industry. The samples were contaminated mainly with oil products (up to 7231mg/kg) and polycyclic aromatic hydrocarbons (PAHs; up to 434mg/kg). Concentrations of heavy metals and water-extractable phenols were low. The toxicities of the aqueous extracts of solid-phase samples were evaluated by using a battery of Toxkit tests (involving crustaceans, protozoa, rotifers and algae). Waste rock and fresh semi-coke were classified as of "high acute toxic hazard", whereas aged semi-coke and most of the polluted soils were classified as of "acute toxic hazard". Analysis of the soil slurries by using the photobacterial solid-phase flash assay showed the presence of particle-bound toxicity in most samples. In the case of four samples out of the eight, chemical and toxicological evaluations both showed that the levels of PAHs, oil products or both exceeded their respective permitted limit values for the living zone (20mg PAHs/kg and 500mg oil products/kg); the toxicity tests showed a toxic hazard. However, in the case of three samples, the chemical and toxicological hazard predictions differed markedly: polluted soil from the Erra River bank contained 2334mg oil/kg, but did not show any water-extractable toxicity. In contrast, spent rock and aged semi-coke that contained none of the pollutants in hazardous concentrations, showed adverse effects in toxicity tests. The environmental hazard of solid waste deposits from the oil shale industry needs further assessment. PMID:11387023

  6. Seismic hazard assessment of the cultural heritage sites: A case study in Cappadocia (Turkey)

    NASA Astrophysics Data System (ADS)

    Seyrek, Evren; Orhan, Ahmet; Dinçer, İsmail

    2014-05-01

    Turkey is one of the most seismically active regions in the world. Major earthquakes with the potential of threatening life and property occur frequently here. In the last decade, over 50,000 residents lost their lives, commonly as a result of building failures in seismic events. The Cappadocia region is one of the most important touristic sites in Turkey. At the same time, the region has been included to the Word Heritage List by UNESCO at 1985 due to its natural, historical and cultural values. The region is undesirably affected by several environmental conditions, which are subjected in many previous studies. But, there are limited studies about the seismic evaluation of the region. Some of the important historical and cultural heritage sites are: Goreme Open Air Museum, Uchisar Castle, Ortahisar Castle, Derinkuyu Underground City and Ihlara Valley. According to seismic hazard zonation map published by the Ministry of Reconstruction and Settlement these heritage sites fall in Zone III, Zone IV and Zone V. This map show peak ground acceleration or 10 percent probability of exceedance in 50 years for bedrock. In this connection, seismic hazard assessment of these heritage sites has to be evaluated. In this study, seismic hazard calculations are performed both deterministic and probabilistic approaches with local site conditions. A catalog of historical and instrumental earthquakes is prepared and used in this study. The seismic sources have been identified for seismic hazard assessment based on geological, seismological and geophysical information. Peak Ground Acceleration (PGA) at bed rock level is calculated for different seismic sources using available attenuation relationship formula applicable to Turkey. The result of the present study reveals that the seismic hazard at these sites is closely matching with the Seismic Zonation map published by the Ministry of Reconstruction and Settlement. Keywords: Seismic Hazard Assessment, Probabilistic Approach

  7. Turning a hazardous waste lagoon into reclaimed land for wildlife management: A case study

    SciTech Connect

    Leong, A.K.

    1996-12-31

    Brownfields are turning back to green. This paper presents a case study of a former dump site for hazardous waste that has been remediated and will be developed into an enhanced wildlife management habitat. This successful remediation case combined various investigations, remedial designs, risk assessments, ecological studies, and engineering practices. 3 refs., 1 fig., 1 tab.

  8. A computational study of explosive hazard potential for reuseable launch vehicles.

    SciTech Connect

    Langston, Leo J.; Freitas, Christopher J.; Langley, Patrick; Palmer, Donald; Saul, W. Venner; Chocron, Sidney; Kipp, Marlin E.

    2004-09-01

    Catastrophic failure of a Reusable Launch Vehicle (RLV) during launch poses a significant engineering problem in the context of crew escape. The explosive hazard potential of the RLV changes during the various phases of the launch. The hazard potential in the on-pad environment is characterized by release and formation of a gas phase mixture in an oxidizer rich environment, while the hazard during the in-flight phase is dominated by the boundary layer and wake flow formed around the vehicle and the interaction with the exhaust gas plume. In order to address more effectively crew escape in these explosive environments a computational analysis program was undertaken by Lockheed Martin, funded by NASA JSC, with simulations and analyses completed by Southwest Research Institute and Sandia National Laboratories. This paper presents then the details of the methodology used in this analysis, results of the study, and important conclusions that came out of the study.

  9. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  10. A Food Effect Study of an Oral Thrombin Inhibitor and Prodrug Approach To Mitigate It.

    PubMed

    Lee, Jihye; Kim, Bongchan; Kim, Tae Hun; Lee, Sun Hwa; Park, Hee Dong; Chung, Kyungha; Lee, Sung-Hack; Paek, Seungyup; Kim, Eunice EunKyeong; Yoon, SukKyoon; Kim, Aeri

    2016-04-01

    LB30870, a new direct thrombin inhibitor, showed 80% reduction in oral bioavailability in fed state. The present study aims to propose trypsin binding as a mechanism for such negative food effect and demonstrate a prodrug approach to mitigate food effect. Effect of food composition on fed state oral bioavailability of LB30870 was studied in dogs. Various prodrugs were synthesized, and their solubility, permeability, and trypsin binding affinity were measured. LB30870 and prodrugs were subject to cocrystallization with trypsin, and the X-ray structures of cocrystals were determined. Food effect was studied in dogs for selected prodrugs. Protein or lipid meal appeared to affect oral bioavailability of LB30870 in dogs more than carbohydrate meal. Blocking both carboxyl and amidine groups of LB30870 resulted in trypsin Ki values orders of magnitude higher than that of LB30870. Prodrugs belonged to either Biopharmaceutical Classification System I, II, or III. X-ray crystallography revealed that prodrugs did not bind to trypsin, but instead their hydrolysis product at the amidine blocking group formed cocrystal with trypsin. A prodrug with significantly less food effect than LB30870 was identified. Binding of prodrugs to food components such as dietary fiber appeared to counteract the positive effect brought with the prodrug approach. Further formulation research is warranted to enhance the oral bioavailability of prodrugs. In conclusion, this study is the first to demonstrate that the negative food effect of LB30870 can be attributed to trypsin binding. Trypsin binding study is proposed as a screening tool during lead optimization to minimize food effect. PMID:26886576

  11. Assessment of environmental impact on air quality by cement industry and mitigating measures: a case study.

    PubMed

    Kabir, G; Madugu, A I

    2010-01-01

    In this study, environmental impact on air quality was evaluated for a typical Cement Industry in Nigeria. The air pollutants in the atmosphere around the cement plant and neighbouring settlements were determined using appropriate sampling techniques. Atmospheric dust and CO2 were prevalent pollutants during the sampling period; their concentrations were recorded to be in the range of 249-3,745 mg/m3 and 2,440-2,600 mg/m3, respectively. Besides atmospheric dust and CO2, the air pollutants such as NOx, SOx and CO were in trace concentrations, below the safe limits approved by FEPA that are 0.0062-0.093 mg/m3 NOx, 0.026 mg/m3 SOx and 114.3 mg/m3 CO, respectively. Some cost-effective mitigating measures were recommended that include the utilisation of readily available and low-cost pozzolans material to produce blended cement, not only could energy efficiency be improved, but carbon dioxide emission could also be minimised during clinker production; and the installation of an advance high-pressure grinding rolls (clinker-roller-press process) to maximise energy efficiency to above what is obtainable from the traditional ball mills and to minimise CO2 emission from the power plant. PMID:19067202

  12. A Case Study in Ethical Decision Making Regarding Remote Mitigation of Botnets

    NASA Astrophysics Data System (ADS)

    Dittrich, David; Leder, Felix; Werner, Tillmann

    It is becoming more common for researchers to find themselves in a position of being able to take over control of a malicious botnet. If this happens, should they use this knowledge to clean up all the infected hosts? How would this affect not only the owners and operators of the zombie computers, but also other researchers, law enforcement agents serving justice, or even the criminals themselves? What dire circumstances would change the calculus about what is or is not appropriate action to take? We review two case studies of long-lived malicious botnets that present serious challenges to researchers and responders and use them to illuminate many ethical issues regarding aggressive mitigation. We make no judgments about the questions raised, instead laying out the pros and cons of possible choices and allowing workshop attendees to consider how and where they would draw lines. By this, we hope to expose where there is clear community consensus as well as where controversy or uncertainty exists.

  13. An optimization model for regional air pollutants mitigation based on the economic structure adjustment and multiple measures: A case study in Urumqi city, China.

    PubMed

    Sun, Xiaowei; Li, Wei; Xie, Yulei; Huang, Guohe; Dong, Changjuan; Yin, Jianguang

    2016-11-01

    A model based on economic structure adjustment and pollutants mitigation was proposed and applied in Urumqi. Best-worst case analysis and scenarios analysis were performed in the model to guarantee the parameters accuracy, and to analyze the effect of changes of emission reduction styles. Results indicated that pollutant-mitigations of electric power industry, iron and steel industry, and traffic relied mainly on technological transformation measures, engineering transformation measures and structure emission reduction measures, respectively; Pollutant-mitigations of cement industry relied mainly on structure emission reduction measures and technological transformation measures; Pollutant-mitigations of thermal industry relied mainly on the four mitigation measures. They also indicated that structure emission reduction was a better measure for pollutants mitigation of Urumqi. Iron and steel industry contributed greatly in SO2, NOx and PM (particulate matters) emission reduction and should be given special attention in pollutants emission reduction. In addition, the scales of iron and steel industry should be reduced with the decrease of SO2 mitigation amounts. The scales of traffic and electric power industry should be reduced with the decrease of NOx mitigation amounts, and the scales of cement industry and iron and steel industry should be reduced with the decrease of PM mitigation amounts. The study can provide references of pollutants mitigation schemes to decision-makers for regional economic and environmental development in the 12th Five-Year Plan on National Economic and Social Development of Urumqi. PMID:27454097

  14. Recommendations for water supply in arsenic mitigation: a case study from Bangladesh.

    PubMed

    Hoque, B A; Mahmood, A A; Quadiruzzaman, M; Khan, F; Ahmed, S A; Shafique, S A; Rahman, M; Morshed, G; Chowdhury, T; Rahman, M M; Khan, F H; Shahjahan, M; Begum, M; Hoque, M M

    2000-11-01

    Arsenic problems have been observed in several countries around the world. The challenges of arsenic mitigation are more difficult for developing and poor countries due to resource and other limitations. Bangladesh is experiencing the worst arsenic problem in the world, as about 30 million people are possibly drinking arsenic contaminated water. Lack of knowledge has hampered the mitigation initiatives. This paper presents experience gained during an action research on water supply in arsenic mitigation in rural Singair, Bangladesh. The mitigation has been implemented there through integrated research and development of appropriate water supply options and its use through community participation. Political leaders and women played key roles in the success of the mitigation. More than one option for safe water has been developed and/or identified. The main recommendations include: integration of screening of tubewells and supply of safe water, research on technological and social aspects, community, women and local government participation, education and training of all stakeholders, immediate and appropriate use of the available knowledge, links between intermediate/immediate and long term investment, effective coordination and immediate attention by health, nutrition, agriculture, education, and other programs to this arsenic issue. PMID:11114764

  15. Radon mitigation studies: South Central Florida demonstration. Final report, November 1987-January 1991

    SciTech Connect

    Fowler, C.S.; Williamson, A.D.; Pyle, B.E.; Belzer, F.E.; Coker, R.N.

    1992-10-01

    The report gives results of an EPA radon mitigation project involving 14 slab-on-grade houses in Polk County, FL, having indoor radon levels of 320-3810 Bq/cu m (8.7-103 pCi/L), using sub-slab depressurization (SSD) in a variety of applications to evaluate optimal design criteria to be recommended as cost-effective and capable of reducing indoor radon concentrations in houses built over compacted soil fills. For all houses, obvious accessible radon entry points were sealed, and 53-90 L (12-20 gal.) suction pits were dug into the fill material. Two of the houses were mitigated with exterior horizontal suction holes drilled through the stem walls. In four houses, one or more suction pipes were in the garage. The remainder of the interior suction holes were in closets or some other unobtrusive location. Except for the two houses with exterior systems, the other 12 had mitigation fans in the attic. In-line centrifugal fans were used to mitigate each house, although a larger radial blower was installed overnight for experimental purposes in one house, and a vaccumcleaner was used to simulate a larger suction in another house for pressure field measurements only. Post-mitigation worst-case radon concentrations in these houses ranged from 40 to 290 Bq/cu m.

  16. Coastal dynamics studies for evaluation of hazard and vulnerability for coastal erosion. case study the town La Bocana, Buenaventura, colombian pacific

    NASA Astrophysics Data System (ADS)

    Coca-Domínguez, Oswaldo; Ricaurte-Villota, Constanza

    2015-04-01

    The analysis of the hazard and vulnerability in coastal areas caused for erosion is based on studies of coastal dynamics since that allows having a better information detail that is useful for decision-making in aspects like prevention, mitigation, disaster reduction and integrated risk management. The Town of La Bocana, located in Buenaventura (Colombian Pacific) was selected to carry out the threat assessment for coastal erosion based on three components: i) magnitude, ii) occurrence and iii) susceptibility. Vulnerability meanwhile, is also composed of three main components for its evaluation: i) exposure ii) fragility and iii) resilience, which in turn are evaluated in 6 dimensions of vulnerability: physical, social, economic, ecological, institutional and cultural. The hazard analysis performed used a semi-quantitative approach, and an index of variables such as type of geomorphological unit, type of beach, exposure of the surfing coast, occurrence, among others. Quantitative data of coastal retreat was measured through the use of DSAS (Digital Shoreline Analysis System) an application of ArcGIS, as well as the development of digital elevation models from the beach and 6 beach profiles strategically located on the coast obtained with GNSS technology. Sediment samples collected from these beaches, medium height and wave direction were used as complementary data. The information was integrated across the coast line into segments of 250 x 250 meters. 4 sectors are part of the coastal area of La Bocana: Pianguita, Vistahermosa, Donwtown and Shangay. 6 vulnerability dimensions units were taken from these population, as well as its density for exposure, wich was analyzed through a multi-array method that include variables such as, land use, population, type of structure, education, basic services, among others, to measure frailty, and their respective indicator of resilience. The hazard analysis results indicate that Vistahermosa is in very high threat, while

  17. Collaborative studies target volcanic hazards in Central America

    NASA Astrophysics Data System (ADS)

    Bluth, Gregg J. S.; Rose, William I.

    Central America is the second-most consistently active volcanic zone on Earth, after Indonesia. Centuries of volcanic activity have produced a spectacular landscape of collapsed calderas, debris flows, and thick blankets of pyroclastic materials. Volcanic activity dominates the history, culture, and daily life of Central American countries.January 2002 marked the third consecutive year in which a diverse group of volcanologists and geophysicists conducted focused field studies in Central America. This type of multi-institutional collaboration reflects the growing involvement of a number of U.S. and non-U.S. universities, and of other organizations, in Guatemala and El Salvador (Table 1).

  18. Voltage Sag Mitigation Strategies for an Indian Power Systems: A Case Study

    NASA Astrophysics Data System (ADS)

    Goswami, A. K.; Gupta, C. P.; Singh, G. K.

    2014-08-01

    Under modern deregulated environment, both utilities and customers are concerned with the power quality improvement but with different objectives/interests. The utility reconfigure its power network and install mitigation devices, if needed, to improve power quality. The paper presents a strategy for selecting cost-effective solutions to mitigate voltage sags, the most frequent power quality disturbance. In this paper, mitigation device(s) is/are inducted in the optimal network topology at suitable places for their better effectiveness for further improvement in power quality. The optimal placement is looked from utility perspectives for overall benefit. Finally, their performance is evaluated on the basis of reduction in total number of voltage sags, reduction in total number of process trips and reduction in total financial losses due to voltage sags.

  19. Landslide hazard mapping with selected dominant factors: A study case of Penang Island, Malaysia

    NASA Astrophysics Data System (ADS)

    Tay, Lea Tien; Alkhasawneh, Mutasem Sh.; Ngah, Umi Kalthum; Lateh, Habibah

    2015-05-01

    Landslide is one of the destructive natural geohazards in Malaysia. In addition to rainfall as triggering factos for landslide in Malaysia, topographical and geological factors play important role in the landslide susceptibility analysis. Conventional topographic factors such as elevation, slope angle, slope aspect, plan curvature and profile curvature have been considered as landslide causative factors in many research works. However, other topographic factors such as diagonal length, surface area, surface roughness and rugosity have not been considered, especially for the research work in landslide hazard analysis in Malaysia. This paper presents landslide hazard mapping using Frequency Ratio (FR) and the study area is Penang Island of Malaysia. Frequency ratio approach is a variant of probabilistic method that is based on the observed relationships between the distribution of landslides and each landslide-causative factor. Landslide hazard map of Penang Island is produced by considering twenty-two (22) landslide causative factors. Among these twenty-two (22) factors, fourteen (14) factors are topographic factors. They are elevation, slope gradient, slope aspect, plan curvature, profile curvature, general curvature, tangential curvature, longitudinal curvature, cross section curvature, total curvature, diagonal length, surface area, surface roughness and rugosity. These topographic factors are extracted from the digital elevation model of Penang Island. The other eight (8) non-topographic factors considered are land cover, vegetation cover, distance from road, distance from stream, distance from fault line, geology, soil texture and rainfall precipitation. After considering all twenty-two factors for landslide hazard mapping, the analysis is repeated with fourteen dominant factors which are selected from the twenty-two factors. Landslide hazard map was segregated into four categories of risks, i.e. Highly hazardous area, Hazardous area, Moderately hazardous area

  20. Landslide hazard mapping with selected dominant factors: A study case of Penang Island, Malaysia

    SciTech Connect

    Tay, Lea Tien; Alkhasawneh, Mutasem Sh.; Ngah, Umi Kalthum; Lateh, Habibah

    2015-05-15

    Landslide is one of the destructive natural geohazards in Malaysia. In addition to rainfall as triggering factos for landslide in Malaysia, topographical and geological factors play important role in the landslide susceptibility analysis. Conventional topographic factors such as elevation, slope angle, slope aspect, plan curvature and profile curvature have been considered as landslide causative factors in many research works. However, other topographic factors such as diagonal length, surface area, surface roughness and rugosity have not been considered, especially for the research work in landslide hazard analysis in Malaysia. This paper presents landslide hazard mapping using Frequency Ratio (FR) and the study area is Penang Island of Malaysia. Frequency ratio approach is a variant of probabilistic method that is based on the observed relationships between the distribution of landslides and each landslide-causative factor. Landslide hazard map of Penang Island is produced by considering twenty-two (22) landslide causative factors. Among these twenty-two (22) factors, fourteen (14) factors are topographic factors. They are elevation, slope gradient, slope aspect, plan curvature, profile curvature, general curvature, tangential curvature, longitudinal curvature, cross section curvature, total curvature, diagonal length, surface area, surface roughness and rugosity. These topographic factors are extracted from the digital elevation model of Penang Island. The other eight (8) non-topographic factors considered are land cover, vegetation cover, distance from road, distance from stream, distance from fault line, geology, soil texture and rainfall precipitation. After considering all twenty-two factors for landslide hazard mapping, the analysis is repeated with fourteen dominant factors which are selected from the twenty-two factors. Landslide hazard map was segregated into four categories of risks, i.e. Highly hazardous area, Hazardous area, Moderately hazardous area

  1. A Study of Airline Passenger Susceptibility to Atmospheric Turbulence Hazard

    NASA Technical Reports Server (NTRS)

    Stewart, Eric C.

    2000-01-01

    A simple, generic, simulation math model of a commercial airliner has been developed to study the susceptibility of unrestrained passengers to large, discrete gust encounters. The math model simulates the longitudinal motion to vertical gusts and includes (1) motion of an unrestrained passenger in the rear cabin, (2) fuselage flexibility, (3) the lag in the downwash from the wing to the tail, and (4) unsteady lift effects. Airplane and passenger response contours are calculated for a matrix of gust amplitudes and gust lengths of a simulated mountain rotor. A comparison of the model-predicted responses to data from three accidents indicates that the accelerations in actual accidents are sometimes much larger than the simulated gust encounters.

  2. Flood hazards studies in the Mississippi River basin using remote sensing

    NASA Technical Reports Server (NTRS)

    Rango, A.; Anderson, A. T.

    1974-01-01

    The Spring 1973 Mississippi River flood was investigated using remotely sensed data from ERTS-1. Both manual and automatic analyses of the data indicated that ERTS-1 is extremely useful as a regional tool for flood mamagement. Quantitative estimates of area flooded were made in St. Charles County, Missouri and Arkansas. Flood hazard mapping was conducted in three study areas along the Mississippi River using pre-flood ERTS-1 imagery enlarged to 1:250,000 and 1:100,000 scale. Initial results indicate that ERTS-1 digital mapping of flood prone areas can be performed at 1:62,500 which is comparable to some conventional flood hazard map scales.

  3. Orbital debris hazard insights from spacecraft anomalies studies

    NASA Astrophysics Data System (ADS)

    McKnight, Darren S.

    2016-09-01

    Since the dawning of the space age space operators have been tallying spacecraft anomalies and failures then using these insights to improve the space systems and operations. As space systems improved and their lifetimes increased, the anomaly and failure modes have multiplied. Primary triggers for space anomalies and failures include design issues, space environmental effects, and satellite operations. Attempts to correlate anomalies to the orbital debris environment have started as early as the mid-1990's. Early attempts showed tens of anomalies correlated well to altitudes where the cataloged debris population was the highest. However, due to the complexity of tracing debris impacts to mission anomalies, these analyses were found to be insufficient to prove causation. After the fragmentation of the Chinese Feng-Yun satellite in 2007, it was hypothesized that the nontrackable fragments causing anomalies in LEO would have increased significantly from this event. As a result, debris-induced anomalies should have gone up measurably in the vicinity of this breakup. Again, the analysis provided some subtle evidence of debris-induced anomalies but it was not convincing. The continued difficulty in linking debris flux to satellite anomalies and failures prompted the creation of a series of spacecraft anomalies and failure workshops to investigate the identified shortfalls. These gatherings have produced insights into why this process is not straightforward. Summaries of these studies and workshops are presented and observations made about how to create solutions for anomaly attribution, especially as it relates to debris-induced spacecraft anomalies and failures.

  4. Respiratory hazards in hard metal workers: a cross sectional study.

    PubMed Central

    Meyer-Bisch, C; Pham, Q T; Mur, J M; Massin, N; Moulin, J J; Teculescu, D; Carton, B; Pierre, F; Baruthio, F

    1989-01-01

    A cross sectional study was conducted on 513 employees at three hard metal plants: 425 exposed workers (351 men, 74 women) and 88 controls (69 men, 19 women). Cough and sputum were more frequent in workers engaged in "soft powder" and presintering workshops compared with controls (12.5% and 16.5% v 3.5%). Spirometric abnormalities were more frequent among women in sintering and finishing workshops compared with control women (56.8% v 23.8%) and abnormalities of carbon monoxide test were more frequent in exposed groups than in controls; this difference was more pronounced in women (31.4% v 5.6%) than in men (18.5% v 13%). No significant correlation was observed between duration of exposure and age adjusted lung function tests. Slight abnormalities of chest radiographs (0/1, 1/1 according to ILO classification) were more frequent in exposed men than controls (12.8% v 1.9%) and mostly in soft powder workers. In subjects with abnormal chest radiographs FVC, FEV1 and carbon monoxide indices (fractional uptake of CO or CO transfer index or both) were lower compared with those with normal chest radiographs. Although relatively mild, the clinical, radiological, and functional abnormalities uncovered call for a regular supervision of workers exposed to hard metal dust. PMID:2787666

  5. Mask roughness induced LER control and mitigation: aberrations sensitivity study and alternate illumination scheme

    NASA Astrophysics Data System (ADS)

    McClinton, Brittany M.; Naulleau, Patrick P.

    2011-04-01

    Here we conduct a mask-roughness-induced line-edge-roughness (LER) aberrations sensitivity study both as a random distribution amongst the first 16 Fringe Zernikes (for overall aberration levels of 0.25, 0.50, and 0.75nm rms) as well as an individual aberrations sensitivity matrix over the first 37 Fringe Zernikes. Full 2D aerial image modeling for an imaging system with NA = 0.32 was done for both the 22-nm and 16-nm half-pitch nodes on a rough mask with a replicated surface roughness (RSR) of 100 pm and a correlation length of 32 nm at the nominal extreme-ultraviolet lithography (EUVL) wavelength of 13.5nm. As the ideal RSR value for commercialization of EUVL is 50 pm and under, and furthermore as has been shown elsewhere, a correlation length of 32 nm of roughness on the mask sits on the peak LER value for an NA = 0.32 imaging optic, these mask roughness values and consequently the aberration sensitivity study presented here, represent a worst-case scenario. The illumination conditions were chosen based on the possible candidates for the 22-nm and 16-nm half-pitch nodes, respectively. In the 22-nm case, a disk illumination setting of σ = 0.50 was used, and for the 16-nm case, crosspole illumination with σ = 0.10 at an optimum offset of dx = 0 and dy = .67 in sigma space. In examining how to mitigate mask roughness induced LER, we considered an alternate illumination scheme whereby a traditional dipole's angular spectrum is extended in the direction parallel to the line-and-space mask absorber pattern to represent a "strip". While this illumination surprisingly provides minimal improvement to the LER as compared to several alternate illumination schemes, the overall imaging quality in terms of image-log-slope (ILS) and contrast is improved.

  6. Mask roughness induced LER control and mitigation: aberrations sensitivity study and alternate illumination scheme

    SciTech Connect

    McClinton, Brittany M.; Naulleau, Patrick P.

    2011-03-11

    Here we conduct a mask-roughness-induced line-edge-roughness (LER) aberrations sensitivity study both as a random distribution amongst the first 16 Fringe Zernikes (for overall aberration levels of 0.25, 0.50, and 0.75nm rms) as well as an individual aberrations sensitivity matrix over the first 37 Fringe Zernikes. Full 2D aerial image modeling for an imaging system with NA = 0.32 was done for both the 22-nm and 16-nm half-pitch nodes on a rough mask with a replicated surface roughness (RSR) of 100 pm and a correlation length of 32 nm at the nominal extreme-ultraviolet lithography (EUVL) wavelength of 13.5nm. As the ideal RSR value for commercialization of EUVL is 50 pm and under, and furthermore as has been shown elsewhere, a correlation length of 32 nm of roughness on the mask sits on the peak LER value for an NA = 0.32 imaging optic, these mask roughness values and consequently the aberration sensitivity study presented here, represent a worst-case scenario. The illumination conditions were chosen based on the possible candidates for the 22-nm and 16-nm half-pitch nodes, respectively. In the 22-nm case, a disk illumination setting of {sigma} = 0.50 was used, and for the 16-nm case, crosspole illumination with {sigma} = 0.10 at an optimum offset of dx = 0 and dy = .67 in sigma space. In examining how to mitigate mask roughness induced LER, we considered an alternate illumination scheme whereby a traditional dipole's angular spectrum is extended in the direction parallel to the line-and-space mask absorber pattern to represent a 'strip'. While this illumination surprisingly provides minimal improvement to the LER as compared to several alternate illumination schemes, the overall imaging quality in terms of image-log-slope (ILS) and contrast is improved.

  7. THE SOCIAL IMPLICATIONS OF FLAME RETARDANT CHEMICALS: A CASE STUDY IN RISK AND HAZARD PERCEPTION

    EPA Science Inventory

    This study is expected to fill an important gap in the literature by focusing on how individuals characterize exposure in terms of risk and hazard, and how this understanding can lead to concrete changes in their personal and professional lives. I expect that people differ gre...

  8. A PRELIMINARY FEASIBILITY STUDY FOR AN OFFSHORE HAZARDOUS WASTE INCINERATION FACILITY

    EPA Science Inventory

    The report summarizes a feasibility study of using an existing offshore oil platform, being offered to the Government, as a site for incineration of hazardous wastes and related research. The platform, located in the Gulf of Mexico about 100 km south of Mobile, AL, has potential ...

  9. When does highway construction to mitigate congestion reduce carbon emissions? A Case Study: The Caldecott Tunnel

    NASA Astrophysics Data System (ADS)

    Thurlow, M. E.; Maness, H.; Wiersema, D. J.; Mcdonald, B. C.; Harley, R.; Fung, I. Y.

    2014-12-01

    The construction of the fourth bore of the Caldecott Tunnel, which connects Oakland and Moraga, CA on State Route 24, was the second largest roadway construction project in California last year with a total cost of $417 million. The objective of the fourth bore was to reduce traffic congestion before the tunnel entrance in the off-peak direction of travel, but the project was a source of conflict between policy makers and environmental and community groups concerned about the air quality and traffic impacts. We analyze the impact of the opening of the fourth bore on CO2 emissions associated with traffic. We made surface observations of CO2from a mobile platform along State Route 24 for several weeks in November 2013 incorporating the period prior to and after the opening of the fourth bore on November 16, 2013. We directly compare bottom-up and top-down approaches to estimate the change in traffic emissions associated with the fourth bore opening. A bottom-up emissions inventory was derived from the high-resolution Performance Measurement System (PeMs) dataset and the Multi-scale Motor Vehicle and Equipment Emissions System (MOVES). The emissions inventory was used to drive a box model as well as a high-resolution regional transport model (the Weather and Regional Forecasting Model). The box model was also used to derive emissions from observations in a basic inversion. We also present an analysis of long-term traffic patterns and consider the potential for compensating changes in behavior that offset the observed emissions reductions on longer timescales. Finally, we examine how the results from the Caldecott study demonstrate the general benefit of using mobile measurements for quantifying environmental impacts of congestion mitigation projects.

  10. A Study of Aircraft Fire Hazards Related to Natural Electrical Phenomena

    NASA Technical Reports Server (NTRS)

    Kester, Frank L.; Gerstein, Melvin; Plumer, J. A.

    1960-01-01

    The problems of natural electrical phenomena as a fire hazard to aircraft are evaluated. Assessment of the hazard is made over the range of low level electrical discharges, such as static sparks, to high level discharges, such as lightning strikes to aircraft. In addition, some fundamental work is presented on the problem of flame propagation in aircraft fuel vent systems. This study consists of a laboratory investigation in five parts: (1) a study of the ignition energies and flame propagation rates of kerosene-air and JP-6-air foams, (2) a study of the rate of flame propagation of n-heptane, n-octane, n-nonane, and n-decane in aircraft vent ducts, (3) a study of the damage to aluminum, titanium, and stainless steel aircraft skin materials by lightning strikes, (4) a study of fuel ignition by lightning strikes to aircraft skins, and (5) a study of lightning induced flame propagation in an aircraft vent system.

  11. Urban Vulnerability Assessment to Seismic Hazard through Spatial Multi-Criteria Analysis. Case Study: the Bucharest Municipality/Romania

    NASA Astrophysics Data System (ADS)

    Armas, Iuliana; Dumitrascu, Silvia; Bostenaru, Maria

    2010-05-01

    conditions). In effect, the example of Bucharest demonstrates how the results shape the ‘vulnerability to seismic hazard profile of the city, based on which decision makers could develop proper mitigation strategies. To sum up, the use of an analytical framework as the standard Spatial Multi-Criteria Analysis (SMCA) - despite all difficulties in creating justifiable weights (Yeh et al., 1999) - results in accurate estimations of the state of the urban system. Although this method was often mistrusted by decision makers (Janssen, 2001), we consider that the results can represent, based on precisely the level of generalization, a decision support framework for policy makers to critically reflect on possible risk mitigation plans. Further study will lead to the improvement of the analysis by integrating a series of daytime and nighttime scenarios and a better definition of the constructed space variables.

  12. Natural phenomena hazards site characterization criteria

    SciTech Connect

    Not Available

    1994-03-01

    The criteria and recommendations in this standard shall apply to site characterization for the purpose of mitigating Natural Phenomena Hazards (wind, floods, landslide, earthquake, volcano, etc.) in all DOE facilities covered by DOE Order 5480.28. Criteria for site characterization not related to NPH are not included unless necessary for clarification. General and detailed site characterization requirements are provided in areas of meteorology, hydrology, geology, seismology, and geotechnical studies.

  13. Natural hazard understanding in the middle schools of the Colorado Front Range

    SciTech Connect

    Grogger, P.K.

    1995-12-01

    The best form of mitigation is not to put one`s self in a position that mitigation is required. For the last five years the University of Colorado`s Department of Geology has teamed with local school districts to implement an understanding of natural hazards. By working with middle school students the dangers and possible mitigation of North America are learned at an early age. Over the years, the knowledge gained by these communities citizens will hopefully help lessen the dangers from natural hazards society faces. Education of the general public about natural hazards needs to be addressed by the professional societies studying and developing answers to natural hazards problems. By working with school children this process of educating the general public starts early in the education system and will bear fruit many years in the future. This paper describes the course that is being given to students in Colorado.

  14. Use of geotextiles for mitigation of the effects of man-made hazards such as greening of waste deposits in frame of the conversion of industrial areas

    NASA Astrophysics Data System (ADS)

    Bostenaru, Magdalena; Siminea, Ioana; Bostenaru, Maria

    2010-05-01

    The city of Karlsruhe lays on the Rhine valley; however, it is situated at a certain distance from the Rhine river and the coastal front is not integrated in the urban development. However, the port to the Rhine developed to the second largest internal port in Germany. With the process of deindustrialisation, industrial use is now shrinking. With the simultaneous process of the ecological re-win of rivers, the conversion of the industrial area to green and residential areals is imposed. In the 1990s a project was made by the third author of the contribution with Andrea Ciobanu as students of the University of Karlsruhe for the conversion of the Rhine port area of Karlsruhe into such a nature-residential use. The area included also a waste deposit, proposed to be transformed into a "green hill". Such an integration of a waste deposit into a park in the process of the conversion of an industrial area is not singular in Germany; several such projects were proposed and some of them realised at the IBA Emscher Park in the Ruhr area. Some of them were coupled with artistic projects. The technical details are also subject of the contribution. Studies were made by the first two authors on the conditions in which plants grow on former waste deposits if supported by intermediar layers of a geotextile. The characteristics of the geotextiles, together with the technologic process of obtaining, and the results of laboratory and field experiments for use on waste deposits in comparable conditions in Romania will be shown. The geotextile is also usable for ash deposits such as those in the Ruhr area.

  15. Geometrical Scaling of the Magnitude Frequency Statistics of Fluid Injection Induced Earthquakes and Implications for Assessment and Mitigation of Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Dinske, C.; Shapiro, S. A.

    2015-12-01

    To study the influence of size and geometry of hydraulically perturbed rock volumes on the magnitude statistics of induced events, we compare b value and seismogenic index estimates derived from different algorithms. First, we use standard Gutenberg-Richter approaches like least square fit and maximum likelihood technique. Second, we apply the lower bound probability fit (Shapiro et al., 2013, JGR, doi:10.1002/jgrb.50264) which takes the finiteness of the perturbed volume into account. The different estimates systematically deviate from each other and the deviations are larger for smaller perturbed rock volumes. It means that the frequency-magnitude distribution is most affected for small injection volume and short injection time resulting in a high apparent b value. In contrast, the specific magnitude value, the quotient of seismogenic index and b value (Shapiro et al., 2013, JGR, doi:10.1002/jgrb.50264), appears to be a unique seismotectonic parameter of a reservoir location. Our results confirm that it is independent of the size of perturbed rock volume. The specific magnitude is hence an indicator of the magnitudes that one can expect for a given injection. Several performance tests to forecast the magnitude frequencies of induced events show that the seismogenic index model provides reliable predictions which confirm its applicability as a forecast tool, particularly, if applied in real-time monitoring. The specific magnitude model can be used to predict an asymptotical upper limit of probable frequency-magnitude distributions of induced events. We also conclude from our analysis that the physical process of pore pressure diffusion for the event triggering and the scaling of their frequency-magnitude distribution by the size of perturbed rock volume well depicts the presented relation between upper bound of maximum seismic moment and injected fluid volume (McGarr, 2014, JGR, doi:10.1002/2013JB010597), particularly, if nonlinear effects in the diffusion process

  16. A feasibility study on the influence of the geomorphological feature in identifying the potential landslide hazard

    NASA Astrophysics Data System (ADS)

    Baek, M. H.; Kim, T. H.

    2014-11-01

    In this study we focused on identifying geomorphological features that control the location of landslides. The representation of these features is based on a high resolution DEM (Digital Elevation Model) derived from airborne laser altimetry (LiDAR) and evaluated by statistical analysis of axial orientation data. The main principle of this analysis is generating eigenvalues from axial orientation data and comparing them. The Planarity, a ratio of eigenvalues, would tell the degree of roughness on ground surface based on their ratios. Results are compared to the recent landslide case in Korea in order to evaluate the feasibility of the proposed methodology in identifying the potential landslide hazard. The preliminary landslide assessment based on the Planarity analysis well discriminates features between stable and unstable domain in the study area especially in the landslide initiation zones. Results also show it is beneficial to build the preliminary landslide hazard especially inventory mapping where none of information on historical records of landslides is existed. By combining other physical procedures such as geotechnical monitoring, the landslide hazard assessment using geomorphological features will promise a better understanding of landslides and their mechanisms, and provide an enhanced methodology to evaluate their hazards and appropriate actions.

  17. Natural hazard risk perception of Italian population: case studies along national territory.

    NASA Astrophysics Data System (ADS)

    Gravina, Teresita; Tupputi Schinosa, Francesca De Luca; Zuddas, Isabella; Preto, Mattia; Marengo, Angelo; Esposito, Alessandro; Figliozzi, Emanuele; Rapinatore, Matteo

    2015-04-01

    Risk perception is judgment that people make about the characteristics and severity of risks, in last few years risk perception studies focused on provide cognitive elements to communication experts responsible in order to design citizenship information and awareness appropriate strategies. Several authors in order to determine natural hazards risk (Seismic, landslides, cyclones, flood, Volcanic) perception used questionnaires as tool for providing reliable quantitative data and permitting comparison the results with those of similar surveys. In Italy, risk perception studies based on surveys, were also carried out in order to investigate on national importance Natural risk, in particular on Somma-Vesuvio and Phlegrean Fields volcanic Risks, but lacked risk perception studies on local situation distributed on whole national territory. National importance natural hazard were frequently reported by national mass media and there were debate about emergencies civil protection plans, otherwise could be difficult to obtain information on bonded and regional nature natural hazard which were diffuses along National territory. In fact, Italian peninsula was a younger geological area subjected to endogenous phenomena (volcanoes, earthquake) and exogenous phenomena which determine land evolution and natural hazard (landslide, coastal erosion, hydrogeological instability, sinkhole) for population. For this reason we decided to investigate on natural risks perception in different Italian place were natural hazard were taken place but not reported from mass media, as were only local relevant or historical event. We carried out surveys in different Italian place interested by different types of natural Hazard (landslide, coastal erosion, hydrogeological instability, sinkhole, volcanic phenomena and earthquake) and compared results, in order to understand population perception level, awareness and civil protection exercises preparation. Our findings support that risks

  18. Identifying hazard parameter to develop quantitative and dynamic hazard map of an active volcano in Indonesia

    NASA Astrophysics Data System (ADS)

    Suminar, Wulan; Saepuloh, Asep; Meilano, Irwan

    2016-05-01

    Analysis of hazard assessment to active volcanoes is crucial for risk management. The hazard map of volcano provides information to decision makers and communities before, during, and after volcanic crisis. The rapid and accurate hazard assessment, especially to an active volcano is necessary to be developed for better mitigation on the time of volcanic crises in Indonesia. In this paper, we identified the hazard parameters to develop quantitative and dynamic hazard map of an active volcano. The Guntur volcano in Garut Region, West Java, Indonesia was selected as study area due population are resided adjacent to active volcanoes. The development of infrastructures, especially related to tourism at the eastern flank from the Summit, are growing rapidly. The remote sensing and field investigation approaches were used to obtain hazard parameters spatially. We developed a quantitative and dynamic algorithm to map spatially hazard potential of volcano based on index overlay technique. There were identified five volcano hazard parameters based on Landsat 8 and ASTER imageries: volcanic products including pyroclastic fallout, pyroclastic flows, lava and lahar, slope topography, surface brightness temperature, and vegetation density. Following this proposed technique, the hazard parameters were extracted, indexed, and calculated to produce spatial hazard values at and around Guntur Volcano. Based on this method, the hazard potential of low vegetation density is higher than high vegetation density. Furthermore, the slope topography, surface brightness temperature, and fragmental volcanic product such as pyroclastics influenced to the spatial hazard value significantly. Further study to this proposed approach will be aimed for effective and efficient analyses of volcano risk assessment.

  19. Treatment of kappa in Recent Western US Seismic Nuclear Plant Probabilistic Seismic Hazard Studies

    NASA Astrophysics Data System (ADS)

    Toro, G. R.; Di Alessandro, C.; Al Atik, L.

    2015-12-01

    The three operating nuclear plants (Diablo Canyon, Palo Verde, and Columbia Generating Station) in the western United States recently performed SSHAC Level 3 seismic hazard studies in response to a Request for Information by the Nuclear Regulatory Commission, following the accident at the Fukushima Dai-ichi nuclear facility. The treatment of zero-distance kappa, referred to as kappa_0 and commonly attributed to material damping and scattering in the shallow crust, was given extensive consideration in these studies. Available ground motion prediction equations (GMPEs) do not typically include kappa_0 as a prediction parameter and are developed for an average kappa_0 of the host region. Kappa scaling is routinely applied to adjust for the differences in average kappa between the GMPEs host regions and the target regions. The impact of kappa scaling on the results of probabilistic seismic hazard analyses is significant for nuclear and other facilities that are sensitive to high frequency ground motions (frequencies greater than about 10 Hz). There are several available approaches for deriving kappa scaling factors to GMPEs, which all require estimating kappa_0 at the target site. It is difficult to constrain the target kappa_0 empirically due to the scarcity of ground-motion data from hard-rock sites in ground-motion databases.The hazard studies for the three nuclear power plants had different data, faced different challenges in the estimation of kappa_0, and used different methods for the estimation of the effect of kappa_0 on the site-specific ground motions. This presentation summarizes the approaches used for the evaluation of kappa_0 and for their incorporation in the probabilistic seismic hazard analysis. Emphasis is given to the quantification of the kappa_0 uncertainty, and on the evaluation of its impact to the resulting seismic hazard at the different sites.

  20. Robot-assisted home hazard assessment for fall prevention: a feasibility study.

    PubMed

    Sadasivam, Rajani S; Luger, Tana M; Coley, Heather L; Taylor, Benjamin B; Padir, Taskin; Ritchie, Christine S; Houston, Thomas K

    2014-01-01

    We examined the feasibility of using a remotely manoeuverable robot to make home hazard assessments for fall prevention. We employed use-case simulations to compare robot assessments with in-person assessments. We screened the homes of nine elderly patients (aged 65 years or more) for fall risks using the HEROS screening assessment. We also assessed the participants' perspectives of the remotely-operated robot in a survey. The nine patients had a median Short Blessed Test score of 8 (interquartile range, IQR 2-20) and a median Life-Space Assessment score of 46 (IQR 27-75). Compared to the in-person assessment (mean = 4.2 hazards identified per participant), significantly more home hazards were perceived in the robot video assessment (mean = 7.0). Only two checklist items (adequate bedroom lighting and a clear path from bed to bathroom) had more than 60% agreement between in-person and robot video assessment. Participants were enthusiastic about the robot and did not think it violated their privacy. The study found little agreement between the in-person and robot video hazard assessments. However, it identified several research questions about how to best use remotely-operated robots. PMID:24352900

  1. Natural Hazards in Ólafsfjörður, Iceland. A conceptual study.

    NASA Astrophysics Data System (ADS)

    Moran, A.; Wastl, M.; Stötter, J.; Ploner, A.; Sönser, T.

    2003-04-01

    This study focuses on the conceptual approach to natural hazard investigations in regions lacking hazard zoning or with only the existence of rudimentary hazard assessments. Fulfilling these specifications, the community of Ólafsfjörður, with a population of ca. 1100 inhabitants, is selected as a sample area. It is located in a high mountain environment in northern Iceland (66.07° N, 18.65°W) and is subject to various natural hazard processes such as snow avalanches, debris and mud flows, rock fall and flooding. The objective of this procedure is to identify the risk potential of the area on a regional scale of 1:25,000. The input data consists of only aerial photography and topographical maps, enabling by the usage of various software programs, including a geographical information system (GIS), the generation of a digital terrain model and creation of orthophotos for mapping. The following step in approaching this task is hazard identification. This consists of the interpretation of aerial photography, field mapping and the evaluation of historic events by means of interviews and using written chronicles. In a second step GIS-based avalanche and rock fall simulation models are run in order to quantify worst-case scenarios. Parallel to the assessment of geomorphological processes, the areas of human settlement and building functions in the entire community are acquired. In a final step the information representing the spatial extents of natural processes and existing human settlements are overlapped allowing an overview of potential conflict zones. The results reveal that a considerable number of buildings and road segments are found to lie within the reach of potential avalanche run-out. This calls for urgent measures to be taken. This study obtains quick results in identifying areas of conflict while forming the foundation for more detailed planning where necessary.

  2. DIII-D Studies of Massive Gas Injection Fast Shutdowns for Disruption Mitigation

    SciTech Connect

    Hollmann, E; Jernigan, T; Antar, G; Bakhtiari, M; Boedo, J; Combs, S; Evans, T; Gray, D; Groth, M; Huymphreys, D; Lasnier, C; Moyer, R; Parks, P; Rudakov, D; Strait, E; Wesley, J; West, W; Whyte, D; Yu, J

    2006-06-19

    Injection of massive quantities of gas is a promising technique for fast shutdown of ITER for the purpose of avoiding divertor and first wall damage from disruptions. Previous experiments using massive gas injection (MGI) to terminate discharges in the DIII-D tokamak have demonstrated rapid shutdown with reduced wall heating and halo currents (relative to natural disruptions) and with very small runaway electron (RE) generation [1]. Figure 1 shows time traces which give an overview of shutdown time scales. Typically, of order 5 x 10{sup 22} Ar neutrals are fired over a pulse of 25 ms duration into stationary (non-disrupting) discharges. The observed results are consistent with the following scenario: within several ms of the jet trigger, sufficient Ar neutrals are delivered to the plasma to cause the edge temperature to collapse, initiating the inward propagation of a cold front. The exit flow of the jet [Fig. 1(a)] has a {approx} 9 ms rise time; so the quantity of neutrals which initiates the edge collapse is small (<10{sup 20}). When the cold front reaches q {approx} 2 surface, global magnetohydrodynamic (MHD) modes are destabilized [2], mixing hot core plasma with edge impurities. Here, q is the safety factor. Most (>90%) of the plasma thermal energy is lost via impurity radiation during this thermal quench (TQ) phase. Conducted heat loads to the wall are low because of the cold edge temperature. After the TQ, the plasma is very cold (of order several eV), so conducted wall (halo) currents are low, even if the current channel contacts the wall. The plasma current profile broadens and begins decaying resistively. The decaying current generates a toroidal electric field which can accelerate REs; however, RE beam formation appears to be limited in MGI shutdowns. Presently, it is thought that the conducted heat flux and halo current mitigation qualities of the MGI shutdown technique will scale well to a reactor-sized tokamak. However, because of the larger RE gain

  3. Success in transmitting hazard science

    NASA Astrophysics Data System (ADS)

    Price, J. G.; Garside, T.

    2010-12-01

    Money motivates mitigation. An example of success in communicating scientific information about hazards, coupled with information about available money, is the follow-up action by local governments to actually mitigate. The Nevada Hazard Mitigation Planning Committee helps local governments prepare competitive proposals for federal funds to reduce risks from natural hazards. Composed of volunteers with expertise in emergency management, building standards, and earthquake, flood, and wildfire hazards, the committee advises the Nevada Division of Emergency Management on (1) the content of the State’s hazard mitigation plan and (2) projects that have been proposed by local governments and state agencies for funding from various post- and pre-disaster hazard mitigation programs of the Federal Emergency Management Agency. Local governments must have FEMA-approved hazard mitigation plans in place before they can receive this funding. The committee has been meeting quarterly with elected and appointed county officials, at their offices, to encourage them to update their mitigation plans and apply for this funding. We have settled on a format that includes the county’s giving the committee an overview of its infrastructure, hazards, and preparedness. The committee explains the process for applying for mitigation grants and presents the latest information that we have about earthquake hazards, including locations of nearby active faults, historical seismicity, geodetic strain, loss-estimation modeling, scenarios, and documents about what to do before, during, and after an earthquake. Much of the county-specific information is available on the web. The presentations have been well received, in part because the committee makes the effort to go to their communities, and in part because the committee is helping them attract federal funds for local mitigation of not only earthquake hazards but also floods (including canal breaches) and wildfires, the other major concerns in

  4. Thermal study of payload module for the next-generation infrared space telescope SPICA in risk mitigation phase

    NASA Astrophysics Data System (ADS)

    Shinozaki, Keisuke; Sato, Yoichi; Sawada, Kenichiro; Ando, Makiko; Sugita, Hiroyuki; Yamawaki, Toshihiro; Mizutani, Tadahiro; Komatsu, Keiji; Nakagawa, Takao; Murakami, Hiroshi; Matsuhara, Hideo; Takada, Makoto; Takai, Shigeki; Okabayashi, Akinobu; Tsunematsu, Shoji; Kanao, Kenichi; Narasaki, Katsuhiro

    2014-11-01

    SPace Infrared telescope for Cosmology and Astrophysics (SPICA) is a pre-project of JAXA in collaboration with ESA to be launched around 2020. The SPICA is transferred into a halo orbit around the second Lagrangian point (L2) in the Sun-Earth system, which enables us to use effective radiant cooling in combination with mechanical cooling system in order to cool a 3 m large IR telescope below 6 K. At a present, a conceptional study of SPICA is underway to assess and mitigate mission's risks; the thermal study for the risk mitigation sets a goal of a 25% margin on cooling power of 4 K/1 K temperature regions, a 25% margin on the heat load from Focal Plane Instruments (FPIs) at intermediated temperature region, to enhance the reliability of the mechanical cooler system, and to enhance feasibility of ground tests. Thermal property measurements of FRP materials are also important. This paper introduces details of the thermal design study for risk mitigation, including development of the truss separation mechanism, the cryogenic radiator, mechanical cooler system, and thermal property measurements of materials.

  5. Google Earth Views of Probabilistic Tsunami Hazard Analysis Pilot Study, Seaside, Oregon

    NASA Astrophysics Data System (ADS)

    Wong, F. L.; Venturato, A. J.; Geist, E. L.

    2006-12-01

    Virtual globes such as Google Earth provide immediate geographic context for research data for coastal hazard planning. We present Google Earth views of data from a Tsunami Pilot Study conducted within and near Seaside and Gearhart, Oregon, as part of FEMA's Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). Two goals of the pilot study were to develop probabilistic 100- year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improved tsunami hazard assessment guidelines. The Seaside area was chosen because it is typical of many coastal communities along the Cascadia subduction zone that extends from Cape Mendocino, California, to the Strait of Juan de Fuca, Washington. State and local stakeholders also expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study report will be augmented by a separate geographic information systems (GIS) data publication that provides model data and results. In addition to traditional GIS data formats, Google Earth kmz files are available to provide rapid visualization of the data against the rich base map provided by the interface. The data include verbal and geologic observations of historic tsunami events, newly constructed DEMs, historic shorelines, earthquake sources, models of tsunami wave heights, and maps of the estimated 100- and 500-year probabilistic floods. Tsunami Pilot Study Working Group, 2006, Seaside, Oregon Tsunami Pilot Study - Modernization of FEMA Flood Hazard Maps: U.S. Geological Survey Open-file Report 2006

  6. Fitting additive hazards models for case-cohort studies: a multiple imputation approach.

    PubMed

    Jung, Jinhyouk; Harel, Ofer; Kang, Sangwook

    2016-07-30

    In this paper, we consider fitting semiparametric additive hazards models for case-cohort studies using a multiple imputation approach. In a case-cohort study, main exposure variables are measured only on some selected subjects, but other covariates are often available for the whole cohort. We consider this as a special case of a missing covariate by design. We propose to employ a popular incomplete data method, multiple imputation, for estimation of the regression parameters in additive hazards models. For imputation models, an imputation modeling procedure based on a rejection sampling is developed. A simple imputation modeling that can naturally be applied to a general missing-at-random situation is also considered and compared with the rejection sampling method via extensive simulation studies. In addition, a misspecification aspect in imputation modeling is investigated. The proposed procedures are illustrated using a cancer data example. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26194861

  7. Climate engineering of vegetated land for hot extremes mitigation: an ESM sensitivity study

    NASA Astrophysics Data System (ADS)

    Wilhelm, Micah; Davin, Edouard; Seneviratne, Sonia

    2014-05-01

    Mitigation efforts to reduce anthropogenic climate forcing have thus far proven inadequate, as evident from accelerating greenhouse gas emissions. Many subtropical and mid-latitude regions are expected to experience longer and more frequent heat waves and droughts within the next century. This increased occurrence of weather extremes has important implications for human health, mortality and for socio-economic factors including forest fires, water availability and agricultural production. Various solar radiation management (SRM) schemes that attempt to homogeneously counter the anthropogenic forcing have been examined with different Earth System Models (ESM). Land climate engineering schemes have also been investigated which reduces the amount of solar radiation that is absorbed at the surface. However, few studies have investigated their effects on extremes but rather on mean climate response. Here we present the results of a series of climate engineering sensitivity experiments performed with the Community Earth System Model (CESM) version 1.0.2 at 2°-resolution. This configuration entails 5 fully coupled model components responsible for simulating the Earth's atmosphere, land, land-ice, ocean and sea-ice that interact through a central coupler. Historical and RCP8.5 scenarios were performed with transient land-cover changes and prognostic terrestrial Carbon/Nitrogen cycles. Four sets of experiments are performed in which surface albedo over snow-free vegetated grid points is increased by 0.5, 0.10, 0.15 and 0.20. The simulations show a strong preferential cooling of hot extremes throughout the Northern mid-latitudes during boreal summer. A strong linear scaling between the cooling of extremes and additional surface albedo applied to the land model is observed. The strongest preferential cooling is found in southeastern Europe and the central United States, where increases of soil moisture and evaporative fraction are the largest relative to the control

  8. 44 CFR 201.7 - Tribal Mitigation Plans.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... OF HOMELAND SECURITY DISASTER ASSISTANCE MITIGATION PLANNING § 201.7 Tribal Mitigation Plans. The... reduce risks from natural hazards, serving as a guide for decision makers as they commit resources to reducing the effects of natural hazards. (a) Plan requirement. (1) Indian tribal governments applying...

  9. Proportional hazards regression in epidemiologic follow-up studies: an intuitive consideration of primary time scale.

    PubMed

    Cologne, John; Hsu, Wan-Ling; Abbott, Robert D; Ohishi, Waka; Grant, Eric J; Fujiwara, Saeko; Cullings, Harry M

    2012-07-01

    In epidemiologic cohort studies of chronic diseases, such as heart disease or cancer, confounding by age can bias the estimated effects of risk factors under study. With Cox proportional-hazards regression modeling in such studies, it would generally be recommended that chronological age be handled nonparametrically as the primary time scale. However, studies involving baseline measurements of biomarkers or other factors frequently use follow-up time since measurement as the primary time scale, with no explicit justification. The effects of age are adjusted for by modeling age at entry as a parametric covariate. Parametric adjustment raises the question of model adequacy, in that it assumes a known functional relationship between age and disease, whereas using age as the primary time scale does not. We illustrate this graphically and show intuitively why the parametric approach to age adjustment using follow-up time as the primary time scale provides a poor approximation to age-specific incidence. Adequate parametric adjustment for age could require extensive modeling, which is wasteful, given the simplicity of using age as the primary time scale. Furthermore, the underlying hazard with follow-up time based on arbitrary timing of study initiation may have no inherent meaning in terms of risk. Given the potential for biased risk estimates, age should be considered as the preferred time scale for proportional-hazards regression with epidemiologic follow-up data when confounding by age is a concern. PMID:22517300

  10. Seaside, Oregon, Tsunami Pilot Study-Modernization of FEMA Flood Hazard Maps: GIS Data

    USGS Publications Warehouse

    Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.

    2006-01-01

    Introduction: The Federal Emergency Management Agency (FEMA) Federal Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study (Chowdhury and others, 2005). Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Analysis (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines (Tsunami Pilot Study Working Group, 2006). The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey, and the National Oceanic and Atmospheric Administration (NOAA), in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. We present the spatial (geographic information system, GIS) data from the pilot study in standard GIS formats and provide files for visualization in Google Earth, a global map viewer.

  11. Assessment and Indirect Adjustment for Confounding by Smoking in Cohort Studies Using Relative Hazards Models

    PubMed Central

    Richardson, David B.; Laurier, Dominique; Schubauer-Berigan, Mary K.; Tchetgen, Eric Tchetgen; Cole, Stephen R.

    2014-01-01

    Workers' smoking histories are not measured in many occupational cohort studies. Here we discuss the use of negative control outcomes to detect and adjust for confounding in analyses that lack information on smoking. We clarify the assumptions necessary to detect confounding by smoking and the additional assumptions necessary to indirectly adjust for such bias. We illustrate these methods using data from 2 studies of radiation and lung cancer: the Colorado Plateau cohort study (1950–2005) of underground uranium miners (in which smoking was measured) and a French cohort study (1950–2004) of nuclear industry workers (in which smoking was unmeasured). A cause-specific relative hazards model is proposed for estimation of indirectly adjusted associations. Among the miners, the proposed method suggests no confounding by smoking of the association between radon and lung cancer—a conclusion supported by adjustment for measured smoking. Among the nuclear workers, the proposed method suggests substantial confounding by smoking of the association between radiation and lung cancer. Indirect adjustment for confounding by smoking resulted in an 18% decrease in the adjusted estimated hazard ratio, yet this cannot be verified because smoking was unmeasured. Assumptions underlying this method are described, and a cause-specific proportional hazards model that allows easy implementation using standard software is presented. PMID:25245043

  12. New Seismic Hazard study in Spain Aimed at the revision of the Spanish Building Code

    NASA Astrophysics Data System (ADS)

    Rivas-Medina, A.; Benito, B.; Cabañas, L.; Martínez-Solares, J. M.; Ruíz, S.; Gaspar-Escribano, J. M.; Carreño, E.; Crespo, M.; García-Mayordomo, J.

    2013-05-01

    In this paper we present a global overview of the recent study carried out in Spain for the new hazard map, which final goal is the revision of the Building Code in our country (NCSE-02). The study was carried our for a working group joining experts from The Instituto Geografico Nacional (IGN) and the Technical University of Madrid (UPM) , being the different phases of the work supervised by an expert Committee integrated by national experts from public institutions involved in subject of seismic hazard. The PSHA method (Probabilistic Seismic Hazard Assessment) has been followed, quantifying the epistemic uncertainties through a logic tree and the aleatory ones linked to variability of parameters by means of probability density functions and Monte Carlo simulations. In a first phase, the inputs have been prepared, which essentially are: 1) a project catalogue update and homogenization at Mw 2) proposal of zoning models and source characterization 3) calibration of Ground Motion Prediction Equations (GMPE's) with actual data and development of a local model with data collected in Spain for Mw < 5.5. In a second phase, a sensitivity analysis of the different input options on hazard results has been carried out in order to have criteria for defining the branches of the logic tree and their weights. Finally, the hazard estimation was done with the logic tree shown in figure 1, including nodes for quantifying uncertainties corresponding to: 1) method for estimation of hazard (zoning and zoneless); 2) zoning models, 3) GMPE combinations used and 4) regression method for estimation of source parameters. In addition, the aleatory uncertainties corresponding to the magnitude of the events, recurrence parameters and maximum magnitude for each zone have been also considered including probability density functions and Monte Carlo simulations The main conclusions of the study are presented here, together with the obtained results in terms of PGA and other spectral accelerations

  13. Analytic study to evaluate associations between hazardous waste sites and birth defects. Final report

    SciTech Connect

    Marshall, E.G.; Gensburg, L.J.; Geary, N.S.; Deres, D.A.; Cayo, M.R.

    1995-06-01

    A study was conducted to evaluate the risk of two types of birth defects (central nervous system and musculoskeletal defects) associated with mothers` exposure to solvents, metals, and pesticides through residence near hazardous waste sites. The only environmental factor showing a statistically significant elevation in risk was living within one mile of industrial or commercial facilities emitting solvents into the air. Residence near these facilities showed elevated risk for central nervous system defects but no elevated risks for musculoskeletal defects.

  14. Reviewing and visualising relationships between anthropic processes and natural hazards within a multi-hazard framework

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2014-05-01

    Here we present a broad overview of the interaction relationships between 17 anthropic processes and 21 different natural hazard types. Anthropic processes are grouped into seven categories (subsurface extraction, subsurface addition, land use change, explosions, hydrological change, surface construction processes, miscellaneous). Natural hazards are grouped into six categories (geophysical, hydrological, shallow earth processes, atmospheric, biophysical and space). A wide-ranging review based on grey- and peer-reviewed literature from many scientific disciplines identified 54 relationships where anthropic processes have been noted to trigger natural hazards. We record case studies for all but three of these relationships. Based on the results of this review, we find that the anthropic processes of deforestation, explosions (conventional and nuclear) and reservoir construction could trigger the widest range of different natural hazard types. We also note that within the natural hazards, landslides and earthquakes are those that could be triggered by the widest range of anthropic processes. This work also examines the possibility of anthropic processes (i) resulting in an increased occurrence of a particular hazard interaction (e.g., deforestation could result in an increased interaction between storms and landslides); and (ii) inadvertently reducing the likelihood of a natural hazard or natural hazard interaction (e.g., poor drainage or deforestation reducing the likelihood of wildfires triggered by lightning). This study synthesises, using accessible visualisation techniques, the large amounts of anthropic process and natural hazard information from our review. In it we have outlined the importance of considering anthropic processes within any analysis of hazard interactions, and we reinforce the importance of a holistic approach to natural hazard assessment, mitigation and management.

  15. Social and ethical perspectives of landslide risk mitigation measures

    NASA Astrophysics Data System (ADS)

    Kalsnes, Bjørn; Vangelsten, Bjørn V.

    2015-04-01

    Landslide risk may be mitigated by use of a wide range of measures. Mitigation and prevention options may include (1) structural measures to reduce the frequency, severity or exposure to the hazard, (2) non-structural measures, such as land-use planning and early warning systems, to reduce the hazard frequency and consequences, and (3) measures to pool and transfer the risks. In a given situation the appropriate system of mitigation measures may be a combination of various types of measures, both structural and non-structural. In the process of choosing mitigation measures for a given landslide risk situation, the role of the geoscientist is normally to propose possible mitigation measures on basis of the risk level and technical feasibility. Social and ethical perspectives are often neglected in this process. However, awareness of the need to consider social as well as ethical issues in the design and management of mitigating landslide risk is rising. There is a growing understanding that technical experts acting alone cannot determine what will be considered the appropriate set of mitigation and prevention measures. Issues such as environment versus development, questions of acceptable risk, who bears the risks and benefits, and who makes the decisions, also need to be addressed. Policymakers and stakeholders engaged in solving environmental risk problems are increasingly recognising that traditional expert-based decision-making processes are insufficient. This paper analyse the process of choosing appropriate mitigation measures to mitigate landslide risk from a social and ethical perspective, considering technical, cultural, economical, environmental and political elements. The paper focus on stakeholder involvement in the decision making process, and shows how making strategies for risk communication is a key for a successful process. The study is supported by case study examples from Norway and Italy. In the Italian case study, three different risk mitigation

  16. Seaside, Oregon Tsunami Pilot Study - modernization of FEMA flood hazard maps

    USGS Publications Warehouse

    Tsunami Pilot Study Working Group

    2006-01-01

    FEMA Flood Insurance Rate Map (FIRM) guidelines do not currently exist for conducting and incorporating tsunami hazard assessments that reflect the substantial advances in tsunami research achieved in the last two decades; this conclusion is the result of two FEMA-sponsored workshops and the associated Tsunami Focused Study. Therefore, as part of FEMA's Map Modernization Program, a Tsunami Pilot Study was carried out in the Seaside/Gearhart, Oregon, area to develop an improved Probabilistic Tsunami Hazard Assessment (PTHA) methodology and to provide recommendations for improved tsunami hazard assessment guidelines. The Seaside area was chosen because it is typical of many coastal communities in the section of the Pacific Coast from Cape Mendocino to the Strait of Juan de Fuca, and because State Agencies and local stakeholders expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by FEMA, U.S. Geological Survey and the National Oceanic and Atmospheric Administration, in collaboration with the University of Southern California, Middle East Technical University. Portland State University, Horning Geosciences, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. Draft copies and a briefing on the contents, results and recommendations of this document were provided to FEMA officials before final publication.

  17. A web-based tool for ranking landslide mitigation measures

    NASA Astrophysics Data System (ADS)

    Lacasse, S.; Vaciago, G.; Choi, Y. J.; Kalsnes, B.

    2012-04-01

    brief description, guidance on design, schematic details, practical examples and references for each mitigation measure. Each of the measures was given a score on its ability and applicability for different types of landslides and boundary conditions, and a decision support matrix was established. The web-based toolbox organizes the information in the compendium and provides an algorithm to rank the measures on the basis of the decision support matrix, and on the basis of the risk level estimated at the site. The toolbox includes a description of the case under study and offers a simplified option for estimating the hazard and risk levels of the slide at hand. The user selects the mitigation measures to be included in the assessment. The toolbox then ranks, with built-in assessment factors and weights and/or with user-defined ranking values and criteria, the mitigation measures included in the analysis. The toolbox includes data management, e.g. saving data half-way in an analysis, returning to an earlier case, looking up prepared examples or looking up information on mitigation measures. The toolbox also generates a report and has user-forum and help features. The presentation will give an overview of the mitigation measures considered and examples of the use of the toolbox, and will take the attendees through the application of the toolbox.

  18. Space Debris & its Mitigation

    NASA Astrophysics Data System (ADS)

    Kaushal, Sourabh; Arora, Nishant

    2012-07-01

    Space debris has become a growing concern in recent years, since collisions at orbital velocities can be highly damaging to functioning satellites and can also produce even more space debris in the process. Some spacecraft, like the International Space Station, are now armored to deal with this hazard but armor and mitigation measures can be prohibitively costly when trying to protect satellites or human spaceflight vehicles like the shuttle. This paper describes the current orbital debris environment, outline its main sources, and identify mitigation measures to reduce orbital debris growth by controlling these sources. We studied the literature on the topic Space Debris. We have proposed some methods to solve this problem of space debris. We have also highlighted the shortcomings of already proposed methods by space experts and we have proposed some modification in those methods. Some of them can be very effective in the process of mitigation of space debris, but some of them need some modification. Recently proposed methods by space experts are maneuver, shielding of space elevator with the foil, vaporizing or redirecting of space debris back to earth with the help of laser, use of aerogel as a protective layer, construction of large junkyards around international space station, use of electrodynamics tether & the latest method proposed is the use of nano satellites in the clearing of the space debris. Limitations of the already proposed methods are as follows: - Maneuvering can't be the final solution to our problem as it is the act of self-defence. - Shielding can't be done on the parts like solar panels and optical devices. - Vaporizing or redirecting of space debris can affect the human life on earth if it is not done in proper manner. - Aerogel has a threshold limit up to which it can bear (resist) the impact of collision. - Large junkyards can be effective only for large sized debris. In this paper we propose: A. The Use of Nano Tubes by creating a mesh

  19. Incorporating induced seismicity in the 2014 United States National Seismic Hazard Model: results of the 2014 workshop and sensitivity studies

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.

    2015-01-01

    The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after

  20. DIII-D Studies of Massive Gas Injection Fast Shutdowns for Disruption Mitigation

    SciTech Connect

    Hollmann, E; Jernigan, T; Antar, G; Bakhtiari, M; Boedo, J; Combs, S; Evans, T; Gray, D; Groth, M; Humphreys, D; Lasnier, C; Moyer, R; Parks, P; Rudakov, D; Strait, E; Wesley, J; West, W; Whyte, D; Yu, J

    2006-09-29

    Injection of massive quantities of gas is a promising technique for fast shutdown of ITER for the purpose of avoiding divertor and first wall damage from disruptions. Previous experiments using massive gas injection (MGI) to terminate discharges in the DIII-D tokamak have demonstrated rapid shutdown with reduced wall heating and halo currents (relative to natural disruptions) and with very small runaway electron (RE) generation [1]. Figure 1 shows time traces which give an overview of shutdown time scales. Typically, of order 5 x 10{sup 22} Ar neutrals are fired over a pulse of 25 ms duration into stationary (non-disrupting) discharges. The observed results are consistent with the following scenario: within several ms of the jet trigger, sufficient Ar neutrals are delivered to the plasma to cause the edge temperature to collapse, initiating the inward propagation of a cold front. The exit flow of the jet [Fig. 1(a)] has a {approx} 9 ms rise time; so the quantity of neutrals which initiates the edge collapse is small (<10{sup 20}). When the cold front reaches q {approx} 2 surface, global magnetohydrodynamic (MHD) modes are destabilized [2], mixing hot core plasma with edge impurities. Here, q is the safety factor. Most (>90%) of the plasma thermal energy is lost via impurity radiation during this thermal quench (TQ) phase. Conducted heat loads to the wall are low because of the cold edge temperature. After the TQ, the plasma is very cold (of order several eV), so conducted wall (halo) currents are low, even if the current channel contacts the wall. The plasma current profile broadens and begins decaying resistively. The decaying current generates a toroidal electric field which can accelerate REs; however, RE beam formation appears to be limited in MGI shutdowns. Presently, it is thought that the conducted heat flux and halo current mitigation qualities of the MGI shutdown technique will scale well to a reactor-sized tokamak. However, because of the larger RE gain

  1. Neotectonic deformation models for probabilistic seismic hazard: a study in the External Dinarides

    NASA Astrophysics Data System (ADS)

    Kastelic, Vanja; Carafa, Michele M. C.; Visini, Francesco

    2016-06-01

    In Europe, common input data types for seismic hazard evaluation include earthquake catalogues, seismic zonation models and ground motion models, all with well-constrained epistemic uncertainties. In contrast, neotectonic deformation models and their related uncertainties are rarely considered in earthquake forecasting and seismic hazard studies. In this study, for the first time in Europe, we developed a seismic hazard model based exclusively on active fault and geodynamic deformation models. We applied it to the External Dinarides, a slow-deforming fold-and-thrust belt in the Central Mediterranean. The two deformation models furnish consistent long-term earthquake rates above the Mw 4.7 threshold on a latitude/longitude grid with 0.2° spacing. Results suggest that the use of deformation models is a valid alternative to empirical-statistical approaches in earthquake forecasting in slow-deforming regions of Europe. Furthermore, we show that the variability of different deformation models has a comparable effect on the peak ground motion acceleration uncertainty as do the ground motion prediction equations.

  2. Status of volcanic hazard studies for the Nevada Nuclear Waste Storage Investigations

    SciTech Connect

    Crowe, B.M.; Vaniman, D.T.; Carr, W.J.

    1983-03-01

    Volcanism studies of the Nevada Test Site (NTS) region are concerned with hazards of future volcanism with respect to underground disposal of high-level radioactive waste. The hazards of silicic volcanism are judged to be negligible; hazards of basaltic volcanism are judged through research approaches combining hazard appraisal and risk assessment. The NTS region is cut obliquely by a N-NE trending belt of volcanism. This belt developed about 8 Myr ago following cessation of silicic volcanism and contemporaneous with migration of basaltic activity toward the southwest margin of the Great Basin. Two types of fields are present in the belt: (1) large-volume, long-lived basalt and local rhyolite fields with numerous eruptive centers and (2) small-volume fields formed by scattered basaltic scoria cones. Late Cenozoic basalts of the NTS region belong to the second field type. Monogenetic basalt centers of this region were formed mostly by Strombolian eruptions; Surtseyean activity has been recognized at three centers. Geochemically, the basalts of the NTS region are classified as straddle A-type basalts of the alkalic suite. Petrological studies indicate a volumetric dominance of evolved hawaiite magmas. Trace- and rare-earth-element abundances of younger basalt (<4 Myr) of the NTS region and southern Death Valley area, California, indicate an enrichment in incompatible elements, with the exception of rubidium. The conditional probability of recurring basaltic volcanism and disruption of a repository by that event is bounded by the range of 10{sup -8} to 10{sup -10} as calculated for a 1-yr period. Potential disruptive and dispersal effects of magmatic penetration of a repository are controlled primarily by the geometry of basalt feeder systems, the mechanism of waste incorporation in magma, and Strombolian eruption processes.

  3. 42 CFR 93.408 - Mitigating and aggravating factors in HHS administrative actions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Mitigating and aggravating factors in HHS administrative actions. 93.408 Section 93.408 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH ASSESSMENTS AND HEALTH EFFECTS STUDIES OF HAZARDOUS SUBSTANCES RELEASES AND FACILITIES PUBLIC HEALTH SERVICE POLICIES ON...

  4. 42 CFR 93.408 - Mitigating and aggravating factors in HHS administrative actions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false Mitigating and aggravating factors in HHS administrative actions. 93.408 Section 93.408 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH ASSESSMENTS AND HEALTH EFFECTS STUDIES OF HAZARDOUS SUBSTANCES RELEASES AND FACILITIES PUBLIC HEALTH SERVICE POLICIES ON...

  5. ERTS-1 flood hazard studies in the Mississippi River Basin. [Missouri, Mississippi, and Arkansas

    NASA Technical Reports Server (NTRS)

    Rango, A.; Anderson, A. T.

    1974-01-01

    The Spring 1973 Mississippi River flood was investigated using remotely sensed data from ERTS-1. Both manual and automatic analyses of the data indicate that ERTS-1 is extremely useful as a regional tool for flood and floodplain management. The maximum error of such flood area measurements is conservatively estimated to be less than five percent. Change detection analysis indicates that the flood had major impacts on soil moisture, land pattern stability, and vegetation stress. Flood hazard identification was conducted using photointerpretation techniques in three study areas along the Mississippi River using pre-flood ERTS-1 imagery down to 1:100,000 scale. Flood prone area boundaries obtained from ERTS-1 were generally in agreement with flood hazard maps produced by the U.S. Army Corps of Engineers and the U.S. Geological Survey although the latter are somewhat more detailed because of their larger scale. Initial results indicate that ERTS-1 digital mapping of the flood-prone areas can be performed at least 1:62,500 which is comparable to conventional flood hazard map scales.

  6. [THE STUDY OF MANIFESTATIONS OF ENVIRONMENTAL HAZARDS AT THE REGIONAL LEVEL].

    PubMed

    Shmandiy, V M; Kharlamova, E V; Rugas, T E

    2015-01-01

    Elaborated methodological approaches to the monitoring of the state of ecological safety are based on the use of systems analysis of conditions and consistent patterns of the formation of the ecological danger search for effective means and methods of safety management. Ecological hazard is considered as a hierarchical structure, consisting of types, classes, species and subsubspecies. In industrially developed regions the most significant are technogenic and sociogenic classes. The sociogenic class of danger was proved to be primary in its formation, as the level of environmental awareness is largely determined by the degree of impact on human health and environment, manifestations of the danger of other classes are depend on it. When analyzing the state of danger there was applied the anthropocentric approach. There was used an assembly of characteristics considering the health status of the population of the certain territory under the influence of factors of environmental hazard. On the base of the colligation of literature data and the results of own observations there was suggested a generalized index of the state of the population'S health in socio-economic areas, there were selected zones with differing level of the technogenic loads and also rural areas beyond objects of technogenic impact. On results of studies there was proved the relationship between the level of the environmental hazard and state of the population's in various socio-economic zones. PMID:26856150

  7. Examination of Icing Induced Loss of Control and Its Mitigations

    NASA Technical Reports Server (NTRS)

    Reehorst, Andrew L.; Addy, Harold E., Jr.; Colantonio, Renato O.

    2010-01-01

    Factors external to the aircraft are often a significant causal factor in loss of control (LOC) accidents. In today s aviation world, very few accidents stem from a single cause and typically have a number of causal factors that culminate in a LOC accident. Very often the "trigger" that initiates an accident sequence is an external environment factor. In a recent NASA statistical analysis of LOC accidents, aircraft icing was shown to be the most common external environmental LOC causal factor for scheduled operations. When investigating LOC accident or incidents aircraft icing causal factors can be categorized into groups of 1) in-flight encounter with super-cooled liquid water clouds, 2) take-off with ice contamination, or 3) in-flight encounter with high concentrations of ice crystals. As with other flight hazards, icing induced LOC accidents can be prevented through avoidance, detection, and recovery mitigations. For icing hazards, avoidance can take the form of avoiding flight into icing conditions or avoiding the hazard of icing by making the aircraft tolerant to icing conditions. Icing detection mitigations can take the form of detecting icing conditions or detecting early performance degradation caused by icing. Recovery from icing induced LOC requires flight crew or automated systems capable of accounting for reduced aircraft performance and degraded control authority during the recovery maneuvers. In this report we review the icing induced LOC accident mitigations defined in a recent LOC study and for each mitigation describe a research topic required to enable or strengthen the mitigation. Many of these research topics are already included in ongoing or planned NASA icing research activities or are being addressed by members of the icing research community. These research activities are described and the status of the ongoing or planned research to address the technology needs is discussed

  8. Mitigation potential of horizontal ground coupled heat pumps for current and future climatic conditions: UK environmental modelling and monitoring studies

    NASA Astrophysics Data System (ADS)

    García González, Raquel; Verhoef, Anne; Vidale, Pier Luigi; Gan, Guohui; Wu, Yupeng; Hughes, Andrew; Mansour, Majdi; Blyth, Eleanor; Finch, Jon; Main, Bruce

    2010-05-01

    An increased uptake of alternative low or non-CO2 emitting energy sources is one of the key priorities for policy makers to mitigate the effects of environmental change. Relatively little work has been undertaken on the mitigation potential of Ground Coupled Heat Pumps (GCHPs) despite the fact that a GCHP could significantly reduce CO2 emissions from heating systems. It is predicted that under climate change the most probable scenario is for UK temperatures to increase and for winter rainfall to become more abundant; the latter is likely to cause a general rise in groundwater levels. Summer rainfall may reduce considerably, while vegetation type and density may change. Furthermore, recent studies underline the likelihood of an increase in the number of heat waves. Under such a scenario, GCHPs will increasingly be used for cooling as well as heating. These factors will affect long-term performance of horizontal GCHP systems and hence their economic viability and mitigation potential during their life span ( 50 years). The seasonal temperature differences encountered in soil are harnessed by GCHPs to provide heating in the winter and cooling in the summer. The performance of a GCHP system will depend on technical factors (heat exchanger (HE) type, length, depth, and spacing of pipes), but also it will be determined to a large extent by interactions between the below-ground parts of the system and the environment (atmospheric conditions, vegetation and soil characteristics). Depending on the balance between extraction and rejection of heat from and to the ground, the soil temperature in the neighbourhood of the HE may fall or rise. The GROMIT project (GROund coupled heat pumps MITigation potential), funded by the Natural Environment Research Council (UK), is a multi-disciplinary research project, in collaboration with EarthEnergy Ltd., which aims to quantify the CO2 mitigation potential of horizontal GCHPs. It considers changing environmental conditions and combines

  9. Effects of anthropogenic land-subsidence on river flood hazard: a case study in Ravenna, Italy

    NASA Astrophysics Data System (ADS)

    Carisi, Francesca; Domeneghetti, Alessio; Castellarin, Attilio

    2015-04-01

    Can differential land-subsidence significantly alter the river flooding dynamics, and thus flood risk in flood prone areas? Many studies show how the lowering of the coastal areas is closely related to an increase in the flood-hazard due to more important tidal flooding and see level rise. On the contrary, the literature on the relationship between differential land-subsidence and possible alterations to riverine flood-hazard of inland areas is still sparse, while several areas characterized by significant land-subsidence rates during the second half of the 20th century experienced an intensification in both inundation magnitude and frequency. This study investigates the possible impact of a significant differential ground lowering on flood hazard in proximity of Ravenna, which is one of the oldest Italian cities, former capital of the Western Roman Empire, located a few kilometers from the Adriatic coast and about 60 km south of the Po River delta. The rate of land-subsidence in the area, naturally in the order of a few mm/year, dramatically increased up to 110 mm/year after World War II, primarily due to groundwater pumping and a number of deep onshore and offshore gas production platforms. The subsidence caused in the last century a cumulative drop larger than 1.5 m in the historical center of the city. Starting from these evidences and taking advantage of a recent digital elevation model of 10m resolution, we reconstructed the ground elevation in 1897 for an area of about 65 km2 around the city of Ravenna. We referred to these two digital elevation models (i.e. current topography and topographic reconstruction) and a 2D finite-element numerical model for the simulation of the inundation dynamics associated with several levee failure scenarios along embankment system of the river Montone. For each scenario and digital elevation model, the flood hazard is quantified in terms of water depth, speed and dynamics of the flooding front. The comparison enabled us to

  10. Study of the radiated energy loss during massive gas injection mitigated disruptions on EAST

    NASA Astrophysics Data System (ADS)

    Duan, Y. M.; Hao, Z. K.; Hu, L. Q.; Wang, L.; Xu, P.; Xu, L. Q.; Zhuang, H. D.

    2015-08-01

    The MGI mitigated disruption experiments were carried out on EAST with a new fast gas controlling valve in 2012. Different amounts of noble gas He or mixed gas of 99% He + 1% Ar are injected into plasma in current flat-top phase and current ramp-down phase separately. The initial results of MGI experiments are described. The MGI system and the radiation measurement system are briefly introduced. The characteristics of radiation distribution and radiation energy loss are analyzed. About 50% of the stored thermal energy Wdia is dissipated by radiation during the entire disruption process and the impurities of C and Li from the PFC play important roles to radiative energy loss. The amount of the gas can affect the pre-TQ phase. Strong poloidal asymmetry of radiation begins to appear in the CQ phase, which is possibly caused by the plasma configuration changes as a result of VDE. No toroidal radiation asymmetry is observed presently.

  11. Study on FPGA SEU Mitigation for the Readout Electronics of DAMPE BGO Calorimeter in Space

    NASA Astrophysics Data System (ADS)

    Shen, Zhongtao; Feng, Changqing; Gao, Shanshan; Zhang, Deliang; Jiang, Di; Liu, Shubin; An, Qi

    2015-06-01

    The BGO calorimeter, which provides a wide measurement range of the primary cosmic ray spectrum, is a key sub-detector of Dark Matter Particle Explorer (DAMPE). The readout electronics of calorimeter consists of 16 pieces of Actel ProASIC Plus FLASH-based FPGA, of which the design-level flip-flops and embedded block RAMs are single event upset (SEU) sensitive in the harsh space environment. Therefore to comply with radiation hardness assurance (RHA), SEU mitigation methods, including partial triple modular redundancy (TMR), CRC checksum, and multi-domain reset are analyzed and tested by the heavy-ion beam test. Composed of multi-level redundancy, a FPGA design with the characteristics of SEU tolerance and low resource consumption is implemented for the readout electronics.

  12. First-phase study design for the US Navy Radon Assessment and Mitigation Program (NAVRAMP)

    SciTech Connect

    Gammage, R.B.; Wilson, D.L.; Dudney, C.S.; Matthews, T.G.

    1990-01-01

    In 1988, the Navy initiated a multi-year program for the assessment and mitigation of radon inside buildings at its worldwide distribution of bases. During the first two years of the program, a survey is being made of indoor radon levels in residences occupied by Navy personnel and their dependents. In addition, a small random sample of other structures is being monitored for elevated radon. Passive alpha-track detectors, numbering about 25,000, are being used as monitoring devices. A substantial fraction of the monitors (20%) are being used for quality assurance. Data management programs have been developed to record the chain of custody of the monitors and handle the associated questionnaire data. Program objectives and implementation emphasize quality assurance, records maintenance and monitor placement and retrieval. 5 refs., 2 tabs.

  13. Effectiveness of protected areas in mitigating fire within their boundaries: case study of Chiapas, Mexico.

    PubMed

    Román-Cuesta, María Rosa; Martínez-Vilalta, Jordi

    2006-08-01

    Since the severe 1982-1983 El Niño drought, recurrent burning has been reported inside tropical protected areas (TPAs). Despite the key role of fire in habitat degradation, little is known about the effectiveness of TPAs in mitigating fire incidence and burned areas. We used a GPS fire database (1995-2005) (n=3590 forest fires) obtained from the National Forest Commission to compare fire incidence (number of fires) and burned areas inside TPAs and their surrounding adjacent buffer areas in Southern Mexico (Chiapas). Burned areas inside parks ranged from 2% (Palenque) to 45% (Lagunas de Montebello) of a park's area, and the amount burned was influenced by two severe El Niño events (1998 and 2003). These two years together resulted in 67% and 46% of the total area burned in TPAs and buffers, respectively during the period under analysis. Larger burned areas in TPAs than in their buffers were exclusively related to the extent of natural habitats (flammable area excluding agrarian and pasture lands). Higher fuel loads together with access and extinction difficulties were likely behind this trend. A higher incidence of fire in TPAs than in their buffers was exclusively related to anthropogenic factors such as higher road densities and agrarian extensions. Our results suggest that TPAs are failing to mitigate fire impacts, with both fire incidence and total burned areas being significantly higher in the reserves than in adjacent buffer areas. Management plans should consider those factors that facilitate fires in TPAs: anthropogenic origin of fires, sensitivity of TPAs to El Niñio-droughts, large fuel loads and fuel continuity inside parks, and limited financial resources. Consideration of these factors favors lines of action such as alternatives to the use of fire (e.g., mucuna-maize system), climatic prediction to follow the evolution of El Niño, fuel management strategies that favor extinction practices, and the strengthening of local communities and ecotourism

  14. Studies On The Influence Of Soil Components On Adsorption-Desorption Of Hazardous Organics And Their Insitu Biodegradation

    NASA Astrophysics Data System (ADS)

    Khan, Z.

    2003-12-01

    Currently approximately 155 cubic yards of soil is contaminated with hazardous organics at Patancheru Industrial area (Hyderabad, India). These hazardous organic contaminants are frequently part of hazardous waste disposed on land and the study of waste site interaction is the key to assess the potential for offsite and onsite contamination. In the present study the authors report the results on the adsorption, soil leaching potential and persistence of phenol, p-nitrophenol,2,4-dichlorophenol and 4,chloro-2,nitrophenol which are the common constituents of the hazardous waste generated. The role of soil components like organic matter, clay, iron and aluminium oxides in the adsorption capacity has been studied. Desorption isotherms of soil adsorbed hazardous organics exhibited hysterisis at high initial concentration indicating the degree of irreversibility of adsorption-deesorption process. Leaching potential of the hazardous organics decreases with their increasing hydrophobicity and soil organic matter content while their persistence in terms of half life time (DT50) increases. Insitu biodegradation has been carried out by developing mixed culture systems which can degrade the phenols to complete mineralisation by utilizing them as the sole source of carbon and their corresponding biodegradation kinetic constants were evaluated. Based on the above data generated preparation of hazardous waste dumpsites with suitable soil surface having high holding capacity for organics and their insitu biodegradation by mixing with specific bacterial cultures enriched from different soils can be exploited as a cost effective technology for reclamation of contaminated sites.

  15. Study on the Application of Probabilistic Tsunami Hazard Analysis for the Nuclear Power Plant Site in Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Rhee, H. M.; Kim, M.; Sheen, D. H.; Choi, I. K.

    2014-12-01

    The necessity of study on the tsunami hazard assessment for Nuclear Power Plant (NPP) site was suggested since the event of Fukushima in 2011 had been occurred. It has being emphasized because all of the NPPs in Korean Peninsula are located in coastal region. The tsunami hazard is regarded as the annual exceedance probability for the wave heights. The methodology for analysis of tsunami hazard is based on the seismic hazard analysis. The seismic hazard analysis had been performed by using both deterministic and probabilistic method. Recently, the probabilistic method had been received more attention than the deterministic method because the uncertainties of hazard analysis could be considered by using the logic tree approach. In this study, the probabilistic tsunami hazard analysis for Uljin NPP site was performed by using the information of fault sources which was published by Atomic Energy Society of Japan (AESJ). The wave parameter is the most different parameter with seismic hazard. It could be estimated from the results of tsunami propagation analysis. The TSUNAMI_ver1.0 which was developed by Japan nuclear energy safety organization (JNES), was used for the tsunami simulation. The 80 cases tsunami simulations were performed and then the wave parameters were estimated. For reducing the sensitivity which was encouraged by location of sampling point, the wave parameters were estimated from group of sampling points.The probability density function on the tsunami height was computed by using the recurrence intervals and the wave parameters. And then the exceedance probability distribution was calculated from the probability density function. The tsunami hazards for the sampling groups were calculated. The fractile curves which were shown the uncertainties of input parameters were estimated from the hazards by using the round-robin algorithm. In general, tsunami hazard analysis is focused on the maximum wave heights. But the minimum wave height should be considered

  16. Hazardous Waste

    MedlinePlus

    ... you throw these substances away, they become hazardous waste. Some hazardous wastes come from products in our homes. Our garbage can include such hazardous wastes as old batteries, bug spray cans and paint ...

  17. GIS data for the Seaside, Oregon, Tsunami Pilot Study to modernize FEMA flood hazard maps

    USGS Publications Warehouse

    Wong, Florence L.; Venturato, Angie J.; Geist, Eric L.

    2007-01-01

    A Tsunami Pilot Study was conducted for the area surrounding the coastal town of Seaside, Oregon, as part of the Federal Emergency Management's (FEMA) Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). The Cascadia subduction zone extends from Cape Mendocino, California, to Vancouver Island, Canada. The Seaside area was chosen because it is typical of many coastal communities subject to tsunamis generated by far- and near-field (Cascadia) earthquakes. Two goals of the pilot study were to develop probabilistic 100-year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improving tsunami hazard assessment guidelines for FEMA and state and local agencies. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study model data and results are published separately as a geographic information systems (GIS) data report (Wong and others, 2006). The flood maps and GIS data are briefly described here.

  18. Towards the Seismic Hazard Reassessment of Paks NPP (Hungary) Site: Seismicity and Sensitivity Studies

    NASA Astrophysics Data System (ADS)

    Toth, Laszlo; Monus, Peter; Gyori, Erzsebet; Grenerczy, Gyula; Janos Katona, Tamas; Kiszely, Marta

    2015-04-01

    reviews, and hazard characterization of the site has been confirmed. The hazard curves have been extended to lower probability events, as it is required by the probabilistic safety analysis. These earlier projects resulted in 0.22-0.26 g and 0.43-0.54 g mean PGA at 104 and 105 return periods. The site effect and liquefaction probability have also been evaluated. As it is expected for the site of soft soil conditions, the amplification is greater at shorter periods for the lower amplitude ground motion of 104 return period compared to the longer periods for the higher amplitude of the 105 year level ground motion. Further studies will be based on the improved regional seismotectonic model, state-of-the-art hazard evaluation software, and better knowledge of the local soil conditions. The presented preliminary results can demonstrate the adequacy of the planned program and highlight the progress in the hazard assessment.

  19. Hazardous-Materials Robot

    NASA Technical Reports Server (NTRS)

    Stone, Henry W.; Edmonds, Gary O.

    1995-01-01

    Remotely controlled mobile robot used to locate, characterize, identify, and eventually mitigate incidents involving hazardous-materials spills/releases. Possesses number of innovative features, allowing it to perform mission-critical functions such as opening and unlocking doors and sensing for hazardous materials. Provides safe means for locating and identifying spills and eliminates risks of injury associated with use of manned entry teams. Current version of vehicle, called HAZBOT III, also features unique mechanical and electrical design enabling vehicle to operate safely within combustible atmosphere.

  20. A Hazard Assessment and Proposed Risk Index for Art, Architecture, Archive and Artifact Protection: Case Studies for Assorted International Museums

    NASA Astrophysics Data System (ADS)

    Kirk, Clara J.

    This study proposes a hazard/risk index for environmental, technological, and social hazards that may threaten a museum or other place of cultural storage and accession. This index can be utilized and implemented to measure the risk at the locations of these storage facilities in relationship to their geologic, geographic, environmental, and social settings. A model case study of the 1966 flood of the Arno River and its impact on the city of Florence and the Uffizi Gallery was used as the index focus. From this focus an additional eleven museums and their related risk were assessed. Each index addressed a diverse range of hazards based on past frequency and magnitude. It was found that locations nearest a hazard had exceptionally high levels of risk, however more distant locations could have influences that would increase their risk to levels similar to those locations near the hazard. Locations not normally associated with a given natural hazard can be susceptible should the right conditions be met and this research identified, complied and assessed those factions found to influence natural hazard risk at these research sites.

  1. Prediction of Ungauged River Basin for Hydro Power Potential and Flood Risk Mitigation; a Case Study at Gin River, Sri Lanka

    NASA Astrophysics Data System (ADS)

    Ratnayake, A. S.

    2011-12-01

    floodplains. Thus modern GIS technology has been productively executed to prepare hazard maps based on the flood modeling and also it would be further utilized for disaster preparedness and mitigation activities. Five suitable hydraulic heads were recognized for mini-hydro power sites and it would be the most economical and applicable flood controlling hydraulic engineering structure considering all morphologic, climatic, environmental and socioeconomic proxies of the study area. Mini-hydro power sites also utilized as clean, eco friendly and reliable energy source (8630.0 kW). Finally Francis Turbine can be employed as the most efficiency turbine for the selected sites bearing in mind of both technical and economical parameters.

  2. An evaluation of soil erosion hazard: A case study in Southern Africa using geomatics technologies

    NASA Astrophysics Data System (ADS)

    Eiswerth, Barbara Alice

    Accelerated soil erosion in Malawi, Southern Africa, increasingly threatens agricultural productivity, given current and projected population growth trends. Previous attempts to document soil erosion potential have had limited success, lacking appropriate information and diagnostic tools. This study utilized geomatics technologies and the latest available information from topography, soils, climate, vegetation, and land use of a watershed in southern Malawi. The Soil Loss Estimation Model for Southern Africa (SLEMSA), developed for conditions in Zimbabwe, was evaluated and used to create a soil erosion hazard map for the watershed under Malawi conditions. The SLEMSA sub-models of cover, soil loss, and topography were computed from energy interception, rainfall energy, and soil erodibility, and slope length and steepness, respectively. Geomatics technologies including remote sensing and Geographic Information Systems (GIS) provided the tools with which land cover/land use, a digital elevation model, and slope length and steepness were extracted and integrated with rainfall and soils spatial information. Geomatics technologies enable rapid update of the model as new and better data sets become available. Sensitivity analyses of the SLEMSA model revealed that rainfall energy and slope steepness have the greatest influence on soil erosion hazard estimates in this watershed. Energy interception was intermediate in sensitivity level, whereas slope length and soil erodibility ranked lowest. Energy interception and soil erodibility were shown by parameter behavior analysis to behave in a linear fashion with respect to soil erosion hazard, whereas rainfall energy, slope steepness, and slope length exhibit non-linear behavior. When SLEMSA input parameters and results were compared to alternative methods of soil erosion assessment, such as drainage density and drainage texture, the model provided more spatially explicit information using 30 meter grid cells. Results of this

  3. The newest achievements of studies on the reutilization, treatment, and disposal technology of hazardous wastes

    SciTech Connect

    Liu Peizhe

    1996-12-31

    From 1991 to 1996, key studies on the reutilization, treatment, and disposal technology of hazardous wastes have been incorporated into the national plan for environmental protection science and technology. At present, the research achievements have been accomplished, have passed national approval, and have been accepted. The author of this paper, as leader of the national group for this research work, expounds the newest achievements of the studies involving four parts: (1) the reutilization technology of electroplating sludge, including the ion-exchange process for recovering the sludge and waste liquor for producing chromium tanning agent and extracting chromium and colloidal protein from tanning waste residue; on the recovery of heavy metals from the electroplating waste liquor with microbic purification; on the demonstration project of producing modified plastics from the sludge and the waste plastics; and on the demonstration of the recovery of heavy metals from waste electroplating sludge by using the ammonia-leaching process; (2) the demonstrative research of reutilization technology of chromium waste residues, including production of self-melting ore and smelting of chromium-containing pig iron, and of pyrolytic detoxification of the residue with cyclone furnace; (3) the incineration technology of hazardous wastes with successful results of the industrial incinerator system for polychlorinated biphenyls; and (4) the safety landfill technology for disposal of hazardous wastes, with a complete set of technology for pretreatment, selection of the site, development of the antipercolating materials, and design and construction of the landfill. Only a part of the achievements is introduced in this paper, most of which has been built and is being operated for demonstration to further spreading application and accumulate experience. 6 refs., 7 figs., 6 tabs.

  4. Implications of Adhesion Studies for Dust Mitigation on Thermal Control Surfaces

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Berkebile, Stephen P.

    2012-01-01

    Experiments measuring the adhesion forces under ultrahigh vacuum conditions (10 (exp -10) torr) between a synthetic volcanic glass and commonly used space exploration materials have recently been described. The glass has a chemistry and surface structure typical of the lunar regolith. It was found that Van der Waals forces between the glass and common spacecraft materials was negligible. Charge transfer between the materials was induced by mechanically striking the spacecraft material pin against the glass plate. No measurable adhesion occurred when striking the highly conducting materials, however, on striking insulating dielectric materials the adhesion increased dramatically. This indicates that electrostatic forces dominate over Van der Waals forces under these conditions. The presence of small amounts of surface contaminants was found to lower adhesive forces by at least two orders of magnitude, and perhaps more. Both particle and space exploration material surfaces will be cleaned by the interaction with the solar wind and other energetic processes and stay clean because of the extremely high vacuum (10 (exp -12) torr) so the atomically clean adhesion values are probably the relevant ones for the lunar surface environment. These results are used to interpret the results of dust mitigation technology experiments utilizing textured surfaces, work function matching surfaces and brushing. They have also been used to reinterpret the results of the Apollo 14 Thermal Degradation Samples experiment.

  5. Macroscopic to microscopic studies of flue gas desulfurization byproducts for acid mine drainage mitigation

    SciTech Connect

    Robbins, E.I.; Kalyoncu, R.S.; Finkelman, R.B.; Matos, G.R.; Barsotti, A.F.; Haefner, R.J.; Rowe, G.L. Jr.; Savela, C.E.; Eddy, J.I.

    1996-12-31

    The use of flue gas desulfurization (FGD) systems to reduce SO{sub 2} emissions has resulted in the generation of large quantities of byproducts. These and other byproducts are being stockpiled at the very time that alkaline materials having high neutralization potential are needed to mitigate acid mine drainage (AMD). FGD byproducts are highly alkaline materials composed primarily of unreacted sorbents (lime or limestone and sulfates and sulfites of Ca). The American Coal Ash Association estimated that approximately 20 million tons of FGD material were generated by electric power utilities equipped with wet lime-limestone PGD systems in 1993. Less than 5% of this material has been put to beneficial use for agricultural soil amendments and for the production of wallboard and cement. Four USGS projects are examining FGD byproduct use to address these concerns. These projects involve (1) calculating the volume of flue gas desulfurization (FGD) byproduct generation and their geographic locations in relation to AMD, (2) determining byproduct chemistry and mineralogy, (3) evaluating hydrology and geochemistry of atmospheric fluidized bed combustion byproduct as soil amendment in Ohio, and (4) analyzing microbial degradation of gypsum in anoxic limestone drains in West Virginia.

  6. Numerical study of potential heat flux mitigation effects in the TCV snowflake divertor

    NASA Astrophysics Data System (ADS)

    Lunt, T.; Canal, G. P.; Duval, B. P.; Feng, Y.; Labit, B.; McCarthy, P.; Reimerdes, H.; Vijvers, W. A. J.; Wischmeier, M.

    2016-04-01

    We report on EMC3-Eirene simulations of the plasma and neutral particle transport the TCV boundary layer of a series of snowflake (SF) equilibria characterized by the normalized poloidal flux coordinate {ρx2} of the secondary X-point x 2. We refer to a snowflake plus (SF+) for {ρx2}<1 , a snowflake minus (SF-) for {ρx2}>1 and a single-null (SN) for |{ρx2}-1|\\gg 0 . Four effects are identified that have the potential to mitigate the heat flux density at the outer strike point in a LFS SF-where x 2 is located on the low field side of the primary X-point x 1: (1) a scrape-off layer heat flux splitting, (2) an impurity radiation cloud forming at x 2 (3) the increased connection length to the outer target and (4) increased transport between x 1 and x 2. The LFS SF- is thus expected to tolerate a larger power flux {{P}\\text{sep}} over the separatrix than a comparable SN configuration.

  7. Viscoelastic Materials Study for the Mitigation of Blast-Related Brain Injury

    NASA Astrophysics Data System (ADS)

    Bartyczak, Susan; Mock, Willis, Jr.

    2011-06-01

    Recent preliminary research into the causes of blast-related brain injury indicates that exposure to blast pressures, such as from IED detonation or multiple firings of a weapon, causes damage to brain tissue resulting in Traumatic Brain Injury (TBI) and Post Traumatic Stress Disorder (PTSD). Current combat helmets are not sufficient to protect the warfighter from this danger and the effects are debilitating, costly, and long-lasting. Commercially available viscoelastic materials, designed to dampen vibration caused by shock waves, might be useful as helmet liners to dampen blast waves. The objective of this research is to develop an experimental technique to test these commercially available materials when subject to blast waves and evaluate their blast mitigating behavior. A 40-mm-bore gas gun is being used as a shock tube to generate blast waves (ranging from 1 to 500 psi) in a test fixture at the gun muzzle. A fast opening valve is used to release nitrogen gas from the breech to impact instrumented targets. The targets consist of aluminum/ viscoelastic polymer/ aluminum materials. Blast attenuation is determined through the measurement of pressure and accelerometer data in front of and behind the target. The experimental technique, calibration and checkout procedures, and results will be presented.

  8. Study of cover source mismatch in steganalysis and ways to mitigate its impact

    NASA Astrophysics Data System (ADS)

    Kodovský, Jan; Sedighi, Vahid; Fridrich, Jessica

    2014-02-01

    When a steganalysis detector trained on one cover source is applied to images from a different source, generally the detection error increases due to the mismatch between both sources. In steganography, this situation is recognized as the so-called cover source mismatch (CSM). The drop in detection accuracy depends on many factors, including the properties of both sources, the detector construction, the feature space used to represent the covers, and the steganographic algorithm. Although well recognized as the single most important factor negatively affecting the performance of steganalyzers in practice, the CSM received surprisingly little attention from researchers. One of the reasons for this is the diversity with which the CSM can manifest. On a series of experiments in the spatial and JPEG domains, we refute some of the common misconceptions that the severity of the CSM is tied to the feature dimensionality or their "fragility." The CSM impact on detection appears too difficult to predict due to the effect of complex dependencies among the features. We also investigate ways to mitigate the negative effect of the CSM using simple measures, such as by enlarging the diversity of the training set (training on a mixture of sources) and by employing a bank of detectors trained on multiple different sources and testing on a detector trained on the closest source.

  9. Conforth Ranch Wildlife Mitigation Feasibility Study, McNary, Oregon : Annual Report.

    SciTech Connect

    Rasmussen, Larry; Wright, Patrick; Giger, Richard

    1991-03-01

    The 2,860-acre Conforth Ranch near Umatilla, Oregon is being considered for acquisition and management to partially mitigate wildlife losses associated with McNary Hydroelectric Project. The Habitat Evaluation Procedures (HEP) estimated that management for wildlife would result in habitat unit gains of 519 for meadowlark, 420 for quail, 431 for mallard, 466 for Canada goose, 405 for mink, 49 for downy woodpecker, 172 for yellow warbler, and 34 for spotted sandpiper. This amounts to a total combined gain of 2,495 habitat units -- a 110 percent increase over the existing values for these species combined of 2,274 habitat units. Current water delivery costs, estimated at $50,000 per year, are expected to increase to $125,000 per year. A survey of local interest indicated a majority of respondents favored the concept with a minority opposed. No contaminants that would preclude the Fish and Wildlife Service from agreeing to accept the property were identified. 21 refs., 3 figs., 5 tabs.

  10. First Production of C60 Nanoparticle Plasma Jet for Study of Disruption Mitigation for ITER

    NASA Astrophysics Data System (ADS)

    Bogatu, I. N.; Thompson, J. R.; Galkin, S. A.; Kim, J. S.; Brockington, S.; Case, A.; Messer, S. J.; Witherspoon, F. D.

    2012-10-01

    Unique fast response and large mass-velocity delivery of nanoparticle plasma jets (NPPJs) provide a novel application for ITER disruption mitigation, runaway electrons diagnostics and deep fueling. NPPJs carry a much larger mass than usual gases. An electromagnetic plasma gun provides a very high injection velocity (many km/s). NPPJ has much higher ram pressure than any standard gas injection method and penetrates the tokamak confining magnetic field. Assimilation is enhanced due to the NP large surface-to-volume ratio. Radially expanding NPPJs help achieving toroidal uniformity of radiation power. FAR-TECH's NPPJ system was successfully tested: a coaxial plasma gun prototype (˜35 cm length, 96 kJ energy) using a solid state TiH2/C60 pulsed power cartridge injector produced a hyper-velocity (>4 km/s), high-density (>10^23 m-3), C60 plasma jet in ˜0.5 ms, with ˜1-2 ms overall response-delivery time. We present the TiH2/C60 cartridge injector output characterization (˜180 mg of sublimated C60 gas) and first production results of a high momentum C60 plasma jet (˜0.6 g.km/s).

  11. Climate change and mitigation.

    PubMed

    Nibleus, Kerstin; Lundin, Rickard

    2010-01-01

    Planet Earth has experienced repeated changes of its climate throughout time. Periods warmer than today as well as much colder, during glacial episodes, have alternated. In our time, rapid population growth with increased demand for natural resources and energy, has made society increasingly vulnerable to environmental changes, both natural and those caused by man; human activity is clearly affecting the radiation balance of the Earth. In the session "Climate Change and Mitigation" the speakers offered four different views on coal and CO2: the basis for life, but also a major hazard with impact on Earth's climate. A common denominator in the presentations was that more than ever science and technology is required. We need not only understand the mechanisms for climate change and climate variability, we also need to identify means to remedy the anthropogenic influence on Earth's climate. PMID:20873680

  12. Awareness of occupational hazards and use of safety measures among welders: a cross-sectional study from eastern Nepal

    PubMed Central

    Budhathoki, Shyam Sundar; Singh, Suman Bahadur; Sagtani, Reshu Agrawal; Niraula, Surya Raj; Pokharel, Paras Kumar

    2014-01-01

    Objective The proper use of safety measures by welders is an important way of preventing and/or reducing a variety of health hazards that they are exposed to during welding. There is a lack of knowledge about hazards and personal protective equipments (PPEs) and the use of PPE among the welders in Nepal is limited. We designed a study to assess welders’ awareness of hazards and PPE, and the use of PPE among the welders of eastern Nepal and to find a possible correlation between awareness and use of PPE among them. Materials and methods A cross-sectional study of 300 welders selected by simple random sampling from three districts of eastern Nepal was conducted using a semistructured questionnaire. Data regarding age, education level, duration of employment, awareness of hazards, safety measures and the actual use of safety measures were recorded. Results Overall, 272 (90.7%) welders were aware of at least one hazard of welding and a similar proportion of welders were aware of at least one PPE. However, only 47.7% used one or more types of PPE. Education and duration of employment were significantly associated with the awareness of hazards and of PPE and its use. The welders who reported using PPE during welding were two times more likely to have been aware of hazards (OR=2.52, 95% CI 1.09 to 5.81) and five times more likely to have been aware of PPE compared with the welders who did not report the use of PPE (OR=5.13, 95% CI 2.34 to 11.26). Conclusions The welders using PPE were those who were aware of hazards and PPE. There is a gap between being aware of hazards and PPE (90%) and use of PPE (47%) at work. Further research is needed to identify the underlying factors leading to low utilisation of PPE despite the welders of eastern Nepal being knowledgeable of it. PMID:24889850

  13. Hazard Ranking Methodology for Assessing Health Impacts of Unconventional Natural Gas Development and Production: The Maryland Case Study

    PubMed Central

    Sangaramoorthy, Thurka; Wilson, Sacoby; Nachman, Keeve E.; Babik, Kelsey; Jenkins, Christian C.; Trowell, Joshua; Milton, Donald K.; Sapkota, Amir

    2016-01-01

    The recent growth of unconventional natural gas development and production (UNGDP) has outpaced research on the potential health impacts associated with the process. The Maryland Marcellus Shale Public Health Study was conducted to inform the Maryland Marcellus Shale Safe Drilling Initiative Advisory Commission, State legislators and the Governor about potential public health impacts associated with UNGDP so they could make an informed decision that considers the health and well-being of Marylanders. In this paper, we describe an impact assessment and hazard ranking methodology we used to assess the potential public health impacts for eight hazards associated with the UNGDP process. The hazard ranking included seven metrics: 1) presence of vulnerable populations (e.g. children under the age of 5, individuals over the age of 65, surface owners), 2) duration of exposure, 3) frequency of exposure, 4) likelihood of health effects, 5) magnitude/severity of health effects, 6) geographic extent, and 7) effectiveness of setbacks. Overall public health concern was determined by a color-coded ranking system (low, moderately high, and high) that was generated based on the overall sum of the scores for each hazard. We provide three illustrative examples of applying our methodology for air quality and health care infrastructure which were ranked as high concern and for water quality which was ranked moderately high concern. The hazard ranking was a valuable tool that allowed us to systematically evaluate each of the hazards and provide recommendations to minimize the hazards. PMID:26726918

  14. Hazard Ranking Methodology for Assessing Health Impacts of Unconventional Natural Gas Development and Production: The Maryland Case Study.

    PubMed

    Boyle, Meleah D; Payne-Sturges, Devon C; Sangaramoorthy, Thurka; Wilson, Sacoby; Nachman, Keeve E; Babik, Kelsey; Jenkins, Christian C; Trowell, Joshua; Milton, Donald K; Sapkota, Amir

    2016-01-01

    The recent growth of unconventional natural gas development and production (UNGDP) has outpaced research on the potential health impacts associated with the process. The Maryland Marcellus Shale Public Health Study was conducted to inform the Maryland Marcellus Shale Safe Drilling Initiative Advisory Commission, State legislators and the Governor about potential public health impacts associated with UNGDP so they could make an informed decision that considers the health and well-being of Marylanders. In this paper, we describe an impact assessment and hazard ranking methodology we used to assess the potential public health impacts for eight hazards associated with the UNGDP process. The hazard ranking included seven metrics: 1) presence of vulnerable populations (e.g. children under the age of 5, individuals over the age of 65, surface owners), 2) duration of exposure, 3) frequency of exposure, 4) likelihood of health effects, 5) magnitude/severity of health effects, 6) geographic extent, and 7) effectiveness of setbacks. Overall public health concern was determined by a color-coded ranking system (low, moderately high, and high) that was generated based on the overall sum of the scores for each hazard. We provide three illustrative examples of applying our methodology for air quality and health care infrastructure which were ranked as high concern and for water quality which was ranked moderately high concern. The hazard ranking was a valuable tool that allowed us to systematically evaluate each of the hazards and provide recommendations to minimize the hazards. PMID:26726918

  15. Mapping Europe's Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Giardini, Domenico; Wössner, Jochen; Danciu, Laurentiu

    2014-07-01

    From the rift that cuts through the heart of Iceland to the complex tectonic convergence that causes frequent and often deadly earthquakes in Italy, Greece, and Turkey to the volcanic tremors that rattle the Mediterranean, seismic activity is a prevalent and often life-threatening reality across Europe. Any attempt to mitigate the seismic risk faced by society requires an accurate estimate of the seismic hazard.

  16. Vertical Field of View Reference Point Study for Flight Path Control and Hazard Avoidance

    NASA Technical Reports Server (NTRS)

    Comstock, J. Raymond, Jr.; Rudisill, Marianne; Kramer, Lynda J.; Busquets, Anthony M.

    2002-01-01

    Researchers within the eXternal Visibility System (XVS) element of the High-Speed Research (HSR) program developed and evaluated display concepts that will provide the flight crew of the proposed High-Speed Civil Transport (HSCT) with integrated imagery and symbology to permit path control and hazard avoidance functions while maintaining required situation awareness. The challenge of the XVS program is to develop concepts that would permit a no-nose-droop configuration of an HSCT and expanded low visibility HSCT operational capabilities. This study was one of a series of experiments exploring the 'design space' restrictions for physical placement of an XVS display. The primary experimental issues here was 'conformality' of the forward display vertical position with respect to the side window in simulated flight. 'Conformality' refers to the case such that the horizon and objects appear in the same relative positions when viewed through the forward windows or display and the side windows. This study quantified the effects of visual conformality on pilot flight path control and hazard avoidance performance. Here, conformality related to the positioning and relationship of the artificial horizon line and associated symbology presented on the forward display and the horizon and associated ground, horizon, and sky textures as they would appear in the real view through a window presented in the side window display. No significant performance consequences were found for the non-conformal conditions.

  17. Delayed geochemical hazard: a tool for risk assessment of heavy metal polluted sites and case study.

    PubMed

    Zheng, Mingxia; Feng, Liu; He, Juanni; Chen, Ming; Zhang, Jiawen; Zhang, Minying; Wang, Jing

    2015-04-28

    A concept of delayed geochemical hazard (DGH) was proposed instead of chemical time bomb to represent an ecological and environmental hazard caused by sudden reactivation and release of long-term accumulated pollutants in soil/sediment system due to the change of physicochemical conditions or the decrease of environmental capacity. A DGH model was also established to provide a quantitative tool to assess and predict potential environmental risk caused by heavy metals and especially its dynamic evolutions. A case study of DGH was carried out for a mercury-polluted area in southern China. Results of soil column experiment showed that DGH was directly resulted from the transformation and release of pollutant from the releasable species to the active ones through a mechanism of chain reaction. The most possible chain reaction was summarized as HgE+C+F+O+R→HgE+C+F+O→HgE+C+F→HgE+C→HgE. Although 8.3% of the studied area with the total releasable content of mercury (TRCPHg) exceeded the DGH critical point value of 16.667mg/kg, with the possibility of DGH burst, the area was classified as low-risk of DGH. This confirmed that DGH model could contribute to the risk assessment and early warning of soil/sediment pollution. PMID:25661167

  18. Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology

    NASA Technical Reports Server (NTRS)

    Woods, Stephen

    2009-01-01

    This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.

  19. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  20. Concerns About Climate Change Mitigation Projects: Summary of Findings from Case Studies in Brazil, India, Mexico, and South Africa

    SciTech Connect

    Sathaye, Jayant A.; Andrasko, Kenneth; Makundi, Willy; La Rovere, Emilio Lebre; Ravinandranath, N.H.; Melli, Anandi; Rangachari, Anita; Amaz, Mireya; Gay, Carlos; Friedmann, Rafael; Goldberg, Beth; van Horen, Clive; Simmonds, Gillina; Parker, Gretchen

    1998-11-01

    The concept of joint implementation as a way to implement climate change mitigation projects in another country has been controversial ever since its inception. Developing countries have raised numerous issues at the project-specific technical level, and broader concerns having to do with equity and burden sharing. This paper summarizes the findings of studies for Brazil, India, Mexico and South Africa, four countries that have large greenhouse gas emissions and are heavily engaged in the debate on climate change projects under the Kyoto Protocol. The studies examine potential or current projects/programs to determine whether eight technical concerns about joint implementation can be adequately addressed. They conclude that about half the concerns were minor or well managed by project developers, but concerns about additionality of funds, host country institutions and guarantees of performance (including the issues of baselines and possible leakage) need much more effort to be adequately addressed. All the papers agree on the need to develop institutional arrangements for approving and monitoring such projects in each of the countries represented. The case studies illustrate that these projects have the potential to bring new technology, investment, employment and ancillary socioeconomic and environmental benefits to developing countries. These benefits are consistent with the goal of sustainable development in the four study countries. At a policy level, the studies' authors note that in their view, the Annex I countries should consider limits on the use of jointly implemented projects as a way to get credits against their own emissions at home, and stress the importance of industrialized countries developing new technologies that will benefit all countries. The authors also observe that if all countries accepted caps on their emissions (with a longer time period allowed for developing countries to do so) project-based GHG mitigation would be significantly

  1. Application of a Data Mining Model and It's Cross Application for Landslide Hazard Analysis: a Case Study in Malaysia

    NASA Astrophysics Data System (ADS)

    Pradhan, Biswajeet; Lee, Saro; Shattri, Mansor

    This paper deals with landslide hazard analysis and cross-application using Geographic Information System (GIS) and remote sensing data for Cameron Highland, Penang Island and Selangor in Malaysia. The aim of this study was to cross-apply and verify a spatial probabilistic model for landslide hazard analysis. Landslide locations were identified in the study area from interpretation of aerial photographs and field surveys. Topographical/geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for the landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. These factors were analyzed using an artificial neural network model to generate the landslide hazard map. Each factor's weight was determined by the back-propagation training method. Then the landslide hazard indices were calculated using the trained back-propagation weights, and finally the landslide hazard map was generated using GIS tools. Landslide hazard maps were drawn for these three areas using artificial neural network model derived not only from the data for that area but also using the weight for each parameters, one of the statistical model, calculated from each of the other two areas (nine maps in all) as a cross-check of the validity of the method. For verification, the results of the analyses were compared, in each study area, with actual landslide locations. The verification results showed sufficient agreement between the presumptive hazard map and the existing data on landslide areas.

  2. Evaluation of MEDALUS model for desertification hazard zonation using GIS; study area: Iyzad Khast plain, Iran.

    PubMed

    Farajzadeh, Manuchehr; Egbal, Mahbobeh Nik

    2007-08-15

    In this study, the MEDALUS model along with GIS mapping techniques are used to determine desertification hazards for a province of Iran to determine the desertification hazard. After creating a desertification database including 20 parameters, the first steps consisted of developing maps of four indices for the MEDALUS model including climate, soil, vegetation and land use were prepared. Since these parameters have mostly been presented for the Mediterranean region in the past, the next step included the addition of other indicators such as ground water and wind erosion. Then all of the layers weighted by environmental conditions present in the area were used (following the same MEDALUS framework) before a desertification map was prepared. The comparison of two maps based on the original and modified MEDALUS models indicates that the addition of more regionally-specific parameters into the model allows for a more accurate representation of desertification processes across the Iyzad Khast plain. The major factors affecting desertification in the area are climate, wind erosion and low land quality management, vegetation degradation and the salinization of soil and water resources. PMID:19070073

  3. [A study on the vibration hazards due to using bush cutters (author's transl)].

    PubMed

    Futatsuka, M

    1979-05-01

    Bush cutters were introduced into Japanese forestry for cutting bush in 1957 or thereabout. Thereafter, they have been used widely not only in forestry but also in agriculture, public engineering, etc. In spite of the fact that vibration acceleration levels of the bush cutters are high, vibration hazards due to using these tools have not been studied. The reason lies in few professions using bush cutters exclusively. Moreover, the tool is only used seasonally from June to September. The author examined 641 bush cutters users in the State Forestry in southern Kyushu and compared the results with a sample of chain saw users matched by age, duration of use, and district. The prevalence rate of Raynaud's phenomenon is 0.4% at 4 years after beginning the use of bush cutters, 4.0% after 10 years and 9.2% after 18 years. The prevalence of Raynaud's phenomenon in chain saw users is twice as great as that in bush cutter users. Bush cutter users are attacked by Raynaud's phenomenon after about 1,000 hours' use of the tool. The difference is regarded to depend on the difference in the density of use and the difference in the levels of vibration acceleration. The vibration hazards due to using bush cutters were characterized by relatively mild complaints in the upper extremities, together with disturbances of peripheral circulation but not always with peripheral neuropathy. PMID:529563

  4. Photobiomodulation Mitigates Diabetes-Induced Retinopathy by Direct and Indirect Mechanisms: Evidence from Intervention Studies in Pigmented Mice

    PubMed Central

    Liu, Haitao; Patel, Shyam; Roberts, Robin; Berkowitz, Bruce A.; Kern, Timothy S.

    2015-01-01

    Objective Daily application of far-red light from the onset of diabetes mitigated diabetes-induced abnormalities in retinas of albino rats. Here, we test the hypothesis that photobiomodulation (PBM) is effective in diabetic, pigmented mice, even when delayed until weeks after onset of diabetes. Direct and indirect effects of PBM on the retina also were studied. Methods Diabetes was induced in C57Bl/6J mice using streptozotocin. Some diabetics were exposed to PBM therapy (4 min/day; 670 nm) daily. In one study, mice were diabetic for 4 weeks before initiation of PBM for an additional 10 weeks. Retinal oxidative stress, inflammation, and retinal function were measured. In some mice, heads were covered with a lead shield during PBM to prevent direct illumination of the eye, or animals were treated with an inhibitor of heme oxygenase-1. In a second study, PBM was initiated immediately after onset of diabetes, and administered daily for 2 months. These mice were examined using manganese-enhanced MRI to assess effects of PBM on transretinal calcium channel function in vivo. Results PBM intervention improved diabetes-induced changes in superoxide generation, leukostasis, expression of ICAM-1, and visual performance. PBM acted in part remotely from the retina because the beneficial effects were achieved even with the head shielded from the light therapy, and because leukocyte-mediated cytotoxicity of retinal endothelial cells was less in diabetics treated with PBM. SnPP+PBM significantly reduced iNOS expression compared to PBM alone, but significantly exacerbated leukostasis. In study 2, PBM largely mitigated diabetes-induced retinal calcium channel dysfunction in all retinal layers. Conclusions PBM induces retinal protection against abnormalities induced by diabetes in pigmented animals, and even as an intervention. Beneficial effects on the retina likely are mediated by both direct and indirect mechanisms. PBM is a novel non-pharmacologic treatment strategy to inhibit

  5. 77 FR 40627 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal Emergency... Administrator for Mitigation, Department of Homeland Security, Federal Emergency Management Agency. BILLING CODE... Development Building, 25 Dorrance Street, Providence, RI 02903. Big Horn County, Wyoming, and...

  6. The Study on Ecological Treatment of Saline Lands to Mitigate the Effects of Climate Change

    NASA Astrophysics Data System (ADS)

    Xie, Jiancang; Zhu, Jiwei; Wang, Tao

    2010-05-01

    The soil water and salt movement is influenced strongly by the frequent droughts, floods and climate change. Additionally, as continued population growth, large-scale reclaiming of arable land and long-term unreasonable irrigation, saline land is increasing at the rate of 1,000,000~15,000,000 mu each year all over the world. In the tradition management, " drainage as the main " measure has series of problem, which appears greater project, more occupation of land, harmful for water saving and downstream pollution. To response the global climate change, it has become the common understanding, which promote energy-saving and environment protection, reflect the current model, explore the ecological management model. In this paper, we take severe saline land—Lubotan in Shaanxi Province as an example. Through nearly 10 years harnessing practice and observing to meteorology, hydrology, soil indicators of climate, we analyze the influence of climate change to soil salinity movement at different seasons and years, then put forward and apply a new model of saline land harnessing to mitigate the Effects of Climate Change and self-rehabilitate entironment. This model will be changed "drainage" to "storage", through the establishment engineering of " storage as the main ", taken comprehensive measures of " project - biology - agriculture ", we are changing saline land into arable land. Adapted to natural changes of climate, rainfall, irrigation backwater, groundwater level, reduced human intervention to achieve system dynamic equilibrium. During the ten years, the salt of plough horizon has reduced from 0.74% to 0.20%, organic matter has increased from 0.7% to 0.92%, various indicators of soil is begining to go better. At the same time, reduced the water for irrigation, drainage pollution and investment costs. Through the model, reformed severe saline land 18,900 mu, increased new cultivated land 16,500 mu, comprehensive efficient significant, ensured the coordinated

  7. Economic optimization of natural hazard protection - conceptual study of existing approaches

    NASA Astrophysics Data System (ADS)

    Spackova, Olga; Straub, Daniel

    2013-04-01

    Risk-based planning of protection measures against natural hazards has become a common practice in many countries. The selection procedure aims at identifying an economically efficient strategy with regard to the estimated costs and risk (i.e. expected damage). A correct setting of the evaluation methodology and decision criteria should ensure an optimal selection of the portfolio of risk protection measures under a limited state budget. To demonstrate the efficiency of investments, indicators such as Benefit-Cost Ratio (BCR), Marginal Costs (MC) or Net Present Value (NPV) are commonly used. However, the methodologies for efficiency evaluation differ amongst different countries and different hazard types (floods, earthquakes etc.). Additionally, several inconsistencies can be found in the applications of the indicators in practice. This is likely to lead to a suboptimal selection of the protection strategies. This study provides a general formulation for optimization of the natural hazard protection measures from a socio-economic perspective. It assumes that all costs and risks can be expressed in monetary values. The study regards the problem as a discrete hierarchical optimization, where the state level sets the criteria and constraints, while the actual optimization is made on the regional level (towns, catchments) when designing particular protection measures and selecting the optimal protection level. The study shows that in case of an unlimited budget, the task is quite trivial, as it is sufficient to optimize the protection measures in individual regions independently (by minimizing the sum of risk and cost). However, if the budget is limited, the need for an optimal allocation of resources amongst the regions arises. To ensure this, minimum values of BCR or MC can be required by the state, which must be achieved in each region. The study investigates the meaning of these indicators in the optimization task at the conceptual level and compares their

  8. Cost-benefit analysis of alternative LNG vapor-mitigation measures. Topical report, September 14, 1987-January 15, 1991

    SciTech Connect

    Atallah, S.

    1992-06-25

    A generalized methodology is presented for comparing the costs and safety benefits of alternative hazard mitigation measures for a large LNG vapor release. The procedure involves the quantification of the risk to the public before and after the application of LNG vapor mitigation measures. In the study, risk was defined as the product of the annual accident frequency, estimated from a fault tree analysis, and the severity of the accident. Severity was measured in terms of the number of people who may be exposed to 2.5% or higher concentration. The ratios of the annual costs of the various mitigation measures to their safety benefits (as determined by the differences between the risk before and after mitigation measure implementation), were then used to identify the most cost-effective approaches to vapor cloud mitigation.

  9. Application of multi-agent coordination methods to the design of space debris mitigation tours

    NASA Astrophysics Data System (ADS)

    Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby

    2016-04-01

    The growth in the number of defunct and fragmented objects near to the Earth poses a growing hazard to launch operations as well as existing on-orbit assets. Numerous studies have demonstrated the positive impact of active debris mitigation campaigns upon the growth of debris populations, but comparatively fewer investigations incorporate specific mission scenarios. Furthermore, while many active mitigation methods have been proposed, certain classes of debris objects are amenable to mitigation campaigns employing chaser spacecraft with existing chemical and low-thrust propulsive technologies. This investigation incorporates an ant colony optimization routing algorithm and multi-agent coordination via auctions into a debris mitigation tour scheme suitable for preliminary mission design and analysis as well as spacecraft flight operations.

  10. Case study: Mapping tsunami hazards associated with debris flow into a reservoir

    USGS Publications Warehouse

    Walder, J.S.; Watts, P.; Waythomas, C.F.

    2006-01-01

    Debris-flow generated impulse waves (tsunamis) pose hazards in lakes, especially those used for hydropower or recreation. We describe a method for assessing tsunami-related hazards for the case in which inundation by coherent water waves, rather than chaotic splashing, is of primary concern. The method involves an experimentally based initial condition (tsunami source) and a Boussinesq model for tsunami propagation and inundation. Model results are used to create hazard maps that offer guidance for emergency planners and responders. An example application explores tsunami hazards associated with potential debris flows entering Baker Lake, a reservoir on the flanks of the Mount Baker volcano in the northwestern United States. ?? 2006 ASCE.

  11. Model-Predictive Cascade Mitigation in Electric Power Systems With Storage and Renewables-Part II: Case-Study

    SciTech Connect

    Almassalkhi, MR; Hiskens, IA

    2015-01-01

    The novel cascade-mitigation scheme developed in Part I of this paper is implemented within a receding-horizon model predictive control (MPC) scheme with a linear controller model. This present paper illustrates the MPC strategy with a case-study that is based on the IEEE RTS-96 network, though with energy storage and renewable generation added. It is shown that the MPC strategy alleviates temperature overloads on transmission lines by rescheduling generation, energy storage, and other network elements, while taking into account ramp-rate limits and network limitations. Resilient performance is achieved despite the use of a simplified linear controller model. The MPC scheme is compared against a base-case that seeks to emulate human operator behavior.

  12. Monitoring and modelling for landslide risk mitigation and reduction. The case study of San Benedetto Ullano (Northern Calabria - Italy)

    NASA Astrophysics Data System (ADS)

    Terranova, Oreste G.; Greco, Venanzio R.; Gariano, Stefano L.; Pascale, Stefania; Rago, Valeria; Caloiero, Paola; Iovine, Giulio G. R.

    2016-04-01

    movements caused severe damage to roads and infrastructure. The second crisis ended up in late June: the hydrological model FLaIR was then successfully tested against the known dates of activation of the slope movement, by using a local rain series [3]. Meanwhile, technical support was being assured to the Municipality to optimize geological prospections, monitoring, and design of remedial works of a master plan. A third activation occurred during the night of 15 March 2013, when planned remedial works had not been completed yet. By applying the hydrological model SAKe [4, 5], this activation could be predicted, again permitting the prompt adoption of mitigation measures. Such activation was triggered by rains smaller and shorter than those that caused previous activations, perhaps indicating an apparent increasing fragility of the slope. Changes in slope stability conditions before and after the construction of the remedial works are being investigated. Critical rain conditions and groundwater levels for landslide activation are in fact expected to change, depending on combined effects of natural weakening vs. artificial strengthening. Monitoring will allow to quantitatively verify the new relationships between rainfall, groundwater and slope stability. References [1] Iovine G., Iaquinta P. & Terranova O. (2009). In Anderssen, Braddock & Newham (Eds.), Proc. 18th World IMACS Congr. and MODSIM09 Int. Congr. on Modelling and Simulation, pp. 2686-2693. [2] Iovine G., Lollino P., Gariano S.L. & Terranova O.G. (2010). NHESS, 10, 2341-2354. [3] Capparelli G., Iaquinta P., Iovine G., Terranova O.G. & Versace P. (2012). Natural Hazards, 61(1), pp.247-256. [4] Terranova O.G., Iaquinta P., Gariano S.L., Greco R. & Iovine G. (2013) In: Landslide Science and Practice, Margottini, Canuti, Sassa (Eds.), Vol. 3, pp.73-79. [5] Terranova O.G., Gariano S.L., Iaquinta P. & Iovine G.G.R. (2015). Geosci. Model Dev., 8, 1955-1978.

  13. Eco-efficiency for greenhouse gas emissions mitigation of municipal solid waste management: a case study of Tianjin, China.

    PubMed

    Zhao, Wei; Huppes, Gjalt; van der Voet, Ester

    2011-06-01

    The issue of municipal solid waste (MSW) management has been highlighted in China due to the continually increasing MSW volumes being generated and the limited capacity of waste treatment facilities. This article presents a quantitative eco-efficiency (E/E) analysis on MSW management in terms of greenhouse gas (GHG) mitigation. A methodology for E/E analysis has been proposed, with an emphasis on the consistent integration of life cycle assessment (LCA) and life cycle costing (LCC). The environmental and economic impacts derived from LCA and LCC have been normalized and defined as a quantitative E/E indicator. The proposed method was applied in a case study of Tianjin, China. The study assessed the current MSW management system, as well as a set of alternative scenarios, to investigate trade-offs between economy and GHG emissions mitigation. Additionally, contribution analysis was conducted on both LCA and LCC to identify key issues driving environmental and economic impacts. The results show that the current Tianjin's MSW management system emits the highest GHG and costs the least, whereas the situation reverses in the integrated scenario. The key issues identified by the contribution analysis show no linear relationship between the global warming impact and the cost impact in MSW management system. The landfill gas utilization scenario is indicated as a potential optimum scenario by the proposed E/E analysis, given the characteristics of MSW, technology levels, and chosen methodologies. The E/E analysis provides an attractive direction towards sustainable waste management, though some questions with respect to uncertainty need to be discussed further. PMID:21316937

  14. Eco-efficiency for greenhouse gas emissions mitigation of municipal solid waste management: A case study of Tianjin, China

    SciTech Connect

    Zhao Wei; Huppes, Gjalt; Voet, Ester van der

    2011-06-15

    The issue of municipal solid waste (MSW) management has been highlighted in China due to the continually increasing MSW volumes being generated and the limited capacity of waste treatment facilities. This article presents a quantitative eco-efficiency (E/E) analysis on MSW management in terms of greenhouse gas (GHG) mitigation. A methodology for E/E analysis has been proposed, with an emphasis on the consistent integration of life cycle assessment (LCA) and life cycle costing (LCC). The environmental and economic impacts derived from LCA and LCC have been normalized and defined as a quantitative E/E indicator. The proposed method was applied in a case study of Tianjin, China. The study assessed the current MSW management system, as well as a set of alternative scenarios, to investigate trade-offs between economy and GHG emissions mitigation. Additionally, contribution analysis was conducted on both LCA and LCC to identify key issues driving environmental and economic impacts. The results show that the current Tianjin's MSW management system emits the highest GHG and costs the least, whereas the situation reverses in the integrated scenario. The key issues identified by the contribution analysis show no linear relationship between the global warming impact and the cost impact in MSW management system. The landfill gas utilization scenario is indicated as a potential optimum scenario by the proposed E/E analysis, given the characteristics of MSW, technology levels, and chosen methodologies. The E/E analysis provides an attractive direction towards sustainable waste management, though some questions with respect to uncertainty need to be discussed further.

  15. Status of volcanic hazard studies for the Nevada Nuclear Waste Storage Investigations. Volume II

    SciTech Connect

    Crowe, B.M.; Wohletz, K.H.; Vaniman, D.T.; Gladney, E.; Bower, N.

    1986-01-01

    Volcanic hazard investigations during FY 1984 focused on five topics: the emplacement mechanism of shallow basalt intrusions, geochemical trends through time for volcanic fields of the Death Valley-Pancake Range volcanic zone, the possibility of bimodal basalt-rhyolite volcanism, the age and process of enrichment for incompatible elements in young basalts of the Nevada Test Site (NTS) region, and the possibility of hydrovolcanic activity. The stress regime of Yucca Mountain may favor formation of shallow basalt intrusions. However, combined field and drill-hole studies suggest shallow basalt intrusions are rare in the geologic record of the southern Great Basin. The geochemical patterns of basaltic volcanism through time in the NTS region provide no evidence for evolution toward a large-volume volcanic field or increases in future rates of volcanism. Existing data are consistent with a declining volcanic system comparable to the late stages of the southern Death Valley volcanic field. The hazards of bimodal volcanism in this area are judged to be low. The source of a 6-Myr pumice discovered in alluvial deposits of Crater Flat has not been found. Geochemical studies show that the enrichment of trace elements in the younger rift basalts must be related to an enrichment of their mantle source rocks. This geochemical enrichment event, which may have been metasomatic alteration, predates the basalts of the silicic episode and is, therefore, not a young event. Studies of crater dimensions of hydrovolcanic landforms indicate that the worst case scenario (exhumation of a repository at Yucca Mountain by hydrovolcanic explosions) is unlikely. Theoretical models of melt-water vapor explosions, particularly the thermal detonation model, suggest hydrovolcanic explosion are possible at Yucca Mountain. 80 refs., 21 figs., 5 tabs.

  16. Property-close source separation of hazardous waste and waste electrical and electronic equipment - A Swedish case study

    SciTech Connect

    Bernstad, Anna; Cour Jansen, Jes la; Aspegren, Henrik

    2011-03-15

    Through an agreement with EEE producers, Swedish municipalities are responsible for collection of hazardous waste and waste electrical and electronic equipment (WEEE). In most Swedish municipalities, collection of these waste fractions is concentrated to waste recycling centres where households can source-separate and deposit hazardous waste and WEEE free of charge. However, the centres are often located on the outskirts of city centres and cars are needed in order to use the facilities in most cases. A full-scale experiment was performed in a residential area in southern Sweden to evaluate effects of a system for property-close source separation of hazardous waste and WEEE. After the system was introduced, results show a clear reduction in the amount of hazardous waste and WEEE disposed of incorrectly amongst residual waste or dry recyclables. The systems resulted in a source separation ratio of 70 wt% for hazardous waste and 76 wt% in the case of WEEE. Results show that households in the study area were willing to increase source separation of hazardous waste and WEEE when accessibility was improved and that this and similar collection systems can play an important role in building up increasingly sustainable solid waste management systems.

  17. Study in landslide hazard zonation based on factor weighting-rating in Wan County, Three Gorges Reservoir area

    NASA Astrophysics Data System (ADS)

    Liu, Zhengjun; Wang, Jian; Chi, Changyan

    2008-11-01

    Multi-source earth observation data is highly desirable in current landslide hazard prediction modeling, as well as Landslide Hazard Zonation(LHZ) is a very important content of landslide hazard prediction modeling. In this paper, take Wan County for instance, we investigate the potentials of derivation from multi-source data sets to study landslide hazard zonation based on the ordinal scale relative weighting-rating technique. LHZ is then performed with chosen factor layers including: buffer map of thrusts, lithology, slope angle and relative relief calculated from DEM, NDVI, buffer map of drainage and lineaments extracted from the digital satellite imagery(TM). Then Landslide Hazard Index (LHI) value is calculated and landslide hazard zonation is decided by slicing LHI histogram. The statistics results demonstrate that high stable slope zone, stable slope zone, quasi-stable slope zone, relatively unstable slope zone, unstable slope zone and defended slope zone account for 2.20%, 14.02%, 39.88%, 28.27%, 12.17% and 3.47% respectively. Then, GPS deformation control points on the landslide bodies are used to verify the validity of the LHZ technique.

  18. GIS-based pollution hazard mapping and assessment framework of shallow lakes: southeastern Pampean lakes (Argentina) as a case study.

    PubMed

    Romanelli, A; Esquius, K S; Massone, H E; Escalante, A H

    2013-08-01

    The assessment of water vulnerability and pollution hazard traditionally places particular emphasis on the study on groundwaters more than on surface waters. Consequently, a GIS-based Lake Pollution Hazard Index (LPHI) was proposed for assessing and mapping the potential pollution hazard for shallow lakes due to the interaction between the Potential Pollutant Load and the Lake Vulnerability. It includes easily measurable and commonly used parameters: land cover, terrain slope and direction, and soil media. Three shallow lake ecosystems of the southeastern Pampa Plain (Argentina) were chosen to test the usefulness and applicability of this suggested index. Moreover, anthropogenic and natural medium influence on biophysical parameters in these three ecosystems was examined. The evaluation of the LPHI map shows for La Brava and Los Padres lakes the highest pollution hazard (≈30 % with high to very high category) while Nahuel Rucá Lake seems to be the less hazardous water body (just 9.33 % with high LPHI). The increase in LPHI value is attributed to a different loading of pollutants governed by land cover category and/or the exposure to high slopes and influence of slope direction. Dissolved oxygen and biochemical oxygen demand values indicate a moderately polluted and eutrophized condition of shallow lake waters, mainly related to moderate agricultural activities and/or cattle production. Obtained information by means of LPHI calculation result useful to perform a local diagnosis of the potential pollution hazard to a freshwater ecosystem in order to implement basic guidelines to improve lake sustainability. PMID:23355019

  19. Distinguishing Realistic Military Blasts from Firecrackers in Mitigation Studies of Blast Induced Traumatic Brain Injury

    SciTech Connect

    Moss, W C; King, M J; Blackman, E G

    2011-01-21

    In their Contributed Article, Nyein et al. (1,2) present numerical simulations of blast waves interacting with a helmeted head and conclude that a face shield may significantly mitigate blast induced traumatic brain injury (TBI). A face shield may indeed be important for future military helmets, but the authors derive their conclusions from a much smaller explosion than typically experienced on the battlefield. The blast from the 3.16 gm TNT charge of (1) has the following approximate peak overpressures, positive phase durations, and incident impulses (3): 10 atm, 0.25 ms, and 3.9 psi-ms at the front of the head (14 cm from charge), and 1.4 atm, 0.32 ms, and 1.7 psi-ms at the back of a typical 20 cm head (34 cm from charge). The peak pressure of the wave decreases by a factor of 7 as it traverses the head. The blast conditions are at the threshold for injury at the front of the head, but well below threshold at the back of the head (4). The blast traverses the head in 0.3 ms, roughly equal to the positive phase duration of the blast. Therefore, when the blast reaches the back of the head, near ambient conditions exist at the front. Because the headform is so close to the charge, it experiences a wave with significant curvature. By contrast, a realistic blast from a 2.2 kg TNT charge ({approx} an uncased 105 mm artillery round) is fatal at an overpressure of 10 atm (4). For an injury level (4) similar to (1), a 2.2 kg charge has the following approximate peak overpressures, positive phase durations, and incident impulses (3): 2.1 atm, 2.3 ms, and 18 psi-ms at the front of the head (250 cm from charge), and 1.8 atm, 2.5 ms, and 16.8 psi-ms at the back of the head (270 cm from charge). The peak pressure decreases by only a factor of 1.2 as it traverses the head. Because the 0.36 ms traversal time is much smaller than the positive phase duration, pressures on the head become relatively uniform when the blast reaches the back of the head. The larger standoff implies

  20. Balancing Mitigation Against Impact: A Case Study From the 2005 Chicxulub Seismic Survey

    NASA Astrophysics Data System (ADS)

    Barton, P.; Diebold, J.; Gulick, S.

    2006-05-01

    investigation. Both datasets indicate significantly lower levels than reported by Tolstoy et al. (2004). There was no evidence of environmental damage created by this survey. It can be concluded that the mitigation measures were extremely successful, but there is also a concern that the overhead cost of the environmental protection made this one of the most costly academic surveys ever undertaken, and that not all of this protection was necessary. In particular, the predicted 180 dB safety radius appeared to be overly conservative, even though based on calibrated measurements in very similar physical circumstances, and we suggest that these differences were a result of local seismic velocity structure in the water column and/or shallow seabed, which resulted in different partitioning of the energy. These results suggest that real time monitoring of hydrophone array data may provide a method of determining the safety radius dynamically, in response to local conditions.

  1. Public perception of flood risks, flood forecasting and mitigation

    NASA Astrophysics Data System (ADS)

    Brilly, M.; Polic, M.

    2005-04-01

    A multidisciplinary and integrated approach to the flood mitigation decision making process should provide the best response of society in a flood hazard situation including preparation works and post hazard mitigation. In Slovenia, there is a great lack of data on social aspects and public response to flood mitigation measures and information management. In this paper, two studies of flood perception in the Slovenian town Celje are represented. During its history, Celje was often exposed to floods, the most recent serious floods being in 1990 and in 1998, with a hundred and fifty return period and more than ten year return period, respectively. Two surveys were conducted in 1997 and 2003, with 157 participants from different areas of the town in the first, and 208 in the second study, aiming at finding the general attitude toward the floods. The surveys revealed that floods present a serious threat in the eyes of the inhabitants, and that the perception of threat depends, to a certain degree, on the place of residence. The surveys also highlighted, among the other measures, solidarity and the importance of insurance against floods.

  2. Hazard function theory for nonstationary natural hazards

    NASA Astrophysics Data System (ADS)

    Read, L.; Vogel, R. M.

    2015-12-01

    Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.

  3. Classification of residential areas according to physical vulnerability to natural hazards: a case study of Çanakkale, Turkey.

    PubMed

    Başaran-Uysal, Arzu; Sezen, Funda; Ozden, Süha; Karaca, Oznur

    2014-01-01

    The selection of new settlement areas and the construction of safe buildings, as well as rendering built-up areas safe, are of great importance in mitigating the damage caused by natural disasters. Most cities in Turkey are unprepared for natural hazards. In this paper, Çanakkale, located in a first-degree seismic zone and sprawled around the Sartçay Delta, is examined in terms of its physical vulnerability to natural hazards. Residential areas are analysed using GIS (geographic information system) and remote-sensing technologies in relation to selected indicators. Residential areas of the city are divided into zones according to an evaluation of geological characteristics, the built-up area's features, and urban infrastructure, and four risk zones are determined. The results of the analysis show that the areas of the city suitable for housing are very limited. In addition, the historical centre and the housing areas near Sartçay stream are shown to be most problematic in terms of natural disasters and sustainability. PMID:24325245

  4. UAV-based Natural Hazard Management in High-Alpine Terrain - Case Studies from Austria

    NASA Astrophysics Data System (ADS)

    Sotier, Bernadette; Adams, Marc; Lechner, Veronika

    2015-04-01

    Unmanned Aerial Vehicles (UAV) have become a standard tool for geodata collection, as they allow conducting on-demand mapping missions in a flexible, cost-effective manner at an unprecedented level of detail. Easy-to-use, high-performance image matching software make it possible to process the collected aerial images to orthophotos and 3D-terrain models. Such up-to-date geodata have proven to be an important asset in natural hazard management: Processes like debris flows, avalanches, landslides, fluvial erosion and rock-fall can be detected and quantified; damages can be documented and evaluated. In the Alps, these processes mostly originate in remote areas, which are difficult and hazardous to access, thus presenting a challenging task for RPAS data collection. In particular, the problems include finding suitable landing and piloting-places, dealing with bad or no GPS-signals and the installation of ground control points (GCP) for georeferencing. At the BFW, RPAS have been used since 2012 to aid natural hazard management of various processes, of which three case studies are presented below. The first case study deals with the results from an attempt to employ UAV-based multi-spectral remote sensing to monitor the state of natural hazard protection forests. Images in the visible and near-infrared (NIR) band were collected using modified low-cost cameras, combined with different optical filters. Several UAV-flights were performed in the 72 ha large study site in 2014, which lies in the Wattental, Tyrol (Austria) between 1700 and 2050 m a.s.l., where the main tree species are stone pine and mountain pine. The matched aerial images were analysed using different UAV-specific vitality indices, evaluating both single- and dual-camera UAV-missions. To calculate the mass balance of a debris flow in the Tyrolean Halltal (Austria), an RPAS flight was conducted in autumn 2012. The extreme alpine environment was challenging for both the mission and the evaluation of the aerial

  5. A Study for Health Hazard Evaluation of Methylene Chloride Evaporated from the Tear Gas Mixture

    PubMed Central

    Chung, Eun-Kyo; Yi, Gwang-Yong; Chung, Kwang-Jae; Shin, Jung-Ah; Lee, In-Seop

    2010-01-01

    This study explored the health hazard of those exposed to methylene chloride by assessing its atmospheric concentration when a tear gas mixture was aerially dispersed. The concentration of methylene chloride ranged from 311.1-980.3 ppm (geometric mean, 555.8 ppm), 30 seconds after the dispersion started. However, the concentration fell rapidly to below 10 ppm after dispersion was completed. The concentration during the dispersion did not surpass the National Institute for Occupational Safety and Health 'immediately dangerous to life or health' value of 2,300 ppm, but did exceed the American Conference of Governmental Industrial Hygienists excursion limit of 250 ppm. Since methylene chloride is highly volatile (vapor pressure, 349 mmHg at 20℃), the postdispersion atmospheric concentration can rise instantaneously. Moreover, the o-chlorobenzylidenemalononitrile formulation of tear gas (CS gas) is an acute upper respiratory tract irritant. Therefore, tear gas mixtures should be handled with delicate care. PMID:22953168

  6. CMMAD Usability Case Study in Support of Countermine and Hazard Sensing

    SciTech Connect

    Victor G. Walker; David I. Gertman

    2010-04-01

    During field trials, operator usability data were collected in support of lane clearing missions and hazard sensing for two robot platforms with Robot Intelligence Kernel (RIK) software and sensor scanning payloads onboard. The tests featured autonomous and shared robot autonomy levels where tasking of the robot used a graphical interface featuring mine location and sensor readings. The goal of this work was to provide insights that could be used to further technology development. The efficacy of countermine systems in terms of mobility, search, path planning, detection, and localization were assessed. Findings from objective and subjective operator interaction measures are reviewed along with commentary from soldiers having taken part in the study who strongly endorse the system.

  7. Experimental study on thermal hazard of tributyl phosphate-nitric acid mixtures using micro calorimeter technique.

    PubMed

    Sun, Qi; Jiang, Lin; Gong, Liang; Sun, Jin-Hua

    2016-08-15

    During PUREX spent nuclear fuel reprocessing, mixture of tributyl phosphate (TBP) and hydrocarbon solvent are employed as organic solvent to extract uranium in consideration of radiation contaminated safety and resource recycling, meanwhile nitric acid is utilized to dissolve the spent fuel into small pieces. However, once TBP contacts with nitric acid or nitrates above 130°C, a heavy "red oil" layer would occur accompanied by thermal runaway reactions, even caused several nuclear safety accident. Considering nitric acid volatility and weak exothermic detection, C80micro calorimeter technique was used in this study to investigate thermal decomposition of TBP mixed with nitric acid. Results show that the concentration of nitric acid greatly influences thermal hazard of the system by direct reactions. Even with a low heating rate, if the concentration of nitric acid increases due to evaporation of water or improper operations, thermal runaway in the closed system could start at a low temperature. PMID:27136728

  8. AMERICAN HEALTHY HOMES SURVEY: A NATIONAL STUDY OF RESIDENTIAL RELATED HAZARDS

    EPA Science Inventory

    The US Environmental Protection Agency's (EPA) National Exposure Research Laboratory (NERL) and the US Department of Housing and Urban Development's (HUD) Office of Healthy Homes and Lead Hazard Control conducted a national survey of housing related hazards in US residences. The...

  9. Impact and effectiveness of risk mitigation strategies on the insurability of nanomaterial production: evidences from industrial case studies.

    PubMed

    Bergamaschi, Enrico; Murphy, Finbarr; Poland, Craig A; Mullins, Martin; Costa, Anna L; McAlea, Eamonn; Tran, Lang; Tofail, Syed A M

    2015-01-01

    Workers involved in producing nanomaterials or using nanomaterials in manufacturing plants are likely to have earlier and higher exposure to manufactured/engineered nanomaterials (ENM) than the general population. This is because both the volume handled and the probability of the effluence of 'free' nanoparticles from the handled volume are much higher during a production process than at any other stage in the lifecycle of nanomaterials and nanotechnology-enabled products. Risk assessment (RA) techniques using control banding (CB) as a framework for risk transfer represents a robust theory but further progress on implementing the model is required so that risk can be transferred to insurance companies. Following a review of RA in general and hazard measurement in particular, we subject a Structural Alert Scheme methodology to three industrial case studies using ZrO2 , TiO2 , and multi-walled carbon nanotubes (MWCNT). The materials are tested in a pristine state and in a remediated (coated) state, and the respective emission and hazard rates are tested alongside the material performance as originally designed. To our knowledge, this is the first such implementation of a CB RA in conjunction with an ENM performance test and offers both manufacturers and underwriters an insight into future applications. PMID:25808636

  10. Geomorphological surveys and software simulations for rock fall hazard assessment: a case study in the Italian Alps

    NASA Astrophysics Data System (ADS)

    Devoto, S.; Boccali, C.; Podda, F.

    2014-12-01

    In northern Italy, fast-moving landslides represent a significant threat to the population and human facilities. In the eastern portion of the Italian Alps, rock falls are recurrent and are often responsible for casualties or severe damage to roads and buildings. The above-cited type of landslide is frequent in mountain ranges, is characterised by strong relief energy and is triggered by earthquakes or copious rainfall, which often exceed 2000 mm yr-1. These factors cause morphological dynamics with intense slope erosion and degradation processes. This work investigates the appraisal of the rock-fall hazard related to the presence of several large unstable blocks located at the top of a limestone peak, approximately 500 m NW with respect to the Village of Cimolais. Field surveys recognised a limestone block exceeding a volume of 400 m3 and identified this block as the most hazardous for Cimolais Village because of its proximity to the rocky cliff. A first assessment of the possible transit and stop areas has been investigated through in-depth traditional activities, such as geomorphological mapping and aerial photo analysis. The output of field surveys was a detailed land use map, which provided a fundamental starting point for rock fall software analysis. The geomorphological observations were correlated with DTMs derived by regional topography and Airborne Laser Scanning (ALS) surveys to recognise possible rock fall routes. To simulate properly rock fall trajectories with a hybrid computer program, particular attention was devoted to the correct quantification of rates of input parameters, such as restitution coefficients and horizontal acceleration associated to earthquakes, which historically occur in this portion of Italy. The simulation outputs regarding the distribution of rock fall end points and kinetic energy along rock falling paths highlight the hazardous situation for Cimolais Village. Because of this reason, mitigation works have been suggested to

  11. Incorporating natural hazard assessments into municipal master-plans; case-studies from Israel

    NASA Astrophysics Data System (ADS)

    Katz, Oded

    2010-05-01

    The active Dead Sea Rift (DSR) runs along the length of Israel, making the entire state susceptible to earthquake-related hazards. Current building codes generally acknowledge seismic hazards and direct engineers towards earthquake-resistant structures. However, hazard mapping on a scale fit for municipal/governmental planning is subject to local initiative and is currently not mandatory as seems necessary. In the following, a few cases of seismic-hazard evaluation made by the Geological Survey of Israel are presented, emphasizing the reasons for their initiation and the way results were incorporated (or not). The first case is a seismic hazard qualitative micro-zonation invited by the municipality of Jerusalem as part of a new master plan. This work resulted in maps (1:50,000; GIS format) identifying areas prone to (1) amplification of seismic shaking due to site characteristics (outcrops of soft rocks or steep topography) and (2) sites with earthquake induced landslide (EILS) hazard. Results were validated using reports from the 1927, M=6.2 earthquake that originated along the DSR about 30km east of Jerusalem. Although the hazard maps were accepted by municipal authorities, practical use by geotechnical engineers working within the frame of the new master-plan was not significant. The main reason for that is apparently a difference of opinion between the city-engineers responsible for implementing the new master-plan and the geologists responsible of the hazard evaluation. The second case involves evaluation of EILS hazard for two towns located further north along the DSR, Zefat and Tiberias. Both were heavily damaged more than once by strong earthquakes in past centuries. Work was carried out as part of a governmental seismic-hazard reduction program. The results include maps (1:10,000 scales) of sites with high EILS hazard identified within city limits. Maps (in GIS format) were sent to city engineers with reports explaining the methods and results. As far as

  12. A Procedure for Rapid Localized Earthquake Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Holliday, J. R.; Rundle, J. B.

    2010-12-01

    In this presentation, we introduce various ground shaking and building response models. We then discuss the forecasting capabilities of different models submitted to the Collaboratory for the Study of Earthquake Predictability (CSEP) and show how they can be used as inputs for these models. Finally, we discuss how outputs from such multi- tiered calculations would prove invaluable for real-time and scenario-based hazard assessment and for cost-benefit analysis of possible mitigation actions.

  13. Risk assessment of debris flow hazards in natural slope

    NASA Astrophysics Data System (ADS)

    Choi, Junghae; Chae, Byung-gon; Liu, Kofei; Wu, Yinghsin

    2016-04-01

    The study area is located at north-east part of South Korea. Referring to the map of landslide sus-ceptibility (KIGAM, 2009) from Korea Institute of Geoscience and Mineral Resources (KIGAM for short), there are large areas of potential landslide in high probability on slope land of mountain near the study area. Besides, recently some severe landslide-induced debris flow hazards occurred in this area. So this site is convinced to be prone to debris flow haz-ards. In order to mitigate the influence of hazards, the assessment of potential debris flow hazards is very important and essential. In this assessment, we use Debris-2D, debris flow numerical program, to assess the potential debris flow hazards. The worst scenario is considered for simulation. The input mass sources are determined using landslide susceptibility map. The water input is referred to the daily accumulative rainfall in the past debris flow event in study area. The only one input material property, i.e. yield stress, is obtained using calibration test. The simulation results show that the study area has po-tential to be impacted by debris flow. Therefore, based on simulation results, to mitigate debris flow hazards, we can propose countermeasures, including building check dams, constructing a protection wall in study area, and installing instruments for active monitoring of debris flow hazards. Acknowledgements:This research was supported by the Public Welfare & Safety Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning (NRF-2012M3A2A1050983)

  14. Reproductive Hazards

    MedlinePlus

    ... such as lead and mercury Chemicals such as pesticides Cigarettes Some viruses Alcohol For men, a reproductive hazard can affect the sperm. For a woman, a reproductive hazard can cause different effects during pregnancy, depending on when she is exposed. ...

  15. Seismic Hazard Analysis in EL Paso/juarez Area from Study of Young Fault Scarps

    NASA Astrophysics Data System (ADS)

    ashenfelter, K. R.

    2012-12-01

    The El Paso-Juarez metropolitan area contains a known record of active faulting, but also has one of the most poorly known paleoseismic records. The scarcity of data means that nearly 2 million people are exposed to a seismic hazard with little known on the actual risk. Active faults are known along the eastern side of the Franklin Mountains as well as young ruptures within the Hueco Bolson in East El Paso, yet the only fault with paleoseismic studies is the East Franklin's fault. Recent population increases in the El Paso region have led to a construction boom in east El Paso, and many of these construction sites cross known Quaternary fault ruptures. This research project contains two potential components: 1) An exploratory component: students can use existing fault maps and high resolution aerial photography to seek out sites where active construction sites might be unearthing exposures of young fault ruptures. The study is exploratory, and involves carefully mapping using field GIS systems to document areas for potential study, map possible faults, etc. 2) An active fault study in an urbanized environment: The east Franklins fault is a known active fault. The scarp is exposed near trans-mountain road, and along some side streets in NE El Paso. Potential studies include careful mapping of fault scarp topographic profiles, and mapping surface traces.

  16. Using Significant Geologic Hazards and Disasters to Focus Geoethics Case Studies

    NASA Astrophysics Data System (ADS)

    Cronin, V. S.

    2015-12-01

    Ethics education since classical times has involved the consideration of stories, parables, myths, fables, allegories and histories. These are the ancient equivalents of case studies. Modern case studies are used in applied-ethics courses in law, engineering, business, and science. When used in a geoscience course, geoethical case studies can enrich a student's understanding of the relationships between issues of geoscience, engineering, sociology, business, public policy and law - all with an ethical dimension. Perhaps more importantly, real cases affected real people. Students develop a strong empathetic connection to the people involved, enhancing students' drive to understand the interconnected layers of the cases. Students might begin to appreciate that geoscientists can help to avoid or alleviate human suffering -- that their careers can have meaning and purpose beyond simply earning a paycheck. Geologic disasters in which losses could have been predicted, avoided or minimized are quite effective as cases. Coupling a "disaster" case with a comparable "hazard" case is particularly effective. For example, there are many places along the San Andreas Fault in California where [1] significant coseismic displacement has occurred during historical times, [2] structures that are still inhabited were built along or across active traces prior to the Alquist-Priolo Earthquake Fault Zoning Act in 1971, and [3] inhabited structures have been built legally since 1971 within a few tens of feet of active traces. The question students confront is whether society ought to allow habitable structures to be built very near to a major active fault. This topic allows students to work with issues of law, history, seismology, seismic site response, crustal deformation adjacent to active faults, building codes and, ultimately, ethics. Similar progressions can be developed for other major geologic hazards, both natural and man-made, such as floods, landslides, erosion along rivers and

  17. Remedial Action Assessment System (RAAS): Evaluation of selected feasibility studies of CERCLA (Comprehensive Environmental Response, Compensation, and Liability Act) hazardous waste sites

    SciTech Connect

    Whelan, G. ); Hartz, K.E.; Hilliard, N.D. and Associates, Seattle, WA )

    1990-04-01

    Congress and the public have mandated much closer scrutiny of the management of chemically hazardous and radioactive mixed wastes. Legislative language, regulatory intent, and prudent technical judgment, call for using scientifically based studies to assess current conditions and to evaluate and select costeffective strategies for mitigating unacceptable situations. The NCP requires that a Remedial Investigation (RI) and a Feasibility Study (FS) be conducted at each site targeted for remedial response action. The goal of the RI is to obtain the site data needed so that the potential impacts on public health or welfare or on the environment can be evaluated and so that the remedial alternatives can be identified and selected. The goal of the FS is to identify and evaluate alternative remedial actions (including a no-action alternative) in terms of their cost, effectiveness, and engineering feasibility. The NCP also requires the analysis of impacts on public health and welfare and on the environment; this analysis is the endangerment assessment (EA). In summary, the RI, EA, and FS processes require assessment of the contamination at a site, of the potential impacts in public health or the environment from that contamination, and of alternative RAs that could address potential impacts to the environment. 35 refs., 7 figs., 1 tab.

  18. Review: Assessment of completeness of reporting in intervention studies using livestock: an example from pain mitigation interventions in neonatal piglets.

    PubMed

    O'Connor, A; Anthony, R; Bergamasco, L; Coetzee, J F; Dzikamunhenga, R S; Johnson, A K; Karriker, L A; Marchant-Forde, J N; Martineau, G P; Millman, S T; Pajor, E A; Rutherford, K; Sprague, M; Sutherland, M A; von Borell, E; Webb, S R

    2016-04-01

    Accurate and complete reporting of study methods, results and interpretation are essential components for any scientific process, allowing end-users to evaluate the internal and external validity of a study. When animals are used in research, excellence in reporting is expected as a matter of continued ethical acceptability of animal use in the sciences. Our primary objective was to assess completeness of reporting for a series of studies relevant to mitigation of pain in neonatal piglets undergoing routine management procedures. Our second objective was to illustrate how authors can report the items in the Reporting guidElines For randomized controLled trials for livEstoCk and food safety (REFLECT) statement using examples from the animal welfare science literature. A total of 52 studies from 40 articles were evaluated using a modified REFLECT statement. No single study reported all REFLECT checklist items. Seven studies reported specific objectives with testable hypotheses. Six studies identified primary or secondary outcomes. Randomization and blinding were considered to be partially reported in 21 and 18 studies, respectively. No studies reported the rationale for sample sizes. Several studies failed to report key design features such as units for measurement, means, standard deviations, standard errors for continuous outcomes or comparative characteristics for categorical outcomes expressed as either rates or proportions. In the discipline of animal welfare science, authors, reviewers and editors are encouraged to use available reporting guidelines to ensure that scientific methods and results are adequately described and free of misrepresentations and inaccuracies. Complete and accurate reporting increases the ability to apply the results of studies to the decision-making process and prevent wastage of financial and animal resources. PMID:26556522

  19. A Competence-Based Science Learning Framework Illustrated through the Study of Natural Hazards and Disaster Risk Reduction

    ERIC Educational Resources Information Center

    Oyao, Sheila G.; Holbrook, Jack; Rannikmäe, Miia; Pagunsan, Marmon M.

    2015-01-01

    This article proposes a competence-based learning framework for science teaching, applied to the study of "big ideas", in this case to the study of natural hazards and disaster risk reduction (NH&DRR). The framework focuses on new visions of competence, placing emphasis on nurturing connectedness and behavioral actions toward…

  20. Natural phenomena hazards, Hanford Site, Washington

    SciTech Connect

    Conrads, T.J.

    1998-09-29

    This document presents the natural phenomena hazard loads for use in implementing DOE Order 5480.28, Natural Phenomena Hazards Mitigation, and supports development of double-shell tank systems specifications at the Hanford Site in south-central Washington State. The natural phenomena covered are seismic, flood, wind, volcanic ash, lightning, snow, temperature, solar radiation, suspended sediment, and relative humidity.

  1. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... (iii) Analysis. (b) A permittee must carry out the risk elimination and mitigation measures derived... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee...

  2. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... (iii) Analysis. (b) A permittee must carry out the risk elimination and mitigation measures derived... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee...

  3. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... (iii) Analysis. (b) A permittee must carry out the risk elimination and mitigation measures derived... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee...

  4. Safety in earth orbit study. Volume 2: Analysis of hazardous payloads, docking, on-board survivability

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Detailed and supporting analyses are presented of the hazardous payloads, docking, and on-board survivability aspects connected with earth orbital operations of the space shuttle program. The hazards resulting from delivery, deployment, and retrieval of hazardous payloads, and from handling and transport of cargo between orbiter, sortie modules, and space station are identified and analyzed. The safety aspects of shuttle orbiter to modular space station docking includes docking for assembly of space station, normal resupply docking, and emergency docking. Personnel traffic patterns, escape routes, and on-board survivability are analyzed for orbiter with crew and passenger, sortie modules, and modular space station, under normal, emergency, and EVA and IVA operations.

  5. Environmental mitigation at hydroelectric projects

    SciTech Connect

    Sale, M.J.; Cada, G.F.; Chang, L.H.; Christensen, S.W.; Railsback, S.F. ); Francfort, J.E.; Rinehart, B.N.; Sommers, G.L. )

    1991-12-01

    Current environmental mitigation practices at nonfederal hydropower projects were analyzed. Information about instream flows, dissolved oxygen (DO) mitigation, and upstream and downstream fish passage facilities was obtained from project operators, regulatory and resource agencies, and literature reviews. Information provided by the operators includes the specific mitigation requirements imposed on each project, specific objectives or purposes of mitigation, mitigation measures chosen to meet the requirement, the kinds of post-project monitoring conducted, and the costs of mitigation. Costs are examined for each of the four mitigation methods, segmented by capital, study, operations and maintenance, and annual reporting costs. Major findings of the study include: the dominant role of the Instream Flow Incremental Methodology, in conjunction with professional judgment by agency biologists, to set instream flow requirements; reliance on spill flows for DO enhancement; and the widespread use of angled bar racks for downstream fish protection. All of these measures can have high costs and, with few exceptions, there are few data available from nonfederal hydropower projects with which to judge their effectiveness. 100 refs.

  6. Natural Hazards, Second Edition

    NASA Astrophysics Data System (ADS)

    Rouhban, Badaoui

    Natural disaster loss is on the rise, and the vulnerability of the human and physical environment to the violent forces of nature is increasing. In many parts of the world, disasters caused by natural hazards such as earthquakes, floods, landslides, drought, wildfires, intense windstorms, tsunami, and volcanic eruptions have caused the loss of human lives, injury, homelessness, and the destruction of economic and social infrastructure. Over the last few years, there has been an increase in the occurrence, severity, and intensity of disasters, culminating with the devastating tsunami of 26 December 2004 in South East Asia.Natural hazards are often unexpected or uncontrollable natural events of varying magnitude. Understanding their mechanisms and assessing their distribution in time and space are necessary for refining risk mitigation measures. This second edition of Natural Hazards, (following a first edition published in 1991 by Cambridge University Press), written by Edward Bryant, associate dean of science at Wollongong University, Australia, grapples with this crucial issue, aspects of hazard prediction, and other issues. The book presents a comprehensive analysis of different categories of hazards of climatic and geological origin.

  7. Vulnerability studies and integrated assessments for hazard risk reduction in Pittsburgh, PA (Invited)

    NASA Astrophysics Data System (ADS)

    Klima, K.

    2013-12-01

    Today's environmental problems stretch beyond the bounds of most academic disciplines, and thus solutions require an interdisciplinary approach. For instance, the scientific consensus is changes in the frequency and severity of many types of extreme weather events are increasing (IPCC 2012). Yet despite our efforts to reduce greenhouse gases, we continue to experience severe weather events such as Superstorm Sandy, record heat and blizzards, and droughts. These natural hazards, combined with increased vulnerability and exposure, result in longer-lasting disruptions to critical infrastructure and business continuity throughout the world. In order to protect both our lives and the economy, we must think beyond the bounds of any one discipline to include an integrated assessment of relevant work. In the wake of recent events, New York City, Washington, DC, Chicago, and a myriad of other cities have turned to their academic powerhouses for assistance in better understanding their vulnerabilities. This talk will share a case study of the state of integrated assessments and vulnerability studies of energy, transportation, water, real estate, and other main sectors in Pittsburgh, PA. Then the talk will use integrated assessment models and other vulnerability studies to create coordinated sets of climate projections for use by the many public agencies and private-sector organizations in the region.

  8. Household hazardous wastes as a potential source of pollution: a generation study.

    PubMed

    Ojeda-Benítez, Sara; Aguilar-Virgen, Quetzalli; Taboada-González, Paul; Cruz-Sotelo, Samantha E

    2013-12-01

    Certain domestic wastes exhibit characteristics that render them dangerous, such as explosiveness, flammability, spontaneous combustion, reactivity, toxicity and corrosiveness. The lack of information about their generation and composition hinders the creation of special programs for their collection and treatment, making these wastes a potential threat to human health and the environment. We attempted to quantify the levels of hazardous household waste (HHW) generated in Mexicali, Mexico. The analysis considered three socioeconomic strata and eight categories. The sampling was undertaken on a house-by-house basis, and hypothesis testing was based on differences between two proportions for each of the eight categories. In this study, HHW comprised 3.49% of the total generated waste, which exceeded that reported in previous studies in Mexico. The greatest quantity of HHW was generated by the middle stratum; in the upper stratum, most packages were discarded with their contents remaining. Cleaning products represent 45.86% of the HHW generated. Statistical differences were not observed for only two categories among the three social strata. The scarcity of studies on HHW generation limits direct comparisons. Any decrease in waste generation within the middle social stratum will have a large effect on the total amount of waste generated, and decrease their impact on environmental and human health. PMID:24293231

  9. Climate change adaptation & mitigation strategies for Water-Energy-Land Nexus management in Mediterranean region: Case study of Catalunya (Spain).

    NASA Astrophysics Data System (ADS)

    Kumar, Vikas; Schuhmacher, Marta

    2016-04-01

    Water-Energy-Land (WEL) Nexus management is one of those complex decision problems where holistic approach to supply-demand management considering different criteria would be valuable. However, multi-criteria decision making with diverse indicators measured on different scales and uncertainty levels is difficult to solve. On the other hand, climate adaptation and mitigation need to be integrated, and resource sensitive regions like Mediterranean provide ample opportunities towards that end. While the water sector plays a key role in climate adaptation, mitigation focuses on the energy and agriculture sector. Recent studies on the so-called WEL nexus confirm the potential synergies to be derived from mainstreaming climate adaptation in the water sector, while simultaneously addressing opportunities for co-management with energy (and also land use). Objective of this paper is to develop scenarios for the future imbalances in water & energy supply and demand for a water stressed Mediterranean area of Northern Spain (Catalonia) and to test the scenario based climate adaptation & mitigation strategy for WEL management policies. Resource sensitive area of Catalonia presents an interesting nexus problem to study highly stressed water demand scenario (representing all major demand sectors), very heterogeneous land use including intensive agriculture to diversified urban and industrial uses, and mixed energy supply including hydro, wind, gas turbine to nuclear energy. Different energy sectors have different water and land requirements. Inter-river basin water transfer is another factor which is considered for this area. The water-energy link is multifaceted. Energy production can affect water quality, while energy is used in water treatment and to reduce pollution. Similarly, hydropower - producing energy from water - and desalination - producing freshwater using energy - both play important role in economic growth by supplying large and secure amounts of 'green' energy or

  10. Mitigating Challenges of Using Virtual Reality in Online Courses: A Case Study

    ERIC Educational Resources Information Center

    Stewart, Barbara; Hutchins, Holly M.; Ezell, Shirley; De Martino, Darrell; Bobba, Anil

    2010-01-01

    Case study methodology was used to describe the challenges experienced in the development of a virtual component for a freshman-level undergraduate course. The purpose of the project was to use a virtual environment component to provide an interactive and engaging learning environment. While some student and faculty feedback was positive, this…

  11. State-of-the Art of hazard mapping in a mountain village: a case study in South Tyrol

    NASA Astrophysics Data System (ADS)

    Zanotti, Fabrizio; Marini, Matteo; Simoni, Silvia; Casagranda, Elisabetta; Begnudelli, Lorenzo; Dall'Amico, Matteo

    2010-05-01

    The recent catastrophes occurred in Italy have raised the concern about the delicate equilibrium between human settlements and natural hazards. Numerous laws and regional directives have been put in place in order to establish new rules for future urbanization, especially in mountain areas. Here numerous are the hazards threatening the territory: snow avalanches, rockfall, landslides, debris flow, floods. The objective of this work is to illustrate a procedure for the elaboration of a hazard map for a mountain village in South Tyrol, and to describe the obtained results. Snow avalanches, rockfalls, debris flow, landslide and floods have been considered as the potential hazard. This work represents an inter-sectorial effort which encompasses different capabilities and expertise to evaluate natural hazards in mountain regions; in particular, geological studies, forestry analyses and engineering calculations proved to be essential in this context. The most advanced techniques available from research in the field and softwares have been used, such as the hydrological model GEOtop (www.geotop.org) for the precipitation analysis and landslides susceptibility, the model TRENT-2D for the propagation of the debris flow, and other advanced models for the flood forecasting and rockfall simulation.

  12. Social Uptake of Scientific Understanding of Seismic Hazard in Sumatra and Cascadia

    NASA Astrophysics Data System (ADS)

    Shannon, R.; McCloskey, J.; Guyer, C.; McDowell, S.; Steacy, S.

    2007-12-01

    The importance of science within hazard mitigation cannot be underestimated. Robust mitigation polices rely strongly on a sound understanding of the science underlying potential natural disasters and the transference of that knowledge from the scientific community to the general public via governments and policy makers. We aim to investigate how and why the public's knowledge, perceptions, response, adjustments and values towards science have changed throughout two decades of research conducted in areas along and adjacent to the Sumatran and Cascadia subduction zones. We will focus on two countries subject to the same potential hazard, but which encompass starkly contrasting political, economic, social and environmental settings. The transfer of scientific knowledge into the public/ social arena is a complex process, the success of which is reflected in a community's ability to withstand large scale devastating events. Although no one could have foreseen the magnitude of the 2004 Boxing Day tsunami, the social devastation generated underscored the stark absence of mitigation measures in the nations most heavily affected. It furthermore emphasized the need for the design and implementation of disaster preparedness measures. Survey of existing literature has already established timelines for major events and public policy changes in the case study areas. Clear evidence exists of the link between scientific knowledge and its subsequent translation into public policy, particularly in the Cascadia context. The initiation of the National Tsunami Hazard Mitigation Program following the Cape Mendocino earthquake in 1992 embodies this link. Despite a series of environmental disasters with recorded widespread fatalities dating back to the mid 1900s and a heightened impetus for scientific research into tsunami/ earthquake hazard following the 2004 Boxing Day tsunami, the translation of science into the public realm is not widely obvious in the Sumatran context. This research

  13. Inactivation of RNA Viruses by Gamma Irradiation: A Study on Mitigating Factors.

    PubMed

    Hume, Adam J; Ames, Joshua; Rennick, Linda J; Duprex, W Paul; Marzi, Andrea; Tonkiss, John; Mühlberger, Elke

    2016-01-01

    Effective inactivation of biosafety level 4 (BSL-4) pathogens is vital in order to study these agents safely. Gamma irradiation is a commonly used method for the inactivation of BSL-4 viruses, which among other advantages, facilitates the study of inactivated yet morphologically intact virions. The reported values for susceptibility of viruses to inactivation by gamma irradiation are sometimes inconsistent, likely due to differences in experimental protocols. We analyzed the effects of common sample attributes on the inactivation of a recombinant vesicular stomatitis virus expressing the Zaire ebolavirus glycoprotein and green fluorescent protein. Using this surrogate virus, we found that sample volume and protein content of the sample modulated viral inactivation by gamma irradiation but that air volume within the sample container and the addition of external disinfectant surrounding the sample did not. These data identify several factors which alter viral susceptibility to inactivation and highlight the usefulness of lower biosafety level surrogate viruses for such studies. Our results underscore the need to validate inactivation protocols of BSL-4 pathogens using "worst-case scenario" procedures to ensure complete sample inactivation. PMID:27455307

  14. Inactivation of RNA Viruses by Gamma Irradiation: A Study on Mitigating Factors

    PubMed Central

    Hume, Adam J.; Ames, Joshua; Rennick, Linda J.; Duprex, W. Paul; Marzi, Andrea; Tonkiss, John; Mühlberger, Elke

    2016-01-01

    Effective inactivation of biosafety level 4 (BSL-4) pathogens is vital in order to study these agents safely. Gamma irradiation is a commonly used method for the inactivation of BSL-4 viruses, which among other advantages, facilitates the study of inactivated yet morphologically intact virions. The reported values for susceptibility of viruses to inactivation by gamma irradiation are sometimes inconsistent, likely due to differences in experimental protocols. We analyzed the effects of common sample attributes on the inactivation of a recombinant vesicular stomatitis virus expressing the Zaire ebolavirus glycoprotein and green fluorescent protein. Using this surrogate virus, we found that sample volume and protein content of the sample modulated viral inactivation by gamma irradiation but that air volume within the sample container and the addition of external disinfectant surrounding the sample did not. These data identify several factors which alter viral susceptibility to inactivation and highlight the usefulness of lower biosafety level surrogate viruses for such studies. Our results underscore the need to validate inactivation protocols of BSL-4 pathogens using “worst-case scenario” procedures to ensure complete sample inactivation. PMID:27455307

  15. Methane emission from ruminants and solid waste: A critical analysis of baseline and mitigation projections for climate and policy studies

    NASA Astrophysics Data System (ADS)

    Matthews, E.

    2012-12-01

    Current and projected estimates of methane (CH4) emission from anthropogenic sources are numerous but largely unexamined or compared. Presented here is a critical appraisal of CH4 projections used in climate-chemistry and policy studies. We compare emissions for major CH4 sources from several groups, including our own new data and RCP projections developed for climate-chemistry models for the next IPCC Assessment Report (AR5). We focus on current and projected baseline and mitigation emissions from ruminant animals and solid waste that are both predicted to rise dramatically in coming decades, driven primarily by developing countries. For waste, drivers include increasing urban populations, higher per capita waste generation due to economic growth and increasing landfilling rates. Analysis of a new global data base detailing waste composition, collection and disposal indicates that IPCC-based methodologies and default data overestimate CH4 emission for the current period which cascades into substantial overestimates in future projections. CH4 emission from solid waste is estimated to be ~10-15 Tg CH4/yr currently rather than the ~35 Tg/yr often reported in the literature. Moreover, emissions from developing countries are unlikely to rise rapidly in coming decades because new management approaches, such as sanitary landfills, that would increase emissions are maladapted to infrastructures in these countries and therefore unlikely to be implemented. The low current emission associated with solid waste (~10 Tg), together with future modest growth, implies that mitigation of waste-related CH4 emission is a poor candidate for slowing global warming. In the case of ruminant animals (~90 Tg CH4/yr currently), the dominant assumption driving future trajectories of CH4 emission is a substantial increase in meat and dairy consumption in developing countries to be satisfied by growing animal populations. Unlike solid waste, current ruminant emissions among studies exhibit a

  16. The Relative Severity of Single Hazards within a Multi-Hazard Framework

    NASA Astrophysics Data System (ADS)

    Gill, Joel C.; Malamud, Bruce D.

    2013-04-01

    Here we present a description of the relative severity of single hazards within a multi-hazard framework, compiled through examining, quantifying and ranking the extent to which individual hazards trigger or increase the probability of other hazards. Hazards are broken up into six major groupings (geophysical, hydrological, shallow earth processes, atmospheric, biophysical and space), with the interactions for 21 different hazard types examined. These interactions include both one primary hazard triggering a secondary hazard, and one primary hazard increasing the probability of a secondary hazard occurring. We identify, through a wide-ranging review of grey- and peer-review literature, >90 interactions. The number of hazard-type linkages are then summed for each hazard in terms of their influence (the number of times one hazard type triggers another type of hazard, or itself) and their sensitivity (the number of times one hazard type is triggered by other hazard types, or itself). The 21 different hazards are then ranked based on (i) influence and (ii) sensitivity. We found, by quantification and ranking of these hazards, that: (i) The strongest influencers (those triggering the most secondary hazards) are volcanic eruptions, earthquakes and storms, which when taken together trigger almost a third of the possible hazard interactions identified; (ii) The most sensitive hazards (those being triggered by the most primary hazards) are identified to be landslides, volcanic eruptions and floods; (iii) When sensitivity rankings are adjusted to take into account the differential likelihoods of different secondary hazards being triggered, the most sensitive hazards are found to be landslides, floods, earthquakes and ground heave. We believe that by determining the strongest influencing and the most sensitive hazards for specific spatial areas, the allocation of resources for mitigation measures might be done more effectively.

  17. Role of human- and animal-sperm studies in the evaluation of male reproductive hazards

    SciTech Connect

    Wyrobek, A.J.; Gordon, L.; Watchmaker, G.

    1982-04-07

    Human sperm tests provide a direct means of assessing chemically induced spermatogenic dysfunction in man. Available tests include sperm count, motility, morphology (seminal cytology), and Y-body analyses. Over 70 different human exposures have been monitored in various groups of exposed men. The majority of exposures studied showed a significant change from control in one or more sperm tests. When carefully controlled, the sperm morphology test is statistically the most sensitive of these human sperm tests. Several sperm tests have been developed in nonhuman mammals for the study of chemical spermatotoxins. The sperm morphology test in mice has been the most widely used. Results with this test seem to be related to germ-cell mutagenicity. In general, animal sperm tests should play an important role in the identification and assessment of potential human reproductive hazards. Exposure to spermatotoxins may lead to infertility, and more importantly, to heritable genetic damage. While there are considerable animal and human data suggesting that sperm tests may be used to detect agents causing infertility, the extent to which these tests detect heritable genetic damage remains unclear. (ERB)

  18. Exercise as an intervention for sedentary hazardous drinking college students: A pilot study

    PubMed Central

    Weinstock, Jeremiah; Capizzi, Jeffrey; Weber, Stefanie M.; Pescatello, Linda S.; Petry, Nancy M.

    2014-01-01

    Young adults 18–24 years have the highest rates of problems associated with alcohol use among all age groups, and substance use is inversely related to engagement in substance-free activities. This pilot study investigated the promotion of one specific substance-free activity, exercise, on alcohol use in college students. Thirty-one sedentary college students who engaged in hazardous drinking (Alcohol Use Disorders Identification Test scores ≥ 8) were randomized to one of two conditions: (a) one 50-minute session of motivational enhancement therapy (MET) focused on increasing exercise, or (b) one 50-minute session of MET focused on increasing exercise plus 8 weeks of contingency management (CM) for adhering to specific exercise activities. All participants completed evaluations at baseline and post-treatment (2-months later) assessing exercise participation and alcohol use. Results of the pilot study suggest the interventions were well received by participants, the MET+CM condition showed an increased self-reported frequency of exercise in comparison to the MET alone condition, but other indices of exercise, physical fitness, and alcohol use did not differ between the interventions over time. These results suggest that a larger scale trial could better assess efficacy of this well received combined intervention. Investigation in other clinically relevant populations is also warranted. PMID:24949085

  19. Are some "safer alternatives" hazardous as PBTs? The case study of new flame retardants.

    PubMed

    Gramatica, Paola; Cassani, Stefano; Sangion, Alessandro

    2016-04-01

    Some brominated flame retardants (BFRs), as PBDEs, are persistent, bioaccumulative, toxic (PBT) and are restricted/prohibited under various legislations. They are replaced by "safer" flame retardants (FRs), such as new BFRs or organophosphorous compounds. However, informations on the PBT behaviour of these substitutes are often lacking. The PBT assessment is required by the REACH regulation and the PBT chemicals should be subjected to authorization. Several new FRs, proposed and already used as safer alternatives to PBDEs, are here screened by the cumulative PBT Index model, implemented in QSARINS (QSAR-Insubria), new software for the development/validation of QSAR models. The results, obtained directly from the chemical structure for the three studied characteristics altogether, were compared with those from the US-EPA PBT Profiler: the two different approaches are in good agreement, supporting the utility of a consensus approach in these screenings. A priority list of the most harmful FRs, predicted in agreement by the two modelling tools, has been proposed, highlighting that some supposed "safer alternatives" are detected as intrinsically hazardous for their PBT properties. This study also shows that the PBT Index could be a valid tool to evaluate appropriate and safer substitutes, a priori from the chemical design, in a benign by design approach, avoiding unnecessary synthesis and tests. PMID:26742016

  20. An Optimal Mitigation Strategy Against the Asteroid Impact Threat with Short Warning Time

    NASA Technical Reports Server (NTRS)

    Wie, Bong; Barbee, Brent W.

    2015-01-01

    This paper presents the results of a NASA Innovative Advanced Concept (NIAC) Phase 2 study entitled "An Innovative Solution to NASA's Near-Earth Object (NEO) Impact Threat Mitigation Grand Challenge and Flight Validation Mission Architecture Development." This NIAC Phase 2 study was conducted at the Asteroid Deflection Research Center (ADRC) of Iowa State University in 2012-2014. The study objective was to develop an innovative yet practically implementable mitigation strategy for the most probable impact threat of an asteroid or comet with short warning time (less than 5 years). The mitigation strategy described in this paper is intended to optimally reduce the severity and catastrophic damage of the NEO impact event, especially when we don't have sufficient warning times for non-disruptive deflection of a hazardous NEO. This paper provides an executive summary of the NIAC Phase 2 study results.

  1. Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs

    EPA Science Inventory

    This is the fourth, also the last, report of the report series entitled “Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings.” This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method, developed ...

  2. Laboratory Study of Polychlorinated Biphenyl Contamination and Mitigation in Buildings -- Part 4. Evaluation of the Activated Metal Treatment System (AMTS) for On-site Destruction of PCBs

    EPA Science Inventory

    This is the fourth, also the last, report of the report series entitled “Laboratory Study of Polychlorinated Biphenyl (PCB) Contamination and Mitigation in Buildings.” This report evaluates the performance of an on-site PCB destruction method, known as the AMTS method...

  3. Legal and institutional tools to mitigate plastic pollution affecting marine species: Argentina as a case study.

    PubMed

    González Carman, Victoria; Machain, Natalia; Campagna, Claudio

    2015-03-15

    Plastics are the most common form of debris found along the Argentine coastline. The Río de la Plata estuarine area is a relevant case study to describe a situation where ample policy exists against a backdrop of plastics disposed by populated coastal areas, industries, and vessels; with resultant high impacts of plastic pollution on marine turtles and mammals. Policy and institutions are in place but the impact remains due to ineffective waste management, limited public education and awareness, and weaknesses in enforcement of regulations. This context is frequently repeated all over the world. We list possible interventions to increase the effectiveness of policy that require integrating efforts among governments, the private sector, non-governmental organizations and the inhabitants of coastal cities to reduce the amount of plastics reaching the Río de la Plata and protect threatened marine species. What has been identified for Argentina applies to the region and globally. PMID:25627195

  4. Dietary Flaxseed Mitigates Impaired Skeletal Muscle Regeneration: in Vivo, in Vitro and in Silico Studies

    PubMed Central

    Carotenuto, Felicia; Costa, Alessandra; Albertini, Maria Cristina; Rocchi, Marco Bruno Luigi; Rudov, Alexander; Coletti, Dario; Minieri, Marilena; Di Nardo, Paolo; Teodori, Laura

    2016-01-01

    Background: Diets enriched with n-3 polyunsaturated fatty acids (n-3 PUFAs) have been shown to exert a positive impact on muscle diseases. Flaxseed is one of the richest sources of n-3 PUFA acid α-linolenic acid (ALA). The aim of this study was to assess the effects of flaxseed and ALA in models of skeletal muscle degeneration characterized by high levels of Tumor Necrosis Factor-α (TNF). Methods: The in vivo studies were carried out on dystrophic hamsters affected by muscle damage associated with high TNF plasma levels and fed with a long-term 30% flaxseed-supplemented diet. Differentiating C2C12 myoblasts treated with TNF and challenged with ALA represented the in vitro model. Skeletal muscle morphology was scrutinized by applying the Principal Component Analysis statistical method. Apoptosis, inflammation and myogenesis were analyzed by immunofluorescence. Finally, an in silico analysis was carried out to predict the possible pathways underlying the effects of n-3 PUFAs. Results: The flaxseed-enriched diet protected the dystrophic muscle from apoptosis and preserved muscle myogenesis by increasing the myogenin and alpha myosin heavy chain. Moreover, it restored the normal expression pattern of caveolin-3 thereby allowing protein retention at the sarcolemma. ALA reduced TNF-induced apoptosis in differentiating myoblasts and prevented the TNF-induced inhibition of myogenesis, as demonstrated by the increased expression of myogenin, myosin heavy chain and caveolin-3, while promoting myotube fusion. The in silico investigation revealed that FAK pathways may play a central role in the protective effects of ALA on myogenesis. Conclusions: These findings indicate that flaxseed may exert potent beneficial effects by preserving skeletal muscle regeneration and homeostasis partly through an ALA-mediated action. Thus, dietary flaxseed and ALA may serve as a useful strategy for treating patients with muscle dystrophies. PMID:26941581

  5. Post mitigation impact risk analysis for asteroid deflection demonstration missions

    NASA Astrophysics Data System (ADS)

    Eggl, Siegfried; Hestroffer, Daniel; Thuillot, William; Bancelin, David; Cano, Juan L.; Cichocki, Filippo

    2015-08-01

    Even though mankind believes to have the capabilities to avert potentially disastrous asteroid impacts, only the realization of mitigation demonstration missions can validate this claim. Such a deflection demonstration attempt has to be cost effective, easy to validate, and safe in the sense that harmless asteroids must not be turned into potentially hazardous objects. Uncertainties in an asteroid's orbital and physical parameters as well as those additionally introduced during a mitigation attempt necessitate an in depth analysis of deflection mission designs in order to dispel planetary safety concerns. We present a post mitigation impact risk analysis of a list of potential kinetic impactor based deflection demonstration missions proposed in the framework of the NEOShield project. Our results confirm that mitigation induced uncertainties have a significant influence on the deflection outcome. Those cannot be neglected in post deflection impact risk studies. We show, furthermore, that deflection missions have to be assessed on an individual basis in order to ensure that asteroids are not inadvertently transported closer to the Earth at a later date. Finally, we present viable targets and mission designs for a kinetic impactor test to be launched between the years 2025 and 2032.

  6. Early Intervention Following Trauma May Mitigate Genetic Risk for PTSD in Civilians: A Pilot Prospective Emergency Department Study

    PubMed Central

    Rothbaum, Barbara O.; Kearns, Megan C.; Reiser, Emily; Davis, Jennifer S.; Kerley, Kimberly A.; Rothbaum, Alex O.; Mercer, Kristina B.; Price, Matthew; Houry, Debra; Ressler, Kerry J.

    2015-01-01

    intervention group, even after controlling for age, sex, race, education, income, and childhood trauma. Using logistic regression, the number of risk alleles was significantly associated with likelihood of PTSD diagnosis at week 12 (P < .05). Conclusions This pilot prospective study suggests that combined genetic variants may serve to predict those most at risk for developing PTSD following trauma. A psychotherapeutic intervention initiated in the emergency department within hours of the trauma may mitigate this risk. The role of genetic predictors of risk and resilience should be further evaluated in larger, prospective intervention and prevention trials. Trial Registration ClinicalTrials.gov identifier: NCT00895518 PMID:25188543

  7. Study and mitigation of calibration error sources in a water vapour Raman lidar

    NASA Astrophysics Data System (ADS)

    David, Leslie; Bock, Olivier; Bosser, Pierre; Thom, Christian; Pelon, Jacques

    2014-05-01

    The monitoring of water vapour throughout the atmosphere is important for many scientific applications (weather forecasting, climate research, calibration of GNSS altimetry measurements). Measuring water vapour remains a technical challenge because of its high variability in space and time. The major issues are achieving long-term stability (e.g., for climate trends monitoring) and high accuracy (e.g. for calibration/validation applications). LAREG and LOEMI at Institut National de l'Information Géographique et Forestière (IGN) have developed a mobile scanning water vapour Raman lidar in collaboration with LATMOS at CNRS. This system aims at providing high accuracy water vapour measurements throughout the troposphere for calibrating GNSS wet delay signals and thus improving vertical positioning. Current developments aim at improving the calibration method and long term stability of the system to allow the Raman lidar to be used as a reference instrument. The IGN-LATMOS lidar was deployed in the DEMEVAP (Development of Methodologies for Water Vapour Measurement) campaign that took place in 2011 at the Observatoire de Haute Provence. The goals of DEMEVAP were to inter-compare different water vapour sounding techniques (lidars, operational and research radiosondes, GPS,…) and to study various calibration methods for the Raman lidar. A significant decrease of the signals and of the calibration constants of the IGN-LATMOS Raman lidar has been noticed all along the campaign. This led us to study the likely sources of uncertainty and drifts in each part of the instrument: emission, reception and detection. We inventoried several error sources as well as instability sources. The impact of the temperature dependence of the Raman lines on the filter transmission or the fluorescence in the fibre, are examples of the error sources. We investigated each error source and each instability source (uncontrolled laser beam jitter, temporal fluctuations of the photomultiplier

  8. Feasibility study of tank leakage mitigation using subsurface barriers. Revision 1

    SciTech Connect

    Treat, R.L.; Peters, B.B.; Cameron, R.J.

    1995-01-01

    This document reflects the evaluations and analyses performed in response to Tri-Party Agreement Milestone M-45-07A - {open_quotes}Complete Evaluation of Subsurface Barrier Feasibility{close_quotes} (September 1994). In addition, this feasibility study was revised reflecting ongoing work supporting a pending decision by the DOE Richland Operations Office, the Washington State Department of Ecology, and the US Environmental Protection Agency regarding further development of subsurface barrier options for SSTs and whether to proceed with demonstration plans at the Hanford Site (Tri-Party Agreement Milestone M-45-07B). Analyses of 14 integrated SST tank farm remediation alternatives were conducted in response to the three stated objectives of Tri-Party Agreement Milestone M-45-07A. The alternatives include eight with subsurface barriers and six without. Technologies used in the alternatives include three types of tank waste retrieval, seven types of subsurface barriers, a method of stabilizing the void space of emptied tanks, two types of in situ soil flushing, one type of surface barrier, and a clean-closure method. A no-action alternative and a surface-barrier-only alternative were included as nonviable alternatives for comparison. All other alternatives were designed to result in closure of SST tank farms as landfills or in clean-closure. Revision 1 incorporates additional analyses of worker safety, large leak scenarios, and sensitivity to the leach rates of risk controlling constituents. The additional analyses were conducted to support TPA Milestone M-45-07B.

  9. Estimation-based mitigation of dynamic optical turbulence: an experimental study

    NASA Astrophysics Data System (ADS)

    Khandekar, Rahul M.; Nikulin, Vladimir V.

    2008-02-01

    Laser beam propagating through the atmosphere encounters dynamic turbulence, which creates spatial and temporal fields of the refractive index. The resulting wavefront distortions lead to severe performance degradation in the form of reduced signal power and increased BER, even for short-range links. To alleviate this problem, an electrically addressed liquid crystal spatial light modulator (SLM) can be used to correct the wavefront by dynamically changing the optical path delays. Application of Zernike Formalism reduces the complexity of calculation of the SLM control signals by approximating the required phase profile. A real-time wavefront correction procedure utilizing Simplex optimization by Nelder and Mead was previously demonstrated. The performance of such procedure could be improved by proper re-initialization to avoid sub-optimum solutions. Interference-based phase estimation is proposed for this task and its potential was demonstrated in a proof-of-concept theoretical study. This paper presents the modification in the previously developed system and the corresponding experimental results, which show dynamic correction of the phase distortions.

  10. Effect of anxiolytic and hypnotic drug prescriptions on mortality hazards: retrospective cohort study

    PubMed Central

    Pearce, Hannah Louise; Croft, Peter; Singh, Swaran; Crome, Ilana; Bashford, James; Frisher, Martin

    2014-01-01

    Objective To test the hypothesis that people taking anxiolytic and hypnotic drugs are at increased risk of premature mortality, using primary care prescription records and after adjusting for a wide range of potential confounders. Design Retrospective cohort study. Setting 273 UK primary care practices contributing data to the General Practice Research Database. Participants 34 727 patients aged 16 years and older first prescribed anxiolytic or hypnotic drugs, or both, between 1998 and 2001, and 69 418 patients with no prescriptions for such drugs (controls) matched by age, sex, and practice. Patients were followed-up for a mean of 7.6 years (range 0.1-13.4 years). Main outcome All cause mortality ascertained from practice records. Results Physical and psychiatric comorbidities and prescribing of non-study drugs were significantly more prevalent among those prescribed study drugs than among controls. The age adjusted hazard ratio for mortality during the whole follow-up period for use of any study drug in the first year after recruitment was 3.46 (95% confidence interval 3.34 to 3.59) and 3.32 (3.19 to 3.45) after adjusting for other potential confounders. Dose-response associations were found for all three classes of study drugs (benzodiazepines, Z drugs (zaleplon, zolpidem, and zopiclone), and other drugs). After excluding deaths in the first year, there were approximately four excess deaths linked to drug use per 100 people followed for an average of 7.6 years after their first prescription. Conclusions In this large cohort of patients attending UK primary care, anxiolytic and hypnotic drugs were associated with significantly increased risk of mortality over a seven year period, after adjusting for a range of potential confounders. As with all observational findings, however, these results are prone to bias arising from unmeasured and residual confounding. PMID:24647164

  11. Climate engineering of vegetated land for hot extremes mitigation: An Earth system model sensitivity study

    NASA Astrophysics Data System (ADS)

    Wilhelm, Micah; Davin, Edouard L.; Seneviratne, Sonia I.

    2015-04-01

    Various climate engineering schemes have been proposed as a way to curb anthropogenic climate change. Land climate engineering schemes aiming to reduce the amount of solar radiation absorbed at the surface by changes in land surface albedo have been considered in a limited number of investigations. However, global studies on this topic have generally focused on the impacts on mean climate rather than extremes. Here we present the results of a series of transient global climate engineering sensitivity experiments performed with the Community Earth System Model over the time period 1950-2100 under historical and Representative Concentration Pathway 8.5 scenarios. Four sets of experiments are performed in which the surface albedo over snow-free vegetated grid points is increased respectively by 0.05, 0.10, 0.15, and 0.20. The simulations show a preferential cooling of hot extremes relative to mean temperatures throughout the Northern midlatitudes during boreal summer under the late twentieth century conditions. Two main mechanisms drive this response: On the one hand, a stronger efficacy of the albedo-induced radiative forcing on days with high incoming shortwave radiation and, on the other hand, enhanced soil moisture-induced evaporative cooling during the warmest days relative to the control simulation due to accumulated soil moisture storage and reduced drying. The latter effect is dominant in summer in midlatitude regions and also implies a reduction of summer drought conditions. It thus constitutes another important benefit of surface albedo modifications in reducing climate change impacts. The simulated response for the end of the 21st century conditions is of the same sign as that for the end of the twentieth century conditions but indicates an increasing absolute impact of land surface albedo increases in reducing mean and extreme temperatures under enhanced greenhouse gas forcing.

  12. Mitigating Infectious Disease Outbreaks

    NASA Astrophysics Data System (ADS)

    Davey, Victoria

    The emergence of new, transmissible infections poses a significant threat to human populations. As the 2009 novel influenza A/H1N1 pandemic and the 2014-2015 Ebola epidemic demonstrate, we have observed the effects of rapid spread of illness in non-immune populations and experienced disturbing uncertainty about future potential for human suffering and societal disruption. Clinical and epidemiologic characteristics of a newly emerged infectious organism are usually gathered in retrospect as the outbreak evolves and affects populations. Knowledge of potential effects of outbreaks and epidemics and most importantly, mitigation at community, regional, national and global levels is needed to inform policy that will prepare and protect people. Study of possible outcomes of evolving epidemics and application of mitigation strategies is not possible in observational or experimental research designs, but computational modeling allows conduct of `virtual' experiments. Results of well-designed computer simulations can aid in the selection and implementation of strategies that limit illness and death, and maintain systems of healthcare and other critical resources that are vital to public protection. Mitigating Infectious Disease Outbreaks.

  13. Flood hazard studies in Central Texas using orbital and suborbital remote sensing machinery

    NASA Technical Reports Server (NTRS)

    Baker, V. R.; Holz, R. K.; Patton, P. C.

    1975-01-01

    Central Texas is subject to infrequent, unusually intense rainstorms which cause extremely rapid runoff from drainage basins developed on the deeply dissected limestone and marl bedrock of the Edwards Plateau. One approach to flood hazard evaluation in this area is a parametric model relating flood hydrograph characteristics to quantitative geomorphic properties of the drainage basins. The preliminary model uses multiple regression techniques to predict potential peak flood discharge from basin magnitude, drainage density, and ruggedness number. After mapping small catchment networks from remote sensing imagery, input data for the model are generated by network digitization and analysis by a computer assisted routine of watershed analysis. The study evaluated the network resolution capabilities of the following data formats: (1) large-scale (1:24,000) topographic maps, employing Strahler's "method of v's," (2) standard low altitude black and white aerial photography (1:13,000 and 1:20,000 scales), (3) NASA - generated aerial infrared photography at scales ranging from 1:48,000 to 1:123,000, and (4) Skylab Earth Resources Experiment Package S-190A and S-190B sensors (1:750,000 and 1:500,000 respectively).

  14. Natural Hazards

    NASA Astrophysics Data System (ADS)

    Bryant, Edward

    2005-02-01

    This updated new edition presents a comprehensive, inter-disciplinary analysis of the complete range of natural hazards. Edward Bryant describes and explains how hazards occur, examines prediction methods, considers recent and historical hazard events and explores the social impact of such disasters. Supported by over 180 maps, diagrams and photographs, this standard text is an invaluable guide for students and professionals in the field. First Edition Hb (1991): 0-521-37295-X First Edition Pb (1991): 0-521-37889-3

  15. Inland Flood Hazards

    NASA Astrophysics Data System (ADS)

    Wohl, Ellen E.

    2000-07-01

    A comprehensive, interdisciplinary review of issues related to inland flood hazards, this important work addresses physical controls on flooding, flood processes and effects, and responses to flooding, from the perspectives of human, aquatic, and riparian communities. The contributors, recognized experts in their fields, draw on examples and case studies of inland flood hazards from around the world. The volume is unique in that it addresses how the nonoccurrence of floods, in association with flow regulation and other human manipulation of river systems, may create hazards for aquatic and riparian communities. This book will be a valuable resource for all professionals concerned with inland flood hazards.

  16. Land use and management change under climate change adaptation and mitigation strategies: a U.S. case study

    USGS Publications Warehouse

    Mu, Jianhong E.; Wein, Anne; McCarl, Bruce

    2015-01-01

    We examine the effects of crop management adaptation and climate mitigation strategies on land use and land management, plus on related environmental and economic outcomes. We find that crop management adaptation (e.g. crop mix, new species) increases Greenhouse gas (GHG) emissions by 1.7 % under a more severe climate projection while a carbon price reduces total forest and agriculture GHG annual flux by 15 % and 9 %, respectively. This shows that trade-offs are likely between mitigation and adaptation. Climate change coupled with crop management adaptation has small and mostly negative effects on welfare; mitigation, which is implemented as a carbon price starting at $15 per metric ton carbon dioxide (CO2) equivalent with a 5 % annual increase rate, bolsters welfare carbon payments. When both crop management adaptation and carbon price are implemented the effects of the latter dominates.

  17. Evaluation of impacts and mitigation assessments for the UMTRA Project: Gunnison and Durango pilot studies. Final report

    SciTech Connect

    Beranich, S.J.

    1994-08-24

    This report evaluates the impacts assessment and proposed mitigations provided in environmental documents concerning the US Department of Energy`s (DOE) Uranium Mill Tailings Remedial Action (UMTRA) Project. The projected impacts and proposed mitigations identified in UMTRA Project environmental documents were evaluated for two UMTRA Project sites. These sites are Gunnison and Durango, which are representative of currently active and inactive UMTRA Project sites, respectively. National Environmental Policy Act (NEPA) documentation was prepared for the remedial action at Durango and Gunnison as well as for the provision of an alternate water supply system at Gunnison. Additionally, environmental analysis was completed for mill site demolition Gunnison, and for a new road related to the Durango remedial action. The results in this report pertain only to the impact assessments prepared by the Regulatory Compliance staff as a part of the NEPA compliance requirements. Similarly, the mitigative measures documented are those that were identified during the NEPA process.

  18. 78 FR 13844 - Change in Submission Requirements for State Mitigation Plans

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-01

    ... Federal Register on March 24, 2005 (70 FR 15086). B. Submission of Sensitive Information Do not submit...), entitled ``Hazard Mitigation Planning and Hazard Mitigation Grant Program,'' 67 FR 8844, implemented... November 1, 2003 to November 1, 2004. 67 FR 61512. A subsequent revision on September 13, 2004 provided...

  19. 49 CFR 195.579 - What must I do to mitigate internal corrosion?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false What must I do to mitigate internal corrosion? 195... SAFETY TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Corrosion Control § 195.579 What must I do to mitigate internal corrosion? (a) General. If you transport any hazardous liquid or carbon dioxide...

  20. Flood frequency analysis with uncertainty estimation and its application for hazard assessment - a case study in the Mekong Delta

    NASA Astrophysics Data System (ADS)

    Nguyen Viet, Dung; Apel, Heiko; Merz, Bruno; Bárdossy, András

    2015-04-01

    In many flood-prone regions on earth, the nature of the floods calls for a multivariate approach as analyzing flood frequency, which provides a basic for a sound flood hazard and risk assessment. That is because the flood severity is determined not only by the peak flood discharge as usually considered but also by other aspects such as the volume and even the hydrograph shape of the flood. However, the multivariate flood frequency analysis taking into account its associated uncertainty sources has rarely been studied. The Mekong Delta is one of the largest and most densely populated deltas on Earth. It witnesses annual large scale inundations which are associated to the SE-Asian monsoons. These floods are the basis for the livelihoods of the population of the Delta, but they are also the major hazard. This hazard has, however, not been studied within the frame of a probabilistic flood hazard analysis. Thus this study focuses on the identification of a suitable statistical model for the estimation of flood frequencies considering two important flood aspects peak Q and volume V and exemplifies its applicability for a sound flood hazard assessment for the case study in the Mekong Delta. A copula-based bivariate statistical model with bootstrapping-based uncertainty estimation is, hence, developed for a flood frequency analysis of peak flow and volume. The analysis reveals that even with the available - in a hydrological context - quite long data series (e.g. 88 years in the Mekong Delta), large uncertainties are associated to the bivariate quantiles (Q, V), even for rather frequent events. The main uncertainty source is the sampling uncertainty, thus a direct consequence of the limited length of the data series. However, we still advocate for applying the proposed bivariate frequency method for flood frequency estimation in the Mekong Delta because a) it reflects the essential aspects of floods in this region, b) the uncertainties are inherent for every multivariate

  1. Probabilistic Hazard Curves for Tornadic Winds, Wind Gusts, and Extreme Rainfall Events

    SciTech Connect

    Weber, A.H.

    1999-07-29

    'This paper summarizes a study carried on at the Savannah River Site (SRS) for determining probabilistic hazard curves for tornadic winds, wind gusts, and extreme rainfall events. DOE Order 420.1, Facility Safety, outlines the requirements for Natural Phenomena Hazards (NPH) mitigation for new and existing DOE facilities. Specifically, NPH include tornadic winds, maximum wind gusts, and extreme rainfall events. Probabilistic hazard curves for each phenomenon indicate the recurrence frequency, and these hazard curves must be updated at least every 10 years to account for recent data, improved methodologies, or criteria changes. Also, emergency response exercises often use hypothetical weather data to initiate accident scenarios. The hazard curves in these reports provide a means to use extreme weather events based on models and measurements rather than scenarios that are created ad hoc as is often the case.'

  2. Trade study of leakage detection, monitoring, and mitigation technologies to support Hanford single-shell waste retrieval

    SciTech Connect

    Hertzel, J.S.

    1996-03-01

    The U.S. Department of Energy has established the Tank Waste Remediation System to safely manage and dispose of low-level, high-level, and transuranic wastes currently stored in underground storage tanks at the Hanford Site in Eastern Washington. This report supports the Hanford Federal Facility Agreement and Consent Order (Tri-Party Agreement) Milestone No. M-45-08-T01 and addresses additional issues regarding single-shell tank leakage detection, monitoring, and mitigation technologies and provide an indication of the scope of leakage detection, monitoring, and mitigation activities necessary to support the Tank Waste Remedial System Initial Single-shell Tank Retrieval System project.

  3. Assessment of Nearshore Hazard due to Tsunami-Induced Currents

    NASA Astrophysics Data System (ADS)

    Lynett, P. J.; Ayca, A.; Borrero, J. C.; Eskijian, M.; Miller, K.; Wilson, R. I.

    2014-12-01

    The California Tsunami Program in cooperation with NOAA and FEMA has begun implementing a plan to increase tsunami hazard preparedness and mitigation in maritime communities (both ships and harbor infrastructure) through the development of in-harbor hazard maps, offshore safety zones for boater evacuation, and associated guidance for harbors and marinas before, during and following tsunamis. The hope is that the maritime guidance and associated education program will help save lives and reduce exposure of damage to boats and harbor infrastructure. Findings will be used to develop maps, guidance documents, and consistent policy recommendations for emergency managers and port authorities and provide information critical to real-time decisions required when responding to tsunami alert notifications. The initial goals of the study are to (1) evaluate the effectiveness and sensitivity of existing numerical models for assessing maritime tsunami hazards, (2) find a relationship between current speeds and expected damage levels, (3) evaluate California ports and harbors in terms of tsunami induced hazards by identifying regions that are prone to higher current speeds and damage and to identify regions of relatively lower impact that may be used for evacuation of maritime assets, and (4) determine 'safe depths' for evacuation of vessels from ports and harbors during a tsunami event. We will present details of a new initiative to evaluate the future likelihood of failure for different structural components of a harbor, leading to the identification of high priority areas for mitigation. This presentation will focus on the results from California ports and harbors across the State, and will include feedback we have received from discussions with local harbor masters and port authorities. To help promote accurate and consistent products, the authors are also working through the National Tsunami Hazard Mitigation Program to organize a tsunami current model benchmark workshop.

  4. The Impact Hazard in the Context of Other Natural Hazards and Predictive Science

    NASA Astrophysics Data System (ADS)

    Chapman, C. R.

    1998-09-01

    The hazard due to impact of asteroids and comets has been recognized as analogous, in some ways, to other infrequent but consequential natural hazards (e.g. floods and earthquakes). Yet, until recently, astronomers and space agencies have felt no need to do what their colleagues and analogous agencies must do in order the assess, quantify, and communicate predictions to those with a practical interest in the predictions (e.g. public officials who must assess the threats, prepare for mitigation, etc.). Recent heightened public interest in the impact hazard, combined with increasing numbers of "near misses" (certain to increase as Spaceguard is implemented) requires that astronomers accept the responsibility to place their predictions and assessments in terms that may be appropriately considered. I will report on preliminary results of a multi-year GSA/NCAR study of "Prediction in the Earth Sciences: Use and Misuse in Policy Making" in which I have represented the impact hazard, while others have treated earthquakes, floods, weather, global climate change, nuclear waste disposal, acid rain, etc. The impact hazard presents an end-member example of a natural hazard, helping those dealing with more prosaic issues to learn from an extreme. On the other hand, I bring to the astronomical community some lessons long adopted in other cases: the need to understand the policy purposes of impact predictions, the need to assess potential societal impacts, the requirements to very carefully assess prediction uncertainties, considerations of potential public uses of the predictions, awareness of ethical considerations (e.g. conflicts of interest) that affect predictions and acceptance of predictions, awareness of appropriate means for publicly communicating predictions, and considerations of the international context (especially for a hazard that knows no national boundaries).

  5. Experimental and Numerical Studies of the Effects of Water Sprinkling on Urban Pavement on Heat Island Mitigation

    NASA Astrophysics Data System (ADS)

    Yoshioka, M.; Tosaka, H.; Nakagawa, K.

    2007-12-01

    One of the main causes of 'heat island phenomeno' is thought to be the artificial covers of the ground surface with asphalt or concrete which reduce greatly inherent cooling effect of water evaporation from soil surface. In this study, as a candidate method of mitigating the heat island the effects of the 'water sprinkling' on the pavements are discussed from field experiments and numerical studies. Three field experiments of water sprinkling on the asphalt/concrete pavements were performed in hot summer days in 2004-2006. For detecting the change in temperatures, the authors developed and used a 3-D measurements system which consists of two vertical planes with 6m high and 16m wide, and has network arrays of 102 thermistors distributed spatially in the planes. The temperatures measured in and around the water sprinkled area indicated that the ground surface temperature decreased 5 to 15 degrees uniformly in the water sprinkled area compared with those in the un-sprinkled area, while the relative decrease of atmospheric temperature was approximately up to 1 degree. The subsurface temperature at a depth of 14cm under the pavement decreased significantly and kept lower than that at the same depth in un-sprinkled area over the next morning. A numerical model was developed and applied to interpret the experimental results. It deals with the heat balance of radiation, sensible/latent heat transfer at the ground surface and heat conduction through the artificial and natural soil layer under ground. temperature and vapor conditions changes at and near ground surface were modeled by using the bulk formula.Good agreements between the calculated time-temperature profiles and the experimental ones were obtained by assuming adequate physical parameters and meteorological conditions. The model could be improved in order to evaluate the changes of temperature and vapor contents in atmosphere near the ground surface caused by aerodynamic turbulent diffusion.