Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affectingmore » the public.« less
Renjith, V R; Madhu, G; Nayagam, V Lakshmana Gomathi; Bhasi, A B
2010-11-15
The hazards associated with major accident hazard (MAH) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. Copyright © 2010 Elsevier B.V. All rights reserved.
Modeling and Hazard Analysis Using STPA
NASA Astrophysics Data System (ADS)
Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka
2010-09-01
A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis and following the NASA standards for safety-critical systems, the results of our experimental application of STPA can be compared with these more traditional safety engineering approaches in terms of the problems identified and the resources required to use it.
Connor, Thomas H; Smith, Jerome P
2016-09-01
At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic. Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Current recommendations are that all healthc a re settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use.
Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F
2010-01-01
The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.
Hydrothermal Liquefaction Treatment Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios receivedmore » increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.« less
Forensic and homeland security applications of modern portable Raman spectroscopy.
Izake, Emad L
2010-10-10
Modern detection and identification of chemical and biological hazards within the forensic and homeland security contexts may well require conducting the analysis in field while adapting a non-contact approach to the hazard. Technological achievements on both surface and resonance enhancement Raman scattering re-developed Raman spectroscopy to become the most adaptable spectroscopy technique for stand-off and non-contact analysis of hazards. On the other hand, spatially offset Raman spectroscopy proved to be very valuable for non-invasive chemical analysis of hazards concealed within non-transparent containers and packaging. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Risk Management Technique for design and operation of facilities and equipment
NASA Technical Reports Server (NTRS)
Fedor, O. H.; Parsons, W. N.; Coutinho, J. De S.
1975-01-01
The Risk Management System collects information from engineering, operating, and management personnel to identify potentially hazardous conditions. This information is used in risk analysis, problem resolution, and contingency planning. The resulting hazard accountability system enables management to monitor all identified hazards. Data from this system are examined in project reviews so that management can decide to eliminate or accept these risks. This technique is particularly effective in improving the management of risks in large, complex, high-energy facilities. These improvements are needed for increased cooperation among industry, regulatory agencies, and the public.
Earthquake Hazard Analysis Methods: A Review
NASA Astrophysics Data System (ADS)
Sari, A. M.; Fakhrurrozi, A.
2018-02-01
One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.
Seismic hazard assessment of Syria using seismicity, DEM, slope, active tectonic and GIS
NASA Astrophysics Data System (ADS)
Ahmad, Raed; Adris, Ahmad; Singh, Ramesh
2016-07-01
In the present work, we discuss the use of an integrated remote sensing and Geographical Information System (GIS) techniques for evaluation of seismic hazard areas in Syria. The present study is the first time effort to create seismic hazard map with the help of GIS. In the proposed approach, we have used Aster satellite data, digital elevation data (30 m resolution), earthquake data, and active tectonic maps. Many important factors for evaluation of seismic hazard were identified and corresponding thematic data layers (past earthquake epicenters, active faults, digital elevation model, and slope) were generated. A numerical rating scheme has been developed for spatial data analysis using GIS to identify ranking of parameters to be included in the evaluation of seismic hazard. The resulting earthquake potential map delineates the area into different relative susceptibility classes: high, moderate, low and very low. The potential earthquake map was validated by correlating the obtained different classes with the local probability that produced using conventional analysis of observed earthquakes. Using earthquake data of Syria and the peak ground acceleration (PGA) data is introduced to the model to develop final seismic hazard map based on Gutenberg-Richter (a and b values) parameters and using the concepts of local probability and recurrence time. The application of the proposed technique in Syrian region indicates that this method provides good estimate of seismic hazard map compared to those developed from traditional techniques (Deterministic (DSHA) and probabilistic seismic hazard (PSHA). For the first time we have used numerous parameters using remote sensing and GIS in preparation of seismic hazard map which is found to be very realistic.
Thunderstorm monitoring and lightning warning, operational applications of the Safir system
NASA Technical Reports Server (NTRS)
Richard, Philippe
1991-01-01
During the past years a new range of studies have been opened by the application of electromagnetic localization techniques to the field of thunderstorm remote sensing. VHF localization techniques were used in particular for the analysis of lightning discharges and gave access to time resolved 3-D images of lightning discharges within thunderclouds. Detection and localization techniques developed have been applied to the design of the SAFIR system. This development's main objective was the design of an operational system capable of assessing and warning in real time for lightning hazards and potential thunderstorm hazards. The SAFIR system main detection technique is the long range interferometric localization of thunderstorm electromagnetic activity; the system performs the localization of intracloud and cloud to ground lightning discharges and the analysis of the characteristics of the activity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, A; Samost, A; Viswanathan, A
Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios weremore » then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in brachytherapy and may prove to be an alternative to other hazard analysis techniques.« less
Rasmussen's legacy: A paradigm change in engineering for safety.
Leveson, Nancy G
2017-03-01
This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.
Safety Guided Design of Crew Return Vehicle in Concept Design Phase Using STAMP/STPA
NASA Astrophysics Data System (ADS)
Nakao, H.; Katahira, M.; Miyamoto, Y.; Leveson, N.
2012-01-01
In the concept development and design phase of a new space system, such as a Crew Vehicle, designers tend to focus on how to implement new technology. Designers also consider the difficulty of using the new technology and trade off several system design candidates. Then they choose an optimal design from the candidates. Safety should be a key aspect driving optimal concept design. However, in past concept design activities, safety analysis such as FTA has not used to drive the design because such analysis techniques focus on component failure and component failure cannot be considered in the concept design phase. The solution to these problems is to apply a new hazard analysis technique, called STAMP/STPA. STAMP/STPA defines safety as a control problem rather than a failure problem and identifies hazardous scenarios and their causes. Defining control flow is the essential in concept design phase. Therefore STAMP/STPA could be a useful tool to assess the safety of system candidates and to be part of the rationale for choosing a design as the baseline of the system. In this paper, we explain our case study of safety guided concept design using STPA, the new hazard analysis technique, and model-based specification technique on Crew Return Vehicle design and evaluate benefits of using STAMP/STPA in concept development phase.
Site specific probabilistic seismic hazard analysis at Dubai Creek on the west coast of UAE
NASA Astrophysics Data System (ADS)
Shama, Ayman A.
2011-03-01
A probabilistic seismic hazard analysis (PSHA) was conducted to establish the hazard spectra for a site located at Dubai Creek on the west coast of the United Arab Emirates (UAE). The PSHA considered all the seismogenic sources that affect the site, including plate boundaries such as the Makran subduction zone, the Zagros fold-thrust region and the transition fault system between them; and local crustal faults in UAE. PSHA indicated that local faults dominate the hazard. The peak ground acceleration (PGA) for the 475-year return period spectrum is 0.17 g and 0.33 g for the 2,475-year return period spectrum. The hazard spectra are then employed to establish rock ground motions using the spectral matching technique.
Demonstration of a Safety Analysis on a Complex System
NASA Technical Reports Server (NTRS)
Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey;
1997-01-01
For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.
Accident analysis and control options in support of the sludge water system safety analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
HEY, B.E.
A hazards analysis was initiated for the SWS in July 2001 (SNF-8626, K Basin Sludge and Water System Preliminary Hazard Analysis) and updated in December 2001 (SNF-10020 Rev. 0, Hazard Evaluation for KE Sludge and Water System - Project A16) based on conceptual design information for the Sludge Retrieval System (SRS) and 60% design information for the cask and container. SNF-10020 was again revised in September 2002 to incorporate new hazards identified from final design information and from a What-if/Checklist evaluation of operational steps. The process hazards, controls, and qualitative consequence and frequency estimates taken from these efforts have beenmore » incorporated into Revision 5 of HNF-3960, K Basins Hazards Analysis. The hazards identification process documented in the above referenced reports utilized standard industrial safety techniques (AIChE 1992, Guidelines for Hazard Evaluation Procedures) to systematically guide several interdisciplinary teams through the system using a pre-established set of process parameters (e.g., flow, temperature, pressure) and guide words (e.g., high, low, more, less). The teams generally included representation from the U.S. Department of Energy (DOE), K Basins Nuclear Safety, T Plant Nuclear Safety, K Basin Industrial Safety, fire protection, project engineering, operations, and facility engineering.« less
Safety analysis in test facility design
NASA Astrophysics Data System (ADS)
Valk, A.; Jonker, R. J.
1990-09-01
The application of safety analysis techniques as developed in, for example nuclear and petrochemical industry, can be very beneficial in coping with the increasing complexity of modern test facility installations and their operations. To illustrate the various techniques available and their phasing in a project, an overview of the most commonly used techniques is presented. Two case studies are described: the hazard and operability study techniques and safety zoning in relation to the possible presence of asphyxiating atmospheres.
Nadeau-Fredette, Annie-Claire; Hawley, Carmel M.; Pascoe, Elaine M.; Chan, Christopher T.; Clayton, Philip A.; Polkinghorne, Kevan R.; Boudville, Neil; Leblanc, Martine
2015-01-01
Background and objectives Home dialysis is often recognized as a first-choice therapy for patients initiating dialysis. However, studies comparing clinical outcomes between peritoneal dialysis and home hemodialysis have been very limited. Design, setting, participants, & measurements This Australia and New Zealand Dialysis and Transplantation Registry study assessed all Australian and New Zealand adult patients receiving home dialysis on day 90 after initiation of RRT between 2000 and 2012. The primary outcome was overall survival. The secondary outcomes were on-treatment survival, patient and technique survival, and death-censored technique survival. All results were adjusted with three prespecified models: multivariable Cox proportional hazards model (main model), propensity score quintile–stratified model, and propensity score–matched model. Results The study included 10,710 patients on incident peritoneal dialysis and 706 patients on incident home hemodialysis. Treatment with home hemodialysis was associated with better patient survival than treatment with peritoneal dialysis (5-year survival: 85% versus 44%, respectively; log-rank P<0.001). Using multivariable Cox proportional hazards analysis, home hemodialysis was associated with superior patient survival (hazard ratio for overall death, 0.47; 95% confidence interval, 0.38 to 0.59) as well as better on-treatment survival (hazard ratio for on-treatment death, 0.34; 95% confidence interval, 0.26 to 0.45), composite patient and technique survival (hazard ratio for death or technique failure, 0.34; 95% confidence interval, 0.29 to 0.40), and death-censored technique survival (hazard ratio for technique failure, 0.34; 95% confidence interval, 0.28 to 0.41). Similar results were obtained with the propensity score models as well as sensitivity analyses using competing risks models and different definitions for technique failure and lag period after modality switch, during which events were attributed to the initial modality. Conclusions Home hemodialysis was associated with superior patient and technique survival compared with peritoneal dialysis. PMID:26068181
Data gap filling techniques are commonly used to predict hazard in the absence of empirical data. The most established techniques are read-across, trend analysis and quantitative structure-activity relationships (QSARs). Toxic equivalency factors (TEFs) are less frequently used d...
Assessing the validity of prospective hazard analysis methods: a comparison of two techniques
2014-01-01
Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system. PMID:24467813
Application of hazard and effects management tools and links to the HSE case
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gower-Jones, A.D.; Graaf, G.C. van der; Milne, D.J.
1996-12-31
Many tools and techniques are promoted for the analysis and management of hazards and their effects. The proliferation in the last 5-6 years of these tools has resulted in an overload on designers, engineers and operators of E&P activities and assets to the extent that they are unsure what to do when and how this fits together. This paper starts from the basic E&P business (a business model) the basic structure of any accidental event (bow tie) and maps the tools and techniques to analyze the hazards and effects for both asset and activity HSE management. The links to developingmore » an HSE case within the HSE-MS for assets and activities are given.« less
NASA Astrophysics Data System (ADS)
Schoessow, F. S.; Li, Y.; Howe, P. D.
2016-12-01
Extreme heat events are the deadliest natural hazard in the United States and are expected to increase in both severity and frequency in the coming years due to the effects of climate change. The risks of climate change and weather-related events such as heat waves to a population can be more comprehensively assessed by coupling the traditional examination of natural hazards using remote sensing and geospatial analysis techniques with human vulnerability factors and individual perceptions of hazards. By analyzing remote-sensed and empirical survey data alongside national hazards advisories, this study endeavors to establish a nationally-representative baseline quantifying the spatiotemporal variation of individual heat vulnerabilities at multiple scales and between disparate population groups affected by their unique socioenvironmental factors. This is of immediate academic interest because the study of heat waves risk perceptions remains relatively unexplored - despite the intensification of extreme heat events. The use of "human sensors", georeferenced & timestamped individual response data, provides invaluable contextualized data at a high spatial resolution, which will enable policy-makers to more effectively implement targeted strategies for risk prevention, mitigation, and communication. As climate change risks are further defined, this cognizance will help identify vulnerable populations and enhance national hazard preparedness and recovery frameworks.
NASA Astrophysics Data System (ADS)
Moreira, Francisco; Silva, Nuno
2016-08-01
Safety systems require accident avoidance. This is covered by application standards, processes, techniques and tools that support the identification, analysis, elimination or reduction to an acceptable level of system risks and hazards. Ideally, a safety system should be free of hazards. However, both industry and academia have been struggling to ensure appropriate risk and hazard analysis, especially in what concerns completeness of the hazards, formalization, and timely analysis in order to influence the specifications and the implementation. Such analysis is also important when considering a change to an existing system. The Common Safety Method for Risk Evaluation and Assessment (CSM- RA) is a mandatory procedure whenever any significant change is proposed to the railway system in a European Member State. This paper provides insights on the fundamentals of CSM-RA based and complemented with Hazard Analysis. When and how to apply them, and the relation and similarities of these processes with industry standards and the system life cycles is highlighted. Finally, the paper shows how CSM-RA can be the basis of a change management process, guiding the identification and management of the hazards helping ensuring the similar safety level as the initial system. This paper will show how the CSM-RA principles can be used in other domains particularly for space system evolution.
NASA Technical Reports Server (NTRS)
McLeod, Ken; Stoltzfus, Joel
2006-01-01
Oxygen relief systems present a serious fire hazard risk with often severe consequences. This presentation offers a risk management solution strategy which encourages minimizing ignition hazards, maximizing best materials, and utilizing good practices. Additionally, the relief system should be designed for cleanability and ballistic flow. The use of the right metals, softgoods, and lubricants, along with the best assembly techniques, is stressed. Materials should also be tested if data is not available and a full hazard analysis should be conducted in an effort to minimize risk and harm.
NASA Astrophysics Data System (ADS)
Wilson, R. I.; Eble, M. C.
2013-12-01
The U.S. National Tsunami Hazard Mitigation Program (NTHMP) is comprised of representatives from coastal states and federal agencies who, under the guidance of NOAA, work together to develop protocols and products to help communities prepare for and mitigate tsunami hazards. Within the NTHMP are several subcommittees responsible for complimentary aspects of tsunami assessment, mitigation, education, warning, and response. The Mapping and Modeling Subcommittee (MMS) is comprised of state and federal scientists who specialize in tsunami source characterization, numerical tsunami modeling, inundation map production, and warning forecasting. Until September 2012, much of the work of the MMS was authorized through the Tsunami Warning and Education Act, an Act that has since expired but the spirit of which is being adhered to in parallel with reauthorization efforts. Over the past several years, the MMS has developed guidance and best practices for states and territories to produce accurate and consistent tsunami inundation maps for community level evacuation planning, and has conducted benchmarking of numerical inundation models. Recent tsunami events have highlighted the need for other types of tsunami hazard analyses and products for improving evacuation planning, vertical evacuation, maritime planning, land-use planning, building construction, and warning forecasts. As the program responsible for producing accurate and consistent tsunami products nationally, the NTHMP-MMS is initiating a multi-year plan to accomplish the following: 1) Create and build on existing demonstration projects that explore new tsunami hazard analysis techniques and products, such as maps identifying areas of strong currents and potential damage within harbors as well as probabilistic tsunami hazard analysis for land-use planning. 2) Develop benchmarks for validating new numerical modeling techniques related to current velocities and landslide sources. 3) Generate guidance and protocols for the production and use of new tsunami hazard analysis products. 4) Identify multistate collaborations and funding partners interested in these new products. Application of these new products will improve the overall safety and resilience of coastal communities exposed to tsunami hazards.
Identification and assessment of hazardous compounds in drinking water.
Fawell, J K; Fielding, M
1985-12-01
The identification of organic chemicals in drinking water and their assessment in terms of potential hazardous effects are two very different but closely associated tasks. In relation to both continuous low-level background contamination and specific, often high-level, contamination due to pollution incidents, the identification of contaminants is a pre-requisite to evaluation of significant hazards. Even in the case of the rapidly developing short-term bio-assays which are applied to water to indicate a potential genotoxic hazard (for example Ames tests), identification of the active chemicals is becoming a major factor in the further assessment of the response. Techniques for the identification of low concentrations of organic chemicals in drinking water have developed remarkably since the early 1970s and methods based upon gas chromatography-mass spectrometry (GC-MS) have revolutionised qualitative analysis of water. Such techniques are limited to "volatile" chemicals and these usually constitute a small fraction of the total organic material in water. However, in recent years there have been promising developments in techniques for "non-volatile" chemicals in water. Such techniques include combined high-performance liquid chromatography-mass spectrometry (HPLC-MS) and a variety of MS methods, involving, for example, field desorption, fast atom bombardment and thermospray ionisation techniques. In the paper identification techniques in general are reviewed and likely future developments outlined. The assessment of hazards associated with chemicals identified in drinking and related waters usually centres upon toxicology - an applied science which involves numerous disciplines. The paper examines the toxicological information needed, the quality and deployment of such information and discusses future research needs. Application of short-term bio-assays to drinking water is a developing area and one which is closely involved with, and to some extent dependent on, powerful methods of identification. Recent developments are discussed.
Man-rating of a launch vehicle
NASA Astrophysics Data System (ADS)
Soeffker, D.
Analysis techniques for hazard identification, classification, and control, developed for Spacelab, are presented. Hazards were classified as catastrophic (leading to crew or vehicle loss) critical (could lead to serious injury or damage) and controlled (counteracted by design). All nonmetallic materials were rated for flammability in oxygen enriched atmospheres, toxic offgassing, and odor. Any element with less than 200 mission capability was rated life limited.
The British Geological Survey and the petroleum industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chesher, J.A.
1995-08-01
The British Geological Survey is the UK`s national centre for earth science information with a parallel remit to operate internationally. The Survey`s work covers the full geoscience spectrum in energy, mineral and groundwater resources and associated implications for land use, geological hazards and environmental impact. Much of the work is conducted in collaboration with industry and academia, including joint funding opportunities. Activities relating directly to hydrocarbons include basin analysis, offshore geoscience mapping, hazard assessment, fracture characterization, biostratigraphy, sedimentology, seismology, geomagnetism and frontier data acquisition techniques, offshore. The BGS poster presentation illustrates the value of the collaborative approach through consortia supportmore » for regional offshore surveys, geotechnical hazard assessments and state-of-the-art R & D into multicomponent seismic imaging techniques, among others.« less
Logistic regression applied to natural hazards: rare event logistic regression with replications
NASA Astrophysics Data System (ADS)
Guns, M.; Vanacker, V.
2012-06-01
Statistical analysis of natural hazards needs particular attention, as most of these phenomena are rare events. This study shows that the ordinary rare event logistic regression, as it is now commonly used in geomorphologic studies, does not always lead to a robust detection of controlling factors, as the results can be strongly sample-dependent. In this paper, we introduce some concepts of Monte Carlo simulations in rare event logistic regression. This technique, so-called rare event logistic regression with replications, combines the strength of probabilistic and statistical methods, and allows overcoming some of the limitations of previous developments through robust variable selection. This technique was here developed for the analyses of landslide controlling factors, but the concept is widely applicable for statistical analyses of natural hazards.
NASA Astrophysics Data System (ADS)
Lazzari, M.; Loperte, A.; Perrone, A.
2010-03-01
This work, carried out with an integrated methodological approach, focuses on the use of near surface geophysics techniques, such as ground penetrating radar and electrical resistivity tomography (ERT), and geomorphological analysis, in order to reconstruct the cave distribution and geometry in a urban context and, in particular, in historical centres. The interaction during recent centuries between human activity (caves excavation, birth and growth of an urban area) and the characters of the natural environment were the reasons of a progressive increase in hazard and vulnerability levels of several sites. The reconstruction of a detailed cave map distribution is the first step to define the anthropic and geomorphological hazard in urban areas, fundamental basis for planning and assessing the risk.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems
NASA Astrophysics Data System (ADS)
Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.
2011-12-01
The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed technique to a training dataset of induced earthquakes recorded by Berkeley-Geysers network, which is installed in The Geysers geothermal area in Northern California. The reliability of the techniques is then tested by using a different dataset performing seismic hazard analysis in a time-evolving approach, which provides with ground-motion values having fixed probabilities of exceedence. Those values can be finally compared with the observations by using appropriate statistical tests.
Airborne Forward-Looking Interferometer for the Detection of Terminal-Area Hazards
NASA Technical Reports Server (NTRS)
West, Leanne; Gimmestad, Gary; Lane, Sarah; Smith, Bill L.; Kireev, Stanislav; Daniels, Taumi S.; Cornman, Larry; Sharman, Bob
2014-01-01
The Forward Looking Interferometer (FLI) program was a multi-year cooperative research effort to investigate the use of imaging radiometers with high spectral resolution, using both modeling/simulation and field experiments, along with sophisticated data analysis techniques that were originally developed for analysis of data from space-based radiometers and hyperspectral imagers. This investigation has advanced the state of knowledge in this technical area, and the FLI program developed a greatly improved understanding of the radiometric signal strength of aviation hazards in a wide range of scenarios, in addition to a much better understanding of the real-world functionality requirements for hazard detection instruments. The project conducted field experiments on three hazards (turbulence, runway conditions, and wake vortices) and analytical studies on several others including volcanic ash, reduced visibility conditions, in flight icing conditions, and volcanic ash.
Hazard analysis of Clostridium perfringens in the Skylab Food System
NASA Technical Reports Server (NTRS)
Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.
1974-01-01
The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.
NASA Astrophysics Data System (ADS)
Tierz, Pablo; Sandri, Laura; Ramona Stefanescu, Elena; Patra, Abani; Marzocchi, Warner; Costa, Antonio; Sulpizio, Roberto
2014-05-01
Explosive volcanoes and, especially, Pyroclastic Density Currents (PDCs) pose an enormous threat to populations living in the surroundings of volcanic areas. Difficulties in the modeling of PDCs are related to (i) very complex and stochastic physical processes, intrinsic to their occurrence, and (ii) to a lack of knowledge about how these processes actually form and evolve. This means that there are deep uncertainties (namely, of aleatory nature due to point (i) above, and of epistemic nature due to point (ii) above) associated to the study and forecast of PDCs. Consequently, the assessment of their hazard is better described in terms of probabilistic approaches rather than by deterministic ones. What is actually done to assess probabilistic hazard from PDCs is to couple deterministic simulators with statistical techniques that can, eventually, supply probabilities and inform about the uncertainties involved. In this work, some examples of both PDC numerical simulators (Energy Cone and TITAN2D) and uncertainty quantification techniques (Monte Carlo sampling -MC-, Polynomial Chaos Quadrature -PCQ- and Bayesian Linear Emulation -BLE-) are presented, and their advantages, limitations and future potential are underlined. The key point in choosing a specific method leans on the balance between its related computational cost, the physical reliability of the simulator and the pursued target of the hazard analysis (type of PDCs considered, time-scale selected for the analysis, particular guidelines received from decision-making agencies, etc.). Although current numerical and statistical techniques have brought important advances in probabilistic volcanic hazard assessment from PDCs, some of them may be further applicable to more sophisticated simulators. In addition, forthcoming improvements could be focused on three main multidisciplinary directions: 1) Validate the simulators frequently used (through comparison with PDC deposits and other simulators), 2) Decrease simulator runtimes (whether by increasing the knowledge about the physical processes or by doing more efficient programming, parallelization, ...) and 3) Improve uncertainty quantification techniques.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-01-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987
An Integrated GIS-Expert System Framework for Live Hazard Monitoring and Detection.
McCarthy, James D; Graniero, Phil A; Rozic, Steven M
2008-02-08
In the context of hazard monitoring, using sensor web technology to monitor anddetect hazardous conditions in near-real-time can result in large amounts of spatial data thatcan be used to drive analysis at an instrumented site. These data can be used for decisionmaking and problem solving, however as with any analysis problem the success ofanalyzing hazard potential is governed by many factors such as: the quality of the sensordata used as input; the meaning that can be derived from those data; the reliability of themodel used to describe the problem; the strength of the analysis methods; and the ability toeffectively communicate the end results of the analysis. For decision makers to make use ofsensor web data these issues must be dealt with to some degree. The work described in thispaper addresses all of these areas by showing how raw sensor data can be automaticallytransformed into a representation which matches a predefined model of the problem context.This model can be understood by analysis software that leverages rule-based logic andinference techniques to reason with, and draw conclusions about, spatial data. These toolsare integrated with a well known Geographic Information System (GIS) and existinggeospatial and sensor web infrastructure standards, providing expert users with the toolsneeded to thoroughly explore a problem site and investigate hazards in any domain.
Spatial earthquake hazard assessment of Evansville, Indiana
Rockaway, T.D.; Frost, J.D.; Eggert, D.L.; Luna, R.
1997-01-01
The earthquake hazard has been evaluated for a 150-square-kilometer area around Evansville, Indiana. GIS-QUAKE, a system that combines liquefaction and ground motion analysis routines with site-specific geological, geotechnical, and seismological information, was used for the analysis. The hazard potential was determined by using 586 SPT borings, 27 CPT sounding, 39 shear-wave velocity profiles and synthesized acceleration records for body-wave magnitude 6.5 and 7.3 mid-continental earthquakes, occurring at distances of 50 km and 250 km, respectively. The results of the GIS-QUAKE hazard analyses for Evansville identify areas with a high hazard potential that had not previously been identified in earthquake zonation studies. The Pigeon Creek area specifically is identified as having significant potential for liquefaction-induced damage. Damage as a result of ground motion amplification is determined to be a moderate concern throughout the area. Differences in the findings of this zonation study and previous work are attributed to the size and range of the database, the hazard evaluation methodologies, and the geostatistical interpolation techniques used to estimate the hazard potential. Further, assumptions regarding the groundwater elevations made in previous studies are also considered to have had a significant effect on the results.
NASA Technical Reports Server (NTRS)
Uber, James G.
1988-01-01
Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.
Pacific Basin conference on hazardous waste: Proceedings
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This conference was held November 4--8, 1996 in Kuala Lumpur, Malaysia. The purpose of this conference was to provide a multidisciplinary forum for exchange of state-of-the-art information on the problems of hazardous waste. Topics of discussion deal with pollution prevention, waste treatment technology, health and ecosystem effects research, analysis and assessment, and regulatory management techniques. Individual papers have been processed separately for inclusion in the appropriate data bases.
NASA Astrophysics Data System (ADS)
García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.
2009-04-01
In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide location data. These results show a high concordance between the landslide inventory and the high susceptibility estimated zone with an adjustment of 95.1 % for ANN model and 89.4% for LR model. In addition, we make a comparative analysis of both techniques using the Receiver Operating Characteristic (ROC) curve, a graphical plot of the sensitivity vs. (1 - specificity) for a binary classifier system in function of its discrimination threshold, and calculating the Area Under the ROC (AUROC) value for each model. Finally, the previous models are used for the developing a new probabilistic landslide hazard map for future events. They are obtained combining the expected triggering factor (calculated earthquake ground motion) for a return period of 475 years with the susceptibility map.
Analyzing Distributed Functions in an Integrated Hazard Analysis
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Massie, Michael J.
2010-01-01
Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.
NASA Astrophysics Data System (ADS)
Rahamana, S. Abdul; Aruchamy, S.; Jegankumar, R.
2014-12-01
Landslides are one of the critical natural phenomena that frequently lead to serious problems in hilly area, resulting to loss of human life and property, as well as causing severe damage to natural resources. The local geology with high degree of slope coupled with high intensity of rainfall along with unplanned human activities of the study area causes many landslides in this region. The present study area is more attracted by tourist throughout the year, so this area must be considered for preventive measures. Geospatial based Multicriteria decision analysis (MCDA) technique is increasingly used for landslide vulnerability and hazard zonation mapping. It enables the integration of different data layers with different levels of uncertainty. In this present study, it is used analytic hierarchy process (AHP) method to prepare landslide hazard zones of the Coonoor and Ooty, part of Kallar watershed, The Nilgiris, Tamil Nadu. The study was carried out using remote sensing data, field surveys and geographic information system (GIS) tools. The ten factors that influence landslide occurrence, such as elevation, slope aspect, slope angle, drainage density, lineament density, soil, precipitation, land use/land cover (LULC), distance from road and NDVI were considered. These factors layers were extracted from the various related spatial data's. These factors were evaluated, and then, the individual factor weight and class weight were assigned to each of the related factors. The Landslide Hazard Zone Index (LHZI) was calculated using Multicriteria decision analysis (MCDA) the technique based on the assigned weight and the rating is given by the Analytical Hierarchy Process (AHP) method. The final cumulative map of the study area was categorized into four hazard zones and classified as zone I to IV. There are 3.56% of the area comes under the hazard zone IV fallowed by 48.19% of the area comes under zone III, 43.63 % of the area in zone II and 4.61% of the area comes hazard zone I. Further resulted hazard zone map and landuse/landcover map are overlaid to check the hazard status, and existing inventory of known landslides within the present study area was compared with the resulting vulnerable and hazard zone maps. The landslide hazard zonation map is useful for landslide hazard prevention, mitigation, and improvement to society, and proper planning for land use and construction in the future.
Hazard Analysis of Pollution Abatement Techniques. Volume I
1974-06-01
present hazards during startup/shutdown operations when filling or emptying the diatomaceous earth filters, or when filling or emptying the carbon...columns. Frictional initi- I ation modes can occur in the filter due to movement of diatomaceous earth 1< over a sufficiently dry TN~T layer. nie b...said operation. .,I addition, contaminaited diatomaceous earth should not be handled in a dry condition. Spent carbon, as well as spent earth , should be
Expert systems in civil engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostem, C.N.; Maher, M.L.
1986-01-01
This book presents the papers given at a symposium on expert systems in civil engineering. Topics considered at the symposium included problem solving using expert system techniques, construction schedule analysis, decision making and risk analysis, seismic risk analysis systems, an expert system for inactive hazardous waste site characterization, an expert system for site selection, knowledge engineering, and knowledge-based expert systems in seismic analysis.
Feasibility Study of Radiometry for Airborne Detection of Aviation Hazards
NASA Technical Reports Server (NTRS)
Gimmestad, Gary G.; Papanicolopoulos, Chris D.; Richards, Mark A.; Sherman, Donald L.; West, Leanne L.; Johnson, James W. (Technical Monitor)
2001-01-01
Radiometric sensors for aviation hazards have the potential for widespread and inexpensive deployment on aircraft. This report contains discussions of three aviation hazards - icing, turbulence, and volcanic ash - as well as candidate radiometric detection techniques for each hazard. Dual-polarization microwave radiometry is the only viable radiometric technique for detection of icing conditions, but more research will be required to assess its usefulness to the aviation community. Passive infrared techniques are being developed for detection of turbulence and volcanic ash by researchers in this country and also in Australia. Further investigation of the infrared airborne radiometric hazard detection approaches will also be required in order to develop reliable detection/discrimination techniques. This report includes a description of a commercial hyperspectral imager for investigating the infrared detection techniques for turbulence and volcanic ash.
Use of quantified risk assessment techniques in relation to major hazard installations
NASA Astrophysics Data System (ADS)
Elliott, M. J.
Over the past decade, industry and regulatory authorities have expressed interest in the development and use of hazard assessment techniques, particularly in relation to the control of major hazards. However, misconceptions about the methodology and role of quantified hazard assessment techniques in decision-making has hindered productive dialogues on the use and value of these techniques, both within industry and between industry and regulatory authorities. This Paper outlines the nature, role and current uses of hazard assessment as perceived by the author; and identifies and differentiates between those areas and types of decisions where quantification should prove beneficial, and those where it is unwarranted and should be discouraged.
Application of systems and control theory-based hazard analysis to radiation oncology.
Pawlicki, Todd; Samost, Aubrey; Brown, Derek W; Manger, Ryan P; Kim, Gwe-Ya; Leveson, Nancy G
2016-03-01
Both humans and software are notoriously challenging to account for in traditional hazard analysis models. The purpose of this work is to investigate and demonstrate the application of a new, extended accident causality model, called systems theoretic accident model and processes (STAMP), to radiation oncology. Specifically, a hazard analysis technique based on STAMP, system-theoretic process analysis (STPA), is used to perform a hazard analysis. The STPA procedure starts with the definition of high-level accidents for radiation oncology at the medical center and the hazards leading to those accidents. From there, the hierarchical safety control structure of the radiation oncology clinic is modeled, i.e., the controls that are used to prevent accidents and provide effective treatment. Using STPA, unsafe control actions (behaviors) are identified that can lead to the hazards as well as causal scenarios that can lead to the identified unsafe control. This information can be used to eliminate or mitigate potential hazards. The STPA procedure is demonstrated on a new online adaptive cranial radiosurgery procedure that omits the CT simulation step and uses CBCT for localization, planning, and surface imaging system during treatment. The STPA procedure generated a comprehensive set of causal scenarios that are traced back to system hazards and accidents. Ten control loops were created for the new SRS procedure, which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Eighty three unsafe control actions were identified as well as 472 causal scenarios that could lead to those unsafe control actions. STPA provides a method for understanding the role of management decisions and hospital operations on system safety and generating process design requirements to prevent hazards and accidents. The interaction of people, hardware, and software is highlighted. The method of STPA produces results that can be used to improve safety and prevent accidents and warrants further investigation.
Infusing Reliability Techniques into Software Safety Analysis
NASA Technical Reports Server (NTRS)
Shi, Ying
2015-01-01
Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.
Ensemble of ground subsidence hazard maps using fuzzy logic
NASA Astrophysics Data System (ADS)
Park, Inhye; Lee, Jiyeong; Saro, Lee
2014-06-01
Hazard maps of ground subsidence around abandoned underground coal mines (AUCMs) in Samcheok, Korea, were constructed using fuzzy ensemble techniques and a geographical information system (GIS). To evaluate the factors related to ground subsidence, a spatial database was constructed from topographic, geologic, mine tunnel, land use, groundwater, and ground subsidence maps. Spatial data, topography, geology, and various ground-engineering data for the subsidence area were collected and compiled in a database for mapping ground-subsidence hazard (GSH). The subsidence area was randomly split 70/30 for training and validation of the models. The relationships between the detected ground-subsidence area and the factors were identified and quantified by frequency ratio (FR), logistic regression (LR) and artificial neural network (ANN) models. The relationships were used as factor ratings in the overlay analysis to create ground-subsidence hazard indexes and maps. The three GSH maps were then used as new input factors and integrated using fuzzy-ensemble methods to make better hazard maps. All of the hazard maps were validated by comparison with known subsidence areas that were not used directly in the analysis. As the result, the ensemble model was found to be more effective in terms of prediction accuracy than the individual model.
Tinoco, Maria Auxiliadora Cannarozzo; Nodari, Christine Tessele; Pereira, Kimberllyn Rosa da Silva
2016-09-19
This study aimed to assess the environmental and social vulnerability and identify critical highway stretches for accidents involving transportation of hazardous products on the BR-101 highway between the cities of Osório and Torres in Rio Grande do Sul State, Brazil. The study's approach consisted of a multiple-criteria analysis combining highway safety analysis and environmental and social vulnerability analysis in the occurrence of accidents with hazardous products, plus cartographic analysis techniques. Thirty-eight kilometers of the highway showed high vulnerability, of which 8 kilometers with critical vulnerability, associated with bridges over rivers, water uptake points, a tunnel, environmental preservation areas, and an urban area. These stretches should be prioritized when developing action plans for accident mitigation and development of public policies for this highway. This proved to be an unprecedented approach when compared to existing studies and a potentially useful tool for decision-making in emergency operations.
Determination of Sulfate by Conductometric Titration: An Undergraduate Laboratory Experiment
ERIC Educational Resources Information Center
Garcia, Jennifer; Schultz, Linda D.
2016-01-01
The classic technique for sulfate analysis in an undergraduate quantitative analysis lab involves precipitation as the barium salt with barium chloride, collection of the precipitate by gravity filtration using ashless filter paper, and removal of the filter paper by charring over a Bunsen burner. The entire process is time-consuming, hazardous,…
NASA Astrophysics Data System (ADS)
Al-Akad, S.; Akensous, Y.; Hakdaoui, M.
2017-11-01
This research article is summarize the applications of remote sensing and GIS to study the urban floods risk in Al Mukalla. Satellite acquisition of a flood event on October 2015 in Al Mukalla (Yemen) by using flood risk mapping techniques illustrate the potential risk present in this city. Satellite images (The Landsat and DEM images data were atmospherically corrected, radiometric corrected, and geometric and topographic distortions rectified.) are used for flood risk mapping to afford a hazard (vulnerability) map. This map is provided by applying image-processing techniques and using geographic information system (GIS) environment also the application of NDVI, NDWI index, and a method to estimate the flood-hazard areas. Four factors were considered in order to estimate the spatial distribution of the hazardous areas: flow accumulation, slope, land use, geology and elevation. The multi-criteria analysis, allowing to deal with vulnerability to flooding, as well as mapping areas at the risk of flooding of the city Al Mukalla. The main object of this research is to provide a simple and rapid method to reduce and manage the risks caused by flood in Yemen by take as example the city of Al Mukalla.
NASA Astrophysics Data System (ADS)
Khan, K. M.; Rashid, S.; Yaseen, M.; Ikram, M.
2016-12-01
The Karakoram Highway (KKH) 'eighth wonder of the world', constructed and completed by the consent of Pakistan and China in 1979 as a Friendship Highway. It connect Gilgit-Baltistan, a strategically prominent region of Pakistan, with Xinjiang region in China. Due to manifold geology/geomorphology, soil formation, steep slopes, climate change well as unsustainable anthropogenic activities, still, KKH is remarkably vulnerable to natural hazards i.e. land subsistence, landslides, erosion, rock fall, floods, debris flows, cyclical torrential rainfall and snowfall, lake outburst etc. Most of the time these geohazard's damaging effects jeopardized the life in the region. To ascertain the nature and frequency of the disaster and vulnerability zoning, a rating and management (logistic) analysis were made to investigate the spatiotemporal sharing of the natural hazard. The substantial dynamics of the physiograpy, geology, geomorphology, soils and climate were carefully understand while slope, aspect, elevation, profile curvature and rock hardness was calculated by different techniques. To assess the nature and intensity geospatial analysis were conducted and magnitude of every factor was gauged by using logistic regression. Moreover, ever relative variable was integrated in the evaluation process. Logistic regression and geospatial techniques were used to map the geohazard vulnerability zoning (GVZ). The GVZ model findings were endorsed by the reviews of documented hazards in the current years and the precision was realized more than 88.1 %. The study has proved the model authentication by highlighting the comfortable indenture among the vulnerability mapping and past documented hazards. By using a receiver operating characteristic curve, the logistic regression model made satisfactory results. The outcomes will be useful in sustainable land use and infrastructure planning, mainly in high risk zones for reduceing economic damages and community betterment.
Reviewing and visualizing the interactions of natural hazards
NASA Astrophysics Data System (ADS)
Gill, Joel C.; Malamud, Bruce D.
2014-12-01
This paper presents a broad overview, characterization, and visualization of the interaction relationships between 21 natural hazards, drawn from six hazard groups (geophysical, hydrological, shallow Earth, atmospheric, biophysical, and space hazards). A synthesis is presented of the identified interaction relationships between these hazards, using an accessible visual format particularly suited to end users. Interactions considered are primarily those where a primary hazard triggers or increases the probability of secondary hazards occurring. In this paper we do the following: (i) identify, through a wide-ranging review of grey- and peer-review literature, 90 interactions; (ii) subdivide the interactions into three levels, based on how well we can characterize secondary hazards, given information about the primary hazard; (iii) determine the spatial overlap and temporal likelihood of the triggering relationships occurring; and (iv) examine the relationship between primary and secondary hazard intensities for each identified hazard interaction and group these into five possible categories. In this study we have synthesized, using accessible visualization techniques, large amounts of information drawn from many scientific disciplines. We outline the importance of constraining hazard interactions and reinforce the importance of a holistic (or multihazard) approach to natural hazard assessment. This approach allows those undertaking research into single hazards to place their work within the context of other hazards. It also communicates important aspects of hazard interactions, facilitating an effective analysis by those working on reducing and managing disaster risk within both the policy and practitioner communities.
Consumer product safety: A systems problem
NASA Technical Reports Server (NTRS)
Clark, C. C.
1971-01-01
The manufacturer, tester, retailer, consumer, repairer disposer, trade and professional associations, national and international standards bodies, and governments in several roles are all involved in consumer product safety. A preliminary analysis, drawing on system safety techniques, is utilized to distinguish the inter-relations of these many groups and the responsibilities that they are or could take for product safety, including the slow accident hazards as well as the more commonly discussed fast accident hazards. The importance of interactive computer aided information flow among these groups is particularly stressed.
NASA Astrophysics Data System (ADS)
De Agostini, A.; Floris, M.; Pasquali, P.; Barbieri, M.; Cantone, A.; Riccardi, P.; Stevan, G.; Genevois, R.
2012-04-01
In the last twenty years, Differential Synthetic Aperture Radar Interferometry (DInSAR) techniques have been widely used to investigate geological processes, such as subsidence, earthquakes and landslides, through the evaluation of earth surface displacements caused by these processes. In the study of mass movements, contribution of interferometry can be limited due to the acquisition geometry of RADAR images and the rough morphology of mountain and hilly regions which represent typical landslide-prone areas. In this study, the advanced DInSAR techniques (i.e. Small Baseline Subset and Persistent Scatterers techniques), available in SARscape software, are used. These methods involve the use of multiple acquisitions stacks (large SAR temporal series) allowing improvements and refinements in landslide identification, characterization and hazard evaluation at the basin scale. Potential and limits of above mentioned techniques are outlined and discussed. The study area is the Agno Valley, located in the North-Eastern sector of Italian Alps and included in the Vicenza Province (Veneto Region, Italy). This area and the entire Vicenza Province were hit by an exceptional rainfall event on November 2010 that triggered more than 500 slope instabilities. The main aim of the work is to verify if spatial information available before the rainfall event, including ERS and ENVISAT RADAR data from 1992 to 2010, were able to predict the landslides occurred in the study area, in order to implement an effectiveness forecasting model. In the first step of the work a susceptibility analysis is carried out using landslide dataset from the IFFI project (Inventario Fenomeni Franosi in Italia, Landslide Italian Inventory) and related predisposing factors, which consist of morphometric (elevation, slope, aspect and curvature) and non-morphometric (land use, distance of roads and distance of river) factors available from the Veneto Region spatial database. Then, to test the prediction, the results of susceptibility analysis are compared with the location of landslides occurred in the study area during the November 2010 rainfall event. In the second step, results of DInSAR analysis (displacement maps over the time) are added on the prediction analysis to build up a map containing both spatial and temporal information on landslides and, as in the previous case, the prediction is tested by using November 2010 instabilities dataset. Comparison of the two tests allows to evaluate the contribution of interferometric techniques. Finally, morphometric factors and interferometric RADAR data are combined to design a preliminary analysis scheme that provide information on possible use of DInSAR techniques in landslide hazard evaluation of a given area.
Sampling methods for microbiological analysis of red meat and poultry carcasses.
Capita, Rosa; Prieto, Miguel; Alonso-Calleja, Carlos
2004-06-01
Microbiological analysis of carcasses at slaughterhouses is required in the European Union for evaluating the hygienic performance of carcass production processes as required for effective hazard analysis critical control point implementation. The European Union microbial performance standards refer exclusively to the excision method, even though swabbing using the wet/dry technique is also permitted when correlation between both destructive and nondestructive methods can be established. For practical and economic reasons, the swab technique is the most extensively used carcass surface-sampling method. The main characteristics, advantages, and limitations of the common excision and swabbing methods are described here.
Tracking Hazard Analysis Data in a Jungle of Changing Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Robin S.; Young, Jonathan
2006-05-14
The biggest fear of the hazard analyst is the loss of data in the middle of the design jungle. When project schedules are demanding and design is changing rapidly it is essential that the hazard analysis data be tracked and kept current in order to provide the required project design, development, and regulatory support. Being able to identify the current information, as well as the past archived information, as the design progresses and to be able to show how the project is designing in safety through modifications based on hazard analysis results is imperative. At the DOE Hanford site inmore » Washington State, Flour Hanford Inc is in the process of the removal and disposition of sludge from the 100 Area K Basins. The K Basins were used to store spent fuel from the operating reactors at the Hanford Site. The sludge is a by-product from the corrosion of the fuel and fuel storage canisters. The sludge removal project has been very dynamic involving the design, procurement and, more recently, the operation of processes at two basins, K East and K West. The project has an ambitious schedule with a large number of changes to design concepts. In order to support the complex K Basins project a technique to track the status of the hazard analysis data was developed. This paper will identify the most important elements of the tracking system and how it was used to assist the project in ensuring that current design data was reflected in a specific version of the hazard analysis and to show how the project was keeping up with the design and ensuring compliance with the requirements to design in safety. While the specifics of the data tracking strategy for the K Basins sludge removal project will be described in the paper, the general concepts of the strategy are applicable to similar projects requiring iteration of hazard analysis and design.« less
Fire Safety of Passenger Trains : Phase II : Application of Fire Hazard Analysis Techniques
DOT National Transportation Integrated Search
2001-12-01
On May 12, 1999, the Federal Railroad Administration (FRA) issued regulations for passenger rail equipment safety standards that included small-scale fire tests and performance criteria to evaluate the flammability and smoke characteristics of indivi...
Fire safety of passenger trains. Phase II, Application of fire hazard analysis techniques.
DOT National Transportation Integrated Search
2001-12-01
On May 12, 1999, the Federal Railroad Administration (FRA) issued regulations for passenger rail equipment safety standards that included small-scale fire tests and performance criteria to evaluate the flammability and smoke characteristics of indivi...
The Use of Molecular Techniques at Hazardous Waste Sites
It is clear that typical protocols used for soil analysis would certainly fail to adequately interrogate ground-water treatment systems unless they were substantially modified. The modifications found necessary to compensate for the low biomass include molecular tools and techniq...
Safety Analysis of Soybean Processing for Advanced Life Support
NASA Technical Reports Server (NTRS)
Hentges, Dawn L.
1999-01-01
Soybeans (cv. Hoyt) is one of the crops planned for food production within the Advanced Life Support System Integration Testbed (ALSSIT), a proposed habitat simulation for long duration lunar/Mars missions. Soybeans may be processed into a variety of food products, including soymilk, tofu, and tempeh. Due to the closed environmental system and importance of crew health maintenance, food safety is a primary concern on long duration space missions. Identification of the food safety hazards and critical control points associated with the closed ALSSIT system is essential for the development of safe food processing techniques and equipment. A Hazard Analysis Critical Control Point (HACCP) model was developed to reflect proposed production and processing protocols for ALSSIT soybeans. Soybean processing was placed in the type III risk category. During the processing of ALSSIT-grown soybeans, critical control points were identified to control microbiological hazards, particularly mycotoxins, and chemical hazards from antinutrients. Critical limits were suggested at each CCP. Food safety recommendations regarding the hazards and risks associated with growing, harvesting, and processing soybeans; biomass management; and use of multifunctional equipment were made in consideration of the limitations and restraints of the closed ALSSIT.
Data analysis and detection methods for on-line health monitoring of bridge structures
DOT National Transportation Integrated Search
2002-06-01
Developing an efficient structural health monitoring (SHM) technique is important for reducing potential hazards posed : to the public by damaged civil structures. The ultimate goal of applying SHM is to real-time detect, localize, and quantify : the...
WE-G-BRA-06: Application of Systems and Control Theory-Based Hazard Analysis to Radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pawlicki, T; Samost, A; Leveson, N
Purpose: The process of delivering radiation occurs in a complex socio-technical system heavily reliant on human operators. Furthermore, both humans and software are notoriously challenging to account for in traditional hazard analysis models. High reliability industries such as aviation have approached this problem through using hazard analysis techniques grounded in systems and control theory. The purpose of this work is to apply the Systems Theoretic Accident Model Processes (STAMP) hazard model to radiotherapy. In particular, the System-Theoretic Process Analysis (STPA) approach is used to perform a hazard analysis of a proposed on-line adaptive cranial radiosurgery procedure that omits the CTmore » Simulation step and uses only CBCT for planning, localization, and treatment. Methods: The STPA procedure first requires the definition of high-level accidents and hazards leading to those accidents. From there, hierarchical control structures were created followed by the identification and description of control actions for each control structure. Utilizing these control structures, unsafe states of each control action were created. Scenarios contributing to unsafe control action states were then identified and translated into system requirements to constrain process behavior within safe boundaries. Results: Ten control structures were created for this new CBCT-only process which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Twenty three control actions were identified that contributed to over 80 unsafe states of those control actions resulting in over 220 failure scenarios. Conclusion: The interaction of people, hardware, and software are highlighted through the STPA approach. STPA provides a hierarchical model for understanding the role of management decisions in impacting system safety so that a process design requirement can be traced back to the hazard and accident that it is intended to mitigate. Varian Medical Systems, Inc.« less
Rail-highway crossing hazard prediction : research results
DOT National Transportation Integrated Search
1979-12-01
This document presents techniques for constructing and evaluating railroad grade : crossing hazard indexes. Hazard indexes are objective formulas for comparing or ranking : crossings according to relative hazard or for calculating absolute hazard (co...
Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems
NASA Astrophysics Data System (ADS)
Kwag, Shinyoung
Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or component on the critical path is relatively more important in a risk-informed environment. Significance of multi-hazard risk is also illustrated for uncorrelated hazards of earthquakes and high winds which may result in competing design objectives. It is also illustrated that the number of computationally intensive nonlinear simulations needed in performance-based risk assessment for external hazards can be significantly reduced by using the power of Bayesian updating in conjunction with the concept of equivalent limit-state.
20170913 - Systematic Approaches to Biological/Chemical Read-Across for Hazard Identification (EMGS)
Read-across is a well-established data gap filling technique used within chemical category and analogue approaches for regulatory purposes. The category/analogue workflow comprises a number of steps starting from decision context, data gap analysis through to analogue identificat...
Tools for Material Design and Selection
NASA Astrophysics Data System (ADS)
Wehage, Kristopher
The present thesis focuses on applications of numerical methods to create tools for material characterization, design and selection. The tools generated in this work incorporate a variety of programming concepts, from digital image analysis, geometry, optimization, and parallel programming to data-mining, databases and web design. The first portion of the thesis focuses on methods for characterizing clustering in bimodal 5083 Aluminum alloys created by cryomilling and powder metallurgy. The bimodal samples analyzed in the present work contain a mixture of a coarse grain phase, with a grain size on the order of several microns, and an ultra-fine grain phase, with a grain size on the order of 200 nm. The mixing of the two phases is not homogeneous and clustering is observed. To investigate clustering in these bimodal materials, various microstructures were created experimentally by conventional cryomilling, Hot Isostatic Pressing (HIP), Extrusion, Dual-Mode Dynamic Forging (DMDF) and a new 'Gradient' cryomilling process. Two techniques for quantitative clustering analysis are presented, formulated and implemented. The first technique, the Area Disorder function, provides a metric of the quality of coarse grain dispersion in an ultra-fine grain matrix and the second technique, the Two-Point Correlation function, provides a metric of long and short range spatial arrangements of the two phases, as well as an indication of the mean feature size in any direction. The two techniques are implemented on digital images created by Scanning Electron Microscopy (SEM) and Electron Backscatter Detection (EBSD) of the microstructures. To investigate structure--property relationships through modeling and simulation, strategies for generating synthetic microstructures are discussed and a computer program that generates randomized microstructures with desired configurations of clustering described by the Area Disorder Function is formulated and presented. In the computer program, two-dimensional microstructures are generated by Random Sequential Adsorption (RSA) of voxelized ellipses representing the coarse grain phase. A simulated annealing algorithm is used to geometrically optimize the placement of the ellipses in the model to achieve varying user-defined configurations of spatial arrangement of the coarse grains. During the simulated annealing process, the ellipses are allowed to overlap up to a specified threshold, allowing triple junctions to form in the model. Once the simulated annealing process is complete, the remaining space is populated by smaller ellipses representing the ultra-fine grain phase. Uniform random orientations are assigned to the grains. The program generates text files that can be imported in to Crystal Plasticity Finite Element Analysis Software for stress analysis. Finally, numerical methods and programming are applied to current issues in green engineering and hazard assessment. To understand hazards associated with materials and select safer alternatives, engineers and designers need access to up-to-date hazard information. However, hazard information comes from many disparate sources and aggregating, interpreting and taking action on the wealth of data is not trivial. In light of these challenges, a Framework for Automated Hazard Assessment based on the GreenScreen list translator is presented. The framework consists of a computer program that automatically extracts data from the GHS-Japan hazard database, loads the data into a machine-readable JSON format, transforms the JSON document in to a GreenScreen JSON document using the GreenScreen List Translator v1.2 and performs GreenScreen Benchmark scoring on the material. The GreenScreen JSON documents are then uploaded to a document storage system to allow human operators to search for, modify or add additional hazard information via a web interface.
Loss estimation and damage forecast using database provided
NASA Astrophysics Data System (ADS)
Pyrchenko, V.; Byrova, V.; Petrasov, A.
2009-04-01
There is a wide spectrum of development of natural hazards is observed in Russian territory. It the necessity of investigation of numerous events of dangerous natural processes, researches of mechanisms of their development and interaction with each other (synergetic amplification or new hazards emerging) with the purpose of the forecast of possible losses. Employees of Laboratory of the analysis of geological risk IEG RAS have created a database about displays of natural hazards in territory of Russia, which contains the information on 1310 cases of their display during 1991 - 2008. The wide range of the used sources has determined certain difficulties in creation of Database and has demanded to develop a special new technique of unification of the information received at different times. One of points of this technique is classification of negative consequences of display of the natural hazards, considering a death-roll, wounded mans, victims and direct economic damage. This Database has allowed to track dynamics of natural hazards and the emergency situations caused by them (ES) for the considered period, and also to define laws of their development in territory of Russia in time and space. It gives the chance to create theoretical, methodological and methodical bases of forecasting of possible losses with a certain degree of probability for territory of Russia and for its separate regions that guarantees in the future maintenance of adequate, operative and efficient pre-emptive decision-making.
Calculation of Hazard Category 2/3 Threshold Quantities Using Contemporary Dosimetric Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, William C.
The purpose of this report is to describe the methodology and selection of input data utilized to calculate updated Hazard Category 2 and Hazard Category 3 Threshold Quantities (TQs) using contemporary dosimetric information. The calculation of the updated TQs will be considered for use in the revision to the Department of Energy (DOE) Technical Standard (STD-) 1027-92 Change Notice (CN)-1, “Hazard Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports.” The updated TQs documented in this report complement an effort previously undertaken by the National Nuclear Security Administration (NNSA), which in 2014 issued revisedmore » Supplemental Guidance documenting the calculation of updated TQs for approximately 100 radionuclides listed in DOE-STD-1027-92, CN-1. The calculations documented in this report complement the NNSA effort by expanding the set of radionuclides to more than 1,250 radionuclides with a published TQ. The development of this report was sponsored by the Department of Energy’s Office of Nuclear Safety (AU-30) within the Associate Under Secretary for Environment, Health, Safety, and Security organization.« less
Tankiewicz, Maciej; Fenik, Jolanta; Biziuk, Marek
2011-10-30
The intensification of agriculture means that increasing amounts of toxic organic and inorganic compounds are entering the environment. The pesticides generally applied nowadays are regarded as some of the most dangerous contaminants of the environment. Their presence in the environment, especially in water, is hazardous because they cause human beings to become more susceptible to disease. For these reasons, it is essential to monitor pesticide residues in the environment with the aid of all accessible analytical methods. The analysis of samples for the presence of pesticides is problematic, because of the laborious and time-consuming operations involved in preparing samples for analysis, which themselves may be a source of additional contaminations and errors. To date, it has been standard practice to use large quantities of organic solvents in the sample preparation process; but as these solvents are themselves hazardous, solventless and solvent-minimized techniques are coming into use. This paper discusses the most commonly used over the last 15 years sample preparation techniques for monitoring organophosphorus and organonitrogen pesticides residue in water samples. Furthermore, a significant trend in sample preparation, in accordance with the principles of 'Green Chemistry' is the simplification, miniaturization and automation of analytical techniques. In view of this aspect, several novel techniques are being developed in order to reduce the analysis step, increase the sample throughput and to improve the quality and the sensitivity of analytical methods. The paper describes extraction techniques requiring the use of solvents - liquid-liquid extraction (LLE) and its modifications, membrane extraction techniques, hollow fibre-protected two-phase solvent microextraction, liquid phase microextraction based on the solidification of a floating organic drop (LPME-SFO), solid-phase extraction (SPE) and single-drop microextraction (SDME) - as well as solvent-free techniques - solid phase microextraction (SPME) and stir bar sorptive extraction (SBSE). The advantages and drawbacks of these techniques are also discussed, and some solutions to their limitations are proposed. Copyright © 2011 Elsevier B.V. All rights reserved.
Monitoring genetic damage to ecosystems from hazardous waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, S.L.
1992-03-01
Applications of ecological toxicity testing to hazardous waste management have increased dramatically over the last few years, resulting in a greater awareness of the need for improved biomonitoring techniques. Our laboratory is developing advanced techniques to assess the genotoxic effects of environmental contamination on ecosystems. We have developed a novel mutagenesis assay using the nematode Caenorhabditis elegans, which is potentially applicable for multimedia studies in soil, sediment, and water. In addition, we are conducting validation studies of a previously developed anaphase aberration test that utilizes sea urchin embryos. Other related efforts include field validation studies of the new tests, evaluationmore » of their potential ecological relevance, and analysis of their sensitivity relative to that of existing toxicity tests that assess only lethal effects, rather than genetic damage.« less
Trajectory-Based Performance Assessment for Aviation Weather Information
NASA Technical Reports Server (NTRS)
Vigeant-Langlois, Laurence; Hansman, R. John, Jr.
2003-01-01
Based on an analysis of aviation decision-makers' time-related weather information needs, an abstraction of the aviation weather decision task was developed, that involves 4-D intersection testing between aircraft trajectory hypertubes and hazardous weather hypervolumes. The framework builds on the hypothesis that hazardous meteorological fields can be simplified using discrete boundaries of surrogate threat attributes. The abstractions developed in the framework may be useful in studying how to improve the performance of weather forecasts from the trajectory-centric perspective, as well as for developing useful visualization techniques of weather information.
40 CFR Appendix I to Part 265 - Recordkeeping Instructions
Code of Federal Regulations, 2010 CFR
2010-07-01
... physical form, i.e., liquid, sludge, solid, or contained gas. If the waste is not listed in part 261..., solid filter cake from production of ___, EPA Hazardous Waste Number W051). Each hazardous waste listed... technique(s) used at the facility to treat, store or dispose of each quantity of hazardous waste received. 1...
NASA Astrophysics Data System (ADS)
McCrea, Terry
The Shuttle Processing Contract (SPC) workforce consists of Lockheed Space Operations Co. as prime contractor, with Grumman, Thiokol Corporation, and Johnson Controls World Services as subcontractors. During the design phase, reliability engineering is instrumental in influencing the development of systems that meet the Shuttle fail-safe program requirements. Reliability engineers accomplish this objective by performing FMEA (failure modes and effects analysis) to identify potential single failure points. When technology, time, or resources do not permit a redesign to eliminate a single failure point, the single failure point information is formatted into a change request and presented to senior management of SPC and NASA for risk acceptance. In parallel with the FMEA, safety engineering conducts a hazard analysis to assure that potential hazards to personnel are assessed. The combined effort (FMEA and hazard analysis) is published as a system assurance analysis. Special ground rules and techniques are developed to perform and present the analysis. The reliability program at KSC is vigorously pursued, and has been extremely successful. The ground support equipment and facilities used to launch and land the Space Shuttle maintain an excellent reliability record.
A uniform technique for flood frequency analysis.
Thomas, W.O.
1985-01-01
This uniform technique consisted of fitting the logarithms of annual peak discharges to a Pearson Type III distribution using the method of moments. The objective was to adopt a consistent approach for the estimation of floodflow frequencies that could be used in computing average annual flood losses for project evaluation. In addition, a consistent approach was needed for defining equitable flood-hazard zones as part of the National Flood Insurance Program. -from ASCE Publications Information
NASA Astrophysics Data System (ADS)
Meyer, F. J.; Webley, P. W.; Dehn, J.; Arko, S. A.; McAlpin, D. B.; Gong, W.
2016-12-01
Volcanic eruptions are among the most significant hazards to human society, capable of triggering natural disasters on regional to global scales. In the last decade, remote sensing has become established in operational volcano monitoring. Centers like the Alaska Volcano Observatory rely heavily on remote sensing data from optical and thermal sensors to provide time-critical hazard information. Despite this high use of remote sensing data, the presence of clouds and a dependence on solar illumination often limit their impact on decision making. Synthetic Aperture Radar (SAR) systems are widely considered superior to optical sensors in operational monitoring situations, due to their weather and illumination independence. Still, the contribution of SAR to operational volcano monitoring has been limited in the past due to high data costs, long processing times, and low temporal sampling rates of most SAR systems. In this study, we introduce the automatic SAR processing system SARVIEWS, whose advanced data analysis and data integration techniques allow, for the first time, a meaningful integration of SAR into operational monitoring systems. We will introduce the SARVIEWS database interface that allows for automatic, rapid, and seamless access to the data holdings of the Alaska Satellite Facility. We will also present a set of processing techniques designed to automatically generate a set of SAR-based hazard products (e.g. change detection maps, interferograms, geocoded images). The techniques take advantage of modern signal processing and radiometric normalization schemes, enabling the combination of data from different geometries. Finally, we will show how SAR-based hazard information is integrated in existing multi-sensor decision support tools to enable joint hazard analysis with data from optical and thermal sensors. We will showcase the SAR processing system using a set of recent natural disasters (both earthquakes and volcanic eruptions) to demonstrate its robustness. We will also show the benefit of integrating SAR with data from other sensors to support volcano monitoring. For historic eruptions at Okmok and Augustine volcano, both located in the North Pacific, we will demonstrate that the addition of SAR can lead to a significant improvement in activity detection and eruption forecasting.
Ecological Effects of Weather Modification: A Problem Analysis.
ERIC Educational Resources Information Center
Cooper, Charles F.; Jolly, William C.
This publication reviews the potential hazards to the environment of weather modification techniques as they eventually become capable of producing large scale weather pattern modifications. Such weather modifications could result in ecological changes which would generally require several years to be fully evident, including the alteration of…
APPLICATIONS ANALYSIS REPORT: TOXIC TREATMENTS, IN-SITU STEAM/HOT-AIR STRIPPING TECHNOLOGY
This document is an evaluation of the performance of the Toxic Treatments (USA), Inc., (TTUSA) in situ steam/hot-air stripping technology and its applicability as an on-site treatment technique for hazardous waste site soil cleanup of volatile and semivolatile contaminants. Both ...
NASA Astrophysics Data System (ADS)
CHEN, Huali; Tokunaga, Tomochika; Ito, Yuka; Sawamukai, Marie
2014-05-01
Floods, the most common natural disaster in the world, cause serious loss of life and economic damage. Flood is one of the disasters in the coastal lowland along the Kujukuri Plain, Chiba Prefecture, Japan. Many natural and human activities have changed the surface environment of the Plain. These include agricultural development, urban and industrial development, change of the drainage patterns of the land surface, deposition and/or erosion of the river valleys, and so on. In addition, wide spread occurrence of land subsidence has been caused by the abstraction of natural gas dissolved in groundwater. The locations of the groundwater extraction include nearby the coast, and it may increase the flood risk. Hence, it is very important to evaluate flood hazard by taking into account the temporal change of land elevation caused by land subsidence, and to develop hazard maps for protecting surface environment and land-use planning. Multicriteria decision analysis (MCDA) provides methodology and techniques for analyzing complex decision problems, which often involve incommensurable data or criteria. Also, Geographical Information System (GIS) is the powerful tool since it manages large amount of spatial data involved in MCDA. The purpose of this study is to present a flood hazard model using MCDA techniques with GIS support in a region where primary data are scare. The model incorporates six parameters: river system, topography, land-use, flood control project, passing flood from coast, and precipitation. Main data sources used are 10 meter resolution topography data, airborne laser scanning data, leveling data, Landsat-TM data, two 1:30,000 scale river watershed map, and precipitation data from precipitation observation stations around the study area. River system map was created by merging the river order, the line density, and the river sink point density layers. Land-use data were derived from Landsat-TM images. A final hazard map for 2004, as an example, was obtained using an algorithm that combines factors in weighted linear combinations. The assignment of the weight/rank values and their analysis were realized by the application of the Analytic Hierarchy Process (AHP) method. This study is the preliminary work to investigate the flood hazard at the Kujukuri Plain. Flood hazard map of the other years will be analyzed to investigate the temporal change of the flood hazard area, and more data will be collected and added to improve the assessment.
Hazards on Hazards, Ensuring Spacecraft Safety While Sampling Asteroid Surface Materials
NASA Astrophysics Data System (ADS)
Johnson, C. A.; DellaGiustina, D. N.
2016-12-01
The near-Earth object Bennu is a carbonaceous asteroid that is a remnant from the earliest stages of the solar-system formation. It is also a potentially hazardous asteroid with a relatively high probability of impacting Earth late in the 22nd century. While the primary focus of the NASA funded OSIRIS-REx mission is the return of pristine organic material from the asteroid's surface, information about Bennu's physical and chemical properties gleaned throughout operations will be critical for a possible future impact mitigation mission. In order to ensure a regolith sample can be successfully acquired, the sample site and surrounding area must be thoroughly assessed for any potential hazards to the spacecraft. The OSIRIS-REx Image Processing Working Group has been tasked with generating global and site-specific hazard maps using mosaics and a trio of feature identification techniques. These techniques include expert-lead manual classification, internet-based amateur classification using the citizen science platform CosmoQuest, and automated classification using machine learning and computer vision tools. Because proximity operations around Bennu do not begin until the end of 2018, we have an opportunity to test the performance of our software on analogue surfaces of other asteroids from previous NASA and other space agencies missions. The entire pipeline from image processing and mosaicking to hazard identification, analysis and mapping will be performed on asteroids of varying size, shape and surface morphology. As a result, upon arrival at Bennu, we will have the software and processes in place to quickly and confidently produce the hazard maps needed to ensure the success of our mission.
Integrated survival analysis using an event-time approach in a Bayesian framework
Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.
2015-01-01
Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.
Integrated survival analysis using an event-time approach in a Bayesian framework.
Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M
2015-02-01
Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
One of the approaches for reducing uncertainties in the assessment of human exposure is to better characterize the hazardous wastes that contaminate our environment. A significant limitation to this approach, however, is that sampling and laboratory analysis of contaminated envi...
A hazards-model analysis of the covariates of infant and child mortality in Sri Lanka.
Trussell, J; Hammerslough, C
1983-02-01
The purpose of this paper is twofold: (a) to provide a complete self-contained exposition of estimating life tables with covariates through the use of hazards models, and (b) to illustrate this technique with a substantive analysis of child mortality in Sri Lanka, thereby demonstrating that World Fertility Survey data are a valuable source for the study of child mortality. We show that life tables with covariates can be easily estimated with standard computer packages designed for analysis of contingency tables. The substantive analysis confirms and supplements an earlier study of infant and child mortality in Sri Lanka by Meegama. Those factors found to be strongly associated with mortality are mother's and father's education, time period of birth, urban/rural/estate residence, ethnicity, sex, birth order, age of the mother at the birth, and type of toilet facility.
Rapid scanning system for fuel drawers
Caldwell, J.T.; Fehlau, P.E.; France, S.W.
A nondestructive method for uniquely distinguishing among and quantifying the mass of individual fuel plates in situ in fuel drawers utilized in nuclear reactors is described. The method is both rapid and passive, eliminating the personnel hazard of the commonly used irradiation techniques which require that the analysis be performed in proximity to an intense neutron source such as a reactor. In the present technique, only normally decaying nuclei are observed. This allows the analysis to be performed anywhere. This feature, combined with rapid scanning of a given fuel drawer (in approximately 30 s), and the computer data analysis allows the processing of large numbers of fuel drawers efficiently in the event of a loss alert.
Rapid scanning system for fuel drawers
Caldwell, John T.; Fehlau, Paul E.; France, Stephen W.
1981-01-01
A nondestructive method for uniqely distinguishing among and quantifying the mass of individual fuel plates in situ in fuel drawers utilized in nuclear reactors is described. The method is both rapid and passive, eliminating the personnel hazard of the commonly used irradiation techniques which require that the analysis be performed in proximity to an intense neutron source such as a reactor. In the present technique, only normally decaying nuclei are observed. This allows the analysis to be performed anywhere. This feature, combined with rapid scanning of a given fuel drawer (in approximately 30 s), and the computer data analysis allows the processing of large numbers of fuel drawers efficiently in the event of a loss alert.
Experimental Fuels Facility Re-categorization Based on Facility Segmentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reiss, Troy P.; Andrus, Jason
The Experimental Fuels Facility (EFF) (MFC-794) at the Materials and Fuels Complex (MFC) located on the Idaho National Laboratory (INL) Site was originally constructed to provide controlled-access, indoor storage for radiological contaminated equipment. Use of the facility was expanded to provide a controlled environment for repairing contaminated equipment and characterizing, repackaging, and treating waste. The EFF facility is also used for research and development services, including fuel fabrication. EFF was originally categorized as a LTHC-3 radiological facility based on facility operations and facility radiological inventories. Newly planned program activities identified the need to receive quantities of fissionable materials in excessmore » of the single parameter subcritical limit in ANSI/ANS-8.1, “Nuclear Criticality Safety in Operations with Fissionable Materials Outside Reactors” (identified as “criticality list” quantities in DOE-STD-1027-92, “Hazard Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports,” Attachment 1, Table A.1). Since the proposed inventory of fissionable materials inside EFF may be greater than the single parameter sub-critical limit of 700 g of U-235 equivalent, the initial re-categorization is Hazard Category (HC) 2 based upon a potential criticality hazard. This paper details the facility hazard categorization performed for the EFF. The categorization was necessary to determine (a) the need for further safety analysis in accordance with LWP-10802, “INL Facility Categorization,” and (b) compliance with 10 Code of Federal Regulations (CFR) 830, Subpart B, “Safety Basis Requirements.” Based on the segmentation argument presented in this paper, the final hazard categorization for the facility is LTHC-3. Department of Energy Idaho (DOE-ID) approval of the final hazard categorization determined by this hazard assessment document (HAD) was required per the DOE-ID Supplemental Guidance for DOE-STD-1027-92 based on the proposed downgrade of the initial facility categorization of Hazard Category 2.« less
Flood Hazard Mapping by Applying Fuzzy TOPSIS Method
NASA Astrophysics Data System (ADS)
Han, K. Y.; Lee, J. Y.; Keum, H.; Kim, B. J.; Kim, T. H.
2017-12-01
There are lots of technical methods to integrate various factors for flood hazard mapping. The purpose of this study is to suggest the methodology of integrated flood hazard mapping using MCDM(Multi Criteria Decision Making). MCDM problems involve a set of alternatives that are evaluated on the basis of conflicting and incommensurate criteria. In this study, to apply MCDM to assessing flood risk, maximum flood depth, maximum velocity, and maximum travel time are considered as criterion, and each applied elements are considered as alternatives. The scheme to find the efficient alternative closest to a ideal value is appropriate way to assess flood risk of a lot of element units(alternatives) based on various flood indices. Therefore, TOPSIS which is most commonly used MCDM scheme is adopted to create flood hazard map. The indices for flood hazard mapping(maximum flood depth, maximum velocity, and maximum travel time) have uncertainty concerning simulation results due to various values according to flood scenario and topographical condition. These kind of ambiguity of indices can cause uncertainty of flood hazard map. To consider ambiguity and uncertainty of criterion, fuzzy logic is introduced which is able to handle ambiguous expression. In this paper, we made Flood Hazard Map according to levee breach overflow using the Fuzzy TOPSIS Technique. We confirmed the areas where the highest grade of hazard was recorded through the drawn-up integrated flood hazard map, and then produced flood hazard map can be compared them with those indicated in the existing flood risk maps. Also, we expect that if we can apply the flood hazard map methodology suggested in this paper even to manufacturing the current flood risk maps, we will be able to make a new flood hazard map to even consider the priorities for hazard areas, including more varied and important information than ever before. Keywords : Flood hazard map; levee break analysis; 2D analysis; MCDM; Fuzzy TOPSIS Acknowlegement This research was supported by a grant (17AWMP-B079625-04) from Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-04
... describes: Standard operating procedures for using hazardous chemicals; hazard-control techniques; equipment...] Occupational Exposure to Hazardous Chemicals in Laboratories Standard; Extension of the Office of Management... collection requirements specified in the Standard on Occupational Exposure to Hazardous Chemicals in...
Food Safety Evaluation Based on Near Infrared Spectroscopy and Imaging: A Review.
Fu, Xiaping; Ying, Yibin
2016-08-17
In recent years, due to the increasing consciousness of food safety and human health, much progress has been made in developing rapid and nondestructive techniques for the evaluation of food hazards, food authentication, and traceability. Near infrared (NIR) spectroscopy and imaging techniques have gained wide acceptance in many fields because of their advantages over other analytical techniques. Following a brief introduction of NIR spectroscopy and imaging basics, this review mainly focuses on recent NIR spectroscopy and imaging applications for food safety evaluation, including (1) chemical hazards detection; (2) microbiological hazards detection; (3) physical hazards detection; (4) new technology-induced food safety concerns; and (5) food traceability. The review shows NIR spectroscopy and imaging to be effective tools that will play indispensable roles for food safety evaluation. In addition, on-line/real-time applications of these techniques promise to be a huge growth field in the near future.
Visual Attention for Solving Multiple-Choice Science Problem: An Eye-Tracking Analysis
ERIC Educational Resources Information Center
Tsai, Meng-Jung; Hou, Huei-Tse; Lai, Meng-Lung; Liu, Wan-Yi; Yang, Fang-Ying
2012-01-01
This study employed an eye-tracking technique to examine students' visual attention when solving a multiple-choice science problem. Six university students participated in a problem-solving task to predict occurrences of landslide hazards from four images representing four combinations of four factors. Participants' responses and visual attention…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowyer, Theodore W.; Gesh, Christopher J.; Haas, Daniel A.
This report details efforts to develop a technique which is able to detect and quantify the mass of 240Pu in waste storage tanks and other enclosed spaces. If the isotopic ratios of the plutonium contained in the enclosed space is also known, then this technique is capable of estimating the total mass of the plutonium without physical sample retrieval and radiochemical analysis of hazardous material. Results utilizing this technique are reported for a Hanford Site waste tank (TX-118) and a well-characterized plutonium sample in a laboratory environment.
Safety considerations in the design and operation of large wind turbines
NASA Technical Reports Server (NTRS)
Reilly, D. H.
1979-01-01
The engineering and safety techniques used to assure the reliable and safe operation of large wind turbine generators utilizing the Mod 2 Wind Turbine System Program as an example is described. The techniques involve a careful definition of the wind turbine's natural and operating environments, use of proven structural design criteria and analysis techniques, an evaluation of potential failure modes and hazards, and use of a fail safe and redundant component engineering philosophy. The role of an effective quality assurance program, tailored to specific hardware criticality, and the checkout and validation program developed to assure system integrity are described.
Revealing the underlying drivers of disaster risk: a global analysis
NASA Astrophysics Data System (ADS)
Peduzzi, Pascal
2017-04-01
Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.
29 CFR 1910.307 - Hazardous (classified) locations.
Code of Federal Regulations, 2013 CFR
2013-07-01
...; conductor insulation, flexible cords, sealing and drainage, transformers, capacitors, switches, circuit... following are acceptable protection techniques for electric and electronic equipment in hazardous...) Nonincendive circuit. This protection technique is permitted for equipment in Class I, Division 2; Class II...
29 CFR 1910.307 - Hazardous (classified) locations.
Code of Federal Regulations, 2012 CFR
2012-07-01
...; conductor insulation, flexible cords, sealing and drainage, transformers, capacitors, switches, circuit... following are acceptable protection techniques for electric and electronic equipment in hazardous...) Nonincendive circuit. This protection technique is permitted for equipment in Class I, Division 2; Class II...
29 CFR 1910.307 - Hazardous (classified) locations.
Code of Federal Regulations, 2014 CFR
2014-07-01
...; conductor insulation, flexible cords, sealing and drainage, transformers, capacitors, switches, circuit... following are acceptable protection techniques for electric and electronic equipment in hazardous...) Nonincendive circuit. This protection technique is permitted for equipment in Class I, Division 2; Class II...
Aga, Cathrine; Kartus, Jüri-Tomas; Lind, Martin; Lygre, Stein Håkon Låstad; Granan, Lars-Petter; Engebretsen, Lars
2017-10-01
Double-bundle anterior cruciate ligament (ACL) reconstruction has demonstrated improved biomechanical properties and moderately better objective outcomes compared with single-bundle reconstructions. This could make an impact on the rerupture rate and reduce the risk of revisions in patients undergoing double-bundle ACL reconstruction compared with patients reconstructed with a traditional single-bundle technique. The National Knee Ligament Registers in Scandinavia provide information that can be used to evaluate the revision outcome after ACL reconstructions. The purposes of the study were (1) to compare the risk of revision between double-bundle and single-bundle reconstructions, reconstructed with autologous hamstring tendon grafts; (2) to compare the risk of revision between double-bundle hamstring tendon and single-bundle bone-patellar tendon-bone autografts; and (3) to compare the hazard ratios for the same two research questions after Cox regression analysis was performed. Data collection of primary ACL reconstructions from the National Knee Ligament Registers in Denmark, Norway, and Sweden from July 1, 2005, to December 31, 2014, was retrospectively analyzed. A total of 60,775 patients were included in the study; 994 patients were reconstructed with double-bundle hamstring tendon grafts, 51,991 with single-bundle hamstring tendon grafts, and 7790 with single-bundle bone-patellar tendon-bone grafts. The double-bundle ACL-reconstructed patients were compared with the two other groups. The risk of revision for each research question was detected by the risk ratio, hazard ratio, and the corresponding 95% confidence intervals. Kaplan-Meier analysis was used to estimate survival at 1, 2, and 5 years for the three different groups. Furthermore, a Cox proportional hazard regression model was applied and the hazard ratios were adjusted for country, age, sex, meniscal or chondral injury, and utilized fixation devices on the femoral and tibial sides. There were no differences in the crude risk of revision between the patients undergoing the double-bundle technique and the two other groups. A total of 3.7% patients were revised in the double-bundle group (37 of 994 patients) versus 3.8% in the single-bundle hamstring tendon group (1952 of 51,991; risk ratio, 1.01; 95% confidence interval (CI), 0.73-1.39; p = 0.96), and 2.8% of the patients were revised in the bone-patellar tendon-bone group (219 of the 7790 bone-patellar tendon-bone patients; risk ratio, 0.76; 95% CI, 0.54-1.06; p = 0.11). Cox regression analysis with adjustment for country, age, sex, menisci or cartilage injury, and utilized fixation device on the femoral and tibial sides, did not reveal any further difference in the risk of revision between the single-bundle hamstring tendon and double-bundle hamstring tendon groups (hazard ratio, 1.18; 95% CI, 0.85-1.62; p = 0.33), but the adjusted hazard ratio showed a lower risk of revision in the single-bundle bone-patellar tendon-bone group compared with the double-bundle group (hazard ratio, 0.62; 95% CI, 0.43-0.90; p = 0.01). Comparisons of the graft revision rates reported separately for each country revealed that double-bundle hamstring tendon reconstructions in Sweden had a lower hazard ratio compared with the single-bundle hamstring tendon reconstructions (hazard ratio, 1.00 versus 1.89; 95% CI, 1.09-3.29; p = 0.02). Survival at 5 years after index surgery was 96.0% for the double-bundle group, 95.4% for the single-bundle hamstring tendon group, and 97.0% for the single-bundle bone-patellar tendon-bone group. Based on the data from all three national registers, the risk of revision was not influenced by the reconstruction technique in terms of using single- or double-bundle hamstring tendons, although national differences in survival existed. Using bone-patellar tendon-bone grafts lowered the risk of revision compared with double-bundle hamstring tendon grafts. These findings should be considered when deciding what reconstruction technique to use in ACL-deficient knees. Future studies identifying the reasons for graft rerupture in single- and double-bundle reconstructions would be of interest to understand the findings of the present study. Level III, therapeutic study.
NASA Astrophysics Data System (ADS)
Salvini, Riccardo; Mastrorocco, Giovanni; Esposito, Giuseppe; Di Bartolo, Silvia; Coggan, John; Vanneschi, Claudio
2018-01-01
The use of remote sensing techniques is now common practice in different working environments, including engineering geology. Moreover, in recent years the development of structure from motion (SfM) methods, together with rapid technological improvement, has allowed the widespread use of cost-effective remotely piloted aircraft systems (RPAS) for acquiring detailed and accurate geometrical information even in evolving environments, such as mining contexts. Indeed, the acquisition of remotely sensed data from hazardous areas provides accurate 3-D models and high-resolution orthophotos minimizing the risk for operators. The quality and quantity of the data obtainable from RPAS surveys can then be used for inspection of mining areas, audit of mining design, rock mass characterizations, stability analysis investigations and monitoring activities. Despite the widespread use of RPAS, its potential and limitations still have to be fully understood.In this paper a case study is shown where a RPAS was used for the engineering geological investigation of a closed marble mine area in Italy; direct ground-based techniques could not be applied for safety reasons. In view of the re-activation of mining operations, high-resolution images taken from different positions and heights were acquired and processed using SfM techniques to obtain an accurate and detailed 3-D model of the area. The geometrical and radiometrical information was subsequently used for a deterministic rock mass characterization, which led to the identification of two large marble blocks that pose a potential significant hazard issue for the future workforce. A preliminary stability analysis, with a focus on investigating the contribution of potential rock bridges, was then performed in order to demonstrate the potential use of RPAS information in engineering geological contexts for geohazard identification, awareness and reduction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1990. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis. The Chemical Analysis Group uses wet-chemical and instrumental methods for elemental, compositional, and isotopic analyses of solid, liquid, and gaseous samples and provides specialized analytical services. The Instrumental Analysis Group uses nuclear counting techniques in radiochemical analyses over a wide range of sample types from low-level environmental samples to samples of high radioactivity. The Organic Analysis Group uses amore » number of complementary techniques to separate and to quantitatively and qualitatively analyze complex organic mixtures and compounds at the trace level, including synthetic fuels, toxic substances, fossil-fuel residues and emissions, pollutants, biologically active compounds, pesticides, and drugs. The Environmental Analysis Group performs analyses of inorganic environmental and hazardous waste and coal samples.« less
NASA Technical Reports Server (NTRS)
Watson, Clifford
2010-01-01
Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the twodimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the least-well-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and threedimensional charting gives a visual confirmation of the relationship between causes and their controls
NASA Technical Reports Server (NTRS)
Watson, Clifford C.
2011-01-01
Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the two-dimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the least-well-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and three-dimensional charting gives a visual confirmation of the relationship between causes and their controls.
Risk Presentation Using the Three Dimensions of Likelihood, Severity, and Level of Control
NASA Technical Reports Server (NTRS)
Watson, Clifford
2010-01-01
Traditional hazard analysis techniques utilize a two-dimensional representation of the results determined by relative likelihood and severity of the residual risk. These matrices present a quick-look at the Likelihood (Y-axis) and Severity (X-axis) of the probable outcome of a hazardous event. A three-dimensional method, described herein, utilizes the traditional X and Y axes, while adding a new, third dimension, shown as the Z-axis, and referred to as the Level of Control. The elements of the Z-axis are modifications of the Hazard Elimination and Control steps (also known as the Hazard Reduction Precedence Sequence). These steps are: 1. Eliminate risk through design. 2. Substitute less risky materials for more hazardous materials. 3. Install safety devices. 4. Install caution and warning devices. 5. Develop administrative controls (to include special procedures and training.) 6. Provide protective clothing and equipment. When added to the two-dimensional models, the level of control adds a visual representation of the risk associated with the hazardous condition, creating a tall-pole for the leastwell-controlled failure while establishing the relative likelihood and severity of all causes and effects for an identified hazard. Computer modeling of the analytical results, using spreadsheets and three-dimensional charting gives a visual confirmation of the relationship between causes and their controls.
A comparative critical study between FMEA and FTA risk analysis methods
NASA Astrophysics Data System (ADS)
Cristea, G.; Constantinescu, DM
2017-10-01
Today there is used an overwhelming number of different risk analyses techniques with acronyms such as: FMEA (Failure Modes and Effects Analysis) and its extension FMECA (Failure Mode, Effects, and Criticality Analysis), DRBFM (Design Review by Failure Mode), FTA (Fault Tree Analysis) and and its extension ETA (Event Tree Analysis), HAZOP (Hazard & Operability Studies), HACCP (Hazard Analysis and Critical Control Points) and What-if/Checklist. However, the most used analysis techniques in the mechanical and electrical industry are FMEA and FTA. In FMEA, which is an inductive method, information about the consequences and effects of the failures is usually collected through interviews with experienced people, and with different knowledge i.e., cross-functional groups. The FMEA is used to capture potential failures/risks & impacts and prioritize them on a numeric scale called Risk Priority Number (RPN) which ranges from 1 to 1000. FTA is a deductive method i.e., a general system state is decomposed into chains of more basic events of components. The logical interrelationship of how such basic events depend on and affect each other is often described analytically in a reliability structure which can be visualized as a tree. Both methods are very time-consuming to be applied thoroughly, and this is why it is oftenly not done so. As a consequence possible failure modes may not be identified. To address these shortcomings, it is proposed to use a combination of FTA and FMEA.
NASA Astrophysics Data System (ADS)
Lukman, Iing; Ibrahim, Noor A.; Daud, Isa B.; Maarof, Fauziah; Hassan, Mohd N.
2002-03-01
Survival analysis algorithm is often applied in the data mining process. Cox regression is one of the survival analysis tools that has been used in many areas, and it can be used to analyze the failure times of aircraft crashed. Another survival analysis tool is the competing risks where we have more than one cause of failure acting simultaneously. Lunn-McNeil analyzed the competing risks in the survival model using Cox regression with censored data. The modified Lunn-McNeil technique is a simplify of the Lunn-McNeil technique. The Kalbfleisch-Prentice technique is involving fitting models separately from each type of failure, treating other failure types as censored. To compare the two techniques, (the modified Lunn-McNeil and Kalbfleisch-Prentice) a simulation study was performed. Samples with various sizes and censoring percentages were generated and fitted using both techniques. The study was conducted by comparing the inference of models, using Root Mean Square Error (RMSE), the power tests, and the Schoenfeld residual analysis. The power tests in this study were likelihood ratio test, Rao-score test, and Wald statistics. The Schoenfeld residual analysis was conducted to check the proportionality of the model through its covariates. The estimated parameters were computed for the cause-specific hazard situation. Results showed that the modified Lunn-McNeil technique was better than the Kalbfleisch-Prentice technique based on the RMSE measurement and Schoenfeld residual analysis. However, the Kalbfleisch-Prentice technique was better than the modified Lunn-McNeil technique based on power tests measurement.
Fundamentals of Digital Engineering: Designing for Reliability
NASA Technical Reports Server (NTRS)
Katz, R.; Day, John H. (Technical Monitor)
2001-01-01
The concept of designing for reliability will be introduced along with a brief overview of reliability, redundancy and traditional methods of fault tolerance is presented, as applied to current logic devices. The fundamentals of advanced circuit design and analysis techniques will be the primary focus. The introduction will cover the definitions of key device parameters and how analysis is used to prove circuit correctness. Basic design techniques such as synchronous vs asynchronous design, metastable state resolution time/arbiter design, and finite state machine structure/implementation will be reviewed. Advanced topics will be explored such as skew-tolerant circuit design, the use of triple-modular redundancy and circuit hazards, device transients and preventative circuit design, lock-up states in finite state machines generated by logic synthesizers, device transient characteristics, radiation mitigation techniques. worst-case analysis, the use of timing analyzer and simulators, and others. Case studies and lessons learned from spaceflight designs will be given as examples
NASA Astrophysics Data System (ADS)
mouloud, Hamidatou
2016-04-01
The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria.
40 CFR 63.1210 - What are the notification requirements?
Code of Federal Regulations, 2012 CFR
2012-07-01
...; (E) Waste minimization and emission control technique(s) effectiveness; (F) A description of the... National Emission Standards for Hazardous Air Pollutants from Hazardous Waste Combustors Notification...)(v) Notification of excessive particulate matter detection system exceedances. 63.1207(e), 63.9(e) 63...
Hazards and benefits of in-vivo Raman spectroscopy of human skin
NASA Astrophysics Data System (ADS)
Carter, Elizabeth A.; Williams, Adrian C.; Barry, Brian W.; Edwards, Howell G.
1999-04-01
The resurgence of Raman spectroscopy, in the late 1980's has led to an increase in the use of the technique for the analysis of biological tissues. Consequently, Raman spectroscopy is now regarded to be a well-established non- invasive, non-destructive technique, which is used to obtain good quality spectra from biological tissues with minimal fluorescence. What is presently of interest to our group is to develop further and establish the technique for in vivo investigations of healthy and diseased skin. This presentation discusses some potentially valuable clinical applications of the technique, and also highlights some of the experimental difficulties that were encountered when examining patients who were receiving treatment for psoriasis.
Gotham, Kevin Fox; Campanella, Richard; Lauve-Moon, Katie; Powers, Bradford
2018-02-01
This article investigates the determinants of flood risk perceptions in New Orleans, Louisiana (United States), a deltaic coastal city highly vulnerable to seasonal nuisance flooding and hurricane-induced deluges and storm surges. Few studies have investigated the influence of hazard experience, geophysical vulnerability (hazard proximity), and risk perceptions in cities undergoing postdisaster recovery and rebuilding. We use ordinal logistic regression techniques to analyze experiential, geophysical, and sociodemographic variables derived from a survey of 384 residents in seven neighborhoods. We find that residents living in neighborhoods that flooded during Hurricane Katrina exhibit higher levels of perceived risk than those residents living in neighborhoods that did not flood. In addition, findings suggest that flood risk perception is positively associated with female gender, lower income, and direct flood experiences. In conclusion, we discuss the implications of these findings for theoretical and empirical research on environmental risk, flood risk communication strategies, and flood hazards planning. © 2017 Society for Risk Analysis.
Bates, Matthew E; Keisler, Jeffrey M; Zussblatt, Niels P; Plourde, Kenton J; Wender, Ben A; Linkov, Igor
2016-02-01
Risk research for nanomaterials is currently prioritized by means of expert workshops and other deliberative processes. However, analytical techniques that quantify and compare alternative research investments are increasingly recommended. Here, we apply value of information and portfolio decision analysis-methods commonly applied in financial and operations management-to prioritize risk research for multiwalled carbon nanotubes and nanoparticulate silver and titanium dioxide. We modify the widely accepted CB Nanotool hazard evaluation framework, which combines nano- and bulk-material properties into a hazard score, to operate probabilistically with uncertain inputs. Literature is reviewed to develop uncertain estimates for each input parameter, and a Monte Carlo simulation is applied to assess how different research strategies can improve hazard classification. The relative cost of each research experiment is elicited from experts, which enables identification of efficient research portfolios-combinations of experiments that lead to the greatest improvement in hazard classification at the lowest cost. Nanoparticle shape, diameter, solubility and surface reactivity were most frequently identified within efficient portfolios in our results.
NASA Technical Reports Server (NTRS)
Deckert, George
2010-01-01
This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts
NASA Astrophysics Data System (ADS)
Joyce, Malcolm J.; Aspinall, Michael D.; Cave, Francis D.; Lavietes, Anthony D.
2012-08-01
Pulse-shape discrimination (PSD) in fast, organic scintillation detectors is a long-established technique used to separate neutrons and γ rays in mixed radiation fields. In the analogue domain the method can achieve separation in real time, but all knowledge of the pulses themselves is lost thereby preventing the possibility of any post- or repeated analysis. Also, it is typically reliant on electronic systems that are largely obsolete and which require significant experience to set up. In the digital domain, PSD is often more flexible but significant post-processing has usually been necessary to obtain neutron/γ-ray separation. Moreover, the scintillation media on which the technique relies usually have a low flashpoint and are thus deemed hazardous. This complicates the ease with which they are used in industrial applications. In this paper, results obtained with a new portable digital pulse-shape discrimination instrument are described. This instrument provides real-time, digital neutron/γ-ray separation whilst preserving the synchronization with the time-of-arrival for each event, and realizing throughputs of 3 × 106 events per second. Furthermore, this system has been tested with a scintillation medium that is non-flammable and not hazardous.
Techniques for hazard analysis and their use at CERN.
Nuttall, C; Schönbacher, H
2001-01-01
CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.
Nowcast model for hazardous material spill prevention and response, San Francisco Bay, California
Cheng, Ralph T.; Wilmot, Wayne L.; Galt, Jerry A.
1997-01-01
The National Oceanic and Atmospheric Administration (NOAA) installed the Physical Oceanographic Real-time System (PORTS) in San Francisco Bay, California, to provide real-time observations of tides, tidal currents, and meteorological conditions to, among other purposes, guide hazardous material spill prevention and response. Integrated with nowcast modeling techniques and dissemination of real-time data and the nowcasting results through the Internet on the World Wide Web, emerging technologies used in PORTS for real-time data collection forms a nowcast modeling system. Users can download tides and tidal current distribution in San Francisco Bay for their specific applications and/or for further analysis.
Detecting Spatial Patterns of Natural Hazards from the Wikipedia Knowledge Base
NASA Astrophysics Data System (ADS)
Fan, J.; Stewart, K.
2015-07-01
The Wikipedia database is a data source of immense richness and variety. Included in this database are thousands of geotagged articles, including, for example, almost real-time updates on current and historic natural hazards. This includes usercontributed information about the location of natural hazards, the extent of the disasters, and many details relating to response, impact, and recovery. In this research, a computational framework is proposed to detect spatial patterns of natural hazards from the Wikipedia database by combining topic modeling methods with spatial analysis techniques. The computation is performed on the Neon Cluster, a high performance-computing cluster at the University of Iowa. This work uses wildfires as the exemplar hazard, but this framework is easily generalizable to other types of hazards, such as hurricanes or flooding. Latent Dirichlet Allocation (LDA) modeling is first employed to train the entire English Wikipedia dump, transforming the database dump into a 500-dimension topic model. Over 230,000 geo-tagged articles are then extracted from the Wikipedia database, spatially covering the contiguous United States. The geo-tagged articles are converted into an LDA topic space based on the topic model, with each article being represented as a weighted multidimension topic vector. By treating each article's topic vector as an observed point in geographic space, a probability surface is calculated for each of the topics. In this work, Wikipedia articles about wildfires are extracted from the Wikipedia database, forming a wildfire corpus and creating a basis for the topic vector analysis. The spatial distribution of wildfire outbreaks in the US is estimated by calculating the weighted sum of the topic probability surfaces using a map algebra approach, and mapped using GIS. To provide an evaluation of the approach, the estimation is compared to wildfire hazard potential maps created by the USDA Forest service.
Software Safety Analysis of a Flight Guidance System
NASA Technical Reports Server (NTRS)
Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.
2004-01-01
This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.
Site characterization and analysis penetrometer system
NASA Astrophysics Data System (ADS)
Heath, Jeff
1995-04-01
The site characterization and analysis penetrometer system (SCAPS) with laser induced fluorescence (LIF) sensors is being demonstrated as a quick field screening technique to determine the physical and chemical characteristics of subsurface soil and contaminants at hazardous waste sites SCAPS is a collaborative development effort of the Navy, Army, and Air Force under the Tri-Service SCAPS Program. The current SCAPS configuration is designed to quickly and cost-effectively distinguish areas contaminated with petroleum products (hydrocarbons) from unaffected areas.
Remote Sensing of Chaparral Fire Potential: Case Study in Topanga Canyon, California.
Remote sensing techniques, especially the use of color infrared aerial photography, provide a useful tool for fire hazard analysis, including interpetive information about fuel volumes, physiognomic plant groupings, the relationships of buildings to both natural and planted vegetation, and fire vulnerability of roofing materials. In addition, the behavior of the September, 1970 Wright Fire in the Topanga study area suggested the validity of the fire potential analysis which had been made prior to that conflagration.
Techniques for development of safety-related software for surgical robots.
Varley, P
1999-12-01
Regulatory bodies require evidence that software controlling potentially hazardous devices is developed to good manufacturing practices. Effective techniques used in other industries assume long timescales and high staffing levels and can be unsuitable for use without adaptation in developing electronic healthcare devices. This paper discusses a set of techniques used in practice to develop software for a particular innovative medical product, an endoscopic camera manipulator. These techniques include identification of potential hazards and tracing their mitigating factors through the project lifecycle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dionne, B.J.; Morris, S.C. III; Baum, J.W.
1998-01-01
The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example ofmore » a risk-based decision technique. This document contains the Appendices for the report.« less
NASA Astrophysics Data System (ADS)
Maruthi, Y. A.; Das, N. Lakshmana; Ramprasad, S.; Ram, S. S.; Sudarshan, M.
2015-08-01
The present studies focus the quantitative analysis of elements in school chalk to ensure the safety of its use. The elements like Calcium (Ca), Aluminum (Al), Iron (Fe), Silicon (Si) and Chromium (Cr) were analyzed from settled chalk dust samples collected from five classrooms (CD-1) and also from another set of unused chalk samples collected from local market (CD-2) using Energy Dispersive X-Ray florescence(ED-XRF) spectroscopy. Presence of these elements in significant concentrations in school chalk confirmed that, it is an irritant and occupational hazard. It is suggested to use protective equipments like filtered mask for mouth, nose and chalk holders. This study also suggested using the advanced mode of techniques like Digital boards, marker boards and power point presentations to mitigate the occupational hazard for classroom chalk
Detection, identification, and quantification techniques for spills of hazardous chemicals
NASA Technical Reports Server (NTRS)
Washburn, J. F.; Sandness, G. A.
1977-01-01
The first 400 chemicals listed in the Coast Guard's Chemical Hazards Response Information System were evaluated with respect to their detectability, identifiability, and quantifiability by 12 generalized remote and in situ sensing techniques. Identification was also attempted for some key areas in water pollution sensing technology.
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prooijen, Monique van; Breen, Stephen
Purpose: Our treatment for choroidal melanoma utilizes the GTC frame. The patient looks at a small LED to stabilize target position. The LED is attached to a metal arm attached to the GTC frame. A camera on the arm allows therapists to monitor patient compliance. To move to mask-based immobilization we need a new LED/camera attachment mechanism. We used a Hazard-Risk Analysis (HRA) to guide the design of the new tool. Method: A pre-clinical model was built with input from therapy and machine shop personnel. It consisted of an aluminum frame placed in aluminum guide posts attached to the couchmore » top. Further development was guided by the Department of Defense Standard Practice - System Safety hazard risk analysis technique. Results: An Orfit mask was selected because it allowed access to indexes on the couch top which assist with setup reproducibility. The first HRA table was created considering mechanical failure modes of the device. Discussions with operators and manufacturers identified other failure modes and solutions. HRA directed the design towards a safe clinical device. Conclusion: A new immobilization tool has been designed using hazard-risk analysis which resulted in an easier-to-use and safer tool compared to the initial design. The remaining risks are all low probability events and not dissimilar from those currently faced with the GTC setup. Given the gains in ease of use for therapists and patients as well as the lower costs for the hospital, we will implement this new tool.« less
Failure Modes and Effects Analysis (FMEA): A Bibliography
NASA Technical Reports Server (NTRS)
2000-01-01
Failure modes and effects analysis (FMEA) is a bottom-up analytical process that identifies process hazards, which helps managers understand vulnerabilities of systems, as well as assess and mitigate risk. It is one of several engineering tools and techniques available to program and project managers aimed at increasing the likelihood of safe and successful NASA programs and missions. This bibliography references 465 documents in the NASA STI Database that contain the major concepts, failure modes or failure analysis, in either the basic index of the major subject terms.
Geological risk assessment for the rapid development area of the Erhai Basin
NASA Astrophysics Data System (ADS)
Yang, Liu; Wang, Zhanqi; Jin, Gui; Chen, Dongdong; Wang, Zhan
For low-slope hilly land development to have more new land space in a watershed, it is particularly important that to coordinate the sharply increasing conflicts between mountainous and urban land utilization in the city. However, development of low-slope hilly land easily induce potential risks of geologic hazards such as landslide and landslip. It may lead to further environmental losses in a watershed. Hence, it is necessary to study potential risks of geo-hazards in low-slope hilly land development in urban area. Based on GIS spatial analysis technique, we select a study area, Dali City in the Erhai Basin located in watershed belt of Jinsha River, Lancang River and Red River in Yunnan Province of China. Through studying some relevant key indexes and parameters for monitoring potential risks of geo-hazards, we establish a composite index model for zoning the area with potential risks of geo-hazards in development of low-slope hilly land in the study area. Our research findings indicate that the potential risks of geo-hazards in eastern Dali City is relatively low while of that on slow hills with gentle slopes in the western area are relatively high. By using a zoning research method, generated maps show geological information of potential risks of geo-hazards on low-slope hilly land which provide important messages for guarding against natural geo-hazards and potential environmental losses in a watershed.
Canister Storage Building (CSB) Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
POWERS, T.B.
2000-03-16
This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safetymore » analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other controls required to protect the public, workers, and environment.« less
An assessment of gas emanation hazard using a geographic information system and geostatistics.
Astorri, F; Beaubien, S E; Ciotoli, G; Lombardi, S
2002-03-01
This paper describes the use of geostatistical analysis and GIS techniques to assess gas emanation hazards. The Mt. Vulsini volcanic district was selected for this study because of the wide range of natural phenomena locally present that affect gas migration in the near surface. In addition, soil gas samples that were collected in this area should allow for a calibration between the generated risk/hazard models and the measured distribution of toxic gas species at surface. The approach used during this study consisted of three general stages. First data were digitally organized into thematic layers, then software functions in the GIS program "ArcView" were used to compare and correlate these various layers, and then finally the produced "potential-risk" map was compared with radon soil gas data in order to validate the model and/or to select zones for further, more-detailed soil gas investigations.
Slonecker, E. Terrence; Fisher, Gary B.
2011-01-01
This report represents a remote sensing research effort conducted by the U.S. Geological Survey in cooperation with the U.S. Environmental Protection Agency (EPA) for the EPA Office of Inspector General. The objective of this investigation was to explore the efficacy of remote sensing as a technology for postclosure monitoring of hazardous waste sites as defined under the Comprehensive Environmental Response Compensation and Liability Act of 1980 (Public Law 96-510, 42 U.S.C. §9601 et seq.), also known as \\"Superfund.\\" Five delisted Superfund sites in Maryland and Virginia were imaged with a hyperspectral sensor and visited for collection of soil, water, and spectral samples and inspection of general site conditions. This report evaluates traditional and hyperspectral imagery and field spectroscopic measurement techniques in the characterization and analysis of fugitive (anthropogenic, uncontrolled) contamination at previously remediated hazardous waste disposal sites.
Complete hazard ranking to analyze right-censored data: An ALS survival study.
Huang, Zhengnan; Zhang, Hongjiu; Boss, Jonathan; Goutman, Stephen A; Mukherjee, Bhramar; Dinov, Ivo D; Guan, Yuanfang
2017-12-01
Survival analysis represents an important outcome measure in clinical research and clinical trials; further, survival ranking may offer additional advantages in clinical trials. In this study, we developed GuanRank, a non-parametric ranking-based technique to transform patients' survival data into a linear space of hazard ranks. The transformation enables the utilization of machine learning base-learners including Gaussian process regression, Lasso, and random forest on survival data. The method was submitted to the DREAM Amyotrophic Lateral Sclerosis (ALS) Stratification Challenge. Ranked first place, the model gave more accurate ranking predictions on the PRO-ACT ALS dataset in comparison to Cox proportional hazard model. By utilizing right-censored data in its training process, the method demonstrated its state-of-the-art predictive power in ALS survival ranking. Its feature selection identified multiple important factors, some of which conflicts with previous studies.
Combining heuristic and statistical techniques in landslide hazard assessments
NASA Astrophysics Data System (ADS)
Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni
2014-05-01
As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.
40 CFR Appendix I to Part 264 - Recordkeeping Instructions
Code of Federal Regulations, 2010 CFR
2010-07-01
...., liquid, sludge, solid, or contained gas. If the waste is not listed in part 261, subpart D, of this... from production of ----, EPA Hazardous Waste Number W051). Each hazardous waste listed in part 261... Methods Enter the handling code(s) listed below that most closely represents the technique(s) used at the...
INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
R.J. Garrett
2005-02-17
The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology formore » this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.« less
Chicken barn climate and hazardous volatile compounds control using simple linear regression and PID
NASA Astrophysics Data System (ADS)
Abdullah, A. H.; Bakar, M. A. A.; Shukor, S. A. A.; Saad, F. S. A.; Kamis, M. S.; Mustafa, M. H.; Khalid, N. S.
2016-07-01
The hazardous volatile compounds from chicken manure in chicken barn are potentially to be a health threat to the farm animals and workers. Ammonia (NH3) and hydrogen sulphide (H2S) produced in chicken barn are influenced by climate changes. The Electronic Nose (e-nose) is used for the barn's air, temperature and humidity data sampling. Simple Linear Regression is used to identify the correlation between temperature-humidity, humidity-ammonia and ammonia-hydrogen sulphide. MATLAB Simulink software was used for the sample data analysis using PID controller. Results shows that the performance of PID controller using the Ziegler-Nichols technique can improve the system controller to control climate in chicken barn.
NASA Technical Reports Server (NTRS)
1996-01-01
Under a Small Business Innovation Research (SBIR) contract to Kennedy Space Center, EIC Laboratories invented a Raman Spectrograph with fiber optic sampling for space applications such as sensing hazardous fuel vapors and making on-board rapid analyses of chemicals and minerals. Raman spectroscopy is a laser-based measurement technique that provides through a unique vibrational spectrum a molecular 'fingerprint,' and can function in aqueous environments. EIC combined optical fiber technology with Raman methods to develop sensors that can be operated at a distance from the spectrographic analysis instruments and the laser excitation source. EIC refined and commercialized the technology to create the Fiber Optic Raman Spectrograph and the RamanProbe. Commercial applications range from process control to monitoring hazardous materials.
Initiating Event Analysis of a Lithium Fluoride Thorium Reactor
NASA Astrophysics Data System (ADS)
Geraci, Nicholas Charles
The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to arrive at a list of key initiating events that can be used to address vulnerabilities during the design phases of LFTR development.
Hazard Analysis Guidelines for Transit Projects
DOT National Transportation Integrated Search
2000-01-01
These hazard analysis guidelines discuss safety critical systems and subsystems, types of hazard analyses, when hazard analyses should be performed, and the hazard analysis philosophy. These guidelines are published by FTA to assist the transit indus...
Nagaosa, Ryuichi S
2014-04-30
This study proposes a new numerical formulation of the spread of a flammable gas leakage. A new numerical approach has been applied to establish fundamental data for a hazard assessment of flammable gas spread in an enclosed residential space. The approach employs an extended version of a two-compartment concept, and determines the leakage concentration of gas using a mass-balance based formulation. The study also introduces a computational fluid dynamics (CFD) technique for calculating three-dimensional details of the gas spread by resolving all the essential scales of fluid motions without a turbulent model. The present numerical technique promises numerical solutions with fewer uncertainties produced by the model equations while maintaining high accuracy. The study examines the effect of gas density on the concentration profiles of flammable gas spread. It also discusses the effect of gas leakage rate on gas concentration profiles. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
14 CFR 437.29 - Hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...
14 CFR 437.29 - Hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...
NASA Astrophysics Data System (ADS)
Munirwansyah; Irsyam, Masyhur; Munirwan, Reza P.; Yunita, Halida; Zulfan Usrina, M.
2018-05-01
Occupational safety and health (OSH) is a planned effort to prevent accidents and diseases caused by work. In conducting mining activities often occur work accidents caused by unsafe field conditions. In open mine area, there is often a slump due to unstable slopes, which can disrupt the activities and productivity of mining companies. Based on research on stability of open pit slopes conducted by Febrianti [8], the Meureubo coal mine located in Aceh Barat district, on the slope of mine was indicated unsafe slope conditions, it will be continued research on OSH for landslide which is to understand the stability of the excavation slope and the shape of the slope collapse. Plaxis software was used for this research. After analyzing the slope stability and the effect of landslide on OSH with Job Safety Analysis (JSA) method, to identify the hazard to work safety, risk management analysis will be conducted to classified hazard level and its handling technique. This research aim is to know the level of risk of work accident at the company and its prevention effort. The result of risk analysis research is very high-risk value that is > 350 then the activity must be stopped until the risk can be reduced to reach the risk value limit < 20 which is allowed or accepted.
Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henneke, Dennis W.; Robinson, James
In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less
Detection of hazardous chemicals using field-portable Raman spectroscopy
NASA Astrophysics Data System (ADS)
Wright, Cherylyn W.; Harvey, Scott D.; Wright, Bob W.
2003-07-01
A major challenge confronting emergency response, border control, and other security-related functions is the accurate, rapid, and safe identification of potentially hazardous chemicals outside a laboratory environment. Raman spectroscopy is a rapid, non-intrusive technique that can be used to confidently identify many classes of hazardous and potentially explosive compounds based on molecular vibration information. Advances in instrumentation now allow reliable field - portable measurements to be made. Before the Raman technique can be effectively applied and be accepted within the scientific community, realistic studies must be performed to develop methods, define limitations, and rigorously evaluate its effectiveness. Examples of a variety of chemicals (including neat and diluted chemical warfare [CW] agents, a CW agent precursor, a biological warfare (BW)-related compound, an illicit drug, and explosives) identified using Raman spectroscopy in various types of containers and on surfaces are given, as well as results from a blind field test of 29 unknown samples which included CW agent precursors and/or degradation products, solvents associated with CW agent production, pesticides, explosives, and BW toxins (mostly mycotoxins). Additionally, results of experimental studies to evaluate the analysis of flammable organic solvents, propellants, military explosives, mixtures containing military explosives, shock-sensitive explosives, and gun powders are described with safety guidelines. Spectral masks for screening unknown samples for explosives and nerve agents are given.
NASA Astrophysics Data System (ADS)
Themistocleous, K.
2017-09-01
PROTHEGO (PROTection of European Cultural HEritage from GeO-hazards) is a collaborative research project funded in the framework of the Joint Programming Initiative on Cultural Heritage and Global Change (JPICH) - Heritage Plus in 2015-2018 (www.prothego.eu). PROTHEGO aims to make an innovative contribution towards the analysis of geo-hazards in areas of cultural heritage, and uses novel space technology for the management of sites and world heritage monuments located throughout Europe, using specialized remote sensing techniques. Τhe methodology will include the 395 monuments of UNESCO in Europe, with case studies conducted in 4 UNESCO sites in England, Spain, Italy and Cyprus. For the Cyprus case study in Choirokoitia, Unmanned Aerial Vehicles (UAVs) are used to monitor and assess the risk from natural hazards on the archaeological site to evaluate cultural heritage sites deformation. The UAVs were flown over the study area to produce time-series data, including orthoimages, 3D models and digital elevation models of the Choirokoitia site in order to identify changes in the area caused by natural hazards.
Task distribution, work environment, and perceived health discomforts among Indian ceramic workers.
Majumder, Joydeep; Shah, Priyanka; Bagepally, Bhavani S
2016-12-01
The study examined the work environment of Indian ceramics workers and assessed associations between work hazards, work category, and self-reported symptoms. The multi-method ergonomic review technique (MMERT) checkpoints was used for work analysis and prevalence of self-reported symptoms among 329 male workers. Ambient temperature and relative humidity in ceramic industries were 39.9°C and 17.4% respectively. Musculoskeletal discomfort was observed as a primary complaint; especially lower extremity pain (45%). Load handlers and machine operators had the highest levels of work hazards, including high skill requirement, strenuous work posture, poor commitment by the organization. Poor job autonomy, task clarity, hot workplace, inappropriate workplace design, inadequate auxiliary support, and mental overload were significantly associated with self-reported symptoms (P < 0.001). Work categories are associated with work hazards and may lead to various health symptoms among ceramic workers. Control of workplace hazards may lower rates of symptoms and thus may lead to improved health, productivity, and well-being. Am. J. Ind. Med. 59:1145-1155, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Lunar mission safety and rescue: Hazards analysis and safety requirements
NASA Technical Reports Server (NTRS)
1971-01-01
The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maruthi, Y. A., E-mail: ymjournal2014@gmail.com; Das, N. Lakshmana, E-mail: nldas9@gmail.com; Ramprasad, S., E-mail: ramprasadsurakala@gmail.com
The present studies focus the quantitative analysis of elements in school chalk to ensure the safety of its use. The elements like Calcium (Ca), Aluminum (Al), Iron (Fe), Silicon (Si) and Chromium (Cr) were analyzed from settled chalk dust samples collected from five classrooms (CD-1) and also from another set of unused chalk samples collected from local market (CD-2) using Energy Dispersive X-Ray florescence(ED-XRF) spectroscopy. Presence of these elements in significant concentrations in school chalk confirmed that, it is an irritant and occupational hazard. It is suggested to use protective equipments like filtered mask for mouth, nose and chalk holders.more » This study also suggested using the advanced mode of techniques like Digital boards, marker boards and power point presentations to mitigate the occupational hazard for classroom chalk.« less
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.
2014-10-01
Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.
2015-03-01
Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...
ERIC Educational Resources Information Center
Moore, Andrea Lisa
2013-01-01
Toxic Release Inventory facilities are among the many environmental hazards shown to create environmental inequities in the United States. This project examined four factors associated with Toxic Release Inventory, specifically, manufacturing facility location at multiple spatial scales using spatial analysis techniques (i.e., O-ring statistic and…
ADVANCES IN ENCAPSULATION TECHNOLOGIES FOR THE MANAGEMENT OF MERCURY-CONTAMINATED HAZARDOUS WASTES
Although industrial and commercial uses of mercury have been curtailed in recent times, there is a demonstrated need for the development of reliable hazardous waste management techniques because of ongoing hazardous waste generation and historic operations that have led to signif...
Empirical Data Fusion for Convective Weather Hazard Nowcasting
NASA Astrophysics Data System (ADS)
Williams, J.; Ahijevych, D.; Steiner, M.; Dettling, S.
2009-09-01
This paper describes a statistical analysis approach to developing an automated convective weather hazard nowcast system suitable for use by aviation users in strategic route planning and air traffic management. The analysis makes use of numerical weather prediction model fields and radar, satellite, and lightning observations and derived features along with observed thunderstorm evolution data, which are aligned using radar-derived motion vectors. Using a dataset collected during the summers of 2007 and 2008 over the eastern U.S., the predictive contributions of the various potential predictor fields are analyzed for various spatial scales, lead-times and scenarios using a technique called random forests (RFs). A minimal, skillful set of predictors is selected for each scenario requiring distinct forecast logic, and RFs are used to construct an empirical probabilistic model for each. The resulting data fusion system, which ran in real-time at the National Center for Atmospheric Research during the summer of 2009, produces probabilistic and deterministic nowcasts of the convective weather hazard and assessments of the prediction uncertainty. The nowcasts' performance and results for several case studies are presented to demonstrate the value of this approach. This research has been funded by the U.S. Federal Aviation Administration to support the development of the Consolidated Storm Prediction for Aviation (CoSPA) system, which is intended to provide convective hazard nowcasts and forecasts for the U.S. Next Generation Air Transportation System (NextGen).
NASA Astrophysics Data System (ADS)
Wilson, B.; Paradise, T. R.
2016-12-01
The influx of millions of Syrian refugees into Turkey has rapidly changed the population distribution along the Dead Sea Rift and East Anatolian Fault zones. In contrast to other countries in the Middle East where refugees are accommodated in camp environments, the majority of displaced individuals in Turkey are integrated into cities, towns, and villages—placing stress on urban settings and increasing potential exposure to strong shaking. Yet, displaced populations are not traditionally captured in data sources used in earthquake risk analysis or loss estimations. Accordingly, we present a district-level analysis assessing the spatial overlap of earthquake hazards and refugee locations in southeastern Turkey to determine how migration patterns are altering seismic risk in the region. Using migration estimates from the U.S. Humanitarian Information Unit, we create three district-level population scenarios that combine official population statistics, refugee camp populations, and low, median, and high bounds for integrated refugee populations. We perform probabilistic seismic hazard analysis alongside these population scenarios to map spatial variations in seismic risk between 2011 and late 2015. Our results show a significant relative southward increase of seismic risk for this period due to refugee migration. Additionally, we calculate earthquake fatalities for simulated earthquakes using a semi-empirical loss estimation technique to determine degree of under-estimation resulting from forgoing migration data in loss modeling. We find that including refugee populations increased casualties by 11-12% using median population estimates, and upwards of 20% using high population estimates. These results communicate the ongoing importance of placing environmental hazards in their appropriate regional and temporal context which unites physical, political, cultural, and socio-economic landscapes. Keywords: Earthquakes, Hazards, Loss-Estimation, Syrian Crisis, Migration, Refugees
78 FR 28892 - Hazardous Fire Risk Reduction, East Bay Hills, CA
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-16
... DEPARTMENT OF HOMELAND SECURITY Federal Emergency Management Agency [Docket ID FEMA 2010-0037] Hazardous Fire Risk Reduction, East Bay Hills, CA AGENCY: Federal Emergency Management Agency, DHS. ACTION..., limbing and mowing, thinning, and grazing techniques as appropriate to reduce the risk of fire hazard...
NASA Technical Reports Server (NTRS)
1974-01-01
Future operational concepts for the space transportation system were studied in terms of space shuttle upper stage failure contingencies possible during deployment, retrieval, or space servicing of automated satellite programs. Problems anticipated during mission planning were isolated using a modified 'fault tree' technique, normally used in safety analyses. A comprehensive space servicing hazard analysis is presented which classifies possible failure modes under the catagories of catastrophic collision, failure to rendezvous and dock, servicing failure, and failure to undock. The failure contingencies defined are to be taken into account during design of the upper stage.
NASA Astrophysics Data System (ADS)
Moser, M.
2009-04-01
The catchment Gadeinerbach in the District of Lungau/Salzburg/Austria is prone to debris flows. Large debris flow events dates back from the years 1934 and 1953. In the upper catchment large mass movements represent debris sources. A field study shows the debris potential and the catchment looks like a "sleeping torrential giant". To carry out mitigation measures a detailed risk management concept, based on a risk assessment in combination of historical analysis, field study and numerical modeling on the alluvial fan was conducted. Human activities have partly altered the surface of the alluvial fan Gadeinerbach but nevertheless some important hazard indicators could be found. With the hazard indicators and photo analysis from the large debris flow event 1934 the catchment character could be pointed out. With the help of these historical data sets (hazard indicators, sediment and debris amount...) it is possible to calibrate the provided numerical models and to win useful knowledge over the pro and cons and their application. The results were used to simulate the design event and furthermore to derive mitigation measures. Therefore the most effective protection against debris with a reduction of the high energy level to a lower level under particular energy change in combination with a debris/bedload deposition place has been carried out. Expert opinion, the study of historical data and a field work is in addition to numerical simulation techniques very necessary for the work in the field of natural hazard management.
Geotechnical applications of LiDAR pertaining to geomechanical evaluation and hazard identification
NASA Astrophysics Data System (ADS)
Lato, Matthew J.
Natural hazards related to ground movement that directly affect the safety of motorists and highway infrastructure include, but are not limited to, rockfalls, rockslides, debris flows, and landslides. This thesis specifically deals with the evaluation of rockfall hazards through the evaluation of LiDAR data. Light Detection And Ranging (LiDAR) is an imaging technology that can be used to delineate and evaluate geomechanically-controlled hazards. LiDAR has been adopted to conduct hazard evaluations pertaining to rockfall, rock-avalanches, debris flows, and landslides. Characteristics of LiDAR surveying, such as rapid data acquisition rates, mobile data collection, and high data densities, pose problems to traditional CAD or GIS-based mapping methods. New analyses methods, including tools specifically oriented to geomechanical analyses, are needed. The research completed in this thesis supports development of new methods, including improved survey techniques, innovative software workflows, and processing algorithms to aid in the detection and evaluation of geomechanically controlled rockfall hazards. The scientific research conducted between the years of 2006-2010, as presented in this thesis, are divided into five chapters, each of which has been published by or is under review by an international journal. The five research foci are: (i) geomechanical feature extraction and analysis using LiDAR data in active mining environments; (ii) engineered monitoring of rockfall hazards along transportation corridors: using mobile terrestrial LiDAR; (iii) optimization of LiDAR scanning and processing for automated structural evaluation of discontinuities in rockmasses; (iv) location orientation bias when using static LiDAR data for geomechanical analysis; and (v) evaluating roadside rockmasses for rockfall hazards from LiDAR data: optimizing data collection and processing protocols. The research conducted pertaining to this thesis has direct and significant implications with respect to numerous engineering projects that are affected by geomechanical stability issues. The ability to efficiently and accurately map discontinuities, detect changes, and standardize roadside geomechanical stability analyses from remote locations will fundamentally change the state-of-practice of geotechnical investigation workflows and repeatable monitoring. This, in turn, will lead to earlier detection and definition of potential zones of instability, will allow for progressive monitoring and risk analysis, and will indicate the need for pro-active slope improvement and stabilization.
Wake-Vortex Hazards During Cruise
NASA Technical Reports Server (NTRS)
Rossow, Vernon J.; James, Kevin D.; Nixon, David (Technical Monitor)
1998-01-01
Even though the hazard posed by lift-generated wakes of subsonic transport aircraft has been studied extensively for approach and departure at airports, only a small amount of effort has gone into the potential hazard at cruise altitude. This paper reports on a studio of the wake-vortex hazard during cruise because encounters may become more prevalent when free-flight becomes available and each aircraft, is free to choose its own route between destinations. In order to address the problem, the various fluid-dynamic stages that vortex wakes usually go through as they age will be described along with estimates of the potential hazard that each stage poses. It appears that a rolling-moment hazard can be just as severe at cruise as for approach at airports, but it only persists for several minutes. However, the hazard posed by the downwash in the wake due to the lift on the generator aircraft persists for tens of minutes in a long narrow region behind the generating aircraft. The hazard consists of severe vertical loads when an encountering aircraft crosses the wake. A technique for avoiding vortex wakes at cruise altitude will be described. To date the hazard posed by lift-generated vortex wakes and their persistence at cruise altitudes has been identified and subdivided into several tasks. Analyses of the loads to be encounter and are underway and should be completed shortly. A review of published literature on the subject has been nearly completed (see text) and photographs of vortex wakes at cruise altitudes have been taken and the various stages of decay have been identified. It remains to study and sort the photographs for those that best illustrate the various stages of decay after they are shed by subsonic transport aircraft at cruise altitudes. The present status of the analysis and the paper are described.
Wind shear and turbulence around airports
NASA Technical Reports Server (NTRS)
Lewellen, W. S.; Williamson, G. G.
1976-01-01
A two part study was conducted to determine the feasibility of predicting the conditions under which wind/turbulence environments hazardous to aviation operations exist. The computer model used to solve the velocity temperature, and turbulence distributions in the atmospheric boundary layer is described, and the results of a parameteric analysis to determine the expected range of wind shear and turbulence to be encountered in the vicinity of airports are given. The second part describes the delineation of an ensemble of aircraft accidents in which low level wind shear and/or turbulence appeared to be causative factors. This set of accidents, encompassing a wide range of meteorological conditions, should prove useful in developing techniques for reconstructing hazardous wind environments for aircraft safety investigation purposes.
Safety assessment for EPS electron-proton spectrometer
NASA Technical Reports Server (NTRS)
Gleeson, P.
1971-01-01
A safety analysis was conducted to identify the efforts required to assure relatively hazard free operation of the EPS and to meet the safety requirements of the program. Safety engineering criteria, principles, and techniques in applicable disciplines are stressed in the performance of the system and subsystem studies; in test planning; in the design, development, test, evaluation, and checkout of the equipment; and the operating procedures for the EPS program.
[Hygienic and ergonomic analysis of the technology for sinking main and subsidiary mine shafts].
Meniaĭlo, N I; Tyshlek, E G; Gritsenko, V S; Shemiakin, G M
1989-01-01
The labour conditions in mine shafts do not correspond to the existing ergonomic and hygienic norms. Drilling and blasting techniques are most hazardous as to the gravity and duration of the factors involved. Working conditions normalization should be based on the elaboration of specifically innovative technologies which should envisage the workers' periodic staying in the mine shaft area during the work shift.
Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai
2017-01-01
Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA. PMID:28938011
NASA Astrophysics Data System (ADS)
Pinar, Anthony; Havens, Timothy C.; Rice, Joseph; Masarik, Matthew; Burns, Joseph; Thelen, Brian
2016-05-01
Explosive hazards are a deadly threat in modern conflicts; hence, detecting them before they cause injury or death is of paramount importance. One method of buried explosive hazard discovery relies on data collected from ground penetrating radar (GPR) sensors. Threat detection with downward looking GPR is challenging due to large returns from non-target objects and clutter. This leads to a large number of false alarms (FAs), and since the responses of clutter and targets can form very similar signatures, classifier design is not trivial. One approach to combat these issues uses robust principal component analysis (RPCA) to enhance target signatures while suppressing clutter and background responses, though there are many versions of RPCA. This work applies some of these RPCA techniques to GPR sensor data and evaluates their merit using the peak signal-to-clutter ratio (SCR) of the RPCA-processed B-scans. Experimental results on government furnished data show that while some of the RPCA methods yield similar results, there are indeed some methods that outperform others. Furthermore, we show that the computation time required by the different RPCA methods varies widely, and the selection of tuning parameters in the RPCA algorithms has a major effect on the peak SCR.
Reliability studies of Integrated Modular Engine system designs
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1993-01-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
System safety in Stirling engine development
NASA Technical Reports Server (NTRS)
Bankaitis, H.
1981-01-01
The DOE/NASA Stirling Engine Project Office has required that contractors make safety considerations an integral part of all phases of the Stirling engine development program. As an integral part of each engine design subtask, analyses are evolved to determine possible modes of failure. The accepted system safety analysis techniques (Fault Tree, FMEA, Hazards Analysis, etc.) are applied in various degrees of extent at the system, subsystem and component levels. The primary objectives are to identify critical failure areas, to enable removal of susceptibility to such failures or their effects from the system and to minimize risk.
Reliability studies of integrated modular engine system designs
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1993-01-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Reliability studies of integrated modular engine system designs
NASA Astrophysics Data System (ADS)
Hardy, Terry L.; Rapp, Douglas C.
1993-06-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
Reliability studies of Integrated Modular Engine system designs
NASA Astrophysics Data System (ADS)
Hardy, Terry L.; Rapp, Douglas C.
1993-06-01
A study was performed to evaluate the reliability of Integrated Modular Engine (IME) concepts. Comparisons were made between networked IME systems and non-networked discrete systems using expander cycle configurations. Both redundant and non-redundant systems were analyzed. Binomial approximation and Markov analysis techniques were employed to evaluate total system reliability. In addition, Failure Modes and Effects Analyses (FMEA), Preliminary Hazard Analyses (PHA), and Fault Tree Analysis (FTA) were performed to allow detailed evaluation of the IME concept. A discussion of these system reliability concepts is also presented.
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...
Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, eachmore » based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.« less
Compiler-assisted multiple instruction rollback recovery using a read buffer
NASA Technical Reports Server (NTRS)
Alewine, N. J.; Chen, S.-K.; Fuchs, W. K.; Hwu, W.-M.
1993-01-01
Multiple instruction rollback (MIR) is a technique that has been implemented in mainframe computers to provide rapid recovery from transient processor failures. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs have also been developed which remove rollback data hazards directly with data-flow transformations. This paper focuses on compiler-assisted techniques to achieve multiple instruction rollback recovery. We observe that some data hazards resulting from instruction rollback can be resolved efficiently by providing an operand read buffer while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations indicate improved efficiency over previous hardware-based and compiler-based schemes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riddle, F. J.
2003-06-26
The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less
Hazard Analysis for Building 34 Vacuum Glove Box Assembly
NASA Technical Reports Server (NTRS)
Meginnis, Ian
2014-01-01
One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".
Landslide hazard assessment: recent trends and techniques.
Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S
2013-01-01
Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.
Cytometric analysis of shape and DNA content in mammalian sperm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gledhill, B.L.
1983-10-10
Male germ cells respond dramatically to a variety of insults and are important reproductive dosimeters. Semen analyses are very useful in studies on the effects of drugs, chemicals, and environmental hazards on testicular function, male fertility and heritable germinal mutations. Sperm were analyzed by flow cytometry and slit-scan flow analysis for injury following the exposure of testes to mutagens. The utility of flow cytometry in genotoxin screening and monitoring of occupational exposure was evaluated. The technique proved valuable in separation of X- and Y-chromosome bearing sperm and the potential applicability of this technique in artificial insemination and a solution, ofmore » accurately assessing the DNA content of sperm were evaluated-with reference to determination of X- and Y-chromosome bearing sperm.« less
Mitigation of EMU Cut Glove Hazard from Micrometeoroid and Orbital Debris Impacts on ISS Handrails
NASA Technical Reports Server (NTRS)
Ryan, Shannon; Christiansen, Eric L.; Davis, Bruce A.; Ordonez, Erick
2009-01-01
Recent cut damages sustained on crewmember gloves during extravehicular activity (ISS) onboard the International Space Station (ISS) have been caused by contact with sharp edges or a pinch point according to analysis of the damages. One potential source are protruding sharp edged crater lips from micrometeoroid and orbital debris (MMOD) impacts on metallic handrails along EVA translation paths. A number of hypervelocity impact tests were performed on ISS handrails, and found that mm-sized projectiles were capable of inducing crater lip heights two orders of magnitude above the minimum value for glove abrasion concerns. Two techniques were evaluated for mitigating the cut glove hazard of MMOD impacts on ISS handrails: flexible overwraps which act to limit contact between crewmember gloves and impact sites, and; alternate materials which form less hazardous impact crater profiles. In parallel with redesign efforts to increase the cut resilience of EMU gloves, the modifications to ISS handrails evaluated in this study provide the means to significantly reduce cut glove risk from MMOD impact craters
Polynuclear aromatic hydrocarbon analysis using the synchronous scanning luminoscope
NASA Astrophysics Data System (ADS)
Hyfantis, George J., Jr.; Teglas, Matthew S.; Wilbourn, Robert G.
2001-02-01
12 The Synchronous Scanning Luminoscope (SSL) is a field- portable, synchronous luminescence spectrofluorometer that was developed for on-site analysis of contaminated soil and ground water. The SSL is capable of quantitative analysis of total polynuclear aromatic hydrocarbons (PAHs) using phosphorescence and fluorescence techniques with a high correlation to laboratory data as illustrated by this study. The SSL is also capable of generating benzo(a)pyrene equivalency results, based on seven carcinogenic PAHs and Navy risk numbers, with a high correlation to laboratory data as illustrated by this study. These techniques allow rapid field assessments of total PAHs and benzo(a)pyrene equivalent concentrations. The Luminoscope is capable of detecting total PAHs to the parts per billion range. This paper describes standard field methods for using the SSL and describes the results of field/laboratory testing of PAHs. SSL results from two different hazardous waste sites are discussed.
NASA Astrophysics Data System (ADS)
Odbert, Henry; Aspinall, Willy
2014-05-01
Evidence-based hazard assessment at volcanoes assimilates knowledge about the physical processes of hazardous phenomena and observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We discuss the uncertainty of inferences, and how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.
NASA Astrophysics Data System (ADS)
Odbert, Henry; Hincks, Thea; Aspinall, Willy
2015-04-01
Volcanic hazard assessments must combine information about the physical processes of hazardous phenomena with observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We show how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.
NASA Astrophysics Data System (ADS)
Lan, Hengxing; Derek Martin, C.; Lim, C. H.
2007-02-01
Geographic information system (GIS) modeling is used in combination with three-dimensional (3D) rockfall process modeling to assess rockfall hazards. A GIS extension, RockFall Analyst (RA), which is capable of effectively handling large amounts of geospatial information relative to rockfall behaviors, has been developed in ArcGIS using ArcObjects and C#. The 3D rockfall model considers dynamic processes on a cell plane basis. It uses inputs of distributed parameters in terms of raster and polygon features created in GIS. Two major components are included in RA: particle-based rockfall process modeling and geostatistics-based rockfall raster modeling. Rockfall process simulation results, 3D rockfall trajectories and their velocity features either for point seeders or polyline seeders are stored in 3D shape files. Distributed raster modeling, based on 3D rockfall trajectories and a spatial geostatistical technique, represents the distribution of spatial frequency, the flying and/or bouncing height, and the kinetic energy of falling rocks. A distribution of rockfall hazard can be created by taking these rockfall characteristics into account. A barrier analysis tool is also provided in RA to aid barrier design. An application of these modeling techniques to a case study is provided. The RA has been tested in ArcGIS 8.2, 8.3, 9.0 and 9.1.
Directional analysis and filtering for dust storm detection in NOAA-AVHRR imagery
NASA Astrophysics Data System (ADS)
Janugani, S.; Jayaram, V.; Cabrera, S. D.; Rosiles, J. G.; Gill, T. E.; Rivera Rivera, N.
2009-05-01
In this paper, we propose spatio-spectral processing techniques for the detection of dust storms and automatically finding its transport direction in 5-band NOAA-AVHRR imagery. Previous methods that use simple band math analysis have produced promising results but have drawbacks in producing consistent results when low signal to noise ratio (SNR) images are used. Moreover, in seeking to automate the dust storm detection, the presence of clouds in the vicinity of the dust storm creates a challenge in being able to distinguish these two types of image texture. This paper not only addresses the detection of the dust storm in the imagery, it also attempts to find the transport direction and the location of the sources of the dust storm. We propose a spatio-spectral processing approach with two components: visualization and automation. Both approaches are based on digital image processing techniques including directional analysis and filtering. The visualization technique is intended to enhance the image in order to locate the dust sources. The automation technique is proposed to detect the transport direction of the dust storm. These techniques can be used in a system to provide timely warnings of dust storms or hazard assessments for transportation, aviation, environmental safety, and public health.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Hazard Analysis and Risk- Based Preventive Controls for Human Food'' and its information collection... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food.'' IV. How To...
NASA Astrophysics Data System (ADS)
Owen, S. E.; Yun, S. H.; Hua, H.; Agram, P. S.; Liu, Z.; Sacco, G. F.; Manipon, G.; Linick, J. P.; Fielding, E. J.; Lundgren, P.; Farr, T. G.; Webb, F.; Rosen, P. A.; Simons, M.
2017-12-01
The Advanced Rapid Imaging and Analysis (ARIA) project for Natural Hazards is focused on rapidly generating high-level geodetic imaging products and placing them in the hands of the solid earth science and local, national, and international natural hazard communities by providing science product generation, exploration, and delivery capabilities at an operational level. Space-based geodetic measurement techniques including Interferometric Synthetic Aperture Radar (InSAR), differential Global Positioning System, and SAR-based change detection have become critical additions to our toolset for understanding and mapping the damage and deformation caused by earthquakes, volcanic eruptions, floods, landslides, and groundwater extraction. Up until recently, processing of these data sets has been handcrafted for each study or event and has not generated products rapidly and reliably enough for response to natural disasters or for timely analysis of large data sets. The ARIA project, a joint venture co-sponsored by the California Institute of Technology and by NASA through the Jet Propulsion Laboratory, has been capturing the knowledge applied to these responses and building it into an automated infrastructure to generate imaging products in near real-time that can improve situational awareness for disaster response. In addition to supporting the growing science and hazard response communities, the ARIA project has developed the capabilities to provide automated imaging and analysis capabilities necessary to keep up with the influx of raw SAR data from geodetic imaging missions such as ESA's Sentinel-1A/B, now operating with repeat intervals as short as 6 days, and the upcoming NASA NISAR mission. We will present the progress and results we have made on automating the analysis of Sentinel-1A/B SAR data for hazard monitoring and response, with emphasis on recent developments and end user engagement in flood extent mapping and deformation time series for both volcano monitoring and mapping of groundwater-related subsidence
Developing a safe on-orbit cryogenic depot
NASA Technical Reports Server (NTRS)
Bahr, Nicholas J.
1992-01-01
New U.S. space initiatives will require technology to realize planned programs such as piloted lunar and Mars missions. Key to the optimal execution of such missions are high performance orbit transfer vehicles and propellant storage facilities. Large amounts of liquid hydrogen and oxygen demand a uniquely designed on-orbit cryogenic propellant depot. Because of the inherent dangers in propellant storage and handling, a comprehensive system safety program must be established. This paper shows how the myriad and complex hazards demonstrate the need for an integrated safety effort to be applied from program conception through operational use. Even though the cryogenic depot is still in the conceptual stage, many of the hazards have been identified, including fatigue due to heavy thermal loading from environmental and operating temperature extremes, micrometeoroid and/or depot ancillary equipment impact (this is an important problem due to the large surface area needed to house the large quantities of propellant), docking and maintenance hazards, and hazards associated with extended extravehicular activity. Various safety analysis techniques were presented for each program phase. Specific system safety implementation steps were also listed. Enhanced risk assessment was demonstrated through the incorporation of these methods.
Spatial analysis of hazardous waste data using geostatistics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zirschky, J.H.
1984-01-01
The objective of this investigation was to determine if geostatistics could be a useful tool for evaluating hazardous waste sites. Three sites contaminated by dioxin (2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD)) were investigated. The first site evaluated was a creek into which TCDD-contaminated soil had eroded. The second site was a town in which TCDD-contaminated wastes had been sprayed onto the streets. Finally, the third site was a highway of which the shoulders were contaminated by dust deposition from a nearby hazardous waste site. The distribution of TCDD at the first and third sites were investigated using kriging, an optimal estimation technique. By usingmore » kriging, the areas of both sites requiring cleanup were successfully identified. At the second site, the town, satisfactory results were not obtained. The distribution of contamination in this town is believed to be very heterogeneous; thus, reasonable estimates could not be obtained. Additional sampling was therefore recommended at this site. Based upon this research, geostatistics appears to be a very useful tool for evaluating a hazardous waste site if the distribution of contaminants at the site is homogeneous, or can be divided into homogeneous areas.« less
Identification and delineation of areas flood hazard using high accuracy of DEM data
NASA Astrophysics Data System (ADS)
Riadi, B.; Barus, B.; Widiatmaka; Yanuar, M. J. P.; Pramudya, B.
2018-05-01
Flood incidents that often occur in Karawang regency need to be mitigated. These expectations exist on technologies that can predict, anticipate and reduce disaster risks. Flood modeling techniques using Digital Elevation Model (DEM) data can be applied in mitigation activities. High accuracy DEM data used in modeling, will result in better flooding flood models. The result of high accuracy DEM data processing will yield information about surface morphology which can be used to identify indication of flood hazard area. The purpose of this study was to identify and describe flood hazard areas by identifying wetland areas using DEM data and Landsat-8 images. TerraSAR-X high-resolution data is used to detect wetlands from landscapes, while land cover is identified by Landsat image data. The Topography Wetness Index (TWI) method is used to detect and identify wetland areas with basic DEM data, while for land cover analysis using Tasseled Cap Transformation (TCT) method. The result of TWI modeling yields information about potential land of flood. Overlay TWI map with land cover map that produces information that in Karawang regency the most vulnerable areas occur flooding in rice fields. The spatial accuracy of the flood hazard area in this study was 87%.
NASA Astrophysics Data System (ADS)
Genzano, Nicola; Filizzola, Carolina; Hattori, Katsumi; Lisi, Mariano; Paciello, Rossana; Pergola, Nicola; Tramutoli, Valerio
2017-04-01
In order to increase reliability and precision of short-term seismic hazard assessment (but also a possible earthquakes forecast), the integration of different kinds of observations (chemical, physical, biological, etc.) in a multi-parametric approach could be a useful strategy to be undertaken. Among the different observational methodologies, the fluctuations of Earth's thermally emitted radiation, measured by satellite sensors operating in the thermal infrared (TIR) spectral range, have been proposed since eighties as a potential earthquake precursor. Since 2001, the general change detention approach Robust Satellite Techniques (RST), used in combination with RETIRA (Robust Estimator of TIR Anomalies) index, showed good ability to discriminate anomalous TIR signals possibly associated to seismic activity, from the normal variability of TIR signal due to other causes (e.g. meteorological). In this paper, the RST data analysis approach has been implemented on TIR satellite records collected over Japan by the geostationary satellite sensor MTSAT (Multifunctional Transport SATellites) in the period June 2005 - December 2015 in order to evaluate its possible contribute to an improved multi parametric system for a time-Dependent Assessment of Seismic Hazard (t-DASH). For the first time, thermal anomalies have been identified comparing the daily TIR radiation of each location of the considered satellite portions, with its historical expected value and variation range (i.e. RST reference fields) computed using a a 30 days moving window (i.e. 15 days before and 15 days after the considered day of the year) instead than fixed monthly window. Preliminary results of correlation analysis among the appearance of Significant Sequences of TIR Anomalies (SSTAs) and time, location and magnitude of earthquakes (M≥5), performed by applying predefined space-temporal and magnitude constraints, show that 80% of SSTAs were in an apparent space-time relations with earthquakes with a false alarm rate of 20%.
Assessing Surface Fuel Hazard in Coastal Conifer Forests through the Use of LiDAR Remote Sensing
NASA Astrophysics Data System (ADS)
Koulas, Christos
The research problem that this thesis seeks to examine is a method of predicting conventional fire hazards using data drawn from specific regions, namely the Sooke and Goldstream watershed regions in coastal British Columbia. This thesis investigates whether LiDAR data can be used to describe conventional forest stand fire hazard classes. Three objectives guided this thesis: to discuss the variables associated with fire hazard, specifically the distribution and makeup of fuel; to examine the relationship between derived LiDAR biometrics and forest attributes related to hazard assessment factors defined by the Capitol Regional District (CRD); and to assess the viability of the LiDAR biometric decision tree in the CRD based on current frameworks for use. The research method uses quantitative datasets to assess the optimal generalization of these types of fire hazard data through discriminant analysis. Findings illustrate significant LiDAR-derived data limitations, and reflect the literature in that flawed field application of data modelling techniques has led to a disconnect between the ways in which fire hazard models have been intended to be used by scholars and the ways in which they are used by those tasked with prevention of forest fires. It can be concluded that a significant trade-off exists between computational requirements for wildfire simulation models and the algorithms commonly used by field teams to apply these models with remote sensing data, and that CRD forest management practices would need to change to incorporate a decision tree model in order to decrease risk.
NASA Technical Reports Server (NTRS)
2012-01-01
One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.
NASA Astrophysics Data System (ADS)
Chiessi, Vittorio; D'Orefice, Maurizio; Scarascia Mugnozza, Gabriele; Vitale, Valerio; Cannese, Christian
2010-07-01
This paper describes the results of a rockfall hazard assessment for the village of San Quirico (Abruzzo region, Italy) based on an engineering-geological model. After the collection of geological, geomechanical, and geomorphological data, the rockfall hazard assessment was performed based on two separate approaches: i) simulation of detachment of rock blocks and their downhill movement using a GIS; and ii) application of geostatistical techniques to the analysis of georeferenced observations of previously fallen blocks, in order to assess the probability of arrival of blocks due to potential future collapses. The results show that the trajectographic analysis is significantly influenced by the input parameters, with particular reference to the coefficients of restitution values. In order to solve this problem, the model was calibrated based on repeated field observations. The geostatistical approach is useful because it gives the best estimation of point-source phenomena such as rockfalls; however, the sensitivity of results to basic assumptions, e.g. assessment of variograms and choice of a threshold value, may be problematic. Consequently, interpolations derived from different variograms have been used and compared among them; hence, those showing the lowest errors were adopted. The data sets which were statistically analysed are relevant to both kinetic energy and surveyed rock blocks in the accumulation area. The obtained maps highlight areas susceptible to rock block arrivals, and show that the area accommodating the new settlement of S. Quirico Village has the highest level of hazard according to both probabilistic and deterministic methods.
Lee, Saro; Park, Inhye
2013-09-30
Subsidence of ground caused by underground mines poses hazards to human life and property. This study analyzed the hazard to ground subsidence using factors that can affect ground subsidence and a decision tree approach in a geographic information system (GIS). The study area was Taebaek, Gangwon-do, Korea, where many abandoned underground coal mines exist. Spatial data, topography, geology, and various ground-engineering data for the subsidence area were collected and compiled in a database for mapping ground-subsidence hazard (GSH). The subsidence area was randomly split 50/50 for training and validation of the models. A data-mining classification technique was applied to the GSH mapping, and decision trees were constructed using the chi-squared automatic interaction detector (CHAID) and the quick, unbiased, and efficient statistical tree (QUEST) algorithms. The frequency ratio model was also applied to the GSH mapping for comparing with probabilistic model. The resulting GSH maps were validated using area-under-the-curve (AUC) analysis with the subsidence area data that had not been used for training the model. The highest accuracy was achieved by the decision tree model using CHAID algorithm (94.01%) comparing with QUEST algorithms (90.37%) and frequency ratio model (86.70%). These accuracies are higher than previously reported results for decision tree. Decision tree methods can therefore be used efficiently for GSH analysis and might be widely used for prediction of various spatial events. Copyright © 2013. Published by Elsevier Ltd.
SAMCO: Society Adaptation for coping with Mountain risks in a global change COntext
NASA Astrophysics Data System (ADS)
Grandjean, Gilles; Bernardie, Severine; Malet, Jean-Philippe; Puissant, Anne; Houet, Thomas; Berger, Frederic; Fort, Monique; Pierre, Daniel
2013-04-01
The SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points with (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform. The strength and originality of the SAMCO project will be to combine different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) and to gather various interdisciplinary expertises in earth sciences, environmental sciences, and social sciences. The multidisciplinary background of the members could potentially lead to the development of new concepts and emerging strategies for mountain hazard/risk adaptation. Research areas, characterized by a variety of environmental, economical and social settings, are severely affected by landslides, and have experienced significant land use modifications (reforestation, abandonment of traditional agricultural practices) and human interferences (urban expansion, ski resorts construction) over the last century.
NASA Astrophysics Data System (ADS)
Iribarren Anacona, P.; Norton, K. P.; Mackintosh, A.
2014-07-01
Glacier retreat since the Little Ice Age has resulted in the development or expansion of hundreds of glacial lakes in Patagonia. Some of these lakes have produced large (≥106 m3) Glacial Lake Outburst Floods (GLOFs) damaging inhabited areas. GLOF hazard studies in Patagonia have been mainly based on the analysis of short-term series (≤50 years) of flood data and until now no attempt has been made to identify the relative susceptibility of lakes to failure. Power schemes and associated infrastructure are planned for Patagonian basins that have historically been affected by GLOFs, and we now require a thorough understanding of the characteristics of dangerous lakes in order to assist with hazard assessment and planning. In this paper, the conditioning factors of 16 outbursts from moraine dammed lakes in Patagonia were analysed. These data were used to develop a classification scheme designed to assess outburst susceptibility, based on image classification techniques, flow routine algorithms and the Analytical Hierarchy Process. This scheme was applied to the Baker Basin, Chile, where at least 7 moraine-dammed lakes have failed in historic time. We identified 386 moraine-dammed lakes in the Baker Basin of which 28 were classified with high or very high outburst susceptibility. Commonly, lakes with high outburst susceptibility are in contact with glaciers and have moderate (>8°) to steep (>15°) dam outlet slopes, akin to failed lakes in Patagonia. The proposed classification scheme is suitable for first-order GLOF hazard assessments in this region. However, rapidly changing glaciers in Patagonia make detailed analysis and monitoring of hazardous lakes and glaciated areas upstream from inhabited areas or critical infrastructure necessary, in order to better prepare for hazards emerging from an evolving cryosphere.
NASA Astrophysics Data System (ADS)
Iribarren Anacona, P.; Norton, K. P.; Mackintosh, A.
2014-12-01
Glacier retreat since the Little Ice Age has resulted in the development or expansion of hundreds of glacial lakes in Patagonia. Some of these lakes have produced large (≥ 106 m3) Glacial Lake Outburst Floods (GLOFs) damaging inhabited areas. GLOF hazard studies in Patagonia have been mainly based on the analysis of short-term series (≤ 50 years) of flood data and until now no attempt has been made to identify the relative susceptibility of lakes to failure. Power schemes and associated infrastructure are planned for Patagonian basins that have historically been affected by GLOFs, and we now require a thorough understanding of the characteristics of dangerous lakes in order to assist with hazard assessment and planning. In this paper, the conditioning factors of 16 outbursts from moraine-dammed lakes in Patagonia were analysed. These data were used to develop a classification scheme designed to assess outburst susceptibility, based on image classification techniques, flow routine algorithms and the Analytical Hierarchy Process. This scheme was applied to the Baker Basin, Chile, where at least seven moraine-dammed lakes have failed in historic time. We identified 386 moraine-dammed lakes in the Baker Basin of which 28 were classified with high or very high outburst susceptibility. Commonly, lakes with high outburst susceptibility are in contact with glaciers and have moderate (> 8°) to steep (> 15°) dam outlet slopes, akin to failed lakes in Patagonia. The proposed classification scheme is suitable for first-order GLOF hazard assessments in this region. However, rapidly changing glaciers in Patagonia make detailed analysis and monitoring of hazardous lakes and glaciated areas upstream from inhabited areas or critical infrastructure necessary, in order to better prepare for hazards emerging from an evolving cryosphere.
Research on preventive technologies for bed-separation water hazard in China coal mines
NASA Astrophysics Data System (ADS)
Gui, Herong; Tong, Shijie; Qiu, Weizhong; Lin, Manli
2018-03-01
Bed-separation water is one of the major water hazards in coal mines. Targeted researches on the preventive technologies are of paramount importance to safe mining. This article studied the restrictive effect of geological and mining factors, such as lithological properties of roof strata, coal seam inclination, water source to bed separations, roof management method, dimensions of mining working face, and mining progress, on the formation of bed-separation water hazard. The key techniques to prevent bed-separation water-related accidents include interception, diversion, destructing the buffer layer, grouting and backfilling, etc. The operation and efficiency of each technique are corroborated in field engineering cases. The results of this study will offer reference to countries with similar mining conditions in the researches on bed-separation water burst and hazard control in coal mines.
Anderson, Carl A; McRae, Allan F; Visscher, Peter M
2006-07-01
Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.
[Hazard function and life table: an introduction to the failure time analysis].
Matsushita, K; Inaba, H
1987-04-01
Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.
2004-02-01
UNCLASSIFIED − Conducted experiments to determine the usability of general-purpose anomaly detection algorithms to monitor a large, complex military...reaction and detection modules to perform tailored analysis sequences to monitor environmental conditions, health hazards and physiological states...scalability of lab proven anomaly detection techniques for intrusion detection in real world high volume environments. Narrative Title FY 2003
Application of Non-destructive Methods of Stress-strain State at Hazardous Production Facilities
NASA Astrophysics Data System (ADS)
Shram, V.; Kravtsova, Ye; Selsky, A.; Bezborodov, Yu; Lysyannikova, N.; Lysyannikov, A.
2016-06-01
The paper deals with the sources of accidents in distillation columns, on the basis of which the most dangerous defects are detected. The analysis of the currently existing methods of non-destructive testing of the stress-strain state is performed. It is proposed to apply strain and acoustic emission techniques to continuously monitor dangerous objects, which helps prevent the possibility of accidents, as well as reduce the work.
Anneken, David; Striebich, Richard; DeWitt, Matthew J; Klingshirn, Christopher; Corporan, Edwin
2015-03-01
Aircraft turbine engines are a significant source of particulate matter (PM) and gaseous emissions in the vicinity of airports and military installations. Hazardous air pollutants (HAPs) (e.g., formaldehyde, benzene, naphthalene and other compounds) associated with aircraft emissions are an environmental concern both in flight and at ground level. Therefore, effective sampling, identification, and accurate measurement of these trace species are important to assess their environmental impact. This effort evaluates two established ambient air sampling and analysis methods, U.S. Environmental Protection Agency (EPA) Method TO-11A and National Institute for Occupational Safety and Health (NIOSH) Method 1501, for potential use to quantify HAPs from aircraft turbine engines. The techniques were used to perform analysis of the exhaust from a T63 turboshaft engine, and were examined using certified gas standards transferred through the heated sampling systems used for engine exhaust gaseous emissions measurements. Test results show that the EPA Method TO-11A (for aldehydes) and NIOSH Method 1501 (for semivolatile hydrocarbons) were effective techniques for the sampling and analysis of most HAPs of interest. Both methods showed reasonable extraction efficiencies of HAP species from the sorbent tubes, with the exception of acrolein, styrene, and phenol, which were not well quantified. Formaldehyde measurements using dinitrophenylhydrazine (DNPH) tubes (EPA method TO-11A) were accurate for gas-phase standards, and compared favorably to measurements using gas-phase Fourier-transform infrared (FTIR) spectroscopy. In general, these two standard methodologies proved to be suitable techniques for field measurement of turbine engine HAPs within a reasonable (5-10 minutes) sampling period. Details of the tests, the analysis methods, calibration procedures, and results from the gas standards and T63 engine tested using a conventional JP-8 jet fuel are provided. HAPs from aviation-related sources are important because of their adverse health and environmental impacts in and around airports and flight lines. Simpler, more convenient techniques to measure the important HAPs, especially aldehydes and volatile organic HAPs, are needed to provide information about their occurrence and assist in the development of engines that emit fewer harmful emissions.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-26
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' that appeared in... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day...
NASA Technical Reports Server (NTRS)
Nguyen, Truong X.; Koppen, Sandra V.; Ely, Jay J.; Williams, Reuben A.; Smith, Laura J.; Salud, Maria Theresa P.
2004-01-01
This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.
Hazardous Chemicals on the Job: A Workers Guide to Reducing Exposure.
ERIC Educational Resources Information Center
Conrad, Fran; Kayman, Lindsey
This guide is intended to assist workers in reducing their exposure to hazardous chemicals on the job. It describes a systematic preventive approach to hazardous chemicals that is based on the following steps: determining which chemicals are in use at a particular worksite (techniques for asking the company and steps to take if the company is…
Using expert opinion surveys to rank threats to endangered species: a case study with sea turtles.
Donlan, C Josh; Wingfield, Dana K; Crowder, Larry B; Wilcox, Chris
2010-12-01
Little is known about how specific anthropogenic hazards affect the biology of organisms. Quantifying the effect of regional hazards is particularly challenging for species such as sea turtles because they are migratory, difficult to study, long lived, and face multiple anthropogenic threats. Expert elicitation, a technique used to synthesize opinions of experts while assessing uncertainty around those views, has been in use for several decades in the social science and risk assessment sectors. We conducted an internet-based survey to quantify expert opinion on the relative magnitude of anthropogenic hazards to sea turtle populations at the regional level. Fisheries bycatch and coastal development were most often ranked as the top hazards to sea turtle species in a geographic region. Nest predation and direct take followed as the second and third greatest threats, respectively. Survey results suggest most experts believe sea turtles are threatened by multiple factors, including substantial at-sea threats such as fisheries bycatch. Resources invested by the sea turtle community, however, appear biased toward terrestrial-based impacts. Results from the survey are useful for conservation planning because they provide estimates of relative impacts of hazards on sea turtles and a measure of consensus on the magnitude of those impacts among researchers and practitioners. Our survey results also revealed patterns of expert bias, which we controlled for in our analysis. Respondents with no experience with respect to a sea turtle species tended to rank hazards affecting that sea turtle species higher than respondents with experience. A more-striking pattern was with hazard-based expertise: the more experience a respondent had with a specific hazard, the higher the respondent scored the impact of that hazard on sea turtle populations. Bias-controlled expert opinion surveys focused on threatened species and their hazards can help guide and expedite species recovery plans. © 2010 Society for Conservation Biology.
Non-Intrusive Measurement Techniques Applied to the Hybrid Solid Fuel Degradation
NASA Astrophysics Data System (ADS)
Cauty, F.
2004-10-01
The knowledge of the solid fuel regression rate and the time evolution of the grain geometry are requested for hybrid motor design and control of its operating conditions. Two non-intrusive techniques (NDT) have been applied to hybrid propulsion : both are based on wave propagation, the X-rays and the ultrasounds, through the materials. X-ray techniques allow local thickness measurements (attenuated signal level) using small probes or 2D images (Real Time Radiography), with a link between the size of field of view and accuracy. Beside the safety hazards associated with the high-intensity X-ray systems, the image analysis requires the use of quite complex post-processing techniques. The ultrasound technique is more widely used in energetic material applications, including hybrid fuels. Depending upon the transducer size and the associated equipment, the application domain is large, from tiny samples to the quad-port wagon wheel grain of the 1.1 MN thrust HPDP motor. The effect of the physical quantities has to be taken into account in the wave propagation analysis. With respect to the various applications, there is no unique and perfect experimental method to measure the fuel regression rate. The best solution could be obtained by combining two techniques at the same time, each technique enhancing the quality of the global data.
NASA Astrophysics Data System (ADS)
Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.
2012-04-01
Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m resolution was used for the analysis. Three failure mechanisms were analyzed: planar and wedge sliding, as well as toppling. Based on this kinematic analysis, areas where failure is feasible were used as source areas for run out analysis using Rockyfor3D v. 4.1 (www.ecorisq.org). The software calculates trajectories of single falling blocks in three dimensions using physically based algorithms developed under a stochastic approach. The ALS-DEM was down-scaled to 5 m resolution to optimize processing time. Results were compared with run-out simulations using Rockyfor3D with the whole rock wall as source area, and with maps of deposits generated from field observations and aerial photo interpretation. The results product of our implementation show a better correlation with field observations, and help to produce more accurate rock fall hazard assessment maps by a better definition of the source areas. It reduces the time processing for the analysis as well. The findings presented in this contribution are part of an effort to produce guidelines for natural hazard mapping in Norway. Guidelines will be used in upcoming years for hazard mapping in areas where larger groups of population are exposed to mass movements from steep slopes.
Pötschger, Ulrike; Heinzl, Harald; Valsecchi, Maria Grazia; Mittlböck, Martina
2018-01-19
Investigating the impact of a time-dependent intervention on the probability of long-term survival is statistically challenging. A typical example is stem-cell transplantation performed after successful donor identification from registered donors. Here, a suggested simple analysis based on the exogenous donor availability status according to registered donors would allow the estimation and comparison of survival probabilities. As donor search is usually ceased after a patient's event, donor availability status is incompletely observed, so that this simple comparison is not possible and the waiting time to donor identification needs to be addressed in the analysis to avoid bias. It is methodologically unclear, how to directly address cumulative long-term treatment effects without relying on proportional hazards while avoiding waiting time bias. The pseudo-value regression technique is able to handle the first two issues; a novel generalisation of this technique also avoids waiting time bias. Inverse-probability-of-censoring weighting is used to account for the partly unobserved exogenous covariate donor availability. Simulation studies demonstrate unbiasedness and satisfying coverage probabilities of the new method. A real data example demonstrates that study results based on generalised pseudo-values have a clear medical interpretation which supports the clinical decision making process. The proposed generalisation of the pseudo-value regression technique enables to compare survival probabilities between two independent groups where group membership becomes known over time and remains partly unknown. Hence, cumulative long-term treatment effects are directly addressed without relying on proportional hazards while avoiding waiting time bias.
NASA Technical Reports Server (NTRS)
Hwu, Shian U.; Kelley, James S.; Panneton, Robert B.; Arndt, G. Dickey
1995-01-01
In order to estimate the RF radiation hazards to astronauts and electronics equipment due to various Space Station transmitters, the electric fields around the various Space Station antennas are computed using the rigorous Computational Electromagnetics (CEM) techniques. The Method of Moments (MoM) was applied to the UHF and S-band low gain antennas. The Aperture Integration (AI) method and the Geometrical Theory of Diffraction (GTD) method were used to compute the electric field intensities for the S- and Ku-band high gain antennas. As a result of this study, The regions in which the electric fields exceed the specified exposure levels for the Extravehicular Mobility Unit (EMU) electronics equipment and Extravehicular Activity (EVA) astronaut are identified for various Space Station transmitters.
Chi, Feng; Zhou, Jun; Zhang, Qi; Wang, Yong; Huang, Panling
2017-01-01
The vibration control of a construction vehicle must be carried out in order to meet the aims of sustainable environmental development and to avoid the potential human health hazards. In this paper, based on market feedback, the driver seat vibration of a type of wheel loader in the left and right direction, is found to be significant over a certain speed range. In order to find abnormal vibration components, the order tracking technique (OTT) and transmission path analysis (TPA) were used to analyze the vibration sources of the wheel loader. Through this analysis, it can be seen that the abnormal vibration comes from the interaction between the tire tread and the road, and this is because the vibration was amplified by the cab mount, which was eventually transmitted to the cab seat. Finally, the seat vibration amplitudes were decreased by up to 50.8%, after implementing the vibration reduction strategy. PMID:28282849
Chi, Feng; Zhou, Jun; Zhang, Qi; Wang, Yong; Huang, Panling
2017-03-08
The vibration control of a construction vehicle must be carried out in order to meet the aims of sustainable environmental development and to avoid the potential human health hazards. In this paper, based on market feedback, the driver seat vibration of a type of wheel loader in the left and right direction, is found to be significant over a certain speed range. In order to find abnormal vibration components, the order tracking technique (OTT) and transmission path analysis (TPA) were used to analyze the vibration sources of the wheel loader. Through this analysis, it can be seen that the abnormal vibration comes from the interaction between the tire tread and the road, and this is because the vibration was amplified by the cab mount, which was eventually transmitted to the cab seat. Finally, the seat vibration amplitudes were decreased by up to 50.8%, after implementing the vibration reduction strategy.
Assessment and prediction of debris-flow hazards
Wieczorek, Gerald F.; ,
1993-01-01
Study of debris-flow geomorphology and initiation mechanism has led to better understanding of debris-flow processes. This paper reviews how this understanding is used in current techniques for assessment and prediction of debris-flow hazards.
Risk Assessment Methodology for Hazardous Waste Management (1998)
A methodology is described for systematically assessing and comparing the risks to human health and the environment of hazardous waste management alternatives. The methodology selects and links appropriate models and techniques for performing the process.
Safety management of a complex R&D ground operating system
NASA Technical Reports Server (NTRS)
Connors, J.; Mauer, R. A.
1975-01-01
Report discusses safety program implementation for large R&D operating system. Analytical techniques are defined and suggested as tools for identifying potential hazards and determining means to effectively control or eliminate hazards.
Rapid SAR and GPS Measurements and Models for Hazard Science and Situational Awareness
NASA Astrophysics Data System (ADS)
Owen, S. E.; Yun, S. H.; Hua, H.; Agram, P. S.; Liu, Z.; Moore, A. W.; Rosen, P. A.; Simons, M.; Webb, F.; Linick, J.; Fielding, E. J.; Lundgren, P.; Sacco, G. F.; Polet, J.; Manipon, G.
2016-12-01
The Advanced Rapid Imaging and Analysis (ARIA) project for Natural Hazards is focused on rapidly generating higher level geodetic imaging products and placing them in the hands of the solid earth science and local, national, and international natural hazard communities by providing science product generation, exploration, and delivery capabilities at an operational level. Space-based geodetic measurement techniques such as Interferometric Synthetic Aperture Radar (InSAR), Differential Global Positioning System (DGPS), SAR-based change detection, and image pixel tracking have recently become critical additions to our toolset for understanding and mapping the damage caused by earthquakes, volcanic eruptions, landslides, and floods. Analyses of these data sets are still largely handcrafted following each event and are not generated rapidly and reliably enough for response to natural disasters or for timely analysis of large data sets. The ARIA project, a joint venture co-sponsored by California Institute of Technology (Caltech) and by NASA through the Jet Propulsion Laboratory (JPL), has been capturing the knowledge applied to these responses and building it into an automated infrastructure to generate imaging products in near real-time that can improve situational awareness for disaster response. In addition, the ARIA project is developing the capabilities to provide automated imaging and analysis capabilities necessary to keep up with the imminent increase in raw data from geodetic imaging missions planned for launch by NASA, as well as international space agencies. We will present the progress we have made on automating the analysis of SAR data for hazard monitoring and response using data from Sentinel 1a/b as well as continuous GPS stations. Since the beginning of our project, our team has imaged events and generated response products for events around the world. These response products have enabled many conversations with those in the disaster response community about the potential usefulness of rapid SAR and GPS-based information. We will present progress on our data system technology that enables rapid and reliable production of imagery, as well as lessons learned from our engagement with FEMA and others in the hazard response community on the important actionable information that they need.
Regional Frequency Analysis of Ocean Hazard
NASA Astrophysics Data System (ADS)
Bernardara, Pietro; Weiss, Jerome; Benoit, Michel; Andreewsky, Marc
2015-04-01
The estimation of the extreme return level (up to 10-4 annual probability of exceedence) of natural phenomena is a very uncertain exercise, when extrapolating using the information and the measure collected in a single site. The aim of the Regional Frequency Analysis (RFA) is to benefit from the information contained in observations and data collected not only on the site of interested but in a larger set of sites, located in the same region of the site of interest or sharing with it similar characteristics. This technique was introduced in the '60 and widely used in various domains including hydrology and meteorology. The RFA was recently acknowledge as a potential choice for the estimation of flooding hazard in the Methodological Guide for flooding hazard estimation [1], published in 2013 by the French Nuclear Safety Autority. The aim of this presentation is to bring in the main concepts of the RFA and illustrate the latest innovation on its application, delivered by EDF R&D. They concerns the statistical definition of storms, the formation of homogeneous regions and a new approach for filtering the redundant information linked to the spatial correlation of natural phenomena. Application to skew surges and waves will be shown 1. ASN, Guide pour la Protection des installations nucléaires de base contre les inondations externes. 2013, ASN. p. 44.
NASA Astrophysics Data System (ADS)
England, John F.; Julien, Pierre Y.; Velleux, Mark L.
2014-03-01
Traditionally, deterministic flood procedures such as the Probable Maximum Flood have been used for critical infrastructure design. Some Federal agencies now use hydrologic risk analysis to assess potential impacts of extreme events on existing structures such as large dams. Extreme flood hazard estimates and distributions are needed for these efforts, with very low annual exceedance probabilities (⩽10-4) (return periods >10,000 years). An integrated data-modeling hydrologic hazard framework for physically-based extreme flood hazard estimation is presented. Key elements include: (1) a physically-based runoff model (TREX) coupled with a stochastic storm transposition technique; (2) hydrometeorological information from radar and an extreme storm catalog; and (3) streamflow and paleoflood data for independently testing and refining runoff model predictions at internal locations. This new approach requires full integration of collaborative work in hydrometeorology, flood hydrology and paleoflood hydrology. An application on the 12,000 km2 Arkansas River watershed in Colorado demonstrates that the size and location of extreme storms are critical factors in the analysis of basin-average rainfall frequency and flood peak distributions. Runoff model results are substantially improved by the availability and use of paleoflood nonexceedance data spanning the past 1000 years at critical watershed locations.
1998-03-01
benefit estimation techniques used to monetize the value of flood hazard reduction in the City of Roanoke. Each method was then used to estimate...behavior. This framework justifies interpreting people’s choices to infer and then monetize their preferences. If individuals have well-ordered and...Journal of Agricultural Economics. 68 (1986) 2: 280-290. Soule, Don M. and Claude M. Vaughn, "Flood Protection Benefits as Reflected in Property
Beyond FMEA: the structured what-if technique (SWIFT).
Card, Alan J; Ward, James R; Clarkson, P John
2012-01-01
Although it is probably the best-known prospective hazard analysis (PHA) tool, failure mode and effects analysis (FMEA) is far from the only option available. This article introduces one of the alternatives: The structured what-if technique (SWIFT). SWIFT is a flexible, high-level risk identification technique that can be used on a stand-alone basis, or as part of a staged approach to make more efficient use of bottom-up methods like FMEA. In this article we describe the method, assess the evidence related to its use in healthcare with the use of a systematic literature review, and suggest ways in which it could be better adapted for use in the healthcare industry. Based on the limited evidence available, it appears that healthcare workers find it easy to learn, easy to use, and credible. Especially when used as part of a staged approach, SWIFT appears capable of playing a useful role as component of the PHA armamentarium. © 2012 American Society for Healthcare Risk Management of the American Hospital Association.
Probabilistic tsunami hazard assessment at Seaside, Oregon, for near-and far-field seismic sources
Gonzalez, F.I.; Geist, E.L.; Jaffe, B.; Kanoglu, U.; Mofjeld, H.; Synolakis, C.E.; Titov, V.V.; Areas, D.; Bellomo, D.; Carlton, D.; Horning, T.; Johnson, J.; Newman, J.; Parsons, T.; Peters, R.; Peterson, C.; Priest, G.; Venturato, A.; Weber, J.; Wong, F.; Yalciner, A.
2009-01-01
The first probabilistic tsunami flooding maps have been developed. The methodology, called probabilistic tsunami hazard assessment (PTHA), integrates tsunami inundation modeling with methods of probabilistic seismic hazard assessment (PSHA). Application of the methodology to Seaside, Oregon, has yielded estimates of the spatial distribution of 100- and 500-year maximum tsunami amplitudes, i.e., amplitudes with 1% and 0.2% annual probability of exceedance. The 100-year tsunami is generated most frequently by far-field sources in the Alaska-Aleutian Subduction Zone and is characterized by maximum amplitudes that do not exceed 4 m, with an inland extent of less than 500 m. In contrast, the 500-year tsunami is dominated by local sources in the Cascadia Subduction Zone and is characterized by maximum amplitudes in excess of 10 m and an inland extent of more than 1 km. The primary sources of uncertainty in these results include those associated with interevent time estimates, modeling of background sea level, and accounting for temporal changes in bathymetry and topography. Nonetheless, PTHA represents an important contribution to tsunami hazard assessment techniques; viewed in the broader context of risk analysis, PTHA provides a method for quantifying estimates of the likelihood and severity of the tsunami hazard, which can then be combined with vulnerability and exposure to yield estimates of tsunami risk. Copyright 2009 by the American Geophysical Union.
A Reference Model for Software and System Inspections. White Paper
NASA Technical Reports Server (NTRS)
He, Lulu; Shull, Forrest
2009-01-01
Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.
77 FR 55371 - System Safety Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-07
...-based rule and FRA seeks comments on all aspects of the proposed rule. An SSP would be implemented by a... SSP would be the risk-based hazard management program and risk-based hazard analysis. A properly implemented risk-based hazard management program and risk-based hazard analysis would identify the hazards and...
Joseph J. O’Brien; Kathryn A. Mordecai; Leslie Wolcott
2010-01-01
This publication is a field guide to tactics and techniques for dealing with hazardous fuels in subtropical pine flatwoods and tropical pine rocklands. The guide covers prescribed fire, mechanical, chemical, and other means for reducing and managing wildland fuels in these systems. Also, a list of exotic plants that contribute to hazardous fuel problems is included...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2012 CFR
2012-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2014 CFR
2014-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2011 CFR
2011-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2013 CFR
2013-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2010 CFR
2010-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
NASA Technical Reports Server (NTRS)
Jamnejad, Vahraz; Statman, Joseph
2013-01-01
This work includes a simplified analysis of the radiated near to mid-field from JPL/NASA Deep Space Network (DSN) reflector antennas and uses an averaging technique over the main beam region and beyond for complying with FAA regulations in specific aviation environments. The work identifies areas that require special attention, including the implications of the very narrow beam of the DSN transmitters. The paper derives the maximum averaged power densities allowed and identifies zones where mitigation measures are required.
NASA Technical Reports Server (NTRS)
Tai, H.; Wilson, J. W.; Maiden, D. L.
2003-01-01
The atmospheric ionizing radiation (AIR) ER-2 preflight analysis, one of the first attempts to obtain a relatively complete measurement set of the high-altitude radiation level environment, is described in this paper. The primary thrust is to characterize the atmospheric radiation and to define dose levels at high-altitude flight. A secondary thrust is to develop and validate dosimetric techniques and monitoring devices for protecting aircrews. With a few chosen routes, we can measure the experimental results and validate the AIR model predictions. Eventually, as more measurements are made, we gain more understanding about the hazardous radiation environment and acquire more confidence in the prediction models.
NASA Astrophysics Data System (ADS)
Raghuvanshi, Tarun Kumar; Ibrahim, Jemal; Ayalew, Dereje
2014-11-01
In this paper a new slope susceptibility evaluation parameter (SSEP) rating scheme is presented which is developed as an expert evaluation approach for landslide hazard zonation. The SSEP rating scheme is developed by considering intrinsic and external triggering parameters that are responsible for slope instability. The intrinsic parameters which are considered are; slope geometry, slope material (rock or soil type), structural discontinuities, landuse and landcover and groundwater. Besides, external triggering parameters such as, seismicity, rainfall and manmade activities are also considered. For SSEP empirical technique numerical ratings are assigned to each of the intrinsic and triggering parameters on the basis of logical judgments acquired from experience of studies of intrinsic and external triggering factors and their relative impact in inducing instability to the slope. Further, the distribution of maximum SSEP ratings is based on their relative order of importance in contributing instability to the slope. Finally, summation of all ratings for intrinsic and triggering parameter based on actual observation will provide the expected degree of landslide in a given land unit. This information may be utilized to develop a landslide hazard zonation map. The SSEP technique was applied in the area around Wurgessa Kebelle of North Wollo Zonal Administration, Amhara National Regional State in northern Ethiopia, some 490 km from Addis Ababa. The results obtained indicates that 8.33% of the area fall under Moderately hazard and 83.33% fall within High hazard whereas 8.34% of the area fall under Very high hazard. Further, in order to validate the LHZ map prepared during the study, active landslide activities and potential instability areas, delineated through inventory mapping was overlain on it. All active landslide activities and potential instability areas fall within very high and high hazard zone. Thus, the satisfactory agreement confirms the rationality of considered governing parameters, the adopted SSEP technique, tools and procedures in developing the landslide hazard map of the study area.
Site-specific seismic probabilistic tsunami hazard analysis: performances and potential applications
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Volpe, Manuela; Lorito, Stefano; Selva, Jacopo; Orefice, Simone; Graziani, Laura; Brizuela, Beatriz; Smedile, Alessandra; Romano, Fabrizio; De Martini, Paolo Marco; Maramai, Alessandra; Piatanesi, Alessio; Pantosti, Daniela
2017-04-01
Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) provides probabilities to exceed different thresholds of tsunami hazard intensity, at a specific site or region and in a given time span, for tsunamis caused by seismic sources. Results obtained by SPTHA (i.e., probabilistic hazard curves and inundation maps) represent a very important input to risk analyses and land use planning. However, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could lead to a biased analysis. Moreover, tsunami propagation from source to target requires the use of very expensive numerical simulations. At regional scale, the computational cost can be reduced using assumptions on the tsunami modeling (i.e., neglecting non-linear effects, using coarse topo-bathymetric meshes, empirically extrapolating maximum wave heights on the coast). On the other hand, moving to local scale, a much higher resolution is required and such assumptions drop out, since detailed inundation maps require significantly greater computational resources. In this work we apply a multi-step method to perform a site-specific SPTHA which can be summarized in the following steps: i) to perform a regional hazard assessment to account for both the aleatory and epistemic uncertainties of the seismic source, by combining the use of an event tree and an ensemble modeling technique; ii) to apply a filtering procedure which use a cluster analysis to define a significantly reduced number of representative scenarios contributing to the hazard of a specific target site; iii) to perform high resolution numerical simulations only for these representative scenarios and for a subset of near field sources placed in very shallow waters and/or whose coseismic displacements induce ground uplift or subsidence at the target. The method is applied to three target areas in the Mediterranean located around the cities of Milazzo (Italy), Thessaloniki (Greece) and Siracusa (Italy). The latter target analysis is enriched by the use of local observed tsunami data, both geological and historical. Indeed, tsunami data-sets available for Siracusa are particularly rich with respect to the scarce and heterogeneous data-sets usually available elsewhere. Therefore, they can represent a further valuable source of information to benchmark and strengthen the results of such kind of studies. The work is funded by the Italian Flagship Project RITMARE, the two EC FP7 ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389) projects, the TSUMAPS-NEAM (Grant agreement ECHO/SUB/2015/718568/PREV26) project and the INGV-DPC Agreement.
An Online Resource for Flight Test Safety Planning
NASA Technical Reports Server (NTRS)
Lewis, Greg
2007-01-01
A viewgraph presentation describing an online database for flight test safety techniques is shown. The topics include: 1) Goal; 2) Test Hazard Analyses; 3) Online Database Background; 4) Data Gathering; 5) NTPS Role; 6) Organizations; 7) Hazard Titles; 8) FAR Paragraphs; 9) Maneuver Name; 10) Identified Hazard; 11) Matured Hazard Titles; 12) Loss of Control Causes; 13) Mitigations; 14) Database Now Open to the Public; 15) FAR Reference Search; 16) Record Field Search; 17) Keyword Search; and 18) Results of FAR Reference Search.
14 CFR 417.227 - Toxic release hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from any...
Compiler-Assisted Multiple Instruction Rollback Recovery Using a Read Buffer. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Alewine, Neal Jon
1993-01-01
Multiple instruction rollback (MIR) is a technique to provide rapid recovery from transient processor failures and was implemented in hardware by researchers and slow in mainframe computers. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs were also developed which remove rollback data hazards directly with data flow manipulations, thus eliminating the need for most data redundancy hardware. Compiler-assisted techniques to achieve multiple instruction rollback recovery are addressed. It is observed that data some hazards resulting from instruction rollback can be resolved more efficiently by providing hardware redundancy while others are resolved more efficiently with compiler transformations. A compiler-assisted multiple instruction rollback scheme is developed which combines hardware-implemented data redundancy with compiler-driven hazard removal transformations. Experimental performance evaluations were conducted which indicate improved efficiency over previous hardware-based and compiler-based schemes. Various enhancements to the compiler transformations and to the data redundancy hardware developed for the compiler-assisted MIR scheme are described and evaluated. The final topic deals with the application of compiler-assisted MIR techniques to aid in exception repair and branch repair in a speculative execution architecture.
Ground Water Issue: Phytoremediation of Contaminated Soil and Ground Water at Hazardous Waste Sites
2001-02-01
Development Ground Water Issue Phytoremediation of Contaminated Soil and Ground Water at Hazardous Waste Sites National Risk Management Research... Phytoremediation , the use of plants in remediation, is one such technology. This issue paper focuses on the processes and applications of phytoremediation ...of phytoremediation as a cleanup or containment technique for remediation of hazardous waste sites. Introductory material on plant processes is
Remedial Investigation/Feasibility Study (RI/FS) process, elements and techniques guidance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This manual provides detailed guidance on Remedial Investigation/Feasibility Studies (RI/FSs) conducted pursuant to the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) at Department of Energy (DOE) facilities. The purpose of the RI/FS, to assess the risk posed by a hazardous waste site and to determine the best way to reduce that risk, and its structure (site characterization, risk assessment, screening and detailed analysis of alternatives, etc.) is defined in the National Oil and Hazardous Substances Pollution Contingency Plan (NCP) and further explained in the Environmental Protection Agency`s (EPA`s) Guidance for Conducting Remedial Investigations and Feasibility Studies Under CERCLA (Interimmore » Final) 540/G-89/004, OSWER Directive 9355.3-01, October 1988. Though issued in 1988, the EPA guidance remains an excellent source of information on the conduct and structure of an RI/FS. This document makes use of supplemental RI/FS-related guidance that EPA has developed since its initial document was issued in 1988, incorporates practical lessons learned in more than 12 years of experience in CERCLA hazardous site remediation, and drawing on those lessons, introduces the Streamlined Approach For Environmental Restoration (SAFER), developed by DOE as a way to proceed quickly and efficiently through the RI/FS process at DOE facilities. Thus as its title implies, this guidance is intended to describe in detail the process and component elements of an RI/FS, as well as techniques to manage the RI/FS effectively.« less
Regression analysis of current-status data: an application to breast-feeding.
Grummer-strawn, L M
1993-09-01
"Although techniques for calculating mean survival time from current-status data are well known, their use in multiple regression models is somewhat troublesome. Using data on current breast-feeding behavior, this article considers a number of techniques that have been suggested in the literature, including parametric, nonparametric, and semiparametric models as well as the application of standard schedules. Models are tested in both proportional-odds and proportional-hazards frameworks....I fit [the] models to current status data on breast-feeding from the Demographic and Health Survey (DHS) in six countries: two African (Mali and Ondo State, Nigeria), two Asian (Indonesia and Sri Lanka), and two Latin American (Colombia and Peru)." excerpt
Hybrid 3D reconstruction and image-based rendering techniques for reality modeling
NASA Astrophysics Data System (ADS)
Sequeira, Vitor; Wolfart, Erik; Bovisio, Emanuele; Biotti, Ester; Goncalves, Joao G. M.
2000-12-01
This paper presents a component approach that combines in a seamless way the strong features of laser range acquisition with the visual quality of purely photographic approaches. The relevant components of the system are: (i) Panoramic images for distant background scenery where parallax is insignificant; (ii) Photogrammetry for background buildings and (iii) High detailed laser based models for the primary environment, structure of exteriors of buildings and interiors of rooms. These techniques have a wide range of applications in visualization, virtual reality, cost effective as-built analysis of architectural and industrial environments, building facilities management, real-estate, E-commerce, remote inspection of hazardous environments, TV production and many others.
NASA Astrophysics Data System (ADS)
Anderson, E. R.; Griffin, R.; Irwin, D.
2013-12-01
Heavy rains and steep, volcanic slopes in El Salvador cause numerous landslides every year, posing a persistent threat to the population, economy and environment. Although potential debris inundation hazard zones have been delineated using digital elevation models (DEMs), some disparities exist between the simulated zones and actual affected areas. Moreover, these hazard zones have only been identified for volcanic lahars and not the shallow landslides that occur nearly every year. This is despite the availability of tools to delineate a variety of landslide types (e.g., the USGS-developed LAHARZ software). Limitations in DEM spatial resolution, age of the data, and hydrological preprocessing techniques can contribute to inaccurate hazard zone definitions. This study investigates the impacts of using different elevation models and pit filling techniques in the final debris hazard zone delineations, in an effort to determine which combination of methods most closely agrees with observed landslide events. In particular, a national DEM digitized from topographic sheets from the 1970s and 1980s provide an elevation product at a 10 meter resolution. Both natural and anthropogenic modifications of the terrain limit the accuracy of current landslide hazard assessments derived from this source. Global products from the Shuttle Radar Topography Mission (SRTM) and the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global DEM (ASTER GDEM) offer more recent data but at the cost of spatial resolution. New data derived from the NASA Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) in 2013 provides the opportunity to update hazard zones at a higher spatial resolution (approximately 6 meters). Hydrological filling of sinks or pits for current hazard zone simulation has previously been achieved through ArcInfo spatial analyst. Such hydrological processing typically only fills pits and can lead to drastic modifications of original elevation values. Optimized pit filling techniques use both cut and fill operations to minimize modifications of the original DEM. Satellite image interpretation and field surveying provide the baseline upon which to test the accuracy of each model simulation. By outlining areas that could potentially be inundated by debris flows, these efforts can be used to more accurately identify the places and assets immediately exposed to landslide hazards. We contextualize the results of the previous and ongoing efforts into how they may be incorporated into decision support systems. We also discuss if and how these analyses would have provided additional knowledge in the past, and identify specific recommendations as to how they could contribute to a more robust decision support system in the future.
CyberShake: Running Seismic Hazard Workflows on Distributed HPC Resources
NASA Astrophysics Data System (ADS)
Callaghan, S.; Maechling, P. J.; Graves, R. W.; Gill, D.; Olsen, K. B.; Milner, K. R.; Yu, J.; Jordan, T. H.
2013-12-01
As part of its program of earthquake system science research, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a tensor-valued wavefield of Strain Green Tensors, and then using seismic reciprocity to calculate synthetic seismograms for about 415,000 events per site of interest. These seismograms are processed to compute ground motion intensity measures, which are then combined with probabilities from an earthquake rupture forecast to produce a site-specific hazard curve. Seismic hazard curves for hundreds of sites in a region can be used to calculate a seismic hazard map, representing the seismic hazard for a region. We present a recently completed PHSA study in which we calculated four CyberShake seismic hazard maps for the Southern California area to compare how CyberShake hazard results are affected by different SGT computational codes (AWP-ODC and AWP-RWG) and different community velocity models (Community Velocity Model - SCEC (CVM-S4) v11.11 and Community Velocity Model - Harvard (CVM-H) v11.9). We present our approach to running workflow applications on distributed HPC resources, including systems without support for remote job submission. We show how our approach extends the benefits of scientific workflows, such as job and data management, to large-scale applications on Track 1 and Leadership class open-science HPC resources. We used our distributed workflow approach to perform CyberShake Study 13.4 on two new NSF open-science HPC computing resources, Blue Waters and Stampede, executing over 470 million tasks to calculate physics-based hazard curves for 286 locations in the Southern California region. For each location, we calculated seismic hazard curves with two different community velocity models and two different SGT codes, resulting in over 1100 hazard curves. We will report on the performance of this CyberShake study, four times larger than previous studies. Additionally, we will examine the challenges we face applying these workflow techniques to additional open-science HPC systems and discuss whether our workflow solutions continue to provide value to our large-scale PSHA calculations.
The development of an audit technique to assess the quality of safety barrier management.
Guldenmund, Frank; Hale, Andrew; Goossens, Louis; Betten, Jeroen; Duijm, Nijs Jan
2006-03-31
This paper describes the development of a management model to control barriers devised to prevent major hazard scenarios. Additionally, an audit technique is explained that assesses the quality of such a management system. The final purpose of the audit technique is to quantify those aspects of the management system that have a direct impact on the reliability and effectiveness of the barriers and, hence, the probability of the scenarios involved. First, an outline of the management model is given and its elements are explained. Then, the development of the audit technique is described. Because the audit technique uses actual major hazard scenarios and barriers within these as its focus, the technique achieves a concreteness and clarity that many other techniques often lack. However, this strength is also its limitation, since the full safety management system is not covered with the technique. Finally, some preliminary experiences obtained from several test sites are compiled and discussed.
Wehage, Kristopher; Chenhansa, Panan; Schoenung, Julie M
2017-01-01
GreenScreen® for Safer Chemicals is a framework for comparative chemical hazard assessment. It is the first transparent, open and publicly accessible framework of its kind, allowing manufacturers and governmental agencies to make informed decisions about the chemicals and substances used in consumer products and buildings. In the GreenScreen® benchmarking process, chemical hazards are assessed and classified based on 18 hazard endpoints from up to 30 different sources. The result is a simple numerical benchmark score and accompanying assessment report that allows users to flag chemicals of concern and identify safer alternatives. Although the screening process is straightforward, aggregating and sorting hazard data is tedious, time-consuming, and prone to human error. In light of these challenges, the present work demonstrates the usage of automation to cull chemical hazard data from publicly available internet resources, assign metadata, and perform a GreenScreen® hazard assessment using the GreenScreen® "List Translator." The automated technique, written as a module in the Python programming language, generates GreenScreen® List Translation data for over 3000 chemicals in approximately 30 s. Discussion of the potential benefits and limitations of automated techniques is provided. By embedding the library into a web-based graphical user interface, the extensibility of the library is demonstrated. The accompanying source code is made available to the hazard assessment community. Integr Environ Assess Manag 2017;13:167-176. © 2016 SETAC. © 2016 SETAC.
Advantages and limitations of potential methods for the analysis of bacteria in milk: a review.
Tabit, Frederick Tawi
2016-01-01
Contamination concerns in the dairy industry are motivated by outbreaks of disease in humans and the inability of thermal processes to eliminate bacteria completely in processed products. HACCP principles are an important tool used in the food industry to identify and control potential food safety hazards in order to meet customer demands and regulatory requirements. Milk testing is of importance to the milk industry regarding quality assurance and monitoring of processed products by researchers, manufacturers and regulatory agencies. Due to the availability of numerous methods used for analysing the microbial quality of milk in literature and differences in priorities of stakeholders, it is sometimes confusing to choose an appropriate method for a particular analysis. The objective of this paper is to review the advantages and disadvantages of selected techniques that can be used in the analysis of bacteria in milk. SSC, HRMA, REP, and RAPD are the top four techniques which are quick and cost-effective and possess adequate discriminatory power for the detection and profiling of bacteria. The following conclusions were arrived at during this review: HRMA, REP and RFLP are the techniques with the most reproducible results, and the techniques with the most discriminatory power are AFLP, PFGE and Raman Spectroscopy.
Kourgialas, Nektarios N; Karatzas, George P
2017-12-01
The present work introduces a national scale flood hazard assessment methodology, using multi-criteria analysis and artificial neural networks (ANNs) techniques in a GIS environment. The proposed methodology was applied in Greece, where flash floods are a relatively frequent phenomenon and it has become more intense over the last decades, causing significant damages in rural and urban sectors. In order the most prone flooding areas to be identified, seven factor-maps (that are directly related to flood generation) were combined in a GIS environment. These factor-maps are: a) the Flow accumulation (F), b) the Land use (L), c) the Altitude (A), b) the Slope (S), e) the soil Erodibility (E), f) the Rainfall intensity (R), and g) the available water Capacity (C). The name to the proposed method is "FLASERC". The flood hazard for each one of these factors is classified into five categories: Very low, low, moderate, high, and very high. The above factors are combined and processed using the appropriate ANN algorithm tool. For the ANN training process spatial distribution of historical flooded points in Greece within the five different flood hazard categories of the aforementioned seven factor-maps were combined. In this way, the overall flood hazard map for Greece was determined. The final results are verified using additional historical flood events that have occurred in Greece over the last 100years. In addition, an overview of flood protection measures and adaptation policy approaches were proposed for agricultural and urban areas located at very high flood hazard areas. Copyright © 2017 Elsevier B.V. All rights reserved.
Association of Modality with Mortality among Canadian Aboriginals
Hemmelgarn, Brenda; Rigatto, Claudio; Komenda, Paul; Yeates, Karen; Promislow, Steven; Mojica, Julie; Tangri, Navdeep
2012-01-01
Summary Background and objectives Previous studies have shown that Aboriginals and Caucasians experience similar outcome on dialysis in Canada. Using the Canadian Organ Replacement Registry, this study examined whether dialysis modality (peritoneal or hemodialysis) impacted mortality in Aboriginal patients. Design, setting, participants, & measurements This study identified 31,576 adult patients (hemodialysis: Aboriginal=1839, Caucasian=21,430; peritoneal dialysis: Aboriginal=554, Caucasian=6769) who initiated dialysis between January of 2000 and December of 2009. Aboriginal status was identified by self-report. Dialysis modality was determined 90 days after dialysis initiation. Multivariate Cox proportional hazards and competing risk models were constructed to determine the association between race and mortality by dialysis modality. Results During the study period, 939 (51.1%) Aboriginals and 12,798 (53.3%) Caucasians initiating hemodialysis died, whereas 166 (30.0%) and 2037 (30.1%), respectively, initiating peritoneal dialysis died. Compared with Caucasians, Aboriginals on hemodialysis had a comparable risk of mortality (adjusted hazards ratio=1.04, 95% confidence interval=0.96–1.11, P=0.37). However, on peritoneal dialysis, Aboriginals experienced a higher risk of mortality (adjusted hazards ratio=1.36, 95% confidence interval=1.13–1.62, P=0.001) and technique failure (adjusted hazards ratio=1.29, 95% confidence interval=1.03–1.60, P=0.03) than Caucasians. The risk of technique failure varied by patient age, with younger Aboriginals (<50 years old) more likely to develop technique failure than Caucasians (adjusted hazards ratio=1.76, 95% confidence interval=1.23–2.52, P=0.002). Conclusions Aboriginals on peritoneal dialysis experience higher mortality and technique failure relative to Caucasians. Reasons for this race disparity in peritoneal dialysis outcomes are unclear. PMID:22997343
Development of a Probabilistic Tsunami Hazard Analysis in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka
2006-07-01
It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less
SUBSTITUTION REACTIONS FOR THE DETOXIFICATION OF HAZARDOUS CHEMICALS
Chemical Treatment is one of several treatment techniques used for the remediation of toxic and hazardous chemicals. Chemical treatment in this report is defined as substitution of halogens by hydrogens for the conversion of halogenated organic toxicant into its native hydrocarb...
NASA Astrophysics Data System (ADS)
Paprotny, Dominik; Morales-Nápoles, Oswaldo; Jonkman, Sebastiaan N.
2018-03-01
The influence of social and economic change on the consequences of natural hazards has been a matter of much interest recently. However, there is a lack of comprehensive, high-resolution data on historical changes in land use, population, or assets available to study this topic. Here, we present the Historical Analysis of Natural Hazards in Europe (HANZE) database, which contains two parts: (1) HANZE-Exposure with maps for 37 countries and territories from 1870 to 2020 in 100 m resolution and (2) HANZE-Events, a compilation of past disasters with information on dates, locations, and losses, currently limited to floods only. The database was constructed using high-resolution maps of present land use and population, a large compilation of historical statistics, and relatively simple disaggregation techniques and rule-based land use reallocation schemes. Data encompassed in HANZE allow one to "normalize" information on losses due to natural hazards by taking into account inflation as well as changes in population, production, and wealth. This database of past events currently contains 1564 records (1870-2016) of flash, river, coastal, and compound floods. The HANZE database is freely available at https://data.4tu.nl/repository/collection:HANZE.
SU-F-T-246: Evaluation of Healthcare Failure Mode And Effect Analysis For Risk Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harry, T; University of California, San Diego, La Jolla, CA; Manger, R
Purpose: To evaluate the differences between the Veteran Affairs Healthcare Failure Modes and Effect Analysis (HFMEA) and the AAPM Task Group 100 Failure and Effect Analysis (FMEA) risk assessment techniques in the setting of a stereotactic radiosurgery (SRS) procedure were compared respectively. Understanding the differences in the techniques methodologies and outcomes will provide further insight into the applicability and utility of risk assessments exercises in radiation therapy. Methods: HFMEA risk assessment analysis was performed on a stereotactic radiosurgery procedure. A previous study from our institution completed a FMEA of our SRS procedure and the process map generated from this workmore » was used for the HFMEA. The process of performing the HFMEA scoring was analyzed, and the results from both analyses were compared. Results: The key differences between the two risk assessments are the scoring criteria for failure modes and identifying critical failure modes for potential hazards. The general consensus among the team performing the analyses was that scoring for the HFMEA was simpler and more intuitive then the FMEA. The FMEA identified 25 critical failure modes while the HFMEA identified 39. Seven of the FMEA critical failure modes were not identified by the HFMEA and 21 of the HFMEA critical failure modes were not identified by the FMEA. HFMEA as described by the Veteran Affairs provides guidelines on which failure modes to address first. Conclusion: HFMEA is a more efficient model for identifying gross risks in a process than FMEA. Clinics with minimal staff, time and resources can benefit from this type of risk assessment to eliminate or mitigate high risk hazards with nominal effort. FMEA can provide more in depth details but at the cost of elevated effort.« less
NASA Technical Reports Server (NTRS)
Driver, E. T.
1971-01-01
Safety design features in the motor vehicle and highway construction fields result from systems analysis approach to prevent or lessen death, injury, and property damage results. Systems analysis considers the prevention of crashes, increased survivability in crashes, and prompt medical attention to injuries as well as other postcrash salvage measures. The interface of these system elements with the driver, the vehicle, and the environment shows that action on the vehicle system produces the greatest safety payoff through design modifications. New and amended safety standards developed through hazard analysis technique improved accident statistics in the 70'; these regulations include driver qualifications and countermeasures to identify the chronic drunken driver who is involved in more than two-thirds of all auto deaths.
Eder Puestow dilatation of benign rectal stricture following anterior resection.
Woodward, A; Tydeman, G; Lewis, M H
1990-01-01
Benign anastomotic stricture following anterior resection can be difficult to manage when the stricture is proximal. The acceptable surgical options are either a redo low resection with its accompanying hazards or, alternatively, the formation of a permanent colostomy. Although dilatation of such strictures is possible by blind passage of metal bougies, the authors believe that this technique must be regarded as hazardous. A technique of dilatation is described that is usually reserved for esophageal stricture, namely, Eder Puestow dilatation over a guide wire inserted under direct vision. Although this technique may not be without risk, this readily available equipment may be valuable in making a further resection unnecessary.
Modelling Multi Hazard Mapping in Semarang City Using GIS-Fuzzy Method
NASA Astrophysics Data System (ADS)
Nugraha, A. L.; Awaluddin, M.; Sasmito, B.
2018-02-01
One important aspect of disaster mitigation planning is hazard mapping. Hazard mapping can provide spatial information on the distribution of locations that are threatened by disaster. Semarang City as the capital of Central Java Province is one of the cities with high natural disaster intensity. Frequent natural disasters Semarang city is tidal flood, floods, landslides, and droughts. Therefore, Semarang City needs spatial information by doing multi hazard mapping to support disaster mitigation planning in Semarang City. Multi Hazards map modelling can be derived from parameters such as slope maps, rainfall, land use, and soil types. This modelling is done by using GIS method with scoring and overlay technique. However, the accuracy of modelling would be better if the GIS method is combined with Fuzzy Logic techniques to provide a good classification in determining disaster threats. The Fuzzy-GIS method will build a multi hazards map of Semarang city can deliver results with good accuracy and with appropriate threat class spread so as to provide disaster information for disaster mitigation planning of Semarang city. from the multi-hazard modelling using GIS-Fuzzy can be known type of membership that has a good accuracy is the type of membership Gauss with RMSE of 0.404 the smallest of the other membership and VAF value of 72.909% of the largest of the other membership.
NASA Astrophysics Data System (ADS)
Hart, Brian K.; Griffiths, Peter R.
1998-06-01
Partial least squares (PLS) regression has been evaluated as a robust calibration technique for over 100 hazardous air pollutants (HAPs) measured by open path Fourier transform infrared (OP/FT-IR) spectrometry. PLS has the advantage over the current recommended calibration method of classical least squares (CLS), in that it can look at the whole useable spectrum (700-1300 cm-1, 2000-2150 cm-1, and 2400-3000 cm-1), and detect several analytes simultaneously. Up to one hundred HAPs synthetically added to OP/FT-IR backgrounds have been simultaneously calibrated and detected using PLS. PLS also has the advantage in requiring less preprocessing of spectra than that which is required in CLS calibration schemes, allowing PLS to provide user independent real-time analysis of OP/FT-IR spectra.
A critical analysis of hazard resilience measures within sustainability assessment frameworks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, Elizabeth C., E-mail: echiso1@lsu.edu; Sattler, Meredith, E-mail: msattler@lsu.edu; Friedland, Carol J., E-mail: friedland@lsu.edu
Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site,more » community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
PECH, S.H.
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.
NONDESTRUCTIVE TESTING (NDT) TECHNIQUES TO DETECT CONTAINED SUBSURFACE HAZARDOUS WASTE
The project involves the detection of buried containers with NDT (remote-sensing) techniques. Seventeen techniques were considered and four were ultimately decided upon. They were: electromagnetic induction (EMI); metal detection (MD); magnetometer (MAG); and ground penetrating r...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carr, F.H.
1990-02-01
Because of the increasingly strict enforcement of the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) and the Resource Conservation and Recovery Act (RCRA), generators of hazardous waste are compelled to investigate the hazardous waste treatment, storage and disposal facility (TSDF) they use. This investigation must include an environmental and a financial audit. Simple audits may be performed by the hazardous waste generator, while more thorough ones such as those performed for groups of generators are more likely to be conducted by environmental consultants familiar with treatment, storage, and disposal techniques and the regulatory framework that guides them.
Threat and error management for anesthesiologists: a predictive risk taxonomy
Ruskin, Keith J.; Stiegler, Marjorie P.; Park, Kellie; Guffey, Patrick; Kurup, Viji; Chidester, Thomas
2015-01-01
Purpose of review Patient care in the operating room is a dynamic interaction that requires cooperation among team members and reliance upon sophisticated technology. Most human factors research in medicine has been focused on analyzing errors and implementing system-wide changes to prevent them from recurring. We describe a set of techniques that has been used successfully by the aviation industry to analyze errors and adverse events and explain how these techniques can be applied to patient care. Recent findings Threat and error management (TEM) describes adverse events in terms of risks or challenges that are present in an operational environment (threats) and the actions of specific personnel that potentiate or exacerbate those threats (errors). TEM is a technique widely used in aviation, and can be adapted for the use in a medical setting to predict high-risk situations and prevent errors in the perioperative period. A threat taxonomy is a novel way of classifying and predicting the hazards that can occur in the operating room. TEM can be used to identify error-producing situations, analyze adverse events, and design training scenarios. Summary TEM offers a multifaceted strategy for identifying hazards, reducing errors, and training physicians. A threat taxonomy may improve analysis of critical events with subsequent development of specific interventions, and may also serve as a framework for training programs in risk mitigation. PMID:24113268
Stocka, Jolanta; Tankiewicz, Maciej; Biziuk, Marek; Namieśnik, Jacek
2011-01-01
Pesticides are among the most dangerous environmental pollutants because of their stability, mobility and long-term effects on living organisms. Their presence in the environment is a particular danger. It is therefore crucial to monitor pesticide residues using all available analytical methods. The analysis of environmental samples for the presence of pesticides is very difficult: the processes involved in sample preparation are labor-intensive and time-consuming. To date, it has been standard practice to use large quantities of organic solvents in the sample preparation process; but as these solvents are themselves hazardous, solvent-less and solvent-minimized techniques are becoming popular. The application of Green Chemistry principles to sample preparation is primarily leading to the miniaturization of procedures and the use of solvent-less techniques, and these are discussed in the paper. PMID:22174632
NASA Technical Reports Server (NTRS)
Curtis, S. B.; Wilkinson, M. C.
1971-01-01
The secondary dose contribution expected from the heavy primaries of the galactic cosmic rays was evaluated by a calculational technique developed in this study. Improvements in the solar and galactic cosmic ray environments made possible by recent experimental and theoretical work are discussed and presented. The recommendations of the National Academy of Sciences' space radiation study panel, are used in conjunction with a shielding analysis, to evaluate the radiation status of an astronaut during the triple solar particle event of 10, 14, 16 July 1959.
29 CFR 1908.9 - Monitoring and evaluation.
Code of Federal Regulations, 2010 CFR
2010-07-01
... ability to identify hazards in the workplaces which they have visited; their ability to determine employee... application of applicable Federal or State statutes, regulations or standards; their knowledge and application of appropriate hazard correction techniques and approaches; their knowledge and application of the...
HANDBOOK: CONTROL TECHNIQUES FOR HAZARDOUS AIR POLLUTANTS
This manual is a revision of the first (1986) edition of the Handbook: Control Technologies for Hazardous Air Pollutants, which incorporated information from numerous sources into a single, self-contained reference source focusing on the design and cost of VOC and partic...
Phytoremediation of Contaminated Soil and Ground Water at Hazardous Waste Sites
The purpose of this issue paper is to provide a concise discussion of the processes associated with the use of phytoremediation as a cleanup or containment technique for remediation of hazardous waste sites. Introductory material on plant processes is ...
Chemistry laboratory safety manual available
NASA Technical Reports Server (NTRS)
Elsbrock, R. G.
1968-01-01
Chemistry laboratory safety manual outlines safe practices for handling hazardous chemicals and chemistry laboratory equipment. Included are discussions of chemical hazards relating to fire, health, explosion, safety equipment and procedures for certain laboratory techniques and manipulations involving glassware, vacuum equipment, acids, bases, and volatile solvents.
Reduction of Fire Hazard in Materials for Irrigators and Water Collectors in Cooling Towers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smirnov, N. V.; Konstantinova, N. I., E-mail: konstantinova-n@inbox.ru; Gordon, E. P.
A way of reducing the fire hazard of PVC film used to make cooling-tower irrigators and water collectors is examined. A new generation of fire retardant, nanostructured magnesium hydroxide, is used to impart fire retardant properties. The fabrication technology is optimized with a roller-calendering manufacturing technique, and the permissible ranges of fire hazard indicators for materials in irrigators and water collectors are determined.
Processing LiDAR Data to Predict Natural Hazards
NASA Technical Reports Server (NTRS)
Fairweather, Ian; Crabtree, Robert; Hager, Stacey
2008-01-01
ELF-Base and ELF-Hazards (wherein 'ELF' signifies 'Extract LiDAR Features' and 'LiDAR' signifies 'light detection and ranging') are developmental software modules for processing remote-sensing LiDAR data to identify past natural hazards (principally, landslides) and predict future ones. ELF-Base processes raw LiDAR data, including LiDAR intensity data that are often ignored in other software, to create digital terrain models (DTMs) and digital feature models (DFMs) with sub-meter accuracy. ELF-Hazards fuses raw LiDAR data, data from multispectral and hyperspectral optical images, and DTMs and DFMs generated by ELF-Base to generate hazard risk maps. Advanced algorithms in these software modules include line-enhancement and edge-detection algorithms, surface-characterization algorithms, and algorithms that implement innovative data-fusion techniques. The line-extraction and edge-detection algorithms enable users to locate such features as faults and landslide headwall scarps. Also implemented in this software are improved methodologies for identification and mapping of past landslide events by use of (1) accurate, ELF-derived surface characterizations and (2) three LiDAR/optical-data-fusion techniques: post-classification data fusion, maximum-likelihood estimation modeling, and hierarchical within-class discrimination. This software is expected to enable faster, more accurate forecasting of natural hazards than has previously been possible.
Guo, Xuezhen; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W
2014-06-01
Economic analysis of hazard surveillance in livestock production chains is essential for surveillance organizations (such as food safety authorities) when making scientifically based decisions on optimization of resource allocation. To enable this, quantitative decision support tools are required at two levels of analysis: (1) single-hazard surveillance system and (2) surveillance portfolio. This paper addresses the first level by presenting a conceptual approach for the economic analysis of single-hazard surveillance systems. The concept includes objective and subjective aspects of single-hazard surveillance system analysis: (1) a simulation part to derive an efficient set of surveillance setups based on the technical surveillance performance parameters (TSPPs) and the corresponding surveillance costs, i.e., objective analysis, and (2) a multi-criteria decision making model to evaluate the impacts of the hazard surveillance, i.e., subjective analysis. The conceptual approach was checked for (1) conceptual validity and (2) data validity. Issues regarding the practical use of the approach, particularly the data requirement, were discussed. We concluded that the conceptual approach is scientifically credible for economic analysis of single-hazard surveillance systems and that the practicability of the approach depends on data availability. Copyright © 2014 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joyce, M. J.; Aspinall, M. D.; Cave, F. D.
Pulse-shape discrimination (PSD) in fast, organic scintillation detectors is a long-established technique used to separate neutrons and {gamma} rays in mixed radiation fields. In the analogue domain the method can achieve separation in real time, but all knowledge of the pulses themselves is lost thereby preventing the possibility of any post- or repeated analysis. Also, it is typically reliant on electronic systems that are largely obsolete and which require significant experience to set up. In the digital domain, PSD is often more flexible but significant post-processing has usually been necessary to obtain neutron/{gamma}-ray separation. Moreover, the scintillation media on whichmore » the technique relies usually have a low flash point and are thus deemed hazardous. This complicates the ease with which they are used in industrial applications. In this paper, results obtained with a new portable digital pulse-shape discrimination instrument are described. This instrument provides real-time, digital neutron/{gamma} separation whilst preserving the synchronization with the time-of-arrival for each event, and realizing throughputs of 3 x 10{sup 6} events per second. Furthermore, this system has been tested with a scintillation medium that is non-flammable and not hazardous. (authors)« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-19
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Analysis and Risk-Based Preventive Controls for Human Food.'' FOR FURTHER INFORMATION CONTACT: Domini Bean... Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day comment...
Protection of agriculture against drought in Slovenia based on vulnerability and risk assessment
NASA Astrophysics Data System (ADS)
Dovžak, M.; Stanič, S.; Bergant, K.; Gregorič, G.
2012-04-01
Past and recent extreme events, like earthquakes, extreme droughts, heat waves, flash floods and volcanic eruptions continuously remind us that natural hazards are an integral component of the global environment. Despite rapid improvement of detection techniques many of these events evade long-term or even mid-term prediction and can thus have disastrous impacts on affected communities and environment. Effective mitigation and preparedness strategies will be possible to develop only after gaining the understanding on how and where such hazards may occur, what causes them, what circumstances increase their severity, and what their impacts may be and their study has the recent years emerged as under the common title of natural hazard management. The first step in natural risk management is risk identification, which includes hazard analysis and monitoring, vulnerability analysis and determination of the risk level. The presented research focuses on drought, which is at the present already the most widespread as well as still unpredictable natural hazard. Its primary aim was to assess the frequency and the consequences of droughts in Slovenia based on drought events in the past, to develop methodology for drought vulnerability and risk assessment that can be applied in Slovenia and wider in South-Eastern Europe, to prepare maps of drought risk and crop vulnerability and to guidelines to reduce the vulnerability of the crops. Using the amounts of plant available water in the soil, slope inclination, solar radiation, land use and irrigation infrastructure data sets as inputs, we obtained vulnerability maps for Slovenia using GIS-based multi-criteria decision analysis with a weighted linear combination of the input parameters. The weight configuration was optimized by comparing the modelled crop damage to the assessed actual damage, which was available for the extensive drought case in 2006. Drought risk was obtained quantitatively as a function of hazard and vulnerability and presented in the same way as the vulnerability, as a GIS-based map. Risk maps show geographic regions in Slovenia where droughts pose a major threat to the agriculture and together with the vulnerability maps provide the basis for drought management, in particular for the appropriate mitigation and response actions in specific regions. The developed methodology is expected to be applied to the entire region of South-Eastern Europe within the initiative of the Drought Management Centre for Southeastern Europe.
TECHNICAL GUIDANCE DOCUMENT: INSPECTION TECHNIQUES FOR THE FABRICATION OF GEOMEMBRANE FIELD SEAMS
Subtitle C of the Resource Conservation and Recovery Act (RCRA) requires the U.S. Environmental Protection Agency (EPA) to establish a Federal hazardous waste management program. This program must ensure that hazardous wastes are handled safely from generation until final dispos...
GROUTING TECHNIQUES IN BOTTOM SEALING OF HAZARDOUS WASTE SITES
Bottom sealing of hazardous waste sites involves the injection or insertion of an inert impermeable and continuous horizontal barrier in soil below the source of contamination. This type of containment strategy could be used in conjunction with other technology such as slurry wal...
ADVANCES IN ENCAPSULATION TECHNOLOGIES FOR THE MANAGEMENT OF MERCURY CONTAMINATED HAZARDOUS WASTES
Although industrial and commercial uses of mercury have been curtailed in recent times, there is a demonstrated need for the development of reliable hazardous waste management techniques because of historic operations that have led to significant contamination and ongoing hazardo...
Volcanic hazards and their mitigation: progress and problems
Tilling, R.I.
1989-01-01
A review of hazards mitigation approaches and techniques indicates that significant advances have been made in hazards assessment, volcano monioring, and eruption forecasting. For example, the remarkable accuracy of the predictions of dome-building events at Mount St. Helens since June 1980 is unprecedented. Yet a predictive capability for more voluminous and explosive eruptions still has not been achieved. Studies of magma-induced seismicity and ground deformation continue to provide the most systematic and reliable data for early detection of precursors to eruptions and shallow intrusions. In addition, some other geophysical monitoring techniques and geochemical methods have been refined and are being more widely applied and tested. Comparison of the four major volcanic disasters of the 1980s (Mount St. Helens, U.S.A. (1980), El Chichon, Mexico (1982); Galunggung, Indonesia (1982); and Nevado del Ruiz, Colombia (1985)) illustrates the importance of predisaster geoscience studies, volcanic hazards assessments, volcano monitoring, contingency planning, and effective communications between scientists and authorities. -from Author
Trace element analysis of coal by neutron activation.
NASA Technical Reports Server (NTRS)
Sheibley, D. W.
1973-01-01
The irradiation, counting, and data reduction scheme is described for an analysis capability of 1000 samples per year. Up to 56 elements are reported on each sample. The precision and accuracy of the method are shown for 25 elements designated as hazardous by the Environmental Protection Agency (EPA). The interference corrections for selenium and ytterbium on mercury and ytterbium on selenium are described. The effect of bromine and antimony on the determination of arsenic is also mentioned. The use of factorial design techniques to evaluate interferences in the determination of mercury, selenium, and arsenic is shown. Some typical trace element results for coal, fly ash, and bottom ash are given.
Trace element analysis of coal by neutron activation
NASA Technical Reports Server (NTRS)
Sheibley, D. W.
1973-01-01
The irradiation, counting, and data reduction scheme is described for an analysis capability of 1000 samples per year. Up to 56 elements are reported on each sample. The precision and accuracy of the method are shown for 25 elements designated as hazardous by the Environmental Protection Agency (EPA). The interference corrections for selenium and ytterbium on mercury and ytterbium on selenium are described. The effect of bromine and antimony on the determination of arsenic is also mentioned. The use of factorial design techniques to evaluate interferences in the determination of mercury, selenium, and arsenic is shown. Some typical trace element results for coal, fly ash, and bottom ash are given.
Measurement methods for human exposure analysis.
Lioy, P J
1995-01-01
The general methods used to complete measurements of human exposures are identified and illustrations are provided for the cases of indirect and direct methods used for exposure analysis. The application of the techniques for external measurements of exposure, microenvironmental and personal monitors, are placed in the context of the need to test hypotheses concerning the biological effects of concern. The linkage of external measurements to measurements made in biological fluids is explored for a suite of contaminants. This information is placed in the context of the scientific framework used to conduct exposure assessment. Examples are taken from research on volatile organics and for a large scale problem: hazardous waste sites. PMID:7635110
ESSAA: Embedded system safety analysis assistant
NASA Technical Reports Server (NTRS)
Wallace, Peter; Holzer, Joseph; Guarro, Sergio; Hyatt, Larry
1987-01-01
The Embedded System Safety Analysis Assistant (ESSAA) is a knowledge-based tool that can assist in identifying disaster scenarios. Imbedded software issues hazardous control commands to the surrounding hardware. ESSAA is intended to work from outputs to inputs, as a complement to simulation and verification methods. Rather than treating the software in isolation, it examines the context in which the software is to be deployed. Given a specified disasterous outcome, ESSAA works from a qualitative, abstract model of the complete system to infer sets of environmental conditions and/or failures that could cause a disasterous outcome. The scenarios can then be examined in depth for plausibility using existing techniques.
Lumia, Margaret E.; Gentile, Charles; Gochfeld, Michael; Efthimion, Philip; Robson, Mark
2015-01-01
This study evaluates a new decontamination technique for the mitigation and abatement of hazardous particulates. The traditional decontamination methods used to clean facilities and equipment are time-consuming, prolonging workers' exposure time, may generate airborne hazards, and can be expensive. The use of removable thin film coating as a decontamination technique for surface contamination proved to be a more efficient method of decontamination. This method was tested at three different sites on different hazardous metals. One application of the coating reduced the levels of these metals 90% and had an average reduction of one magnitude. The paired t-tests that were performed for each metal demonstrated that there was a statistically significant reduction of the metal after the use of the coating: lead (p = 0.03), beryllium (p = 0.05), aluminum (p = 0.006), iron (p = 0.0001), and copper (p = 0.004). The Kendall tau-b correlation coefficient demonstrates that there was a positive correlation between the initial levels of contamination and the removal efficiency for all the samples taken from different locations on the floor for each of the three sites. This new decontamination technique worked efficiently, requiring only one application, which decreased exposure time and did not generate any airborne dust. PMID:19437305
Compiler-assisted multiple instruction rollback recovery using a read buffer
NASA Technical Reports Server (NTRS)
Alewine, Neal J.; Chen, Shyh-Kwei; Fuchs, W. Kent; Hwu, Wen-Mei W.
1995-01-01
Multiple instruction rollback (MIR) is a technique that has been implemented in mainframe computers to provide rapid recovery from transient processor failures. Hardware-based MIR designs eliminate rollback data hazards by providing data redundancy implemented in hardware. Compiler-based MIR designs have also been developed which remove rollback data hazards directly with data-flow transformations. This paper describes compiler-assisted techniques to achieve multiple instruction rollback recovery. We observe that some data hazards resulting from instruction rollback can be resolved efficiently by providing an operand read buffer while others are resolved more efficiently with compiler transformations. The compiler-assisted scheme presented consists of hardware that is less complex than shadow files, history files, history buffers, or delayed write buffers, while experimental evaluation indicates performance improvement over compiler-based schemes.
Risk analysis based on hazards interactions
NASA Astrophysics Data System (ADS)
Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost
2017-04-01
Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).
Specification of the ISS Plasma Environment Variability
NASA Technical Reports Server (NTRS)
Minow, Joseph I.; Neergaard, Linda F.; Bui, Them H.; Mikatarian, Ronald R.; Barsamian, H.; Koontz, Steven L.
2002-01-01
Quantifying the spacecraft charging risks and corresponding hazards for the International Space Station (ISS) requires a plasma environment specification describing the natural variability of ionospheric temperature (Te) and density (Ne). Empirical ionospheric specification and forecast models such as the International Reference Ionosphere (IRI) model typically only provide estimates of long term (seasonal) mean Te and Ne values for the low Earth orbit environment. Knowledge of the Te and Ne variability as well as the likelihood of extreme deviations from the mean values are required to estimate both the magnitude and frequency of occurrence of potentially hazardous spacecraft charging environments for a given ISS construction stage and flight configuration. This paper describes the statistical analysis of historical ionospheric low Earth orbit plasma measurements used to estimate Ne, Te variability in the ISS flight environment. The statistical variability analysis of Ne and Te enables calculation of the expected frequency of Occurrence of any particular values of Ne and Te, especially those that correspond to possibly hazardous spacecraft charging environments. The database used in the original analysis included measurements from the AE-C, AE-D, and DE-2 satellites. Recent work on the database has added additional satellites to the database and ground based incoherent scatter radar observations as well. Deviations of the data values from the IRI estimated Ne, Te parameters for each data point provide a statistical basis for modeling the deviations of the plasma environment from the IRI model output. This technique, while developed specifically for the Space Station analysis, can also be generalized to provide ionospheric plasma environment risk specification models for low Earth orbit over an altitude range of 200 km through approximately 1000 km.
DOT National Transportation Integrated Search
1985-10-01
This report summarizes the findings from the second phase of a two-part analysis of hazardous materials truck routes in the Dallas-Fort Worth area. Phase II of this study analyzes the risk of transporting hazardous materials on freeways and arterial ...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
Novel Flood Detection and Analysis Method Using Recurrence Property
NASA Astrophysics Data System (ADS)
Wendi, Dadiyorto; Merz, Bruno; Marwan, Norbert
2016-04-01
Temporal changes in flood hazard are known to be difficult to detect and attribute due to multiple drivers that include processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defence, river training, or land use change, could impact variably on space-time scales and influence or mask each other. Flood time series may show complex behavior that vary at a range of time scales and may cluster in time. This study focuses on the application of recurrence based data analysis techniques (recurrence plot) for understanding and quantifying spatio-temporal changes in flood hazard in Germany. The recurrence plot is known as an effective tool to visualize the dynamics of phase space trajectories i.e. constructed from a time series by using an embedding dimension and a time delay, and it is known to be effective in analyzing non-stationary and non-linear time series. The emphasis will be on the identification of characteristic recurrence properties that could associate typical dynamic behavior to certain flood situations.
NASA Astrophysics Data System (ADS)
Kuznetsov, Andrey; Evsenin, Alexey; Gorshkov, Igor; Osetrov, Oleg; Vakhtin, Dmitry
2009-12-01
Device for detection of explosives, radioactive and heavily shielded nuclear materials in luggage and cargo containers based on Nanosecond Neutron Analysis/Associated Particles Technique (NNA/APT) is under construction. Detection module consists of a small neutron generator with built-in position-sensitive detector of associated alpha-particles, and several scintillator-based gamma-ray detectors. Explosives and other hazardous chemicals are detected by analyzing secondary high-energy gamma-rays from reactions of fast neutrons with materials inside a container. The same gamma-ray detectors are used to detect unshielded radioactive and nuclear materials. An array of several neutron detectors is used to detect fast neutrons from induced fission of nuclear materials. Coincidence and timing analysis allows one to discriminate between fission neutrons and scattered probing neutrons. Mathematical modeling by MCNP5 and MCNP-PoliMi codes was used to estimate the sensitivity of the device and its optimal configuration. Comparison of the features of three gamma detector types—based on BGO, NaI and LaBr3 crystals is presented.
PHYTOREMEDIATION OF CONTAMINATED SOIL AND GROUND WATER AT HAZARDOUS WASTE SITES
The purpose of this issue paper is to provide a concise discussion of the processes associated with the use of phytoremediation as a cleanup or containment technique for remediation of hazardous waste sites. Introductory material on plant processes is provided. The different fo...
Risk and Risk Assessment in Environmental Education.
ERIC Educational Resources Information Center
Chiras, Daniel D.
1982-01-01
Risk assessment (the identification of hazards, the determination of the probability of a hazardous event occurring, and an estimation of the severity of such an event's occurrence) is suggested as a technique to be used to analyze current issues in environmental education in an objective manner. (PEB)
SEPARATION OF HAZARDOUS ORGANICS BY LOW PRESSURE REVERSE OSMOSIS MEMBRANES - PHASE II FINAL REPORT
Extensive experimental studies showed that thin-film, composite membranes can be used effectively for the separation of selected hazardous organic compounds. This waste treatment technique offers definite advantages in terms of high solute separations at low pressures (<2MPa) and...
NASA Astrophysics Data System (ADS)
Meyer, F. J.; Webley, P.; Dehn, J.; Arko, S. A.; McAlpin, D. B.
2013-12-01
Volcanic eruptions are among the most significant hazards to human society, capable of triggering natural disasters on regional to global scales. In the last decade, remote sensing techniques have become established in operational forecasting, monitoring, and managing of volcanic hazards. Monitoring organizations, like the Alaska Volcano Observatory (AVO), are nowadays heavily relying on remote sensing data from a variety of optical and thermal sensors to provide time-critical hazard information. Despite the high utilization of these remote sensing data to detect and monitor volcanic eruptions, the presence of clouds and a dependence on solar illumination often limit their impact on decision making processes. Synthetic Aperture Radar (SAR) systems are widely believed to be superior to optical sensors in operational monitoring situations, due to the weather and illumination independence of their observations and the sensitivity of SAR to surface changes and deformation. Despite these benefits, the contributions of SAR to operational volcano monitoring have been limited in the past due to (1) high SAR data costs, (2) traditionally long data processing times, and (3) the low temporal sampling frequencies inherent to most SAR systems. In this study, we present improved data access, data processing, and data integration techniques that mitigate some of the above mentioned limitations and allow, for the first time, a meaningful integration of SAR into operational volcano monitoring systems. We will introduce a new database interface that was developed in cooperation with the Alaska Satellite Facility (ASF) and allows for rapid and seamless data access to all of ASF's SAR data holdings. We will also present processing techniques that improve the temporal frequency with which hazard-related products can be produced. These techniques take advantage of modern signal processing technology as well as new radiometric normalization schemes, both enabling the combination of multiple observation geometries in change detection procedures. Additionally, it will be shown how SAR-based hazard information can be integrated with data from optical satellites, thermal sensors, webcams and models to create near-real time volcano hazard information. We will introduce a prototype monitoring system that integrates SAR-based hazard information into the near real-time volcano hazard monitoring system of the Alaska Volcano Observatory. This prototype system was applied to historic eruptions of the volcanoes Okmok and Augustine, both located in the North Pacific. We will show that for these historic eruptions, the addition of SAR data lead to a significant improvement in activity detection and eruption monitoring, and improved the accuracy and timeliness of eruption alerts.
A Bernoulli Formulation of the Land-Use Portfolio Model
Champion, Richard A.
2008-01-01
Decision making for natural-hazards mitigation can be sketched as knowledge available in advance (a priori), knowledge available later (a posteriori), and how consequences of the mitigation decision might be viewed once future outcomes are known. Two outcomes - mitigating for a hazard event that will occur, and not mitigating for a hazard event that will not occur - can be considered narrowly correct. Two alternative outcomes - mitigating for a hazard event that will not occur, and not mitigating for a hazard event that will occur - can be considered narrowly incorrect. The dilemma facing the decision maker is that mitigation choices must be made before the event, and often must be made with imperfect statistical techniques and imperfect data.
Multi-criteria analysis for the detection of the most critical European UNESCO Heritage sites
NASA Astrophysics Data System (ADS)
Valagussa, Andrea; Frattini, Paolo; Berta, Nadia; Spizzichino, Daniele; Leoni, Gabriele; Margottini, Claudio; Battista Crosta, Giovanni
2017-04-01
A GIS-based multi-criteria analysis has been implemented to identify and to rank the most critical UNESCO Heritage sites at the European scale in the context of PROTHEGO JPI-Project. Two multi-criteria methods have been tested and applied to more than 300 European UNESCO Sites. First, the Analytic Hierarchy Procedure (AHP) was applied to the data of the UNESCO Periodic Report, in relation to 13 natural hazards that have affected or can potentially affect the Heritage sites. According to these reports, 22% of sites are without any documented hazard and 70% of the sites have at least one hazard affecting the site. The most important hazards on the European country are: fire (wildfire), storm, flooding, earthquake and erosion. For each UNESCO site, the potential risk was calculated as a weighed sum of the hazards that affect the site. The weighs of the 13 hazards were obtained by AHP procedure, which is a technique for multi-attribute decision making that enables the decomposition of a problem into hierarchy, based on the opinion of different experts about the dominance of risks. The weights are obtained by rescaling between 0 and 1 the eigenvectors relative to the maximum eigenvalue for the matrix of the coefficients. The internal coherence of the expert's attributions is defined through the calculation of the consistency ratio (Saaty, 1990). The result of the AHP method consists in a map of the UNESCO sites ranked according to the potential risk, where the site most at risk results to be the Geirangerfjord and Nærøyfjord in Norway. However, the quality of these results lies in the reliability of the Period Reports, which are produced by different experts with unknown level of scientific background. To test the reliability of these results, a comparison of the information of the periodic reports with available high-quality datasets (earthquake, volcano and landslide) at the Italian scale has been performed. Sites properly classified by the Period Reports range from 65% (earthquake hazard) to 98% (volcano hazard), with a high underestimation of landslide hazard. Due to this high value of uncertainty, we developed a new methodology to identify and to rank the most critical UNESCO Heritage sites on the basis of three natural hazards (landslide, earthquake, and volcano) for which reliable European-scale hazard maps are available. For each UNESCO site, a potential risk was calculated as the product of hazard (from the available maps) and potential vulnerability. The latter is obtained considering the typology of site (e.g. monument, cultural landscape, and cultural road), the presence or absence of resident and/or tourist, the position of the site (underground/over-ground). Through this methodology, a new ranking of the European UNESCO Sites has been obtained. In this ranking, the historic center of Naples results to be the most-at-danger site of the European continent.
NASA Technical Reports Server (NTRS)
1972-01-01
System studies, equipment simulation, hardware development and flight tests which were conducted during the development of aircraft collision hazard warning system are discussed. The system uses a cooperative, continuous wave Doppler radar principle with pseudo-random frequency modulation. The report presents a description of the system operation and deals at length with the use of pseudo-random coding techniques. In addition, the use of mathematical modeling and computer simulation to determine the alarm statistics and system saturation characteristics in terminal area traffic of variable density is discussed.
NASA Astrophysics Data System (ADS)
Chen, Wei; Pourghasemi, Hamid Reza; Panahi, Mahdi; Kornejady, Aiding; Wang, Jiale; Xie, Xiaoshen; Cao, Shubo
2017-11-01
The spatial prediction of landslide susceptibility is an important prerequisite for the analysis of landslide hazards and risks in any area. This research uses three data mining techniques, such as an adaptive neuro-fuzzy inference system combined with frequency ratio (ANFIS-FR), a generalized additive model (GAM), and a support vector machine (SVM), for landslide susceptibility mapping in Hanyuan County, China. In the first step, in accordance with a review of the previous literature, twelve conditioning factors, including slope aspect, altitude, slope angle, topographic wetness index (TWI), plan curvature, profile curvature, distance to rivers, distance to faults, distance to roads, land use, normalized difference vegetation index (NDVI), and lithology, were selected. In the second step, a collinearity test and correlation analysis between the conditioning factors and landslides were applied. In the third step, we used three advanced methods, namely, ANFIS-FR, GAM, and SVM, for landslide susceptibility modeling. Subsequently, the results of their accuracy were validated using a receiver operating characteristic curve. The results showed that all three models have good prediction capabilities, while the SVM model has the highest prediction rate of 0.875, followed by the ANFIS-FR and GAM models with prediction rates of 0.851 and 0.846, respectively. Thus, the landslide susceptibility maps produced in the study area can be applied for management of hazards and risks in landslide-prone Hanyuan County.
Rapid Detection of Transition Metals in Welding Fumes Using Paper-Based Analytical Devices
Volckens, John
2014-01-01
Metals in particulate matter (PM) are considered a driving factor for many pathologies. Despite the hazards associated with particulate metals, personal exposures for at-risk workers are rarely assessed due to the cost and effort associated with monitoring. As a result, routine exposure assessments are performed for only a small fraction of the exposed workforce. The objective of this research was to evaluate a relatively new technology, microfluidic paper-based analytical devices (µPADs), for measuring the metals content in welding fumes. Fumes from three common welding techniques (shielded metal arc, metal inert gas, and tungsten inert gas welding) were sampled in two welding shops. Concentrations of acid-extractable Fe, Cu, Ni, and Cr were measured and independently verified using inductively coupled plasma-optical emission spectroscopy (ICP-OES). Results from the µPAD sensors agreed well with ICP-OES analysis; the two methods gave statistically similar results in >80% of the samples analyzed. Analytical costs for the µPAD technique were ~50 times lower than market-rate costs with ICP-OES. Further, the µPAD method was capable of providing same-day results (as opposed several weeks for ICP laboratory analysis). Results of this work suggest that µPAD sensors are a viable, yet inexpensive alternative to traditional analytic methods for transition metals in welding fume PM. These sensors have potential to enable substantially higher levels of hazard surveillance for a given resource cost, especially in resource-limited environments. PMID:24515892
Rapid detection of transition metals in welding fumes using paper-based analytical devices.
Cate, David M; Nanthasurasak, Pavisara; Riwkulkajorn, Pornpak; L'Orange, Christian; Henry, Charles S; Volckens, John
2014-05-01
Metals in particulate matter (PM) are considered a driving factor for many pathologies. Despite the hazards associated with particulate metals, personal exposures for at-risk workers are rarely assessed due to the cost and effort associated with monitoring. As a result, routine exposure assessments are performed for only a small fraction of the exposed workforce. The objective of this research was to evaluate a relatively new technology, microfluidic paper-based analytical devices (µPADs), for measuring the metals content in welding fumes. Fumes from three common welding techniques (shielded metal arc, metal inert gas, and tungsten inert gas welding) were sampled in two welding shops. Concentrations of acid-extractable Fe, Cu, Ni, and Cr were measured and independently verified using inductively coupled plasma-optical emission spectroscopy (ICP-OES). Results from the µPAD sensors agreed well with ICP-OES analysis; the two methods gave statistically similar results in >80% of the samples analyzed. Analytical costs for the µPAD technique were ~50 times lower than market-rate costs with ICP-OES. Further, the µPAD method was capable of providing same-day results (as opposed several weeks for ICP laboratory analysis). Results of this work suggest that µPAD sensors are a viable, yet inexpensive alternative to traditional analytic methods for transition metals in welding fume PM. These sensors have potential to enable substantially higher levels of hazard surveillance for a given resource cost, especially in resource-limited environments.
NASA Astrophysics Data System (ADS)
Hua, H.; Owen, S. E.; Yun, S.; Lundgren, P.; Fielding, E. J.; Agram, P.; Manipon, G.; Stough, T. M.; Simons, M.; Rosen, P. A.; Wilson, B. D.; Poland, M. P.; Cervelli, P. F.; Cruz, J.
2013-12-01
Space-based geodetic measurement techniques such as Interferometric Synthetic Aperture Radar (InSAR) and Continuous Global Positioning System (CGPS) are now important elements in our toolset for monitoring earthquake-generating faults, volcanic eruptions, hurricane damage, landslides, reservoir subsidence, and other natural and man-made hazards. Geodetic imaging's unique ability to capture surface deformation with high spatial and temporal resolution has revolutionized both earthquake science and volcanology. Continuous monitoring of surface deformation and surface change before, during, and after natural hazards improves decision-making from better forecasts, increased situational awareness, and more informed recovery. However, analyses of InSAR and GPS data sets are currently handcrafted following events and are not generated rapidly and reliably enough for use in operational response to natural disasters. Additionally, the sheer data volumes needed to handle a continuous stream of InSAR data sets also presents a bottleneck. It has been estimated that continuous processing of InSAR coverage of California alone over 3-years would reach PB-scale data volumes. Our Advanced Rapid Imaging and Analysis for Monitoring Hazards (ARIA-MH) science data system enables both science and decision-making communities to monitor areas of interest with derived geodetic data products via seamless data preparation, processing, discovery, and access. We will present our findings on the use of hybrid-cloud computing to improve the timely processing and delivery of geodetic data products, integrating event notifications from USGS to improve the timely processing for response, as well as providing browse results for quick looks with other tools for integrative analysis.
A framework for the probabilistic analysis of meteotsunamis
Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.
2014-01-01
A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.
Addressee Errors in ATC Communications: The Call Sign Problem
NASA Technical Reports Server (NTRS)
Monan, W. P.
1983-01-01
Communication errors involving aircraft call signs were portrayed in reports of 462 hazardous incidents voluntarily submitted to the ASRS during an approximate four-year period. These errors resulted in confusion, disorder, and uncoordinated traffic conditions and produced the following types of operational anomalies: altitude deviations, wrong-way headings, aborted takeoffs, go arounds, runway incursions, missed crossing altitude restrictions, descents toward high terrain, and traffic conflicts in flight and on the ground. Analysis of the report set resulted in identification of five categories of errors involving call signs: (1) faulty radio usage techniques, (2) call sign loss or smearing due to frequency congestion, (3) confusion resulting from similar sounding call signs, (4) airmen misses of call signs leading to failures to acknowledge or readback, and (5) controller failures regarding confirmation of acknowledgements or readbacks. These error categories are described in detail and several associated hazard mitigating measures that might be aken are considered.
Coulon, Frédéric; Al Awadi, Mohammed; Cowie, William; Mardlin, David; Pollard, Simon; Cunningham, Colin; Risdon, Graeme; Arthur, Paul; Semple, Kirk T; Paton, Graeme I
2010-10-01
A six month field scale study was carried out to compare windrow turning and biopile techniques for the remediation of soil contaminated with bunker C fuel oil. End-point clean-up targets were defined by human risk assessment and ecotoxicological hazard assessment approaches. Replicate windrows and biopiles were amended with either nutrients and inocula, nutrients alone or no amendment. In addition to fractionated hydrocarbon analysis, culturable microbial characterisation and soil ecotoxicological assays were performed. This particular soil, heavy in texture and historically contaminated with bunker fuel was more effectively remediated by windrowing, but coarser textures may be more amendable to biopiling. This trial reveals the benefit of developing risk and hazard based approaches in defining end-point bioremediation of heavy hydrocarbons when engineered biopile or windrow are proposed as treatment option. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babendreier, Justin E.; Castleton, Karl J.
2005-08-01
Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less
[When and where motorcyclists have accidents and die in Belo Horizonte, Minas Gerais State, Brazil].
Diniz, Eugênio Paceli Hatem; Pinheiro, Letícia Cavalari; Proietti, Fernando Augusto
2015-12-01
The objective of this study was to analyze traffic accidents involving motorcycles in Belo Horizonte, Minas Gerais State, Brazil, from 2007 to 2011 and to identify clusters of high-risk and hazardous intersections in and around the city. Data were provided by the Military Police Brigade and the Emergency Medical Service (SAMU). Accident severity rates were used to identify critical intersections. Two techniques were used: kernel analysis and scan statistics (continuous Poisson model). High-risk clusters were located in the downtown area and on major thoroughfares. Surprisingly, the highest risk of accidents and death occurred not at intersections, but between them. Hazardous intersections are part of routes used to access regions around Greater Metropolitan Belo Horizonte. Two distinct trends in mortality rates and accidents were identified. Most motorcycle deaths occurred after 7:00 PM. The study concludes that there is an urgent need to improve motorcycle and public transportation routes.
New Techniques to Evaluate the Incendiary Behavior of Insulators
NASA Technical Reports Server (NTRS)
Buhler, Charles; Calle, Carlos; Clements, Sid; Trigwell, Steve; Ritz, Mindy
2008-01-01
New techniques for evaluating the incendiary behavior of insulators is presented. The onset of incendive brush discharges in air is evaluated using standard spark probe techniques for the case simulating approaches of an electrically grounded sphere to a charged insulator in the presence of a flammable atmosphere. However, this standard technique is unsuitable for the case of brush discharges that may occur during the charging-separation process for two insulator materials. We present experimental techniques to evaluate this hazard in the presence of a flammable atmosphere which is ideally suited to measure the incendiary nature of micro-discharges upon separation, a measurement never before performed. Other measurement techniques unique to this study include; surface potential measurements of insulators before, during and after contact and separation, as well as methods to verify fieldmeter calibrations using a charge insulator surface opposed to standard high voltage plates. Key words: Kapton polyimide film, incendiary discharges, brush discharges, contact and frictional electrification, ignition hazards, insulators, contact angle, surface potential measurements.
The SAMCO Web-platform for resilience assessment in mountainous valleys impacted by landslide risks.
NASA Astrophysics Data System (ADS)
Grandjean, Gilles; Thomas, Loic; Bernardie, Severine
2016-04-01
The ANR-SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points: (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform available on the web. The strength and originality of the SAMCO project lies in the combination of different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) that are implemented in a user-oriented web-platform, currently in development. We present the first results of this development task, architecture and functions of the web-tools, the case studies database showing the multi-hazard maps and the stakes at risks. Risk assessment over several area of interest in Alpine or Pyrenean valleys are still in progress, but the first analyses are presented for current and future periods for which climate change and land-use (economical, geographical and social aspects) scenarios are taken into account. This tool, dedicated to stakeholders, should be finally used to evaluate resilience of mountainous regions since multiple scenarios can be tested and compared.
On the global geodetic observing system: Africa's preparedness and challenges
NASA Astrophysics Data System (ADS)
Botai, O. J.; Combrinck, Ludwig; Rautenbach, C. J. Hannes
2013-02-01
Space geodetic techniques and satellite missions play a crucial role in the determination and monitoring of geo-kinematics, Earth's rotation and gravity fields. These three pillars of geodesy provide the basis for determining the geodetic reference frames with high accuracy, spatial resolution and temporal stability. Space geodetic techniques have been used for the assessment of geo-hazards, anthropogenic hazards and in the design of early warning systems for hazard and disasters. In general, space geodesy provides products for Earth observation, science and influences many activities (e.g., building and management) in a modern society. In order to further promote the application of space geodetic methods to solving Earth science problems, the Global Geodetic Observing System (GGOS) of the International Association of Geodesy (IAG) was commissioned as an important geodetic infrastructure that integrates different geodetic techniques (such as Global Navigation Satellite Systems, Very Long Baseline Interferometry, Satellite Laser Ranging, Interferometric Synthetic Aperture Radar and Doppler Orbitography and Radio-positioning Integrated by Satellite), models and analysis techniques for the purpose of ensuring long-term, precise monitoring of geodetic observables vital for monitoring Earth system processes. Since its inception, there has been considerable progress made towards setting up the infrastructure necessary for the establishment of the GGOS database. While the challenges that beleaguer the GGOS are acknowledged (at least at global level), the assessment of an attuned GGOS infrastructure in the African context is necessary, yet lacking. In the present contribution, (a) the African preparedness and response to the observing system is assessed, and (b) the specific scientific and technological challenges of establishing a regional GGOS hub for Africa are reviewed. Currently only South Africa has a fundamental geodetic observatory located at Hartebeesthoek, Pretoria. Other countries in Africa have shown interest to participate in global geodetic activities, in particular through interest in the development of a unified African geodetic reference frame (AFREF). In particular interest has been shown in the proposed African VLBI Network (AVN), which will be partially based on existing ex-telecommunication radio antennas. Several countries are investigating their participation in the AVN, including Kenya, Nigeria and Ghana.
Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis
NASA Technical Reports Server (NTRS)
Shortle, J. F.; Allocco, M.
2005-01-01
Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.
Some Open Issues on Rockfall Hazard Analysis in Fractured Rock Mass: Problems and Prospects
NASA Astrophysics Data System (ADS)
Ferrero, Anna Maria; Migliazza, Maria Rita; Pirulli, Marina; Umili, Gessica
2016-09-01
Risk is part of every sector of engineering design. It is a consequence of the uncertainties connected with the cognitive boundaries and with the natural variability of the relevant variables. In soil and rock engineering, in particular, uncertainties are linked to geometrical and mechanical aspects and the model used for the problem schematization. While the uncertainties due to the cognitive gaps could be filled by improving the quality of numerical codes and measuring instruments, nothing can be done to remove the randomness of natural variables, except defining their variability with stochastic approaches. Probabilistic analyses represent a useful tool to run parametric analyses and to identify the more significant aspects of a given phenomenon: They can be used for a rational quantification and mitigation of risk. The connection between the cognitive level and the probability of failure is at the base of the determination of hazard, which is often quantified through the assignment of safety factors. But these factors suffer from conceptual limits, which can be only overcome by adopting mathematical techniques with sound bases, not so used up to now (Einstein et al. in rock mechanics in civil and environmental engineering, CRC Press, London, 3-13, 2010; Brown in J Rock Mech Geotech Eng 4(3):193-204, 2012). The present paper describes the problems and the more reliable techniques used to quantify the uncertainties that characterize the large number of parameters that are involved in rock slope hazard assessment through a real case specifically related to rockfall. Limits of the existing approaches and future developments of the research are also provided.
An international workshop was held in 2006 to evaluate experimental techniques for hazard identification and hazard characterization of sensitizing agents in terms of their ability to produce data, including dose–response information, to inform risk assessment. Human testing to i...
Training Medical Students about Hazardous Drinking Using Simple Assessment Techniques
ERIC Educational Resources Information Center
Hidalgo, Jesús López-Torres; Pretel, Fernando Andrés; Bravo, Beatriz Navarro; Rabadan, Francisco Escobar; Serrano Selva, Juan Pedro; Latorre Postigo, Jose Miguel; Martínez, Ignacio Párraga
2014-01-01
Objective: To examine the ability of medical students to identify hazardous drinkers using screening tools recommended in clinical practice. Design: Observational cross-sectional study. Setting: Faculty of Medicine of Castilla-La Mancha, Spain. Method: The medical students learnt to use Alcohol Use Disorders Identification Test (AUDIT) and…
PHYTOREMEDIATION OF CONTAMINATED SOIL AND GROUND WATER AT HAZARDOUS WASTE SITES (EPA/540/S-01/500)
The purpose of this issue paper is to provide a concise discussion of the processes associated with the use of phytoremediation as a cleanup or containment technique for remediation of hazardous waste sites. Introductory material on plant processes is provided. The different form...
Rocky Mountain Research Station USDA Forest Service
2004-01-01
Effective public education and communication campaigns about wildland fire and fuels management should have clear objectives, and use the right techniques to achieve these objectives. This fact sheet lists seven important considerations for planning or implementing a hazard communication effort.
RESTORING HAZARDOUS SPILL-DAMAGED AREAS: TECHNIQUE IDENTIFICATION/ASSESSMENT
The goal of this study was to identify and assess methods that could be used to accelerate the restoration of lands damaged by spills of hazardous materials. The literature was reviewed to determine what response methods had been used in the past to clean up spills on land and id...
The Hazards Posed by the Global Development of Nuclear Power
ERIC Educational Resources Information Center
O'Reilly, S. A.
1976-01-01
Outlines the growth in the demand for energy on a world-wide basis. Reviews the development of nuclear power and points out the many hazards in the nuclear fuel cycle. Describes the nature of nuclear wastes and explains the quantities involved and the current techniques for waste disposal. (GS)
New efficient optimizing techniques for Kalman filters and numerical weather prediction models
NASA Astrophysics Data System (ADS)
Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis
2016-06-01
The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, D.L.; Wagner, J.B.
1988-09-01
Before siting oil and gas platforms on the sea floor as artificial reefs offshore Louisiana, potentially hazardous and unstable geologic conditions must be identified and evaluated to assess their possible impacts on platform stability. Geologic and man-made features can be identified and assessed from high-resolution geophysical techniques (3.5-7.0 kHz echograms, single-channel seismic, and side-scan sonar). Such features include faults, diapirs, scarps, channels, gas seeps, irregular sea floor topography, mass wasting deposits (slumps, slides, and debris flows), pipelines, and other subsea marine equipment. Geotechnical techniques are utilized to determine lithologic and physical properties of the sediments for correlation with the geophysicalmore » data. These techniques are used to develop a series of geologic maps, cross sections, and pipeline and platform-location maps. Construction of echo-character maps from 3.5-kHz data provides an analysis of near-bottom sedimentation processes (turbidity currents and debris flows).« less
Supplemental Hazard Analysis and Risk Assessment - Hydrotreater
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish amore » lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.« less
Tracking Hazard Analysis Data in a Jungle of Changing Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Robin S.; Young, Jonathan
2006-05-16
Tracking hazard analysis data during the 'life cycle' of a project can be an extremely complicated task. However, a few simple rules, used consistently, can give you the edge that will save countless headaches and provide the information that will help integrate the hazard analysis and design activities even if performed in parallel.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... 0584-AD65 School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles... Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP) was published on... of Management and Budget (OMB) cleared the associated information collection requirements (ICR) on...
An extended stochastic method for seismic hazard estimation
NASA Astrophysics Data System (ADS)
Abd el-aal, A. K.; El-Eraki, M. A.; Mostafa, S. I.
2015-12-01
In this contribution, we developed an extended stochastic technique for seismic hazard assessment purposes. This technique depends on the hypothesis of stochastic technique of Boore (2003) "Simulation of ground motion using the stochastic method. Appl. Geophy. 160:635-676". The essential characteristics of extended stochastic technique are to obtain and simulate ground motion in order to minimize future earthquake consequences. The first step of this technique is defining the seismic sources which mostly affect the study area. Then, the maximum expected magnitude is defined for each of these seismic sources. It is followed by estimating the ground motion using an empirical attenuation relationship. Finally, the site amplification is implemented in calculating the peak ground acceleration (PGA) at each site of interest. We tested and applied this developed technique at Cairo, Suez, Port Said, Ismailia, Zagazig and Damietta cities to predict the ground motion. Also, it is applied at Cairo, Zagazig and Damietta cities to estimate the maximum peak ground acceleration at actual soil conditions. In addition, 0.5, 1, 5, 10 and 20 % damping median response spectra are estimated using the extended stochastic simulation technique. The calculated highest acceleration values at bedrock conditions are found at Suez city with a value of 44 cm s-2. However, these acceleration values decrease towards the north of the study area to reach 14.1 cm s-2 at Damietta city. This comes in agreement with the results of previous studies of seismic hazards in northern Egypt and is found to be comparable. This work can be used for seismic risk mitigation and earthquake engineering purposes.
Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations
NASA Technical Reports Server (NTRS)
Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art
2012-01-01
This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).
NASA Astrophysics Data System (ADS)
Thomas, D. M.; Bevens, D.
2015-12-01
The Center for the Study of Active Volcanoes, in cooperation with the USGS Volcano Hazards Program at HVO and CVO, offers a broadly based volcano hazards training program targeted toward scientists and technicians from developing nations. The program has been offered for 25 years and provides a hands-on introduction to a broad suite of volcano monitoring techniques, rather than detailed training with just one. The course content has evolved over the life of the program as the needs of the trainees have changed: initially emphasizing very basic monitoring techniques (e.g. precise leveling, interpretation of seismic drum records, etc.) but, as the level of sophistication of the trainees has increased, training in more advanced technologies has been added. Currently, topics of primary emphasis have included volcano seismology and seismic networks; acquisition and modeling of geodetic data; methods of analysis and monitoring of gas geochemistry; interpretation of volcanic deposits and landforms; training in LAHARZ, GIS mapping of lahar risks; and response to and management of volcanic crises. The course also provides training on public outreach, based on CSAV's Hawaii-specific hazards outreach programs, and volcano preparedness and interactions with the media during volcanic crises. It is an intensive eight week course with instruction and field activities underway 6 days per week; it is now offered in two locations, Hawaii Island, for six weeks, and the Cascades volcanoes of the Pacific Northwest, for two weeks, to enable trainees to experience field conditions in both basaltic and continental volcanic environments. The survival of the program for more than two decades demonstrates that a need for such training exists and there has been interaction and contribution to the program by the research community, however broader engagement with the latter continues to present challenges. Some of the reasons for this will be discussed.
Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land
2006-01-01
We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.
Application of seismic interpretation in the development of Jerneh Field, Malay Basin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yusoff, Z.
1994-07-01
Development of the Jerneh gas field has been significantly aided by the use of 3-D and site survey seismic interpretations. The two aspects that have been of particular importance are identification of sea-floor and near-surface safety hazards for safe platform installation/development drilling and mapping of reservoirs/hydrocarbons within gas-productive sands of the Miocene groups B, D, and E. Choice of platform location as well as casing design require detailed analysis of sea-floor and near-surface safety hazards. At Jerneh, sea-floor pockmarks near-surface high amplitudes, distributary channels, and minor faults were recognized as potential operational safety hazards. The integration of conventional 3-D andmore » site survey seismic data enabled comprehensive understanding of the occurrence and distribution of potential hazards to platform installation and development well drilling. Three-dimensional seismic interpretation has been instrumental not only in the field structural definition but also in recognition of reservoir trends and hydrocarbon distribution. Additional gas reservoirs were identified by their DHI characteristics and subsequently confirmed by development wells. The innovative use of seismic attribute mapping techniques has been very important in defining both fluid and reservoir distribution in groups B and D. Integration of 3-D seismic data and well-log interpretations has helped in optimal field development, including the planning of well locations and drilling sequence.« less
Toxic release consequence analysis tool (TORCAT) for inherently safer design plant.
Shariff, Azmi Mohd; Zaini, Dzulkarnain
2010-10-15
Many major accidents due to toxic release in the past have caused many fatalities such as the tragedy of MIC release in Bhopal, India (1984). One of the approaches is to use inherently safer design technique that utilizes inherent safety principle to eliminate or minimize accidents rather than to control the hazard. This technique is best implemented in preliminary design stage where the consequence of toxic release can be evaluated and necessary design improvements can be implemented to eliminate or minimize the accidents to as low as reasonably practicable (ALARP) without resorting to costly protective system. However, currently there is no commercial tool available that has such capability. This paper reports on the preliminary findings on the development of a prototype tool for consequence analysis and design improvement via inherent safety principle by utilizing an integrated process design simulator with toxic release consequence analysis model. The consequence analysis based on the worst-case scenarios during process flowsheeting stage were conducted as case studies. The preliminary finding shows that toxic release consequences analysis tool (TORCAT) has capability to eliminate or minimize the potential toxic release accidents by adopting the inherent safety principle early in preliminary design stage. 2010 Elsevier B.V. All rights reserved.
Integrated Safety Analysis Tiers
NASA Technical Reports Server (NTRS)
Shackelford, Carla; McNairy, Lisa; Wetherholt, Jon
2009-01-01
Commercial partnerships and organizational constraints, combined with complex systems, may lead to division of hazard analysis across organizations. This division could cause important hazards to be overlooked, causes to be missed, controls for a hazard to be incomplete, or verifications to be inefficient. Each organization s team must understand at least one level beyond the interface sufficiently enough to comprehend integrated hazards. This paper will discuss various ways to properly divide analysis among organizations. The Ares I launch vehicle integrated safety analyses effort will be utilized to illustrate an approach that addresses the key issues and concerns arising from multiple analysis responsibilities.
Seismic hazard assessment: Issues and alternatives
Wang, Z.
2011-01-01
Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.
A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva
2018-03-01
The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.
Probabilistic analysis of tsunami hazards
Geist, E.L.; Parsons, T.
2006-01-01
Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).
An information diffusion technique to assess integrated hazard risks.
Huang, Chongfu; Huang, Yundong
2018-02-01
An integrated risk is a scene in the future associated with some adverse incident caused by multiple hazards. An integrated probability risk is the expected value of disaster. Due to the difficulty of assessing an integrated probability risk with a small sample, weighting methods and copulas are employed to avoid this obstacle. To resolve the problem, in this paper, we develop the information diffusion technique to construct a joint probability distribution and a vulnerability surface. Then, an integrated risk can be directly assessed by using a small sample. A case of an integrated risk caused by flood and earthquake is given to show how the suggested technique is used to assess the integrated risk of annual property loss. Copyright © 2017 Elsevier Inc. All rights reserved.
Analysis of On-board Hazard Detection and Avoidance for Safe Lunar Landing
NASA Technical Reports Server (NTRS)
Johnson, Andrew E.; Huertas, Andres; Werner, Robert A.; Montgomery, James F.
2008-01-01
Landing hazard detection and avoidance technology is being pursued within NASA to improve landing safety and increase access to sites of interest on the lunar surface. The performance of a hazard detection and avoidance system depends on properties of the terrain, sensor performance, algorithm design, vehicle characteristics and the overall all guidance navigation and control architecture. This paper analyzes the size of the region that must be imaged, sensor performance parameters and the impact of trajectory angle on hazard detection performance. The analysis shows that vehicle hazard tolerance is the driving parameter for hazard detection system design.
Kim, Hyo Jin; Lee, Joongyub; Park, Miseon; Kim, Yuri; Lee, Hajeong; Kim, Dong Ki; Joo, Kwon Wook; Kim, Yon Su; Cho, Eun Jin; Ahn, Curie
2017-01-01
Background Lower education level could be a risk factor for higher peritoneal dialysis (PD)-associated peritonitis, potentially resulting in technique failure. This study evaluated the influence of lower education level on the development of peritonitis, technique failure, and overall mortality. Methods Patients over 18 years of age who started PD at Seoul National University Hospital between 2000 and 2012 with information on the academic background were enrolled. Patients were divided into three groups: middle school or lower (academic year≤9, n = 102), high school (9
Kim, Hyo Jin; Lee, Joongyub; Park, Miseon; Kim, Yuri; Lee, Hajeong; Kim, Dong Ki; Joo, Kwon Wook; Kim, Yon Su; Cho, Eun Jin; Ahn, Curie; Oh, Kook-Hwan
2017-01-01
Lower education level could be a risk factor for higher peritoneal dialysis (PD)-associated peritonitis, potentially resulting in technique failure. This study evaluated the influence of lower education level on the development of peritonitis, technique failure, and overall mortality. Patients over 18 years of age who started PD at Seoul National University Hospital between 2000 and 2012 with information on the academic background were enrolled. Patients were divided into three groups: middle school or lower (academic year≤9, n = 102), high school (9
Cha, DongHwan; Wang, Xin; Kim, Jeong Woo
2017-01-01
Hotspot analysis was implemented to find regions in the province of Alberta (Canada) with high frequency Cloud to Ground (CG) lightning strikes clustered together. Generally, hotspot regions are located in the central, central east, and south central regions of the study region. About 94% of annual lightning occurred during warm months (June to August) and the daily lightning frequency was influenced by the diurnal heating cycle. The association rule mining technique was used to investigate frequent CG lightning patterns, which were verified by similarity measurement to check the patterns’ consistency. The similarity coefficient values indicated that there were high correlations throughout the entire study period. Most wildfires (about 93%) in Alberta occurred in forests, wetland forests, and wetland shrub areas. It was also found that lightning and wildfires occur in two distinct areas: frequent wildfire regions with a high frequency of lightning, and frequent wild-fire regions with a low frequency of lightning. Further, the preference index (PI) revealed locations where the wildfires occurred more frequently than in other class regions. The wildfire hazard area was estimated with the CG lightning hazard map and specific land use types. PMID:29065564
NASA Astrophysics Data System (ADS)
Rosas-Carbajal, Marina; Komorowski, Jean-Christophe; Nicollin, Florence; Gibert, Dominique
2016-07-01
Catastrophic collapses of the flanks of stratovolcanoes constitute a major hazard threatening numerous lives in many countries. Although many such collapses occurred following the ascent of magma to the surface, many are not associated with magmatic reawakening but are triggered by a combination of forcing agents such as pore-fluid pressurization and/or mechanical weakening of the volcanic edifice often located above a low-strength detachment plane. The volume of altered rock available for collapse, the dynamics of the hydrothermal fluid reservoir and the geometry of incipient collapse failure planes are key parameters for edifice stability analysis and modelling that remain essentially hidden to current volcano monitoring techniques. Here we derive a high-resolution, three-dimensional electrical conductivity model of the La Soufrière de Guadeloupe volcano from extensive electrical tomography data. We identify several highly conductive regions in the lava dome that are associated to fluid saturated host-rock and preferential flow of highly acid hot fluids within the dome. We interpret this model together with the existing wealth of geological and geochemical data on the volcano to demonstrate the influence of the hydrothermal system dynamics on the hazards associated to collapse-prone altered volcanic edifices.
Rosas-Carbajal, Marina; Komorowski, Jean-Christophe; Nicollin, Florence; Gibert, Dominique
2016-01-01
Catastrophic collapses of the flanks of stratovolcanoes constitute a major hazard threatening numerous lives in many countries. Although many such collapses occurred following the ascent of magma to the surface, many are not associated with magmatic reawakening but are triggered by a combination of forcing agents such as pore-fluid pressurization and/or mechanical weakening of the volcanic edifice often located above a low-strength detachment plane. The volume of altered rock available for collapse, the dynamics of the hydrothermal fluid reservoir and the geometry of incipient collapse failure planes are key parameters for edifice stability analysis and modelling that remain essentially hidden to current volcano monitoring techniques. Here we derive a high-resolution, three-dimensional electrical conductivity model of the La Soufrière de Guadeloupe volcano from extensive electrical tomography data. We identify several highly conductive regions in the lava dome that are associated to fluid saturated host-rock and preferential flow of highly acid hot fluids within the dome. We interpret this model together with the existing wealth of geological and geochemical data on the volcano to demonstrate the influence of the hydrothermal system dynamics on the hazards associated to collapse-prone altered volcanic edifices. PMID:27457494
Cha, DongHwan; Wang, Xin; Kim, Jeong Woo
2017-10-23
Hotspot analysis was implemented to find regions in the province of Alberta (Canada) with high frequency Cloud to Ground (CG) lightning strikes clustered together. Generally, hotspot regions are located in the central, central east, and south central regions of the study region. About 94% of annual lightning occurred during warm months (June to August) and the daily lightning frequency was influenced by the diurnal heating cycle. The association rule mining technique was used to investigate frequent CG lightning patterns, which were verified by similarity measurement to check the patterns' consistency. The similarity coefficient values indicated that there were high correlations throughout the entire study period. Most wildfires (about 93%) in Alberta occurred in forests, wetland forests, and wetland shrub areas. It was also found that lightning and wildfires occur in two distinct areas: frequent wildfire regions with a high frequency of lightning, and frequent wild-fire regions with a low frequency of lightning. Further, the preference index (PI) revealed locations where the wildfires occurred more frequently than in other class regions. The wildfire hazard area was estimated with the CG lightning hazard map and specific land use types.
Rosas-Carbajal, Marina; Komorowski, Jean-Christophe; Nicollin, Florence; Gibert, Dominique
2016-07-26
Catastrophic collapses of the flanks of stratovolcanoes constitute a major hazard threatening numerous lives in many countries. Although many such collapses occurred following the ascent of magma to the surface, many are not associated with magmatic reawakening but are triggered by a combination of forcing agents such as pore-fluid pressurization and/or mechanical weakening of the volcanic edifice often located above a low-strength detachment plane. The volume of altered rock available for collapse, the dynamics of the hydrothermal fluid reservoir and the geometry of incipient collapse failure planes are key parameters for edifice stability analysis and modelling that remain essentially hidden to current volcano monitoring techniques. Here we derive a high-resolution, three-dimensional electrical conductivity model of the La Soufrière de Guadeloupe volcano from extensive electrical tomography data. We identify several highly conductive regions in the lava dome that are associated to fluid saturated host-rock and preferential flow of highly acid hot fluids within the dome. We interpret this model together with the existing wealth of geological and geochemical data on the volcano to demonstrate the influence of the hydrothermal system dynamics on the hazards associated to collapse-prone altered volcanic edifices.
2016-01-01
The literature of environmental microbiology broadly discusses issues associated with microbial hazards in archives, but these publications are mainly devoted to paper documents. There are few articles on historical parchment documents, which used to be very important for the development of literature and the art of writing. These studies present a broad spectrum of methods for the assessment of biodeterioration hazards of the parchment document in question. They are based on both conventional microbiological methods and advanced techniques of molecular biology. Here, a qualitative analysis was conducted, based on genetic identification of bacteria and fungi present on the document as well as denaturing gradient gel electrophoresis profiling and examining the destructive potential of isolated microbes. Moreover, the study involved a quantitative and qualitative microbiological assessment of the indoor air in the room where the parchment was kept. The microbes with the highest destructive potential that were isolated from the investigated item were Bacillus cereus and Acinetobacter lwoffii bacteria and Penicillium chrysogenum, Chaetomium globosum, and Trichoderma longibrachiatum fungi. The presence of the B. cereus strain was particularly interesting since, under appropriate conditions, it leads to complete parchment degradation within several days. PMID:26896133
Lee, D; Porter, J; Hertel, N; Hatswell, A J; Briggs, A
2016-08-01
In the absence of head-to-head data, a common method for modelling comparative survival for cost-effectiveness analysis is estimating hazard ratios from trial publications. This assumes that the hazards of mortality are proportional between treatments and that outcomes are not polluted by subsequent therapy use. Newer techniques that compare treatments where the proportional hazards assumption is violated and adjust for use of subsequent therapies often require patient-level data, which are rarely available for all treatments. The objective of this study was to provide a comparison of overall survival data for ipilimumab, vemurafenib and dacarbazine using data from three trials lacking a common comparator arm and confounded by the use of subsequent treatment. We compared three estimated overall survival curves for vemurafenib and the difference compared to ipilimumab and dacarbazine. We performed a naïve comparison and adjusted it for heterogeneity between the ipilimumab and vemurafenib trials, including differences in prognostic characteristics and subsequent therapy using a published hazard function for the impact of prognostic characteristics in melanoma and trial data on the impact of second-line use of ipilimumab. The mean incremental life-years gained for patients receiving ipilimumab compared with vemurafenib were 0.34 (95 % confidence interval [CI] -0.24 to 0.84) using the naïve comparison and 0.51 (95 % CI -0.08 to 0.99) using the covariate-adjusted survival curve. The analyses estimated the comparative efficacy of ipilimumab and vemurafenib in the absence of head-to-head patient-level data for all trials and proportional hazards in overall survival.
Flood risk management in the Souss watershed
NASA Astrophysics Data System (ADS)
Bouaakkaz, Brahim; El Abidine El Morjani, Zine; Bouchaou, Lhoussaine; Elhimri, Hamza
2018-05-01
Flooding is the most devasting natural hazards that causes more damage throughout the world. In 2016, for the fourth year in a row, it was the most costly natural disaster, in terms of global economic losses: 62 billion, according to a Benfield's 2016 annual report on climate and natural disasters [1]. The semi-arid to arid Souss watershed is vulnerable to floods, whose the intensity is becoming increasingly alarming and this area does not escape to the effects of this extreme event.. Indeed, the susceptibility of this region to this type of hazard is accentuated by its rapid evolution in terms of demography, uncontrolled land use, anthropogenic actions (uncontrolled urbanization, encroachment of the hydraulic public domain, overgrazing, clearing and deforestation).), and physical behavior of the environment (higher slope, impermeable rocks, etc.). It is in this context, that we have developed a strategic plan of action to manage this risk in the Souss basin in order to reduce the human, economic and environmental losses, after the modeling of the flood hazard in the study area, using georeferenced information systems (GIS), satellite remote sensing space and multi-criteria analysis techniques, as well as the history of major floods. This study, which generated the high resolution 30m flood hazard spatial distribution map of with accuracy of 85%, represents a decision tool to identify and prioririze area with high probability of hazard occurrence. It can also serve as a basis for urban evacuation plans for anticipating and preventing flood risk in the region, in order to ovoid any dramatic disaster.
Identification, prediction, and mitigation of sinkhole hazards in evaporite karst areas
Gutierrez, F.; Cooper, A.H.; Johnson, K.S.
2008-01-01
Sinkholes usually have a higher probability of occurrence and a greater genetic diversity in evaporite terrains than in carbonate karst areas. This is because evaporites have a higher solubility and, commonly, a lower mechanical strength. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas. To deal with these hazards, a phased approach is needed for sinkhole identification, investigation, prediction, and mitigation. Identification techniques include field surveys and geomorphological mapping combined with accounts from local people and historical sources. Detailed sinkhole maps can be constructed from sequential historical maps, recent topographical maps, and digital elevation models (DEMs) complemented with building-damage surveying, remote sensing, and high-resolution geodetic surveys. On a more detailed level, information from exposed paleosubsidence features (paleokarst), speleological explorations, geophysical investigations, trenching, dating techniques, and boreholes may help in investigating dissolution and subsidence features. Information on the hydrogeological pathways including caves, springs, and swallow holes are particularly important especially when corroborated by tracer tests. These diverse data sources make a valuable database-the karst inventory. From this dataset, sinkhole susceptibility zonations (relative probability) may be produced based on the spatial distribution of the features and good knowledge of the local geology. Sinkhole distribution can be investigated by spatial distribution analysis techniques including studies of preferential elongation, alignment, and nearest neighbor analysis. More objective susceptibility models may be obtained by analyzing the statistical relationships between the known sinkholes and the conditioning factors. Chronological information on sinkhole formation is required to estimate the probability of occurrence of sinkholes (number of sinkholes/km2 year). Such spatial and temporal predictions, frequently derived from limited records and based on the assumption that past sinkhole activity may be extrapolated to the future, are non-corroborated hypotheses. Validation methods allow us to assess the predictive capability of the susceptibility maps and to transform them into probability maps. Avoiding the most hazardous areas by preventive planning is the safest strategy for development in sinkhole-prone areas. Corrective measures could be applied to reduce the dissolution activity and subsidence processes. A more practical solution for safe development is to reduce the vulnerability of the structures by using subsidence-proof designs. ?? 2007 Springer-Verlag.
Guide for Hydrogen Hazards Analysis on Components and Systems
NASA Technical Reports Server (NTRS)
Beeson, Harold; Woods, Stephen
2003-01-01
The physical and combustion properties of hydrogen give rise to hazards that must be considered when designing and operating a hydrogen system. One of the major concerns in the use of hydrogen is that of fire or detonation because of hydrogen's wide flammability range, low ignition energy, and flame speed. Other concerns include the contact and interaction of hydrogen with materials, such as the hydrogen embrittlement of materials and the formation of hydrogen hydrides. The low temperature of liquid and slush hydrogen bring other concerns related to material compatibility and pressure control; this is especially important when dissimilar, adjoining materials are involved. The potential hazards arising from these properties and design features necessitate a proper hydrogen hazards analysis before introducing a material, component, or system into hydrogen service. The objective of this guide is to describe the NASA Johnson Space Center White Sands Test Facility hydrogen hazards analysis method that should be performed before hydrogen is used in components and/or systems. The method is consistent with standard practices for analyzing hazards. It is recommended that this analysis be made before implementing a hydrogen component qualification procedure. A hydrogen hazards analysis is a useful tool for hydrogen-system designers, system and safety engineers, and facility managers. A hydrogen hazards analysis can identify problem areas before hydrogen is introduced into a system-preventing damage to hardware, delay or loss of mission or objective, and possible injury or loss of life.
Integrated Risk Assessment to Natural Hazards in Motozintla, Chiapas, Mexico
NASA Astrophysics Data System (ADS)
Novelo-Casanova, D. A.
2012-12-01
An integrated risk assessment includes the analysis of all components of individual constituents of risk such as baseline study, hazard identification and categorization, hazard exposure, and vulnerability. Vulnerability refers to the inability of people, organizations, and societies to withstand adverse impacts from multiple stressors to which they are exposed. These impacts are due to characteristics inherent in social interactions, institutions, and systems of cultural values. Thus, social vulnerability is a pre-existing condition that affects a society's ability to prepare for and recover from a disruptive event. Risk is the probability of a loss, and this loss depends on three elements: hazard, exposure, and vulnerability. Thus, risk is the estimated impact that a hazard event would have on people, services, facilities, structures and assets in a community. In this work we assess the risk to natural hazards in the community of Motozintla located in southern Mexico in the state of Chiapas (15.37N, 92.25W) with a population of about 20 000 habitants. Due to its geographical and geological location, this community is continuously exposed to many different natural hazards (earthquakes, landslides, volcanic eruptions, and floods). To determine the level of exposure of the community to natural hazards, we developed integrated studies and analysis of seismic microzonation, landslide and flood susceptibility as well as volcanic impact using standard methodologies. Social vulnerability was quantified from data obtained from local families interviews. Five variables were considered: household structure quality and design, availability of basic public services, family economic conditions, existing family plans for disaster preparedness, and risk perception.The number of families surveyed was determined considering a sample statistically significant. The families that were interviewed were selected using the simple random sampling technique with replacement. With these procedure, each household was chosen randomly and entirely by chance with the same probability of being chosen at any stage during the sampling process. To facilitate our interpretation, all results were spatially analyzed using a Geographical Information System (GIS). Our results indicate that the community of Motozintla is higly exposed to floods, landslides and earthquakes and to a lesser extent to the impact of a volcanic eruption. The locality has a high level of structural vulnerability to the main identified hazards (floods and landslides). About 70% of the families has a daily income below 11 USD. Approximately 66% of the population does not know any existing Civil Protection Plan. Another major observation is that the community organization for disaster prevention is practically nonexistent. These natural and social conditions indicate that the community of Motozintla has a very high level of risk to natural hazards. This research will support decision makers in Mexico, and particularly from the sate of Chiapas, in the development of an integrated comprenhensive natural hazards mitigation and prevention program in this region.
Yearly report, Yucca Mountain project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brune, J.N.
1992-09-30
We proposed to (1) Develop our data logging and analysis equipment and techniques for analyzing seismic data from the Southern Great Basin Seismic Network (SGBSN), (2) Investigate the SGBSN data for evidence of seismicity patterns, depth distribution patterns, and correlations with geologic features (3) Repair and maintain our three broad band downhole digital seismograph stations at Nelson, nevada, Troy Canyon, Nevada, and Deep Springs, California (4) Install, operate, and log data from a super sensitive microearthquake array at Yucca Mountain (5) Analyze data from micro-earthquakes relative to seismic hazard at Yucca Mountain.
NASA Astrophysics Data System (ADS)
Arend, Mark; Campmier, Mark; Fernandez, Aris; Moshary, Fred
2018-04-01
The complexity of urban boundary layer dynamics poses challenges to those responsible for the design and regulation of buildings and structures in the urban environment. Lidar systems in the New York City Metropolitan region have been used extensively to study urban boundary layer dynamics. These systems, in conjunction with other sensing platforms can provide an observatory to perform research and analysis of turbulent and inclement weather patterns of interest to developers and agencies.
Biodeterioration of materials in water reclamation systems
NASA Technical Reports Server (NTRS)
Ford, Tim; Maki, James S.; Mitchell, Ralph
1992-01-01
The chemicals produced by the microbial processes involved in the 'biofilms' which form on the surfaces of manned spacecraft water reclamation systems encompass both metals and organic poisons; both are potential hazards to astronaut health and the growth of the plants envisioned for closed-cycle life support systems. Image analysis is here shown to be a very useful technique for the study of biofilm formation on candidate water-processor materials for Space Station Freedom. The biodeterioration of materials exposed to biofilms can be swiftly evaluated by means of electrochemical impedance spectroscopy.
Hazard identification by methods of animal-based toxicology.
Barlow, S M; Greig, J B; Bridges, J W; Carere, A; Carpy, A J M; Galli, C L; Kleiner, J; Knudsen, I; Koëter, H B W M; Levy, L S; Madsen, C; Mayer, S; Narbonne, J-F; Pfannkuch, F; Prodanchuk, M G; Smith, M R; Steinberg, P
2002-01-01
This paper is one of several prepared under the project "Food Safety In Europe: Risk Assessment of Chemicals in Food and Diet" (FOSIE), a European Commission Concerted Action Programme, organised by the International Life Sciences Institute, Europe (ILSI). The aim of the FOSIE project is to review the current state of the science of risk assessment of chemicals in food and diet, by consideration of the four stages of risk assessment, that is, hazard identification, hazard characterisation, exposure assessment and risk characterisation. The contribution of animal-based methods in toxicology to hazard identification of chemicals in food and diet is discussed. The importance of first applying existing technical and chemical knowledge to the design of safety testing programs for food chemicals is emphasised. There is consideration of the presently available and commonly used toxicity testing approaches and methodologies, including acute and repeated dose toxicity, reproductive and developmental toxicity, neurotoxicity, genotoxicity, carcinogenicity, immunotoxicity and food allergy. They are considered from the perspective of whether they are appropriate for assessing food chemicals and whether they are adequate to detect currently known or anticipated hazards from food. Gaps in knowledge and future research needs are identified; research on these could lead to improvements in the methods of hazard identification for food chemicals. The potential impact of some emerging techniques and toxicological issues on hazard identification for food chemicals, such as new measurement techniques, the use of transgenic animals, assessment of hormone balance and the possibilities for conducting studies in which common human diseases have been modelled, is also considered.
An evaluation of treatment strategies for head and neck cancer in an African American population.
Ignacio, D N; Griffin, J J; Daniel, M G; Serlemitsos-Day, M T; Lombardo, F A; Alleyne, T A
2013-07-01
This study evaluated treatment strategies for head and neck cancers in a predominantly African American population. Data were collected utilizing medical records and the tumour registry at the Howard University Hospital. Kaplan-Meier method was used for survival analysis and Cox proportional hazards regression analysis predicted the hazard of death. Analysis revealed that the main treatment strategy was radiation combined with platinum for all stages except stage I. Cetuximab was employed in only 1% of cases. Kaplan-Meier analysis revealed stage II patients had poorer outcome than stage IV while Cox proportional hazard regression analysis (p = 0.4662) showed that stage I had a significantly lower hazard of death than stage IV (HR = 0.314; p = 0.0272). Contributory factors included tobacco and alcohol but body mass index (BMI) was inversely related to hazard of death. There was no difference in survival using any treatment modality for African Americans.
Recording and cataloging hazards information, revision A
NASA Technical Reports Server (NTRS)
Stein, R. J.
1974-01-01
A data collection process is described for the purpose of discerning causation factors of accidents, and the establishment of boundaries or controls aimed at mitigating and eliminating accidents. A procedure is proposed that suggests a discipline approach to hazard identification based on energy interrelationships together with an integrated control technique which takes the form of checklists.
Plioutsias, Anastasios; Karanikas, Nektarios; Chatzimihailidou, Maria Mikela
2018-03-01
Currently, published risk analyses for drones refer mainly to commercial systems, use data from civil aviation, and are based on probabilistic approaches without suggesting an inclusive list of hazards and respective requirements. Within this context, this article presents: (1) a set of safety requirements generated from the application of the systems theoretic process analysis (STPA) technique on a generic small drone system; (2) a gap analysis between the set of safety requirements and the ones met by 19 popular drone models; (3) the extent of the differences between those models, their manufacturers, and the countries of origin; and (4) the association of drone prices with the extent they meet the requirements derived by STPA. The application of STPA resulted in 70 safety requirements distributed across the authority, manufacturer, end user, or drone automation levels. A gap analysis showed high dissimilarities regarding the extent to which the 19 drones meet the same safety requirements. Statistical results suggested a positive correlation between drone prices and the extent that the 19 drones studied herein met the safety requirements generated by STPA, and significant differences were identified among the manufacturers. This work complements the existing risk assessment frameworks for small drones, and contributes to the establishment of a commonly endorsed international risk analysis framework. Such a framework will support the development of a holistic and methodologically justified standardization scheme for small drone flights. © 2017 Society for Risk Analysis.
Safety assessment of discharge chute isolation barrier preparation and installation. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meichle, R.H.
1994-10-10
This revision responds to RL comments and increases the discussion of the ``effective hazard categorization`` and the readiness review basis. The safety assessment is made for the activities for the preparation and installation of the discharge chute isolation barriers. The safety assessment includes a hazard assessment and comparison of potential accidents/events to those addressed by the current safety basis documentation. No significant hazards were identified. An evaluation against the USQ evaluation questions were made and the determination made that the activities do not represent a USQ. Hazard categorization techniques were used to provide a basis for readiness review classification.
Detection of indoor biological hazards using the man-portable laser induced breakdown spectrometer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munson, Chase A.; Gottfried, Jennifer L.; Snyder, Emily Gibb
2008-11-01
The performance of a man-portable laser induced breakdown spectrometer was evaluated for the detection of biological powders on indoor office surfaces and wipe materials. Identification of pure unknown powders was performed by comparing against a library of spectra containing biological agent surrogates and confusant materials, such as dusts, diesel soot, natural and artificial sweeteners, and drink powders, using linear correlation analysis. Simple models constructed using a second technique, partial least squares discriminant analysis, successfully identified Bacillus subtilis (BG) spores on wipe materials and office surfaces. Furthermore, these models were able to identify BG on materials not used in the trainingmore » of the model.« less
Mayega, R W; Wafula, M R; Musenero, M; Omale, A; Kiguli, J; Orach, G C; Kabagambe, G; Bazeyo, W
2013-06-01
Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for identifying the priority hazards to guide disaster management planning. The study as conducted in Uganda, as part of a multi-country assessment. A hazard, vulnerability and capacity analysis was conducted in a focus group discussion of 7 experts representing key stakeholder agencies in disaster management in Uganda. A simple ranking method was used to rank the probability of occurance of 11 top hazards, their potential impact and the level vulnerability of people and infrastructure. In-terms of likelihood of occurance and potential impact, the top ranked disaster hazards in Uganda are: 1) Epidemics of infectious diseases, 2) Drought/famine, 3) Conflict and environmental degradation in that order. In terms of vulnerability, the top priority hazards to which people and infrastructure were vulnerable were: 1) Conflicts, 2) Epidemics, 3) Drought/famine and, 4) Environmental degradation in that order. Poverty, gender, lack of information, and lack of resilience measures were some of the factors promoting vulnerability to disasters. As Uganda develops a disaster risk reduction and response plan, it ought to prioritize epidemics of infectious diseases, drought/famine, conflics and environmental degradation as the priority disaster hazards.
Space geodetic tools provide early warnings for earthquakes and volcanic eruptions
NASA Astrophysics Data System (ADS)
Aoki, Yosuke
2017-04-01
Development of space geodetic techniques such as Global Navigation Satellite System and Synthetic Aperture Radar in last few decades allows us to monitor deformation of Earth's surface in unprecedented spatial and temporal resolution. These observations, combined with fast data transmission and quick data processing, enable us to quickly detect and locate earthquakes and volcanic eruptions and assess potential hazards such as strong earthquake shaking, tsunamis, and volcanic eruptions. These techniques thus are key parts of early warning systems, help identify some hazards before a cataclysmic event, and improve the response to the consequent damage.
Preliminary hazards analysis -- vitrification process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coordes, D.; Ruggieri, M.; Russell, J.
1994-06-01
This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s constructionmore » and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.« less
NASA Technical Reports Server (NTRS)
Campbell, R.; Dyer, M. K.; Hoard, E. G.; Little, D. G.; Taylor, A. C.
1972-01-01
Constructive recommendations are suggested for pollution problems from offshore energy resources industries on outer continental shelf. Technical management techniques for pollution identification and control offer possible applications to space engineering and management.
The HSE management system in practice-implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Primrose, M.J.; Bentley, P.D.; Sykes, R.M.
1996-11-01
This paper sets out the necessary strategic issues that must be dealt with when setting up a management system for HSE. It touches on the setting of objectives using a form of risk matrix and the establishment of corporate risk tolerability levels. Such issue management is vital but can be seen as yet another corporate HQ initiative. It must therefore be linked, and made relevant to those in middle management tasked with implementing the system and also to those at risk {open_quote}at the sharp end{close_quote} of the business. Setting acceptance criteria is aimed at demonstrating a necessary and sufficient levelmore » of control or coverage for those hazards considered as being within the objective setting of the Safety or HSE Case. Critical risk areas addressed via the Safety Case, within Shell companies at least, must show how this coverage is extended to critical health and environmental issues. Methods of achieving this are various ranging from specific Case deliverables (like the Hazard Register and Accountability Matrices) through to the incorporation of topics from the hazard analysis in toolbox talks and meetings. Risk analysis techniques are increasingly seen as complementary rather than separate with environmental assessments, health risk assessment sand safety risk analyses taking place together and results being considered jointly. The paper ends with some views on the way ahead regarding the linking of risk decisions to target setting at the workplace and views on how Case information may be retrieved and used on a daily basis.« less
NASA Astrophysics Data System (ADS)
Sari, D. A. P.; Innaqa, S.; Safrilah
2017-06-01
This research analyzed the levels of disaster risk in the Citeureup sub-District, Bogor Regency, West Java, based on its potential hazard, vulnerability and capacity, using map to represent the results, then Miles and Huberman analytical techniques was used to analyze the qualitative interviews. The analysis conducted in this study is based on the concept of disaster risk by Wisner. The result shows that the Citeureup sub-District has medium-low risk of landslides. Of the 14 villages, three villages have a moderate risk level, namely Hambalang, Tajur, and Tangkil, or 49.58% of the total land area. Eleven villages have a low level of risk, namely Pasir Mukti, Sanja, Tarikolot, Gunung Sari, Puspasari, East Karang Asem, Citeureup, Leuwinutug, Sukahati, West Karang Asem West and Puspanegara, or 48.68% of the total land area, for high-risk areas only around 1.74%, which is part of Hambalang village. The analysis using Geographic Information System (GIS) prove that areas with a high risk potential does not necessarily have a high level of risk. The capacity of the community plays an important role to minimize the risk of a region. Disaster risk reduction strategy is done by creating a safe condition, which intensified the movement of disaster risk reduction.
Fire hazard after prescribed burning in a gorse shrubland: implications for fuel management.
Marino, Eva; Guijarro, Mercedes; Hernando, Carmen; Madrigal, Javier; Díez, Carmen
2011-03-01
Prescribed burning is commonly used to prevent accumulation of biomass in fire-prone shrubland in NW Spain. However, there is a lack of knowledge about the efficacy of the technique in reducing fire hazard in these ecosystems. Fire hazard in burned shrubland areas will depend on the initial capacity of woody vegetation to recover and on the fine ground fuels existing after fire. To explore the effect that time since burning has on fire hazard, experimental tests were performed with two fuel complexes (fine ground fuels and regenerated shrubs) resulting from previous prescribed burnings conducted in a gorse shrubland (Ulex europaeus L.) one, three and five years earlier. A point-ignition source was used in burning experiments to assess ignition and initial propagation success separately for each fuel complex. The effect of wind speed was also studied for shrub fuels, and several flammability parameters were measured. Results showed that both ignition and initial propagation success of fine ground fuels mainly depended on fuel depth and were independent of time since burning, although flammability parameters indicated higher fire hazard three years after burning. In contrast, time since burning increased ignition and initial propagation success of regenerated shrub fuels, as well as the flammability parameters assessed, but wind speed had no significant effect. The combination of results of fire hazard for fine ground fuels and regenerated shrubs according to the variation in relative coverage of each fuel type after prescribed burning enabled an assessment of integrated fire hazard in treated areas. The present results suggest that prescribed burning is a very effective technique to reduce fire hazard in the study area, but that fire hazard will be significantly increased by the third year after burning. These results are valuable for fire prevention and fuel management planning in gorse shrubland areas. Copyright © 2010 Elsevier Ltd. All rights reserved.
Evaluation of Landslide Mapping Techniques and LiDAR-based Conditioning Factors
NASA Astrophysics Data System (ADS)
Mahalingam, R.; Olsen, M. J.
2014-12-01
Landslides are a major geohazard, which result in significant human, infrastructure, and economic losses. Landslide susceptibility mapping can help communities to plan and prepare for these damaging events. Mapping landslide susceptible locations using GIS and remote sensing techniques is gaining popularity in the past three decades. These efforts use a wide variety of procedures and consider a wide range of factors. Unfortunately, each study is often completed differently and independently of others. Further, the quality of the datasets used varies in terms of source, data collection, and generation, which can propagate errors or inconsistencies into the resulting output maps. Light detection and ranging (LiDAR) has proved to have higher accuracy in representing the continuous topographic surface, which can help minimize this uncertainty. The primary objectives of this paper are to investigate the applicability and performance of terrain factors in landslide hazard mapping, determine if LiDAR-derived datasets (slope, slope roughness, terrain roughness, stream power index and compound topographic index) can be used for predictive mapping without data representing other common landslide conditioning factors, and evaluate the differences in landslide susceptibility mapping using widely-used statistical approaches. The aforementioned factors were used to produce landslide susceptibility maps for a 140 km2 study area in northwest Oregon using six representative techniques: frequency ratio, weights of evidence, logistic regression, discriminant analysis, artificial neural network, and support vector machine. Most notably, the research showed an advantage in selecting fewer critical conditioning factors. The most reliable factors all could be derived from a single LiDAR DEM, reducing the need for laborious and costly data gathering. Most of the six techniques showed similar statistical results; however, ANN showed less accuracy for predictive mapping. Keywords : LiDAR, Landslides, Oregon, Inventory, Hazard
Safety analysis, risk assessment, and risk acceptance criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jamali, K.; Stack, D.W.; Sullivan, L.H.
1997-08-01
This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities andmore » that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, `ensuring` plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is `safe.` Use of RACs requires quantitative estimates of consequence frequency and magnitude.« less
Rauma, Matias; Isaksson, Tina S; Johanson, Gunnar
2006-10-01
Potential health hazards of dermal exposure, variability in reported dermal absorption rates and potential losses from the skin by evaporation indicate a need for a simple, inexpensive and standardized procedure to measure dermal absorption and desorption of chemical substances. The aim of this study was to explore the possibility to measure dermal absorption and desorption of volatile chemicals using a new gravimetric technique, namely thermal gravimetric analysis (TGA), and trypsinated stratum corneum from pig. Changes in skin weight were readily detected before, during and after exposure to vapours of water, 2-propanol, methanol and toluene. The shape and height of the weight curves differed between the four chemicals, reflecting differences in diffusivity and partial pressure and skin:air partitioning, respectively. As the skin weight is highly sensitive to the partial pressure of volatile chemicals, including water, this technique requires carefully controlled conditions with respect to air flow, temperature, chemical vapour generation and humidity. This new technique may help in the assessment of dermal uptake of volatile chemicals. Only a small piece of skin is needed and skin integrity is not necessary, facilitating the use of human samples. The high resolution weight-time curves obtained may also help to elucidate the characteristics of absorption, desorption and diffusion of chemicals in skin.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cronin, Kelly; Whyte, Catherine; Reiner, Tom
2008-07-08
Throughout the world there are hundreds of historic monuments and structures considered to be invaluable and irreplaceable. They are symbols of cultural identity and a means of educating people about history. Preservation of historic monuments and structures is therefore an important part of safeguarding these cultural heritage sites so that they retain their value for future generations.This report discusses a procedure for the investigation of seismic hazards in existing buildings and possible steps that can be taken to avoid damage caused by these hazards. The Augusta Airship Hangar located in Sicily, will be used as a case study however themore » topics addressed in this paper can be applied to other structures of historic value around the world.First state-of-the-art scanning procedures were used to create scale digital models that were imported into a structural analysis program. Within this program dynamic analyses were performed on the model based on actual ground motions taken close to the site. This data was used to determine the period and mode shapes of the structure. Then a nonlinear analysis, including a static pushover analysis, was implemented on a two-dimensional model of the structural frame. From this analysis the failure mechanisms of the structure were revealed with relation to an allowable roof displacement. The structural integrity of the structure was evaluated based on pre-defined performance goals. Finally multiple suggestions were made how the Augusta Airship Hangar might be repaired and strengthened so that this structure will not be destroyed should an earthquake occur.The results of our study show that historic structures, despite their age, can still be strong and ductile. Also there are a multitude of effective preservation and retrofit techniques that can be used to strengthen these historic structures, should an earthquake occur. Through this study, the Augusta Airship Hangar has proven to be not only a historic symbol for Sicily but also can be used as an example for the rehabilitation of other historic structures. The techniques and processes discussed in this paper can be applied to other historic reinforced concrete structures and can be expanded upon in future investigations.« less
New methods to monitor emerging chemicals in the drinking water production chain.
van Wezel, Annemarie; Mons, Margreet; van Delft, Wouter
2010-01-01
New techniques enable a shift in monitoring chemicals that affect water quality from mainly at the end product, tap water, towards monitoring during the whole process along the production chain. This is congruent with the 'HACCP' system (hazard analysis of critical control points) that is fairly well integrated into food production but less well in drinking water production. This shift brings about more information about source quality, the efficiency of treatment and distribution, and understanding of processes within the production chain, and therefore can lead to a more pro-active management of drinking water production. At present, monitoring is focused neither on emerging chemicals, nor on detection of compounds with chronic toxicity. We discuss techniques to be used, detection limits compared to quality criteria, data interpretation and possible interventions in production.
Management of Combined Natural Risks - A New Approach: Keynote Address
NASA Astrophysics Data System (ADS)
Hanisch, Jörg
A new attempt is made to illustrate and to quantify the relationships of individual natural hazards, their combinations and the human vulnerability to natural hazards. During many catastrophic events, combinations of different natural events aggravate their occurrence substantially. Earthquakes are frequently associated with heavy landsliding (El Salvador 2001) and heavy rainstorms are able to trigger fast running debris flows and not only floods (like during the Mitch disaster in Central America in 1998). That signifies that natural hazard maps should show the combinations of different hazards and their genetic relationships. To put into effect this, first, the individual hazards have to be assessed and presented in hazard zones (0 to 3). Then these hazards zones will be overlain using GIS techniques. In this way, e.g., an earthquake-prone area which coincides with an area susceptible to landslides (ranking 0 to 3 as well) can show hazard concentrations of up to a value of 6, simply adding the individual hazard zones. To get the result of the corresponding risk zones, the vulnerability maps of human settlements and infra-structure have to be overlain on the maps of these combinations of natural hazards.
Analysis and design of randomised clinical trials involving competing risks endpoints.
Tai, Bee-Choo; Wee, Joseph; Machin, David
2011-05-19
In randomised clinical trials involving time-to-event outcomes, the failures concerned may be events of an entirely different nature and as such define a classical competing risks framework. In designing and analysing clinical trials involving such endpoints, it is important to account for the competing events, and evaluate how each contributes to the overall failure. An appropriate choice of statistical model is important for adequate determination of sample size. We describe how competing events may be summarised in such trials using cumulative incidence functions and Gray's test. The statistical modelling of competing events using proportional cause-specific and subdistribution hazard functions, and the corresponding procedures for sample size estimation are outlined. These are illustrated using data from a randomised clinical trial (SQNP01) of patients with advanced (non-metastatic) nasopharyngeal cancer. In this trial, treatment has no effect on the competing event of loco-regional recurrence. Thus the effects of treatment on the hazard of distant metastasis were similar via both the cause-specific (unadjusted csHR = 0.43, 95% CI 0.25 - 0.72) and subdistribution (unadjusted subHR 0.43; 95% CI 0.25 - 0.76) hazard analyses, in favour of concurrent chemo-radiotherapy followed by adjuvant chemotherapy. Adjusting for nodal status and tumour size did not alter the results. The results of the logrank test (p = 0.002) comparing the cause-specific hazards and the Gray's test (p = 0.003) comparing the cumulative incidences also led to the same conclusion. However, the subdistribution hazard analysis requires many more subjects than the cause-specific hazard analysis to detect the same magnitude of effect. The cause-specific hazard analysis is appropriate for analysing competing risks outcomes when treatment has no effect on the cause-specific hazard of the competing event. It requires fewer subjects than the subdistribution hazard analysis for a similar effect size. However, if the main and competing events are influenced in opposing directions by an intervention, a subdistribution hazard analysis may be warranted.
Using hazard functions to assess changes in processing capacity in an attentional cuing paradigm.
Wenger, Michael J; Gibson, Bradley S
2004-08-01
Processing capacity--defined as the relative ability to perform mental work in a unit of time--is a critical construct in cognitive psychology and is central to theories of visual attention. The unambiguous use of the construct, experimentally and theoretically, has been hindered by both conceptual confusions and the use of measures that are at best only coarsely mapped to the construct. However, more than 25 years ago, J. T. Townsend and F. G. Ashby (1978) suggested that the hazard function on the response time (RT) distribution offered a number of conceptual advantages as a measure of capacity. The present study suggests that a set of statistical techniques, well-known outside the cognitive and perceptual literatures, offers the ability to perform hypothesis tests on RT-distribution hazard functions. These techniques are introduced, and their use is illustrated in application to data from the contingent attentional capture paradigm.
Mapping flood hazards under uncertainty through probabilistic flood inundation maps
NASA Astrophysics Data System (ADS)
Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.
2017-12-01
Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.
Development of Inspection and Investigation Techniques to Prepare Debris Flow in Urban Areas
NASA Astrophysics Data System (ADS)
Seong, Joo-Hyun; Jung, Min-Hyeong; Park, Kyung-Han; An, Jai-Wook; Kim, Jiseong
2017-04-01
Due to the urban development, various facilities are located in mountainous areas near the city, and the damage to the occurrence of the debris flow is increasing in the urban area. However, quantitative inspection and investigation techniques are not sufficient for preparing debris flow in the urban area around the world. Therefore, in this study, we developed the debris flow inspection and investigation techniques, which are suitable for urban characteristics, regarding the soil hazard prevention and restoration in urban area. First, the inspection and investigation system is divided into the daily occurrence and the occurrence of the soil hazard event, and the inspection / investigation flow chart were developed based on the kind of inspection and correspondence required for each situation. The types of inspections applied in this study were determined as daily inspection, regular inspections, special emergency inspection, damage emergency inspection and In-depth safety inspection. The management agency, term of inspection, objects to be inspected, and contents of inspection work were presented according to type of each inspection. The daily inspection routinely checks for signs of collapse and conditions of facilities in urban areas which show vulnerability for soil hazard and that are conducted from the management agency. In the case of regular inspection, an expert for soil hazards regularly conducts detailed visual surveys on mountainous areas, steep slopes, prevention facilities and adjacent facilities in vulnerable areas. On the other hand, it was decided that the emergency inspection is carried out in the event of the occurrence of vulnerable element or soil hazards. Acknowledgement This study was conducted with the research iund support by the constructiontechnology research project of the Ministry of Land, Infrastructure and Transport (project number 16SCIP-B069989-04)
NASA Astrophysics Data System (ADS)
Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco
2016-04-01
The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including hanging wall and directivity effects) within modern ground motion prediction equations, can have an influence on the seismic hazard at a site. Yet we also illustrate the conditions under which these effects may be partially tempered when considering the full uncertainty in rupture behaviour within the fault system. The third challenge is the development of efficient means for representing both aleatory and epistemic uncertainties from active fault models in PSHA. In implementing state-of-the-art seismic hazard models into OpenQuake, such as those recently undertaken in California and Japan, new modeling techniques are needed that redefine how we treat interdependence of ruptures within the model (such as mutual exclusivity), and the propagation of uncertainties emerging from geology. Finally, we illustrate how OpenQuake, and GEM's additional toolkits for model preparation, can be applied to address long-standing issues in active fault modeling in PSHA. These include constraining the seismogenic coupling of a fault and the partitioning of seismic moment between the active fault surfaces and the surrounding seismogenic crust. We illustrate some of the possible roles that geodesy can play in the process, but highlight where this may introduce new uncertainties and potential biases into the seismic hazard process, and how these can be addressed.
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
DOT National Transportation Integrated Search
1988-05-01
The report is devoted to the review and discussion of generic hazards associated with the ground, launch, orbital and re-entry phases of space operations. Since the DOT Office of Commercial Space Transportation (OCST) has been charged with protecting...
9 CFR 417.4 - Validation, Verification, Reassessment.
Code of Federal Regulations, 2012 CFR
2012-01-01
... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...
9 CFR 417.4 - Validation, Verification, Reassessment.
Code of Federal Regulations, 2010 CFR
2010-01-01
... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...
9 CFR 417.4 - Validation, Verification, Reassessment.
Code of Federal Regulations, 2011 CFR
2011-01-01
... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
PRO-Elicere: A Study for Create a New Process of Dependability Analysis of Space Computer Systems
NASA Astrophysics Data System (ADS)
da Silva, Glauco; Netto Lahoz, Carlos Henrique
2013-09-01
This paper presents the new approach to the computer system dependability analysis, called PRO-ELICERE, which introduces data mining concepts and intelligent mechanisms to decision support to analyze the potential hazards and failures of a critical computer system. Also, are presented some techniques and tools that support the traditional dependability analysis and briefly discusses the concept of knowledge discovery and intelligent databases for critical computer systems. After that, introduces the PRO-ELICERE process, an intelligent approach to automate the ELICERE, a process created to extract non-functional requirements for critical computer systems. The PRO-ELICERE can be used in the V&V activities in the projects of Institute of Aeronautics and Space, such as the Brazilian Satellite Launcher (VLS-1).
Sarkar, Gunjan; Saha, Nayan Ranjan; Roy, Indranil; Bhattacharyya, Amartya; Bose, Madhura; Mishra, Roshnara; Rana, Dipak; Bhattacharjee, Debashis; Chattopadhyay, Dipankar
2014-05-01
The aim of this work is to examine the effectiveness of mucilage/hydroxypropylmethylcellulose (HPMC) based transdermal patch (matrix type) as a drug delivery device. We have successfully extracted mucilage from Colocasia esculenta (Taro) corms and prepared diltiazem hydrochloride incorporated mucilage/HPMC based transdermal patches using various wt% of mucilage by the solvent evaporation technique. Characterization of both mucilage and transdermal patches has been done by several techniques such as Molisch's test, organoleptic evaluation of mucilage, mechanical, morphological and thermal analysis of transdermal patches. Skin irritation test is studied on hairless Albino rat skin showing that transdermal patches are apparently free of potentially hazardous skin irritation. Fourier transform infrared analysis shows that there is no interaction between drug, mucilage and HPMC while scanning electron microscopy shows the surface morphology of transdermal patches. In vitro drug release time of mucilage-HPMC based transdermal patches is prolonged with increasing mucilage concentration in the formulation. Copyright © 2014 Elsevier B.V. All rights reserved.
Abbas, Aamir; Ihsanullah; Al-Baghli, Nadhir A. H.
2017-01-01
Multiwall carbon nanotubes (CNTs) and iron oxide impregnated carbon nanotubes (CNTs-iron oxide) were investigated for the adsorption of hazardous toluene and paraxylene (p-xylene) from aqueous solution. Pure CNTs were impregnated with iron oxides nanoparticles using wet impregnation technique. Various characterization techniques including thermogravimetric analysis, scanning electron microscopy, elemental dispersion spectroscopy, X-ray diffraction, and nitrogen adsorption analysis were used to study the thermal degradation, surface morphology, purity, and surface area of the materials. Batch adsorption experiments show that iron oxide impregnated CNTs have higher degree of removal of p-xylene (i.e., 90%) compared with toluene (i.e., 70%), for soaking time 2 h, with pollutant initial concentration 100 ppm, at pH 6 and shaking speed of 200 rpm at 25°C. Pseudo-second-order model provides better fitting for the toluene and p-xylene adsorption. Langmuir and Freundlich isotherm models demonstrate good fitting for the adsorption data of toluene and p-xylene. PMID:28386208
Khan, F I; Abbasi, S A
2000-07-10
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.
NASA Technical Reports Server (NTRS)
Shortle, John F.; Allocco, Michael
2005-01-01
This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.
Asilomar Decision: Unprecedented Guidelines for Gene-Transplant Research
ERIC Educational Resources Information Center
Science News, 1975
1975-01-01
The hazards posed by new techniques of genetic manipulation have prompted scientists to regulate and in some cases restrict their own basic investigations. Describes some possible applications of the new techniques and outlines the established research guidelines. (GS)
FIELD STUDIES OF GEOMEMBRANE INSTALLATION TECHNIQUES
Fourteen construction sites where geomembranes were being installed were visited to observe subgrade preparation and liner installation techniques. These sites were visited during a study conducted for the U.S. EPA, Solid and Hazardous Waste Research Division. The sites included ...
A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam
NASA Astrophysics Data System (ADS)
Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen
2014-05-01
Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho are too short for a meaningful frequency analysis. The detailed hazard mapping is performed by a 2D hydrodynamic model for Can Tho city. As the scenarios are derived in a Monte-Carlo framework, the final flood hazard maps are probabilistic, i.e. show the median flood hazard along with uncertainty estimates for each defined level of probabilities of exceedance. For the pluvial flood hazard a frequency analysis of the hourly rain gauge data of Can Tho is performed implementing a peak-over-threshold procedure. Based on this frequency analysis synthetic rains storms are generated in a Monte-Carlo framework for the same probabilities of exceedance as in the fluvial flood hazard analysis. Probabilistic flood hazard maps were then generated with the same 2D hydrodynamic model for the city. In a last step the fluvial and pluvial scenarios are combined assuming independence of the events. These scenarios were also transferred into hazard maps by the 2D hydrodynamic model finally yielding combined fluvial-pluvial probabilistic flood hazard maps for Can Tho. The derived set of maps may be used for an improved city planning or a flood risk analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perry, F.V.; Valentine, G.A.; Crowe, B.M.
This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). The objective of this project was to determine whether isotopic techniques can be used to assess the eruption potential and eruption volume of continental stratovolcanoes. Large-volume eruptions from stratovolcanoes pose significant hazards to population and infrastructure in many parts of the world. We are testing whether this technique will allow a short- to medium-term (decades to millennia) probabilistic hazard assessment of large-volume eruptions. If successful, the technique will be useful to countries or regions that must consider medium tomore » long-term volcanic (e.g., nuclear waste facilities). We have begun sample acquisition and isotopic measurements at two stratovolcanoes, Pico de Orizaba in eastern Mexico and Daisen in western Japan.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tucker, Brian J.; Diaz, Aaron A.; Eckenrode, Brian A.
2006-03-16
The Hazardous Materials Response Unit (HMRU) and the Counterterrorism and Forensic Science Research Unit (CTFSRU), Laboratory Division, Federal Bureau of Investigation (FBI) have been mandated to develop and establish a wide range of unprecedented capabilities for providing scientific and technical forensic services to investigations involving hazardous chemical, biological, and radiological materials, including extremely dangerous chemical and biological warfare agents. Pacific Northwest National Laboratory (PNNL) has developed a portable, hand-held, hazardous materials acoustic inspection device (HAZAID) that provides noninvasive container interrogation and material identification capabilities using nondestructive ultrasonic velocity and attenuation measurements. Due to the wide variety of fluids as wellmore » as container sizes and materials, the need for high measurement sensitivity and advanced ultrasonic measurement techniques were identified. The HAZAID prototype was developed using a versatile electronics platform, advanced ultrasonic wave propagation methods, and advanced signal processing techniques. This paper primarily focuses on the ultrasonic measurement methods and signal processing techniques incorporated into the HAZAID prototype. High bandwidth ultrasonic transducers combined with the advanced pulse compression technique allowed researchers to 1) impart large amounts of energy, 2) obtain high signal-to-noise ratios, and 3) obtain accurate and consistent time-of-flight (TOF) measurements through a variety of highly attenuative containers and fluid media. Results of this feasibility study demonstrated that the HAZAID experimental measurement technique also provided information regarding container properties, which will be utilized in future container-independent measurements of hidden liquids.« less
Large-bowel surgery, 1979: self-assessment.
Matheson, N A; Valerio, D
1980-01-01
Evidence of wide variability in the immediate results of large-bowel surgery stimulated self-assessment during 1979. The hazards of large-bowel surgery can usually be avoided by good bowel preparation, sound anastomotic technique, primary resection in large bowel emergencies, avoidance of anastomosis when hazardous, and antibiotic lavage for extant or potential peritoneal and wound contamination. PMID:7427416
Meta-analysis of individual registry results enhances international registry collaboration.
Paxton, Elizabeth W; Mohaddes, Maziar; Laaksonen, Inari; Lorimer, Michelle; Graves, Stephen E; Malchau, Henrik; Namba, Robert S; Kärrholm, John; Rolfson, Ola; Cafri, Guy
2018-03-28
Background and purpose - Although common in medical research, meta-analysis has not been widely adopted in registry collaborations. A meta-analytic approach in which each registry conducts a standardized analysis on its own data followed by a meta-analysis to calculate a weighted average of the estimates allows collaboration without sharing patient-level data. The value of meta-analysis as an alternative to individual patient data analysis is illustrated in this study by comparing the risk of revision of porous tantalum cups versus other uncemented cups in primary total hip arthroplasties from Sweden, Australia, and a US registry (2003-2015). Patients and methods - For both individual patient data analysis and meta-analysis approaches a Cox proportional hazard model was fit for time to revision, comparing porous tantalum (n = 23,201) with other uncemented cups (n = 128,321). Covariates included age, sex, diagnosis, head size, and stem fixation. In the meta-analysis approach, treatment effect size (i.e., Cox model hazard ratio) was calculated within each registry and a weighted average for the individual registries' estimates was calculated. Results - Patient-level data analysis and meta-analytic approaches yielded the same results with the porous tantalum cups having a higher risk of revision than other uncemented cups (HR (95% CI) 1.6 (1.4-1.7) and HR (95% CI) 1.5 (1.4-1.7), respectively). Adding the US cohort to the meta-analysis led to greater generalizability, increased precision of the treatment effect, and similar findings (HR (95% CI) 1.6 (1.4-1.7)) with increased risk of porous tantalum cups. Interpretation - The meta-analytic technique is a viable option to address privacy, security, and data ownership concerns allowing more expansive registry collaboration, greater generalizability, and increased precision of treatment effects.
NASA Astrophysics Data System (ADS)
Hung, Hung-Chih; Wu, Ju-Yu; Hung, Chih-Hsuan
2017-04-01
1. Background Asia-Pacific region is one of the most vulnerable areas of the world to climate-related hazards and extremes due to rapid urbanization and over-development in hazard-prone areas. It is thus increasingly recognized that the management of land use and reduction of hazard risk are inextricably linked. This is especially critical from the perspective of integrated river basin management. A range of studies has targeted existing vulnerability assessments. However, limited attention has been paid to the cumulative effects of multiple vulnerable factors and their dynamics faced by local communities. This study proposes a novel methodology to access the changing cumulative vulnerability to climate-related hazards, and to examine the relationship between the attraction factors relevant to the general process of urbanization and vulnerability variability with a focus on a river basin management unit. 2. Methods and data The methods applied in this study include three steps. First, using Intergovernmental Panel on Climate Change's (IPCC) approach, a Cumulative Vulnerability Assessment Framework (CVAF) is built with a goal to characterize and compare the vulnerability to climate-related hazards within river basin regions based on a composition of multiple indicators. We organize these indicator metrics into three categories: (1) hazard exposure; (2) socioeconomic sensitivity, and (3) adaptive capacity. Second, the CVAF is applied by combining a geographical information system (GIS)-based spatial statistics technique with a multicriteria decision analysis (MCDA) to assess and map the changing cumulative vulnerability, comparing conditions in 1996 and 2006 in Danshui River Basin, Taiwan. Third, to examine the affecting factors of vulnerability changing, we develop a Vulnerability Changing Model (VCM) using four attraction factors to reflect how the process of urban developments leads to vulnerability changing. The factors are transport networks, land uses, production values of industries, and infrastructures. We then conduct a regression analysis to test the VCM. To illustrate the proposed methodology, the data are collected from the National Science and Technology Center for Disaster Reduction, Taiwan as well as the National Land Use Investigation and official census statistics. 3. Results and policy implications Results of CVAF analysis demonstrate heterogeneous patterns of vulnerability in the region, and highlight trends of long-term changes. The vulnerable areas unfold as clustered patterns and spatial analogues across regions, rather than randomly distributed. Highest cumulative vulnerability is concentrated in densely populated and downstream reaches (such as Taipei City) of the Danshui River in both time periods. When examining the VCM, it indicates that upper stream and more remote areas generally show low vulnerability, increases are observed in some areas between 1996 and 2006 due to land use intensification, industrial and infrastructure expansion. These findings suggest that land use planning should consider the socioeconomic progression and infrastructure investment factors that contribute to urban sprawl and address current as well as future urban developments vulnerable to hazard risk transmission. The cumulative vulnerability assessment, mapping methods and modelling presented here can be applied to other climate change and hazard risks to highlight priority areas for further investigation and contribute towards improving river basin management.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-25
... establishment's process control plans, that is, its Hazard Analysis and Critical Control Point plans. DATES... control plans, i.e., its Hazard Analysis and Critical Control Point (HACCP) plans; and (3) make the recall... systematic prevention of biological, chemical, and physical hazards. HACCP plans are establishment-developed...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... determine whether there are food hazards that are reasonably likely to occur for each type of juice...
Zahabi, Maryam; Kaber, David B; Swangnetr, Manida
2015-08-01
The objectives of this study were to (a) review electronic medical record (EMR) and related electronic health record (EHR) interface usability issues, (b) review how EMRs have been evaluated with safety analysis techniques along with any hazard recognition, and (c) formulate design guidelines and a concept for enhanced EMR interfaces with a focus on diagnosis and documentation processes. A major impact of information technology in health care has been the introduction of EMRs. Although numerous studies indicate use of EMRs to increase health care quality, there remain concerns with usability issues and safety. A literature search was conducted using Compendex, PubMed, CINAHL, and Web of Science databases to find EMR research published since 2000. Inclusion criteria included relevant English-language papers with subsets of keywords and any studies (manually) identified with a focus on EMR usability. Fifty studies met the inclusion criteria. Results revealed EMR and EHR usability problems to include violations of natural dialog, control consistency, effective use of language, effective information presentation, and customization principles as well as a lack of error prevention, minimization of cognitive load, and feedback. Studies focusing on EMR system safety made no objective assessments and applied only inductive reasoning methods for hazard recognition. On the basis of the identified usability problems and structure of safety analysis techniques, we provide EMR design guidelines and a design concept focused on the diagnosis process and documentation. The design guidelines and new interface concept can be used for prototyping and testing enhanced EMRs. © 2015, Human Factors and Ergonomics Society.
Agyei-Baffour, Peter; Sekyere, Kofi Boateng; Addy, Ernestine Akosua
2013-11-04
Food borne diseases claim more lives and are growing public health concerns. Simple preventive techniques such as adoption and adherence to hazard analysis and critical control point (HACCP) policy can significantly reduce this disease burden. Though food screening and inspection are done, the ultimate regulation, Hazard Analysis and Critical Control Point, which is known and accepted worldwide, appears not to be popular among food operators in Ghana. This paper examines the level of awareness of the existence of policy on hazard analysis and critical control point (HACCP) and its adherence to food preparation guidelines among food service providers in Ghana. The results revealed the mean age of food providers as 33.1 years with a standard deviation of 7.5, range of 18-55 years, more females, in full time employment and with basic education. Of the fifty institutional managers, 42 (84%) were senior officers and had worked for more than five years. Education and type of food operator had strong statistically significant relationship with the implementation of HCCP policy and adherence with food preparation guidelines. The enforcement of HACCP policy and adherence with food safety guidelines was led by the Ghana Tourist Board, Public Health officers, and KMA, respectively. While a majority of food operators 373/450 (83.3%) did not know HACCP policy is part of food safety guidelines, staff of food safety law enforcement 44/50 (88%) confirmed knowing that food operators were not aware of the HACCP policy. The study documents evidence on the practice of food safety principles or HACCP policy or adherence to food preparation guidelines. Existing food safety guidelines incorporate varying principles of HACCP, however, awareness is low among food operators. The implication is that food production is likely to fall short of acceptable standards and not be wholesome putting consumers at health risk. Repeating this study in rural and urban areas in Ghana is necessary to provide much more evidence to inform food safety guidelines. Further studies on chemical analysis of food and implementing training modules on HACCP policy for food producers and law enforcement agencies may be helpful to improve existing situation.
2013-01-01
Background Food borne diseases claim more lives and are growing public health concerns. Simple preventive techniques such as adoption and adherence to hazard analysis and critical control point (HACCP) policy can significantly reduce this disease burden. Though food screening and inspection are done, the ultimate regulation, Hazard Analysis and Critical Control Point, which is known and accepted worldwide, appears not to be popular among food operators in Ghana. This paper examines the level of awareness of the existence of policy on hazard analysis and critical control point (HACCP) and its adherence to food preparation guidelines among food service providers in Ghana. Results The results revealed the mean age of food providers as 33.1 years with a standard deviation of 7.5, range of 18–55 years, more females, in full time employment and with basic education. Of the fifty institutional managers, 42 (84%) were senior officers and had worked for more than five years. Education and type of food operator had strong statistically significant relationship with the implementation of HCCP policy and adherence with food preparation guidelines. The enforcement of HACCP policy and adherence with food safety guidelines was led by the Ghana Tourist Board, Public Health officers, and KMA, respectively. While a majority of food operators 373/450 (83.3%) did not know HACCP policy is part of food safety guidelines, staff of food safety law enforcement 44/50 (88%) confirmed knowing that food operators were not aware of the HACCP policy. Conclusion The study documents evidence on the practice of food safety principles or HACCP policy or adherence to food preparation guidelines. Existing food safety guidelines incorporate varying principles of HACCP, however, awareness is low among food operators. The implication is that food production is likely to fall short of acceptable standards and not be wholesome putting consumers at health risk. Repeating this study in rural and urban areas in Ghana is necessary to provide much more evidence to inform food safety guidelines. Further studies on chemical analysis of food and implementing training modules on HACCP policy for food producers and law enforcement agencies may be helpful to improve existing situation. PMID:24180236
A comparison of water quality criteria for the Great Lakes based on human and wildlife health
Ludwig, James P.; Giesy, John P.; Summer, Cheryl L.; Bowerman, William; Aulerich, Richard J.; Bursian, Steven J.; Auman, Heidi J.; Jones, Paul D.; Williams, Lisa L.; Tillitt, Donald E.; Gilbertson, Michael
1993-01-01
Water quality criteria (WQC) can be derived in several ways. The usual techniques involve hazard and risk assessment procedures. For non-persistent, non-biomagnified compounds and elements, WQC are experimentally derived from their acute and chronic toxicity to aquatic organisms. For those persistent chlorinated hydrocarbons (PCHs) that are bioaccumulated and biomagnified, these traditional techniques have not been effective, partly because effects higher in the food web were not considered. Polychlorinated biphenyls (PCBs) are the bioaccumulative synthetic chemicals of primary toxicological significance to the Great Lakes biota which have caused widespread injury to wildlife. In the Laurentian Great Lakes, the primary emphasis of hazard assessments has been on the potential for adverse effects in humans who eat fish. The primary regulatory endpoint of traditional hazard and risk assessments underlying current WQC are the probabilities of additional cancers occurring in the human population. The analysis presented here indicates that this is not adequate to restore sensitive wildlife species that are highly exposed to PCBs, especially those that have suffered serious population declines. Because WQC are legal instruments, the methods of deriving WQC have large implications for remediation, litigation, and damage assessments. Here WQC are derived for six species based on the responses of wildlife in the field or produced by feeding fish to surrogate species, rather than projecting a potential of increased cancer rates in humans. If the most sensitive wildlife species are restored and protected for very sensitive reproductive endpoints, then all components of the ecosystem, including human health, should be more adequately protected. The management of Great Lakes wildlife requires an understanding of the injury and causal relationships to persistent toxic substances.
NASA Astrophysics Data System (ADS)
Morris, Phillip A.
The prevalence of low-cost side scanning sonar systems mounted on small recreational vessels has created improved opportunities to identify and map submerged navigational hazards in freshwater impoundments. However, these economical sensors also present unique challenges for automated techniques. This research explores related literature in automated sonar imagery processing and mapping technology, proposes and implements a framework derived from these sources, and evaluates the approach with video collected from a recreational grade sonar system. Image analysis techniques including optical character recognition and an unsupervised computer automated detection (CAD) algorithm are employed to extract the transducer GPS coordinates and slant range distance of objects protruding from the lake bottom. The retrieved information is formatted for inclusion into a spatial mapping model. Specific attributes of the sonar sensors are modeled such that probability profiles may be projected onto a three dimensional gridded map. These profiles are computed from multiple points of view as sonar traces crisscross or come near each other. As lake levels fluctuate over time so do the elevation points of view. With each sonar record, the probability of a hazard existing at certain elevations at the respective grid points is updated with Bayesian mechanics. As reinforcing data is collected, the confidence of the map improves. Given a lake's current elevation and a vessel draft, a final generated map can identify areas of the lake that have a high probability of containing hazards that threaten navigation. The approach is implemented in C/C++ utilizing OpenCV, Tesseract OCR, and QGIS open source software and evaluated in a designated test area at Lake Lavon, Collin County, Texas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, J.D.; Blanchard, L.J.; Srikantapura, S.
1996-12-31
The minor- and trace-element content of coal is of great interest because of the potentially hazardous impact on human health and the environment resulting from their release during coal combustion. Of the one billion tons of coal mined annually in the United States, 85-90% is consumed by coal-fired power plants. Potentially toxic elements present at concentrations as low as a few egg can be released in large quantities from combustion of this magnitude. Of special concern are those trace elements that occur naturally in coal which have been designated as potential hazardous air pollutants (HAPs) in the 1990 Amendments tomore » the Clean Air Act. The principle objective of this work was to investigate a combination of physical and chemical coal cleaning techniques to remove 90 percent of HAP trace elements at 90 percent combustibles recovery from Kentucky No. 9 coal. Samples of this coal were first subjected to physical separation by flotation in a Denver cell. The float fraction from the Denver cell was then used as feed material for hydrothermal leaching tests in which the efficacy of dilute alkali (NaOH) and acid (HNO{sub 3}) solutions at various temperatures and pressures was investigated. The combined column flotation and mild chemical cleaning strategy removed 60-80% of trace elements with greater than 85, recovery of combustibles from very finely ground (-325 mesh) coal. The elemental composition of the samples generated at each stage was determined using particle induced X-ray emission (PIXE) analysis. PIXE is a rapid, instrumental technique that, in principle, is capable of analyzing all elements from sodium through uranium with sensitivities as low as 1 {mu}g/g.« less
Flood hazard studies in Central Texas using orbital and suborbital remote sensing machinery
NASA Technical Reports Server (NTRS)
Baker, V. R.; Holz, R. K.; Patton, P. C.
1975-01-01
Central Texas is subject to infrequent, unusually intense rainstorms which cause extremely rapid runoff from drainage basins developed on the deeply dissected limestone and marl bedrock of the Edwards Plateau. One approach to flood hazard evaluation in this area is a parametric model relating flood hydrograph characteristics to quantitative geomorphic properties of the drainage basins. The preliminary model uses multiple regression techniques to predict potential peak flood discharge from basin magnitude, drainage density, and ruggedness number. After mapping small catchment networks from remote sensing imagery, input data for the model are generated by network digitization and analysis by a computer assisted routine of watershed analysis. The study evaluated the network resolution capabilities of the following data formats: (1) large-scale (1:24,000) topographic maps, employing Strahler's "method of v's," (2) standard low altitude black and white aerial photography (1:13,000 and 1:20,000 scales), (3) NASA - generated aerial infrared photography at scales ranging from 1:48,000 to 1:123,000, and (4) Skylab Earth Resources Experiment Package S-190A and S-190B sensors (1:750,000 and 1:500,000 respectively).
Tang, Quan; Sheng, Wanqi; Li, Liyuan; Zheng, Liugen; Miao, Chunhui; Sun, Ruoyu
2018-08-01
The alteration behavior of minerals and hazardous elements during simulated combustion (100-1200 °C) of a raw coal collected from a power plant were studied. Thermogravimetric analysis indicated that there were mainly four alteration stages during coal combustion. The transformation behavior of mineral phases of raw coal, which were detected by X-ray polycrystalline diffraction (XRD) technique, mainly relied on the combustion temperature. A series of changes were derived from the intensities of mineral (e.g. clays) diffraction peaks when temperature surpassed 600 °C. Mineral phases tended to be simple and collapsed to amorphous glass when temperature reached up to 1200 °C. The characteristics of functional groups for raw coal and high-temperature (1200 °C) ash studied by Fourier transform infrared spectroscopy (FTIR) were in accordance with the result obtained from XRD analysis. The volatilization ratios of Co, Cr, Ni and V increased consistently with the increase of combustion temperature, suggesting these elements were gradually released from the organic matter and inorganic minerals of coal. Copyright © 2018 Elsevier Ltd. All rights reserved.
Chemical Waste Management and Disposal.
ERIC Educational Resources Information Center
Armour, Margaret-Ann
1988-01-01
Describes simple, efficient techniques for treating hazardous chemicals so that nontoxic and nonhazardous residues are formed. Discusses general rules for management of waste chemicals from school laboratories and general techniques for the disposal of waste or surplus chemicals. Lists specific disposal reactions. (CW)
Schedule-controlled learning and memory in a regulatory context
Control of behavior by the manipulation of contingencies provides powerful techniques for assessing the hazard of chemical toxicants on the nervous system. When applied to evaluate the consequences of developmental exposure, these techniques are well suited for characterizing per...
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
Hazards and occupational risk in hard coal mines - a critical analysis of legal requirements
NASA Astrophysics Data System (ADS)
Krause, Marcin
2017-11-01
This publication concerns the problems of occupational safety and health in hard coal mines, the basic elements of which are the mining hazards and the occupational risk. The work includes a comparative analysis of selected provisions of general and industry-specific law regarding the analysis of hazards and occupational risk assessment. Based on a critical analysis of legal requirements, basic assumptions regarding the practical guidelines for occupational risk assessment in underground coal mines have been proposed.
NASA Astrophysics Data System (ADS)
Gălbău, Ionela
2015-04-01
Using techniques of information, such as Geographic Information Systems (GIS), on spatial analysis, offers numerous possibilities in terms of spatial emphasizing the study area and marking hazard risk areas (especially landslides). Although the means ultra modern techniques have advanced, using GIS in spatial planning remains the most important technique used. Also, GIS maps obtained are more objective than paper made by hand, using the same data and the same conceptual model. The study area, commune Vorniceni is situated in the north of Romania, Ibaneasa River basin, a tributary of Jijiei and occupies an area of 63 km2. The area has experienced over the past 50 years, a trend not only territorial but also morphological and morphometric. This study involves a relation between the evolution of territorial distribution of the population of the commune Vorniceni and influence on the environment. The construction of the dam reservoir Ibaneasa River using poor borrow pits, meant a starting point for the development of landslides. Brutal antropic intervention on the environment by building a dam or lake clogging the two reservoirs (ponds) increased possibility of negative phenomena in the area. These phenomena directly affect the village population as territorial evolution involved the construction of settlements in areas with potential risk of landslides. The analysis of the factors that have influenced the evolution of territorial and producing negative phenomena and making GIS database will be followed by the realization of a hypsometric map of slopes, slope inclination and land use. All this, highlights the relationship anthropic environment - natural environment, and not turning both low population provides another opportunity to use the land in a beneficial way by harnessing the risk map obtained. Although not without shortcomings, the method proved to be a feasible and cost-effective approach for assessing landslide susceptibility and mapping. "ACKNOWLEDGMENT This paper has been financially supported within the project entitled "SOCERT. Knowledge society, dynamism through research", contractnumber POSDRU/159/1.5/S/132406. This project is co-financed by European Social Fund through Sectoral Operational Programme for Human Resources Development 2007-2013. Investing in people!"
Assessing natural hazards in NE Colombia using Sentinel-1 interferometry
NASA Astrophysics Data System (ADS)
Olen, Stephanie; Bookhagen, Bodo
2017-04-01
The DIGENTI project (Digitaler Entscheidertisch für das Naturgefahrenmanagement auf Basis von Satellitendaten und VGI (Volunteered Geographic Information)) aims to assess the natural hazard threat to the Cesar and La Guajira departments of northeast Colombia as guidance for decision makers and disaster relief workers. As members of the DIGENTI project, we use Sentinel-1 synthetic aperture radar (SAR) interferometry to detect hillslope movements, delineate settlements, and monitor damage to urban areas. Our study area, located in the remote Serranía del Perijá mountain range on the border of Colombia and Venezuela, is mountainous, highly vegetated, and experiences high and spatially variable rainfall (between 1 and 4 m a-1). The remote nature of the region, coupled with the favorable conditions for mass movements and other hillslope instabilities, make it an ideal location to employ remote sensing techniques to monitor potential natural hazards. In the highly vegetated Serranía del Perijá mountain range, traditional damage proxy mapping is complicated by vegetation-related coherence loss between SAR scenes. Cross-referencing existing maps, we define regions of consistently high coherence as settled or urban areas. Using the spatial extent of settled or urban areas as a mask, we establish an algorithm to use coherence loss only in these regions as a damage proxy in urban areas where the local population will be most affected. Outside of settlements, hillslope instabilities and movements are quantified and mapped using a two-prong approach: (1) Horizontal ground displacement is be calculated by dense amplitude cross-correlation using the topsOffsetApp in the InSAR Scientific Computing Environment (ISCE). This allows the location, direction, and magnitude of mass movements and hillslope instabilities to be identified and mapped; (2) We use a timeseries of interferograms to quantify vertical ground deformation (e.g., as caused by landsliding) during the Sentinel-1 time window. To do this we employ the small baseline subset (SBAS) technique to create a timeseries of potential landslides in our study area. This technique has the added advantage of overcoming poor coherence between individual InSAR scenes (e.g., caused by vegetation cover). The output of the SBAS analysis will be used to designate natural hazard "hot spots" that will enhance static estimates of factor of safety and landslide risk (e.g., based on digital elevation models). Both the timeseries of horizontal and vertical surface movements and their spatial extent are compared to regional rainfall and vegetation patterns to aid in future natural hazard assessment. Preliminary work is being done to apply these algorithms to other regions with markedly different climate and tectonic settings (NW Argentine Andes and the Arun Valley, Nepal).
Chang, Yen-Hou; Li, Wai-Hou; Chang, Yi; Peng, Chia-Wen; Cheng, Ching-Hsuan; Chang, Wei-Pin; Chuang, Chi-Mu
2016-03-17
In the analysis of survival data for cancer patients, the problem of competing risks is often ignored. Competing risks have been recognized as a special case of time-to-event analysis. The conventional techniques for time-to-event analysis applied in the presence of competing risks often give biased or uninterpretable results. Using a prospectively collected administrative health care database in a single institution, we identified patients diagnosed with stage III or IV primary epithelial ovarian, tubal, and peritoneal cancers with minimal residual disease after primary cytoreductive surgery between 1995 and 2012. Here, we sought to evaluate whether intraperitoneal chemotherapy outperforms intravenous chemotherapy in the presence of competing risks. Unadjusted and multivariable subdistribution hazards models were applied to this database with two types of competing risks (cancer-specific mortality and other-cause mortality) coded to measure the relative effects of intraperitoneal chemotherapy. A total of 1263 patients were recruited as the initial cohort. After propensity score matching, 381 patients in each arm entered into final competing risk analysis. Cumulative incidence estimates for cancer-specific mortality were statistically significantly lower (p = 0.017, Gray test) in patients receiving intraperitoneal chemotherapy (5-year estimates, 34.5%; 95% confidence interval [CI], 29.5-39.6%, and 10-year estimates, 60.7%; 95% CI, 52.2-68.0%) versus intravenous chemotherapy (5-year estimates, 41.3%; 95% CI, 36.2-46.3%, and 10-year estimates, 67.5%, 95% CI, 61.6-72.7%). In subdistribution hazards analysis, for cancer-specific mortality, intraperitoneal chemotherapy outperforms intravenous chemotherapy (Subdistribution hazard ratio, 0.82; 95% CI, 0.70-0.96) after correcting other covariates. In conclusion, results from this comparative effectiveness study provide supportive evidence for previous published randomized trials that intraperitoneal chemotherapy outperforms intravenous chemotherapy even eliminating the confounding of competing risks. We suggest that implementation of competing risk analysis should be highly considered for the investigation of cancer patients who have medium to long-term follow-up period.
NASA Astrophysics Data System (ADS)
Nath, Sankar Kumar; Srivastava, Nishtha; Ghatak, Chitralekha; Adhikari, Manik Das; Ghosh, Ambarish; Sinha Ray, S. P.
2018-01-01
Liquefaction-induced ground failure is one amongst the leading causes of infrastructure damage due to the impact of large earthquakes in unconsolidated, non-cohesive, water saturated alluvial terrains. The city of Kolkata is located on the potentially liquefiable alluvial fan deposits of Ganga-Bramhaputra-Meghna Delta system with subsurface litho-stratigraphic sequence comprising of varying percentages of clay, cohesionless silt, sand, and gravel interbedded with decomposed wood and peat. Additionally, the region has moderately shallow groundwater condition especially in the post-monsoon seasons. In view of burgeoning population, there had been unplanned expansion of settlements in the hazardous geological, geomorphological, and hydrological conditions exposing the city to severe liquefaction hazard. The 1897 Shillong and 1934 Bihar-Nepal earthquakes both of M w 8.1 reportedly induced Modified Mercalli Intensity of IV-V and VI-VII respectively in the city reportedly triggering widespread to sporadic liquefaction condition with surface manifestation of sand boils, lateral spreading, ground subsidence, etc., thus posing a strong case for liquefaction potential analysis in the terrain. With the motivation of assessing seismic hazard, vulnerability, and risk of the city of Kolkata through a consorted federal funding stipulated for all the metros and upstart urban centers in India located in BIS seismic zones III, IV, and V with population more than one million, an attempt has been made here to understand the liquefaction susceptibility condition of Kolkata under the impact of earthquake loading employing modern multivariate techniques and also to predict deterministic liquefaction scenario of the city in the event of a probabilistic seismic hazard condition with 10% probability of exceedance in 50 years and a return period of 475 years. We conducted in-depth geophysical and geotechnical investigations in the city encompassing 435 km2 area. The stochastically synthesized bedrock ground motion for both the 1897 and 1934 earthquakes on non-linear analysis of local site conditions through DEEPSOIL Geotechnical analysis package present surface level peak ground acceleration of the order of 0.05-0.14 g for the 1934 Bihar-Nepal earthquake while for the 1897 Shillong earthquake it is found to be in the range of 0.03-0.11 g. The factor of safety (FOS) against liquefaction, the probability of liquefaction ( P L), the liquefaction potential index (LPI), and the liquefaction risk index are estimated under the influence of these two earthquakes wherein the city is classified into severe (LPI > 15), high (5 < LPI ≤ 15), moderate (0 < LPI ≤ 5), and non-liquefiable (LPI = 0) susceptibility zones. While the 1934 Bihar-Nepal earthquake induced moderate to severe liquefaction hazard condition in the city in mostly the deltaic plain and interdistributary marsh geomorphologic units with 13.5% sites exhibiting moderate hazard with a median LPI of 1.8, 8.5% sites depicting high with a median LPI of 9.1 and 4% sites with a median LPI of 18.9 exhibiting severe hazard condition, 1897 Shillong earthquake induced mostly non-liquefaction condition with very few sites depicting moderate and high liquefaction hazard. A conservative liquefaction hazard scenario of the city on the other hand estimated through deterministic approach for 10% probability of exceedance in 50 years predicts a high hazard zone in the 3.5-19 m depth region with FOS < 1 and P L > 65% comprising of coarse-grained sediments of sand, silty sand, and clayey silty sand in mostly the deltaic plain geomorphologic unit with 39.1% sites depicting severe liquefaction hazard with a median LPI of 28.3. A non-linear regression analysis on both the historical and deterministic liquefaction scenarios in P L versus LPI domain with ± 1 standard deviation confidence bound generated a cubic polynomial relationship between the two liquefaction hazard proxies. This study considered a bench mark for other cities in the country and elsewhere forms an integral part of the mega-seismic microzonation endeavors undertaken in all the earthquake-prone counties in the world.
Prospects and pitfalls of occupational hazard mapping: 'between these lines there be dragons'.
Koehler, Kirsten A; Volckens, John
2011-10-01
Hazard data mapping is a promising new technique that can enhance the process of occupational exposure assessment and risk communication. Hazard maps have the potential to improve worker health by providing key input for the design of hazard intervention and control strategies. Hazard maps are developed with aid from direct-reading instruments, which can collect highly spatially and temporally resolved data in a relatively short period of time. However, quantifying spatial-temporal variability in the occupational environment is not a straightforward process, and our lack of understanding of how to ascertain and model spatial and temporal variability is a limiting factor in the use and interpretation of workplace hazard maps. We provide an example of how sources of and exposures to workplace hazards may be mischaracterized in a hazard map due to a lack of completeness and representativeness of collected measurement data. Based on this example, we believe that a major priority for research in this emerging area should focus on the development of a statistical framework to quantify uncertainty in spatially and temporally varying data. In conjunction with this need is one for the development of guidelines and procedures for the proper sampling, generation, and evaluation of workplace hazard maps.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L.; Vogel, R. M.
2015-12-01
Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.
Timely detection and monitoring of oil leakage by satellite optical data.
NASA Astrophysics Data System (ADS)
Grimaldi, C. S. L.; Coviello, I.; Lacava, T.; Pergola, N.; Tramutoli, V.
2009-04-01
Sea oil pollution can derive from different sources. Accidental release of oil into the oceans caused by "human errors" (tankers collisions and/or shipwrecks) or natural hazards (hurricanes, landslides, earthquakes) have remarkable ecological impact on maritime and coastal environments. Katrina Hurricane, for example, hitting oil and gas infrastructures off USA coasts caused the destruction of more than 100 platforms and the release into the sea of more than 10,000 gallons of crude oil. In order to reduce the environmental impact of such kind of technological hazards, timely detection and continuously updated information are fundamental. Satellite remote sensing can give a significant contribution in such a direction. Nowadays, SAR (Synthetic Aperture Radar) technology has been recognized as the most efficient for oil spill detection and mapping, thanks to the high spatial resolution and all-time/weather capability of the present operational sensors. Anyway, due to their current revisiting cycles, SAR systems cannot be profitably used for a rapid detection and for a continuous and near real-time monitoring of these phenomena. Until COSMO-Skymed SAR constellation, that will be able to improve SAR observational frequency, will not be fully operational, passive optical sensors on board meteorological satellites, thanks to their high temporal resolution, may represent a suitable alternative for early detection and continuous monitoring of oil spills, provided that adequate and reliable data analysis techniques exist. Recently, an innovative technique for oil spill detection and monitoring, based on the general Robust Satellite Techniques (RST) approach, has been proposed. It exploits the multi-temporal analysis of optical data acquired by both AVHRR (Advanced Very High Resolution Radiometer) and MODIS (Moderate Resolution Imaging Spectroradiometer) sensors in order to detect, automatically and timely, the presence of oil spill over the sea surface, trying to minimize the "false-detections" possibly caused by spurious effects (e.g. clouds). In this paper, preliminary results obtained applying the proposed methodology to different test-cases are shown and discussed.
San Mateo County Geographic Information Systems (GIS) project
Brabb, E.E.
1986-01-01
Earthquakes and ground failures in the United States cause billions of dollars of damages each year, but techniques for predicting and reducing these hazardous geologic processes remain elusive. geologists, geophysicists, hydrologists, engineers, cartographers, and computer specialists from the U.S geological Survey in Menlo Park, California, are working together on a project involving GIS techniques to determine how to predict the consequences of earthquakes and landslides, using San Mateo County as a subject area. Together with members of the Planning and Emergency Serivces Departments of San Mateo County and the Association of Bay Area Governments, They are also determining how to reduce the losses caused by hazards.
NASA Astrophysics Data System (ADS)
Sadegh, M.; Moftakhari, H.; AghaKouchak, A.
2017-12-01
Many natural hazards are driven by multiple forcing variables, and concurrence/consecutive extreme events significantly increases risk of infrastructure/system failure. It is a common practice to use univariate analysis based upon a perceived ruling driver to estimate design quantiles and/or return periods of extreme events. A multivariate analysis, however, permits modeling simultaneous occurrence of multiple forcing variables. In this presentation, we introduce the Multi-hazard Assessment and Scenario Toolbox (MhAST) that comprehensively analyzes marginal and joint probability distributions of natural hazards. MhAST also offers a wide range of scenarios of return period and design levels and their likelihoods. Contribution of this study is four-fold: 1. comprehensive analysis of marginal and joint probability of multiple drivers through 17 continuous distributions and 26 copulas, 2. multiple scenario analysis of concurrent extremes based upon the most likely joint occurrence, one ruling variable, and weighted random sampling of joint occurrences with similar exceedance probabilities, 3. weighted average scenario analysis based on a expected event, and 4. uncertainty analysis of the most likely joint occurrence scenario using a Bayesian framework.
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2012 CFR
2012-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard Analysis and Critical Control Point (HACCP...
ERIC Educational Resources Information Center
Alexander, Melody W.; Arp, Larry W.
1997-01-01
A survey of 260 secondary and 251 postsecondary business educators found the former more likely to think computer ergonomic techniques should taught in elementary school and to address the hazards of improper use. Both groups stated that over half of students they observe do not use good techniques and agreed that students need continual…
NASA Technical Reports Server (NTRS)
Bowles, Roland L.; Buck, Bill K.
2009-01-01
The objective of the research developed and presented in this document was to statistically assess turbulence hazard detection performance employing airborne pulse Doppler radar systems. The FAA certification methodology for forward looking airborne turbulence radars will require estimating the probabilities of missed and false hazard indications under operational conditions. Analytical approaches must be used due to the near impossibility of obtaining sufficient statistics experimentally. This report describes an end-to-end analytical technique for estimating these probabilities for Enhanced Turbulence (E-Turb) Radar systems under noise-limited conditions, for a variety of aircraft types, as defined in FAA TSO-C134. This technique provides for one means, but not the only means, by which an applicant can demonstrate compliance to the FAA directed ATDS Working Group performance requirements. Turbulence hazard algorithms were developed that derived predictive estimates of aircraft hazards from basic radar observables. These algorithms were designed to prevent false turbulence indications while accurately predicting areas of elevated turbulence risks to aircraft, passengers, and crew; and were successfully flight tested on a NASA B757-200 and a Delta Air Lines B737-800. Application of this defined methodology for calculating the probability of missed and false hazard indications taking into account the effect of the various algorithms used, is demonstrated for representative transport aircraft and radar performance characteristics.
NASA Astrophysics Data System (ADS)
Fugazza, Davide; Scaioni, Marco; Corti, Manuel; D'Agata, Carlo; Azzoni, Roberto Sergio; Cernuschi, Massimo; Smiraglia, Claudio; Diolaiuti, Guglielmina Adele
2018-04-01
Tourists and hikers visiting glaciers all year round face hazards such as sudden terminus collapses, typical of such a dynamically evolving environment. In this study, we analyzed the potential of different survey techniques to analyze hazards of the Forni Glacier, an important geosite located in Stelvio Park (Italian Alps). We carried out surveys in the 2016 ablation season and compared point clouds generated from an unmanned aerial vehicle (UAV) survey, close-range photogrammetry and terrestrial laser scanning (TLS). To investigate the evolution of glacier hazards and evaluate the glacier thinning rate, we also used UAV data collected in 2014 and a digital elevation model (DEM) created from an aerial photogrammetric survey of 2007. We found that the integration between terrestrial and UAV photogrammetry is ideal for mapping hazards related to the glacier collapse, while TLS is affected by occlusions and is logistically complex in glacial terrain. Photogrammetric techniques can therefore replace TLS for glacier studies and UAV-based DEMs hold potential for becoming a standard tool in the investigation of glacier thickness changes. Based on our data sets, an increase in the size of collapses was found over the study period, and the glacier thinning rates went from 4.55 ± 0.24 m a-1 between 2007 and 2014 to 5.20 ± 1.11 m a-1 between 2014 and 2016.
Social vulnerability assessment: a growing practice in Europe?
NASA Astrophysics Data System (ADS)
Tapsell, S.; McC arthy, S.
2012-04-01
This paper builds upon work on social vulnerability from the CapHaz-Net consortium, an ongoing research project funded by the European Commission in its 7th Framework Programme. The project focuses on the social dimensions of natural hazards, as well as on regional practices of risk prevention and management, and aims at improving the resilience of European societies to natural hazards, paying particular attention to social capacity building. The topic of social vulnerability is one of seven themes being addressed in the project. There are various rationales for examining the relevance of social vulnerability to natural hazards. Vulnerability assessment has now been accepted as a requirement for the effective development of emergency management capability, and assessment of social vulnerability has been recognised as being integral to understanding the risk to natural hazards. The aim of our research was to examine social vulnerability, how it might be understood in the context of natural hazards in Europe, and how social vulnerability can be addressed to increase social capacity. The work comprised a review of research on social vulnerability to different natural hazards within Europe and included concepts and definitions of social vulnerability (and related concepts), the purpose of vulnerability assessment and who decides who is vulnerable, different approaches to assessing or measuring social vulnerability (such as the use of 'classical' quantitative vulnerability indicators and qualitative community-based approaches, along with the advantages and disadvantages of both), conceptual frameworks for assessing social vulnerability and three case studies of social vulnerability studies within Europe: flash floods in the Italian Alps, fluvial flooding in Germany and heat waves in Spain. The review reveals variable application of social vulnerability analysis across Europe and there are indications why this might be the case. Reasons could range from the scale of country policy and the particular risk management focus to the smaller scale risk management perceptions of the analysis techniques employed being to resource expensive, difficult to interpret or to operationalise. This paper will provide a context with some empirical examples to perhaps explain the growing popularity of concepts such as resilience and capacity building which lie more comfortably with policy makers and risk managers as concepts which focus on the solution rather than identifying a problem by assessing social vulnerability.
Potential hazards in smoke-flavored fish
NASA Astrophysics Data System (ADS)
Lin, Hong; Jiang, Jie; Li, Donghua
2008-08-01
Smoking is widely used in fish processing for the color and flavor. Smoke flavorings have evolved as a successful alternative to traditional smoking. The hazards of the fish products treated by liquid-smoking process are discussed in this review. The smoke flavoring is one important ingredient in the smoke-flavored fish. This paper gives the definition of smoke flavorings and the hazard of polycyclic aromatic hydrocarbons (PAHs) residue in the smoke flavorings on the market. It gives also an assessment of chemical hazards such as carcinogenic PAHs, especially Benzo-[ a]pyrene, as well as biological hazards such as Listeria monocytogenes, Clostridium botulinum, histamine and parasites in smoke-flavored fish. The limitations in regulations or standards are discussed. Smoke flavored fish have lower content of PAHs as compared with the traditional smoking techniques if the PAHs residue in smoke flavorings is controlled by regulations or standards.
Management of toxic substances and hazardous wastes.
Niemela, V E
1984-08-01
This paper describes the extent of the hazardous and toxic chemical waste problems in Canada and discusses the management, treatment, and disposal methods commonly used in North America and Europe. The treatment and disposal techniques covered are biological, physical-chemical, incineration technologies, and secure land disposal. Some of the available and emerging technologies for destruction of polychlorinated biphenyls are also described.
Little, C L; Lock, D; Barnes, J; Mitchell, R T
2003-09-01
A meta-analysis of eight UK food studies was carried out to determine the microbiological quality of food and its relationship with the presence in food businesses of hazard analysis systems and food hygiene training. Of the 19,022 premises visited to collect food samples in these studies between 1997 and 2002, two thirds (66%) were catering premises and one third (34%) were retail premises. Comparison with PHLS Microbiological Guidelines revealed that significantly more ready-to-eat food samples from catering premises (20%; 2,511/12,703) were of unsatisfactory or unacceptable microbiological quality compared to samples from retail premises (12%; 1,039/8,462) (p < 0.00001). Three quarters (76%) of retail premises had hazard analysis systems in place compared with 59% of catering premises (p < 0.00001). In 87% of retail premises the manager had received some form of food hygiene training compared with 80% of catering premises (p < 0.00001). From premises where the manager had received no food hygiene training a greater proportion of samples were of unsatisfactory and unacceptable microbiological quality (20% retail, 27% catering) compared with premises where the manager had received food hygiene training (11% retail, 19% catering) (p < 0.00001). Where the manager of the premises had received food hygiene training, documented hazard analysis systems were more likely to be in place (p < 0.00001). Higher proportions of samples of unsatisfactory and unacceptable microbiological quality (17% retail, 22% catering) were from premises where there was no hazard analysis system in place compared to premises that had a documented hazard analysis system in place (10% retail, 18% catering) (p < 0.00001). Our meta-analysis suggests that the lower microbiological quality of ready-to-eat foods from catering premises compared with those collected from retail premises may reflect differences in management food hygiene training and the presence of a hazard analysis system. The importance of adequate training for food handlers and their managers as a pre-requisite for effective hazard analysis and critical control point (HACCP) based controls is therefore emphasised.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bloom, R.R.
1996-04-01
The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented VAC*TRAX mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses an indirectly heated, batch vacuum dryer to thermally desorb organic compounds from mixed wastes. This process hazards analysis evaluated 102 potential hazards. The three significant hazards identified involved the inclusion of oxygen in a process that also included an ignition source and fuel. Changesmore » to the design of the MTU were made concurrent with the hazard identification and analysis; all hazards with initial risk rankings of 1 or 2 were reduced to acceptable risk rankings of 3 or 4. The overall risk to any population group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.« less
NASA Astrophysics Data System (ADS)
Hadley, Brian Christopher
This dissertation assessed remotely sensed data and geospatial modeling technique(s) to map the spatial distribution of total above-ground biomass present on the surface of the Savannah River National Laboratory's (SRNL) Mixed Waste Management Facility (MWMF) hazardous waste landfill. Ordinary least squares (OLS) regression, regression kriging, and tree-structured regression were employed to model the empirical relationship between in-situ measured Bahia (Paspalum notatum Flugge) and Centipede [Eremochloa ophiuroides (Munro) Hack.] grass biomass against an assortment of explanatory variables extracted from fine spatial resolution passive optical and LIDAR remotely sensed data. Explanatory variables included: (1) discrete channels of visible, near-infrared (NIR), and short-wave infrared (SWIR) reflectance, (2) spectral vegetation indices (SVI), (3) spectral mixture analysis (SMA) modeled fractions, (4) narrow-band derivative-based vegetation indices, and (5) LIDAR derived topographic variables (i.e. elevation, slope, and aspect). Results showed that a linear combination of the first- (1DZ_DGVI), second- (2DZ_DGVI), and third-derivative of green vegetation indices (3DZ_DGVI) calculated from hyperspectral data recorded over the 400--960 nm wavelengths of the electromagnetic spectrum explained the largest percentage of statistical variation (R2 = 0.5184) in the total above-ground biomass measurements. In general, the topographic variables did not correlate well with the MWMF biomass data, accounting for less than five percent of the statistical variation. It was concluded that tree-structured regression represented the optimum geospatial modeling technique due to a combination of model performance and efficiency/flexibility factors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levander, Alan Richard; Zelt, Colin A.
2015-03-17
The work plan for this project was to develop and apply advanced seismic reflection and wide-angle processing and inversion techniques to high resolution seismic data for the shallow subsurface to seismically characterize the shallow subsurface at hazardous waste sites as an aid to containment and cleanup activities. We proposed to continue work on seismic data that we had already acquired under a previous DoE grant, as well as to acquire additional new datasets for analysis. The project successfully developed and/or implemented the use of 3D reflection seismology algorithms, waveform tomography and finite-frequency tomography using compressional and shear waves for highmore » resolution characterization of the shallow subsurface at two waste sites. These two sites have markedly different near-surface structures, groundwater flow patterns, and hazardous waste problems. This is documented in the list of refereed documents, conference proceedings, and Rice graduate theses, listed below.« less
Detection of hazardous cavities with combined geophysical methods
NASA Astrophysics Data System (ADS)
Hegymegi, Cs.; Nyari, Zs.; Pattantyus-Abraham, M.
2003-04-01
Unknown near-surface cavities often cause problems for municipal communities all over the world. This is the situation in Hungary in many towns and villages, too. Inhabitants and owners of real estates (houses, cottages, lands) are responsible for the safety and stability of their properties. The safety of public sites belongs to the local municipal community. Both (the owner and the community) are interested in preventing accidents. Near-surface cavities (unknown caves or earlier built and forgotten cellars) usually can be easily detected by surface geophysical methods. Traditional and recently developed measuring techniques in seismics, geoelectrics and georadar are suitable for economical investigation of hazardous, potentially collapsing cavities, prior to excavation and reinforcement. This poster will show some example for detection of cellars and caves being dangerous for civil population because of possible collapse under public sites (road, yard, playground, agricultural territory, etc.). The applied and presented methods are ground penetrating radar, seismic surface tomography and analysis of single traces, geoelectric 2D and 3D resistivity profiling. Technology and processing procedure will be presented.
NASA Technical Reports Server (NTRS)
Getty, S. A.; Brinckerhoff, W. B.; Arevalo, R. D.; Floyd, M. M.; Li, X.; Cornish, T.; Ecelberger, S. A.
2012-01-01
Future landed missions to Mars will be guided by two strategic directions: (1) sample return to Earth, for comprehensive compositional analyses, as recommended by the 2011 NRC Planetary Decadal Survey; and (2) preparation for human exploration in the 2030s and beyond, as laid out by US space policy. The resultant mission architecture will likely require high-fidelity in situ chemical/organic sample analyses within an extremely constrained resource envelope. Both science goals (e.g., MEPAG Goal 1, return sample selection, etc.) as well as identification of any potential toxic and biological hazards to humans, must be addressed. Over the past several years of instrument development, we have found that the adaptable, compact, and highly capable technique of laser desorption/ionization time-of-flight mass spectrometry (LD-TOF-MS) has significant potential to contribute substantially to these dual objectives. This concept thus addresses Challenge Area 1: instrumentation and Investigation Approaches.
The Negative Impact of Early Peritonitis on Continuous Ambulatory Peritoneal Dialysis Patients
Hsieh, Yao-Peng; Wang, Shu-Chuan; Chang, Chia-Chu; Wen, Yao-Ko; Chiu, Ping-Fang; Yang, Yu
2014-01-01
♦ Background: Peritonitis rate has been reported to be associated with technique failure and overall mortality in previous literatures. However, information on the impact of the timing of the first peritonitis episode on continuous ambulatory peritoneal dialysis (CAPD) patients is sparse. The aim of this research is to study the influence of time to first peritonitis on clinical outcomes, including technique failure, patient mortality and dropout from peritoneal dialysis (PD). ♦ Methods: A retrospective observational cohort study was conducted over 10 years at a single PD unit in Taiwan. A total of 124 patients on CAPD with at least one peritonitis episode comprised the study subjects, which were dichotomized by the median of time to first peritonitis into either early peritonitis patients or late peritonitis patients. Cox proportional hazard model was used to analyze the correlation of the timing of first peritonitis with clinical outcomes. ♦ Results: Early peritonitis patients were older, more diabetic and had lower serum levels of creatinine than the late peritonitis patients. Early peritonitis patients were associated with worse technique survival, patient survival and stay on PD than late peritonitis patients, as indicated by Kaplan-Meier analysis (log-rank test, p = 0.04, p < 0.001, p < 0.001, respectively). In the multivariate Cox regression model, early peritonitis was still a significant predictor for technique failure (hazard ratio (HR), 0.54; 95% confidence interval (CI), 0.30 - 0.98), patient mortality (HR, 0.34; 95% CI, 0.13 - 0.92) and dropout from PD (HR, 0.50; 95% CI, 0.30 - 0.82). In continuous analyses, a 1-month increase in the time to the first peritonitis episode was associated with a 2% decreased risk of technique failure (HR, 0.98; 95% CI, 0.97 - 0.99), a 3% decreased risk of patient mortality (HR, 0.97; 95% CI, 0.95 - 0.99), and a 2% decreased risk of dropout from PD (HR, 98%; 95% CI, 0.97 - 0.99). Peritonitis rate was inversely correlated with time to first peritonitis according to the Spearman analysis (r = -0.64, p < 0.001). ♦ Conclusions: Time to first peritonitis is significantly correlated with clinical outcomes of peritonitis patients with early peritonitis patients having poor prognosis. Patients with shorter time to first peritonitis were prone to having a higher peritonitis rate. PMID:24497590
Technical Guidance for Hazardous Analysis, Emergency Planning for Extremely Hazardous Substances
This current guide supplements NRT-1 by providing technical assistance to LEPCs to assess the lethal hazards related to potential airborne releases of extremely hazardous substances (EHSs) as designated under Section 302 of Title Ill of SARA.
NASA Technical Reports Server (NTRS)
Windley, P.
1992-01-01
We present a state property called congruence and show how it can be used to demonstrate commutivity of instructions in a modern load-store architecture. Our analysis is particularly important in pipelined microprocessors where instructions are frequently reordered to avoid costly delays in execution caused by hazards. Our work has significant implications to safety and security critical applications since reordering can easily change the meaning and an instruction sequence and current techniques are largely ad hoc. Our work is done in a mechanical theorem prover and results in a set of trustworthy rules for instruction reordering. The mechanization makes it practical to analyze the entire instruction set.
NASA Astrophysics Data System (ADS)
Tarolli, Paolo; Prosdocimi, Massimo; Sofia, Giulia; Dalla Fontana, Giancarlo
2015-04-01
A real opportunity and challenge for the hazard mapping is offered by the use of smartphones and low-cost and flexible photogrammetric technique (i.e. 'Structure-from-Motion'-SfM-). Differently from the other traditional photogrammetric methods, the SfM allows to reconstitute three-dimensional geometries (Digital Surface Models, DSMs) from randomly acquired images. The images can be acquired by standalone digital cameras (compact or reflex), or even by smartphones built-in cameras. This represents a "revolutionary" advance compared with more expensive technologies and applications (e.g. Terrestrial Laser Scanner TLS, airborne lidar) (Tarolli, 2014). Through fast, simple and consecutive field surveys, anyone with a smartphone can take a lot of pictures of the same study area. This way, high-resolution and multi-temporal DSMs may be obtained and used to better monitor and understand erosion and deposition processes. Furthermore, these topographic data can also facilitate to quantify volumes of eroded materials due to landslides and recognize the major critical issues that usually occur during a natural hazard (e.g. river bank erosion and/or collapse due to floods). In this work we considered different case studies located in different environmental contexts of Italy, where extensive photosets were obtained using smartphones. TLS data were also considered in the analysis as benchmark to compare with SfM data. Digital Surface Models (DSMs) derived from SfM at centimeter grid-cell resolution revealed to be effective to automatically recognize areas subject to surface instabilities, and estimate quantitatively erosion and deposition volumes, for example. Morphometric indexes such as landform curvature and surface roughness, and statistical thresholds (e.g. standard deviation) of these indices, served as the basis for the proposed analyses. The results indicate that SfM technique through smartphones really offers a fast, simple and affordable alternative to lidar technology. Anyone (included farmers, technicians or who work at Civil Protection) who has a good smartphone can take photographs and, from these photographs, they can easily obtain high-resolution DSMs. Therefore, SfM technique accomplished with smartphones can be a very strategic tool for post-event field surveys, to increase the existing knowledge on such events, and to provide fast technical solutions for risk mitigation (e.g. landslide and flood risk management). The future challenge consists of using only a smartphone for local scale post-event analyses. This can be even enhanced by the development of specific apps that are able to build quickly a 3D view of the case study and arrange a preliminary quantitative analysis of the process involved, ready to be sent to Civil Protection for further elaborations. Tarolli, P. (2014). High-resolution topography for understanding Earth surface processes: opportunities and challenges. Geomorphology, 216, 295-312, doi:10.1016/j.geomorph.2014.03.008.
Modelling tsunami inundation for risk analysis at the Andaman Sea Coast of Thailand
NASA Astrophysics Data System (ADS)
Kaiser, G.; Kortenhaus, A.
2009-04-01
The mega-tsunami of Dec. 26, 2004 strongly impacted the Andaman Sea coast of Thailand and devastated coastal ecosystems as well as towns, settlements and tourism resorts. In addition to the tragic loss of many lives, the destruction or damage of life-supporting infrastructure, such as buildings, roads, water & power supply etc. caused high economic losses in the region. To mitigate future tsunami impacts there is a need to assess the tsunami hazard and vulnerability in flood prone areas at the Andaman Sea coast in order to determine the spatial distribution of risk and to develop risk management strategies. In the bilateral German-Thai project TRAIT research is performed on integrated risk assessment for the Provinces Phang Nga and Phuket in southern Thailand, including a hazard analysis, i.e. modelling tsunami propagation to the coast, tsunami wave breaking and inundation characteristics, as well as vulnerability analysis of the socio-economic and the ecological system in order to determine the scenario-based, specific risk for the region. In this presentation results of the hazard analysis and the inundation simulation are presented and discussed. Numerical modelling of tsunami propagation and inundation simulation is an inevitable tool for risk analysis, risk management and evacuation planning. While numerous investigations have been made to model tsunami wave generation and propagation in the Indian Ocean, there is still a lack in determining detailed inundation patterns, i.e. water depth and flow dynamics. However, for risk management and evacuation planning this knowledge is essential. As the accuracy of the inundation simulation is strongly depending on the available bathymetric and the topographic data, a multi-scale approach is chosen in this work. The ETOPO Global Relief Model as a bathymetric basis and the Shuttle Radar Topography Mission (SRTM90) have been widely applied in tsunami modelling approaches as these data are free and almost world-wide available. However, to model tsunami-induced inundation for risk analysis and management purposes the accuracy of these data is not sufficient as the processes in the near-shore zone cannot be modelled accurately enough and the spatial resolution of the topography is weak. Moreover, the SRTM data provide a digital surface model which includes vegetation and buildings in the surface description. To improve the data basis additional bathymetric data were used in the near shore zone of the Phang Nga and Phuket coastlines and various remote sensing techniques as well as additional GPS measurements were applied to derive a high resolution topography from satellite and airborne data. Land use classifications and filter methods were developed to correct the digital surface models to digital elevation models. Simulations were then performed with a non-linear shallow water model to model the 2004 Asian Tsunami and to simulate possible future ones. Results of water elevation near the coast were compared with field measurements and observations, and the influence of the resolution of the topography on inundation patterns like water depth, velocity, dispersion and duration of the flood were analysed. The inundation simulation provides detailed hazard maps and is considered a reliable basis for risk assessment and risk zone mapping. Results are regarded vital for estimation of tsunami induced damages and evacuation planning. Results of the aforementioned simulations will be discussed during the conference. Differences of the numerical results using topographic data of different scales and modified by different post processing techniques will be analysed and explained. Further use of the results with respect to tsunami risk analysis and management will also be demonstrated.
Photo-vibrational spectroscopy using quantum cascade laser and laser Doppler vibrometer
NASA Astrophysics Data System (ADS)
Liu, Huan; Hu, Qi; Xie, Jiecheng; Fu, Yu
2017-06-01
Photoacoustic/photothermal spectroscopy is an established technique for detection of chemicals and explosives. However, prior sample preparation is required and the analysis is conducted in a sealed space with a high-sensitivity sensor coupled with a lock-in amplifier, limiting the technique to applications in a controllable laboratory environment. Hence, this technique may not be suitable for defense and security applications where the detection of explosives or hazardous chemicals is required in an open environment at a safe standoff distance. In this study, chemicals in various forms were excited by an intensity-modulated quantum cascade laser (QCL), while a laser Doppler vibrometer (LDV) was applied to detect the vibration signal resulting from the photocoustic/photothermal effect. The photo-vibrational spectrum obtained by scanning the QCL's wavelength in MIR range, coincides well with the corresponding spectrum obtained using typical FTIR equipment. The experiment in short and long standoff distances demonstrated that the LDV is a capable sensor for chemical detection in an open environment.
A FMEA clinical laboratory case study: how to make problems and improvements measurable.
Capunzo, Mario; Cavallo, Pierpaolo; Boccia, Giovanni; Brunetti, Luigi; Pizzuti, Sante
2004-01-01
The authors have experimented the application of the Failure Mode and Effect Analysis (FMEA) technique in a clinical laboratory. FMEA technique allows: a) to evaluate and measure the hazards of a process malfunction, b) to decide where to execute improvement actions, and c) to measure the outcome of those actions. A small sample of analytes has been studied: there have been determined the causes of the possible malfunctions of the analytical process, calculating the risk probability index (RPI), with a value between 1 and 1,000. Only for the cases of RPI > 400, improvement actions have been implemented that allowed a reduction of RPI values between 25% to 70% with a costs increment of < 1%. FMEA technique can be applied to the processes of a clinical laboratory, even if of small dimensions, and offers a high potential of improvement. Nevertheless, such activity needs a thorough planning because it is complex, even if the laboratory already operates an ISO 9000 Quality Management System.
NASA Astrophysics Data System (ADS)
Park, S.; Ishii, M.
2017-12-01
Various seismic imaging methods have been developed, such as traveltime, waveform, and noise tomography, improving our knowledge of the subsurface structure and evolution. Near-surface structure, in particular, is crucial in understanding earthquake and volcano hazards. Seismic speed is directly related to the level of ground shaking, and monitoring its temporal change is valuable in volcanic hazard assessment. Here, we introduce a novel technique to constrain seismic wave speed of the very upper crust based upon the polarization measurements of teleseismic body-wave arrivals. The technique relates the orientation of recorded body waves to the wave speed immediately beneath a seismic instrument. We develop a counter-intuitive relationship that the P-wave polarization direction is only sensitive to subsurface shear wave speed but not to compressional wave speed, while the S-wave polarization direction is sensitive to both wave speeds. This approach is applied to the High-Sensitivity Seismograph Network in Japan, where the results are benchmarked against the borehole well data available at most stations. There is a good agreement between polarization-based estimates and the well measurements at as shallow as 100 m, confirming the efficacy of the new method in resolving the shallow structure. The lateral variation of wave speeds shows that sedimentary basins and mountainous regions are characterized by low and high wave speeds, respectively. It also correlates with volcano locations and geological units of different ages. Moreover, the analysis is expanded into 3D by examining the frequency dependence, where some preliminary results using broadband data are presented. These 2D and 3D wave speed estimates can be used to identify zones of high seismic risk by comparison with population distribution. This technique requires minimal computation resources and can be applied to any single three-component seismograph. It opens a new path to a reliable, non-invasive, and inexpensive earthquake hazard assessment in any environment where a drilling or a field experiment using vibro-trucks or explosives is not a practical option for measuring the near-surface seismic wave speeds. It can also provide means of monitoring changes that occur within the very upper crust such as from volcanic or hydrological phenomena.
Seddigi, Zaki S.; Baig, Umair; Ahmed, Saleh A.; Abdulaziz, M. A.; Danish, Ekram Y.; Khaled, Mazen M.; Lais, Abul
2017-01-01
In the present work, bismuth oxychloride nanoparticles–a light harvesting semiconductor photocatalyst–were synthesized by a facile hydrolysis route, with sodium bismuthate and hydroxylammonium chloride as the precursor materials. The as-synthesized semiconductor photocatalysts were characterized using X-ray diffraction analysis, Fourier transform infra-red spectroscopy, Raman spectroscopy, Field emission scanning electron microscopy, X-ray photoelectron spectroscopy and Photoluminescence spectroscopy techniques. The crystal structure, morphology, composition, and optical properties of these facile synthesized bismuth oxychloride nanoparticles (BiOCl NPs) were compared to those of traditional bismuth oxychloride. In addition, the photocatalytic performance of facile-synthesized BiOCl NPs and traditional BiOCl, as applied to the removal of hazardous organic dyes under visible light illumination, is thoroughly investigated. Our results reveal that facile-synthesized BiOCl NPs display strong UV-Vis light adsorption, improved charge carrier mobility and an inhibited rate of charge carrier recombination, when compared to traditional BiOCl. These enhancements result in an improved photocatalytic degradation rate of hazardous organic dyes under UV-Vis irradiance. For instance, the facile-synthesized BiOCl NPs attained 100% degradation of methylene blue and methyl orange dyes in approximately 30 mins under UV-Vis irradiation, against 55% degradation for traditional BiOCl under similar experimental conditions. PMID:28245225
Murphy, S.; Scala, A.; Herrero, A.; Lorito, S.; Festa, G.; Trasatti, E.; Tonini, R.; Romano, F.; Molinari, I.; Nielsen, S.
2016-01-01
The 2011 Tohoku earthquake produced an unexpected large amount of shallow slip greatly contributing to the ensuing tsunami. How frequent are such events? How can they be efficiently modelled for tsunami hazard? Stochastic slip models, which can be computed rapidly, are used to explore the natural slip variability; however, they generally do not deal specifically with shallow slip features. We study the systematic depth-dependence of slip along a thrust fault with a number of 2D dynamic simulations using stochastic shear stress distributions and a geometry based on the cross section of the Tohoku fault. We obtain a probability density for the slip distribution, which varies both with depth, earthquake size and whether the rupture breaks the surface. We propose a method to modify stochastic slip distributions according to this dynamically-derived probability distribution. This method may be efficiently applied to produce large numbers of heterogeneous slip distributions for probabilistic tsunami hazard analysis. Using numerous M9 earthquake scenarios, we demonstrate that incorporating the dynamically-derived probability distribution does enhance the conditional probability of exceedance of maximum estimated tsunami wave heights along the Japanese coast. This technique for integrating dynamic features in stochastic models can be extended to any subduction zone and faulting style. PMID:27725733
Seddigi, Zaki S; Gondal, Mohammed A; Baig, Umair; Ahmed, Saleh A; Abdulaziz, M A; Danish, Ekram Y; Khaled, Mazen M; Lais, Abul
2017-01-01
In the present work, bismuth oxychloride nanoparticles-a light harvesting semiconductor photocatalyst-were synthesized by a facile hydrolysis route, with sodium bismuthate and hydroxylammonium chloride as the precursor materials. The as-synthesized semiconductor photocatalysts were characterized using X-ray diffraction analysis, Fourier transform infra-red spectroscopy, Raman spectroscopy, Field emission scanning electron microscopy, X-ray photoelectron spectroscopy and Photoluminescence spectroscopy techniques. The crystal structure, morphology, composition, and optical properties of these facile synthesized bismuth oxychloride nanoparticles (BiOCl NPs) were compared to those of traditional bismuth oxychloride. In addition, the photocatalytic performance of facile-synthesized BiOCl NPs and traditional BiOCl, as applied to the removal of hazardous organic dyes under visible light illumination, is thoroughly investigated. Our results reveal that facile-synthesized BiOCl NPs display strong UV-Vis light adsorption, improved charge carrier mobility and an inhibited rate of charge carrier recombination, when compared to traditional BiOCl. These enhancements result in an improved photocatalytic degradation rate of hazardous organic dyes under UV-Vis irradiance. For instance, the facile-synthesized BiOCl NPs attained 100% degradation of methylene blue and methyl orange dyes in approximately 30 mins under UV-Vis irradiation, against 55% degradation for traditional BiOCl under similar experimental conditions.
Lech, Tomasz
2016-05-01
The literature of environmental microbiology broadly discusses issues associated with microbial hazards in archives, but these publications are mainly devoted to paper documents. There are few articles on historical parchment documents, which used to be very important for the development of literature and the art of writing. These studies present a broad spectrum of methods for the assessment of biodeterioration hazards of the parchment document in question. They are based on both conventional microbiological methods and advanced techniques of molecular biology. Here, a qualitative analysis was conducted, based on genetic identification of bacteria and fungi present on the document as well as denaturing gradient gel electrophoresis profiling and examining the destructive potential of isolated microbes. Moreover, the study involved a quantitative and qualitative microbiological assessment of the indoor air in the room where the parchment was kept. The microbes with the highest destructive potential that were isolated from the investigated item were Bacillus cereus and Acinetobacter lwoffii bacteria and Penicillium chrysogenum,Chaetomium globosum, and Trichoderma longibrachiatum fungi. The presence of the B. cereuss train was particularly interesting since, under appropriate conditions, it leads to complete parchment degradation within several days. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Maxim, L D; Allshouse, J N; Chen, S H; Treadway, J; Venturin, D
1998-04-01
The traditional hierarchy of measures for control of potential respiratory hazards in the workplace includes (in order of preference) engineering controls, workplace practices, and use of respiratory protection. Although third in this hierarchy, respirators can be an important component of the control mix-particularly for difficult-to-control jobs, as an interim measure (pending implementation of other controls), and in cases where exposure is intermittent. One of the problems associated with the use of respirators as a control measure is that valid and adequate data on respirator usage are often not available. Absent these data it is difficult to determine the practical effectiveness of respirators and exposure calculations which include the protective effect of respirators are speculative. This paper presents models (and appropriate statistical fitting techniques) suitable for quantification of respirator usage and defines three potentially useful measures of effectiveness for a respirator program. These models are illustrated with monitoring data on refractory ceramic fiber (RCF) developed as part of a Consent Agreement between the RCF industry and the U.S. Environmental Protection Agency. For this substance there are extensive and comprehensive monitoring data available. The models and methods of analysis may prove applicable for other potential respiratory hazards in the workplace. Copyright 1998 Academic Press.
Space Propulsion Hazards Analysis Manual (SPHAM). Volume 2. Appendices
1988-10-01
lb. RESTRICTIVE MARKINGS UNCLASSIFIED 2a. SECURITY CLASSIFICATION AUTHORITY 3 . DISTRIBUTION/AVAILABILITY OF REPORT Approved for public release...Volume I Chapter 2 - Requirementb and the Hazards Analysis Process .... Volume I Chapter 3 - Accident Scenarios...list of the hazardous materials that are discussed; 3 ) description of the failure scenarios; 4) type of post-accident environment that is discussed
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, F.W.
1994-03-28
This bibliography is divided into the following four sections: Seismicity of Hawaii and Kilauea Volcano; Occurrence, locations and accelerations from large historical Hawaiian earthquakes; Seismic hazards of Hawaii; and Methods of seismic hazard analysis. It contains 62 references, most of which are accompanied by short abstracts.
NASA Astrophysics Data System (ADS)
Zosseder, K.; Post, J.; Steinmetz, T.; Wegscheider, S.; Strunz, G.
2009-04-01
Indonesia is located at one of the most active geological subduction zones in the world. Following the most recent seaquakes and their subsequent tsunamis in December 2004 and July 2006 it is expected that also in the near future tsunamis are likely to occur due to increased tectonic tensions leading to abrupt vertical seafloor alterations after a century of relative tectonic silence. To face this devastating threat tsunami hazard maps are very important as base for evacuation planning and mitigation strategies. In terms of a tsunami impact the hazard assessment is mostly covered by numerical modelling because the model results normally offer the most precise database for a hazard analysis as they include spatially distributed data and their influence to the hydraulic dynamics. Generally a model result gives a probability for the intensity distribution of a tsunami at the coast (or run up) and the spatial distribution of the maximum inundation area depending on the location and magnitude of the tsunami source used. The boundary condition of the source used for the model is mostly chosen by a worst case approach. Hence the location and magnitude which are likely to occur and which are assumed to generate the worst impact are used to predict the impact at a specific area. But for a tsunami hazard assessment covering a large coastal area, as it is demanded in the GITEWS (German Indonesian Tsunami Early Warning System) project in which the present work is embedded, this approach is not practicable because a lot of tsunami sources can cause an impact at the coast and must be considered. Thus a multi-scenario tsunami model approach is developed to provide a reliable hazard assessment covering large areas. For the Indonesian Early Warning System many tsunami scenarios were modelled by the Alfred Wegener Institute (AWI) at different probable tsunami sources and with different magnitudes along the Sunda Trench. Every modelled scenario delivers the spatial distribution of the inundation for a specific area, the wave height at coast at this area and the estimated times of arrival (ETAs) of the waves, caused by one tsunamigenic source with a specific magnitude. These parameters from the several scenarios can overlap each other along the coast and must be combined to get one comprehensive hazard assessment for all possible future tsunamis at the region under observation. The simplest way to derive the inundation probability along the coast using the multiscenario approach is to overlay all scenario inundation results and to determine how often a point on land will be significantly inundated from the various scenarios. But this does not take into account that the used tsunamigenic sources for the modeled scenarios have different likelihoods of causing a tsunami. Hence a statistical analysis of historical data and geophysical investigation results based on numerical modelling results is added to the hazard assessment, which clearly improves the significance of the hazard assessment. For this purpose the present method is developed and contains a complex logical combination of the diverse probabilities assessed like probability of occurrence for different earthquake magnitudes at different localities, probability of occurrence for a specific wave height at the coast and the probability for every point on land likely to get hit by a tsunami. The values are combined by a logical tree technique and quantified by statistical analysis of historical data and of the tsunami modelling results as mentioned before. This results in a tsunami inundation probability map covering the South West Coast of Indonesia which nevertheless shows a significant spatial diversity offering a good base for evacuation planning and mitigation strategies. Keywords: tsunami hazard assessment, tsunami modelling, probabilistic analysis, early warning
Choi, Jeongan; Yang, Byeong Joon; Bae, Gwi-Nam; Jung, Jae Hee
2015-11-18
Recently, with the increased attention to indoor air quality, antimicrobial air filtration techniques have been studied widely to inactivate hazardous airborne microorganisms effectively. In this study, we demonstrate herbal extract incorporated (HEI) nanofibers synthesized by an electrospinning technique and their application to antimicrobial air filtration. As an antimicrobial herbal material, an ethanolic extract of Sophora flavescens, which exhibits great antibacterial activity against pathogens, was mixed with the polymer solution for the electrospinning process. We measured various characteristics of the synthesized HEI nanofibers, such as fiber morphology, fiber size distribution, and thermal stability. For application of the electrospun HEI nanofibers, we made highly effective air filters with 99.99% filtration efficiency and 99.98% antimicrobial activity against Staphylococcus epidermidis. The pressure drop across the HEI nanofiber air filter was 4.75 mmH2O at a face air velocity of 1.79 cm/s. These results will facilitate the implementation of electrospun HEI nanofiber techniques to control air quality and protect against hazardous airborne microorganisms.
NASA Astrophysics Data System (ADS)
Weatherill, Graeme; Burton, Paul W.
2010-09-01
The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard mapping calculations. These hazard maps are in general agreement with previous maps for the Aegean, recognising the highest hazard in the Ionian Islands, Gulf of Corinth and Hellenic Arc. Peak Ground Accelerations for some sites in these regions reach as high as 500-600 cm s -2 using European/NGA attenuation models, and 400-500 cm s -2 using Greek attenuation models.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L. K.; Vogel, R. M.
2015-11-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, Laura K.; Vogel, Richard M.
2016-04-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.
The simulation of air recirculation and fire/explosion phenomena within a semiconductor factory.
I, Yet-Pole; Chiu, Yi-Long; Wu, Shi-Jen
2009-04-30
The semiconductor industry is the collection of capital-intensive firms that employ a variety of hazardous chemicals and engage in the design and fabrication of semiconductor devices. Owing to its processing characteristics, the fully confined structure of the fabrication area (fab) and the vertical airflow ventilation design restrict the applications of traditional consequence analysis techniques that are commonly used in other industries. The adverse situation also limits the advancement of a fire/explosion prevention design for the industry. In this research, a realistic model of a semiconductor factory with a fab, sub-fabrication area, supply air plenum, and return air plenum structures was constructed and the computational fluid dynamics algorithm was employed to simulate the possible fire/explosion range and its severity. The semiconductor factory has fan module units with high efficiency particulate air filters that can keep the airflow uniform within the cleanroom. This condition was modeled by 25 fans, three layers of porous ceiling, and one layer of porous floor. The obtained results predicted very well the real airflow pattern in the semiconductor factory. Different released gases, leak locations, and leak rates were applied to investigate their influence on the hazard range and severity. Common mitigation measures such as a water spray system and a pressure relief panel were also provided to study their potential effectiveness to relieve thermal radiation and overpressure hazards within a fab. The semiconductor industry can use this simulation procedure as a reference on how to implement a consequence analysis for a flammable gas release accident within an air recirculation cleanroom.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stroh, K.R.
1980-01-01
The Composite HTGR Analysis Program (CHAP) consists of a model-independent systems analysis mainframe named LASAN and model-dependent linked code modules, each representing a component, subsystem, or phenomenon of an HTGR plant. The Fort St. Vrain (FSV) version (CHAP-2) includes 21 coded modules that model the neutron kinetics and thermal response of the core; the thermal-hydraulics of the reactor primary coolant system, secondary steam supply system, and balance-of-plant; the actions of the control system and plant protection system; the response of the reactor building; and the relative hazard resulting from fuel particle failure. FSV steady-state and transient plant data are beingmore » used to partially verify the component modeling and dynamic smulation techniques used to predict plant response to postulated accident sequences.« less
Statistical analysis of the uncertainty related to flood hazard appraisal
NASA Astrophysics Data System (ADS)
Notaro, Vincenza; Freni, Gabriele
2015-12-01
The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.
Geological hazard monitoring system in Georgia
NASA Astrophysics Data System (ADS)
Gaprindashvili, George
2017-04-01
Georgia belongs to one of world's most complex mountainous regions according to the scale and frequency of Geological processes and damage caused to population, farmlands, and Infrastructure facilities. Geological hazards (landslide, debrisflow/mudflow, rockfall, erosion and etc.) are affecting many populated areas, agricultural fields, roads, oil and gas pipes, high-voltage electric power transmission towers, hydraulic structures, and tourist complexes. Landslides occur almost in all geomorphological zones, resulting in wide differentiation in the failure types and mechanisms and in the size-frequency distribution. In Georgia, geological hazards triggered by: 1. Activation of highly intense earthquakes; 2. Meteorological events provoking the disaster processes on the background of global climatic change; 3. Large-scale Human impact on the environment. The prediction and monitoring of Geological Hazards is a very wide theme, which involves different researchers from different spheres. Geological hazard monitoring is essential to prevent and mitigate these hazards. In past years in Georgia several monitoring system, such as Ground-based geodetic techniques, Debrisflow Early Warning System (EWS) were installed on high sensitive landslide and debrisflow areas. This work presents description of Geological hazard monitoring system in Georgia.
A rapid method for the sampling of atmospheric water vapour for isotopic analysis.
Peters, Leon I; Yakir, Dan
2010-01-01
Analysis of the stable isotopic composition of atmospheric moisture is widely applied in the environmental sciences. Traditional methods for obtaining isotopic compositional data from ambient moisture have required complicated sampling procedures, expensive and sophisticated distillation lines, hazardous consumables, and lengthy treatments prior to analysis. Newer laser-based techniques are expensive and usually not suitable for large-scale field campaigns, especially in cases where access to mains power is not feasible or high spatial coverage is required. Here we outline the construction and usage of a novel vapour-sampling system based on a battery-operated Stirling cycle cooler, which is simple to operate, does not require any consumables, or post-collection distillation, and is light-weight and highly portable. We demonstrate the ability of this system to reproduce delta(18)O isotopic compositions of ambient water vapour, with samples taken simultaneously by a traditional cryogenic collection technique. Samples were collected over 1 h directly into autosampler vials and were analysed by mass spectrometry after pyrolysis of 1 microL aliquots to CO. This yielded an average error of < +/-0.5 per thousand, approximately equal to the signal-to-noise ratio of traditional approaches. This new system provides a rapid and reliable alternative to conventional cryogenic techniques, particularly in cases requiring high sample throughput or where access to distillation lines, slurry maintenance or mains power is not feasible. Copyright 2009 John Wiley & Sons, Ltd.
Monte Carlo simulation for slip rate sensitivity analysis in Cimandiri fault area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pratama, Cecep, E-mail: great.pratama@gmail.com; Meilano, Irwan; Nugraha, Andri Dian
Slip rate is used to estimate earthquake recurrence relationship which is the most influence for hazard level. We examine slip rate contribution of Peak Ground Acceleration (PGA), in probabilistic seismic hazard maps (10% probability of exceedance in 50 years or 500 years return period). Hazard curve of PGA have been investigated for Sukabumi using a PSHA (Probabilistic Seismic Hazard Analysis). We observe that the most influence in the hazard estimate is crustal fault. Monte Carlo approach has been developed to assess the sensitivity. Then, Monte Carlo simulations properties have been assessed. Uncertainty and coefficient of variation from slip rate formore » Cimandiri Fault area has been calculated. We observe that seismic hazard estimates is sensitive to fault slip rate with seismic hazard uncertainty result about 0.25 g. For specific site, we found seismic hazard estimate for Sukabumi is between 0.4904 – 0.8465 g with uncertainty between 0.0847 – 0.2389 g and COV between 17.7% – 29.8%.« less
Hydrogen Hazards Assessment Protocol (HHAP): Approach and Methodology
NASA Technical Reports Server (NTRS)
Woods, Stephen
2009-01-01
This viewgraph presentation reviews the approach and methodology to develop a assessment protocol for hydrogen hazards. Included in the presentation are the reasons to perform hazards assessment, the types of hazard assessments that exist, an analysis of hydrogen hazards, specific information about the Hydrogen Hazards Assessment Protocol (HHAP). The assessment is specifically tailored for hydrogen behavior. The end product of the assesment is a compilation of hazard, mitigations and associated factors to facilitate decision making and achieve the best practice.
Aeromagnetic investigations of hazardous waste sites
,
1995-01-01
Aeromagnetic survey data collected by helicopter over hazardous waste sites can be used to map the distribution of buried metallic (ferrous) objects at these sites, including drums and scrap metal. Thorough knowledge of the locations and nature of hazardous waste containers and contaminated objects is needed prior to the start of remediation efforts. Non-invasive geophysical techniques such as the aeromagnetic method provide the best way to obtain this knowledge. The U.S. Geological Survey (USGS) not only has experience in processing and interpreting aeromagnetic surveys of this type but also offers aid in the design and monitoring of contracts for such surveys.
NASA Technical Reports Server (NTRS)
Kessler, D. J. (Compiler); Su, S. Y. (Compiler)
1985-01-01
Earth orbital debris issues and recommended future activities are discussed. The workshop addressed the areas of environment definition, hazards to spacecraft, and space object management. It concluded that orbital debris is a potential problem for future space operations. However, before recommending any major efforts to control the environment, more data are required. The most significant required data are on the population of debris smaller than 4 cm in diameter. New damage criteria are also required. When these data are obtained, they can be combined with hypervelocity data to evaluate the hazards to future spacecraft. After these hazards are understood, then techniques to control the environment can be evaluated.
Innovative Solidification Techniques for Hazardous Wastes at Army Installations.
1985-11-01
Fixed Hazardous Industrial Wastes and Flue Gas Desulfurization Sludges," Interim Report, EPA-600/2-76-182, US Environmental Protection Agency... flue gas . Flyash from coal-fired power plants is an almost entirely inorganic product having a glassy nature. Consequently, flyash by itself has little...effective- ness of alternative control strategies for reducing environmental impacts. 4. % 46 -"- °° ~~~~~...-.-..o
Yan, Fang; Xu, Kaili
2017-01-01
Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific.
Yan, Fang; Xu, Kaili
2017-01-01
Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific. PMID:28076440
Hazard perception of Dutch farmers and veterinarians related to dairy young stock rearing.
Boersema, J S C; Noordhuizen, J P T M; Lievaart, J J
2013-08-01
A group of 110 dairy farmers and 26 bovine veterinarians participated in a web-based questionnaire using the adaptive conjoint analysis technique to rank their perception regarding several hazards during 6 subsequent periods of the process of dairy young stock rearing. The method applied only involved selected respondents with a high consistency in their answering (correlation >30%). For the ranking, answers were first transformed into a utility score (US) for each hazard. The final ranking for each of the 6 periods was based on the US per hazard separately for farmers and veterinarians. Besides the ranking, the absolute values and the US itself were also compared between farmers and veterinarians to determine any statistically significant differences between the levels of the score despite the ranking. The overall conclusion is that, for almost every designated period, the ranking of the hazards differed between farmers and veterinarians. Only 1 period was observed (period IV, Pregnancy period until 4 weeks before calving) where veterinarians and farmers had the same top 3 ranking of the hazards, namely "Mastitis," "Abortion," and "Poor growth rate of the pregnant heifer." Major differences between farmers and veterinarians were seen during period II (feeding milk until weaning) for the hazard "Diarrhea in older calf," which was considered less important by farmers compared to veterinarians, and period number III (weaning until insemination) for "Over-condition," which, again, was seen as the most important hazard by veterinarians, but only ranked as number 5 by farmers. Besides the ranking, significant differences in absolute US values between veterinarians and farmers were seen in "Infection with Johne's disease" (14.5 vs. 7.8), "Diarrhea in newborn calf" (18.2 vs. 12.2), and "Insufficient feed intake" (16.2 vs. 8.4) in period I (colostrum until transition to milk replacer). Lameness represented the most important significant difference in absolute values in period III (weaning until insemination; 6.3 vs. 14.3), which was again significant in period V (4 wks before calving until calving; 7.4 vs. 12.1). The outcome of this study shows that hazard perception of veterinarians and farmers differs for most rearing periods (in ranking and absolute values). The outcome of this study can be used for 2 purposes: first, to improve communication between farmers and their consulting veterinarian about hazards and hazard perception in young stock rearing; and second, the US scores can be used to select top priority hazards which should at least be integrated into management advisory programs to improve dairy young stock rearing. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Identifying Changes of Complex Flood Dynamics with Recurrence Analysis
NASA Astrophysics Data System (ADS)
Wendi, D.; Merz, B.; Marwan, N.
2016-12-01
Temporal changes in flood hazard system are known to be difficult to detect and attribute due to multiple drivers that include complex processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defense, river training, or land use change, could impact variably on space-time scales and influence or mask each other. Flood time series may show complex behavior that vary at a range of time scales and may cluster in time. Moreover hydrological time series (i.e. discharge) are often subject to measurement errors, such as rating curve error especially in the case of extremes where observation are actually derived through extrapolation. This study focuses on the application of recurrence based data analysis techniques (recurrence plot) for understanding and quantifying spatio-temporal changes in flood hazard in Germany. The recurrence plot is known as an effective tool to visualize the dynamics of phase space trajectories i.e. constructed from a time series by using an embedding dimension and a time delay, and it is known to be effective in analyzing non-stationary and non-linear time series. Sensitivity of the common measurement errors and noise on recurrence analysis will also be analyzed and evaluated against conventional methods. The emphasis will be on the identification of characteristic recurrence properties that could associate typical dynamic to certain flood events.
Agent-based simulation for human-induced hazard analysis.
Bulleit, William M; Drewek, Matthew W
2011-02-01
Terrorism could be treated as a hazard for design purposes. For instance, the terrorist hazard could be analyzed in a manner similar to the way that seismic hazard is handled. No matter how terrorism is dealt with in the design of systems, the need for predictions of the frequency and magnitude of the hazard will be required. And, if the human-induced hazard is to be designed for in a manner analogous to natural hazards, then the predictions should be probabilistic in nature. The model described in this article is a prototype model that used agent-based modeling (ABM) to analyze terrorist attacks. The basic approach in this article of using ABM to model human-induced hazards has been preliminarily validated in the sense that the attack magnitudes seem to be power-law distributed and attacks occur mostly in regions where high levels of wealth pass through, such as transit routes and markets. The model developed in this study indicates that ABM is a viable approach to modeling socioeconomic-based infrastructure systems for engineering design to deal with human-induced hazards. © 2010 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Benkert, B.; Perrin, A.; Calmels, F.
2015-12-01
Together with its partners, the Northern Climate ExChange (NCE, part of the Yukon Research Centre at Yukon College) has been mapping permafrost-related hazard risk in northern communities since 2010. By integrating geoscience and climate project data, we have developed a series of community-scale hazard risk maps. The maps depict hazard risk in stoplight colours for easy interpretation, and support community-based, future-focused adaptation planning. Communities, First Nations, consultants and local regulatory agencies have used the hazard risk maps to site small-scale infrastructure projects, guide land planning processes, and assess suitability of land development applications. However, we know that assessing risk is only one step in integrating the implications of permafrost degradation in societal responses to environmental change. To build on our permafrost hazard risk maps, we are integrating economic principles and traditional land use elements. To assess economic implications of adaptation to permafrost change, we are working with geotechnical engineers to identify adaptation options (e.g., modified building techniques, permafrost thaw mitigation approaches) that suit the risks captured by our existing hazard risk maps. We layer this with an economic analysis of the costs associated with identified adaptation options, providing end-users with a more comprehensive basis upon which to make decisions related to infrastructure. NCE researchers have also integrated traditional land use activities in assessments of permafrost thaw risk, in a project led by Jean Marie River First Nation in the Northwest Territories. Here, the implications of permafrost degradation on food security and land use priorities were assessed by layering key game and gathering areas on permafrost thaw vulnerability maps. Results indicated that close to one quarter of big and small game habitats, and close to twenty percent of key furbearer and gathering areas within the First Nation's traditional territory, are situated on highly thaw sensitive permafrost. These projects demonstrate how physical and socio-economic factors can be integrated in assessments of permafrost vulnerability to thaw, thus providing tangible, useable results that reflect community priorities and support local decision making.
Cho, Herman M.; Washton, Nancy M.; Mueller, Karl T.; Sears, Jr., Jesse A.; Townsend, Mark R.; Ewing, James R.
2016-06-14
A magic-angle-spinning (MAS) nuclear magnetic resonance (NMR) probe is described that includes double containment enclosures configured to seal and contain hazardous samples for analysis. The probe is of a modular design that ensures containment of hazardous samples during sample analysis while preserving spin speeds for superior NMR performance and convenience of operation.
Flood Hazard Management: British and International Perspectives
NASA Astrophysics Data System (ADS)
James, L. Douglas
This proceedings of an international workshop at the Flood Hazard Research Centre (Queensway, Enfield, Middlesex, U.K.) begins by noting how past British research on flood problems concentrated on refining techniques to implement established policy. In contrast, research covered in North American and Australian publications involved normative issues on policy alternatives and administrative implementation. The workshop's participants included 16 widely recognized scientists, whose origins were about equally divided between Britain and overseas; from this group the workshop's organizers expertly drew ideas for refining British urban riverine flood hazard management and for cultivating links among researchers everywhere. Such intellectual exchange should be of keen interest to flood hazard program managers around the world, to students of comparative institutional performance, to those who make policy on protecting people from hazards, and to hydrologists and other geophysicists who must communicate descriptive information for bureaucratic, political, and public decision- making.
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... instrumentation with alarms, and detection hardware such as hydrocarbon sensors.); (4) Consequences of failure of...
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... instrumentation with alarms, and detection hardware such as hydrocarbon sensors.); (4) Consequences of failure of...
NASA Astrophysics Data System (ADS)
Han, Xiuzhen; Ma, Jianwen; Bao, Yuhai
2006-12-01
Currently the function of operational locust monitor system mainly focused on after-hazards monitoring and assessment, and to found the way effectively to perform early warning and prediction has more practical meaning. Through 2001, 2002 two years continuously field sample and statistics for locusts eggs hatching, nymph growth, adults 3 phases observation, sample statistics and calculation, spectral measurements as well as synchronically remote sensing data processing we raise the view point of Remote Sensing three stage monitor the locust hazards. Based on the point of view we designed remote sensing monitor in three stages: (1) during the egg hitching phase remote sensing can retrieve parameters of land surface temperature (LST) and soil moisture; (2) during nymph growth phase locust increases appetite greatly and remote sensing can calculate vegetation index, leaf area index, vegetation cover and analysis changes; (3) during adult phase the locust move and assembly towards ponds and water ditches as well as less than 75% vegetation cover areas and remote sensing combination with field data can monitor and predicts potential areas for adult locusts to assembly. In this way the priority of remote sensing technology is elaborated effectively and it also provides technique support for the locust monitor system. The idea and techniques used in the study can also be used as reference for other plant diseases and insect pests.
Deep learning algorithms for detecting explosive hazards in ground penetrating radar data
NASA Astrophysics Data System (ADS)
Besaw, Lance E.; Stimac, Philip J.
2014-05-01
Buried explosive hazards (BEHs) have been, and continue to be, one of the most deadly threats in modern conflicts. Current handheld sensors rely on a highly trained operator for them to be effective in detecting BEHs. New algorithms are needed to reduce the burden on the operator and improve the performance of handheld BEH detectors. Traditional anomaly detection and discrimination algorithms use "hand-engineered" feature extraction techniques to characterize and classify threats. In this work we use a Deep Belief Network (DBN) to transcend the traditional approaches of BEH detection (e.g., principal component analysis and real-time novelty detection techniques). DBNs are pretrained using an unsupervised learning algorithm to generate compressed representations of unlabeled input data and form feature detectors. They are then fine-tuned using a supervised learning algorithm to form a predictive model. Using ground penetrating radar (GPR) data collected by a robotic cart swinging a handheld detector, our research demonstrates that relatively small DBNs can learn to model GPR background signals and detect BEHs with an acceptable false alarm rate (FAR). In this work, our DBNs achieved 91% probability of detection (Pd) with 1.4 false alarms per square meter when evaluated on anti-tank and anti-personnel targets at temperate and arid test sites. This research demonstrates that DBNs are a viable approach to detect and classify BEHs.
NASA Astrophysics Data System (ADS)
Thitimakorn, Thanop
2013-08-01
To account for site amplification and seismic hazard mapping, the shear-wave velocity (Vs) profile to a depth of 30 m (Vs (30)) is an important parameter and can be used to calculate the ground motion for specific site conditions. In this study, the near-surface Vs profiles of soils were collected at 44 sites in Chiang Mai city using the multi-channel analysis of surface-wave technique. The Vs of each tested location was average weighted to Vs (30) based on the National Earthquake Hazards Reduction Program (NEHRP) criteria. The average Vs (30) value of the alluvium soils was about 362 m s-1, which falls between NEHRP site classes C and D. The average Vs (30) values of flood plain, fluvial clay and natural levee soils (at 300, 299 and 311 m s-1, respectively) all equated to NEHRP class D. The colluvial deposits in the north-western part of the city were mainly composed of gravel, coarse sand and rock fragments, and were assigned to class C (average Vs (30) of 412 m s-1). Soils with lower Vs values will experience higher earthquake ground shaking than those of the bedrock. Accordingly the major part of Chiang Mai city may experience substantial ground shaking due to the amplification in the soft soils.
Analysis of deformation patterns through advanced DINSAR techniques in Istanbul megacity
NASA Astrophysics Data System (ADS)
Balik Sanli, F.; Calò, F.; Abdikan, S.; Pepe, A.; Gorum, T.
2014-09-01
As result of the Turkey's economic growth and heavy migration processes from rural areas, Istanbul has experienced a high urbanization rate, with severe impacts on the environment in terms of natural resources pressure, land-cover changes and uncontrolled sprawl. As a consequence, the city became extremely vulnerable to natural and man-made hazards, inducing ground deformation phenomena that threaten buildings and infrastructures and often cause significant socio-economic losses. Therefore, the detection and monitoring of such deformation patterns is of primary importance for hazard and risk assessment as well as for the design and implementation of effective mitigation strategies. Aim of this work is to analyze the spatial distribution and temporal evolution of deformations affecting the Istanbul metropolitan area, by exploiting advanced Differential SAR Interferometry (DInSAR) techniques. In particular, we apply the Small BAseline Subset (SBAS) approach to a dataset of 43 TerraSAR-X images acquired, between November 2010 and June 2012, along descending orbits with an 11-day revisit time and a 3 m × 3 m spatial resolution. The SBAS processing allowed us to remotely detect and monitor subsidence patterns over all the urban area as well as to provide detailed information at the scale of the single building. Such SBAS measurements, effectively integrated with ground-based monitoring data and thematic maps, allows to explore the relationship between the detected deformation phenomena and urbanization, contributing to improve the urban planning and management.
NASA Technical Reports Server (NTRS)
West, Leanne; Gimmestad, Gary; Smith, William; Kireev, Stanislav; Cornman, Larry B.; Schaffner, Philip R.; Tsoucalas, George
2008-01-01
The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining measurements of potential weather hazards to alert flight crews. The FLI concept is based on high-resolution Infrared (IR) Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing, and which have also been applied to the detection of aerosols and gases for other purposes. It is being evaluated for multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing, during all phases of flight. Previous sensitivity and characterization studies addressed the phenomenology that supports detection and mitigation by the FLI. Techniques for determining the range, and hence warning time, were demonstrated for several of the hazards, and a table of research instrument parameters was developed for investigating all of the hazards discussed above. This work supports the feasibility of detecting multiple hazards with an FLI multi-hazard airborne sensor, and for producing enhanced IR images in reduced visibility conditions; however, further research must be performed to develop a means to estimate the intensities of the hazards posed to an aircraft and to develop robust algorithms to relate sensor measurables to hazard levels. In addition, validation tests need to be performed with a prototype system.
Regional United States electric field and GIC hazard impacts (Invited)
NASA Astrophysics Data System (ADS)
Gannon, J. L.; Balch, C. C.; Trichtchenko, L.
2013-12-01
Geomagnetically Induced Currents (GICs) are primarily driven by impulsive geomagnetic disturbances created by the interaction between the Earth's magnetosphere and sharp velocity, density, and magnetic field enhancements in the solar wind. However, the magnitude of the induced electric field response at the ground level, and therefore the resulting hazard to the bulk power system, is determined not only by magnetic drivers, but also by the underlying geology. Convolution techniques are used to calculate surface electric fields beginning from the spectral characteristics of magnetic field drivers and the frequency response of the local geology. Using these techniques, we describe historical scenarios for regions across the United States, and the potential impact of large events on electric power infrastructure.
Probabilistic wind/tornado/missile analyses for hazard and fragility evaluations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Y.J.; Reich, M.
Detailed analysis procedures and examples are presented for the probabilistic evaluation of hazard and fragility against high wind, tornado, and tornado-generated missiles. In the tornado hazard analysis, existing risk models are modified to incorporate various uncertainties including modeling errors. A significant feature of this paper is the detailed description of the Monte-Carlo simulation analyses of tornado-generated missiles. A simulation procedure, which includes the wind field modeling, missile injection, solution of flight equations, and missile impact analysis, is described with application examples.
Critical asset and portfolio risk analysis: an all-hazards framework.
Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark
2007-08-01
This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.
An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study
NASA Technical Reports Server (NTRS)
Ray, Paul S.
1996-01-01
The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.
Multivariate time series clustering on geophysical data recorded at Mt. Etna from 1996 to 2003
NASA Astrophysics Data System (ADS)
Di Salvo, Roberto; Montalto, Placido; Nunnari, Giuseppe; Neri, Marco; Puglisi, Giuseppe
2013-02-01
Time series clustering is an important task in data analysis issues in order to extract implicit, previously unknown, and potentially useful information from a large collection of data. Finding useful similar trends in multivariate time series represents a challenge in several areas including geophysics environment research. While traditional time series analysis methods deal only with univariate time series, multivariate time series analysis is a more suitable approach in the field of research where different kinds of data are available. Moreover, the conventional time series clustering techniques do not provide desired results for geophysical datasets due to the huge amount of data whose sampling rate is different according to the nature of signal. In this paper, a novel approach concerning geophysical multivariate time series clustering is proposed using dynamic time series segmentation and Self Organizing Maps techniques. This method allows finding coupling among trends of different geophysical data recorded from monitoring networks at Mt. Etna spanning from 1996 to 2003, when the transition from summit eruptions to flank eruptions occurred. This information can be used to carry out a more careful evaluation of the state of volcano and to define potential hazard assessment at Mt. Etna.
Tian, Jun-Ping; Wang, Hong; Du, Feng-He; Wang, Tao
2016-09-01
The mortality rate of peritoneal dialysis (PD) patients is still high, and the predicting factors for PD patient mortality remain to be determined. This study aimed to explore the relationship between the standard deviation (SD) of extracellular water/intracellular water (E/I) and all-cause mortality and technique failure in continuous ambulatory PD (CAPD) patients. All 152 patients came from the PD Center between January 1st 2006 and December 31st 2007. Clinical data and at least five-visit E/I ratio defined by bioelectrical impedance analysis were collected. The patients were followed up till December 31st 2010. The primary outcomes were death from any cause and technique failure. Kaplan-Meier analysis and Cox proportional hazards models were used to identify risk factors for mortality and technique failure in CAPD patients. All patients were followed up for 59.6 ± 23.0 months. The patients were divided into two groups according to their SD of E/I values: lower SD of E/I group (≤0.126) and higher SD of E/I group (>0.126). The patients with higher SD of E/I showed a higher all-cause mortality (log-rank χ (2) = 10.719, P = 0.001) and technique failure (log-rank χ (2) = 9.724, P = 0.002) than those with lower SD of E/I. Cox regression analysis found that SD of E/I independently predicted all-cause mortality (HR 3.551, 95 % CI 1.442-8.746, P = 0.006) and technique failure (HR 2.487, 95 % CI 1.093-5.659, P = 0.030) in CAPD patients after adjustment for confounders except when sensitive C-reactive protein was added into the model. The SD of E/I was a strong independent predictor of all-cause mortality and technique failure in CAPD patients.
Discussion of methodological issues for conducting benefit-cost analysis and provides guidance for selecting and applying the most appropriate and useful mechanisms in benefit-cost analysis of toxic substances, hazardous materials, and solid waste control
40 CFR 148.5 - Waste analysis.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Waste analysis. 148.5 Section 148.5 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) HAZARDOUS WASTE INJECTION RESTRICTIONS General § 148.5 Waste analysis. Generators of hazardous wastes that are...
40 CFR 148.5 - Waste analysis.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 23 2011-07-01 2011-07-01 false Waste analysis. 148.5 Section 148.5 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) HAZARDOUS WASTE INJECTION RESTRICTIONS General § 148.5 Waste analysis. Generators of hazardous wastes that are...