NASA Technical Reports Server (NTRS)
Deckert, George
2010-01-01
This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts
Canister Storage Building (CSB) Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
POWERS, T.B.
2000-03-16
This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safetymore » analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other controls required to protect the public, workers, and environment.« less
Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, eachmore » based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riddle, F. J.
2003-06-26
The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
R.J. Garrett
2005-02-17
The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology formore » this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.« less
Hydrothermal Liquefaction Treatment Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios receivedmore » increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.« less
NASA Technical Reports Server (NTRS)
Shortle, John F.; Allocco, Michael
2005-01-01
This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.
Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affectingmore » the public.« less
NASA Astrophysics Data System (ADS)
Citraresmi, A. D. P.; Wahyuni, E. E.
2018-03-01
The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2014 CFR
2014-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... instrumentation with alarms, and detection hardware such as hydrocarbon sensors.); (4) Consequences of failure of...
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... instrumentation with alarms, and detection hardware such as hydrocarbon sensors.); (4) Consequences of failure of...
Accident analysis and control options in support of the sludge water system safety analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
HEY, B.E.
A hazards analysis was initiated for the SWS in July 2001 (SNF-8626, K Basin Sludge and Water System Preliminary Hazard Analysis) and updated in December 2001 (SNF-10020 Rev. 0, Hazard Evaluation for KE Sludge and Water System - Project A16) based on conceptual design information for the Sludge Retrieval System (SRS) and 60% design information for the cask and container. SNF-10020 was again revised in September 2002 to incorporate new hazards identified from final design information and from a What-if/Checklist evaluation of operational steps. The process hazards, controls, and qualitative consequence and frequency estimates taken from these efforts have beenmore » incorporated into Revision 5 of HNF-3960, K Basins Hazards Analysis. The hazards identification process documented in the above referenced reports utilized standard industrial safety techniques (AIChE 1992, Guidelines for Hazard Evaluation Procedures) to systematically guide several interdisciplinary teams through the system using a pre-established set of process parameters (e.g., flow, temperature, pressure) and guide words (e.g., high, low, more, less). The teams generally included representation from the U.S. Department of Energy (DOE), K Basins Nuclear Safety, T Plant Nuclear Safety, K Basin Industrial Safety, fire protection, project engineering, operations, and facility engineering.« less
Preliminary hazards analysis -- vitrification process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coordes, D.; Ruggieri, M.; Russell, J.
1994-06-01
This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s constructionmore » and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.« less
NASA Technical Reports Server (NTRS)
Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.
2015-01-01
The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g., missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.
NASA Technical Reports Server (NTRS)
Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.
2015-01-01
The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g. missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
29 CFR 1910.119 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... complexity of the process will influence the decision as to the appropriate PHA methodology to use. All PHA... process hazard analysis in sufficient detail to support the analysis. (3) Information pertaining to the...) Relief system design and design basis; (E) Ventilation system design; (F) Design codes and standards...
Paukatong, K V; Kunawasen, S
2001-01-01
Nham is a traditional Thai fermented pork sausage. The major ingredients of Nham are ground pork meat and shredded pork rind. Nham has been reported to be contaminated with Salmonella spp., Staphylococcus aureus, and Listeria monocytogenes. Therefore, it is a potential cause of foodborne diseases for consumers. A Hazard Analysis and Critical Control Points (HACCP) generic model has been developed for the Nham process. Nham processing plants were observed and a generic flow diagram of Nham processes was constructed. Hazard analysis was then conducted. Other than microbial hazards, the pathogens previously found in Nham, sodium nitrite and metal were identified as chemical and physical hazards in this product, respectively. Four steps in the Nham process have been identified as critical control points. These steps are the weighing of the nitrite compound, stuffing, fermentation, and labeling. The chemical hazard of nitrite must be controlled during the weighing step. The critical limit of nitrite levels in the Nham mixture has been set at 100-200 ppm. This level is high enough to control Clostridium botulinum but does not cause chemical hazards to the consumer. The physical hazard from metal clips could be prevented by visual inspection of every Nham product during stuffing. The microbiological hazard in Nham could be reduced in the fermentation process. The critical limit of the pH of Nham was set at lower than 4.6. Since this product is not cooked during processing, finally, educating the consumer, by providing information on the label such as "safe if cooked before consumption", could be an alternative way to prevent the microbiological hazards of this product.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bloom, R.R.
1996-04-01
The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented VAC*TRAX mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses an indirectly heated, batch vacuum dryer to thermally desorb organic compounds from mixed wastes. This process hazards analysis evaluated 102 potential hazards. The three significant hazards identified involved the inclusion of oxygen in a process that also included an ignition source and fuel. Changesmore » to the design of the MTU were made concurrent with the hazard identification and analysis; all hazards with initial risk rankings of 1 or 2 were reduced to acceptable risk rankings of 3 or 4. The overall risk to any population group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.« less
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to the... engineering and administrative controls; (5) Stationary source siting; (6) Human factors; and (7) A...
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to the... engineering and administrative controls; (5) Stationary source siting; (6) Human factors; and (7) A...
40 CFR 68.67 - Process hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-07-01
...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to the... engineering and administrative controls; (5) Stationary source siting; (6) Human factors; and (7) A...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
10 CFR 70.62 - Safety program and integrated safety analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Radiological hazards related to possessing or processing licensed material at its facility; (ii) Chemical hazards of licensed material and hazardous chemicals produced from licensed material; (iii) Facility... performed by a team with expertise in engineering and process operations. The team shall include at least...
Bonan, Brigitte; Martelli, Nicolas; Berhoune, Malik; Maestroni, Marie-Laure; Havard, Laurent; Prognon, Patrice
2009-02-01
To apply the Hazard analysis and Critical Control Points method to the preparation of anti-cancer drugs. To identify critical control points in our cancer chemotherapy process and to propose control measures and corrective actions to manage these processes. The Hazard Analysis and Critical Control Points application began in January 2004 in our centralized chemotherapy compounding unit. From October 2004 to August 2005, monitoring of the process nonconformities was performed to assess the method. According to the Hazard Analysis and Critical Control Points method, a multidisciplinary team was formed to describe and assess the cancer chemotherapy process. This team listed all of the critical points and calculated their risk indexes according to their frequency of occurrence, their severity and their detectability. The team defined monitoring, control measures and corrective actions for each identified risk. Finally, over a 10-month period, pharmacists reported each non-conformity of the process in a follow-up document. Our team described 11 steps in the cancer chemotherapy process. The team identified 39 critical control points, including 11 of higher importance with a high-risk index. Over 10 months, 16,647 preparations were performed; 1225 nonconformities were reported during this same period. The Hazard Analysis and Critical Control Points method is relevant when it is used to target a specific process such as the preparation of anti-cancer drugs. This method helped us to focus on the production steps, which can have a critical influence on product quality, and led us to improve our process.
NASA Astrophysics Data System (ADS)
Moreira, Francisco; Silva, Nuno
2016-08-01
Safety systems require accident avoidance. This is covered by application standards, processes, techniques and tools that support the identification, analysis, elimination or reduction to an acceptable level of system risks and hazards. Ideally, a safety system should be free of hazards. However, both industry and academia have been struggling to ensure appropriate risk and hazard analysis, especially in what concerns completeness of the hazards, formalization, and timely analysis in order to influence the specifications and the implementation. Such analysis is also important when considering a change to an existing system. The Common Safety Method for Risk Evaluation and Assessment (CSM- RA) is a mandatory procedure whenever any significant change is proposed to the railway system in a European Member State. This paper provides insights on the fundamentals of CSM-RA based and complemented with Hazard Analysis. When and how to apply them, and the relation and similarities of these processes with industry standards and the system life cycles is highlighted. Finally, the paper shows how CSM-RA can be the basis of a change management process, guiding the identification and management of the hazards helping ensuring the similar safety level as the initial system. This paper will show how the CSM-RA principles can be used in other domains particularly for space system evolution.
29 CFR 1926.64 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2011 CFR
2011-07-01
... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...
29 CFR 1926.64 - Process safety management of highly hazardous chemicals.
Code of Federal Regulations, 2010 CFR
2010-07-01
... analysis methodology being used. (5) The employer shall establish a system to promptly address the team's... the decision as to the appropriate PHA methodology to use. All PHA methodologies are subject to... be developed in conjunction with the process hazard analysis in sufficient detail to support the...
WE-G-BRA-06: Application of Systems and Control Theory-Based Hazard Analysis to Radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pawlicki, T; Samost, A; Leveson, N
Purpose: The process of delivering radiation occurs in a complex socio-technical system heavily reliant on human operators. Furthermore, both humans and software are notoriously challenging to account for in traditional hazard analysis models. High reliability industries such as aviation have approached this problem through using hazard analysis techniques grounded in systems and control theory. The purpose of this work is to apply the Systems Theoretic Accident Model Processes (STAMP) hazard model to radiotherapy. In particular, the System-Theoretic Process Analysis (STPA) approach is used to perform a hazard analysis of a proposed on-line adaptive cranial radiosurgery procedure that omits the CTmore » Simulation step and uses only CBCT for planning, localization, and treatment. Methods: The STPA procedure first requires the definition of high-level accidents and hazards leading to those accidents. From there, hierarchical control structures were created followed by the identification and description of control actions for each control structure. Utilizing these control structures, unsafe states of each control action were created. Scenarios contributing to unsafe control action states were then identified and translated into system requirements to constrain process behavior within safe boundaries. Results: Ten control structures were created for this new CBCT-only process which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Twenty three control actions were identified that contributed to over 80 unsafe states of those control actions resulting in over 220 failure scenarios. Conclusion: The interaction of people, hardware, and software are highlighted through the STPA approach. STPA provides a hierarchical model for understanding the role of management decisions in impacting system safety so that a process design requirement can be traced back to the hazard and accident that it is intended to mitigate. Varian Medical Systems, Inc.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-25
... establishment's process control plans, that is, its Hazard Analysis and Critical Control Point plans. DATES... control plans, i.e., its Hazard Analysis and Critical Control Point (HACCP) plans; and (3) make the recall... systematic prevention of biological, chemical, and physical hazards. HACCP plans are establishment-developed...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2012 CFR
2012-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard Analysis and Critical Control Point (HACCP...
Safety Analysis of Soybean Processing for Advanced Life Support
NASA Technical Reports Server (NTRS)
Hentges, Dawn L.
1999-01-01
Soybeans (cv. Hoyt) is one of the crops planned for food production within the Advanced Life Support System Integration Testbed (ALSSIT), a proposed habitat simulation for long duration lunar/Mars missions. Soybeans may be processed into a variety of food products, including soymilk, tofu, and tempeh. Due to the closed environmental system and importance of crew health maintenance, food safety is a primary concern on long duration space missions. Identification of the food safety hazards and critical control points associated with the closed ALSSIT system is essential for the development of safe food processing techniques and equipment. A Hazard Analysis Critical Control Point (HACCP) model was developed to reflect proposed production and processing protocols for ALSSIT soybeans. Soybean processing was placed in the type III risk category. During the processing of ALSSIT-grown soybeans, critical control points were identified to control microbiological hazards, particularly mycotoxins, and chemical hazards from antinutrients. Critical limits were suggested at each CCP. Food safety recommendations regarding the hazards and risks associated with growing, harvesting, and processing soybeans; biomass management; and use of multifunctional equipment were made in consideration of the limitations and restraints of the closed ALSSIT.
Comparative hazard analysis of processes leading to remarkable flash floods (France, 1930-1999)
NASA Astrophysics Data System (ADS)
Boudou, M.; Lang, M.; Vinet, F.; Cœur, D.
2016-10-01
Flash flood events are responsible for large economic losses and lead to fatalities every year in France. This is especially the case in the Mediterranean and oversea territories/departments of France, characterized by extreme hydro-climatological features and with a large part of the population exposed to flood risks. The recurrence of remarkable flash flood events, associated with high hazard intensity, significant damage and socio-political consequences, therefore raises several issues for authorities and risk management policies. This study aims to improve our understanding of the hazard analysis process in the case of four remarkable flood events: March 1930, October 1940, January 1980 and November 1999. Firstly, we present the methodology used to define the remarkability score of a flood event. Then, to identify the factors leading to a remarkable flood event, we explore the main parameters of the hazard analysis process, such as the meteorological triggering conditions, the return period of the rainfall and peak discharge, as well as some additional factors (initial catchment state, flood chronology, cascade effects, etc.). The results contribute to understanding the complexity of the processes leading to flood hazard and highlight the importance for risk managers of taking additional factors into account.
Space Propulsion Hazards Analysis Manual (SPHAM). Volume 2. Appendices
1988-10-01
lb. RESTRICTIVE MARKINGS UNCLASSIFIED 2a. SECURITY CLASSIFICATION AUTHORITY 3 . DISTRIBUTION/AVAILABILITY OF REPORT Approved for public release...Volume I Chapter 2 - Requirementb and the Hazards Analysis Process .... Volume I Chapter 3 - Accident Scenarios...list of the hazardous materials that are discussed; 3 ) description of the failure scenarios; 4) type of post-accident environment that is discussed
Safety Hazards During Intrahospital Transport: A Prospective Observational Study.
Bergman, Lina M; Pettersson, Monica E; Chaboyer, Wendy P; Carlström, Eric D; Ringdal, Mona L
2017-10-01
To identify, classify, and describe safety hazards during the process of intrahospital transport of critically ill patients. A prospective observational study. Data from participant observations of the intrahospital transport process were collected over a period of 3 months. The study was undertaken at two ICUs in one university hospital. Critically ill patients transported within the hospital by critical care nurses, unlicensed nurses, and physicians. None. Content analysis was performed using deductive and inductive approaches. We detected a total of 365 safety hazards (median, 7; interquartile range, 4-10) during 51 intrahospital transports of critically ill patients, 80% of whom were mechanically ventilated. The majority of detected safety hazards were assessed as increasing the risk of harm, compromising patient safety (n = 204). Using the System Engineering Initiative for Patient Safety, we identified safety hazards related to the work system, as follows: team (n = 61), tasks (n = 83), tools and technologies (n = 124), environment (n = 48), and organization (n = 49). Inductive analysis provided an in-depth description of those safety hazards, contributing factors, and process-related outcomes. Findings suggest that intrahospital transport is a hazardous process for critically ill patients. We have identified several factors that may contribute to transport-related adverse events, which will provide the opportunity for the redesign of systems to enhance patient safety.
Transportation systems safety hazard analysis tool (SafetyHAT) user guide (version 1.0)
DOT National Transportation Integrated Search
2014-03-24
This is a user guide for the transportation system Safety Hazard Analysis Tool (SafetyHAT) Version 1.0. SafetyHAT is a software tool that facilitates System Theoretic Process Analysis (STPA.) This user guide provides instructions on how to download, ...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-06
... analyses and the development of other elements of the standard; developing a written action plan for..., revalidating and retaining the process hazard analysis; developing and implementing written operating [[Page 66639
RMP Guidance for Warehouses - Chapter 7: Prevention Program (Program 3)
If you are already complying with the OSHA Process Safety Management standard for on-site consequences, your process hazard analysis (PHA) team may have to assess new hazards that could affect the public or the environment offsite.
Application of systems and control theory-based hazard analysis to radiation oncology.
Pawlicki, Todd; Samost, Aubrey; Brown, Derek W; Manger, Ryan P; Kim, Gwe-Ya; Leveson, Nancy G
2016-03-01
Both humans and software are notoriously challenging to account for in traditional hazard analysis models. The purpose of this work is to investigate and demonstrate the application of a new, extended accident causality model, called systems theoretic accident model and processes (STAMP), to radiation oncology. Specifically, a hazard analysis technique based on STAMP, system-theoretic process analysis (STPA), is used to perform a hazard analysis. The STPA procedure starts with the definition of high-level accidents for radiation oncology at the medical center and the hazards leading to those accidents. From there, the hierarchical safety control structure of the radiation oncology clinic is modeled, i.e., the controls that are used to prevent accidents and provide effective treatment. Using STPA, unsafe control actions (behaviors) are identified that can lead to the hazards as well as causal scenarios that can lead to the identified unsafe control. This information can be used to eliminate or mitigate potential hazards. The STPA procedure is demonstrated on a new online adaptive cranial radiosurgery procedure that omits the CT simulation step and uses CBCT for localization, planning, and surface imaging system during treatment. The STPA procedure generated a comprehensive set of causal scenarios that are traced back to system hazards and accidents. Ten control loops were created for the new SRS procedure, which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Eighty three unsafe control actions were identified as well as 472 causal scenarios that could lead to those unsafe control actions. STPA provides a method for understanding the role of management decisions and hospital operations on system safety and generating process design requirements to prevent hazards and accidents. The interaction of people, hardware, and software is highlighted. The method of STPA produces results that can be used to improve safety and prevent accidents and warrants further investigation.
Modeling and Hazard Analysis Using STPA
NASA Astrophysics Data System (ADS)
Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka
2010-09-01
A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis and following the NASA standards for safety-critical systems, the results of our experimental application of STPA can be compared with these more traditional safety engineering approaches in terms of the problems identified and the resources required to use it.
Defining hazards of supplemental oxygen therapy in neonatology using the FMEA tool.
van der Eijk, Anne Catherine; Rook, Denise; Dankelman, Jenny; Smit, Bert Johan
2013-01-01
To prospectively evaluate hazards in the process of supplemental oxygen therapy in very preterm infants hospitalized in a Dutch NICU. A Failure Mode and Effects Analysis (FMEA) was conducted by a multidisciplinary team. This team identified, evaluated, and prioritized hazards of supplemental oxygen therapy in preterm infants. After accrediting "hazard scores" for each step in this process, recommendations were formulated for the main hazards. Performing the FMEA took seven meetings of 2 hours. The top 10 hazards could all be categorized into three main topics: incorrect adjustment of the fraction of inspired oxygen (FiO2), incorrect alarm limits for SpO2, and incorrect pulse-oximetry alarm limits on patient monitors for temporary use. The FMEA culminated in recommendations in both educational and technical directions. These included suggestions for (changes in) protocols on alarm limits and manual FiO2 adjustments, education of NICU staff on hazards of supplemental oxygen, and technical improvements in respiratory devices and patient monitors. The FMEA prioritized flaws in the process of supplemental oxygen therapy in very preterm infants. Thanks to the structured approach of the analysis by a multidisciplinary team, several recommendations were made. These recommendations are currently implemented in the study's center.
Multi-hazard risk analysis for management strategies
NASA Astrophysics Data System (ADS)
Kappes, M.; Keiler, M.; Bell, R.; Glade, T.
2009-04-01
Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.
Cho, Herman M.; Washton, Nancy M.; Mueller, Karl T.; Sears, Jr., Jesse A.; Townsend, Mark R.; Ewing, James R.
2016-06-14
A magic-angle-spinning (MAS) nuclear magnetic resonance (NMR) probe is described that includes double containment enclosures configured to seal and contain hazardous samples for analysis. The probe is of a modular design that ensures containment of hazardous samples during sample analysis while preserving spin speeds for superior NMR performance and convenience of operation.
RMP Guidance for Chemical Distributors - Chapter 7: Prevention Program (Program 3)
The OSHA Process Safety Management program has legal authority for on-site consequences, EPA's Prevention Program for offsite consequences, so your process hazard analysis (PHA) team may have to assess new hazards to the public and offsite environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharifi, Mozafar; Hadidi, Mosslem; Vessali, Elahe
2009-10-15
The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose ofmore » GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.« less
Sharifi, Mozafar; Hadidi, Mosslem; Vessali, Elahe; Mosstafakhani, Parasto; Taheri, Kamal; Shahoie, Saber; Khodamoradpour, Mehran
2009-10-01
The evaluation of a hazardous waste disposal site is a complicated process because it requires data from diverse social and environmental fields. These data often involve processing of a significant amount of spatial information which can be used by GIS as an important tool for land use suitability analysis. This paper presents a multi-criteria decision analysis alongside with a geospatial analysis for the selection of hazardous waste landfill sites in Kurdistan Province, western Iran. The study employs a two-stage analysis to provide a spatial decision support system for hazardous waste management in a typically under developed region. The purpose of GIS was to perform an initial screening process to eliminate unsuitable land followed by utilization of a multi-criteria decision analysis (MCDA) to identify the most suitable sites using the information provided by the regional experts with reference to new chosen criteria. Using 21 exclusionary criteria, as input layers, masked maps were prepared. Creating various intermediate or analysis map layers a final overlay map was obtained representing areas for hazardous waste landfill sites. In order to evaluate different landfill sites produced by the overlaying a landfill suitability index system was developed representing cumulative effects of relative importance (weights) and suitability values of 14 non-exclusionary criteria including several criteria resulting from field observation. Using this suitability index 15 different sites were visited and based on the numerical evaluation provided by MCDA most suitable sites were determined.
Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F
2010-01-01
The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.
Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti
2013-01-01
Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.
NASA Technical Reports Server (NTRS)
1971-01-01
A comprehensive test plan for determining the hazards associated with pyrotechnic manufacturing processes is presented. The rationale for each test is based on a systematic analysis of historical accounts of accidents and a detailed study of the characteristics of each manufacturing process. The most hazardous manufacturing operations have been determined to be pressing, mixing, reaming, and filling. The hazard potential of a given situation is evaluated in terms of the probabilities of initiation, communication, and transition to detonation (ICT). The characteristics which affect the ICT probabilities include the ignition mechanisms which are present either in normal or abnormal operation, the condition and properties of the pyrotechnic material, and the configuration of the processing equipment. Analytic expressions are derived which describe the physical conditions of the system, thus permitting a variety of processes to be evaluated in terms of a small number of experiments.
Software System Architecture Modeling Methodology for Naval Gun Weapon Systems
2010-12-01
Weapon System HAR Hazard Action Report HERO Hazards of Electromagnetic Radiation to Ordnance IOC Initial Operational Capability... radiation to ordnance ; and combinations therein. Equipment, systems, or procedures and processes whose malfunction would hazard the safe manufacturing...NDI Non-Development Item OPEVAL Operational Evaluation ORDALTS Ordnance Alterations O&SHA Operating and Support Hazard Analysis PDA
21 CFR 120.11 - Verification and validation.
Code of Federal Regulations, 2013 CFR
2013-04-01
... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Verification and validation. 120.11 Section 120.11...
21 CFR 120.11 - Verification and validation.
Code of Federal Regulations, 2012 CFR
2012-04-01
... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Verification and validation. 120.11 Section 120.11...
21 CFR 120.11 - Verification and validation.
Code of Federal Regulations, 2011 CFR
2011-04-01
... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Verification and validation. 120.11 Section 120.11...
21 CFR 120.11 - Verification and validation.
Code of Federal Regulations, 2014 CFR
2014-04-01
... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Verification and validation. 120.11 Section 120.11...
21 CFR 120.11 - Verification and validation.
Code of Federal Regulations, 2010 CFR
2010-04-01
... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Verification and validation. 120.11 Section 120.11...
Using a 3D CAD plant model to simplify process hazard reviews
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tolpa, G.
A Hazard and Operability (HAZOP) review is a formal predictive procedure used to identify potential hazard and operability problems associated with certain processes and facilities. The HAZOP procedure takes place several times during the life cycle of the facility. Replacing plastic models, layout and detail drawings with a 3D CAD electronic model, provides access to process safety information and a detailed level of plant topology that approaches the visualization capability of the imagination. This paper describes the process that is used for adding the use of a 3D CAD model to flowsheets and proven computer programs for the conduct ofmore » hazard and operability reviews. Using flowsheets and study nodes as a road map for the review the need for layout and other detail drawings is all but eliminated. Using the 3D CAD model again for a post-P and ID HAZOP supports conformance to layout and safety requirements, provides superior visualization of the plant configuration and preserves the owners equity in the design. The response from the review teams are overwhelmingly in favor of this type of review over a review that uses only drawings. Over the long term the plant model serves more than just process hazards analysis. Ongoing use of the model can satisfy the required access to process safety information, OHSA documentation and other legal requirements. In this paper extensive instructions address the logic for the process hazards analysis and the preparation required to assist anyone who wishes to add the use of a 3D model to their review.« less
NASA Astrophysics Data System (ADS)
Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus
2016-04-01
In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach torrent in Tyrol (Austria), are analysed in detail. A couple of buildings are entirely reconstructed within the physical scale model at the scale 1:30. They include basement and first floor and thereby all relevant openings on the building envelopes. The results from experimental modelling represent the data basis for further physics-based vulnerability analysis. Hence, the applied vulnerability analysis concept significantly extends the methods presently used in flood risk assessment. The results of the study are of basic importance for practical application, as they provide extensive information to support hazard zone mapping and management, as well as the planning of local technical protection measures.
Assessment and Control of Spacecraft Charging Risks on the International Space Station
NASA Technical Reports Server (NTRS)
Koontz, Steve; Valentine, Mark; Keeping, Thomas; Edeen, Marybeth; Spetch, William; Dalton, Penni
2004-01-01
The International Space Station (ISS) operates in the F2 region of Earth's ionosphere, orbiting at altitudes ranging from 350 to 450 km at an inclination of 51.6 degrees. The relatively dense, cool F2 ionospheric plasma suppresses surface charging processes much of the time, and the flux of relativistic electrons is low enough to preclude deep dielectric charging processes. The most important spacecraft charging processes in the ISS orbital environment are: 1) ISS electrical power system interactions with the F2 plasma, 2) magnetic induction processes resulting from flight through the geomagnetic field and, 3) charging processes that result from interaction with auroral electrons at high latitude. Recently, the continuing review and evaluation of putative ISS charging hazards required by the ISS Program Office revealed that ISS charging could produce an electrical shock hazard to the ISS crew during extravehicular activity (EVA). ISS charging risks are being evaluated in an ongoing measurement and analysis campaign. The results of ISS charging measurements are combined with a recently developed model of ISS charging (the Plasma Interaction Model) and an exhaustive analysis of historical ionospheric variability data (ISS Ionospheric Specification) to evaluate ISS charging risks using Probabilistic Risk Assessment (PRA) methods. The PRA combines estimates of the frequency of occurrence and severity of the charging hazards with estimates of the reliability of various hazard controls systems, as required by NASA s safety and risk management programs, to enable design and selection of a hazard control approach that minimizes overall programmatic and personnel risk. The PRA provides a quantitative methodology for incorporating the results of the ISS charging measurement and analysis campaigns into the necessary hazard reports, EVA procedures, and ISS flight rules required for operating ISS in a safe and productive manner.
Natural hazard modeling and uncertainty analysis [Chapter 2
Matthew Thompson; Jord J. Warmink
2017-01-01
Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-14
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2010-N-0357] Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis and Critical Control Point Procedures for the Safe and Sanitary Processing and Importing of Juice AGENCY: Food...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-20
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Correction AGENCY... manufacturing, packing, or holding human food (CGMPs) to modernize it and to add requirements for domestic and... ``food-production purposes (i.e., manufacturing, processing, packing, and holding) to consistently use...
An Analysis of U.S. Army Health Hazard Assessments During the Acquisition of Military Materiel
2010-06-03
protective equipment (PPE) (Milz, Conrad, & Soule , 2003). Engineering controls can eliminate hazards through system design, substitution of hazardous...Milz, Conrad, & Soule , 2003). Engineering control measures can serve to 7 minimize hazards where they cannot be eliminated, with preference for...during the materiel acquisitions process, and (c) will evaluate a sample of the database for accuracy by comparing the data entries to original reports
Fielding, L M; Ellis, L; Beveridge, C; Peters, A C
2005-04-01
To reduce foodborne illnesses, hazard and risk-based quality management systems are essential. Small and medium sized companies (SMEs) tend to have a poor understanding of such systems and limited adoption of the Hazard Analysis Critical Control Point system (HACCP). The requirement for full HACCP implementation by 2006 will place an even greater burden on these businesses. The aim of this project is to assess the current levels of understanding of hazards and risks in SMEs in the manufacturing sector. A questionnaire survey was made of 850 SMEs, including microbusinesses. This determined the industry sector and processes carried out, whether the company operated hazard-based quality management and the knowledge of the technical manager regarding the associated hazards and risks. Follow-up visits to the manufacturing plant observed the processes and the operatives to determine their level of understanding. A benchmarking audit was carried out and each company was rated. The results show that the majority of respondents stated that they operated hazard analysis-based quality management. The ability of the respondents to correctly define a hazard or risk or identify different types of hazard was, however, poor. There was no correlation between business type and audit score. The microbusinesses did, however, perform significantly less well than the larger SMEs.
Historical analysis of US pipeline accidents triggered by natural hazards
NASA Astrophysics Data System (ADS)
Girgin, Serkan; Krausmann, Elisabeth
2015-04-01
Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and consequences.
Plasma Hazards and Acceptance for International Space Station Extravehicular Activities
NASA Astrophysics Data System (ADS)
Patton, Thomas
2010-09-01
Extravehicular activity(EVA) is accepted by NASA and other space faring agencies as a necessary risk in order to build and maintain a safe and efficient laboratory in space. EVAs are used for standard construction and as contingency operations to repair critical equipment for vehicle sustainability and safety of the entire crew in the habitable volume. There are many hazards that are assessed for even the most mundane EVA for astronauts, and the vast majority of these are adequately controlled per the rules of the International Space Station Program. The need for EVA repair and construction has driven acceptance of a possible catastrophic hazard to the EVA crewmember which cannot currently be controlled adequately. That hazard is electrical shock from the very environment in which they work. This paper describes the environment, causes and contributors to the shock of EVA crewmembers attributed to the ionospheric plasma environment in low Earth orbit. It will detail the hazard history, and acceptance process for the risk associated with these hazards that give assurance to a safe EVA. In addition to the hazard acceptance process this paper will explore other factors that go into the decision to accept a risk including criticality of task, hardware design and capability, and the probability of hazard occurrence. Also included will be the required interaction between organizations at NASA(EVA Office, Environments, Engineering, Mission Operations, Safety) in order to build and eventually gain adequate acceptance rationale for a hazard of this kind. During the course of the discussion, all current methods of mitigating the hazard will be identified. This paper will capture the history of the plasma hazard analysis and processes used by the International Space Station Program to formally assess and qualify the risk. The paper will discuss steps that have been taken to identify and perform required analysis of the floating potential shock hazard from the ISS environment which eventually led to its status as an accepted risk for ISS EVAs.
[Design of a HACCP Plan for the Gouda-type cheesemaking process in a milk processing plant].
Dávila, Jacqueline; Reyes, Genara; Corzo, Otoniel
2006-03-01
The Hazard Analysis and Critical Control Point (HACCP) is a preventive and systematic method used to identify, assess and control of the hazards related with raw material, ingredients, processing, marketing and intended consumer in order to assure the safety of the food. The aim of this study was to design a HACCP plan for implementing in a Gouda-type cheese-making process in a dairy processing plant. The used methodology was based in the application of the seven principles of the HACCP, the information from the plant about the compliment of the pre-requisite programs (70-80%), the experience of the HACCP team and the sequence of stages settles down by the COVENIN standard 3802 for implementing the HACCP system. A HACCP plan was proposed with the scope, the selection of HACCP team, the description of the product and the intended use, the flow diagram of the process, the hazard analysis and the control table of the plan with the critical control points (CCP). The following CCP were identified in the process: pasteurization, coagulation and ripening.
Lunar Landing Trajectory Design for Onboard Hazard Detection and Avoidance
NASA Technical Reports Server (NTRS)
Paschall, Steve; Brady, Tye; Sostaric, Ron
2009-01-01
The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is developing the software and hardware technology needed to support a safe and precise landing for the next generation of lunar missions. ALHAT provides this capability through terrain-relative navigation measurements to enhance global-scale precision, an onboard hazard detection system to select safe landing locations, and an Autonomous Guidance, Navigation, and Control (AGNC) capability to process these measurements and safely direct the vehicle to a landing location. This paper focuses on the key trajectory design issues relevant to providing an onboard Hazard Detection and Avoidance (HDA) capability for the lander. Hazard detection can be accomplished by the crew visually scanning the terrain through a window, a sensor system imaging the terrain, or some combination of both. For ALHAT, this hazard detection activity is provided by a sensor system, which either augments the crew s perception or entirely replaces the crew in the case of a robotic landing. Detecting hazards influences the trajectory design by requiring the proper perspective, range to the landing site, and sufficient time to view the terrain. Following this, the trajectory design must provide additional time to process this information and make a decision about where to safely land. During the final part of the HDA process, the trajectory design must provide sufficient margin to enable a hazard avoidance maneuver. In order to demonstrate the effects of these constraints on the landing trajectory, a tradespace of trajectory designs was created for the initial ALHAT Design Analysis Cycle (ALDAC-1) and each case evaluated with these HDA constraints active. The ALHAT analysis process, described in this paper, narrows down this tradespace and subsequently better defines the trajectory design needed to support onboard HDA. Future ALDACs will enhance this trajectory design by balancing these issues and others in an overall system design process.
General RMP Guidance - Appendix D: OSHA Guidance on PSM
OSHA's Process Safety Management (PSM) Guidance on providing complete and accurate written information concerning process chemicals, process technology, and process equipment; including process hazard analysis and material safety data sheets.
Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis
NASA Astrophysics Data System (ADS)
Karlsson, Caroline S. J.; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W.
2017-11-01
Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.
Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis.
Karlsson, Caroline S J; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W
2017-11-01
Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.
Hung, Yu-Ting; Liu, Chi-Te; Peng, I-Chen; Hsu, Chin; Yu, Roch-Chui; Cheng, Kuan-Chen
2015-09-01
To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP) plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management. Copyright © 2015. Published by Elsevier B.V.
[Design of a HACCP plan for the industrial process of frozen sardines].
Rosas, Patricia; Reyes, Genara
2009-09-01
The Hazard Analysis and Critical Control Point (HACCP) is a system to identify, assess and control the hazards related with production, processing, distribution and consumption in order to get safe food. The aim of this study was to design a HACCP plan for implementing in processing line of frozen whole sardine (Sardinella aurita). The methodology was based in the evaluation of the accomplishment of the pre-requisite programs (GMP/SSOP in a previous study), the application of the principles of the HACCP and the sequence of stages settles down by the COVENIN Venezuelan standard No 3802. Time-temperature was recorded in each processing step. Histamine was determined by VERATOX NEOGEN. Results showed that some sardine batches arrived to the plant with high time-temperature records, finding up to 5 ppm of histamine due to the abuse of temperature during transportation. A HACCP plan is proposed with the scope, the selection of the team, the description of the product and the intended use, the flow diagram of the process, hazard analysis and identification of CCP, monitoring system, corrective actions and records. The potential hazards were identified as pathogen growth, presence of histamine and physical objects in the sardines. The control measures of PCC are referred as control of time-temperature during transportation and processing, monitoring of ice supplies and sanitary conditions in the process.
An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study
NASA Technical Reports Server (NTRS)
Ray, Paul S.
1996-01-01
The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.
NASA Astrophysics Data System (ADS)
Weinke, Elisabeth; Hölbling, Daniel; Albrecht, Florian; Friedl, Barbara
2017-04-01
Geo-hazards and their effects are distributed geographically over wide regions. The effective mapping and monitoring is essential for hazard assessment and mitigation. It is often best achieved using satellite imagery and new object-based image analysis approaches to identify and delineate geo-hazard objects (landslides, floods, forest fires, storm damages, etc.). At the moment, several local/national databases and platforms provide and publish data of different types of geo-hazards as well as web-based risk maps and decision support systems. Also, the European commission implemented the Copernicus Emergency Management Service (EMS) in 2015 that publishes information about natural and man-made disasters and risks. Currently, no platform for landslides or geo-hazards as such exists that enables the integration of the user in the mapping and monitoring process. In this study we introduce the concept of a spatial data infrastructure for object delineation, web-processing and service provision of landslide information with the focus on user interaction in all processes. A first prototype for the processing and mapping of landslides in Austria and Italy has been developed within the project Land@Slide, funded by the Austrian Research Promotion Agency FFG in the Austrian Space Applications Program ASAP. The spatial data infrastructure and its services for the mapping, processing and analysis of landslides can be extended to other regions and to all types of geo-hazards for analysis and delineation based on Earth Observation (EO) data. The architecture of the first prototypical spatial data infrastructure includes four main areas of technical components. The data tier consists of a file storage system and the spatial data catalogue for the management of EO-data, other geospatial data on geo-hazards, as well as descriptions and protocols for the data processing and analysis. An interface to extend the data integration from external sources (e.g. Sentinel-2 data) is planned for the possibility of rapid mapping. The server tier consists of java based web and GIS server. Sub and main services are part of the service tier. Sub services are for example map services, feature editing services, geometry services, geoprocessing services and metadata services. For (meta)data provision and to support data interoperability, web standards of the OGC and the rest-interface is used. Four central main services are designed and developed: (1) a mapping service (including image segmentation and classification approaches), (2) a monitoring service to monitor changes over time, (3) a validation service to analyze landslide delineations from different sources and (4) an infrastructure service to identify affected landslides. The main services use and combine parts of the sub services. Furthermore, a series of client applications based on new technology standards making use of the data and services offered by the spatial data infrastructure. Next steps include the design to extend the current spatial data infrastructure to other areas and geo-hazard types to develop a spatial data infrastructure that can assist targeted mapping and monitoring of geo-hazards on a global context.
[Failure modes and effects analysis in the prescription, validation and dispensing process].
Delgado Silveira, E; Alvarez Díaz, A; Pérez Menéndez-Conde, C; Serna Pérez, J; Rodríguez Sagrado, M A; Bermejo Vicedo, T
2012-01-01
To apply a failure modes and effects analysis to the prescription, validation and dispensing process for hospitalised patients. A work group analysed all of the stages included in the process from prescription to dispensing, identifying the most critical errors and establishing potential failure modes which could produce a mistake. The possible causes, their potential effects, and the existing control systems were analysed to try and stop them from developing. The Hazard Score was calculated, choosing those that were ≥ 8, and a Severity Index = 4 was selected independently of the hazard Score value. Corrective measures and an implementation plan were proposed. A flow diagram that describes the whole process was obtained. A risk analysis was conducted of the chosen critical points, indicating: failure mode, cause, effect, severity, probability, Hazard Score, suggested preventative measure and strategy to achieve so. Failure modes chosen: Prescription on the nurse's form; progress or treatment order (paper); Prescription to incorrect patient; Transcription error by nursing staff and pharmacist; Error preparing the trolley. By applying a failure modes and effects analysis to the prescription, validation and dispensing process, we have been able to identify critical aspects, the stages in which errors may occur and the causes. It has allowed us to analyse the effects on the safety of the process, and establish measures to prevent or reduce them. Copyright © 2010 SEFH. Published by Elsevier Espana. All rights reserved.
NASA Technical Reports Server (NTRS)
Butler, David R.; Walsh, Stephen J.; Brown, Daniel G.
1991-01-01
Methods are described for using Landsat Thematic Mapper digital data and digital elevation models for the display of natural hazard sites in a mountainous region of northwestern Montana, USA. Hazard zones can be easily identified on the three-dimensional images. Proximity of facilities such as highways and building locations to hazard sites can also be easily displayed. A temporal sequence of Landsat TM (or similar) satellite data sets could also be used to display landscape changes associated with dynamic natural hazard processes.
Tracking Hazard Analysis Data in a Jungle of Changing Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Robin S.; Young, Jonathan
2006-05-14
The biggest fear of the hazard analyst is the loss of data in the middle of the design jungle. When project schedules are demanding and design is changing rapidly it is essential that the hazard analysis data be tracked and kept current in order to provide the required project design, development, and regulatory support. Being able to identify the current information, as well as the past archived information, as the design progresses and to be able to show how the project is designing in safety through modifications based on hazard analysis results is imperative. At the DOE Hanford site inmore » Washington State, Flour Hanford Inc is in the process of the removal and disposition of sludge from the 100 Area K Basins. The K Basins were used to store spent fuel from the operating reactors at the Hanford Site. The sludge is a by-product from the corrosion of the fuel and fuel storage canisters. The sludge removal project has been very dynamic involving the design, procurement and, more recently, the operation of processes at two basins, K East and K West. The project has an ambitious schedule with a large number of changes to design concepts. In order to support the complex K Basins project a technique to track the status of the hazard analysis data was developed. This paper will identify the most important elements of the tracking system and how it was used to assist the project in ensuring that current design data was reflected in a specific version of the hazard analysis and to show how the project was keeping up with the design and ensuring compliance with the requirements to design in safety. While the specifics of the data tracking strategy for the K Basins sludge removal project will be described in the paper, the general concepts of the strategy are applicable to similar projects requiring iteration of hazard analysis and design.« less
NASA Astrophysics Data System (ADS)
Huttenlau, Matthias; Stötter, Johann
2010-05-01
Reinsurance companies are stating a high increase in natural hazard related losses, both insured and economic losses, within the last decades on a global scale. This ongoing trend can be described as a product of the dynamic in the natural and in the anthroposphere. To analyze the potential impact of natural hazard process to a certain insurance portfolio or to the society in general, reinsurance companies or risk management consultants have developed loss models. However, those models are generally not fitting the scale dependent demand on regional scales like it is appropriate (i) for analyses on the scale of a specific province or (ii) for portfolio analyses of regional insurance companies. Moreover, the scientific basis of most of the models is not transparent documented and therefore scientific evaluations concerning the methodology concepts are not possible (black box). This is contrary to the scientific principles of transparency and traceability. Especially in mountain regions like the European Alps with their inherent (i) specific characteristic on small scales, (ii) the relative high process dynamics in general, (iii) the occurrence of gravitative mass movements which are related to high relief energy and thus only exists in mountain regions, (iv) the small proportion of the area of permanent settlement on the overall area, (v) the high value concentration in the valley floors, (vi) the exposition of important infrastructures and lifelines, and others, analyses must consider these circumstances adequately. Therefore, risk-based analyses are methodically estimating the potential consequences of hazard process on the built environment standardized with the risk components (i) hazard, (ii) elements at risk, and (iii) vulnerability. However, most research and progress have been made in the field of hazard analyses, whereas the other both components are not developed accordingly. Since these three general components are influencing factors without any weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were incorporated with additional GIS and statistic data to a comprehensive property-by-property geodatabase of the existing elements and values. This stock of elements and values geodatabase is furthermore the consistent basis for all natural hazard analyses and enables the comparison of the results. The study follows the general accepted moduls (i) hazard analysis, (ii) exposition analysis, and (iii) consequence analysis, whereas the exposition analysis estimates the elements at risk with their corresponding damage potentials and the consequence analysis estimates the PMLs. This multi-hazard analysis focuses on process types with a high to extreme potential of negative consequences on a regional scale. In this context, (i) floodings, (ii) rockslides with the potential of corresponding consequence effects (backwater ponding and outburst flood), (iii) earthquakes, (iv) hail events, and (v) winter storms were considered as hazard processes. Based on general hazard analyses (hazard maps) concrete scenarios and their spatial affectedness were determined. For the different hazard processes, different vulnerability approaches were considered to demonstrate their sensitivity and implication on the results. Thus, no absolute values of losses but probable loss ranges were estimated. It can be shown, that the most serious amount of losses would arise from extreme earthquake events with loss burdens up to more than € 7 bn. solely on buildings and inventory. Possible extreme flood events could lead to losses between € 2 and 2.5 bn., whereas a severe hail swath which affects the central Inn valley could result in losses of ca. € 455 mill. (thereof € 285 mill. on vehicles). The potential most serious rockslide with additional consequence effects would result in losses up to ca. € 185 mill. and extreme winter storms can induce losses between € 100 mill. and 150 mill..
Research on the spatial analysis method of seismic hazard for island
NASA Astrophysics Data System (ADS)
Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying
2017-05-01
Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.
HACCP-Based Programs for Preventing Disease and Injury from Premise Plumbing: A Building Consensus
McCoy, William F.; Rosenblatt, Aaron A.
2015-01-01
Thousands of preventable injuries and deaths are annually caused by microbial, chemical and physical hazards from building water systems. Water is processed in buildings before use; this can degrade the quality of the water. Processing steps undertaken on-site in buildings often include conditioning, filtering, storing, heating, cooling, pressure regulation and distribution through fixtures that restrict flow and temperature. Therefore, prevention of disease and injury requires process management. A process management framework for buildings is the hazard analysis and critical control point (HACCP) adaptation of failure mode effects analysis (FMEA). It has been proven effective for building water system management. Validation is proof that hazards have been controlled under operating conditions and may include many kinds of evidence including cultures of building water samples to detect and enumerate potentially pathogenic microorganisms. However, results from culture tests are often inappropriately used because the accuracy and precision are not sufficient to support specifications for control limit or action triggers. A reliable negative screen is based on genus-level Polymerase Chain Reaction (PCR) for Legionella in building water systems; however, building water samples with positive results from this test require further analysis by culture methods. PMID:26184325
HACCP-Based Programs for Preventing Disease and Injury from Premise Plumbing: A Building Consensus.
McCoy, William F; Rosenblatt, Aaron A
2015-07-09
Thousands of preventable injuries and deaths are annually caused by microbial, chemical and physical hazards from building water systems. Water is processed in buildings before use; this can degrade the quality of the water. Processing steps undertaken on-site in buildings often include conditioning, filtering, storing, heating, cooling, pressure regulation and distribution through fixtures that restrict flow and temperature. Therefore, prevention of disease and injury requires process management. A process management framework for buildings is the hazard analysis and critical control point (HACCP) adaptation of failure mode effects analysis (FMEA). It has been proven effective for building water system management. Validation is proof that hazards have been controlled under operating conditions and may include many kinds of evidence including cultures of building water samples to detect and enumerate potentially pathogenic microorganisms. However, results from culture tests are often inappropriately used because the accuracy and precision are not sufficient to support specifications for control limit or action triggers. A reliable negative screen is based on genus-level Polymerase Chain Reaction (PCR) for Legionella in building water systems; however, building water samples with positive results from this test require further analysis by culture methods.
Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis
NASA Technical Reports Server (NTRS)
Massie, Michael J.; Morris, A. Terry
2010-01-01
Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.
1994-11-01
59 10 Solid Rocket Motor Combustion Products ...60 11 Core Vehicle First Stage Combustion Products ......................................................60 12 Health Hazard...Qualities of Hazardous Launch Emissions......................................61 13 Atlas II Combustion Products
PO*WW*ER mobile treatment unit process hazards analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richardson, R.B.
1996-06-01
The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented PO*WW*ER mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat aqueous mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses evaporation to separate organics and water from radionuclides and solids, and catalytic oxidation to convert the hazardous into byproducts. This process hazards analysis evaluated a number of accident scenarios not directly related to the operation of the MTU, such as natural phenomena damagemore » and mishandling of chemical containers. Worst case accident scenarios were further evaluated to determine the risk potential to the MTU and to workers, the public, and the environment. The overall risk to any group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, A; Samost, A; Viswanathan, A
Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios weremore » then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in brachytherapy and may prove to be an alternative to other hazard analysis techniques.« less
KSC VAB Aeroacoustic Hazard Assessment
NASA Technical Reports Server (NTRS)
Oliveira, Justin M.; Yedo, Sabrina; Campbell, Michael D.; Atkinson, Joseph P.
2010-01-01
NASA Kennedy Space Center (KSC) carried out an analysis of the effects of aeroacoustics produced by stationary solid rocket motors in processing areas at KSC. In the current paper, attention is directed toward the acoustic effects of a motor burning within the Vehicle Assembly Building (VAB). The analysis was carried out with support from ASRC Aerospace who modeled transmission effects into surrounding facilities. Calculations were done using semi-analytical models for both aeroacoustics and transmission. From the results it was concluded that acoustic hazards in proximity to the source of ignition and plume can be severe; acoustic hazards in the far-field are significantly lower.
ERIC Educational Resources Information Center
Hilz, Christoph; Ehrenfeld, John R.
1991-01-01
Several policy frameworks for managing hazardous waste import/export are examined with respect to economic issues, environmental sustainability, and administrative feasibility and effectiveness. Several recommendations for improving the present instrument and implementing process are offered. (Author/CW)
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L.; Vogel, R. M.
2015-12-01
Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.
Hazard Analysis of Japanese Boxed Lunches (Bento).
ERIC Educational Resources Information Center
Bryan, Frank L.; And Others
1991-01-01
For the purposes of identifying contaminants, of assessing risks, and of determining critical food processing control points, hazard analyses were conducted at two "bento" (oriental boxed meals) catering operations. Time and temperature abuses during the holding period, after cooking and prior to consumption, were found to be the primary…
General RMP Guidance - Chapter 7: Prevention Program (Program 3)
Many Program 3 processes are already addressed by the OSHA Process Safety Management Program, which covers on-site consequences. So for compliance with the risk management program, process hazard analysis teams must consider potential offsite consequences.
Thepaksorn, Phayong; Thongjerm, Supawan; Incharoen, Salee; Siriwong, Wattasit; Harada, Kouji; Koizumi, Akio
2017-11-25
We utilized job safety analysis (JSA) and hazard identification for work accident prevention in Para rubber wood sawmills, which aimed to investigate occupational health risk exposures and assess the health hazards at sawmills in the Trang Province, located in southern Thailand. We conducted a cross-sectional study which included a walk-through survey, JSA, occupational risk assessment, and environmental samplings from March through September 2015 at four Para rubber wood sawmills. We identified potential occupational safety and health hazards associated with six main processes, including: 1) logging and cutting, 2) sawing the lumber into sheets, 3) planing and re-arranging, 4) vacuuming and wood preservation, 5) drying and planks re-arranging, and 6) grading, packing, and storing. Working in sawmills was associated with high risk of wood dust and noise exposure, occupational accidents injuring hands and feet, chemicals and fungicide exposure, and injury due to poor ergonomics or repetitive work. Several high-risk areas were identified from JSA and hazard identification of the working processes, especially high wood dust and noise exposure when sawing lumber into sheets and risk of occupational accidents of the hands and feet when struck by lumber. All workers were strongly recommended to use personal protective equipment in any working processes. Exposures should be controlled using local ventilation systems and reducing noise transmission. We recommend that the results from the risk assessment performed in this study be used to create an action plan for reducing occupational health hazards in Para rubber sawmills.
Thepaksorn, Phayong; Thongjerm, Supawan; Incharoen, Salee; Siriwong, Wattasit; Harada, Kouji; Koizumi, Akio
2017-01-01
Objective: We utilized job safety analysis (JSA) and hazard identification for work accident prevention in Para rubber wood sawmills, which aimed to investigate occupational health risk exposures and assess the health hazards at sawmills in the Trang Province, located in southern Thailand. Methods: We conducted a cross-sectional study which included a walk-through survey, JSA, occupational risk assessment, and environmental samplings from March through September 2015 at four Para rubber wood sawmills. Results: We identified potential occupational safety and health hazards associated with six main processes, including: 1) logging and cutting, 2) sawing the lumber into sheets, 3) planing and re-arranging, 4) vacuuming and wood preservation, 5) drying and planks re-arranging, and 6) grading, packing, and storing. Working in sawmills was associated with high risk of wood dust and noise exposure, occupational accidents injuring hands and feet, chemicals and fungicide exposure, and injury due to poor ergonomics or repetitive work. Discussion: Several high-risk areas were identified from JSA and hazard identification of the working processes, especially high wood dust and noise exposure when sawing lumber into sheets and risk of occupational accidents of the hands and feet when struck by lumber. All workers were strongly recommended to use personal protective equipment in any working processes. Exposures should be controlled using local ventilation systems and reducing noise transmission. We recommend that the results from the risk assessment performed in this study be used to create an action plan for reducing occupational health hazards in Para rubber sawmills. PMID:28993571
NASA Technical Reports Server (NTRS)
Zelkin, Natalie; Henriksen, Stephen
2011-01-01
This document is being provided as part of ITT's NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract: "New ATM Requirements--Future Communications, C-Band and L-Band Communications Standard Development." ITT has completed a safety hazard analysis providing a preliminary safety assessment for the proposed C-band (5091- to 5150-MHz) airport surface communication system. The assessment was performed following the guidelines outlined in the Federal Aviation Administration Safety Risk Management Guidance for System Acquisitions document. The safety analysis did not identify any hazards with an unacceptable risk, though a number of hazards with a medium risk were documented. This effort represents an initial high-level safety hazard analysis and notes the triggers for risk reassessment. A detailed safety hazards analysis is recommended as a follow-on activity to assess particular components of the C-band communication system after the profile is finalized and system rollout timing is determined. A security risk assessment has been performed by NASA as a parallel activity. While safety analysis is concerned with a prevention of accidental errors and failures, the security threat analysis focuses on deliberate attacks. Both processes identify the events that affect operation of the system; and from a safety perspective the security threats may present safety risks.
NASA Technical Reports Server (NTRS)
Zelkin, Natalie; Henriksen, Stephen
2011-01-01
This document is being provided as part of ITT's NASA Glenn Research Center Aerospace Communication Systems Technical Support (ACSTS) contract NNC05CA85C, Task 7: "New ATM Requirements--Future Communications, C-Band and L-Band Communications Standard Development." ITT has completed a safety hazard analysis providing a preliminary safety assessment for the proposed L-band (960 to 1164 MHz) terrestrial en route communications system. The assessment was performed following the guidelines outlined in the Federal Aviation Administration Safety Risk Management Guidance for System Acquisitions document. The safety analysis did not identify any hazards with an unacceptable risk, though a number of hazards with a medium risk were documented. This effort represents a preliminary safety hazard analysis and notes the triggers for risk reassessment. A detailed safety hazards analysis is recommended as a follow-on activity to assess particular components of the L-band communication system after the technology is chosen and system rollout timing is determined. The security risk analysis resulted in identifying main security threats to the proposed system as well as noting additional threats recommended for a future security analysis conducted at a later stage in the system development process. The document discusses various security controls, including those suggested in the COCR Version 2.0.
Comparative risk analysis of technological hazards (a review).
Kates, R W; Kasperson, J X
1983-01-01
Hazards are threats to people and what they value and risks are measures of hazards. Comparative analyses of the risks and hazards of technology can be dated to Starr's 1969 paper [Starr, C. (1969) Science 165, 1232-1238] but are rooted in recent trends in the evolution of technology, the identification of hazard, the perception of risk, and the activities of society. These trends have spawned an interdisciplinary quasi profession with new terminology, methodology, and literature. A review of 54 English-language monographs and book-length collections, published between 1970 and 1983, identified seven recurring themes: (i) overviews of the field of risk assessment, (ii) efforts to estimate and quantify risk, (iii) discussions of risk acceptability, (iv) perception, (v) analyses of regulation, (vi) case studies of specific technological hazards, and (vii) agenda for research. Within this field, science occupies a unique niche, for many technological hazards transcend the realm of ordinary experience and require expert study. Scientists can make unique contributions to each area of hazard management but their primary contribution is the practice of basic science. Beyond that, science needs to further risk assessment by understanding the more subtle processes of hazard creation and by establishing conventions for estimating risk and for presenting and handling uncertainty. Scientists can enlighten the discussion of tolerable risk by setting risks into comparative contexts, by studying the process of evaluation, and by participating as knowledgeable individuals, but they cannot decide the issue. Science can inform the hazard management process by broadening the range of alternative control actions and modes of implementation and by devising methods to evaluate their effectiveness. PMID:6580625
Ropkins, K; Beck, A J
2002-08-01
Hazard analysis by critical control points (HACCP) is a systematic approach to the identification, assessment and control of hazards. Effective HACCP requires the consideration of all hazards, i.e., chemical, microbiological and physical. However, to-date most 'in-place' HACCP procedures have tended to focus on the control of microbiological and physical food hazards. In general, the chemical component of HACCP procedures is either ignored or limited to applied chemicals, e.g., food additives and pesticides. In this paper we discuss the application of HACCP to a broader range of chemical hazards, using organic chemical contaminants as examples, and the problems that are likely to arise in the food manufacturing sector. Chemical HACCP procedures are likely to result in many of the advantages previously identified for microbiological HACCP procedures: more effective, efficient and economical than conventional end-point-testing methods. However, the high costs of analytical monitoring of chemical contaminants and a limited understanding of formulation and process optimisation as means of controlling chemical contamination of foods are likely to prevent chemical HACCP becoming as effective as microbiological HACCP.
Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions
NASA Astrophysics Data System (ADS)
De Risi, Raffaele; Goda, Katsuichiro
2017-08-01
Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L. K.; Vogel, R. M.
2015-11-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, Laura K.; Vogel, Richard M.
2016-04-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.
NASA Astrophysics Data System (ADS)
Odbert, Henry; Aspinall, Willy
2014-05-01
Evidence-based hazard assessment at volcanoes assimilates knowledge about the physical processes of hazardous phenomena and observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We discuss the uncertainty of inferences, and how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.
NASA Astrophysics Data System (ADS)
Odbert, Henry; Hincks, Thea; Aspinall, Willy
2015-04-01
Volcanic hazard assessments must combine information about the physical processes of hazardous phenomena with observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We show how our method provides a route to formal propagation of uncertainties in hazard models. Such approaches provide an attractive route to developing an interface between volcano monitoring analyses and probabilistic hazard scenario analysis. We discuss the use of BBNs in hazard analysis as a tractable and traceable tool for fast, rational assimilation of complex, multi-parameter data sets in the context of timely volcanic crisis decision support.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
... Fishery Products--21 CFR Part 123 (OMB Control Number 0910-0354)-- Extension FDA regulations in part 123 (21 CFR part 123) mandate the application of hazard analysis and critical control point (HACCP) principles to the processing of seafood. HACCP is a preventive system of hazard control designed to help...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-09
... Importing of Fish and Fishery Products--21 CFR Part 123 (OMB Control Number 0910-0354)-- Extension FDA regulations in part 123 (21 CFR part 123) mandate the application of hazard analysis and critical control point (HACCP) principles to the processing of seafood. HACCP is a preventive system of hazard control...
Making the Hubble Space Telescope servicing mission safe
NASA Technical Reports Server (NTRS)
Bahr, N. J.; Depalo, S. V.
1992-01-01
The implementation of the HST system safety program is detailed. Numerous safety analyses are conducted through various phases of design, test, and fabrication, and results are presented to NASA management for discussion during dedicated safety reviews. Attention is given to the system safety assessment and risk analysis methodologies used, i.e., hazard analysis, fault tree analysis, and failure modes and effects analysis, and to how they are coupled with engineering and test analysis for a 'synergistic picture' of the system. Some preliminary safety analysis results, showing the relationship between hazard identification, control or abatement, and finally control verification, are presented as examples of this safety process.
NASA Astrophysics Data System (ADS)
McCrea, Terry
The Shuttle Processing Contract (SPC) workforce consists of Lockheed Space Operations Co. as prime contractor, with Grumman, Thiokol Corporation, and Johnson Controls World Services as subcontractors. During the design phase, reliability engineering is instrumental in influencing the development of systems that meet the Shuttle fail-safe program requirements. Reliability engineers accomplish this objective by performing FMEA (failure modes and effects analysis) to identify potential single failure points. When technology, time, or resources do not permit a redesign to eliminate a single failure point, the single failure point information is formatted into a change request and presented to senior management of SPC and NASA for risk acceptance. In parallel with the FMEA, safety engineering conducts a hazard analysis to assure that potential hazards to personnel are assessed. The combined effort (FMEA and hazard analysis) is published as a system assurance analysis. Special ground rules and techniques are developed to perform and present the analysis. The reliability program at KSC is vigorously pursued, and has been extremely successful. The ground support equipment and facilities used to launch and land the Space Shuttle maintain an excellent reliability record.
Singh, Ranjit; Hickner, John; Mold, Jim; Singh, Gurdev
2014-03-01
Testing plays a vital role in primary care. Failures in the process are common and can be harmful. As the great 19th century microbiologist Louis Pasteur put it "chance favors only the prepared mind." Our objective is to prepare minds in primary care practices to improve safety in the testing process. Various principles from safety science can be applied. A prospective methodology that uses an anonymous practice survey based on concepts from failure modes and effects analysis is proposed. Responses are used to rank perceived hazards in the testing process, leading to prioritization of areas for intervention. Secondary data analysis (using data from a study of medication safety) was used to explore the value of this approach in the context of assessing the testing process. At 3 primary care practice sites, a total of 61 staff members completed 4 survey items examining the testing process. Comparison across practices shows that each has a distinct profile of hazards, which would lead each on a different path toward improvement. The proposed approach treats each practice as a unique complex adaptive system aiming to help it thrive by inculcating trust, mutual respect, and collaboration. Implications for patient safety research and practice are discussed.
Improving food safety within the dairy chain: an application of conjoint analysis.
Valeeva, N I; Meuwissen, M P M; Lansink, A G J M Oude; Huirne, R B M
2005-04-01
This study determined the relative importance of attributes of food safety improvement in the production chain of fluid pasteurized milk. The chain was divided into 4 blocks: "feed" (compound feed production and its transport), "farm" (dairy farm), "dairy processing" (transport and processing of raw milk, delivery of pasteurized milk), and "consumer" (retailer/catering establishment and pasteurized milk consumption). The concept of food safety improvement focused on 2 main groups of hazards: chemical (antibiotics and dioxin) and microbiological (Salmonella, Escherichia coli, Mycobacterium paratuberculosis, and Staphylococcus aureus). Adaptive conjoint analysis was used to investigate food safety experts' perceptions of the attributes' importance. Preference data from individual experts (n = 24) on 101 attributes along the chain were collected in a computer-interactive mode. Experts perceived the attributes from the "feed" and "farm" blocks as being more vital for controlling the chemical hazards; whereas the attributes from the "farm" and "dairy processing" were considered more vital for controlling the microbiological hazards. For the chemical hazards, "identification of treated cows" and "quality assurance system of compound feed manufacturers" were considered the most important attributes. For the microbiological hazards, these were "manure supply source" and "action in salmonellosis and M. paratuberculosis cases". The rather high importance of attributes relating to quality assurance and traceability systems of the chain participants indicates that participants look for food safety assurance from the preceding participants. This information has substantial decision-making implications for private businesses along the chain and for the government regarding the food safety improvement of fluid pasteurized milk.
SLUDGE TREATMENT PROJECT KOP CONCEPTUAL DESIGN CONTROL DECISION REPORT
DOE Office of Scientific and Technical Information (OSTI.GOV)
CARRO CA
2010-03-09
This control decision addresses the Knock-Out Pot (KOP) Disposition KOP Processing System (KPS) conceptual design. The KPS functions to (1) retrieve KOP material from canisters, (2) remove particles less than 600 {micro}m in size and low density materials from the KOP material, (3) load the KOP material into Multi-Canister Overpack (MCO) baskets, and (4) stage the MCO baskets for subsequent loading into MCOs. Hazard and accident analyses of the KPS conceptual design have been performed to incorporate safety into the design process. The hazard analysis is documented in PRC-STP-00098, Knock-Out Pot Disposition Project Conceptual Design Hazard Analysis. The accident analysismore » is documented in PRC-STP-CN-N-00167, Knock-Out Pot Disposition Sub-Project Canister Over Lift Accident Analysis. Based on the results of these analyses, and analyses performed in support of MCO transportation and MCO processing and storage activities at the Cold Vacuum Drying Facility (CVDF) and Canister Storage Building (CSB), control decision meetings were held to determine the controls required to protect onsite and offsite receptors and facility workers. At the conceptual design stage, these controls are primarily defined by their safety functions. Safety significant structures, systems, and components (SSCs) that could provide the identified safety functions have been selected for the conceptual design. It is anticipated that some safety SSCs identified herein will be reclassified based on hazard and accident analyses performed in support of preliminary and detailed design.« less
Framework for Automation of Hazard Log Management on Large Critical Projects
NASA Astrophysics Data System (ADS)
Vinerbi, Lorenzo; Babu, Arun P.
2016-08-01
Hazard log is a database of all risk management activities in a project. Maintaining its correctness and consistency on large safety/mission critical projects involving multiple vendors, suppliers, and partners is critical and challenging. IBM DOORS is one of the popular tool used for hazard management in space applications. However, not all stake- holders are familiar with it. Also, It is not always feasible to expect all stake-holders to provide correct and consistent hazard data.The current work describes the process and tools to simplify the process of hazard data collection on large projects. It demonstrates how the collected data from all stake-holders is merged to form the hazard log while ensuring data consistency and correctness.The data provided by all parties are collected using a template containing scripts. The scripts check for mistakes based on internal standards of company in charge of hazard management. The collected data is then subjected to merging in DOORS, which also contain scripts to check and import data to form the hazard log. The proposed tool has been applied to a mission critical project, and has been found to save time and reduce the number of mistakes while creating the hazard log. The use of automatic checks paves the way for correct tracking of risk and hazard analysis activities for large critical projects.
Mars Exploration Rovers Landing Dispersion Analysis
NASA Technical Reports Server (NTRS)
Knocke, Philip C.; Wawrzyniak, Geoffrey G.; Kennedy, Brian M.; Desai, Prasun N.; Parker, TImothy J.; Golombek, Matthew P.; Duxbury, Thomas C.; Kass, David M.
2004-01-01
Landing dispersion estimates for the Mars Exploration Rover missions were key elements in the site targeting process and in the evaluation of landing risk. This paper addresses the process and results of the landing dispersion analyses performed for both Spirit and Opportunity. The several contributors to landing dispersions (navigation and atmospheric uncertainties, spacecraft modeling, winds, and margins) are discussed, as are the analysis tools used. JPL's MarsLS program, a MATLAB-based landing dispersion visualization and statistical analysis tool, was used to calculate the probability of landing within hazardous areas. By convolving this with the probability of landing within flight system limits (in-spec landing) for each hazard area, a single overall measure of landing risk was calculated for each landing ellipse. In-spec probability contours were also generated, allowing a more synoptic view of site risks, illustrating the sensitivity to changes in landing location, and quantifying the possible consequences of anomalies such as incomplete maneuvers. Data and products required to support these analyses are described, including the landing footprints calculated by NASA Langley's POST program and JPL's AEPL program, cartographically registered base maps and hazard maps, and flight system estimates of in-spec landing probabilities for each hazard terrain type. Various factors encountered during operations, including evolving navigation estimates and changing atmospheric models, are discussed and final landing points are compared with approach estimates.
Shin, Sun Kyoung; Kim, Woo-Il; Jeon, Tae-Wan; Kang, Young-Yeul; Jeong, Seong-Kyeong; Yeon, Jin-Mo; Somasundaram, Swarnalatha
2013-09-15
Ministry of Environment, Republic of Korea (South Korea) is in progress of converting its current hazardous waste classification system to harmonize it with the international standard and to set-up the regulatory standards for toxic substances present in the hazardous waste. In the present work, the concentrations along with the trend of 13 heavy metals, F(-), CN(-) and 19 PAH present in the hazardous waste generated among various thermal processes (11 processes) in South Korea were analyzed along with their leaching characteristics. In all thermal processes, the median concentrations of Cu (3.58-209,000 mg/kg), Ni (BDL-1560 mg/kg), Pb (7.22-5132.25mg/kg) and Zn (83.02-31419 mg/kg) were comparatively higher than the other heavy metals. Iron & Steel thermal process showed the highest median value of the heavy metals Cd (14.76 mg/kg), Cr (166.15 mg/kg) and Hg (2.38 mg/kg). Low molecular weight PAH (BDL-37.59 mg/kg) was predominant in sludge & filter cake samples present in most of the thermal processes. Comparatively flue gas dust present in most of the thermal processing units resulted in the higher leaching of the heavy metals. Copyright © 2013 Elsevier B.V. All rights reserved.
2008-03-01
irregular struggle, and, finally, a protracted struggle that will last decades rather than years. how will this war Evolve? It is hazardous to...There is no downside to engagement. It is not an act of para- noia or pessimism to engage Americans in the very real hazards that confront us. It...making process about what to do next. Because it is an undisciplined process, they work through about 100 options when there are only two: duck or
[Quality control in herbal supplements].
Oelker, Luisa
2005-01-01
Quality and safety of food and herbal supplements are the result of a whole of different elements as good manufacturing practice and process control. The process control must be active and able to individuate and correct all possible hazards. The main and most utilized instrument is the hazard analysis critical control point (HACCP) system the correct application of which can guarantee the safety of the product. Herbal supplements need, in addition to standard quality control, a set of checks to assure the harmlessness and safety of the plants used.
NASA Astrophysics Data System (ADS)
Ding, R.; He, T.
2017-12-01
With the increased popularity in mobile applications and services, there has been a growing demand for more advanced mobile technologies that utilize real-time Location Based Services (LBS) data to support natural hazard response efforts. Compared to traditional sources like the census bureau that often can only provide historical and static data, an LBS service can provide more current data to drive a real-time natural hazard response system to more accurately process and assess issues such as population density in areas impacted by a hazard. However, manually preparing or preprocessing the data to suit the needs of the particular application would be time-consuming. This research aims to implement a population heatmap visual analytics system based on real-time data for natural disaster emergency management. System comprised of a three-layered architecture, including data collection, data processing, and visual analysis layers. Real-time, location-based data meeting certain polymerization conditions are collected from multiple sources across the Internet, then processed and stored in a cloud-based data store. Parallel computing is utilized to provide fast and accurate access to the pre-processed population data based on criteria such as the disaster event and to generate a location-based population heatmap as well as other types of visual digital outputs using auxiliary analysis tools. At present, a prototype system, which geographically covers the entire region of China and combines population heat map based on data from the Earthquake Catalogs database has been developed. It Preliminary results indicate that the generation of dynamic population density heatmaps based on the prototype system has effectively supported rapid earthquake emergency rescue and evacuation efforts as well as helping responders and decision makers to evaluate and assess earthquake damage. Correlation analyses that were conducted revealed that the aggregation and movement of people depended on various factors, including earthquake occurrence time and location of epicenter. This research hopes to continue to build upon the success of the prototype system in order to improve and extend the system to support the analysis of earthquakes and other types of natural hazard events.
Cascading disaster models in postburn flash flood
Fred May
2007-01-01
A useful method of modeling threats from hazards and documenting their disaster causation sequences is called âcascading threat modeling.â This type of modeling enables emergency planners to address hazard and risk assessments systematically. This paper describes a cascading threat modeling and analysis process. Wildfire and an associated postburn flash flood disaster...
Hazard function theory for nonstationary natural hazards
Read, Laura K.; Vogel, Richard M.
2016-04-11
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.« less
Hazard function theory for nonstationary natural hazards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Read, Laura K.; Vogel, Richard M.
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.« less
The role of models in estimating consequences as part of the risk assessment process.
Forde-Folle, K; Mitchell, D; Zepeda, C
2011-08-01
The degree of disease risk represented by the introduction, spread, or establishment of one or several diseases through the importation of animals and animal products is assessed by importing countries through an analysis of risk. The components of a risk analysis include hazard identification, risk assessment, risk management, and risk communication. A risk assessment starts with identification of the hazard(s) and then continues with four interrelated steps: release assessment, exposure assessment, consequence assessment, and risk estimation. Risk assessments may be either qualitative or quantitative. This paper describes how, through the integration of epidemiological and economic models, the potential adverse biological and economic consequences of exposure can be quantified.
Geotechnical applications of LiDAR pertaining to geomechanical evaluation and hazard identification
NASA Astrophysics Data System (ADS)
Lato, Matthew J.
Natural hazards related to ground movement that directly affect the safety of motorists and highway infrastructure include, but are not limited to, rockfalls, rockslides, debris flows, and landslides. This thesis specifically deals with the evaluation of rockfall hazards through the evaluation of LiDAR data. Light Detection And Ranging (LiDAR) is an imaging technology that can be used to delineate and evaluate geomechanically-controlled hazards. LiDAR has been adopted to conduct hazard evaluations pertaining to rockfall, rock-avalanches, debris flows, and landslides. Characteristics of LiDAR surveying, such as rapid data acquisition rates, mobile data collection, and high data densities, pose problems to traditional CAD or GIS-based mapping methods. New analyses methods, including tools specifically oriented to geomechanical analyses, are needed. The research completed in this thesis supports development of new methods, including improved survey techniques, innovative software workflows, and processing algorithms to aid in the detection and evaluation of geomechanically controlled rockfall hazards. The scientific research conducted between the years of 2006-2010, as presented in this thesis, are divided into five chapters, each of which has been published by or is under review by an international journal. The five research foci are: (i) geomechanical feature extraction and analysis using LiDAR data in active mining environments; (ii) engineered monitoring of rockfall hazards along transportation corridors: using mobile terrestrial LiDAR; (iii) optimization of LiDAR scanning and processing for automated structural evaluation of discontinuities in rockmasses; (iv) location orientation bias when using static LiDAR data for geomechanical analysis; and (v) evaluating roadside rockmasses for rockfall hazards from LiDAR data: optimizing data collection and processing protocols. The research conducted pertaining to this thesis has direct and significant implications with respect to numerous engineering projects that are affected by geomechanical stability issues. The ability to efficiently and accurately map discontinuities, detect changes, and standardize roadside geomechanical stability analyses from remote locations will fundamentally change the state-of-practice of geotechnical investigation workflows and repeatable monitoring. This, in turn, will lead to earlier detection and definition of potential zones of instability, will allow for progressive monitoring and risk analysis, and will indicate the need for pro-active slope improvement and stabilization.
Integration of Aquifer Storage Transfer and Recovery and HACCP for Ensuring Drinking Water Quality
NASA Astrophysics Data System (ADS)
Lee, S. I.; Ji, H. W.
2015-12-01
The integration of ASTR (Aquifer Storage Transfer and Recovery) and HACCP (Hazard Analysis and Critical Control Point) is being attempted to ensure drinking water quality in a delta area. ASTR is a water supply system in which surface water is injected into a well for storage and recovered from a different well. During the process natural water treatment is achieved in the aquifer. ASTR has advantages over surface reservoirs in that the water is protected from external contaminants and free from water loss by evaporation. HACCP, originated from the food industry, can efficiently manage hazards and reduce risks when it is introduced to the drinking water production. The study area is the located in the Nakdong River Delta, South Korea. Water quality of this region has been deteriorated due to the increased pollution loads from the upstream cities and industrial complexes. ASTR equipped with HACCP system is suggested as a means to heighten the public trust in drinking water. After the drinking water supply system using ASTR was decomposed into ten processes, principles of HACCP were applied. Hazardous event analysis was conducted for 114 hazardous events and nine major hazardous events were identified based on the likelihood and the severity assessment. Potential risk of chemical hazards, as a function of amounts, travel distance and toxicity, was evaluated and the result shows the relative threat a city poses to the drinking water supply facility. Next, critical control points were determined using decision tree analysis. Critical limits, maximum and/or minimum values to which biological, chemical or physical parameters must be controlled, were established. Other procedures such as monitoring, corrective actions and will be presented.
SUPERFUND REMOTE SENSING SUPPORT
This task provides remote sensing technical support to the Superfund program. Support includes the collection, processing, and analysis of remote sensing data to characterize hazardous waste disposal sites and their history. Image analysis reports, aerial photographs, and assoc...
Integrated risk management and communication: case study of Canton Vaud (Switzerland)
NASA Astrophysics Data System (ADS)
Artigue, Veronica; Aye, Zar Chi; Gerber, Christian; Derron, Marc-Henri; Jaboyedoff, Michel
2017-04-01
Canton Vaud's history is marked by events that remind us that any territory may have to cope with natural hazards such as devastating floods of the Baye and the Veraye rivers in Montreux (1927), the overflowing of the Rhône by dam failure (1935), the mud flow of Pissot (1995) and avalanches in the Prealps (1999). All of these examples have caused significant damage, and sometimes even fatalities, in the regions of Canton Vaud. In response to these new issues, the Swiss Confederation and the local authorities of the Canton decided to implement an integrated management policy of natural risks. The realization of natural hazards maps was the first step of the integrated management process. This work resulted in more than 10'000 maps and related documents for 94% of the municipalities of the Canton, covering 17% of its total surface. From this significant amount of data, the main issue is to propose a relevant communication and to build an integrated risk management structure. To make this available information relevant for end users, the implied teams worked to realize documents and tools for a better understanding of these data by all stakeholders. The first step of this process was to carry out a statistical and geographical analysis of hazard maps that allows identifying the most exposed areas to natural hazards. An atlas could thus be created. Then, continued under this framework, several topics have been discussed for each identified risk. The results show that 88 of 318 municipalities in Canton Vaud have at least a high hazard level on their territory, 108 with a moderate hazard level, 41 with a low level and 8 with a residual level. Only 73 of 318 municipalities remain with a minimum or zero hazard level. Concerning the type of hazard considered, 16% of the building zones are exposed to floods, 18% to mud flow, 16% to deep landslides, 14% to spontaneous surface landslides, 6% to rockfall, 55% to rock collapses and less than 5% to avalanches. As the national policies require to take into account the risk at the building scale, further analysis on the buildings have been made. 1'154 buildings are exposed to a high hazard level, while 8409, 21'130 and 14'980 buildings are exposed to a moderate, low and residual hazard level respectively. This paper addresses the complexity of the realization of the hazard map products of the Canton Vaud, particularly through the statistical analysis and the difficulties encountered for data availability and quality at the building scale. The authors highlight the necessary processes to build a robust communication for all the implied stakeholders of risk management in a dynamic and changing area through the example of the Canton Vaud.
USEPA EXAMPLE EXIT LEVEL ANALYSIS RESULTS
Developed by NERL/ERD for the Office of Solid Waste, the enclosed product provides an example uncertainty analysis (UA) and initial process-based sensitivity analysis (SA) of hazardous waste "exit" concentrations for 7 chemicals and metals using the 3MRA Version 1.0 Modeling Syst...
NASA Astrophysics Data System (ADS)
Li, Deying; Yin, Kunlong; Gao, Huaxi; Liu, Changchun
2009-10-01
Although the project of the Three Gorges Dam across the Yangtze River in China can utilize this huge potential source of hydroelectric power, and eliminate the loss of life and damage by flood, it also causes environmental problems due to the big rise and fluctuation of the water, such as geo-hazards. In order to prevent and predict geo-hazards, the establishment of prediction system of geo-hazards is very necessary. In order to implement functions of hazard prediction of regional and urban geo-hazard, single geo-hazard prediction, prediction of landslide surge and risk evaluation, logical layers of the system consist of data capturing layer, data manipulation and processing layer, analysis and application layer, and information publication layer. Due to the existence of multi-source spatial data, the research on the multi-source transformation and fusion data should be carried on in the paper. Its applicability of the system was testified on the spatial prediction of landslide hazard through spatial analysis of GIS in which information value method have been applied aims to identify susceptible areas that are possible to future landslide, on the basis of historical record of past landslide, terrain parameter, geology, rainfall and anthropogenic activity. Detailed discussion was carried out on spatial distribution characteristics of landslide hazard in the new town of Badong. These results can be used for risk evaluation. The system can be implemented as an early-warning and emergency management tool by the relevant authorities of the Three Gorges Reservoir in the future.
Subsystem Hazard Analysis Methodology for the Ares I Upper Stage Source Controlled Items
NASA Technical Reports Server (NTRS)
Mitchell, Michael S.; Winner, David R.
2010-01-01
This article describes processes involved in developing subsystem hazard analyses for Source Controlled Items (SCI), specific components, sub-assemblies, and/or piece parts, of the NASA ARES I Upper Stage (US) project. SCIs will be designed, developed and /or procured by Boeing as an end item or an off-the-shelf item. Objectives include explaining the methodology, tools, stakeholders and products involved in development of these hazard analyses. Progress made and further challenges in identifying potential subsystem hazards are also provided in an effort to assist the System Safety community in understanding one part of the ARES I Upper Stage project.
Sensemaking of patient safety risks and hazards.
Battles, James B; Dixon, Nancy M; Borotkanics, Robert J; Rabin-Fastmen, Barbara; Kaplan, Harold S
2006-08-01
In order for organizations to become learning organizations, they must make sense of their environment and learn from safety events. Sensemaking, as described by Weick (1995), literally means making sense of events. The ultimate goal of sensemaking is to build the understanding that can inform and direct actions to eliminate risk and hazards that are a threat to patient safety. True sensemaking in patient safety must use both retrospective and prospective approach to learning. Sensemaking is as an essential part of the design process leading to risk informed design. Sensemaking serves as a conceptual framework to bring together well established approaches to assessment of risk and hazards: (1) at the single event level using root cause analysis (RCA), (2) at the processes level using failure modes effects analysis (FMEA) and (3) at the system level using probabilistic risk assessment (PRA). The results of these separate or combined approaches are most effective when end users in conversation-based meetings add their expertise and knowledge to the data produced by the RCA, FMEA, and/or PRA in order to make sense of the risks and hazards. Without ownership engendered by such conversations, the possibility of effective action to eliminate or minimize them is greatly reduced.
Sensemaking of Patient Safety Risks and Hazards
Battles, James B; Dixon, Nancy M; Borotkanics, Robert J; Rabin-Fastmen, Barbara; Kaplan, Harold S
2006-01-01
In order for organizations to become learning organizations, they must make sense of their environment and learn from safety events. Sensemaking, as described by Weick (1995), literally means making sense of events. The ultimate goal of sensemaking is to build the understanding that can inform and direct actions to eliminate risk and hazards that are a threat to patient safety. True sensemaking in patient safety must use both retrospective and prospective approach to learning. Sensemaking is as an essential part of the design process leading to risk informed design. Sensemaking serves as a conceptual framework to bring together well established approaches to assessment of risk and hazards: (1) at the single event level using root cause analysis (RCA), (2) at the processes level using failure modes effects analysis (FMEA) and (3) at the system level using probabilistic risk assessment (PRA). The results of these separate or combined approaches are most effective when end users in conversation-based meetings add their expertise and knowledge to the data produced by the RCA, FMEA, and/or PRA in order to make sense of the risks and hazards. Without ownership engendered by such conversations, the possibility of effective action to eliminate or minimize them is greatly reduced. PMID:16898979
An overview of safety assessment, regulation, and control of hazardous material use at NREL
NASA Astrophysics Data System (ADS)
Nelson, B. P.; Crandall, R. S.; Moskowitz, P. D.; Fthenakis, V. M.
1992-12-01
This paper summarizes the methodology we use to ensure the safe use of hazardous materials at the National Renewable Energy Laboratory (NREL). First, we analyze the processes and the materials used in those processes to identify the hazards presented. Then we study federal, state, and local regulations and apply the relevant requirements to our operations. When necessary, we generate internal safety documents to consolidate this information. We design research operations and support systems to conform to these requirements. Before we construct the systems, we perform a semiquantitative risk analysis on likely accident scenarios. All scenarios presenting an unacceptable risk require system or procedural modifications to reduce the risk. Following these modifications, we repeat the risk analysis to ensure that the respective accident scenarios present an acceptable risk. Once all risks are acceptable, we conduct an operational readiness review (ORR). A management-appointed panel performs the ORR ensuring compliance with all relevant requirements. After successful completion of the ORR, operations can begin.
Thermal reactive hazards of HMX with contaminants.
Peng, Deng-Jr; Chang, Cheng-Ming; Chiu, Miin
2004-10-18
In the past, many unexpected runaway accidents occurred in manufacturing processes, involving volatile chemical and explosive storage and transportation. Incompatible product reactions of high explosives must be carefully considered using loss prevention strategies for thermal hazards risk analysis. High explosive reactions vary via environmental changes, contact materials, or process situations, such as onset temperature and shifts in reaction type when high explosives are in contact with contaminants. Therefore, the manufacture and handling of high explosives require the utmost in safety and loss prevention. HMX (cyclotetramethyene tetranitramine) is one kind of high explosive widely used around the world which is stable with high detonation strength properties. In this study, the influences of contaminants on HMX are investigated. The studied contaminants include ferrous chloride tetrahydrate, ferric chloride hexahydrate, acetone solution, acetic acid, and nitric acid. DSC thermal curves and incompatible reaction kinetic evaluations were preformed using iron, chlorine and acid. Organic acetone solution has lesser effects on HMX. Hopefully, this study will lead to improved thermal hazards risk analysis and reduce accidents.
Popova, A Yu; Trukhina, G M; Mikailova, O M
In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.
NASA Technical Reports Server (NTRS)
Lee, Jonggil
1990-01-01
High resolution windspeed profile measurements are needed to provide reliable detection of hazardous low altitude windshear with an airborne pulse Doppler radar. The system phase noise in a Doppler weather radar may degrade the spectrum moment estimation quality and the clutter cancellation capability which are important in windshear detection. Also the bias due to weather return Doppler spectrum skewness may cause large errors in pulse pair spectral parameter estimates. These effects are analyzed for the improvement of an airborne Doppler weather radar signal processing design. A method is presented for the direct measurement of windspeed gradient using low pulse repetition frequency (PRF) radar. This spatial gradient is essential in obtaining the windshear hazard index. As an alternative, the modified Prony method is suggested as a spectrum mode estimator for both the clutter and weather signal. Estimation of Doppler spectrum modes may provide the desired windshear hazard information without the need of any preliminary processing requirement such as clutter filtering. The results obtained by processing a NASA simulation model output support consideration of mode identification as one component of a windshear detection algorithm.
NASA Astrophysics Data System (ADS)
Rahamana, S. Abdul; Aruchamy, S.; Jegankumar, R.
2014-12-01
Landslides are one of the critical natural phenomena that frequently lead to serious problems in hilly area, resulting to loss of human life and property, as well as causing severe damage to natural resources. The local geology with high degree of slope coupled with high intensity of rainfall along with unplanned human activities of the study area causes many landslides in this region. The present study area is more attracted by tourist throughout the year, so this area must be considered for preventive measures. Geospatial based Multicriteria decision analysis (MCDA) technique is increasingly used for landslide vulnerability and hazard zonation mapping. It enables the integration of different data layers with different levels of uncertainty. In this present study, it is used analytic hierarchy process (AHP) method to prepare landslide hazard zones of the Coonoor and Ooty, part of Kallar watershed, The Nilgiris, Tamil Nadu. The study was carried out using remote sensing data, field surveys and geographic information system (GIS) tools. The ten factors that influence landslide occurrence, such as elevation, slope aspect, slope angle, drainage density, lineament density, soil, precipitation, land use/land cover (LULC), distance from road and NDVI were considered. These factors layers were extracted from the various related spatial data's. These factors were evaluated, and then, the individual factor weight and class weight were assigned to each of the related factors. The Landslide Hazard Zone Index (LHZI) was calculated using Multicriteria decision analysis (MCDA) the technique based on the assigned weight and the rating is given by the Analytical Hierarchy Process (AHP) method. The final cumulative map of the study area was categorized into four hazard zones and classified as zone I to IV. There are 3.56% of the area comes under the hazard zone IV fallowed by 48.19% of the area comes under zone III, 43.63 % of the area in zone II and 4.61% of the area comes hazard zone I. Further resulted hazard zone map and landuse/landcover map are overlaid to check the hazard status, and existing inventory of known landslides within the present study area was compared with the resulting vulnerable and hazard zone maps. The landslide hazard zonation map is useful for landslide hazard prevention, mitigation, and improvement to society, and proper planning for land use and construction in the future.
NASA Technical Reports Server (NTRS)
1997-01-01
When NASA started plarning for manned space travel in 1959, the myriad challenges of sustaining life in space included a seemingly mundane but vitally important problem: How and what do you feed an astronaut? There were two main concerns: preventing food crumbs from contaminating the spacecraft's atmosphere or floating into sensitive instruments, and ensuring complete freedom from potentially catastrophic disease-producing bacteria, viruses, and toxins. To solve these concerns, NASA enlisted the help of the Pillsbury Company. Pillsbury quickly solved the first problem by coating bite-size foods to prevent crumbling. They developed the hazard analysis and critical control point (HACCP) concept to ensure against bacterial contamination. Hazard analysis is a systematic study of product, its ingredients, processing conditions, handling, storage, packing, distribution, and directions for consumer use to identify sensitive areas that might prove hazardous. Hazard analysis provides a basis for blueprinting the Critical Control Points (CCPs) to be monitored. CCPs are points in the chain from raw materials to the finished product where loss of control could result in unacceptable food safety risks. In early 1970, Pillsbury plants were following HACCP in production of food for Earthbound consumers. Pillsbury's subsequent training courses for Food and Drug Administration (FDA) personnel led to the incorporation of HACCP in the FDA's Low Acid Canned Foods Regulations, set down in the mid-1970s to ensure the safety of all canned food products in the U.S.
Impact of the new handling recommendations for hazardous drugs in a hospital pharmacy service.
García-Alcántara, Beatriz G; Perelló Alomar, Catalina; Moreno Centeno, Elena; Modamio, Pilar; Mariño, Eduardo L; Delgado Sánchez, Olga
2017-03-01
To describe the actions taken by the Pharmacy Unit in a tertiary hospital in order to adapt to the recommendations established by NIOSH 2014 for handling Hazardous Drugs. Method: A retrospective observational study. A list was prepared including all hazardous drugs according to NIOSH 2014 that were available at the hospital as marketed or foreign drugs, or used in clinical trials, and there was a review of the processes of acquisition, repackaging, preparation, circuits, organizational, dispensing and identification. Results: After the analysis, a report including all needs was prepared and sent to the Hospital Management. Any relevant information about the handling and administration of hazardous drugs was included in the prescription computer program. There were changes in the acquisition process of two drugs, in order to avoid splitting and multi-dose formulations. An alternative or improvement was found for 35 253 of the 75 779 units of hazardous drugs repackaged in one year. The Pharmacy Unit took over the preparation of four non-sterile medications, as well as the preparation of all sterile parenteral medications included in Lists 1 and 2 that were not previously prepared there, as well as one from List 3. Information was also included about the preparation processes of Magistral Formulations that involved hazardous drugs from Lists 2 or 3. The adaptation to the recommendations by NIOSH 2014 has represented a change, but also a significant reduction in the handling process of hazardous drugs by the healthcare staff, therefore reducing the risk of occupational exposure. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
21 CFR 120.24 - Process controls.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Process controls. 120.24 Section 120.24 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process...
Landslide hazard assessment of the Black sea coastline (Caucasus, Russia) via drones
NASA Astrophysics Data System (ADS)
Kazeev, Andrey; Postoev, German; Fedotova, Ksenia
2017-04-01
Landslide hazard assessment of slopes of Sochi was performed along the railway between the cities Tuapse and Adler (total length 103 km). The railway passes through the territory with active development of hazardous geological processes such as landslides, rock falls and debris-flows. By the beginning of 2016, 36 landslide sites were discovered along the railway (total length 34 km), 48 rock-fall sites (length 31 km), and 5 debris-flow sites (length 0.14 km). In recent years the intensification of deformations was observed. For instance, during previous 10 years (1996¬¬-2005) 28 sudden deformations occurred due to slope processes, which caused interruptions in traffic. And in the present decade (2006-2015), 72 deformations were recorded. High landslide activity and economic loss determined the necessity of complex investigations of engineering geological conditions of landslides development and causes of its intensification. The protection strategy development was needed to minimize negative consequences. Thus, the investigations of landslide situation along the railway "Tuapse - Adler" included the categorization of landslide sites by level of hazard, with risk assessment based on numerical criteria. Preliminary evaluation of landslide hazard for the railway was conducted via the analysis of archived engineering-geological documents. 13 of 36 landslide sites (total length 13 km) were selected, reflecting the variety and peculiarities of landslide displacements on slopes (both active and inactive sites). Visual field observations of landslide slopes using drone "DJI Phantom 4" were completed during the second stage of this investigation. High-resolution photographs of landslide cirques, cracks, scarp walls, vegetation features were obtained via drone, which would have been impossible to obtain from the ground in conditions of dense subtropical vegetation cover. Possible approaches to the landslide activity and hazard assessment were evaluated: slope stability analysis, geophysical monitoring methods, analysis of critical deformations and critical velocities of displacement, the analysis of changes of conditions of landslide development during its displacement, as well as scoring approaches to landslide hazard and risk assessment. As the result, the method of probabilistic estimation of landslide activity and hazard has been proposed, based on selection and analysis of main factors, influencing landslide displacements. Slope steepness, landslide thickness, slope length, bedrock dip, slope relief, cracks, vegetation patterns and other factors were used for assessment of activity of landslide sites. The investigation was based on the proposed probabilistic method of assessment of landslide activity and hazard. The considered landslide sites were ranked by the rate of activity as inactive, potentially active and active. The most active sites were used to identify potentially the most hazardous sites. Furthermore, the following factors were additionally considered: the damage of railroad facilities due to landslide, landslide activity, thickness of landslide at the toe of the slope, bedrock stratification, the conditions for the cirque development, the position of the sliding surface relatively to the railway, the involvement of bedrock into displaced mass. As the result, the investigated railroad sites were divided into three categories: non-hazardous, potentially hazardous and hazardous. The research was supported by Russian Scientific Foundation (Project № 16-17-00125).
RMP Guidance for Warehouses - Appendix D: OSHA Guidance on PSM
This text is taken directly from OSHA's appendix C to the Process Safety Management standard (29 CFR 1910.119). Compiled information required by this standard, including material safety data sheets (MSDS), is essential to process hazards analysis (PHA).
NASA Astrophysics Data System (ADS)
A'diat, Arkan Addien Al; Liquiddanu, Eko; Laksono, Pringgo Widyo; Sutopo, Wahyudi; Suletra, I. Wayan
2018-02-01
Along with the increasing number of the modern retail business in Indonesia, give an opportunity to small and medium enterprise (SME) to sell its products through the modern retailer. There are some obstacles faced by the SMEs, one of them is about product standard. Product standard that must be owned by SMEs are GMP standard and halal standard. This research was conducted to know the fulfillment by the beef floss enterprise in jagalan in fulfilling the GMP standard and halal. In addition, Hazard Analysis and Critical Control Points (HACCP) system was applied to analyze the process. HACCP which used in this research was based on the seven principles in SNI (Indonesian National Standard) 01-4852-1998. The seven principles included hazard analysis, critical control point (CCP) determination, critical limit establishment, CCP monitor system establishment, corrective action establishment, verification, and also documentation establishment that must be applied in preparing HACCP plan. Based on this case study, it is concluded that there were 5 CCPs : the boiling process, roasting process, frying process, the beef floss draining process, and the packaging process.
NASA Astrophysics Data System (ADS)
Coca-Domínguez, Oswaldo; Ricaurte-Villota, Constanza; Morales-Giraldo, David; Rangel-Buitrago, Nelson
2014-05-01
Analysis of hazards and vulnerability associated to coastal erosion along coastlines is a first issue in order to establish plans for adaptation to climate change in coastal areas. La Barra Town, Buenaventura (Pacific ocean of Colombia) and Providence - Santa Catalina Islands (Colombian Caribbean) were selected to develop a detailed analysis of coastal erosion hazard and vulnerability from different perspectives: i) physical (hazard) , ii) social , iii) conservation approach and iv) cultural heritage (Raizal). The analysis was made by a semi quantitative approximation method, applying variables associated with the intrinsic coastal zone properties (i.e. type of beach, exposure of the coast to waves, etc.). Coastal erosion data and associated variables as well land use; conservation and heritage data were used to carry out a further detailed analysis of the human - structural vulnerability and exposure to hazards. The data shows erosion rates close to -17 m yr-1 in La Barra Town (highlighting their critical condition and urgent relocation process), while in some sectors of Providence Island, such as Old Town, erosion rate was -5 m yr-1. The observed erosion process affects directly the land use and the local and regional economy. The differences between indexes and the structural and physical vulnerability as well the use of methodological variables are presented in the context of each region. In this work, all the information was worked using a GIS environment since this allows editing and updating the information continuously. The application of this methodology generates useful information in order to promote risk management as well prevention, mitigation and reduction plans. In both areas the adaptation must be a priority strategy to be considered, including relocation alternatives and sustainable protection with the support of studies of uses and future outlooks in the coast. The methodology is framed into the use of GIS tools and it highlights their benefits in the analysis of information.
Yang, Yu; Jiang, Yong-Hai; Lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-Fei; Xu, Xiang-Jian; An, Da
2016-12-01
Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.
Dewettinck, T; Van Houtte, E; Geenens, D; Van Hege, K; Verstraete, W
2001-01-01
To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was used to guarantee hygienically safe drinking water production. Taking into account the literature data on the removal efficiencies of the proposed advanced treatment steps with regard to enteric viruses and protozoa and after setting high quality limits based on the recent progress in quantitative risk assessment, the critical control points (CCPs) and points of attention (POAs) were identified. Based on the HACCP analysis a specific monitoring strategy was developed which focused on the control of these CCPs and POAs.
RELEASE OF DRIED RADIOACTIVE WASTE MATERIALS TECHNICAL BASIS DOCUMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
KOZLOWSKI, S.D.
2007-05-30
This technical basis document was developed to support RPP-23429, Preliminary Documented Safety Analysis for the Demonstration Bulk Vitrification System (PDSA) and RPP-23479, Preliminary Documented Safety Analysis for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Facility. The main document describes the risk binning process and the technical basis for assigning risk bins to the representative accidents involving the release of dried radioactive waste materials from the Demonstration Bulk Vitrification System (DBVS) and to the associated represented hazardous conditions. Appendices D through F provide the technical basis for assigning risk bins to the representative dried waste release accident and associated represented hazardous conditionsmore » for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Packaging Unit (WPU). The risk binning process uses an evaluation of the frequency and consequence of a given representative accident or represented hazardous condition to determine the need for safety structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls. A representative accident or a represented hazardous condition is assigned to a risk bin based on the potential radiological and toxicological consequences to the public and the collocated worker. Note that the risk binning process is not applied to facility workers because credible hazardous conditions with the potential for significant facility worker consequences are considered for safety-significant SSCs and/or TSR-level controls regardless of their estimated frequency. The controls for protection of the facility workers are described in RPP-23429 and RPP-23479. Determination of the need for safety-class SSCs was performed in accordance with DOE-STD-3009-94, Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses, as described below.« less
NASA Astrophysics Data System (ADS)
Meyer, F. J.; Webley, P. W.; Dehn, J.; Arko, S. A.; McAlpin, D. B.; Gong, W.
2016-12-01
Volcanic eruptions are among the most significant hazards to human society, capable of triggering natural disasters on regional to global scales. In the last decade, remote sensing has become established in operational volcano monitoring. Centers like the Alaska Volcano Observatory rely heavily on remote sensing data from optical and thermal sensors to provide time-critical hazard information. Despite this high use of remote sensing data, the presence of clouds and a dependence on solar illumination often limit their impact on decision making. Synthetic Aperture Radar (SAR) systems are widely considered superior to optical sensors in operational monitoring situations, due to their weather and illumination independence. Still, the contribution of SAR to operational volcano monitoring has been limited in the past due to high data costs, long processing times, and low temporal sampling rates of most SAR systems. In this study, we introduce the automatic SAR processing system SARVIEWS, whose advanced data analysis and data integration techniques allow, for the first time, a meaningful integration of SAR into operational monitoring systems. We will introduce the SARVIEWS database interface that allows for automatic, rapid, and seamless access to the data holdings of the Alaska Satellite Facility. We will also present a set of processing techniques designed to automatically generate a set of SAR-based hazard products (e.g. change detection maps, interferograms, geocoded images). The techniques take advantage of modern signal processing and radiometric normalization schemes, enabling the combination of data from different geometries. Finally, we will show how SAR-based hazard information is integrated in existing multi-sensor decision support tools to enable joint hazard analysis with data from optical and thermal sensors. We will showcase the SAR processing system using a set of recent natural disasters (both earthquakes and volcanic eruptions) to demonstrate its robustness. We will also show the benefit of integrating SAR with data from other sensors to support volcano monitoring. For historic eruptions at Okmok and Augustine volcano, both located in the North Pacific, we will demonstrate that the addition of SAR can lead to a significant improvement in activity detection and eruption forecasting.
21 CFR 120.24 - Process controls.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Process controls. 120.24 Section 120.24 Food and... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process controls. (a) In order to meet the requirements of subpart A of this part, processors of juice products...
21 CFR 120.24 - Process controls.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Process controls. 120.24 Section 120.24 Food and... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process controls. (a) In order to meet the requirements of subpart A of this part, processors of juice products...
21 CFR 120.24 - Process controls.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Process controls. 120.24 Section 120.24 Food and... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process controls. (a) In order to meet the requirements of subpart A of this part, processors of juice products...
21 CFR 120.24 - Process controls.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Process controls. 120.24 Section 120.24 Food and... CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS Pathogen Reduction § 120.24 Process controls. (a) In order to meet the requirements of subpart A of this part, processors of juice products...
NASA Technical Reports Server (NTRS)
Steele, Jimmy; Smith, Robert E.
1991-01-01
The ability to identify contaminants associated with experiments and facilities is directly related to the safety of the Space Station. A means of identifying these contaminants has been developed through this contracting effort. The delivered system provides a listing of the materials and/or chemicals associated with each facility, information as to the contaminant's physical state, a list of the quantity and/or volume of each suspected contaminant, a database of the toxicological hazards associated with each contaminant, a recommended means of rapid identification of the contaminants under operational conditions, a method of identifying possible failure modes and effects analysis associated with each facility, and a fault tree-type analysis that will provide a means of identifying potential hazardous conditions related to future planned missions.
Automation for System Safety Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul
2009-01-01
This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Pacific Basin conference on hazardous waste: Proceedings
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This conference was held November 4--8, 1996 in Kuala Lumpur, Malaysia. The purpose of this conference was to provide a multidisciplinary forum for exchange of state-of-the-art information on the problems of hazardous waste. Topics of discussion deal with pollution prevention, waste treatment technology, health and ecosystem effects research, analysis and assessment, and regulatory management techniques. Individual papers have been processed separately for inclusion in the appropriate data bases.
40 CFR 63.11502 - What definitions apply to this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
...: process knowledge, an engineering assessment, or test data. Byproduct means a chemical (liquid, gas, or... limit applicable to the process vent. (4) Design analysis based on accepted chemical engineering... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Chemical Manufacturing Area Sources...
40 CFR 63.11502 - What definitions apply to this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
...: process knowledge, an engineering assessment, or test data. Byproduct means a chemical (liquid, gas, or... limit applicable to the process vent. (4) Design analysis based on accepted chemical engineering... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Chemical Manufacturing Area Sources...
Bou Rached, Lizet; Ascanio, Norelis; Hernández, Pilar
2004-03-01
The Hazard Analysis and Critical Control Point (HACCP) is a systematic integral program used to identify and estimate the hazards (microbiological, chemical and physical) and the risks generated during the primary production, processing, storage, distribution, expense and consumption of foods. To establish a program of HACCP has advantages, being some of them: to emphasize more in the prevention than in the detection, to diminish the costs, to minimize the risk of manufacturing faulty products, to allow bigger trust to the management, to strengthen the national and international competitiveness, among others. The present work is a proposal based on the design of an HACCP program to guarantee the safety of the Bologna Special Type elaborated by a meat products industry, through the determination of hazards (microbiological, chemical or physical), the identification of critical control points (CCP), the establishment of critical limits, plan corrective actions and the establishment of documentation and verification procedures. The used methodology was based in the application of the seven basic principles settled down by the Codex Alimentarius, obtaining the design of this program. In view of the fact that recently the meat products are linked with pathogens like E. coli O157:H7 and Listeria monocytogenes, these were contemplated as microbiological hazard for the establishment of the HACCP plan whose application will guarantee the obtaining of a safe product.
Web processing service for landslide hazard assessment
NASA Astrophysics Data System (ADS)
Sandric, I.; Ursaru, P.; Chitu, D.; Mihai, B.; Savulescu, I.
2012-04-01
Hazard analysis requires heavy computation and specialized software. Web processing services can offer complex solutions that can be accessed through a light client (web or desktop). This paper presents a web processing service (both WPS and Esri Geoprocessing Service) for landslides hazard assessment. The web processing service was build with Esri ArcGIS Server solution and Python, developed using ArcPy, GDAL Python and NumPy. A complex model for landslide hazard analysis using both predisposing and triggering factors combined into a Bayesian temporal network with uncertainty propagation was build and published as WPS and Geoprocessing service using ArcGIS Standard Enterprise 10.1. The model uses as predisposing factors the first and second derivatives from DEM, the effective precipitations, runoff, lithology and land use. All these parameters can be served by the client from other WFS services or by uploading and processing the data on the server. The user can select the option of creating the first and second derivatives from the DEM automatically on the server or to upload the data already calculated. One of the main dynamic factors from the landslide analysis model is leaf area index. The LAI offers the advantage of modelling not just the changes from different time periods expressed in years, but also the seasonal changes in land use throughout a year. The LAI index can be derived from various satellite images or downloaded as a product. The upload of such data (time series) is possible using a NetCDF file format. The model is run in a monthly time step and for each time step all the parameters values, a-priory, conditional and posterior probability are obtained and stored in a log file. The validation process uses landslides that have occurred during the period up to the active time step and checks the records of the probabilities and parameters values for those times steps with the values of the active time step. Each time a landslide has been positive identified new a-priory probabilities are recorded for each parameter. A complete log for the entire model is saved and used for statistical analysis and a NETCDF file is created and it can be downloaded from the server with the log file
Arvanitoyannis, Ioannis S; Varzakas, Theodoros H
2009-08-01
Failure Mode and Effect Analysis (FMEA) has been applied for the risk assessment of snails manufacturing. A tentative approach of FMEA application to the snails industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (snails processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over snails processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Sterilization of tins, bioaccumulation of heavy metals, packaging of shells and poisonous mushrooms, were the processes identified as the ones with the highest RPN (280, 240, 147, 144, respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a snails processing industry is considered imperative.
Seismic risk assessment and application in the central United States
Wang, Z.
2011-01-01
Seismic risk is a somewhat subjective, but important, concept in earthquake engineering and other related decision-making. Another important concept that is closely related to seismic risk is seismic hazard. Although seismic hazard and seismic risk have often been used interchangeably, they are fundamentally different: seismic hazard describes the natural phenomenon or physical property of an earthquake, whereas seismic risk describes the probability of loss or damage that could be caused by a seismic hazard. The distinction between seismic hazard and seismic risk is of practical significance because measures for seismic hazard mitigation may differ from those for seismic risk reduction. Seismic risk assessment is a complicated process and starts with seismic hazard assessment. Although probabilistic seismic hazard analysis (PSHA) is the most widely used method for seismic hazard assessment, recent studies have found that PSHA is not scientifically valid. Use of PSHA will lead to (1) artifact estimates of seismic risk, (2) misleading use of the annual probability of exccedance (i.e., the probability of exceedance in one year) as a frequency (per year), and (3) numerical creation of extremely high ground motion. An alternative approach, which is similar to those used for flood and wind hazard assessments, has been proposed. ?? 2011 ASCE.
Risk assessment of major hazards and its application in urban planning: a case study.
Zhou, Yafei; Liu, Mao
2012-03-01
With the rapid development of industry in China, the number of establishments that are proposed or under construction is increasing year by year, and many are industries that handle flammable, explosive, toxic, harmful, and dangerous substances. Accidents such as fire, explosion, and toxic diffusion inevitably happen. Accidents resulting from these major hazards in cities cause a large number of casualties and property losses. It is increasingly important to analyze the risk of major hazards in cities realistically and to suitably plan and utilize the surrounding land based on the risk analysis results, thereby reducing the hazards. A theoretical system for risk assessment of major hazards in cities is proposed in this article, and the major hazard risk for the entire city is analyzed quantitatively. Risks of various major accidents are considered together, superposition effect is analyzed, individual risk contours of the entire city are drawn out, and the level of risk in the city is assessed using "as low as reasonably practicable" guidelines. After the entire city's individual risk distribution is obtained, risk zones are divided according to corresponding individual risk value of HSE, and land-use planning suggestions are proposed. Finally, a city in China is used as an example to illustrate the risk assessment process of the city's major hazard and its application in urban land-use planning. The proposed method has a certain theoretical and practical significance in establishing and improving risk analysis of major hazard and urban land-use planning. On the one hand, major urban public risk is avoided; further, the land is utilized in the best possible way in order to obtain the maximum benefit from its use. © 2011 Society for Risk Analysis.
49 CFR 659.31 - Hazard management process.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 49 Transportation 7 2014-10-01 2014-10-01 false Hazard management process. 659.31 Section 659.31... Agency § 659.31 Hazard management process. (a) The oversight agency must require the rail transit agency..., operational changes, or other changes within the rail transit environment. (b) The hazard management process...
49 CFR 659.31 - Hazard management process.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 49 Transportation 7 2013-10-01 2013-10-01 false Hazard management process. 659.31 Section 659.31... Agency § 659.31 Hazard management process. (a) The oversight agency must require the rail transit agency..., operational changes, or other changes within the rail transit environment. (b) The hazard management process...
49 CFR 659.31 - Hazard management process.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 49 Transportation 7 2012-10-01 2012-10-01 false Hazard management process. 659.31 Section 659.31... Agency § 659.31 Hazard management process. (a) The oversight agency must require the rail transit agency..., operational changes, or other changes within the rail transit environment. (b) The hazard management process...
This Applications Analysis Report evaluates the solidification/stabilization treatment process of Silicate Technology Corporation (STC) for the on-site treatment of hazardous waste. The STC immobilization technology utilizes a proprietary product (FMS Silicate) to chemically stab...
14 CFR 417.405 - Ground safety analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... hazard from affecting the public. A launch operator must incorporate the launch site operator's systems... personnel who are knowledgeable of launch vehicle systems, launch processing, ground systems, operations...) Begin a ground safety analysis by identifying the systems and operations to be analyzed; (2) Define the...
14 CFR 417.405 - Ground safety analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... hazard from affecting the public. A launch operator must incorporate the launch site operator's systems... personnel who are knowledgeable of launch vehicle systems, launch processing, ground systems, operations...) Begin a ground safety analysis by identifying the systems and operations to be analyzed; (2) Define the...
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.; Newman, Richard L.; Crider, Dennis A.; Klyde, David H.; Foster, John V.; Groff, Loren
2016-01-01
Aircraft loss of control (LOC) is a leading cause of fatal accidents across all transport airplane and operational classes. LOC can result from a wide spectrum of precursors (or hazards), often occurring in combination. Technologies developed for LOC prevention and recovery must therefore be effective under a wide variety of conditions and uncertainties, including multiple hazards, and the validation process must provide a means of assessing system effectiveness and coverage of these hazards. This paper provides a detailed description of a methodology for analyzing LOC as a dynamics and control problem for the purpose of developing effective technology solutions. The paper includes a definition of LOC based on several recent publications, a detailed description of a refined LOC accident analysis process that is illustrated via selected example cases, and a description of planned follow-on activities for identifying future potential LOC risks and the development of LOC test scenarios. Some preliminary considerations for LOC of Unmanned Aircraft Systems (UAS) and for their safe integration into the National Airspace System (NAS) are also discussed.
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
21 CFR 120.7 - Hazard analysis.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard analysis. 120.7 Section 120.7 Food and... hazards. The written hazard analysis shall consist of at least the following: (1) Identification of food..., including food hazards that can occur before, during, and after harvest. The hazard analysis shall be...
Relocating San Miguel Volcanic Seismic Events for Receiver Functions and Tomographic Models
NASA Astrophysics Data System (ADS)
Patlan, E.; Velasco, A. A.; Konter, J.
2009-12-01
The San Miguel volcano lies near the city of San Miguel, El Salvador (13.43N and -88.26W). San Miguel volcano, an active stratovolcano, presents a significant natural hazard for the city of San Miguel. Furthermore, the internal state and activity of volcanoes remains an important component to understanding volcanic hazard. The main technology for addressing volcanic hazards and processes is through the analysis of data collected from the deployment of seismic sensors that record ground motion. Six UTEP seismic stations were deployed around San Miguel volcano from 2007-2008 to define the magma chamber and assess the seismic and volcanic hazard. We utilize these data to develop images of the earth structure beneath the volcano, studying the volcanic processes by identifying different sources, and investigating the role of earthquakes and faults in controlling the volcanic processes. We will calculate receiver functions to determine the thickness of San Miguel volcano internal structure, within the Caribbean plate. Crustal thicknesses will be modeled using calculated receiver functions from both theoretical and hand-picked P-wave arrivals. We will use this information derived from receiver functions, along with P-wave delay times, to map the location of the magma chamber.
FLAMMABLE GAS TECHNICAL BASIS DOCUMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
KRIPPS, L.J.
2005-02-18
This document describes the qualitative evaluation of frequency and consequences for double shell tank (DST) and single shell tank (SST) representative flammable gas accidents and associated hazardous conditions without controls. The evaluation indicated that safety-significant SSCs and/or TSRS were required to prevent or mitigate flammable gas accidents. Discussion on the resulting control decisions is included. This technical basis document was developed to support of the Tank Farms Documented Safety Analysis (DSA) and describes the risk binning process for the flammable gas representative accidents and associated represented hazardous conditions. The purpose of the risk binning process is to determine the needmore » for safety-significant structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls for a given representative accident or represented hazardous condition based on an evaluation of the event frequency and consequence.« less
NASA Astrophysics Data System (ADS)
Mencin, David; Hodgkinson, Kathleen; Sievers, Charlie; David, Phillips; Charles, Meertens; Glen, Mattioli
2017-04-01
UNAVCO has been providing infrastructure and support for solid-earth sciences and earthquake natural hazards for the past two decades. Recent advances in GNSS technology and data processing are now providing position solutions with centimeter-level precision at high-rate (>1 Hz) and low latency (i.e. the time required for data to arrive for analysis, in this case less than 1 second). These data have the potential to improve our understanding in diverse areas of geophysics including properties of seismic, volcanic, magmatic and tsunami sources, and thus profoundly transform rapid event characterization and warning. Scientific and operational applications also include glacier and ice sheet motions; tropospheric modeling; and space weather. These areas of geophysics represent a spectrum of research fields, including geodesy, seismology, tropospheric weather, space weather and natural hazards. Processed Real-Time GNSS (RT-GNSS) data will require formats and standards that allow this broad and diverse community to use these data and associated meta-data in existing research infrastructure. These advances have critically highlighted the difficulties associated with merging data and metadata between scientific disciplines. Even seemingly very closely related fields such as geodesy and seismology, which both have rich histories of handling large volumes of data and metadata, do not go together well in any automated way. Community analysis strategies, or lack thereof, such as treatment of error prove difficult to address and are reflected in the data and metadata. In addition, these communities have differing security, accessibility and reliability requirements. We propose some solutions to the particular problem of making RT-GNSS processed solution data and metadata accessible to multiply scientific and natural hazard communities. Importantly, we discuss the roadblocks encounter and solved and those that remain to be addressed.
Sensitivity to volcanic field boundary
NASA Astrophysics Data System (ADS)
Runge, Melody; Bebbington, Mark; Cronin, Shane; Lindsay, Jan; Rashad Moufti, Mohammed
2016-04-01
Volcanic hazard analyses are desirable where there is potential for future volcanic activity to affect a proximal population. This is frequently the case for volcanic fields (regions of distributed volcanism) where low eruption rates, fertile soil, and attractive landscapes draw populations to live close by. Forecasting future activity in volcanic fields almost invariably uses spatial or spatio-temporal point processes with model selection and development based on exploratory analyses of previous eruption data. For identifiability reasons, spatio-temporal processes, and practically also spatial processes, the definition of a spatial region is required to which volcanism is confined. However, due to the complex and predominantly unknown sub-surface processes driving volcanic eruptions, definition of a region based solely on geological information is currently impossible. Thus, the current approach is to fit a shape to the known previous eruption sites. The class of boundary shape is an unavoidable subjective decision taken by the forecaster that is often overlooked during subsequent analysis of results. This study shows the substantial effect that this choice may have on even the simplest exploratory methods for hazard forecasting, illustrated using four commonly used exploratory statistical methods and two very different regions: the Auckland Volcanic Field, New Zealand, and Harrat Rahat, Kingdom of Saudi Arabia. For Harrat Rahat, sensitivity of results to boundary definition is substantial. For the Auckland Volcanic Field, the range of options resulted in similar shapes, nevertheless, some of the statistical tests still showed substantial variation in results. This work highlights the fact that when carrying out any hazard analysis on volcanic fields, it is vital to specify how the volcanic field boundary has been defined, assess the sensitivity of boundary choice, and to carry these assumptions and related uncertainties through to estimates of future activity and hazard analyses.
NASA Astrophysics Data System (ADS)
Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco
2016-04-01
The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including hanging wall and directivity effects) within modern ground motion prediction equations, can have an influence on the seismic hazard at a site. Yet we also illustrate the conditions under which these effects may be partially tempered when considering the full uncertainty in rupture behaviour within the fault system. The third challenge is the development of efficient means for representing both aleatory and epistemic uncertainties from active fault models in PSHA. In implementing state-of-the-art seismic hazard models into OpenQuake, such as those recently undertaken in California and Japan, new modeling techniques are needed that redefine how we treat interdependence of ruptures within the model (such as mutual exclusivity), and the propagation of uncertainties emerging from geology. Finally, we illustrate how OpenQuake, and GEM's additional toolkits for model preparation, can be applied to address long-standing issues in active fault modeling in PSHA. These include constraining the seismogenic coupling of a fault and the partitioning of seismic moment between the active fault surfaces and the surrounding seismogenic crust. We illustrate some of the possible roles that geodesy can play in the process, but highlight where this may introduce new uncertainties and potential biases into the seismic hazard process, and how these can be addressed.
NASA Astrophysics Data System (ADS)
Hua, H.; Owen, S. E.; Yun, S.; Lundgren, P.; Fielding, E. J.; Agram, P.; Manipon, G.; Stough, T. M.; Simons, M.; Rosen, P. A.; Wilson, B. D.; Poland, M. P.; Cervelli, P. F.; Cruz, J.
2013-12-01
Space-based geodetic measurement techniques such as Interferometric Synthetic Aperture Radar (InSAR) and Continuous Global Positioning System (CGPS) are now important elements in our toolset for monitoring earthquake-generating faults, volcanic eruptions, hurricane damage, landslides, reservoir subsidence, and other natural and man-made hazards. Geodetic imaging's unique ability to capture surface deformation with high spatial and temporal resolution has revolutionized both earthquake science and volcanology. Continuous monitoring of surface deformation and surface change before, during, and after natural hazards improves decision-making from better forecasts, increased situational awareness, and more informed recovery. However, analyses of InSAR and GPS data sets are currently handcrafted following events and are not generated rapidly and reliably enough for use in operational response to natural disasters. Additionally, the sheer data volumes needed to handle a continuous stream of InSAR data sets also presents a bottleneck. It has been estimated that continuous processing of InSAR coverage of California alone over 3-years would reach PB-scale data volumes. Our Advanced Rapid Imaging and Analysis for Monitoring Hazards (ARIA-MH) science data system enables both science and decision-making communities to monitor areas of interest with derived geodetic data products via seamless data preparation, processing, discovery, and access. We will present our findings on the use of hybrid-cloud computing to improve the timely processing and delivery of geodetic data products, integrating event notifications from USGS to improve the timely processing for response, as well as providing browse results for quick looks with other tools for integrative analysis.
The price of safety: costs for mitigating and coping with Alpine hazards
NASA Astrophysics Data System (ADS)
Pfurtscheller, C.; Thieken, A. H.
2013-10-01
Due to limited public budgets and the need to economize, the analysis of costs of hazard mitigation and emergency management of natural hazards becomes increasingly important for public natural hazard and risk management. In recent years there has been a growing body of literature on the estimation of losses which supported to help to determine benefits of measures in terms of prevented losses. On the contrary, the costs of mitigation are hardly addressed. This paper thus aims to shed some light on expenses for mitigation and emergency services. For this, we analysed the annual costs of mitigation efforts in four regions/countries of the Alpine Arc: Bavaria (Germany), Tyrol (Austria), South Tyrol (Italy) and Switzerland. On the basis of PPP values (purchasing power parities), annual expenses on public safety ranged from EUR 44 per capita in the Free State of Bavaria to EUR 216 in the Autonomous Province of South Tyrol. To analyse the (variable) costs for emergency services in case of an event, we used detailed data from the 2005 floods in the Federal State of Tyrol (Austria) as well as aggregated data from the 2002 floods in Germany. The analysis revealed that multi-hazards, the occurrence and intermixture of different natural hazard processes, contribute to increasing emergency costs. Based on these findings, research gaps and recommendations for costing Alpine natural hazards are discussed.
An Approach for Rapid Assessment of Seismic Hazards in Turkey by Continuous GPS Data
Ozener, Haluk; Dogru, Asli; Unlutepe, Ahmet
2009-01-01
The Earth is being monitored every day by all kinds of sensors. This leads an overflow of data in all branches of science nowadays, especially in Earth Sciences. Data storage and data processing are the problems to be solved by current technologies, as well as by those accessing and analyzing these large data sources. Once solutions have been created for collecting, storing and accessing data, then the challenge becomes how to effectively share data, applications and processing resources across many locations. The Global Positioning System (GPS) sensors are being used as geodetic instruments to precisely detect crustal motion in the Earth's surface. Rapid access to data provided by GPS sensors is becoming increasingly important for deformation monitoring and rapid hazard assessments. Today, reliable and fast collection and distribution of data is a challenge and advances in Internet technologies have made it easier to provide the needed data. This study describes a system which will be able to generate strain maps using data from continuous GPS stations for seismic hazard analysis. Strain rates are a key factor in seismic hazard analyses. Turkey is a country prone to earthquakes with a long history of seismic hazards and disasters. This situation has resulted in the studies by Earth scientists that focus on Turkey in order to improve their understanding of the Earth's crust structure and seismic hazards. Nevertheless, the construction of models, data access and analysis are often not fast as expected, but the combination of Internet technologies with continuous GPS sensors can be a solution to overcome this problem. This system would have the potential to answer many important questions to assess seismic hazards such as how much stretching, squashing and shearing is taking place in different parts of Turkey, and how do velocities change from place to place? Seismic hazard estimation is the most effective way to reduce earthquake losses. It is clear that reliability of data and on-line services will support the preparation of strategies for disaster management and planning to cope with hazards. PMID:22389619
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, A.
2014-04-27
One method of remediating legacy liquid radioactive waste produced during the cold war, is aggressive in-tank chemical cleaning. Chemical cleaning has successfully reduced the curie content of residual waste heels in large underground storage tanks; however this process generates significant chemical hazards. Mercury is often the bounding hazard due to its extensive use in the separations process that produced the waste. This paper explores how variations in controllable process factors, tank level and temperature, may be manipulated to reduce the hazard potential related to mercury vapor generation. When compared using a multivariate regression analysis, findings indicated that there was amore » significant relationship between both tank level (p value of 1.65x10{sup -23}) and temperature (p value of 6.39x10{sup -6}) to the mercury vapor concentration in the tank ventilation system. Tank temperature showed the most promise as a controllable parameter for future tank cleaning endeavors. Despite statistically significant relationships, there may not be confidence in the ability to control accident scenarios to below mercury’s IDLH or PAC-III levels for future cleaning initiatives.« less
Cascading processes and interactions in torrent catchments and their influence on the damage pattern
NASA Astrophysics Data System (ADS)
Keiler, Margreth; Gebbers, David
2014-05-01
Research on single geomorphological processes during damaging events has a long history; however, comprehensive documentations and analyses of the events have been conducted not until the late 1980s. Thus, for highly damaging events insights about triggering, the evolution and the impacts of processes during an event and the resulting damage were produced. Though, in the majority of cases the processes were studied in a well-defined procedure of one disciplinary focus. These focused studies neglect mutable influences which may alter the sequence of the process or the event. During damaging events multiple geomorphological processes are active which leads to the assumption that they have a certain impact on each other and the course of damaging effect. Consequently, for a comprehensive hazard and risk analysis all processes of a catchment have to be analysed and evaluated quantitatively and qualitatively (MARZOCCHI, 2007). Although the demand for a sophisticated risk management is increasing, the research on interactions as well as on physical vulnerability to multiple hazards, including the different processes impact effects, is still very limited (KAPPES et al., 2010, 2011). The challenges in this field are the quantity of data needed, and furthermore to conduct this kind of analysis is very complex and complicated (KAPPES et al. 2012). Yet, knowledge about possible interactions and resulting impact effects could significantly contribute to the reduction of risk in a region. The objective of this study is to analyse, i) how geomorphological processes interact with each other and with other factors of the surrounding during a damaging event, ii) what influences those interactions have on the resulting damage of the event and iii) whether or not different events are comparable in terms of those interactions and their impacts. To meet these objectives, 15 damaging torrent events, which occurred between 2000 and 2011 in the Bernese Oberland and the Pennine Alps, Switzerland, were analysed on the basis of event reports and general catchment parameters. The interactions were classified into different categories regarding a process and the interacting counterpart (another process, with structures or disposition) and the temporal and spatial extent in which these interactions occurred. Additionally, positive and negative feedbacks of the processes were considered. First results highlight that some types of interaction can be extracted in several events and that their temporal and spatial extent is comparable. However, the analysis indicates that single interaction exhibits multi-path consequences which are a challenge for general propositions of interactions influencing damage patterns. In the further step of this study, clusters of interactions which could occur in different events in similar ways are analysed in more detail. REFERENCES Kappes, M.S., Papathoma-Köhle, M. & Keiler, M. 2011: Assessing physical vulnerability for multi-hazards using an indicator-based methodology, Applied Geography, 32, 577-590. Kappes, M.S., Keiler, M. & Glade, T. 2010: From single- to multi-hazard risk analyses: a concept addressing emerging challenges. In: Malet, J.-P.; Glade, T. & N. Casagli (eds.) Mountain Risks: bringing science to society. Proceedings of the 'Mountain Risks' International Conference, Firenze, Italy. Strasbourg. CERG Editions: 351-356 Kappes, M. S., Keiler, M., von Eleverfeldt, K., Glade, T. 2012: Challenges of analyzing multi-hazard risk: a review. NAT HAZARDS 64: 1925-1958. Marzocchi, W., Mastellone, M.L., Ruocco, A. 2009: Principles of multi-risk assessment: Interaction amongst natural and man-induced risks. European Commission. Brussels.
Hazards on Hazards, Ensuring Spacecraft Safety While Sampling Asteroid Surface Materials
NASA Astrophysics Data System (ADS)
Johnson, C. A.; DellaGiustina, D. N.
2016-12-01
The near-Earth object Bennu is a carbonaceous asteroid that is a remnant from the earliest stages of the solar-system formation. It is also a potentially hazardous asteroid with a relatively high probability of impacting Earth late in the 22nd century. While the primary focus of the NASA funded OSIRIS-REx mission is the return of pristine organic material from the asteroid's surface, information about Bennu's physical and chemical properties gleaned throughout operations will be critical for a possible future impact mitigation mission. In order to ensure a regolith sample can be successfully acquired, the sample site and surrounding area must be thoroughly assessed for any potential hazards to the spacecraft. The OSIRIS-REx Image Processing Working Group has been tasked with generating global and site-specific hazard maps using mosaics and a trio of feature identification techniques. These techniques include expert-lead manual classification, internet-based amateur classification using the citizen science platform CosmoQuest, and automated classification using machine learning and computer vision tools. Because proximity operations around Bennu do not begin until the end of 2018, we have an opportunity to test the performance of our software on analogue surfaces of other asteroids from previous NASA and other space agencies missions. The entire pipeline from image processing and mosaicking to hazard identification, analysis and mapping will be performed on asteroids of varying size, shape and surface morphology. As a result, upon arrival at Bennu, we will have the software and processes in place to quickly and confidently produce the hazard maps needed to ensure the success of our mission.
40 CFR 63.404 - Compliance demonstrations.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Standards for Hazardous Air Pollutants for Industrial Process Cooling Towers § 63.404 Compliance... Act, the Administrator or delegated authority can require cooling water sample analysis of an IPCT if... accordance with paragraph (c) of this section in lieu of a water sample analysis. If cooling water sample...
Optimizing Web-Based Instruction: A Case Study Using Poultry Processing Unit Operations
ERIC Educational Resources Information Center
O' Bryan, Corliss A.; Crandall, Philip G.; Shores-Ellis, Katrina; Johnson, Donald M.; Ricke, Steven C.; Marcy, John
2009-01-01
Food companies and supporting industries need inexpensive, revisable training methods for large numbers of hourly employees due to continuing improvements in Hazard Analysis Critical Control Point (HACCP) programs, new processing equipment, and high employee turnover. HACCP-based food safety programs have demonstrated their value by reducing the…
ELI ECO Logic International, Inc.'s Thermal Desorption Unit (TDU) is specifically designed for use with Eco Logic's Gas Phase Chemical Reduction Process. The technology uses an externally heated bath of molten tin in a hydrogen atmosphere to desorb hazardous organic compounds fro...
21 CFR 120.25 - Process verification for certain processors.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...
21 CFR 120.25 - Process verification for certain processors.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...
21 CFR 120.25 - Process verification for certain processors.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Process verification for certain processors. 120.25 Section 120.25 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS...
Arvanitoyannis, Ioannis S; Traikou, Athina
2005-01-01
The production of flour and semolina and their ensuing products, such as bread, cake, spaghetti, noodles, and corn flakes, is of major importance, because these products constitute some of the main ingredients of the human diet. The Hazard Analysis Critical Control Point (HACCP) system aims at ensuring the safety of these products. HACCP has been implemented within the frame of this study on various products of both Asian and European origin; the hazards, critical control limits (CCLs), observation practices, and corrective actions have been summarized in comprehensive tables. Furthermore, the various production steps, packaging included, were thoroughly analyzed, and reference was made to both the traditional and new methodologies in an attempt to pinpoint the occurring differences (advantages and disadvantages) per process.
Hazard function analysis for flood planning under nonstationarity
NASA Astrophysics Data System (ADS)
Read, Laura K.; Vogel, Richard M.
2016-05-01
The field of hazard function analysis (HFA) involves a probabilistic assessment of the "time to failure" or "return period," T, of an event of interest. HFA is used in epidemiology, manufacturing, medicine, actuarial statistics, reliability engineering, economics, and elsewhere. For a stationary process, the probability distribution function (pdf) of the return period always follows an exponential distribution, the same is not true for nonstationary processes. When the process of interest, X, exhibits nonstationary behavior, HFA can provide a complementary approach to risk analysis with analytical tools particularly useful for hydrological applications. After a general introduction to HFA, we describe a new mathematical linkage between the magnitude of the flood event, X, and its return period, T, for nonstationary processes. We derive the probabilistic properties of T for a nonstationary one-parameter exponential model of X, and then use both Monte-Carlo simulation and HFA to generalize the behavior of T when X arises from a nonstationary two-parameter lognormal distribution. For this case, our findings suggest that a two-parameter Weibull distribution provides a reasonable approximation for the pdf of T. We document how HFA can provide an alternative approach to characterize the probabilistic properties of both nonstationary flood series and the resulting pdf of T.
For the regulatory process, EPA is required to develop a regulatory impact analysis (RIA). This August 2010 RIA includes an economic impact analysis (EIA) and a small entity impacts analysis and documents the RIA methods and results for the 2010 rules
Arvanitoyannis, Ioannis S; Varzakas, Theodoros H
2008-05-01
The Failure Mode and Effect Analysis (FMEA) model was applied for risk assessment of salmon manufacturing. A tentative approach of FMEA application to the salmon industry was attempted in conjunction with ISO 22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (salmon processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points were identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram and fishbone diagram). In this work, a comparison of ISO 22000 analysis with HACCP is carried out over salmon processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the RPN per identified processing hazard. Fish receiving, casing/marking, blood removal, evisceration, filet-making cooling/freezing, and distribution were the processes identified as the ones with the highest RPN (252, 240, 210, 210, 210, 210, 200 respectively) and corrective actions were undertaken. After the application of corrective actions, a second calculation of RPN values was carried out resulting in substantially lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO 22000 system of a salmon processing industry is anticipated to prove advantageous to industrialists, state food inspectors, and consumers.
Economic Impact Analyses of Interdisciplinary Multi-hazard Scenarios: ShakeOut and ARkStorm
NASA Astrophysics Data System (ADS)
Wein, A. M.; Rose, A.; Sue Wing, I.; Wei, D.
2011-12-01
U. S. Geological Survey (USGS) scientists are using an interdisciplinary strategy to develop and analyze multi-hazard scenarios to help communities enhance resilience to natural hazard disasters. Two such scenarios are the southern California ShakeOut earthquake and the California ARkStorm winter storm. Both scenarios are multi-hazard: Shakeout ground motions trigger landslides and liquefaction and ARkStorm involves wind, flood, landslide, and coastal hazards. A collaborative scenario-process engages partners and stakeholders throughout the development and use of the scenarios, In doing so, community resilience is enhanced by educating communities about hazards and hazard interdependencies, building networks from scientists to decision makers, exercising emergency management strategies, identifying emergency management issues, and motivating solutions prior to an event. In addition, interdisciplinary scenarios stimulate research on the various steps of analysis (e.g., natural hazard processes, physical damages, societal consequences, and policy connections). In particular, USGS scientists have collaborated with economists to advance methods to estimate the economic impacts (business interruption losses) of disasters. Our economic impact analyses evolved from the economic module in the Federal Emergency Management Agency's loss-estimation tool, HAZUS-MH, to a more encompassing input-output analysis for ShakeOut, to a more sophisticated Computable General Equilibrium model for ARkStorm. The analyses depend on physical damage and restoration time estimates from engineers and geographic analyses of economic assets in hazard zones. Economic resilience strategies are incorporated to represent resourcefulness and ingenuity that avoids potential losses during and after an event. Such strategies operate at three levels of the economy: micro (e.g., ability to catch up on lost production time), meso (e.g., coordination within a sector to share resources), and macro (e.g., price adjustments to redistribute scarce resources). A sensitivity analysis of the ARkStorm economic impact model explores the effects of 1) the magnitude of the shocks (e.g., flood damages to buildings and infrastructure, agricultural productivity, and lifeline service disruptions), 2) the sustainability of the economic resilience strategies, and 3) the amount, timing, and source of reconstruction funds. The inclusion of an economic analysis in ShakeOut and ARkStorm broadens the range of interest in the scenario results. For example, the relative contribution of ShakeOut economic shocks to business interruption losses emphasized the need to reduce the impacts of fire following earthquake and water service disruption. Based on the magnitude and duration of the economic impacts for the ARkStorm scenario, policy experts surmised that business interruption policy time elements would be exceeded and business interruptions would be largely unfunded calling attention to the need for innovative funding solutions. Finally, economic impact analyses inform the question of paying now to mitigate or paying more later to recover.
NASA Astrophysics Data System (ADS)
Schaub, Y.; Huggel, C.; Serraino, M.; Haeberli, W.
2012-04-01
The changes in high-mountain environments are increasingly fast and complex. GIS-based models of the Swiss Alps show that numerous topographic overdeepenings are likely to appear on progressively exposed glacier beds, which are considered as potential sites of future lake formation. In many cases these newly forming lakes will be situated in an over-steepened and destabilized high-mountain environment and are, therefore, prone to impact waves from landslides. The risk of glacier lake outburst floods, endangering infrastructure, residential areas and persons further downvalley, is increasing with further lake formation and glacier recession. This risk may persist for many decades if not centuries. Future-oriented hazard assessments have to be integrative and must deal with all possible process chains. Reference studies and methodologies are still scarce, however. We present an approach to compare risks resulting from high-mountain lakes in the Swiss Alps amongst each other. Already existing lakes are thereby as much included in the analysis as future ones. The presented risk assessment approach integrates the envisaged high-mountain hazard process chain with present and future socio-economic conditions. Applying the concept of integral risk management, the hazard and damage potentials have to be analyzed. The areas that feature the topographic potential for rock/iceavalanches to reach a lake were analyzed regarding their susceptibility to slope failure including the factors slope inclination, permafrost occurrence, glacier recession and bedrock lithology. Together with the analysis of the lakes (volume and runout path of potential outburst floods), the hazard analysis of the process chain was completed. As an example, high long-term hazard potentials in the Swiss Alps have, for instance, to be expected in the area of the Great Aletsch glacier. A methodology for the assessment of the damage potential was elaborated and will be presented. In order to estimate the location of the largest damage potentials, driving forces of different spatial development scenarios for the Swiss Alps will be implemented in a land allocation model for the Swiss Alps. By bringing together hazard, exposure and vulnerability analyses, a risk assessment for the entire Swiss Alps regarding lake-outburst floods triggered by impacts of rock/ice avalanches can be conducted for today, the middle of the century and even beyond.
49 CFR 659.31 - Hazard management process.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., operational changes, or other changes within the rail transit environment. (b) The hazard management process must, at a minimum: (1) Define the rail transit agency's approach to hazard management and the... Agency § 659.31 Hazard management process. (a) The oversight agency must require the rail transit agency...
49 CFR 659.31 - Hazard management process.
Code of Federal Regulations, 2011 CFR
2011-10-01
..., operational changes, or other changes within the rail transit environment. (b) The hazard management process must, at a minimum: (1) Define the rail transit agency's approach to hazard management and the... Agency § 659.31 Hazard management process. (a) The oversight agency must require the rail transit agency...
Hazard Analysis Guidelines for Transit Projects
DOT National Transportation Integrated Search
2000-01-01
These hazard analysis guidelines discuss safety critical systems and subsystems, types of hazard analyses, when hazard analyses should be performed, and the hazard analysis philosophy. These guidelines are published by FTA to assist the transit indus...
Investigation of hazards associated with plastic bonded starter mix manufacturing processes
NASA Technical Reports Server (NTRS)
1971-01-01
An investigation to determine the hazards potential evaluation of plastic bonded starter mix (PBSM) production processes and the application to the M18 and M7A3 grenades is reported. The investigation indicated: (1) the materials with the greatest hazards characteristics, (2) process operating stations most likely to initiate hazardous conditions, (3) the test program required to examine ignition characteristics and process hazards, and (4) the method of handling the accumulated information from testing and safety analyses.
14 CFR 437.29 - Hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...
14 CFR 437.29 - Hazard analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...
NASA Astrophysics Data System (ADS)
Patlan, E.; Velasco, A.; Konter, J. G.
2010-12-01
The San Miguel volcano lies near the city of San Miguel, El Salvador (13.43N and - 88.26W). San Miguel volcano, an active stratovolcano, presents a significant natural hazard for the city of San Miguel. In general, the internal state and activity of volcanoes remains an important component to understanding volcanic hazard. The main technology for addressing volcanic hazards and processes is through the analysis of data collected from the deployment of seismic sensors that record ground motion. Six UTEP seismic stations were deployed around San Miguel volcano from 2007-2008 to define the magma chamber and assess the seismic and volcanic hazard. We utilize these data to develop images of the earth structure beneath the volcano, studying the volcanic processes by identifying different sources, and investigating the role of earthquakes and faults in controlling the volcanic processes. We initially locate events using automated routines and focus on analyzing local events. We then relocate each seismic event by hand-picking P-wave arrivals, and later refine these picks using waveform cross correlation. Using a double difference earthquake location algorithm (HypoDD), we identify a set of earthquakes that vertically align beneath the edifice of the volcano, suggesting that we have identified a magma conduit feeding the volcano. We also apply a double-difference earthquake tomography approach (tomoDD) to investigate the volcano’s plumbing system. Our preliminary results show the extent of the magma chamber that also aligns with some horizontal seismicity. Overall, this volcano is very active and presents a significant hazard to the region.
NASA Astrophysics Data System (ADS)
Melzner, Sandra; Mölk, Michael; Schiffer, Michael; Gasperl, Wolfgang
2015-04-01
In times of decreasing financial resources, the demand for the investment in protection measures with a positive return on investment is of high importance. Hazard and risk assessments are essential tools in order to ensure an economically justifiable application of money in the implementation of preventive measures. Many areas in the Eastern Alps are recurrently affected by rockfall processes which pose a significant hazard to settlements and infrastructures. Complex tectonic, lithological and geomorphologic settings require a sufficient amount of effort to map and collect high quality data to perform a reliable hazard and risk analysis. The present work summarizes the results of a detailed hazard and risk assessment performed in a community in the Northern Calcareous Alps (Upper Austroalpine Unit). The community Hallstatt is exposed to very steep limestone cliffs, which are highly susceptible towards future, in many parts high magnitude rock failures. The analysis of the record of former events shows that since 1652 several rockfall events damaged or destroyed houses and killed or injured some people. Hallstatt as a Unesco World Heritage Site represents a very vulnerable settlement, the risk being elevated by a high frequency tourism with greater one million visitors per year. Discussion will focus on the applied methods to identify and map the rockfall hazard and risk, including a magnitude-frequency analysis of events in the past and an extrapolation in the future as well as a vulnerability analysis for the existing infrastructure under the assumed events for the determined magnitude-frequency scenarios. Furthermore challenges for a decision making in terms of a sustainable land use planning and implementation of preventive measures will be discussed.
Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Woo, Gordon
2017-04-01
For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.
Complete hazard ranking to analyze right-censored data: An ALS survival study.
Huang, Zhengnan; Zhang, Hongjiu; Boss, Jonathan; Goutman, Stephen A; Mukherjee, Bhramar; Dinov, Ivo D; Guan, Yuanfang
2017-12-01
Survival analysis represents an important outcome measure in clinical research and clinical trials; further, survival ranking may offer additional advantages in clinical trials. In this study, we developed GuanRank, a non-parametric ranking-based technique to transform patients' survival data into a linear space of hazard ranks. The transformation enables the utilization of machine learning base-learners including Gaussian process regression, Lasso, and random forest on survival data. The method was submitted to the DREAM Amyotrophic Lateral Sclerosis (ALS) Stratification Challenge. Ranked first place, the model gave more accurate ranking predictions on the PRO-ACT ALS dataset in comparison to Cox proportional hazard model. By utilizing right-censored data in its training process, the method demonstrated its state-of-the-art predictive power in ALS survival ranking. Its feature selection identified multiple important factors, some of which conflicts with previous studies.
Lunar mission safety and rescue: Hazards analysis and safety requirements
NASA Technical Reports Server (NTRS)
1971-01-01
The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.
SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Suzette Jackson; Coppersmith, Ryan; Coppersmith, Kevin
A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the newmore » methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.« less
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...
Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2009-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Varzakas, Theodoros H
2011-09-01
The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of pastry processing. A tentative approach of FMEA application to the pastry industry was attempted in conjunction with ISO22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (pastry processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over pastry processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the Risk Priority Number (RPN) per identified processing hazard. Storage of raw materials and storage of final products at -18°C followed by freezing were the processes identified as the ones with the highest RPN (225, 225, and 144 respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a pastry processing industry is considered imperative.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-01
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration Office of Hazardous Materials Safety; Notice of Delays In Processing of Special Permits Applications AGENCY: Pipeline..., Office of Hazardous Materials Special Permits and Approvals, Pipeline and Hazardous Materials Safety...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-16
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration Office of Hazardous Materials Safety; Notice of Delays in Processing of Special Permits Applications AGENCY: Pipeline..., Office of Hazardous Materials Special Permits and Approvals, Pipeline and Hazardous Materials Safety...
Vargas-Leguás, H; Rodríguez Garrido, V; Lorite Cuenca, R; Pérez-Portabella, C; Redecillas Ferreiro, S; Campins Martí, M
2009-06-01
This guide for the preparation of powdered infant formulae in hospital environments is a collaborative work between several hospital services and is based on national and European regulations, international experts meetings and the recommendations of scientific societies. This guide also uses the Hazard Analysis and Critical Control Point principles proposed by Codex Alimentarius and emphasises effective verifying measures, microbiological controls of the process and the corrective actions when monitoring indicates that a critical control point is not under control. It is a dynamic guide and specifies the evaluation procedures that allow it to be constantly adapted.
NASA Astrophysics Data System (ADS)
Li, P.
2016-12-01
In this study, on the basis of 3,200 km shallow stratigraphic section and sidescan sonar data of the coastal area of the Yellow River Delta, we delineated and interpreted a total of seven types of typical hazardous geologies, including the hazardous geology in the shallow strata (buried ancient channel and strata disturbance) and hazardous geology in the seabed surface strata (pit, erosive residual body, sand patch, sand wave and scour channel). We selected eight parameters representing the development scale of the hazardous geology as the zoning indexes, including the number of hazardous geology types, pit depth, height of erosive residual body, length of scour channel, area of sand patch, length of sand wave, width of the buried ancient channel and depth of strata disturbance, and implemented the grid processing of the research area to calculate the arithmetic sum of the zoning indexes of each unit grid one by one. We then adopted the clustering analysis method to divide the near-shore waters of the Yellow River Delta into five hazardous geology areas, namely the serious erosion disaster area controlled by Diaokou lobe waves, hazardous geology area of multi-disasters under the combined action of the Shenxiangou lobe river wave flow, accumulation type hazardous geology area controlled by the current estuary river, hazardous geology area of single disaster in the deep water area and potential hazardous geology area of the Chengdao Oilfield. All four of the main factors affecting the development of hazardous geology, namely the diffusion and movement of sediment flux of the Yellow River water entering the sea, seabed stability, bottom sediment type and distribution, as well as the marine hydrodynamic characteristics, show significant regional differentiation characteristics and laws. These characteristics and laws are consistent with the above-mentioned zoning results, in which the distribution, scale and genetic mechanism of hazardous geology are considered comprehensively. This indicates that the hazardous geology zoning based on the cluster analysis is a new attempt in research regarding the hazardous geology zoning of the near-shore waters of the modern Yellow River Delta and that this type of zoning has a high level of reasonability.
Antal, Borbála; Kuki, Ákos; Nagy, Lajos; Nagy, Tibor; Zsuga, Miklós; Kéki, Sándor
2016-07-01
Residues of chemicals on clothing products were examined by direct analysis in real-time (DART) mass spectrometry. Our experiments have revealed the presence of more than 40 chemicals in 15 different clothing items. The identification was confirmed by DART tandem mass spectrometry (MS/MS) experiments for 14 compounds. The most commonly detected hazardous substances were nonylphenol ethoxylates (NPEs), phthalic acid esters (phthalates), amines released by azo dyes, and quinoline derivates. DART-MS was able to detect NPEs on the skin of the person wearing the clothing item contaminated by NPE residuals. Automated data acquisition and processing method was developed and tested for the recognition of NPE residues thereby reducing the analysis time.
The Role and Quality of Software Safety in the NASA Constellation Program
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor R.; Zelkowitz, Marvin V.
2010-01-01
In this study, we examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Obtaining an accurate, program-wide picture of software safety risk is difficult across multiple, independently-developing systems. We leverage one source of safety information, hazard analysis, to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. The goal of this research is two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to quantify the level of risk presented by software in the hazard analysis. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. To quantify the importance of software, we collected metrics based on the number of software-related causes and controls of hazardous conditions. To quantify the level of risk presented by software, we created a metric scheme to measure the specificity of these software causes. We found that from 49-70% of hazardous conditions in the three systems could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. Furthermore, 10-12% of all controls were software-based. There is potential for inaccuracy in these counts, however, as software causes are not consistently scoped, and the presence of software in a cause or control is not always clear. The application of our software specificity metrics also identified risks in the hazard reporting process. In particular, we found a number of traceability risks in the hazard reports may impede verification of software and system safety.
NASA Astrophysics Data System (ADS)
Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.
2012-04-01
Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m resolution was used for the analysis. Three failure mechanisms were analyzed: planar and wedge sliding, as well as toppling. Based on this kinematic analysis, areas where failure is feasible were used as source areas for run out analysis using Rockyfor3D v. 4.1 (www.ecorisq.org). The software calculates trajectories of single falling blocks in three dimensions using physically based algorithms developed under a stochastic approach. The ALS-DEM was down-scaled to 5 m resolution to optimize processing time. Results were compared with run-out simulations using Rockyfor3D with the whole rock wall as source area, and with maps of deposits generated from field observations and aerial photo interpretation. The results product of our implementation show a better correlation with field observations, and help to produce more accurate rock fall hazard assessment maps by a better definition of the source areas. It reduces the time processing for the analysis as well. The findings presented in this contribution are part of an effort to produce guidelines for natural hazard mapping in Norway. Guidelines will be used in upcoming years for hazard mapping in areas where larger groups of population are exposed to mass movements from steep slopes.
Risk analysis for biological hazards: What we need to know about invasive species
Stohlgren, T.J.; Schnase, J.L.
2006-01-01
Risk analysis for biological invasions is similar to other types of natural and human hazards. For example, risk analysis for chemical spills requires the evaluation of basic information on where a spill occurs; exposure level and toxicity of the chemical agent; knowledge of the physical processes involved in its rate and direction of spread; and potential impacts to the environment, economy, and human health relative to containment costs. Unlike typical chemical spills, biological invasions can have long lag times from introduction and establishment to successful invasion, they reproduce, and they can spread rapidly by physical and biological processes. We use a risk analysis framework to suggest a general strategy for risk analysis for invasive species and invaded habitats. It requires: (1) problem formation (scoping the problem, defining assessment endpoints); (2) analysis (information on species traits, matching species traits to suitable habitats, estimating exposure, surveys of current distribution and abundance); (3) risk characterization (understanding of data completeness, estimates of the “potential” distribution and abundance; estimates of the potential rate of spread; and probable risks, impacts, and costs); and (4) risk management (containment potential, costs, and opportunity costs; legal mandates and social considerations and information science and technology needs).
NASA Astrophysics Data System (ADS)
Tierz, Pablo; Sandri, Laura; Ramona Stefanescu, Elena; Patra, Abani; Marzocchi, Warner; Costa, Antonio; Sulpizio, Roberto
2014-05-01
Explosive volcanoes and, especially, Pyroclastic Density Currents (PDCs) pose an enormous threat to populations living in the surroundings of volcanic areas. Difficulties in the modeling of PDCs are related to (i) very complex and stochastic physical processes, intrinsic to their occurrence, and (ii) to a lack of knowledge about how these processes actually form and evolve. This means that there are deep uncertainties (namely, of aleatory nature due to point (i) above, and of epistemic nature due to point (ii) above) associated to the study and forecast of PDCs. Consequently, the assessment of their hazard is better described in terms of probabilistic approaches rather than by deterministic ones. What is actually done to assess probabilistic hazard from PDCs is to couple deterministic simulators with statistical techniques that can, eventually, supply probabilities and inform about the uncertainties involved. In this work, some examples of both PDC numerical simulators (Energy Cone and TITAN2D) and uncertainty quantification techniques (Monte Carlo sampling -MC-, Polynomial Chaos Quadrature -PCQ- and Bayesian Linear Emulation -BLE-) are presented, and their advantages, limitations and future potential are underlined. The key point in choosing a specific method leans on the balance between its related computational cost, the physical reliability of the simulator and the pursued target of the hazard analysis (type of PDCs considered, time-scale selected for the analysis, particular guidelines received from decision-making agencies, etc.). Although current numerical and statistical techniques have brought important advances in probabilistic volcanic hazard assessment from PDCs, some of them may be further applicable to more sophisticated simulators. In addition, forthcoming improvements could be focused on three main multidisciplinary directions: 1) Validate the simulators frequently used (through comparison with PDC deposits and other simulators), 2) Decrease simulator runtimes (whether by increasing the knowledge about the physical processes or by doing more efficient programming, parallelization, ...) and 3) Improve uncertainty quantification techniques.
NASA Astrophysics Data System (ADS)
Kolokolov, Yury; Monovskaya, Anna
2016-06-01
The paper continues the application of the bifurcation analysis in the research on local climate dynamics based on processing the historically observed data on the daily average land surface air temperature. Since the analyzed data are from instrumental measurements, we are doing the experimental bifurcation analysis. In particular, we focus on the discussion where is the joint between the normal dynamics of local climate systems (norms) and situations with the potential to create damages (hazards)? We illustrate that, perhaps, the criteria for hazards (or violent and unfavorable weather factors) relate mainly to empirical considerations from human opinion, but not to the natural qualitative changes of climate dynamics. To build the bifurcation diagrams, we base on the unconventional conceptual model (HDS-model) which originates from the hysteresis regulator with double synchronization. The HDS-model is characterized by a variable structure with the competition between the amplitude quantization and the time quantization. Then the intermittency between three periodical processes is considered as the typical behavior of local climate systems instead of both chaos and quasi-periodicity in order to excuse the variety of local climate dynamics. From the known specific regularities of the HDS-model dynamics, we try to find a way to decompose the local behaviors into homogeneous units within the time sections with homogeneous dynamics. Here, we present the first results of such decomposition, where the quasi-homogeneous sections (QHS) are determined on the basis of the modified bifurcation diagrams, and the units are reconstructed within the limits connected with the problem of shape defects. Nevertheless, the proposed analysis of the local climate dynamics (QHS-analysis) allows to exhibit how the comparatively modest temperature differences between the mentioned units in an annual scale can step-by-step expand into the great temperature differences of the daily variability at a centennial scale. Then the norms and the hazards relate to the fundamentally different viewpoints, where the time sections of months and, especially, seasons distort the causal effects of natural dynamical processes. The specific circumstances to realize the qualitative changes of the local climate dynamics are summarized by the notion of a likely periodicity. That, in particular, allows to explain why 30-year averaging remains the most common rule so far, but the decadal averaging begins to substitute that rule. We believe that the QHS-analysis can be considered as the joint between the norms and the hazards from a bifurcation analysis viewpoint, where the causal effects of the local climate dynamics are projected into the customary timescale only at the last step. We believe that the results could be interesting to develop the fields connected with climatic change and risk assessment.
Determination of Sulfate by Conductometric Titration: An Undergraduate Laboratory Experiment
ERIC Educational Resources Information Center
Garcia, Jennifer; Schultz, Linda D.
2016-01-01
The classic technique for sulfate analysis in an undergraduate quantitative analysis lab involves precipitation as the barium salt with barium chloride, collection of the precipitate by gravity filtration using ashless filter paper, and removal of the filter paper by charring over a Bunsen burner. The entire process is time-consuming, hazardous,…
Learning lessons from Natech accidents - the eNATECH accident database
NASA Astrophysics Data System (ADS)
Krausmann, Elisabeth; Girgin, Serkan
2016-04-01
When natural hazards impact industrial facilities that house or process hazardous materials, fires, explosions and toxic releases can occur. This type of accident is commonly referred to as Natech accident. In order to prevent the recurrence of accidents or to better mitigate their consequences, lessons-learned type studies using available accident data are usually carried out. Through post-accident analysis, conclusions can be drawn on the most common damage and failure modes and hazmat release paths, particularly vulnerable storage and process equipment, and the hazardous materials most commonly involved in these types of accidents. These analyses also lend themselves to identifying technical and organisational risk-reduction measures that require improvement or are missing. Industrial accident databases are commonly used for retrieving sets of Natech accident case histories for further analysis. These databases contain accident data from the open literature, government authorities or in-company sources. The quality of reported information is not uniform and exhibits different levels of detail and accuracy. This is due to the difficulty of finding qualified information sources, especially in situations where accident reporting by the industry or by authorities is not compulsory, e.g. when spill quantities are below the reporting threshold. Data collection has then to rely on voluntary record keeping often by non-experts. The level of detail is particularly non-uniform for Natech accident data depending on whether the consequences of the Natech event were major or minor, and whether comprehensive information was available for reporting. In addition to the reporting bias towards high-consequence events, industrial accident databases frequently lack information on the severity of the triggering natural hazard, as well as on failure modes that led to the hazmat release. This makes it difficult to reconstruct the dynamics of the accident and renders the development of equipment vulnerability models linking the natural-hazard severity to the observed damage almost impossible. As a consequence, the European Commission has set up the eNATECH database for the systematic collection of Natech accident data and near misses. The database exhibits the more sophisticated accident representation required to capture the characteristics of Natech events and is publicly accessible at http://enatech.jrc.ec.europa.eu. This presentation outlines the general lessons-learning process, introduces the eNATECH database and its specific structure, and discusses natural-hazard specific lessons learned and features common to Natech accidents triggered by different natural hazards.
Integrated Safety Analysis Teams
NASA Technical Reports Server (NTRS)
Wetherholt, Jonathan C.
2008-01-01
Today's complex systems require understanding beyond one person s capability to comprehend. Each system requires a team to divide the system into understandable subsystems which can then be analyzed with an Integrated Hazard Analysis. The team must have both specific experiences and diversity of experience. Safety experience and system understanding are not always manifested in one individual. Group dynamics make the difference between success and failure as well as the difference between a difficult task and a rewarding experience. There are examples in the news which demonstrate the need to connect the pieces of a system into a complete picture. The Columbia disaster is now a standard example of a low consequence hazard in one part of the system; the External Tank is a catastrophic hazard cause for a companion subsystem, the Space Shuttle Orbiter. The interaction between the hardware, the manufacturing process, the handling, and the operations contributed to the problem. Each of these had analysis performed, but who constituted the team which integrated this analysis together? This paper will explore some of the methods used for dividing up a complex system; and how one integration team has analyzed the parts. How this analysis has been documented in one particular launch space vehicle case will also be discussed.
NASA Astrophysics Data System (ADS)
Wing-Yuen Chow, Candace; Bründl, Michael; Keiler, Margreth
2017-04-01
In mountain regions, high economic losses have increased significantly in the past decades due to severe hazard processes, in spite of notable investments in hazard management. Assessing the vulnerability of built structures to high magnitude torrent events is a part of consequence analysis, where hazard intensity is related to the degree of loss sustained. While vulnerability curves have been developed for different countries, the presented work contributes new data from Swiss-based case studies that address a known gap associated with the consequences of high magnitude events. Data for this stage of the investigation communicates the degree of loss associated with affected structures and has been provided by local authorities dealing with natural hazards (e.g. Amt für Wald des Kantons Bern (KAWA) and cantonal insurance providers). Information used for the empirical quantification of vulnerability to torrent processes is derived from detailed post-event documentation and the loss database and verified with field visits. Building the initial database supports data sharing and the systematic inclusion of additional case studies as they become available. The collection of this new data is fundamental to the development of a local vulnerability curve based on observed sediment deposition heights, a proxy for describing hazard intensity. The result will then be compared to curves derived from Austrian and Italian datasets.
APPLICATIONS ANALYSIS REPORT: CHEMFIX TECHNOLOGIES, INC. - SOLIDIFICATION/STABILIZATION PROCESS
In support of the U.S. Environmental Protection Agency's (EPA) Superfund Innovative Technology Evaluation (SITE) Program, this report evaluates the Chemfix Technologies, Inc. (Chemfix), solidification/stabilization technology for on-site treatment of hazardous waste. The Chemfix ...
Bowie, Paul; Price, Julie; Hepworth, Neil; Dinwoodie, Mark; McKay, John
2015-11-27
To analyse a medical protection organisation's database to identify hazards related to general practice systems for ordering laboratory tests, managing test results and communicating test result outcomes to patients. To integrate these data with other published evidence sources to inform design of a systems-based conceptual model of related hazards. A retrospective database analysis. General practices in the UK and Ireland. 778 UK and Ireland general practices participating in a medical protection organisation's clinical risk self-assessment (CRSA) programme from January 2008 to December 2014. Proportion of practices with system risks; categorisation of identified hazards; most frequently occurring hazards; development of a conceptual model of hazards; and potential impacts on health, well-being and organisational performance. CRSA visits were undertaken to 778 UK and Ireland general practices of which a range of systems hazards were recorded across the laboratory test ordering and results management systems in 647 practices (83.2%). A total of 45 discrete hazard categories were identified with a mean of 3.6 per practice (SD=1.94). The most frequently occurring hazard was the inadequate process for matching test requests and results received (n=350, 54.1%). Of the 1604 instances where hazards were recorded, the most frequent was at the 'postanalytical test stage' (n=702, 43.8%), followed closely by 'communication outcomes issues' (n=628, 39.1%). Based on arguably the largest data set currently available on the subject matter, our study findings shed new light on the scale and nature of hazards related to test results handling systems, which can inform future efforts to research and improve the design and reliability of these systems. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Lindley frailty model for a class of compound Poisson processes
NASA Astrophysics Data System (ADS)
Kadilar, Gamze Özel; Ata, Nihal
2013-10-01
The Lindley distribution gain importance in survival analysis for the similarity of exponential distribution and allowance for the different shapes of hazard function. Frailty models provide an alternative to proportional hazards model where misspecified or omitted covariates are described by an unobservable random variable. Despite of the distribution of the frailty is generally assumed to be continuous, it is appropriate to consider discrete frailty distributions In some circumstances. In this paper, frailty models with discrete compound Poisson process for the Lindley distributed failure time are introduced. Survival functions are derived and maximum likelihood estimation procedures for the parameters are studied. Then, the fit of the models to the earthquake data set of Turkey are examined.
Failure Modes and Effects Analysis (FMEA): A Bibliography
NASA Technical Reports Server (NTRS)
2000-01-01
Failure modes and effects analysis (FMEA) is a bottom-up analytical process that identifies process hazards, which helps managers understand vulnerabilities of systems, as well as assess and mitigate risk. It is one of several engineering tools and techniques available to program and project managers aimed at increasing the likelihood of safe and successful NASA programs and missions. This bibliography references 465 documents in the NASA STI Database that contain the major concepts, failure modes or failure analysis, in either the basic index of the major subject terms.
Application of Risk Assessment Tools in the Continuous Risk Management (CRM) Process
NASA Technical Reports Server (NTRS)
Ray, Paul S.
2002-01-01
Marshall Space Flight Center (MSFC) of the National Aeronautics and Space Administration (NASA) is currently implementing the Continuous Risk Management (CRM) Program developed by the Carnegie Mellon University and recommended by NASA as the Risk Management (RM) implementation approach. The four most frequently used risk assessment tools in the center are: (a) Failure Modes and Effects Analysis (FMEA), Hazard Analysis (HA), Fault Tree Analysis (FTA), and Probabilistic Risk Analysis (PRA). There are some guidelines for selecting the type of risk assessment tools during the project formulation phase of a project, but there is not enough guidance as to how to apply these tools in the Continuous Risk Management process (CRM). But the ways the safety and risk assessment tools are used make a significant difference in the effectiveness in the risk management function. Decisions regarding, what events are to be included in the analysis, to what level of details should the analysis be continued, make significant difference in the effectiveness of risk management program. Tools of risk analysis also depends on the phase of a project e.g. at the initial phase of a project, when not much data are available on hardware, standard FMEA cannot be applied; instead a functional FMEA may be appropriate. This study attempted to provide some directives to alleviate the difficulty in applying FTA, PRA, and FMEA in the CRM process. Hazard Analysis was not included in the scope of the study due to the short duration of the summer research project.
Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai
2017-01-01
Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA. PMID:28938011
Jacobs, Molly M.; Malloy, Timothy F.; Tickner, Joel A.; Edwards, Sally
2015-01-01
Background Given increasing pressures for hazardous chemical replacement, there is growing interest in alternatives assessment to avoid substituting a toxic chemical with another of equal or greater concern. Alternatives assessment is a process for identifying, comparing, and selecting safer alternatives to chemicals of concern (including those used in materials, processes, or technologies) on the basis of their hazards, performance, and economic viability. Objectives The purposes of this substantive review of alternatives assessment frameworks are to identify consistencies and differences in methods and to outline needs for research and collaboration to advance science policy practice. Methods This review compares methods used in six core components of these frameworks: hazard assessment, exposure characterization, life-cycle impacts, technical feasibility evaluation, economic feasibility assessment, and decision making. Alternatives assessment frameworks published from 1990 to 2014 were included. Results Twenty frameworks were reviewed. The frameworks were consistent in terms of general process steps, but some differences were identified in the end points addressed. Methodological gaps were identified in the exposure characterization, life-cycle assessment, and decision–analysis components. Methods for addressing data gaps remain an issue. Discussion Greater consistency in methods and evaluation metrics is needed but with sufficient flexibility to allow the process to be adapted to different decision contexts. Conclusion Although alternatives assessment is becoming an important science policy field, there is a need for increased cross-disciplinary collaboration to refine methodologies in support of the informed substitution and design of safer chemicals, materials, and products. Case studies can provide concrete lessons to improve alternatives assessment. Citation Jacobs MM, Malloy TF, Tickner JA, Edwards S. 2016. Alternatives assessment frameworks: research needs for the informed substitution of hazardous chemicals. Environ Health Perspect 124:265–280; http://dx.doi.org/10.1289/ehp.1409581 PMID:26339778
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...
14 CFR 437.55 - Hazard analysis.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...
NASA Astrophysics Data System (ADS)
Samphutthanon, R.; Tripathi, N. K.; Ninsawat, S.; Duboz, R.
2014-12-01
The main objective of this research was the development of an HFMD hazard zonation (HFMD-HZ) model by applying AHP and Fuzzy Logic AHP methodologies for weighting each spatial factor such as disease incidence, socio-economic and physical factors. The outputs of AHP and FAHP were input into a Geographic Information Systems (GIS) process for spatial analysis. 14 criteria were selected for analysis as important factors: disease incidence over 10 years from 2003 to 2012, population density, road density, land use and physical features. The results showed a consistency ratio (CR) value for these main criteria of 0.075427 for AHP, the CR for FAHP results was 0.092436. As both remained below the threshold of 0.1, the CR value were acceptable. After linking to actual geospatial data (disease incidence 2013) through spatial analysis by GIS for validation, the results of the FAHP approach were found to match more accurately than those of the AHP approach. The zones with the highest hazard of HFMD outbreaks were located in two main areas in central Muang Chiang Mai district including suburbs and Muang Chiang Rai district including the vicinity. The produced hazardous maps may be useful for organizing HFMD protection plans.
Dynamic frailty models based on compound birth-death processes.
Putter, Hein; van Houwelingen, Hans C
2015-07-01
Frailty models are used in survival analysis to model unobserved heterogeneity. They accommodate such heterogeneity by the inclusion of a random term, the frailty, which is assumed to multiply the hazard of a subject (individual frailty) or the hazards of all subjects in a cluster (shared frailty). Typically, the frailty term is assumed to be constant over time. This is a restrictive assumption and extensions to allow for time-varying or dynamic frailties are of interest. In this paper, we extend the auto-correlated frailty models of Henderson and Shimakura and of Fiocco, Putter and van Houwelingen, developed for longitudinal count data and discrete survival data, to continuous survival data. We present a rigorous construction of the frailty processes in continuous time based on compound birth-death processes. When the frailty processes are used as mixtures in models for survival data, we derive the marginal hazards and survival functions and the marginal bivariate survival functions and cross-ratio function. We derive distributional properties of the processes, conditional on observed data, and show how to obtain the maximum likelihood estimators of the parameters of the model using a (stochastic) expectation-maximization algorithm. The methods are applied to a publicly available data set. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Assessing the seismic risk potential of South America
Jaiswal, Kishor; Petersen, Mark D.; Harmsen, Stephen; Smoczyk, Gregory M.
2016-01-01
We present here a simplified approach to quantifying regional seismic risk. The seismic risk for a given region can be inferred in terms of average annual loss (AAL) that represents long-term value of earthquake losses in any one year caused from a long-term seismic hazard. The AAL are commonly measured in the form of earthquake shaking-induced deaths, direct economic impacts or indirect losses caused due to loss of functionality. In the context of South American subcontinent, the analysis makes use of readily available public data on seismicity, population exposure, and the hazard and vulnerability models for the region. The seismic hazard model was derived using available seismic catalogs, fault databases, and the hazard methodologies that are analogous to the U.S. Geological Survey’s national seismic hazard mapping process. The Prompt Assessment of Global Earthquakes for Response (PAGER) system’s direct empirical vulnerability functions in terms of fatality and economic impact were used for performing exposure and risk analyses. The broad findings presented and the risk maps produced herein are preliminary, yet they do offer important insights into the underlying zones of high and low seismic risks in the South American subcontinent. A more detailed analysis of risk may be warranted by engaging local experts, especially in some of the high risk zones identified through the present investigation.
Geospatial Data Integration for Assessing Landslide Hazard on Engineered Slopes
NASA Astrophysics Data System (ADS)
Miller, P. E.; Mills, J. P.; Barr, S. L.; Birkinshaw, S. J.
2012-07-01
Road and rail networks are essential components of national infrastructures, underpinning the economy, and facilitating the mobility of goods and the human workforce. Earthwork slopes such as cuttings and embankments are primary components, and their reliability is of fundamental importance. However, instability and failure can occur, through processes such as landslides. Monitoring the condition of earthworks is a costly and continuous process for network operators, and currently, geospatial data is largely underutilised. The research presented here addresses this by combining airborne laser scanning and multispectral aerial imagery to develop a methodology for assessing landslide hazard. This is based on the extraction of key slope stability variables from the remotely sensed data. The methodology is implemented through numerical modelling, which is parameterised with the slope stability information, simulated climate conditions, and geotechnical properties. This allows determination of slope stability (expressed through the factor of safety) for a range of simulated scenarios. Regression analysis is then performed in order to develop a functional model relating slope stability to the input variables. The remotely sensed raster datasets are robustly re-sampled to two-dimensional cross-sections to facilitate meaningful interpretation of slope behaviour and mapping of landslide hazard. Results are stored in a geodatabase for spatial analysis within a GIS environment. For a test site located in England, UK, results have shown the utility of the approach in deriving practical hazard assessment information. Outcomes were compared to the network operator's hazard grading data, and show general agreement. The utility of the slope information was also assessed with respect to auto-population of slope geometry, and found to deliver significant improvements over the network operator's existing field-based approaches.
POHCS AND PICS SCREENING PROTOCOL
The report describes risk-driven analysis strategies and a tiered survey approach of analyses that should be useful for building data bases related to other waste combustion processes. NOTE: The need to characterize hazardous waste incinerator emissions for multiple organic compo...
Impacts of The Future Changes in Extreme Events on Migration in The Middle East
NASA Astrophysics Data System (ADS)
An, Nazan; Turp, M. Tufan; Ozturk, Tugba; Kurnaz, M. Levent
2016-04-01
Natural hazards are defined as extreme events that threat people, their homes and their neighborhoods. They damage housing, food production system and other infrastructures. The frequency of natural hazards namely drought, floods can influence the residential decision-making and can cause substantial residential mobility by affecting relatively greater numbers of people in the region. Developing countries are more vulnerable to the impacts of natural hazards. Therefore, environmental migration can be associated with natural hazards especially in the developing countries. Limited water resources and demographic dynamics of the Middle East make the region one of the most affected domains from the impacts of natural hazards. In this study, we consider the relationship between migration as a demographic process and the frequency of natural hazards in the Middle East for the period of 2020 - 2045 with respect to 1980 - 2005 by performing the projection according to the scenario of IPCC, namely RCP8.5 through the RegCM4.4 and combining them with an econometric analysis. This research has been supported by Boǧaziçi University Research Fund Grant Number 10421.
Hazard Analysis for Building 34 Vacuum Glove Box Assembly
NASA Technical Reports Server (NTRS)
Meginnis, Ian
2014-01-01
One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".
Introducing a change in hospital policy using FMEA methodology as a tool to reduce patient hazards.
Ofek, Fanny; Magnezi, Racheli; Kurzweil, Yaffa; Gazit, Inbal; Berkovitch, Sofia; Tal, Orna
2016-01-01
Intravenous potassium chloride (IV KCl) solutions are widely used in hospitals for treatment of hypokalemia. As ampoules of concentrated KCL must be diluted before use, critical incidents have been associated with its preparation and administration. Currently, we have introduced ready-to-use diluted KCl infusion solutions to minimize the use of high-alert concentrated KCl. Since this process may be associated with considerable risks, we embraced a proactive hazard analysis as a tool to implement a change in high-alert drug usage in a hospital setting. Failure mode and effect analysis (FMEA) is a systematic tool to analyze and identify risks in system operations. We used FMEA to examine the hazards associated with the implementation of the ready-to-use solutions. A multidisciplinary team analyzed the risks by identifying failure modes, conducting a hazard analysis and calculating the criticality index (CI) for each failure mode. A 1-day survey was performed as an evaluation step after a trial run period of approximately 4 months. Six major possible risks were identified. The most severe risks were prioritized and specific recommendations were formulated. Out of 28 patients receiving IV KCl on the day of the survey, 22 received the ready-to-use solutions and 6 received the concentrated solutions as instructed. Only 1 patient received inappropriate ready-to-use KCl. Using the FMEA tool in our study has proven once again that by creating a gradient of severity of potential vulnerable elements, we are able to proactively promote safer and more efficient processes in health care systems. This article presents a utilization of this method for implementing a change in hospital policy regarding the routine use of IV KCl.
NASA Astrophysics Data System (ADS)
Spence, C. M.; Brown, C.; Doss-Gollin, J.
2016-12-01
Climate model projections are commonly used for water resources management and planning under nonstationarity, but they do not reliably reproduce intense short-term precipitation and are instead more skilled at broader spatial scales. To provide a credible estimate of flood trend that reflects climate uncertainty, we present a framework that exploits the connections between synoptic-scale oceanic and atmospheric patterns and local-scale flood-producing meteorological events to develop long-term flood hazard projections. We demonstrate the method for the Iowa River, where high flow episodes have been found to correlate with tropical moisture exports that are associated with a pressure dipole across the eastern continental United States We characterize the relationship between flooding on the Iowa River and this pressure dipole through a nonstationary Pareto-Poisson peaks-over-threshold probability distribution estimated based on the historic record. We then combine the results of a trend analysis of dipole index in the historic record with the results of a trend analysis of the dipole index as simulated by General Circulation Models (GCMs) under climate change conditions through a Bayesian framework. The resulting nonstationary posterior distribution of dipole index, combined with the dipole-conditioned peaks-over-threshold flood frequency model, connects local flood hazard to changes in large-scale atmospheric pressure and circulation patterns that are related to flooding in a process-driven framework. The Iowa River example demonstrates that the resulting nonstationary, probabilistic flood hazard projection may be used to inform risk-based flood adaptation decisions.
Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping
NASA Astrophysics Data System (ADS)
Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai
2015-04-01
Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by empirical relations with geotechnical index properties. Site specific information was regionalized at map scale by (hard and fuzzy) clustering analysis taking into account spatial variables such as: geology, geomorphology and hillslope morphometric variables (longitudinal and transverse curvature, flow accumulation and slope), the latter derived by a DEM with 10 m cell size. In order to map shallow landslide hazard, Monte Carlo simulation was performed for some common physically based models available in literature (eg. SINMAP, SHALSTAB, TRIGRS). Furthermore, a new approach based on the use of Bayesian Network was proposed and validated. Different models, such as Intervals, Convex Models and Fuzzy Sets, were adopted for the modelling of input parameters. Finally, an accuracy assessment was carried out on the resulting maps and the propagation of uncertainty of input parameters into the final shallow landslide hazard estimation was estimated. The outcomes of the analysis are compared and discussed in term of discrepancy among map pixel values and related estimated error. The novelty of the proposed method is on estimation of the confidence of the shallow landslides hazard mapping at regional level. This allows i) to discriminate regions where hazard assessment is robust from areas where more data are necessary to increase the confidence level and ii) to assess the reliability of the procedure used for hazard assessment.
Qiao, Yuanhua; Keren, Nir; Mannan, M Sam
2009-08-15
Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.
NASA Astrophysics Data System (ADS)
Geertsema, Marten
2016-04-01
The association of landslides with thinning glaciers and mapped, or measured, mountain permafrost is increasing. Glacier thinning debuttresses slopes and promotes joint expansion. It is relatively easy to map. Permafrost, a thermal condition, is generally not visually detectible, and is difficult to map. Much mountain permafrost may have been overlooked in hazard analysis. Identifying, and characterizing mountain permafrost, and its influence on slope instability is crucial for hazard and risk analysis in mountainous terrain. Rock falls in mountains can be the initial event in process chains. They can transform into rock avalanches, debris flows or dam burst floods, travelling many kilometres, placing infrastructure and settlements at risk.
NASA Astrophysics Data System (ADS)
Hales, T. C.; Cashman, K. V.
2006-12-01
Geological hazard mitigation is a complicated process that involves both detailed scientific research and negotiations between community members with competing interests in the solution. Geological hazards classes based around traditional lecture methods have difficulty conveying the decision-making processes that go into these negotiations. To address this deficiency, we have spent five years developing and testing a role- playing exercise based on mitigation of a dam outburst hazard on Ruapehu volcano, New Zealand. In our exercise, students are asked to undertake one of five different roles and decide the best way to mitigate the hazard. Over the course of their discussion students are challenged to reach a consensus decision despite the presence of strongly opposed positions. Key to the success of the exercise are (1) the presence of a facilitator and recorder for each meeting, (2) the provision of unique information for each interested party, and (3) the division of the class into multiple meeting groups, such that everyone is required to participate and individual groups can evolve to different conclusions. The exercise can be completed in a single hour and twenty minute classroom session that is divided into four parts: an introduction, a meeting between members of the same interested party to discuss strategy, a meeting between different interested parties, and a debriefing session. This framework can be readily translated to any classroom hazard problem. In our experience, students have responded positively to the use of role-playing to supplement lectures.
NASA Astrophysics Data System (ADS)
Tutak, Magdalena; Brodny, Jarosław
2018-01-01
Hazard of endogenous fires is one of the basic and common presented occupational safety hazards in coal mine in Poland and in the world. This hazard means possibility of coal self-ignition as the result of its self-heating process in mining heading or its surrounding. In underground coal-mining during ventilating of operating longwalls takes place migration of parts of airflow to goaf with caving. In a case when in these goaf a coal susceptible to self-ignition occurs, then the airflow through these goaf may influence on formation of favorable conditions for coal oxidation and subsequently to its self-heating and self-ignition. Endogenous fire formed in such conditions can pose a serious hazards for the crew and for continuity of operation of mining plant. From the practical point of view a very significant meaning has determination of the zone in the goaf with caving, in which necessary conditions for occurence of endogenous fire are fulfilled. In the real conditions determination of such a zone is practically impossible. The main aim of the analysis was to determine the impact of type of the roof rocks forming the goaf on the location and range of endogenous fires particular hazard zone by in these goaf. For determined mining-geological conditions, the critical value of velocity of airflow and oxygen concentration in goaf, conditioning initiation of coal oxidation process were determined.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Thio, H. K.; Løvholt, F.; Harbitz, C. B.; Polet, J.; Lorito, S.; Basili, R.; Volpe, M.; Romano, F.; Selva, J.; Piatanesi, A.; Davies, G.; Griffin, J.; Baptista, M. A.; Omira, R.; Babeyko, A. Y.; Power, W. L.; Salgado Gálvez, M.; Behrens, J.; Yalciner, A. C.; Kanoglu, U.; Pekcan, O.; Ross, S.; Parsons, T.; LeVeque, R. J.; Gonzalez, F. I.; Paris, R.; Shäfer, A.; Canals, M.; Fraser, S. A.; Wei, Y.; Weiss, R.; Zaniboni, F.; Papadopoulos, G. A.; Didenkulova, I.; Necmioglu, O.; Suppasri, A.; Lynett, P. J.; Mokhtari, M.; Sørensen, M.; von Hillebrandt-Andrade, C.; Aguirre Ayerbe, I.; Aniel-Quiroga, Í.; Guillas, S.; Macias, J.
2016-12-01
The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Lorito, S.; Basili, R.; Harbitz, C. B.; Løvholt, F.; Polet, J.; Thio, H. K.
2017-12-01
The tsunamis occurred worldwide in the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but often disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
The Global Tsunami Model (GTM)
NASA Astrophysics Data System (ADS)
Løvholt, Finn
2017-04-01
The large tsunami disasters of the last two decades have highlighted the need for a thorough understanding of the risk posed by relatively infrequent but disastrous tsunamis and the importance of a comprehensive and consistent methodology for quantifying the hazard. In the last few years, several methods for probabilistic tsunami hazard analysis have been developed and applied to different parts of the world. In an effort to coordinate and streamline these activities and make progress towards implementing the Sendai Framework of Disaster Risk Reduction (SFDRR) we have initiated a Global Tsunami Model (GTM) working group with the aim of i) enhancing our understanding of tsunami hazard and risk on a global scale and developing standards and guidelines for it, ii) providing a portfolio of validated tools for probabilistic tsunami hazard and risk assessment at a range of scales, and iii) developing a global tsunami hazard reference model. This GTM initiative has grown out of the tsunami component of the Global Assessment of Risk (GAR15), which has resulted in an initial global model of probabilistic tsunami hazard and risk. Started as an informal gathering of scientists interested in advancing tsunami hazard analysis, the GTM is currently in the process of being formalized through letters of interest from participating institutions. The initiative has now been endorsed by the United Nations International Strategy for Disaster Reduction (UNISDR) and the World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR). We will provide an update on the state of the project and the overall technical framework, and discuss the technical issues that are currently being addressed, including earthquake source recurrence models, the use of aleatory variability and epistemic uncertainty, and preliminary results for a probabilistic global hazard assessment, which is an update of the model included in UNISDR GAR15.
Monitoring of microbial hazards at farms, slaughterhouses, and processing lines of swine in Korea.
Rho, M J; Chung, M S; Lee, J H; Park, J
2001-09-01
This study was executed to investigate microbiological hazards at swine farms, slaughterhouses, dressing operations, and local markets for the application of the hazard analysis critical control point system in Korea by analyzing total aerobic plate count (APC) and presence of pathogens. Six integrated pig farms and meat packers were selected from six different provinces, and samples were collected from pig carcasses by swabbing and excision methods at the slaughterhouses, processing rooms, and local markets, respectively. APCs of water in water tanks were relatively low, 1.9 to 3.1 log10 CFU/ml; however, they were increased to 4.6 to 6.9 log10 CFU/ml when sampled from water nipples in the pigpen. APCs of feeds in the feed bins and in the pigpens were 4.4 to 5.4 and 5.2 to 6.7 log10 CFU/g, respectively. Salmonella spp., Staphylococcus aureus, and Clostridium perfringens were detected from water and feed sampled in pigpens and pigpen floors. S. aureus was the most frequently detected pathogenic bacteria in slaughterhouses and processing rooms. Listeria monocytogenes and Yersinia enterocolitica were also detected from the processing rooms of the Kyonggi, Kyongsang, and Cheju provinces. Even though APCs were maintained at the low level of 3.0 log10 CFU/g during slaughtering and processing steps, those of final pork products produced by the same companies showed relatively high numbers when purchased from the local market. These results indicated that the cold chain system for transporting and merchandising of pork products was deficient in Korea. Water supply and feed bins in swine farms and individual operations can be identified as critical control points to reduce microbiological hazards in swine farms, slaughterhouses, and processing plants.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Hazard Analysis and Risk- Based Preventive Controls for Human Food'' and its information collection... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food.'' IV. How To...
Lidar-Based Navigation Algorithm for Safe Lunar Landing
NASA Technical Reports Server (NTRS)
Myers, David M.; Johnson, Andrew E.; Werner, Robert A.
2011-01-01
The purpose of Hazard Relative Navigation (HRN) is to provide measurements to the Navigation Filter so that it can limit errors on the position estimate after hazards have been detected. The hazards are detected by processing a hazard digital elevation map (HDEM). The HRN process takes lidar images as the spacecraft descends to the surface and matches these to the HDEM to compute relative position measurements. Since the HDEM has the hazards embedded in it, the position measurements are relative to the hazards, hence the name Hazard Relative Navigation.
Rockfall hazard analysis using LiDAR and spatial modeling
NASA Astrophysics Data System (ADS)
Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho
2010-05-01
Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.
Landslide risk models for decision making.
Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio
2009-11-01
This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.
Predicting System Accidents with Model Analysis During Hybrid Simulation
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land D.; Throop, David R.
2002-01-01
Standard discrete event simulation is commonly used to identify system bottlenecks and starving and blocking conditions in resources and services. The CONFIG hybrid discrete/continuous simulation tool can simulate such conditions in combination with inputs external to the simulation. This provides a means for evaluating the vulnerability to system accidents of a system's design, operating procedures, and control software. System accidents are brought about by complex unexpected interactions among multiple system failures , faulty or misleading sensor data, and inappropriate responses of human operators or software. The flows of resource and product materials play a central role in the hazardous situations that may arise in fluid transport and processing systems. We describe the capabilities of CONFIG for simulation-time linear circuit analysis of fluid flows in the context of model-based hazard analysis. We focus on how CONFIG simulates the static stresses in systems of flow. Unlike other flow-related properties, static stresses (or static potentials) cannot be represented by a set of state equations. The distribution of static stresses is dependent on the specific history of operations performed on a system. We discuss the use of this type of information in hazard analysis of system designs.
Multi-hazard risk analysis related to hurricanes
NASA Astrophysics Data System (ADS)
Lin, Ning
Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is developed based on the theory of Poisson random measure, substantiated by a large amount of empirical data. An advanced vulnerability assessment methodology is then developed, by integrating this debris risk model and a component-based pressure damage model, to predict storm-specific or annual damage to coastal residential neighborhoods. The uniqueness of this vulnerability model lies in its detailed description of the interaction between wind pressure and windborne debris effects over periods of strong winds, which is a major mechanism leading to structural failures during hurricanes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolton, P.
The purpose of this task was to support ESH-3 in providing Airborne Release Fraction and Respirable Fraction training to safety analysts at LANL who perform accident analysis, hazard analysis, safety analysis, and/or risk assessments at nuclear facilities. The task included preparation of materials for and the conduct of two 3-day training courses covering the following topics: safety analysis process; calculation model; aerosol physic concepts for safety analysis; and overview of empirically derived airborne release fractions and respirable fractions.
NASA Astrophysics Data System (ADS)
Xu, Z.; Gannon, J. L.; Peek, T. A.; Lin, D.
2017-12-01
One space weather hazard is the Geomagnetically Induced Currents (GICs) in the electric power transmission systems, which is naturally induced geoelectric field during the geomagnetic disturbances (GMDs). GICs are a potentially catastrophic threat to bulk power systems. For instance, the Blackout in Quebec in March 1989 was caused by GMDs during a significant magnetic storm. To monitor the GMDs, the autonomous Space Hazard Monitor (SHM) system is developed recently. The system includes magnetic field measurement from magnetometers and geomagnetic field measurement from electrodes. In this presentation, we introduce the six sites of SHMs which have been deployed in the US continental regions. The data from the magnetometers are processed with the Multiple Observatory Geomagnetic Data Analysis Software (MOGDAS). And the statistical results are presented here. It reveals not only the impacts of space weather over US continental region but also the potential of improving instrumentation development to provide better space weather monitor.
NASA Technical Reports Server (NTRS)
2012-01-01
One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.
Consolidation process for producing ceramic waste forms
Hash, Harry C.; Hash, Mark C.
2000-01-01
A process for the consolidation and containment of solid or semisolid hazardous waste, which process comprises closing an end of a circular hollow cylinder, filling the cylinder with the hazardous waste, and then cold working the cylinder to reduce its diameter while simultaneously compacting the waste. The open end of the cylinder can be sealed prior to or after the cold working process. The preferred method of cold working is to draw the sealed cylinder containing the hazardous waste through a plurality of dies to simultaneously reduce the diameter of the tube while compacting the waste. This process provides a quick continuous process for consolidating hazardous waste, including radioactive waste.
Materials, processes, and environmental engineering network
NASA Technical Reports Server (NTRS)
White, Margo M.
1993-01-01
The Materials, Processes, and Environmental Engineering Network (MPEEN) was developed as a central holding facility for materials testing information generated by the Materials and Processes Laboratory. It contains information from other NASA centers and outside agencies, and also includes the NASA Environmental Information System (NEIS) and Failure Analysis Information System (FAIS) data. Environmental replacement materials information is a newly developed focus of MPEEN. This database is the NASA Environmental Information System, NEIS, which is accessible through MPEEN. Environmental concerns are addressed regarding materials identified by the NASA Operational Environment Team, NOET, to be hazardous to the environment. An environmental replacement technology database is contained within NEIS. Environmental concerns about materials are identified by NOET, and control or replacement strategies are formed. This database also contains the usage and performance characteristics of these hazardous materials. In addition to addressing environmental concerns, MPEEN contains one of the largest materials databases in the world. Over 600 users access this network on a daily basis. There is information available on failure analysis, metals and nonmetals testing, materials properties, standard and commercial parts, foreign alloy cross-reference, Long Duration Exposure Facility (LDEF) data, and Materials and Processes Selection List data.
[Hazard function and life table: an introduction to the failure time analysis].
Matsushita, K; Inaba, H
1987-04-01
Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-26
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' that appeared in... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day...
Volcanic hazard assessment for the Canary Islands (Spain) using extreme value theory
NASA Astrophysics Data System (ADS)
Sobradelo, R.; Martí, J.; Mendoza-Rosas, A. T.; Gómez, G.
2011-10-01
The Canary Islands are an active volcanic region densely populated and visited by several millions of tourists every year. Nearly twenty eruptions have been reported through written chronicles in the last 600 yr, suggesting that the probability of a new eruption in the near future is far from zero. This shows the importance of assessing and monitoring the volcanic hazard of the region in order to reduce and manage its potential volcanic risk, and ultimately contribute to the design of appropriate preparedness plans. Hence, the probabilistic analysis of the volcanic eruption time series for the Canary Islands is an essential step for the assessment of volcanic hazard and risk in the area. Such a series describes complex processes involving different types of eruptions over different time scales. Here we propose a statistical method for calculating the probabilities of future eruptions which is most appropriate given the nature of the documented historical eruptive data. We first characterize the eruptions by their magnitudes, and then carry out a preliminary analysis of the data to establish the requirements for the statistical method. Past studies in eruptive time series used conventional statistics and treated the series as an homogeneous process. In this paper, we will use a method that accounts for the time-dependence of the series and includes rare or extreme events, in the form of few data of large eruptions, since these data require special methods of analysis. Hence, we will use a statistical method from extreme value theory. In particular, we will apply a non-homogeneous Poisson process to the historical eruptive data of the Canary Islands to estimate the probability of having at least one volcanic event of a magnitude greater than one in the upcoming years. This is done in three steps: First, we analyze the historical eruptive series to assess independence and homogeneity of the process. Second, we perform a Weibull analysis of the distribution of repose time between successive eruptions. Third, we analyze the non-homogeneous Poisson process with a generalized Pareto distribution as the intensity function.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This volume contains appendices for the following: Rocky Flats Plant and Idaho National Engineering Laboratory waste process information; TRUPACT-II content codes (TRUCON); TRUPACT-II chemical list; chemical compatibility analysis for Rocky Flats Plant waste forms; chemical compatibility analysis for waste forms across all sites; TRU mixed waste characterization database; hazardous constituents of Rocky Flats Transuranic waste; summary of waste components in TRU waste sampling program at INEL; TRU waste sampling program; and waste analysis data.
Gregg, H.R.; Meltzer, M.P.
1996-05-28
The portable Contamination Analysis Unit (CAU) measures trace quantities of surface contamination in real time. The detector head of the portable contamination analysis unit has an opening with an O-ring seal, one or more vacuum valves and a small mass spectrometer. With the valve closed, the mass spectrometer is evacuated with one or more pumps. The O-ring seal is placed against a surface to be tested and the vacuum valve is opened. Data is collected from the mass spectrometer and a portable computer provides contamination analysis. The CAU can be used to decontaminate and decommission hazardous and radioactive surfaces by measuring residual hazardous surface contamination, such as tritium and trace organics. It provides surface contamination data for research and development applications as well as real-time process control feedback for industrial cleaning operations and can be used to determine the readiness of a surface to accept bonding or coatings. 1 fig.
Gregg, Hugh R.; Meltzer, Michael P.
1996-01-01
The portable Contamination Analysis Unit (CAU) measures trace quantifies of surface contamination in real time. The detector head of the portable contamination analysis unit has an opening with an O-ring seal, one or more vacuum valves and a small mass spectrometer. With the valve closed, the mass spectrometer is evacuated with one or more pumps. The O-ring seal is placed against a surface to be tested and the vacuum valve is opened. Data is collected from the mass spectrometer and a portable computer provides contamination analysis. The CAU can be used to decontaminate and decommission hazardous and radioactive surface by measuring residual hazardous surface contamination, such as tritium and trace organics It provides surface contamination data for research and development applications as well as real-time process control feedback for industrial cleaning operations and can be used to determine the readiness of a surface to accept bonding or coatings.
Performance Analysis: Work Control Events Identified January - August 2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Grange, C E; Freeman, J W; Kerr, C E
2011-01-14
This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting inmore » each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009, training of the workforce began and as of the time of this report more than 50% of authorized Integration Work Sheets (IWS) use the activity-based planning process. In 2010, LSO independently reviewed the work planning and control process and confirmed to the Laboratory that the Integrated Safety Management (ISM) System was implemented. LLNL conducted a cross-directorate management self-assessment of work planning and control and is developing actions to respond to the issues identified. Ongoing efforts to strengthen the work planning and control process and to improve the quality of LLNL work packages are in progress: completion of remaining actions in response to the 2009 DOE Office of Health, Safety, and Security (HSS) evaluation of LLNL's ISM System; scheduling more than 14 work planning and control self-assessments in FY11; continuing to align subcontractor work control with the Institutional work planning and control system; and continuing to maintain the electronic IWS application. The 24 events included in this analysis were caused by errors in the first four of the five ISMS functions. The most frequent cause was errors in analyzing the hazards (Function 2). The second most frequent cause was errors occurring when defining the work (Function 1), followed by errors during the performance of work (Function 4). Interestingly, very few errors in developing controls (Function 3) resulted in events. This leads one to conclude that if improvements are made to defining the scope of work and analyzing the potential hazards, LLNL may reduce the frequency or severity of events. Analysis of the 24 events resulted in the identification of ten common causes. Some events had multiple causes, resulting in the mention of 39 causes being identified for the 24 events. The most frequent cause was workers, supervisors, or experts believing they understood the work and the hazards but their understanding was incomplete. The second most frequent cause was unclear, incomplete or confusing documents directing the work. Together, these two causes were mentioned 17 times and contributed to 13 of the events. All of the events with the cause of ''workers, supervisors, or experts believing they understood the work and the hazards but their understanding was incomplete'' had this error in the first two ISMS functions: define the work and analyze the hazard. This means that these causes result in the scope of work being ill-defined or the hazard(s) improperly analyzed. Incomplete implementation of these functional steps leads to the hazards not being controlled. The causes are then manifested in events when the work is conducted. The process to operate safely relies on accurately defining the scope of work. This review has identified a number of examples of latent organizational weakness in the execution of work control processes.« less
Quantitative risk analysis of oil storage facilities in seismic areas.
Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto
2005-08-31
Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.
A generic multi-hazard and multi-risk framework and its application illustrated in a virtual city
NASA Astrophysics Data System (ADS)
Mignan, Arnaud; Euchner, Fabian; Wiemer, Stefan
2013-04-01
We present a generic framework to implement hazard correlations in multi-risk assessment strategies. We consider hazard interactions (process I), time-dependent vulnerability (process II) and time-dependent exposure (process III). Our approach is based on the Monte Carlo method to simulate a complex system, which is defined from assets exposed to a hazardous region. We generate 1-year time series, sampling from a stochastic set of events. Each time series corresponds to one risk scenario and the analysis of multiple time series allows for the probabilistic assessment of losses and for the recognition of more or less probable risk paths. Each sampled event is associated to a time of occurrence, a damage footprint and a loss footprint. The occurrence of an event depends on its rate, which is conditional on the occurrence of past events (process I, concept of correlation matrix). Damage depends on the hazard intensity and on the vulnerability of the asset, which is conditional on previous damage on that asset (process II). Losses are the product of damage and exposure value, this value being the original exposure minus previous losses (process III, no reconstruction considered). The Monte Carlo method allows for a straightforward implementation of uncertainties and for implementation of numerous interactions, which is otherwise challenging in an analytical multi-risk approach. We apply our framework to a synthetic data set, defined by a virtual city within a virtual region. This approach gives the opportunity to perform multi-risk analyses in a controlled environment while not requiring real data, which may be difficultly accessible or simply unavailable to the public. Based on the heuristic approach, we define a 100 by 100 km region where earthquakes, volcanic eruptions, fluvial floods, hurricanes and coastal floods can occur. All hazards are harmonized to a common format. We define a 20 by 20 km city, composed of 50,000 identical buildings with a fixed economic value. Vulnerability curves are defined in terms of mean damage ratio as a function of hazard intensity. All data are based on simple equations found in the literature and on other simplifications. We show the impact of earthquake-earthquake interaction and hurricane-storm surge coupling, as well as of time-dependent vulnerability and exposure, on aggregated loss curves. One main result is the emergence of low probability-high consequences (extreme) events when correlations are implemented. While the concept of virtual city can suggest the theoretical benefits of multi-risk assessment for decision support, identifying their real-world practicality will require the study of real test sites.
Pardo, José E; de Figueirêdo, Vinícius Reis; Alvarez-Ortí, Manuel; Zied, Diego C; Peñaranda, Jesús A; Dias, Eustáquio Souza; Pardo-Giménez, Arturo
2013-09-01
The Hazard analysis and critical control points (HACCP) is a preventive system which seeks to ensure food safety and security. It allows product protection and correction of errors, improves the costs derived from quality defects and reduces the final overcontrol. In this paper, the system is applied to the line of cultivation of mushrooms and other edible cultivated fungi. From all stages of the process, only the reception of covering materials (stage 1) and compost (stage 3), the pre-fruiting and induction (step 6) and the harvest (stage 7) have been considered as critical control point (CCP). The main hazards found were the presence of unauthorized phytosanitary products or above the permitted dose (stages 6 and 7), and the presence of pathogenic bacteria (stages 1 and 3) and/or heavy metals (stage 3). The implementation of this knowledge will allow the self-control of their productions based on the system HACCP to any plant dedicated to mushroom or other edible fungi cultivation.
Safety Guided Design Based on Stamp/STPA for Manned Vehicle in Concept Design Phase
NASA Astrophysics Data System (ADS)
Ujiie, Ryo; Katahira, Masafumi; Miyamoto, Yuko; Umeda, Hiroki; Leveson, Nancy; Hoshino, Nobuyuki
2013-09-01
In manned vehicles, such as the Soyuz and the Space Shuttle, the crew and computer system cooperate to succeed in returning to the earth. While computers increase the functionality of system, they also increase the complexity of the interaction between the controllers (human and computer) and the target dynamics. In some cases, the complexity can produce a serious accident. To prevent such losses, traditional hazard analysis such as FTA has been applied to system development, however it can be used after creating a detailed system because it focuses on detailed component failures. As a result, it's more difficult to eliminate hazard cause early in the process when it is most feasible.STAMP/STPA is a new hazard analysis that can be applied from the early development phase, with the analysis being refined as more detailed decisions are made. In essence, the analysis and design decisions are intertwined and go hand-in-hand. We have applied STAMP/STPA to a concept design of a new JAXA manned vehicle and tried safety guided design of the vehicle. As a result of this trial, it has been shown that STAMP/STPA can be accepted easily by system engineers and the design has been made more sophisticated from a safety viewpoint. The result also shows that the consequences of human errors on system safety can be analysed in the early development phase and the system designed to prevent them. Finally, the paper will discuss an effective way to harmonize this safety guided design approach with system engineering process based on the result of this experience in this project.
Preventing blood transfusion failures: FMEA, an effective assessment method.
Najafpour, Zhila; Hasoumi, Mojtaba; Behzadi, Faranak; Mohamadi, Efat; Jafary, Mohamadreza; Saeedi, Morteza
2017-06-30
Failure Mode and Effect Analysis (FMEA) is a method used to assess the risk of failures and harms to patients during the medical process and to identify the associated clinical issues. The aim of this study was to conduct an assessment of blood transfusion process in a teaching general hospital, using FMEA as the method. A structured FMEA was recruited in our study performed in 2014, and corrective actions were implemented and re-evaluated after 6 months. Sixteen 2-h sessions were held to perform FMEA in the blood transfusion process, including five steps: establishing the context, selecting team members, analysis of the processes, hazard analysis, and developing a risk reduction protocol for blood transfusion. Failure modes with the highest risk priority numbers (RPNs) were identified. The overall RPN scores ranged from 5 to 100 among which, four failure modes were associated with RPNs over 75. The data analysis indicated that failures with the highest RPNs were: labelling (RPN: 100), transfusion of blood or the component (RPN: 100), patient identification (RPN: 80) and sampling (RPN: 75). The results demonstrated that mis-transfusion of blood or blood component is the most important error, which can lead to serious morbidity or mortality. Provision of training to the personnel on blood transfusion, knowledge raising on hazards and appropriate preventative measures, as well as developing standard safety guidelines are essential, and must be implemented during all steps of blood and blood component transfusion.
Implementing Extreme Value Analysis in a Geospatial Workflow for Storm Surge Hazard Assessment
NASA Astrophysics Data System (ADS)
Catelli, J.; Nong, S.
2014-12-01
Gridded data of 100-yr (1%) and 500-yr (0.2%) storm surge flood elevations for the United States, Gulf of Mexico, and East Coast are critical to understanding this natural hazard. Storm surge heights were calculated across the study area utilizing SLOSH (Sea, Lake, and Overland Surges from Hurricanes) model data for thousands of synthetic US landfalling hurricanes. Based on the results derived from SLOSH, a series of interpolations were performed using spatial analysis in a geographic information system (GIS) at both the SLOSH basin and the synthetic event levels. The result was a single grid of maximum flood elevations for each synthetic event. This project addresses the need to utilize extreme value theory in a geospatial environment to analyze coincident cells across multiple synthetic events. The results are 100-yr (1%) and 500-yr (0.2%) values for each grid cell in the study area. This talk details a geospatial approach to move raster data to SciPy's NumPy Array structure using the Python programming language. The data are then connected through a Python library to an outside statistical package like R to fit cell values to extreme value theory distributions and return values for specified recurrence intervals. While this is not a new process, the value behind this work is the ability to keep this process in a single geospatial environment and be able to easily replicate this process for other natural hazard applications and extreme event modeling.
NASA Astrophysics Data System (ADS)
Apel, Heiko; Martínez Trepat, Oriol; Nghia Hung, Nguyen; Thi Chinh, Do; Merz, Bruno; Viet Dung, Nguyen
2016-04-01
Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either a fluvial or pluvial flood hazard, studies of a combined fluvial and pluvial flood hazard are hardly available. Thus this study aims to analyse a fluvial and a pluvial flood hazard individually, but also to develop a method for the analysis of a combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as an example. In this tropical environment the annual monsoon triggered floods of the Mekong River, which can coincide with heavy local convective precipitation events, causing both fluvial and pluvial flooding at the same time. The fluvial flood hazard was estimated with a copula-based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. The pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data and a stochastic rainstorm generator. Inundation for all flood scenarios was simulated by a 2-dimensional hydrodynamic model implemented on a Graphics Processing Unit (GPU) for time-efficient flood propagation modelling. The combined fluvial-pluvial flood scenarios were derived by adding rainstorms to the fluvial flood events during the highest fluvial water levels. The probabilities of occurrence of the combined events were determined assuming independence of the two flood types and taking the seasonality and probability of coincidence into account. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation taking into account the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by percentile maps. The results are critically discussed and their usage in flood risk management are outlined.
Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.
2015-01-01
The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after further consideration of the reliability and scientific acceptability of each alternative input model. Forecasting the seismic hazard from induced earthquakes is fundamentally different from forecasting the seismic hazard for natural, tectonic earthquakes. This is because the spatio-temporal patterns of induced earthquakes are reliant on economic forces and public policy decisions regarding extraction and injection of fluids. As such, the rates of induced earthquakes are inherently variable and nonstationary. Therefore, we only make maps based on an annual rate of exceedance rather than the 50-year rates calculated for previous U.S. Geological Survey hazard maps.
Yellow River Icicle Hazard Dynamic Monitoring Using UAV Aerial Remote Sensing Technology
NASA Astrophysics Data System (ADS)
Wang, H. B.; Wang, G. H.; Tang, X. M.; Li, C. H.
2014-02-01
Monitoring the response of Yellow River icicle hazard change requires accurate and repeatable topographic surveys. A new method based on unmanned aerial vehicle (UAV) aerial remote sensing technology is proposed for real-time data processing in Yellow River icicle hazard dynamic monitoring. The monitoring area is located in the Yellow River ice intensive care area in southern BaoTou of Inner Mongolia autonomous region. Monitoring time is from the 20th February to 30th March in 2013. Using the proposed video data processing method, automatic extraction covering area of 7.8 km2 of video key frame image 1832 frames took 34.786 seconds. The stitching and correcting time was 122.34 seconds and the accuracy was better than 0.5 m. Through the comparison of precise processing of sequence video stitching image, the method determines the change of the Yellow River ice and locates accurate positioning of ice bar, improving the traditional visual method by more than 100 times. The results provide accurate aid decision information for the Yellow River ice prevention headquarters. Finally, the effect of dam break is repeatedly monitored and ice break five meter accuracy is calculated through accurate monitoring and evaluation analysis.
77 FR 55371 - System Safety Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-07
...-based rule and FRA seeks comments on all aspects of the proposed rule. An SSP would be implemented by a... SSP would be the risk-based hazard management program and risk-based hazard analysis. A properly implemented risk-based hazard management program and risk-based hazard analysis would identify the hazards and...
Natural hazard fatalities in Switzerland from 1946 to 2015
NASA Astrophysics Data System (ADS)
Andres, Norina; Badoux, Alexandre; Techel, Frank
2017-04-01
Switzerland, located in the middle of the Alps, is prone to several different natural hazards which regularly cause fatalities. To explore temporal trends as well as demographic and spatial patterns in the number of natural hazard fatalities, a database comprising all natural hazard events causing fatalities was compiled for the years 1946 until 2015. The new database includes avalanche, flood, lightning, windstorm, landslide, debris flow, rockfall, earthquake and ice avalanche processes. Two existing databases were incorporated and the resulting dataset extended by a comprehensive newspaper search. In total the database contains 635 natural hazard events causing 1023 fatalities. The database does not include victims which exposed themselves to an important danger on purpose (e.g. high risk sports). The most common causes of death were snow avalanches (37 %), followed by lightning (16 %), floods (12 %), windstorms (10 %), rockfall (8 %), landslides (7 %) and other processes (9 %). Around 14.6 fatalities occurred on average each year. A distinct decrease of natural hazard fatalities could be shown over the last 70 years, which was mostly due to the decline in the number of avalanche and lightning fatalities. Thus, nearly three times as many people were killed by natural hazard processes from 1946 to 1980 than from 1981 to 2015. Normalisation of fatality data by population resulted in a clearly declining annual crude mortality rate: 3.9 deaths per million persons for the first 35 years and 1.1 deaths per million persons for the second 35 years of the study period. The average age of the victims was approximately 36 years and about 75% were males. Most people were killed in summer (JJA, 42%) and winter (DJF, 32 %). Furthermore, almost two-thirds of the fatalities took place in the afternoon and evening. The spatial distribution of the natural hazard fatalities over Switzerland was quite homogeneous. However, mountainous parts of the country (Prealps, Alps) were somewhat more prone to fatal events compared to the Swiss Plateau and the Jura. It appears that the overall natural hazard mortality rate in Switzerland over the past 70 years has been relatively low in comparison to rates in other countries or rates of other types of fatal accidents in Switzerland. Nevertheless, the collected data provides a valuable base for analysis and helps authorities to better identify higher risk demographic groups and regions, and accordingly target these to reduce the number of victims.
1997-01-01
When NASA started plarning for manned space travel in 1959, the myriad challenges of sustaining life in space included a seemingly mundane but vitally important problem: How and what do you feed an astronaut? There were two main concerns: preventing food crumbs from contaminating the spacecraft's atmosphere or floating into sensitive instruments, and ensuring complete freedom from potentially catastrophic disease-producing bacteria, viruses, and toxins. To solve these concerns, NASA enlisted the help of the Pillsbury Company. Pillsbury quickly solved the first problem by coating bite-size foods to prevent crumbling. They developed the hazard analysis and critical control point (HACCP) concept to ensure against bacterial contamination. Hazard analysis is a systematic study of product, its ingredients, processing conditions, handling, storage, packing, distribution, and directions for consumer use to identify sensitive areas that might prove hazardous. Hazard analysis provides a basis for blueprinting the Critical Control Points (CCPs) to be monitored. CCPs are points in the chain from raw materials to the finished product where loss of control could result in unacceptable food safety risks. In early 1970, Pillsbury plants were following HACCP in production of food for Earthbound consumers. Pillsbury's subsequent training courses for Food and Drug Administration (FDA) personnel led to the incorporation of HACCP in the FDA's Low Acid Canned Foods Regulations, set down in the mid-1970s to ensure the safety of all canned food products in the U.S.
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2012 CFR
2012-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2014 CFR
2014-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2011 CFR
2011-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2013 CFR
2013-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
9 CFR 417.2 - Hazard Analysis and HACCP Plan.
Code of Federal Regulations, 2010 CFR
2010-01-01
... more food safety hazards that are reasonably likely to occur, based on the hazard analysis conducted in... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...
Kourgialas, Nektarios N; Karatzas, George P
2017-12-01
The present work introduces a national scale flood hazard assessment methodology, using multi-criteria analysis and artificial neural networks (ANNs) techniques in a GIS environment. The proposed methodology was applied in Greece, where flash floods are a relatively frequent phenomenon and it has become more intense over the last decades, causing significant damages in rural and urban sectors. In order the most prone flooding areas to be identified, seven factor-maps (that are directly related to flood generation) were combined in a GIS environment. These factor-maps are: a) the Flow accumulation (F), b) the Land use (L), c) the Altitude (A), b) the Slope (S), e) the soil Erodibility (E), f) the Rainfall intensity (R), and g) the available water Capacity (C). The name to the proposed method is "FLASERC". The flood hazard for each one of these factors is classified into five categories: Very low, low, moderate, high, and very high. The above factors are combined and processed using the appropriate ANN algorithm tool. For the ANN training process spatial distribution of historical flooded points in Greece within the five different flood hazard categories of the aforementioned seven factor-maps were combined. In this way, the overall flood hazard map for Greece was determined. The final results are verified using additional historical flood events that have occurred in Greece over the last 100years. In addition, an overview of flood protection measures and adaptation policy approaches were proposed for agricultural and urban areas located at very high flood hazard areas. Copyright © 2017 Elsevier B.V. All rights reserved.
The distinction between risk and hazard: understanding and use in stakeholder communication.
Scheer, Dirk; Benighaus, Christina; Benighaus, Ludger; Renn, Ortwin; Gold, Stefan; Röder, Bettina; Böl, Gaby-Fleur
2014-07-01
A major issue in all risk communication efforts is the distinction between the terms "risk" and "hazard." The potential to harm a target such as human health or the environment is normally defined as a hazard, whereas risk also encompasses the probability of exposure and the extent of damage. What can be observed again and again in risk communication processes are misunderstandings and communication gaps related to these crucial terms. We asked a sample of 53 experts from public authorities, business and industry, and environmental and consumer organizations in Germany to outline their understanding and use of these terms using both the methods of expert interviews and focus groups. The empirical study made clear that the terms risk and hazard are perceived and used very differently in risk communication depending on the perspective of the stakeholders. Several factors can be identified, such as responsibility for hazard avoidance, economic interest, or a watchdog role. Thus, communication gaps can be reduced to a four-fold problem matrix comprising a semantic, conceptual, strategic, and control problem. The empirical study made clear that risks and hazards are perceived very differently depending on the stakeholders' perspective. Their own worldviews played a major role in their specific use of the two terms hazards and risks in communication. © 2014 Society for Risk Analysis.
14 CFR 417.111 - Launch plans.
Code of Federal Regulations, 2010 CFR
2010-01-01
... controls identified by a launch operator's ground safety analysis and implementation of the ground safety.... (ii) For each toxic propellant, any hazard controls and process constraints determined under the... classification and compatibility group as defined by part 420 of this chapter. (3) A graphic depiction of the...
Improving Food Safety in Meat and Poultry: Will New Regulations Benefit Consumers?
ERIC Educational Resources Information Center
Unnevehr, Laurian J.; Roberts, Tanya; Jensen, Helen H.
1997-01-01
The U.S. Department of Agriculture's Hazard Analysis and Critical Control Point System for meat and poultry processing will benefit consumers by reducing food-borne illnesses. The benefits are likely to exceed the additional costs from implementing the regulations. (SK)
ERIC Educational Resources Information Center
Wanke, Randall; Stauffer, Jennifer
2007-01-01
An advanced undergraduate chemistry laboratory experiment to study the advantages and hazards of the coupling of NIR spectroscopy and chemometrics is described. The combination is commonly used for analysis and process control of various ingredients used in agriculture, petroleum and food products.
Long term volcanic hazard analysis in the Canary Islands
NASA Astrophysics Data System (ADS)
Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.
2009-04-01
Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit having enough quality information to map volcanic hazards and to run more reliable models of volcanic hazards, but in addition it aims to become a sharing system, improving communication between researchers, reducing redundant work and to be the reference for geological research in the Canary Islands.
Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Sperotto, Anna; Glade, Thomas; Marcomini, Antonio
2016-03-01
This paper presents a review of existing multi-risk assessment concepts and tools applied by organisations and projects providing the basis for the development of a multi-risk methodology in a climate change perspective. Relevant initiatives were developed for the assessment of multiple natural hazards (e.g. floods, storm surges, droughts) affecting the same area in a defined timeframe (e.g. year, season, decade). Major research efforts were focused on the identification and aggregation of multiple hazard types (e.g. independent, correlated, cascading hazards) by means of quantitative and semi-quantitative approaches. Moreover, several methodologies aim to assess the vulnerability of multiple targets to specific natural hazards by means of vulnerability functions and indicators at the regional and local scale. The overall results of the review show that multi-risk approaches do not consider the effects of climate change and mostly rely on the analysis of static vulnerability (i.e. no time-dependent vulnerabilities, no changes among exposed elements). A relevant challenge is therefore to develop comprehensive formal approaches for the assessment of different climate-induced hazards and risks, including dynamic exposure and vulnerability. This requires the selection and aggregation of suitable hazard and vulnerability metrics to make a synthesis of information about multiple climate impacts, the spatial analysis and ranking of risks, including their visualization and communication to end-users. To face these issues, climate impact assessors should develop cross-sectorial collaborations among different expertise (e.g. modellers, natural scientists, economists) integrating information on climate change scenarios with sectorial climate impact assessment, towards the development of a comprehensive multi-risk assessment process. Copyright © 2015 Elsevier Ltd. All rights reserved.
Environmental Assessment of Remove Objects Along Flightline at Davis-Monthan Air Force Base, Arizona
2005-08-01
of Defense DRMO Defense Reutilization and Marketing Office EA Environmental Assessment EIAP Environmental Impact Analysis Process EIS...the ADA to advertise that protected species might be available for salvage, facilitating the salvage process for the project area. 3.5 Hazardous...are stored at the Defense Reutilization and Marketing Office (DRMO). Asbestos. AFI 32-1052, Facilities Asbestos Management, provides direction for
AFRL Solid Propellant Laboratory Explosive Siting and Renovation Lessons Learned
2010-07-01
Area 1-30A explosive facility and provide consultation/support during the review process for each of the site plans. • Applied Engineering Services...provided consultation/support during the siting review process. • Applied Engineering Services (AES) Inc. performed a detailed structural, blast, thermal... Applied Engineering Services (AES) Inc. structural, blast, thermal and fragment hazard analysis to determine the appropriate siting values based on
Global quantitative indices reflecting provider process-of-care: data-base derivation.
Moran, John L; Solomon, Patricia J
2010-04-19
Controversy has attended the relationship between risk-adjusted mortality and process-of-care. There would be advantage in the establishment, at the data-base level, of global quantitative indices subsuming the diversity of process-of-care. A retrospective, cohort study of patients identified in the Australian and New Zealand Intensive Care Society Adult Patient Database, 1993-2003, at the level of geographic and ICU-level descriptors (n = 35), for both hospital survivors and non-survivors. Process-of-care indices were established by analysis of: (i) the smoothed time-hazard curve of individual patient discharge and determined by pharmaco-kinetic methods as area under the hazard-curve (AUC), reflecting the integrated experience of the discharge process, and time-to-peak-hazard (TMAX, in days), reflecting the time to maximum rate of hospital discharge; and (ii) individual patient ability to optimize output (as length-of-stay) for recorded data-base physiological inputs; estimated as a technical production-efficiency (TE, scaled [0,(maximum)1]), via the econometric technique of stochastic frontier analysis. For each descriptor, multivariate correlation-relationships between indices and summed mortality probability were determined. The data-set consisted of 223129 patients from 99 ICUs with mean (SD) age and APACHE III score of 59.2(18.9) years and 52.7(30.6) respectively; 41.7% were female and 45.7% were mechanically ventilated within the first 24 hours post-admission. For survivors, AUC was maximal in rural and for-profit ICUs, whereas TMAX (>or= 7.8 days) and TE (>or= 0.74) were maximal in tertiary-ICUs. For non-survivors, AUC was maximal in tertiary-ICUs, but TMAX (>or= 4.2 days) and TE (>or= 0.69) were maximal in for-profit ICUs. Across descriptors, significant differences in indices were demonstrated (analysis-of-variance, P
NASA Astrophysics Data System (ADS)
Anthony, Edward J.; Julian, Maurice
1999-12-01
Steep coastal margins are potentially subject to mass wasting processes involving notable landslide activity and sediment evacuation downstream by steep-gradient streams. Sediment transfer from short source-to-sink segments, coupled with mountain hydrological regimes, regulate patterns of river channel aggradation and coastal sediment supply in such geomorphic settings. On the steep French Riviera margin, sediment transfers from existing landslides or from various minor mass wasting processes to stream channels may result following bursts of heavy, concentrated rainfall. High-magnitude flooding and massive sediment transport downstream are generally related to unpredictable extreme rainfalls. Both mass movements and channel sediment storage pose serious hazards to downvalley settlements and infrastructure. A consideration of channel sediment storage patterns in the Var River catchment, the most important catchment in this area, highlights two important shortcomings relative to environmental engineering and hazard mitigation practices. In the first place, the appreciation of geomorphic processes is rather poor. This is illustrated by the undersized nature of engineering works constructed to mitigate hazards in the upstream bedload-dominated channels, and by the unforeseen effects that ten rock dams, constructed in the early 1970s, have had on downstream and coastal sediment storage and on sediment dispersal patterns and, consequently, valley flooding. Secondly, planners and environmental engineers have lacked foresight in valley and coastal management issues on this steep setting, notably as regards the reclaimed areas of the lower Var channel and delta liable to flooding. Urbanization and transport and environmental engineering works have progressively affected patterns of storage and transport of fine-grained sediments in the lower Var channel and delta. Meanwhile the problems raised by these changes have not been adequately addressed in terms of scientific research. A necessary future step in bettering the engineering solutions implemented to contain natural hazards or to harness water and sediment resources is that of fine-scale analysis of source-to-sink sediment transfer processes, of sediment budgets, of time-scales of storage in stream channels, and, finally, of high-magnitude hydrometeorological forcing events in this area. The way all these aspects have been modulated by engineering practices and socioeconomic development should also be an important part of such an analysis.
Rockfall Hazard Process Assessment : Final Project Report
DOT National Transportation Integrated Search
2017-10-01
After a decade of using the Rockfall Hazard Rating System (RHRS), the Montana Department of Transportation (MDT) sought a reassessment of their rockfall hazard evaluation process. Their prior system was a slightly modified version of the RHRS and was...
Fu, Huijian; Qiu, Wenwei; Ma, Haiying; Ma, Qingguo
2017-01-01
Deceptive behavior is common in human social interactions. Researchers have been trying to uncover the cognitive process and neural basis underlying deception due to its theoretical and practical significance. We used Event-related potentials (ERPs) to investigate the neural correlates of deception when the participants completed a hazard judgment task. Pictures conveying or not conveying hazard information were presented to the participants who were then requested to discriminate the hazard content (safe or hazardous) and make a response corresponding to the cues (truthful or deceptive). Behavioral and electrophysiological data were recorded during the entire experiment. Results showed that deceptive responses, compared to truthful responses, were associated with longer reaction time (RT), lower accuracy, increased N2 and reduced late positive potential (LPP), suggesting a cognitively more demanding process to respond deceptively. The decrement in LPP correlated negatively with the increment in RT for deceptive relative to truthful responses, regardless of hazard content. In addition, hazardous information evoked larger N1 and P300 than safe information, reflecting an early processing bias and a later evaluative categorization process based on motivational significance, respectively. Finally, the interaction between honesty (truthful/deceptive) and safety (safe/hazardous) on accuracy and LPP indicated that deceptive responses towards safe information required more effort than deceptive responses towards hazardous information. Overall, these results demonstrate the neurocognitive substrates underlying deception about hazard information.
Qiu, Wenwei; Ma, Haiying; Ma, Qingguo
2017-01-01
Deceptive behavior is common in human social interactions. Researchers have been trying to uncover the cognitive process and neural basis underlying deception due to its theoretical and practical significance. We used Event-related potentials (ERPs) to investigate the neural correlates of deception when the participants completed a hazard judgment task. Pictures conveying or not conveying hazard information were presented to the participants who were then requested to discriminate the hazard content (safe or hazardous) and make a response corresponding to the cues (truthful or deceptive). Behavioral and electrophysiological data were recorded during the entire experiment. Results showed that deceptive responses, compared to truthful responses, were associated with longer reaction time (RT), lower accuracy, increased N2 and reduced late positive potential (LPP), suggesting a cognitively more demanding process to respond deceptively. The decrement in LPP correlated negatively with the increment in RT for deceptive relative to truthful responses, regardless of hazard content. In addition, hazardous information evoked larger N1 and P300 than safe information, reflecting an early processing bias and a later evaluative categorization process based on motivational significance, respectively. Finally, the interaction between honesty (truthful/deceptive) and safety (safe/hazardous) on accuracy and LPP indicated that deceptive responses towards safe information required more effort than deceptive responses towards hazardous information. Overall, these results demonstrate the neurocognitive substrates underlying deception about hazard information. PMID:28793344
A Cascading Storm-Flood-Landslide Guidance System: Development and Application in China
NASA Astrophysics Data System (ADS)
Zeng, Ziyue; Tang, Guoqiang; Long, Di; Ma, Meihong; Hong, Yang
2016-04-01
Flash floods and landslides, triggered by storms, often interact and cause cascading effects on human lives and property. Satellite remote sensing data has significant potential use in analysis of these natural hazards. As one of the regions continuously affected by severe flash floods and landslides, Yunnan Province, located in Southwest China, has a complex mountainous hydrometeorology and suffers from frequent heavy rainfalls from May through to late September. Taking Yunnan as a test-bed, this study proposed a Cascading Storm-Flood-Landslide Guidance System to progressively analysis and evaluate the risk of the multi-hazards based on multisource satellite remote sensing data. First, three standardized rainfall amounts (average daily amount in flood seasons, maximum 1h and maximum 6h amount) from the products of Topical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) were used as rainfall indicators to derive the StorM Hazard Index (SMHI). In this process, an integrated approach of the Analytic Hierarchy Process (AHP) and the Information-Entropy theory was adopted to determine the weight of each indicator. Then, land cover and vegetation cover data from the Moderate Resolution Imaging Spectroradiometer (MODIS) products, soil type from the Harmonized World Soil Database (HWSD) soil map, and slope from the Shuttle Radar Topography Mission (SRTM) data were add as semi-static geo-topographical indicators to derive the Flash Flood Hazard Index (FFHI). Furthermore, three more relevant landslide-controlling indicators, including elevation, slope angle and soil text were involved to derive the LandSlide Hazard Index (LSHI). Further inclusion of GDP, population and prevention measures as vulnerability indicators enabled to consecutively predict the risk of storm to flash flood and landslide, respectively. Consequently, the spatial patterns of the hazard indices show that the southeast of Yunnan has more possibility to encounter with storms than other parts, while the northeast of Yunnan are most susceptible to floods and landslides, which agrees with the distribution of observed flood and landslide events. Moreover, risks for the multi-hazards were classified into four categories. Results show a strong correlation between the distributions of flash flood prone and landslide-prone regions and also highlight the counties with high risk of storms (e.g., Funing and Malipo), flash floods (e.g., Gongshan and Yanjing) and landslides (e.g., Zhaotong and Luxi). Compared to other approaches, the Cascading Storm-Flood-Landslide Guidance System uses a straightforward yet useful indicator-based weighted linear combination method and could be a useful prototype in mapping characteristics of storm-triggered hazards for users at different administrative levels (e.g., catchment, town, county, province and even nation) in China.
2002 Hyperspectral Analysis of Hazardous Waste Sites on the Savannah River Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gladden, J.B.
2003-08-28
Hazardous waste site inspection is a labor intensive, time consuming job, performed primarily on the ground using visual inspection and instrumentation. It is an expensive process to continually monitor hazardous waste and/or landfill sites to determine if they are maintaining their integrity. In certain instances, it may be possible to monitor aspects of the hazardous waste sites and landfills remotely. The utilization of multispectral data was suggested for the mapping of clays and iron oxides associated with contaminated groundwater, vegetation stress, and methane gas emissions (which require longer wavelength detectors). The Savannah River Site (SRS) near Aiken, S.C. is amore » United States Department of Energy facility operated by the Westinghouse Savannah River Company. For decades the SRS was responsible for developing weapons grade plutonium and other materials for the nation's nuclear defense. Hazardous waste was generated during this process. Waste storage site inspection is a particularly important issue at the SRS because there are over 100 hazardous waste sites scattered throughout the 300 mile complex making it difficult to continually monitor all of the facilities. The goal is to use remote sensing technology to identify surface anomalies on the hazardous waste sites as early as possible so that remedial work can take place rapidly to maintain the integrity of the storage sites. The anomalous areas are then targeted for intensive in situ human examination and measurement. During the 1990s, many of the hazardous waste sites were capped with protective layers of polyethelene sheeting and soil, and planted with bahia grass and/or centipede grass. This research investigated hyperspectral remote sensing technology to determine if it can be used to measure accurately and monitor possible indicators of change on vegetated hazardous waste sites. Specifically, it evaluated the usefulness of hyperspectral remote sensing to assess the condition of vegetation on clay- caps on the Mixed Waste Management Facility (MWMF). This report first describes the principles of hyperspectral remote sensing. In situ measurement and hyperspectral remote sensing methods used to analyze hazardous waste sites on the Savannah River Site are then presented.« less
GO/NO-GO - When is medical hazard mitigation acceptable for launch?
NASA Technical Reports Server (NTRS)
Hamilton, Douglas R.; Polk, James D.
2005-01-01
Medical support of spaceflight missions is composed of complex tasks and decisions that dedicated to maintaining the health and performance of the crew and the completion of mission objectives. Spacecraft represent one of the most complex vehicles built by humans, and are built to very rigorous design specifications. In the course of a Flight Readiness Review (FRR) or a mission itself, the flight surgeon must be able to understand the impact of hazards and risks that may not be completely mitigated by design alone. Some hazards are not mitigated because they are never actually identified. When a hazard is identified, it must be reduced or waivered. Hazards that cannot be designed out of the vehicle or mission, are usually mitigated through other means to bring the residual risk to an acceptable level. This is possible in most engineered systems because failure modes are usually predictable and analysis can include taking these systems to failure. Medical support of space missions is complicated by the inability of flight surgeons to provide "exact" hazard and risk numbers to the NASA engineering community. Taking humans to failure is not an option. Furthermore, medical dogma is mostly comprised of "medical prevention" strategies that mitigate risk by examining the behaviour of a cohort of humans similar to astronauts. Unfortunately, this approach does not lend itself well for predicting the effect of a hazard in the unique environment of space. This presentation will discuss how Medical Operations uses an evidence-based approach to decide if hazard mitigation strategies are adequate to reduce mission risk to acceptable levels. Case studies to be discussed will include: 1. Risk of electrocution risk during EVA 2. Risk of cardiac event risk during long and short duration missions 3. Degraded cabin environmental monitoring on the ISS. Learning Objectives 1.) The audience will understand the challenges of mitigating medical risk caused by nominal and off-nominal mission events. 2.) The audience will understand the process by which medical hazards are identified and mitigated before launch. 3.) The audience will understand the roles and responsibilities of all the other flight control positions in participating in the process of reducing hazards and reducing medical risk to an acceptable level.
Visible and infrared remote imaging of hazardous waste: A review
Slonecker, Terrence; Fisher, Gary B.; Aiello, Danielle P.; Haack, Barry
2010-01-01
One of the critical global environmental problems is human and ecological exposure to hazardous wastes from agricultural, industrial, military and mining activities. These wastes often include heavy metals, hydrocarbons and other organic chemicals. Traditional field and laboratory detection and monitoring of these wastes are generally expensive and time consuming. The synoptic perspective of overhead remote imaging can be very useful for the detection and remediation of hazardous wastes. Aerial photography has a long and effective record in waste site evaluations. Aerial photographic archives allow temporal evaluation and change detection by visual interpretation. Multispectral aircraft and satellite systems have been successfully employed in both spectral and morphological analysis of hazardous wastes on the landscape and emerging hyperspectral sensors have permitted determination of the specific contaminants by processing strategies using the tens or hundreds of acquired wavelengths in the solar reflected and/or thermal infrared parts of the electromagnetic spectrum. This paper reviews the literature of remote sensing and overhead imaging in the context of hazardous waste and discusses future monitoring needs and emerging scientific research areas.
14 CFR 417.227 - Toxic release hazard analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from any...
Herzer, Kurt R; Mirrer, Meredith; Xie, Yanjun; Steppan, Jochen; Li, Matthew; Jung, Clinton; Cover, Renee; Doyle, Peter A; Mark, Lynette J
2012-08-01
Since 1999, hospitals have made substantial commitments to health care quality and patient safety through individual initiatives of executive leadership involvement in quality, investments in safety culture, education and training for medical students and residents in quality and safety, the creation of patient safety committees, and implementation of patient safety reporting systems. At the Weinberg Surgical Suite at The Johns Hopkins Hospital (Baltimore), a 16-operating-room inpatient/outpatient cancer center, a patient safety reporting process was developed to maximize the usefulness of the reports and the long-term sustainability of quality improvements arising from them. A six-phase framework was created incorporating UHC's Patient Safety Net (PSN): Identify, report, analyze, mitigate, reward, and follow up. Unique features of this process included a multidisciplinary team to review reports, mitigate hazards, educate and empower providers, recognize the identifying/reporting individuals or groups with "Good Catch" awards, and follow up to determine if quality improvements were sustained over time. Good Catch awards have been given in recognition of 29 patient safety hazards identified since 2008; in each of these cases, an initiative was developed to mitigate the original hazard. Twenty-five (86%) of the associated quality improvements have been sustained. Two Good Catch award-winning projects--vials of heparin with an unusually high concentration of the drug that posed a potential overdose hazard and a rapid infusion device that resisted practitioner control--are described in detail. A multidisciplinary team's analysis and mitigation of hazards identified in a patient safety reporting process entailed positive recognition with a Good Catch award, education of practitioners, and long-term follow-up.
Wang, Yuanjia; Chen, Tianle; Zeng, Donglin
2016-01-01
Learning risk scores to predict dichotomous or continuous outcomes using machine learning approaches has been studied extensively. However, how to learn risk scores for time-to-event outcomes subject to right censoring has received little attention until recently. Existing approaches rely on inverse probability weighting or rank-based regression, which may be inefficient. In this paper, we develop a new support vector hazards machine (SVHM) approach to predict censored outcomes. Our method is based on predicting the counting process associated with the time-to-event outcomes among subjects at risk via a series of support vector machines. Introducing counting processes to represent time-to-event data leads to a connection between support vector machines in supervised learning and hazards regression in standard survival analysis. To account for different at risk populations at observed event times, a time-varying offset is used in estimating risk scores. The resulting optimization is a convex quadratic programming problem that can easily incorporate non-linearity using kernel trick. We demonstrate an interesting link from the profiled empirical risk function of SVHM to the Cox partial likelihood. We then formally show that SVHM is optimal in discriminating covariate-specific hazard function from population average hazard function, and establish the consistency and learning rate of the predicted risk using the estimated risk scores. Simulation studies show improved prediction accuracy of the event times using SVHM compared to existing machine learning methods and standard conventional approaches. Finally, we analyze two real world biomedical study data where we use clinical markers and neuroimaging biomarkers to predict age-at-onset of a disease, and demonstrate superiority of SVHM in distinguishing high risk versus low risk subjects.
Development of an SPE/CE method for analyzing HAAs
Zhang, L.; Capel, P.D.; Hozalski, R.M.
2007-01-01
The haloacetic acid (HAA) analysis methods approved by the US Environmental Protection Agency involve extraction and derivatization of HAAs (typically to their methyl ester form) and analysis by gas chromatography (GC) with electron capture detection (ECD). Concerns associated with these methods include the time and effort of the derivatization process, use of potentially hazardous chemicals or conditions during methylation, poor recoveries because of low extraction efficiencies for some HAAs or matrix effects from sulfate, and loss of tribromoacetic acid because of decarboxylation. The HAA analysis method introduced here uses solid-phase extraction (SPE) followed by capillary electrophoresis (CE) analysis. The method is accurate, reproducible, sensitive, relatively safe, and easy to perform, and avoids the use of large amounts of solvent for liquid-liquid extraction and the potential hazards and hassles of derivatization. The cost of analyzing HAAs using this method should be lower than the currently approved methods, and utilities with a GC/ECD can perform the analysis in-house.
Herzer, Kurt R.; Mirrer, Meredith; Xie, Yanjun; Steppan, Jochen; Li, Matthew; Jung, Clinton; Cover, Renee; Doyle, Peter A.; Mark, Lynette J.
2014-01-01
Background Since 1999, hospitals have made substantial commitments to healthcare quality and patient safety through individual initiatives of executive leadership involvement in quality, investments in safety culture, education and training for medical students and residents in quality and safety, the creation of patient safety committees, and implementation of patient safety reporting systems. Cohesive quality and safety approaches have become comprehensive programs to identify and mitigate hazards that could harm patients. This article moves to the next level with an intense refocusing of attention on one of the individual components of a comprehensive program--the patient safety reporting system—with a goal of maximized usefulness of the reports and long-term sustainability of quality improvements arising from them. Methods A six-phase framework was developed to deal with patient safety hazards: identify, report, analyze, mitigate, reward, and follow up. Unique features of this process included a multidisciplinary team to review reports, mitigate hazards, educate and empower providers, recognize the identifying/reporting individuals or groups with “Good Catch” awards, and follow up to determine if quality improvements were sustained over time. Results To date, 29 patient safety hazards have gone through this process with “Good Catch” awards being granted at our institution. These awards were presented at various times over the past 4 years since the process began in 2008. Follow-up revealed that 86% of the associated quality improvements have been sustained over time since the awards were given. We present the details of two of these “Good Catch” awards: vials of heparin with an unusually high concentration of the drug that posed a potential overdose hazard and a rapid infusion device that resisted practitioner control. Conclusion A multidisciplinary team's analysis and mitigation of hazards identified in a patient safety reporting system, positive recognition with a “Good Catch” award, education of practitioners, and long-term follow-up resulted in an outcome of sustained quality improvement initiatives. PMID:22946251
77 FR 58217 - Notice of Delays in Processing of Special Permits Applications
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-19
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration Notice of Delays in Processing of Special Permits Applications AGENCY: Pipeline and Hazardous Materials Safety.... FOR FURTHER INFORMATION CONTACT: Ryan Paquet, Director, Office of Hazardous Materials Special Permits...
77 FR 64846 - Notice of Delays in Processing of Special Permits Applications
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-23
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration Notice of Delays in Processing of Special Permits Applications AGENCY: Pipeline and Hazardous Materials Safety.... FOR FURTHER INFORMATION CONTACT: Ryan Paquet, Director, Office of Hazardous Materials Special Permits...
77 FR 15455 - Notice of Delays in Processing of Special Permits Applications
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-15
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration Notice of Delays in Processing of Special Permits Applications AGENCY: Pipeline and Hazardous Materials Safety... and Approvals, Pipeline and Hazardous Materials Safety Administration, U.S. Department of...
36 CFR 218.6 - Reviewing officer.
Code of Federal Regulations, 2010 CFR
2010-07-01
... ADMINISTRATIVE REVIEW PROCESSES Predecisional Administrative Review Process for Hazardous Fuel Reduction Projects... the administrative review processes of other Federal agencies, for authorized hazardous fuel reduction... administrative review. ...
Hazard interactions and interaction networks (cascades) within multi-hazard methodologies
NASA Astrophysics Data System (ADS)
Gill, Joel C.; Malamud, Bruce D.
2016-08-01
This paper combines research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between multi-layer single-hazard approaches and multi-hazard approaches that integrate such interactions. This synthesis suggests that ignoring interactions between important environmental and anthropogenic processes could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. In this paper we proceed to present an enhanced multi-hazard framework through the following steps: (i) description and definition of three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment, (ii) outlining of three types of interaction relationship (triggering, increased probability, and catalysis/impedance), and (iii) assessment of the importance of networks of interactions (cascades) through case study examples (based on the literature, field observations and semi-structured interviews). We further propose two visualisation frameworks to represent these networks of interactions: hazard interaction matrices and hazard/process flow diagrams. Our approach reinforces the importance of integrating interactions between different aspects of the Earth system, together with human activity, into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.
A stochastic automata network for earthquake simulation and hazard estimation
NASA Astrophysics Data System (ADS)
Belubekian, Maya Ernest
1998-11-01
This research develops a model for simulation of earthquakes on seismic faults with available earthquake catalog data. The model allows estimation of the seismic hazard at a site of interest and assessment of the potential damage and loss in a region. There are two approaches for studying the earthquakes: mechanistic and stochastic. In the mechanistic approach, seismic processes, such as changes in stress or slip on faults, are studied in detail. In the stochastic approach, earthquake occurrences are simulated as realizations of a certain stochastic process. In this dissertation, a stochastic earthquake occurrence model is developed that uses the results from dislocation theory for the estimation of slip released in earthquakes. The slip accumulation and release laws and the event scheduling mechanism adopted in the model result in a memoryless Poisson process for the small and moderate events and in a time- and space-dependent process for large events. The minimum and maximum of the hazard are estimated by the model when the initial conditions along the faults correspond to a situation right after a largest event and after a long seismic gap, respectively. These estimates are compared with the ones obtained from a Poisson model. The Poisson model overestimates the hazard after the maximum event and underestimates it in the period of a long seismic quiescence. The earthquake occurrence model is formulated as a stochastic automata network. Each fault is divided into cells, or automata, that interact by means of information exchange. The model uses a statistical method called bootstrap for the evaluation of the confidence bounds on its results. The parameters of the model are adjusted to the target magnitude patterns obtained from the catalog. A case study is presented for the city of Palo Alto, where the hazard is controlled by the San Andreas, Hayward and Calaveras faults. The results of the model are used to evaluate the damage and loss distribution in Palo Alto. The sensitivity analysis of the model results to the variation in basic parameters shows that the maximum magnitude has the most significant impact on the hazard, especially for long forecast periods.
NASA Astrophysics Data System (ADS)
Lasocki, Stanislaw; Urban, Pawel; Kwiatek, Grzegorz; Martinez-Garzón, Particia
2017-04-01
Injection induced seismicity (IIS) is an undesired dynamic rockmass response to massive fluid injections. This includes reactions, among others, to hydro-fracturing for shale gas exploitation. Complexity and changeability of technological factors that induce IIS, may result in significant deviations of the observed distributions of seismic process parameters from the models, which perform well in natural, tectonic seismic processes. Classic formulations of probabilistic seismic hazard analysis in natural seismicity assume the seismic marked point process to be a stationary Poisson process, whose marks - magnitudes are governed by a Gutenberg-Richter born exponential distribution. It is well known that the use of an inappropriate earthquake occurrence model and/or an inappropriate of magnitude distribution model leads to significant systematic errors of hazard estimates. It is therefore of paramount importance to check whether the mentioned, commonly used in natural seismicity assumptions on the seismic process, can be safely used in IIS hazard problems or not. Seismicity accompanying shale gas operations is widely studied in the framework of the project "Shale Gas Exploration and Exploitation Induced Risks" (SHEER). Here we present results of SHEER project investigations of such seismicity from Oklahoma and of a proxy of such seismicity - IIS data from The Geysers geothermal field. We attempt to answer to the following questions: • Do IIS earthquakes follow the Gutenberg-Richter distribution law, so that the magnitude distribution can be modelled by an exponential distribution? • Is the occurrence process of IIS earthquakes Poissonian? Is it segmentally Poissonian? If yes, how are these segments linked to cycles of technological operations? Statistical tests indicate that the Gutenberg-Richter relation born exponential distribution model for magnitude is, in general, inappropriate. The magnitude distribution can be complex, multimodal, with no ready-to-use functional model. In this connection, we recommend to use in hazard analyses non-parametric, kernel estimators of magnitude distribution. The earthquake occurrence process of IIS is not a Poisson process. When earthquakes' occurrences are influenced by a multitude of inducing factors, the interevent time distribution can be modelled by the Weibull distribution supporting a negative ageing property of the process. When earthquake occurrences are due to a specific injection activity, the earthquake rate directly depends on the injection rate and responds immediately to the changes of the injection rate. Furthermore, this response is not limited only to correlated variations of the seismic activity but it also concerns significant changes of the shape of interevent time distribution. Unlike the event rate, the shape of magnitude distribution does not exhibit correlation with the injection rate. This work was supported within SHEER: "Shale Gas Exploration and Exploitation Induced Risks" project funded from Horizon 2020 - R&I Framework Programme, call H2020-LCE 16-2014-1 and within statutory activities No3841/E-41/S/2016 of Ministry of Science and Higher Education of Poland.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...
Code of Federal Regulations, 2012 CFR
2012-01-01
... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...
Code of Federal Regulations, 2010 CFR
2010-01-01
... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...
Code of Federal Regulations, 2013 CFR
2013-01-01
... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...
Code of Federal Regulations, 2014 CFR
2014-01-01
... Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND... control point. A point, step, or procedure in a food process at which control can be applied and, as a...
Numerical and Probabilistic Analysis of Asteroid and Comet Impact Hazard Mitigation
2010-09-01
object on Jupiter are reminders and warning signals that we should take seriously. The extinction of the dinosaurs has been attributed to the impact of a...experimentally determined absorption patterns. These energy deposition processes are independent, so a piecemeal approach is physically reasonable . We
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
This document contains information about the research programs being conducted at the Savannah River Plant. Topics of discussion include: thermal cycling absorption process, development of new alloys, ion exchange, oxalate precipitation, calcination, environmental research, remedial action, ecological risk assessments, chemical analysis of salt cakes, natural phenomena hazards assessment, and sampling of soils and groundwater.
A HW-SW Co-Designed System for the Lunar Lander Hazard Detection and Avoidance Breadboarding
NASA Astrophysics Data System (ADS)
Palomo, Pedro; Latorre, Antonio; Valle, Carlos; Gomez de Aguero, Sergio; Hagenfeldt, Miguel; Parreira, Baltazar; Lindoso, Almudena; Portela, Marta; Garcia, Mario; San Millan, Enrique; Zharikov, Yuri; Entrena, Luis
2014-08-01
This paper presents the HW-SW co-design approach followed to tackle the design of the Hazard Detection and Avoidance (HDA) system breadboarding for the Lunar Lander ESA mission, undertaken given the fact that novel GNC technologies used to promote autonomous systems demand processing capabilities that current (and forthcoming) space processors are not able to satisfy. The paper shows how the current system design has been performed in a process in which the original HDA functionally validated design has been partitioned between SW (deemed for execution in a microprocessor) and HW algorithms (to be executed in an FPGA), considering the performance requirements and resorting to a deep analysis of the algorithms in view of their adequacy to HW or SW implementation.
Assessment and Control of Spacecraft Charging Risks on the International Space Station
NASA Technical Reports Server (NTRS)
Koontz, Steve; Edeen, Marybeth; Spetch, William; Dalton, Penni; Keening, Thomas
2003-01-01
Electrical interactions between the F2 region ionospheric plasma and the 160V photovoltaic (PV) electrical power system on the International Space Station (ISS) can produce floating potentials (FP) on the ISS conducting structure of greater magnitude than are usually observed on spacecraft in low-Earth orbit. Flight through the geomagnetic field also causes magnetic induction charging of ISS conducting structure. Charging processes resulting from interaction of ISS with auroral electrons may also contribute to charging albeit rarely. The magnitude and frequency of occurrence of possibly hazardous charging events depends on the ISS assembly stage (six more 160V PV arrays will be added to ISS), ISS flight configuration, ISS position (latitude and longitude), and the natural variability in the ionospheric flight environment. At present, ISS is equipped with two plasma contactors designed to control ISS FP to within 40 volts of the ambient F2 plasma. The negative-polarity grounding scheme utilized in the ISS 160V power system leads, naturally, to negative values of ISS FP. A negative ISS structural FP leads to application of electrostatic fields across the dielectrics that separate conducting structure from the ambient F2 plasma, thereby enabling dielectric breakdown and arcing. Degradation of some thermal control coatings and noise in electrical systems can result. Continued review and evaluation of the putative charging hazards, as required by the ISS Program Office, revealed that ISS charging could produce a risk of electric shock to the ISS crew during extra vehicular activity. ISS charging risks are being evaluated in ongoing ISS charging measurements and analysis campaigns. The results of ISS charging measurements are combined with a recently developed detailed model of the ISS charging process and an extensive analysis of historical ionospheric variability data, to assess ISS charging risks using Probabilistic Risk Assessment (PRA) methods. The PRA analysis (estimated frequency of occurrence and severity of the charging hazards) are then used to select the hazard control strategy that provides the best overall safety and mission success environment for ISS and the ISS crew. This paper presents: 1) a summary of ISS spacecraft charging analysis, measurements, observations made to date, 2) plans for future ISS spacecraft charging measurement campaigns, and 3) a detailed discussion of the PRA strategy used to assess ISS spacecraft charging risks and select charging hazard control strategies
Assessment and Control of International Space Station Spacecraft Charging Risks
NASA Astrophysics Data System (ADS)
Koontz, S.; Edeen, M.; Spetch, W.; Dalton, P.; Keeping, T.; Minow, J.
2003-12-01
Electrical interactions between the F2 region ionospheric plasma and the 160V photovoltaic (PV) electrical power system on the International Space Station (ISS) can produce floating potentials (FP) on ISS conducting structure of greater magnitude than are usually observed on spacecraft in low-Earth orbit. Flight through the geomagnetic field also causes magnetic induction charging of ISS conducting structure. Charging processes resulting from interaction of ISS with auroral electrons may also contribute to charging, albeit rarely. The magnitude and frequency of occurrence of possibly hazardous charging events depends on the ISS assembly stage (six more 160V PV arrays will be added to ISS), ISS flight configuration, ISS position (latitude and longitude), and the natural variability in the ionospheric flight environment. At present, ISS is equipped with two plasma contactors designed to control ISS FP to within 40 volts of the ambient F2 plasma. The negative-polarity grounding scheme utilized in the ISS 160V power system leads, naturally, to negative values of ISS FP. A negative ISS structural FP leads to application of electrostatic fields across the dielectrics that separate conducting structure from the ambient F2 plasma, thereby enabling dielectric breakdown and arcing. Degradation of some thermal control coatings and noise in electrical systems can result. Continued review and evaluation of the putative charging hazards, as required by the ISS Program Office, revealed that ISS charging could produce a risk of electric shock to the ISS crew during extra vehicular activity. ISS charging risks are being evaluated in ongoing ISS charging measurements and analysis campaigns. The results of ISS charging measurements are combined with a recently developed detailed model of the ISS charging process and an extensive analysis of historical ionospheric variability data, to assess ISS charging risks using Probabilistic Risk Assessment (PRA) methods. The PRA analysis (estimated frequency of occurrence and severity of the charging hazards) are then used to select the hazard control strategy that provides the best overall safety and mission success environment for ISS and the ISS crew. This paper presents: 1) a summary of ISS spacecraft charging analysis, measurements, observations made to date, 2) plans for future ISS spacecraft charging measurement campaigns, and 3) a detailed discussion of the PRA strategy used to assess ISS spacecraft charging risks and select charging hazard control strategies.
Assessing Natural Hazard Vulnerability Through Marmara Region Using GIS
NASA Astrophysics Data System (ADS)
Sabuncu, A.; Garagon Dogru, A.; Ozener, H.
2013-12-01
Natural hazards are natural phenomenon occured in the Earth's system that include geological and meteorological events such as earthquakes, floods, landslides, droughts, fires and tsunamis. The metropolitan cities are vulnerable to natural hazards due to their population densities, industrial facilities and proporties. The urban layout of the megacities are complex since industrial facilities are interference with residential area. The Marmara region is placed in North-western Turkey suffered from natural hazards (earthquakes, floods etc.) for years. After 1999 Kocaeli and Duzce earthquakes and 2009 Istanbul flash floods, dramatic number of casualities and economic losses were reported by the authorities. Geographic information systems (GIS) have substantial capacity in order to develop natural disaster management. As these systems provide more efficient and reliable analysis and evaluation of the data in the management, and also convenient and better solutions for the decision making before during and after the natural hazards. The Earth science data and socio-economic data can be integrated into a GIS as different layers. Additionally, satellite data are used to understand the changes pre and post the natural hazards. GIS is a powerful software for the combination of different type of digital data. A natural hazard database for the Marmara region provides all different types of digital data to the users. All proper data collection processing and analysing are critical to evaluate and identify hazards. The natural hazard database allows users to monitor, analyze and query past and recent disasters in the Marmara Region. The long term aim of this study is to develop geodatabase and identify the natural hazard vulnerabilities of the metropolitan cities.
Development of a Probabilistic Tsunami Hazard Analysis in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka
2006-07-01
It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less
A critical analysis of hazard resilience measures within sustainability assessment frameworks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, Elizabeth C., E-mail: echiso1@lsu.edu; Sattler, Meredith, E-mail: msattler@lsu.edu; Friedland, Carol J., E-mail: friedland@lsu.edu
Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site,more » community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.« less
NASA Astrophysics Data System (ADS)
Ospennikov, E. N.; Hilimonjuk, V. Z.
2009-04-01
Economic development of northern oil-and gas-bearing regions, even by application of shift method, is accompanied by a construction of the linear transport systems including automobile- and railways. Construction of such roads is connected with the risks caused by the whole complex of hazards, defined by the environmental features of the region, including flat surface with strong marshiness, development of a peat, fine-grained and easily eroded friable sedimentations, as well as by complicated geocryological conditions. Geocryological conditions of Western Siberia area are characterized by a rather high heterogeneity. This implies the strong variability of permafrost soils distribution, their thickness and continuity, depths of seasonal thawing and frost penetration, and also intact development of geocryological processes and phenomena. Thermokarst, thermo erosion and thermo-abrasion develop in the natural conditions. These processes are caused by partial degradation of permafrost. A frost heave also occurs during their seasonal or long-term freezing. Failure of an environment, which is always peculiar to construction of the roads, causes reorganization of geocryological systems that is accompanied by occurrence of dangerous geocryological processes, such as technogenic thermokarst (with formation of various negative forms of a relief: from fine subsidence up to small and average sized lakes), frost heave ground (with formation frost mound in height up to 0,5 - 1,5 meters and more), thermal erosion (gullies and ravines with volume of the born material up to several thousand cubic meters). Development of these destructive processes in a road stripes leads to emergencies owing to deformations and destructions of an earthen cloth, and to failure of natural tundra and forest-tundra ecosystems. The methodical approaches based on typification and zoning of the area by its environmental complex have been developed for an estimation of geocryological hazards at linear construction. The estimation was carried out on the basis of the analysis, including features of geocryological processes development in natural conditions and certain types of geocryological conditions; character of the failures caused by construction and operation of roads; hazard severity of destructive processes for certain geotechnical systems of roads. Three categories of territories have been specified as a result on base of hazard severity: very complex, complex and simple. Very complex ones are characterized by close to 0 0C by average annual temperatures of soils, presence massive pore and it is repeated- wedge ices, a wide circulation it is high ice bearing ground and active modern development of processes thermokarst, thermo erosion and frost heave. Simple territories differ in low average annual temperatures of soils (below -4 0С), absence massive underground ices and weak development of geocryological processes. All other territories representing potential hazard at adverse change of an environment are classified as complex territories.
Human error mitigation initiative (HEMI) : summary report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, Susan M.; Ramos, M. Victoria; Wenner, Caren A.
2004-11-01
Despite continuing efforts to apply existing hazard analysis methods and comply with requirements, human errors persist across the nuclear weapons complex. Due to a number of factors, current retroactive and proactive methods to understand and minimize human error are highly subjective, inconsistent in numerous dimensions, and are cumbersome to characterize as thorough. An alternative and proposed method begins with leveraging historical data to understand what the systemic issues are and where resources need to be brought to bear proactively to minimize the risk of future occurrences. An illustrative analysis was performed using existing incident databases specific to Pantex weapons operationsmore » indicating systemic issues associated with operating procedures that undergo notably less development rigor relative to other task elements such as tooling and process flow. Future recommended steps to improve the objectivity, consistency, and thoroughness of hazard analysis and mitigation were delineated.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
MITCHELL,GERRY W.; LONGLEY,SUSAN W.; PHILBIN,JEFFREY S.
This Safety Analysis Report (SAR) is prepared in compliance with the requirements of DOE Order 5480.23, Nuclear Safety Analysis Reports, and has been written to the format and content guide of DOE-STD-3009-94 Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Safety Analysis Reports. The Hot Cell Facility is a Hazard Category 2 nonreactor nuclear facility, and is operated by Sandia National Laboratories for the Department of Energy. This SAR provides a description of the HCF and its operations, an assessment of the hazards and potential accidents which may occur in the facility. The potential consequences and likelihood ofmore » these accidents are analyzed and described. Using the process and criteria described in DOE-STD-3009-94, safety-related structures, systems and components are identified, and the important safety functions of each SSC are described. Additionally, information which describes the safety management programs at SNL are described in ancillary chapters of the SAR.« less
Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A
2001-10-12
As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.
Experimental demonstration of a two-phase population extinction hazard
Drake, John M.; Shapiro, Jeff; Griffen, Blaine D.
2011-01-01
Population extinction is a fundamental biological process with applications to ecology, epidemiology, immunology, conservation biology and genetics. Although a monotonic relationship between initial population size and mean extinction time is predicted by virtually all theoretical models, attempts at empirical demonstration have been equivocal. We suggest that this anomaly is best explained with reference to the transient properties of ensembles of populations. Specifically, we submit that under experimental conditions, many populations escape their initially vulnerable state to reach quasi-stationarity, where effects of initial conditions are erased. Thus, extinction of populations initialized far from quasi-stationarity may be exposed to a two-phase extinction hazard. An empirical prediction of this theory is that the fit Cox proportional hazards regression model for the observed survival time distribution of a group of populations will be shown to violate the proportional hazards assumption early in the experiment, but not at later times. We report results of two experiments with the cladoceran zooplankton Daphnia magna designed to exhibit this phenomenon. In one experiment, habitat size was also varied. Statistical analysis showed that in one of these experiments a transformation occurred so that very early in the experiment there existed a transient phase during which the extinction hazard was primarily owing to the initial population size, and that this was gradually replaced by a more stable quasi-stationary phase. In the second experiment, only habitat size unambiguously displayed an effect. Analysis of data pooled from both experiments suggests that the overall extinction time distribution in this system results from the mixture of extinctions during the initial rapid phase, during which the effects of initial population size can be considerable, and a longer quasi-stationary phase, during which only habitat size has an effect. These are the first results, to our knowledge, of a two-phase population extinction process. PMID:21429907
DOE Office of Scientific and Technical Information (OSTI.GOV)
PECH, S.H.
This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.
Bates, Matthew E; Keisler, Jeffrey M; Zussblatt, Niels P; Plourde, Kenton J; Wender, Ben A; Linkov, Igor
2016-02-01
Risk research for nanomaterials is currently prioritized by means of expert workshops and other deliberative processes. However, analytical techniques that quantify and compare alternative research investments are increasingly recommended. Here, we apply value of information and portfolio decision analysis-methods commonly applied in financial and operations management-to prioritize risk research for multiwalled carbon nanotubes and nanoparticulate silver and titanium dioxide. We modify the widely accepted CB Nanotool hazard evaluation framework, which combines nano- and bulk-material properties into a hazard score, to operate probabilistically with uncertain inputs. Literature is reviewed to develop uncertain estimates for each input parameter, and a Monte Carlo simulation is applied to assess how different research strategies can improve hazard classification. The relative cost of each research experiment is elicited from experts, which enables identification of efficient research portfolios-combinations of experiments that lead to the greatest improvement in hazard classification at the lowest cost. Nanoparticle shape, diameter, solubility and surface reactivity were most frequently identified within efficient portfolios in our results.
NASA Technical Reports Server (NTRS)
Xu, Xidong; Ulrey, Mike L.; Brown, John A.; Mast, James; Lapis, Mary B.
2013-01-01
NextGen is a complex socio-technical system and, in many ways, it is expected to be more complex than the current system. It is vital to assess the safety impact of the NextGen elements (technologies, systems, and procedures) in a rigorous and systematic way and to ensure that they do not compromise safety. In this study, the NextGen elements in the form of Operational Improvements (OIs), Enablers, Research Activities, Development Activities, and Policy Issues were identified. The overall hazard situation in NextGen was outlined; a high-level hazard analysis was conducted with respect to multiple elements in a representative NextGen OI known as OI-0349 (Automation Support for Separation Management); and the hazards resulting from the highly dynamic complexity involved in an OI-0349 scenario were illustrated. A selected but representative set of the existing safety methods, tools, processes, and regulations was then reviewed and analyzed regarding whether they are sufficient to assess safety in the elements of that OI and ensure that safety will not be compromised and whether they might incur intolerably high costs.
Fault Tree Analysis: A Bibliography
NASA Technical Reports Server (NTRS)
2000-01-01
Fault tree analysis is a top-down approach to the identification of process hazards. It is as one of the best methods for systematically identifying an graphically displaying the many ways some things can go wrong. This bibliography references 266 documents in the NASA STI Database that contain the major concepts. fault tree analysis, risk an probability theory, in the basic index or major subject terms. An abstract is included with most citations, followed by the applicable subject terms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coppersmith , Kevin J.; Bommer, Julian J.; Bryce, Robert W.
Under the sponsorship of the US Department of Energy (DOE) and the electric utility Energy Northwest, the Pacific Northwest National Laboratory (PNNL) is conducting a probabilistic seismic hazard analysis (PSHA) within the framework of a SSHAC Level 3 procedure (Senior Seismic Hazard Analysis Committee; Budnitz et al., 1997). Specifically, the project is being conducted following the guidelines and requirements specified in NUREG-2117 (USNRC, 2012b) and consistent with approach given in the American Nuclear Standard ANSI/ANS-2.29-2008 Probabilistic Seismic Hazard Analysis. The collaboration between DOE and Energy Northwest is spawned by the needs of both organizations for an accepted PSHA with highmore » levels of regulatory assurance that can be used for the design and safety evaluation of nuclear facilities. DOE committed to this study after performing a ten-year review of the existing PSHA, as required by DOE Order 420.1C. The study will also be used by Energy Northwest as a basis for fulfilling the NRC’s 10CFR50.54(f) requirement that the western US nuclear power plants conduct PSHAs in conformance with SSHAC Level 3 procedures. The study was planned and is being carried out in conjunction with a project Work Plan, which identifies the purpose of the study, the roles and responsibilities of all participants, tasks and their associated schedules, Quality Assurance (QA) requirements, and project deliverables. New data collection and analysis activities are being conducted as a means of reducing the uncertainties in key inputs to the PSHA. It is anticipated that the results of the study will provide inputs to the site response analyses at multiple nuclear facility sites within the Hanford Site and at the Columbia Generating Station.« less
Analysis of hazardous substances released during CFRP laser processing
NASA Astrophysics Data System (ADS)
Hustedt, Michael; Walter, Juergen; Bluemel, Sven; Jaeschke, Peter; Kaierle, Stefan
2017-02-01
Due to their outstanding mechanical properties, in particular their high specific strength parallel to the carbon fibers, carbon fiber reinforced plastics (CFRP) have a high potential regarding resource-efficient lightweight construction. Consequently, these composite materials are increasingly finding application in important industrial branches such as aircraft, automotive and wind energy industry. However, the processing of these materials is highly demanding. On the one hand, mechanical processing methods such as milling or drilling are sometimes rather slow, and they are connected with notable tool wear. On the other hand, thermal processing methods are critical as the two components matrix and reinforcement have widely differing thermophysical properties, possibly leading to damages of the composite structure in terms of pores or delamination. An emerging innovative method for processing of CFRP materials is the laser technology. As principally thermal method, laser processing is connected with the release of potentially hazardous, gaseous and particulate substances. Detailed knowledge of these process emissions is the basis to ensure the protection of man and the environment, according to the existing legal regulations. This knowledge will help to realize adequate protective measures and thus strengthen the development of CFRP laser processing. In this work, selected measurement methods and results of the analysis of the exhaust air and the air at the workplace during different laser processes with CFRP materials are presented. The investigations have been performed in the course of different cooperative projects, funded by the German Federal Ministry of Education and Research (BMBF) in the course of the funding initiative "Photonic Processes and Tools for Resource-Efficient Lightweight Structures".
Screening Methodologies to Support Risk and Technology Reviews (RTR): A Case Study Analysis
The Clean Air Act establishes a two-stage regulatory process for addressing emissions of hazardous air pollutants (HAPs) from stationary sources. In the first stage, the Act requires the EPA to develop technology-based standards for categories of industrial sources. We have lar...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-03
... reduction and Hazard Analysis and Critical Control Point (HACCP) Systems requirements because OMB approval... February 28, 2014. FSIS has established requirements applicable to meat and poultry establishments designed.... coli by slaughter establishments to verify the adequacy of the establishment's process controls for the...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2013-N-1427... Critical Control Point Procedures for the Safe and Sanitary Processing and Importing of Juice AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The Food and Drug Administration (FDA) is...
Analysis of North Atlantic Tropical Cyclone Intensify Change Using Data Mining
ERIC Educational Resources Information Center
Tang, Jiang
2010-01-01
Tropical cyclones (TC), especially when their intensity reaches hurricane scale, can become a costly natural hazard. Accurate prediction of tropical cyclone intensity is very difficult because of inadequate observations on TC structures, poor understanding of physical processes, coarse model resolution and inaccurate initial conditions, etc. This…
Hazard assessment of substances produced from the accidental heating of chemical compounds.
Lunghi, A; Gigante, L; Cardillo, P; Stefanoni, V; Pulga, G; Rota, R
2004-12-10
Accidental events concerning process industries can affect not only the staff working in, but also the environment and people living next to the factory. For this reason a regulation is imposed by the European Community to prevent accidents that could represent a risk for the population and the environment. In particular, Directive 96/82/CE, the so-called 'Seveso II directive', requests a risk analysis involving also the hazardous materials generated in accidental events. Therefore, it is necessary to develop simple and economic procedure to foresee the hazardous materials that can be produced in the case of major accidents, among which the accidental heating of a chemical due to a fire or a runaway reaction is one of the most frequent. The procedure proposed in this work is based on evolved gas analysis methodology that consists in coupling two instruments: a thermogravimetric analyzer or a flash pyrolyzer, that are employed to simulate accident conditions, and a FTIR spectrometer that can be used to detect the evolved gas composition. More than 40 materials have been examined in various accident scenarios and the obtained data have been statistically analyzed in order to identify meaningful correlations between the presence of a chemical group in the molecule of a chemical and the presence of a given hazardous species in the fume produced.
The Role of Deposition in Limiting the Hazard Extent of Dense-Gas Plumes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dillon, M B
2008-01-29
Accidents involving release of large (multi-ton) quantities of toxic industrial chemicals often yield far fewer fatalities and causalities than standard, widely-used assessment and emergency response models predict. While recent work has suggested that models should incorporate the protection provided by buildings, more refined health effect methodologies, and more detailed consideration of the release process; investigations into the role of deposition onto outdoor surfaces has been lacking. In this paper, we examine the conditions under which dry deposition may significantly reduce the extent of the downwind hazard zone. We provide theoretical arguments that in congested environments (e.g. suburbs, forests), deposition tomore » vertical surfaces (such as building walls) may play a significant role in reducing the hazard zone extent--particularly under low-wind, stable atmospheric conditions which are often considered to be the worst-case scenario for these types of releases. Our analysis suggests that in these urban or suburban environments, the amount of toxic chemicals lost to earth's surface is typically a small fraction of overall depositional losses. For isothermal gases such as chlorine, the degree to which the chemicals stick to (or react with) surfaces (i.e. surface resistance) is demonstrated to be a key parameter controlling hazard extent (the maximum distance from the release at which hazards to human health are expected). This analysis does not consider the depositional effects associated with particulate matter or gases that undergo significant thermal change in the atmosphere. While no controlled experiments were available to validate our hypothesis, our analysis results are qualitatively consistent with the observed downwind extent of vegetation damage in two chlorine accidents.« less
Oliveira, Marcos L S; da Boit, Kátia; Pacheco, Fernanda; Teixeira, Elba C; Schneider, Ismael L; Crissien, Tito J; Pinto, Diana C; Oyaga, Rafael M; Silva, Luis F O
2018-01-01
Pollution generated by hazardous elements and persistent organic compounds that affect coal fire is a major environmental concern because of its toxic nature, persistence, and potential risk to human health. The coal mining activities are growing in the state of Santa Catarina in Brazil, thus the collateral impacts on the health and economy are yet to be analyzed. In addition, the environment is also enduring the collateral damage as the waste materials directly influence the coal by-products applied in civil constructions. This study was aimed to establish the relationships between the composition, morphology, and structural characteristics of ultrafine particles emitted by coal mine fires. In Brazil, the self-combustions produced by Al-Ca-Fe-Mg-Si coal spheres are rich in chalcophile elements (As, Cd, Cu, Hg, Pb, Sb, Se, Sn, and Zn), lithophile elements (Ce, Hf, In, La, Th, and U), and siderophile elements (Co, Cr, Mo, Fe, Ni, and V). The relationship between nanomineralogy and the production of hazardous elements as analyzed by advanced methods for the geochemical analysis of different materials were also delineated. The information obtained by the mineral substance analysis may provide a better idea for the understanding of coal-fire development and assessing the response of particular coal in different combustion processes. Copyright © 2017 Elsevier Inc. All rights reserved.
Age at introduction of ultra-processed food among preschool children attending day-care centers.
Longo-Silva, Giovana; Silveira, Jonas Augusto C; Menezes, Rísia Cristina Egito de; Toloni, Maysa Helena de Aguiar
To identify the age of introduction of ultra-processed food and its associated factors among preschool children. Cross-sectional study carried out from March to June 2014 with 359 preschool children aged 17 to 63 months attending day-care centers. Time until ultra-processed food introduction (outcome variable) was described by the Kaplan-Meier analysis, and the log-rank test was used to compare the survival functions of independent variables. Factors associated with ultra-processed food introduction were investigated using the multivariate Cox proportional hazards model. The results were shown as hazard ratios with their respective 95% confidence intervals. The median time until ultra-processed food introduction was six months. Between the 3rd and 6th months, there is a significant increase in the probability of introducing ultra-processed food in the children's diet; and while the probability in the 3rd month varies from 0.15 to 0.25, at six months the variation ranges from 0.6 to 1.0. The final Cox proportional hazards model showed that unplanned pregnancy (1.32 [1.05-1.65]), absence of prenatal care (2.50 [1.02-6.16]), and income >2 minimum wages (1, 50 [1.09-2.06]) were independent risk factors for the introduction of ultra-processed food. Up to the 6th month of life, approximately 75% of preschool children had received one or more ultra-processed food in their diet. In addition, it was observed that the poorest families, as well as unfavorable prenatal factors, were associated with early introduction of ultra-processed food. Copyright © 2017. Published by Elsevier Editora Ltda.
Landslide hazard assessment: recent trends and techniques.
Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S
2013-01-01
Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.
44 CFR 201.4 - Standard State Mitigation Plans.
Code of Federal Regulations, 2011 CFR
2011-10-01
... reduce risks from natural hazards and serves as a guide for State decision makers as they commit resources to reducing the effects of natural hazards. (b) Planning process. An effective planning process is... risk assessments must characterize and analyze natural hazards and risks to provide a statewide...
The Development of a Tri-Service Notification System for Type 1 Medical Materiel Complaints.
1992-09-01
Hazardous Food and Nonprescription Drug Recall System ...... ............... .... 24 Chapter Summary ..... ............... .... 27 III. Methodology...examination of an existing DOD notification process for hazardous food and nonprescription drugs. It must be emphasized that the process being investigated in...notification process for defective medical materiel has not been accomplished. Hazardous Food and Nonprescription Drug Recall System In examining the DoD
Risk analysis of a biomass combustion process using MOSAR and FMEA methods.
Thivel, P-X; Bultel, Y; Delpech, F
2008-02-28
Thermal and chemical conversion processes that convert in energy the sewage sludge, pasty waste and other pre-processed waste are increasingly common, for economic and ecological reasons. Fluidized bed combustion is currently one of the most promising methods of energy conversion, since it burns biomass very efficiently, and produces only very small quantities of sulphur and nitrogen oxides. The hazards associated with biomass combustion processes are fire, explosion and poisoning from the combustion gases (CO, etc.). The risk analysis presented in this paper uses the MADS-MOSAR methodology, applied to a semi-industrial pilot scheme comprising a fluidization column, a conventional cyclone, two natural gas burners and a continuous supply of biomass. The methodology uses a generic approach, with an initial macroscopic stage where hazard sources are identified, scenarios for undesired events are recognized and ranked using a grid of SeverityxProbability and safety barriers suggested. A microscopic stage then analyzes in detail the major risks identified during the first stage. This analysis may use various different tools, such as HAZOP, FMEA, etc.: our analysis is based on FMEA. Using MOSAR, we were able to identify five subsystems: the reactor (fluidized bed and centrifuge), the fuel and biomass supply lines, the operator and the environment. When we drew up scenarios based on these subsystems, we found that malfunction of the gas supply burners was a common trigger in many scenarios. Our subsequent microscopic analysis, therefore, focused on the burners, looking at the ways they failed, and at the effects and criticality of those failures (FMEA). We were, thus, able to identify a number of critical factors such as the incoming gas lines and the ignition electrode.
Guo, Xuezhen; Claassen, G D H; Oude Lansink, A G J M; Saatkamp, H W
2014-06-01
Economic analysis of hazard surveillance in livestock production chains is essential for surveillance organizations (such as food safety authorities) when making scientifically based decisions on optimization of resource allocation. To enable this, quantitative decision support tools are required at two levels of analysis: (1) single-hazard surveillance system and (2) surveillance portfolio. This paper addresses the first level by presenting a conceptual approach for the economic analysis of single-hazard surveillance systems. The concept includes objective and subjective aspects of single-hazard surveillance system analysis: (1) a simulation part to derive an efficient set of surveillance setups based on the technical surveillance performance parameters (TSPPs) and the corresponding surveillance costs, i.e., objective analysis, and (2) a multi-criteria decision making model to evaluate the impacts of the hazard surveillance, i.e., subjective analysis. The conceptual approach was checked for (1) conceptual validity and (2) data validity. Issues regarding the practical use of the approach, particularly the data requirement, were discussed. We concluded that the conceptual approach is scientifically credible for economic analysis of single-hazard surveillance systems and that the practicability of the approach depends on data availability. Copyright © 2014 Elsevier B.V. All rights reserved.
Di Renzo, Laura; Colica, Carmen; Carraro, Alberto; Cenci Goga, Beniamino; Marsella, Luigi Tonino; Botta, Roberto; Colombo, Maria Laura; Gratteri, Santo; Chang, Ting Fa Margherita; Droli, Maurizio; Sarlo, Francesca; De Lorenzo, Antonino
2015-04-23
The important role of food and nutrition in public health is being increasingly recognized as crucial for its potential impact on health-related quality of life and the economy, both at the societal and individual levels. The prevalence of non-communicable diseases calls for a reformulation of our view of food. The Hazard Analysis and Critical Control Point (HACCP) system, first implemented in the EU with the Directive 43/93/CEE, later replaced by Regulation CE 178/2002 and Regulation CE 852/2004, is the internationally agreed approach for food safety control. Our aim is to develop a new procedure for the assessment of the Nutrient, hazard Analysis and Critical Control Point (NACCP) process, for total quality management (TMQ), and optimize nutritional levels. NACCP was based on four general principles: i) guarantee of health maintenance; ii) evaluate and assure the nutritional quality of food and TMQ; iii) give correct information to the consumers; iv) ensure an ethical profit. There are three stages for the application of the NACCP process: 1) application of NACCP for quality principles; 2) application of NACCP for health principals; 3) implementation of the NACCP process. The actions are: 1) identification of nutritional markers, which must remain intact throughout the food supply chain; 2) identification of critical control points which must monitored in order to minimize the likelihood of a reduction in quality; 3) establishment of critical limits to maintain adequate levels of nutrient; 4) establishment, and implementation of effective monitoring procedures of critical control points; 5) establishment of corrective actions; 6) identification of metabolic biomarkers; 7) evaluation of the effects of food intake, through the application of specific clinical trials; 8) establishment of procedures for consumer information; 9) implementation of the Health claim Regulation EU 1924/2006; 10) starting a training program. We calculate the risk assessment as follows: Risk (R) = probability (P) × damage (D). The NACCP process considers the entire food supply chain "from farm to consumer"; in each point of the chain it is necessary implement a tight monitoring in order to guarantee optimal nutritional quality.
NASA Astrophysics Data System (ADS)
Lan, Hengxing; Derek Martin, C.; Lim, C. H.
2007-02-01
Geographic information system (GIS) modeling is used in combination with three-dimensional (3D) rockfall process modeling to assess rockfall hazards. A GIS extension, RockFall Analyst (RA), which is capable of effectively handling large amounts of geospatial information relative to rockfall behaviors, has been developed in ArcGIS using ArcObjects and C#. The 3D rockfall model considers dynamic processes on a cell plane basis. It uses inputs of distributed parameters in terms of raster and polygon features created in GIS. Two major components are included in RA: particle-based rockfall process modeling and geostatistics-based rockfall raster modeling. Rockfall process simulation results, 3D rockfall trajectories and their velocity features either for point seeders or polyline seeders are stored in 3D shape files. Distributed raster modeling, based on 3D rockfall trajectories and a spatial geostatistical technique, represents the distribution of spatial frequency, the flying and/or bouncing height, and the kinetic energy of falling rocks. A distribution of rockfall hazard can be created by taking these rockfall characteristics into account. A barrier analysis tool is also provided in RA to aid barrier design. An application of these modeling techniques to a case study is provided. The RA has been tested in ArcGIS 8.2, 8.3, 9.0 and 9.1.
NASA Astrophysics Data System (ADS)
Weigel, A. M.; Griffin, R.; Sever, T.
2014-12-01
The extent of the Maya civilization spanned across portions of modern day Mexico, Belize, Guatemala, El Salvador and Honduras. Paleoclimatic studies suggest this region has been affected by strong hurricanes for the past six thousand years, reinforced by archeological evidence from Mayan records indicating they experienced strong storms. It is theorized hurricanes aided in the collapse of the Maya, damaging building structures, agriculture, and ceasing industry activities. Today, this region is known for its active tropical climatology, being hit by numerous strong storms including Hurricane Dean, Iris, Keith, and Mitch. This research uses a geographic information system (GIS) to model hurricane hazards, and assess the risk posed on the Maya civilization. GIS has the ability to handle various layer components making it optimal for combining parameters necessary for assessing the risk of experiencing hurricane related hazards. For this analysis, high winds, storm surge flooding, non-storm surge related flooding, and rainfall triggered landslides were selected as the primary hurricane hazards. Data sets used in this analysis include the National Climatic Data Center International Best Track Archive for Climate Stewardships (IBTrACS) hurricane tracks, Shuttle Radar Topography Mission Digital Elevation Model, WorldClim monthly accumulated precipitation, USGS HydroSHEDS river locations, Harmonized World Soil Database soil types, and known Maya site locations from the Electronic Atlas of Ancient Maya Sites. ArcGIS and ENVI software were utilized to process data and model hurricane hazards. To assess locations at risk of experiencing high winds, a model was created using ArcGIS Model Builder to map each storm's temporal wind profile, and adapted to simulate forward storm velocity, and storm frequency. Modeled results were then combined with physical land characteristics, meteorological, and hydrologic data to identify areas likely affected. Certain areas along the eastern edge of the Yucatan peninsula were found to be more prone to experiencing wind and flood related hurricane hazards. Novel methodologies developed from this analysis can be adapted for further hurricane risk assessment on archeological sites.
NASA Astrophysics Data System (ADS)
Chelidze, Tamaz; Eppelbaum, Lev
2013-04-01
The Alpine-Himalayan convergence zone (AHCZ) underwent recent transverse shortening under the effect of collisional compression. The process was accompanied by rotation of separate microplates. The Caucasian and Eastern Mediterranean regions are segments of the of the AHCZ and are characterized by intensive endogenous and exogenous geodynamic processes, which manifest themselves in occurrence of powerful (with magnitude of 8-9) earthquakes accompanied by development of secondary catastrophic processes. Large landslides, rock falls, avalanches, mud flows, etc. cause human deaths and great material losses. The development of the aforesaid endogenous processes is set forth by peculiarities of the deep structure of the region and an impact of deep geological processes. The Caucasus is divided into several main tectonic terranes: platform (sub-platform, quasi-platform) and fold-thrust units. Existing data enable to perform a division of the Caucasian region into two large-scale geological provinces: southern Tethyan and northern Tethyan located to the south of and to the north of the Lesser Caucasian ophiolite suture, respectively. The recent investigations show that the assessments of the seismic hazard in these regions are not quite correct - for example in the West Caucasus the seismic hazard can be significantly underestimated, which affects the corresponding risk assessments. Integrated analysis of gravity, magnetic, seismic and thermal data enables to refine the assessment of the seismic hazard of the region, taking into account real rates of the geodynamic movements. Important role play the last rheological constructions. According to Reilinger et al. (2006) tectonic scheme, the West flanking of the Arabian Plate manifests strike-slip motion, when the East Caucasian block is converging and shortening. The Eastern Mediterranean is a tectonically complex region located in the midst of the progressive Afro-Eurasian collision. The recent increasing geotectonic activity in this region highlights the need for combined analysis of seismo-neotectonic signatures. For this purpose, this article presents the key features of the tectonic zonation of the Eastern Mediterranean. Map of derivatives of the gravity field retracked from the Geosat satellite and novel map of the Moho discontinuity illustrate the most important tectonic features of the region. The Post-Jurassic map of the deformation of surface leveling reflects the modern tectonic stage of Eastern Mediterranean evolution. The developed tectono-geophysical zonation map integrates the potential geophysical field analysis and seismic section utilization, as well as tectonic-structural, paleogeographical and facial analyses. Tectonically the map agrees with the earlier model of continental accretion (Ben-Avraham and Ginzburg, 1990). Overlaying the seismicity map of the Eastern Mediterranean tectonic region (for the period between 1900 and 2012) on the tectonic zonation chart reveals the key features of the seismo-neotectonic pattern of the Eastern Mediterranean. The results have important implications for tectonic-seismological analysis in this region (Eppelbaum and Katz, 2012). A difference in the geotectonic patterns makes interesting comparison of geodynamic activity and seismic hazard of the Caucasian and Eastern Mediterranean segments of the AHCZ.
Research and Evaluations of the Health Aspects of Disasters, Part IX: Risk-Reduction Framework.
Birnbaum, Marvin L; Daily, Elaine K; O'Rourke, Ann P; Loretti, Alessandro
2016-06-01
A disaster is a failure of resilience to an event. Mitigating the risks that a hazard will progress into a destructive event, or increasing the resilience of a society-at-risk, requires careful analysis, planning, and execution. The Disaster Logic Model (DLM) is used to define the value (effects, costs, and outcome(s)), impacts, and benefits of interventions directed at risk reduction. A Risk-Reduction Framework, based on the DLM, details the processes involved in hazard mitigation and/or capacity-building interventions to augment the resilience of a community or to decrease the risk that a secondary event will develop. This Framework provides the structure to systematically undertake and evaluate risk-reduction interventions. It applies to all interventions aimed at hazard mitigation and/or increasing the absorbing, buffering, or response capacities of a community-at-risk for a primary or secondary event that could result in a disaster. The Framework utilizes the structure provided by the DLM and consists of 14 steps: (1) hazards and risks identification; (2) historical perspectives and predictions; (3) selection of hazard(s) to address; (4) selection of appropriate indicators; (5) identification of current resilience standards and benchmarks; (6) assessment of the current resilience status; (7) identification of resilience needs; (8) strategic planning; (9) selection of an appropriate intervention; (10) operational planning; (11) implementation; (12) assessments of outputs; (13) synthesis; and (14) feedback. Each of these steps is a transformation process that is described in detail. Emphasis is placed on the role of Coordination and Control during planning, implementation of risk-reduction/capacity building interventions, and evaluation. Birnbaum ML , Daily EK , O'Rourke AP , Loretti A . Research and evaluations of the health aspects of disasters, part IX: Risk-Reduction Framework. Prehosp Disaster Med. 2016;31(3):309-325.
Red and processed meat consumption and breast cancer: UK Biobank cohort study and meta-analysis.
Anderson, Jana J; Darwis, Narisa D M; Mackay, Daniel F; Celis-Morales, Carlos A; Lyall, Donald M; Sattar, Naveed; Gill, Jason M R; Pell, Jill P
2018-02-01
Red and processed meat may be risk factors for breast cancer due to their iron content, administration of oestrogens to cattle or mutagens created during cooking. We studied the associations in UK Biobank and then included the results in a meta-analysis of published cohort studies. UK Biobank, a general population cohort study, recruited participants aged 40-69 years. Incident breast cancer was ascertained via linkage to routine hospital admission, cancer registry and death certificate data. Univariate and multivariable Cox proportional hazard models were used to explore the associations between red and processed meat consumption and breast cancer. Previously published cohort studies were identified from a systematic review using PubMed and Ovid and a meta-analysis conducted using a random effects model. Over a median of 7 years follow-up, 4819 of the 262,195 women developed breast cancer. The risk was increased in the highest tertile (>9 g/day) of processed meat consumption (adjusted hazard ratio [HR] 1.21, 95% confidence interval [CI] 1.08-1.35, p = 0.001). Collation with 10 previous cohort studies provided data on 40,257 incident breast cancers in 1.65 million women. On meta-analysis, processed meat consumption was associated with overall (relative risk [RR] 1.06, 95% CI 1.01-1.11) and post-menopausal (RR 1.09, 95% CI 1.03-1.15), but not pre-menopausal (RR 0.99, 95% CI 0.88-1.10), breast cancer. In UK Biobank and the meta-analysis, red meat consumption was not associated with breast cancer (adjusted HR 0.99 95% CI 0.88-1.12 and RR 1.03, 95% CI 0.99-1.08, respectively). Consumption of processed meat, but not red meat, may increase the risk of breast cancer. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
Environmental, technical and technological aspects of hazardous waste management in Poland
NASA Astrophysics Data System (ADS)
Pyssa, Justyna
2017-10-01
The issue of recovery and disposal of hazardous waste is not a new concern. The waste comes from various processes and technologies and therefore the bigger emphasis should be placed on reducing quantities of generated hazardous waste (which is often connected with changes in the technology of manufacturing a given product) and limitation of their negative influence on natural environment. Plants specializing in waste processing processes should meet the so-called cardinal triad of conditions deciding on the full success of investment, and namely: economic effectiveness, ecological efficiency and social acceptance. The structure of generation of hazardous waste in EU-28 has been presented in the paper. Methods of hazardous waste disposal in Poland have been discussed. Economic and ecological criteria for the selection of technology of hazardous waste disposal have been analyzed. The influence of the hazardous waste on the environment is also presented. For four groups of waste, which are currently stored, alternative methods of disposal have been proposed.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-19
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Analysis and Risk-Based Preventive Controls for Human Food.'' FOR FURTHER INFORMATION CONTACT: Domini Bean... Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day comment...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kienhuis, Anne S., E-mail: anne.kienhuis@rivm.nl; RIKILT, Institute of Food Safety, Wageningen UR, PO Box 230, 6700 AE, Wageningen; Netherlands Toxicogenomics Centre
Hepatic systems toxicology is the integrative analysis of toxicogenomic technologies, e.g., transcriptomics, proteomics, and metabolomics, in combination with traditional toxicology measures to improve the understanding of mechanisms of hepatotoxic action. Hepatic toxicology studies that have employed toxicogenomic technologies to date have already provided a proof of principle for the value of hepatic systems toxicology in hazard identification. In the present review, acetaminophen is used as a model compound to discuss the application of toxicogenomics in hepatic systems toxicology for its potential role in the risk assessment process, to progress from hazard identification towards hazard characterization. The toxicogenomics-based parallelogram is usedmore » to identify current achievements and limitations of acetaminophen toxicogenomic in vivo and in vitro studies for in vitro-to-in vivo and interspecies comparisons, with the ultimate aim to extrapolate animal studies to humans in vivo. This article provides a model for comparison of more species and more in vitro models enhancing the robustness of common toxicogenomic responses and their relevance to human risk assessment. To progress to quantitative dose-response analysis needed for hazard characterization, in hepatic systems toxicology studies, generation of toxicogenomic data of multiple doses/concentrations and time points is required. Newly developed bioinformatics tools for quantitative analysis of toxicogenomic data can aid in the elucidation of dose-responsive effects. The challenge herein is to assess which toxicogenomic responses are relevant for induction of the apical effect and whether perturbations are sufficient for the induction of downstream events, eventually causing toxicity.« less
Comparing capacity coefficient and dual task assessment of visual multitasking workload
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaha, Leslie M.
Capacity coefficient analysis could offer a theoretically grounded alternative approach to subjective measures and dual task assessment of cognitive workload. Workload capacity or workload efficiency is a human information processing modeling construct defined as the amount of information that can be processed by the visual cognitive system given a specified of amount of time. In this paper, I explore the relationship between capacity coefficient analysis of workload efficiency and dual task response time measures. To capture multitasking performance, I examine how the relatively simple assumptions underlying the capacity construct generalize beyond the single visual decision making tasks. The fundamental toolsmore » for measuring workload efficiency are the integrated hazard and reverse hazard functions of response times, which are defined by log transforms of the response time distribution. These functions are used in the capacity coefficient analysis to provide a functional assessment of the amount of work completed by the cognitive system over the entire range of response times. For the study of visual multitasking, capacity coefficient analysis enables a comparison of visual information throughput as the number of tasks increases from one to two to any number of simultaneous tasks. I illustrate the use of capacity coefficients for visual multitasking on sample data from dynamic multitasking in the modified Multi-attribute Task Battery.« less
NASA Astrophysics Data System (ADS)
Ciurean, R. L.; Glade, T.
2012-04-01
Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.
Concerns related to Safety Management of Engineered Nanomaterials in research environment
NASA Astrophysics Data System (ADS)
Groso, A.; Meyer, Th
2013-04-01
Since the rise of occupational safety and health research on nanomaterials a lot of progress has been made in generating health effects and exposure data. However, when detailed quantitative risk analysis is in question, more research is needed, especially quantitative measures of workers exposure and standards to categorize toxicity/hazardousness data. In the absence of dose-response relationships and quantitative exposure measurements, control banding (CB) has been widely adopted by OHS community as a pragmatic tool in implementing a risk management strategy based on a precautionary approach. Being in charge of health and safety in a Swiss university, where nanomaterials are largely used and produced, we are also faced with the challenge related to nanomaterials' occupational safety. In this work, we discuss the field application of an in-house risk management methodology similar to CB as well as some other methodologies. The challenges and issues related to the process will be discussed. Since exact data on nanomaterials hazardousness are missing for most of the situations, we deduce that the outcome of the analysis for a particular process is essentially the same with a simple methodology that determines only exposure potential and the one taking into account the hazardousness of ENPs. It is evident that when reliable data on hazardousness factors (as surface chemistry, solubility, carcinogenicity, toxicity etc.) will be available, more differentiation will be possible in determining the risk for different materials. On the protective measures side, all CB methodologies are inclined to overprotection side, only that some of them suggest comprehensive protective/preventive measures and others remain with basic advices. The implementation and control of protective measures in research environment will also be discussed.
The paper discusses measurement issues relating to the characterization of organic emissions from hazardous waste incineration processes under EPA's new risk burn guidance. The recently published draft quidance recommends that hazardous waste combustion facilities complete a mass...
Overview of Energy Systems` safety analysis report programs. Safety Analysis Report Update Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-03-01
The primary purpose of an Safety Analysis Report (SAR) is to provide a basis for judging the adequacy of a facility`s safety. The SAR documents the safety analyses that systematically identify the hazards posed by the facility, analyze the consequences and risk of potential accidents, and describe hazard control measures that protect the health and safety of the public and employees. In addition, some SARs document, as Technical Safety Requirements (TSRs, which include Technical Specifications and Operational Safety Requirements), technical and administrative requirements that ensure the facility is operated within prescribed safety limits. SARs also provide conveniently summarized information thatmore » may be used to support procedure development, training, inspections, and other activities necessary to facility operation. This ``Overview of Energy Systems Safety Analysis Report Programs`` Provides an introduction to the programs and processes used in the development and maintenance of the SARs. It also summarizes some of the uses of the SARs within Energy Systems and DOE.« less
Determination of heavy metals and halogens in plastics from electric and electronic waste.
Dimitrakakis, Emmanouil; Janz, Alexander; Bilitewski, Bernd; Gidarakos, Evangelos
2009-10-01
The presence of hazardous substances and preparations in small waste electrical and electronic equipment (sWEEE) found in the residual household waste stream of the city of Dresden, Germany has been investigated. The content of sWEEE plastics in heavy metals and halogens is determined using handheld X-ray fluorescence analysis (HXRF), elemental analysis by means of atomic absorption spectrometry (AAS) and ion exchange chromatography (IEC). Mean value of results for heavy metals in samples (n=51) by AAS are 17.4 mg/kg for Pb, 5.7 mg/kg for Cd, 8.4 mg/kg for Cr. The mass fraction of an additive as shown by HXRF (n=161) can vary over a wide range. Precise deductions as regards sWEEE plastics content in hazardous substances and preparations cannot be made. Additional research would be expedient regarding the influence of hazardous substances to recycling processes, in particular regarding the contamination of clean fractions in the exit streams of a WEEE treatment plant. Suitable standards for calibrating HXRF for use on EEE plastics or complex electr(on)ic components do not exist and should be developed.
Analyzing Distributed Functions in an Integrated Hazard Analysis
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Massie, Michael J.
2010-01-01
Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.
Lorz, C; Fürst, C; Galic, Z; Matijasic, D; Podrazky, V; Potocic, N; Simoncic, P; Strauch, M; Vacik, H; Makeschin, F
2010-12-01
We assessed the probability of three major natural hazards--windthrow, drought, and forest fire--for Central and South-Eastern European forests which are major threats for the provision of forest goods and ecosystem services. In addition, we analyzed spatial distribution and implications for a future oriented management of forested landscapes. For estimating the probability of windthrow, we used rooting depth and average wind speed. Probabilities of drought and fire were calculated from climatic and total water balance during growing season. As an approximation to climate change scenarios, we used a simplified approach with a general increase of pET by 20%. Monitoring data from the pan-European forests crown condition program and observed burnt areas and hot spots from the European Forest Fire Information System were used to test the plausibility of probability maps. Regions with high probabilities of natural hazard are identified and management strategies to minimize probability of natural hazards are discussed. We suggest future research should focus on (i) estimating probabilities using process based models (including sensitivity analysis), (ii) defining probability in terms of economic loss, (iii) including biotic hazards, (iv) using more detailed data sets on natural hazards, forest inventories and climate change scenarios, and (v) developing a framework of adaptive risk management.
NASA Astrophysics Data System (ADS)
Moustafa, Sayed, Sr.; Alarifi, Nassir S.; Lashin, Aref A.
2016-04-01
Urban areas along the western coast of Saudi Arabia are susceptible to natural disasters and environmental damages due to lack of planning. To produce a site-specific microzonation map of the rapidly growing Yanbu industrial city, spatial distribution of different hazard entities are assessed using the Analytical Hierarchal Process (AHP) together with Geographical Information System (GIS). For this purpose six hazard parameter layers are considered, namely; fundamental frequency, site amplification, soil strength in terms of effective shear-wave velocity, overburden sediment thickness, seismic vulnerability index and peak ground acceleration. The weight and rank values are determined during AHP and are assigned to each layer and its corresponding classes, respectively. An integrated seismic microzonation map was derived using GIS platform. Based on the derived map, the study area is classified into five hazard categories: very low, low, moderate high, and very high. The western and central parts of the study area, as indicated from the derived microzonation map, are categorized as a high hazard zone as compared to other surrounding places. The produced microzonation map of the current study is envisaged as a first-level assessment of the site specific hazards in the Yanbu city area, which can be used as a platform by different stakeholders in any future land-use planning and environmental hazard management.
The hazards of hazard identification in environmental epidemiology.
Saracci, Rodolfo
2017-08-09
Hazard identification is a major scientific challenge, notably for environmental epidemiology, and is often surrounded, as the recent case of glyphosate shows, by debate arising in the first place by the inherently problematic nature of many components of the identification process. Particularly relevant in this respect are components less amenable to logical or mathematical formalization and essentially dependent on scientists' judgment. Four such potentially hazardous components that are capable of distorting the correct process of hazard identification are reviewed and discussed from an epidemiologist perspective: (1) lexical mix-up of hazard and risk (2) scientific questions as distinct from testable hypotheses, and implications for the hierarchy of strength of evidence obtainable from different types of study designs (3) assumptions in prior beliefs and model choices and (4) conflicts of interest. Four suggestions are put forward to strengthen a process that remains in several aspects judgmental, but not arbitrary, in nature.
Tools for Material Design and Selection
NASA Astrophysics Data System (ADS)
Wehage, Kristopher
The present thesis focuses on applications of numerical methods to create tools for material characterization, design and selection. The tools generated in this work incorporate a variety of programming concepts, from digital image analysis, geometry, optimization, and parallel programming to data-mining, databases and web design. The first portion of the thesis focuses on methods for characterizing clustering in bimodal 5083 Aluminum alloys created by cryomilling and powder metallurgy. The bimodal samples analyzed in the present work contain a mixture of a coarse grain phase, with a grain size on the order of several microns, and an ultra-fine grain phase, with a grain size on the order of 200 nm. The mixing of the two phases is not homogeneous and clustering is observed. To investigate clustering in these bimodal materials, various microstructures were created experimentally by conventional cryomilling, Hot Isostatic Pressing (HIP), Extrusion, Dual-Mode Dynamic Forging (DMDF) and a new 'Gradient' cryomilling process. Two techniques for quantitative clustering analysis are presented, formulated and implemented. The first technique, the Area Disorder function, provides a metric of the quality of coarse grain dispersion in an ultra-fine grain matrix and the second technique, the Two-Point Correlation function, provides a metric of long and short range spatial arrangements of the two phases, as well as an indication of the mean feature size in any direction. The two techniques are implemented on digital images created by Scanning Electron Microscopy (SEM) and Electron Backscatter Detection (EBSD) of the microstructures. To investigate structure--property relationships through modeling and simulation, strategies for generating synthetic microstructures are discussed and a computer program that generates randomized microstructures with desired configurations of clustering described by the Area Disorder Function is formulated and presented. In the computer program, two-dimensional microstructures are generated by Random Sequential Adsorption (RSA) of voxelized ellipses representing the coarse grain phase. A simulated annealing algorithm is used to geometrically optimize the placement of the ellipses in the model to achieve varying user-defined configurations of spatial arrangement of the coarse grains. During the simulated annealing process, the ellipses are allowed to overlap up to a specified threshold, allowing triple junctions to form in the model. Once the simulated annealing process is complete, the remaining space is populated by smaller ellipses representing the ultra-fine grain phase. Uniform random orientations are assigned to the grains. The program generates text files that can be imported in to Crystal Plasticity Finite Element Analysis Software for stress analysis. Finally, numerical methods and programming are applied to current issues in green engineering and hazard assessment. To understand hazards associated with materials and select safer alternatives, engineers and designers need access to up-to-date hazard information. However, hazard information comes from many disparate sources and aggregating, interpreting and taking action on the wealth of data is not trivial. In light of these challenges, a Framework for Automated Hazard Assessment based on the GreenScreen list translator is presented. The framework consists of a computer program that automatically extracts data from the GHS-Japan hazard database, loads the data into a machine-readable JSON format, transforms the JSON document in to a GreenScreen JSON document using the GreenScreen List Translator v1.2 and performs GreenScreen Benchmark scoring on the material. The GreenScreen JSON documents are then uploaded to a document storage system to allow human operators to search for, modify or add additional hazard information via a web interface.
Assessing the validity of prospective hazard analysis methods: a comparison of two techniques
2014-01-01
Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system. PMID:24467813
Voigt, Kristina; Scherb, Hagen; Bruggemann, Rainer; Schramm, Karl-Werner
2013-06-01
Sustainable/Green Chemistry is a chemical philosophy encouraging the design of products and processes that reduce or eliminate the use and generation of hazardous substances. In this respect, metrical scientific disciplines like Chemometrics are important, because they indicate criteria for chemicals being hazardous or not. We demonstrated that sustainable principles in the disciplines Green Chemistry, Green Engineering, and Sustainability in Information Technology have main aspects in common. The use of non-hazardous chemicals or the more efficient use of chemical substances is one of these aspects. We take a closer look on the topic of the hazards of chemical substances. Our research focuses on data analyses concerning environmental chemicals named Persistent Organic Pollutants (POPs), which are found all over the world and pose a large risk to environment as well as to humans. The evaluation of the data is a major step in the elucidation of the danger of these chemicals. The data analysis method demonstrated here, is based on the theory of partially ordered sets and provides a generalized ranking. In our approach we investigate data sets of breast milk samples of women in Denmark, Finland, and Turkey which contained measurable levels of 20 POPs. The goal is twofold: On the one side the hazardous chemicals are to be identified and on the other side possible differences among the three nations should be detected, because in that case possible different uptake mechanisms may be supposed. The data analysis is performed by the free available software package PyHasse, written by the third author. We conclude that the data analysis method can well be applied for distinguishing between more or less dangerous existing chemicals. Furthermore, it should be used in sustainable chemistry in the same manner for detecting more and less sustainable chemicals. Copyright © 2013 Elsevier B.V. All rights reserved.
SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Suzette; Coppersmith, Ryan; Coppersmith, Kevin
A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-riskmore » informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.« less
Consumption of ultra-processed foods and cancer risk: results from NutriNet-Santé prospective cohort
Fiolet, Thibault; Sellem, Laury; Kesse-Guyot, Emmanuelle; Allès, Benjamin; Méjean, Caroline; Deschasaux, Mélanie; Fassier, Philippine; Latino-Martel, Paule; Beslay, Marie; Hercberg, Serge; Lavalette, Céline; Monteiro, Carlos A; Julia, Chantal; Touvier, Mathilde
2018-01-01
Abstract Objective To assess the prospective associations between consumption of ultra-processed food and risk of cancer. Design Population based cohort study. Setting and participants 104 980 participants aged at least 18 years (median age 42.8 years) from the French NutriNet-Santé cohort (2009-17). Dietary intakes were collected using repeated 24 hour dietary records, designed to register participants’ usual consumption for 3300 different food items. These were categorised according to their degree of processing by the NOVA classification. Main outcome measures Associations between ultra-processed food intake and risk of overall, breast, prostate, and colorectal cancer assessed by multivariable Cox proportional hazard models adjusted for known risk factors. Results Ultra-processed food intake was associated with higher overall cancer risk (n=2228 cases; hazard ratio for a 10% increment in the proportion of ultra-processed food in the diet 1.12 (95% confidence interval 1.06 to 1.18); P for trend<0.001) and breast cancer risk (n=739 cases; hazard ratio 1.11 (1.02 to 1.22); P for trend=0.02). These results remained statistically significant after adjustment for several markers of the nutritional quality of the diet (lipid, sodium, and carbohydrate intakes and/or a Western pattern derived by principal component analysis). Conclusions In this large prospective study, a 10% increase in the proportion of ultra-processed foods in the diet was associated with a significant increase of greater than 10% in risks of overall and breast cancer. Further studies are needed to better understand the relative effect of the various dimensions of processing (nutritional composition, food additives, contact materials, and neoformed contaminants) in these associations. Study registration Clinicaltrials.gov NCT03335644. PMID:29444771
NASA Astrophysics Data System (ADS)
Heckmann, Tobias; Haas, Florian; Trappe, Martin; Cyffka, Bernd; Becht, Michael
2010-05-01
Natural hazards are processes occurring in the natural environment that negatively affect human society. In most instances, the definition of natural hazards implies sudden events as different as earthquakes, floods or landslides. In addition, there are other phenomena that occur more subtly or slowly, and nevertheless may have serious adverse effects on the human environment. Hence, a comprehensive study programme in natural hazards has to include not only the conspicuous causes and effects of natural catastrophes, but of environmental processes in general. Geography as a discipline is located at the interface of natural, social and economic sciences; the physical geography programme described here is designed to include the social and economic dimension as well as management issues. Modules strengthening the theoretical background of geomorphic, geological, hydrological and meteorological processes and hazards are complemented by practical work in the field and the laboratory, dealing with measuring and monitoring environmental processes. On this basis, modeling and managing skills are developed. Another thread in the transdisciplinary programme deals with sustainability and environmental policy issues, and environmental psychology (e.g. perception of and reaction to hazards). This will improve the communication and team working skills of students wherever they are part of an interdisciplinary working group. Through the involvement in research programmes, students are confronted ‘hands on' with the different aspects of environmental processes and their consequences; thus, they will be excellently but not exclusively qualified for positions in the ‘natural hazards' sector.
Risk analysis based on hazards interactions
NASA Astrophysics Data System (ADS)
Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost
2017-04-01
Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).
NASA Astrophysics Data System (ADS)
Chapman, Martin Colby
1998-12-01
The design earthquake selection problem is fundamentally probabilistic. Disaggregation of a probabilistic model of the seismic hazard offers a rational and objective approach that can identify the most likely earthquake scenario(s) contributing to hazard. An ensemble of time series can be selected on the basis of the modal earthquakes derived from the disaggregation. This gives a useful time-domain realization of the seismic hazard, to the extent that a single motion parameter captures the important time-domain characteristics. A possible limitation to this approach arises because most currently available motion prediction models for peak ground motion or oscillator response are essentially independent of duration, and modal events derived using the peak motions for the analysis may not represent the optimal characterization of the hazard. The elastic input energy spectrum is an alternative to the elastic response spectrum for these types of analyses. The input energy combines the elements of amplitude and duration into a single parameter description of the ground motion that can be readily incorporated into standard probabilistic seismic hazard analysis methodology. This use of the elastic input energy spectrum is examined. Regression analysis is performed using strong motion data from Western North America and consistent data processing procedures for both the absolute input energy equivalent velocity, (Vsbea), and the elastic pseudo-relative velocity response (PSV) in the frequency range 0.5 to 10 Hz. The results show that the two parameters can be successfully fit with identical functional forms. The dependence of Vsbea and PSV upon (NEHRP) site classification is virtually identical. The variance of Vsbea is uniformly less than that of PSV, indicating that Vsbea can be predicted with slightly less uncertainty as a function of magnitude, distance and site classification. The effects of site class are important at frequencies less than a few Hertz. The regression modeling does not resolve significant effects due to site class at frequencies greater than approximately 5 Hz. Disaggregation of general seismic hazard models using Vsbea indicates that the modal magnitudes for the higher frequency oscillators tend to be larger, and vary less with oscillator frequency, than those derived using PSV. Insofar as the elastic input energy may be a better parameter for quantifying the damage potential of ground motion, its use in probabilistic seismic hazard analysis could provide an improved means for selecting earthquake scenarios and establishing design earthquakes for many types of engineering analyses.
DEMONSTRATION BULLETIN: THE DEHYDRO-TECH CORP. CARVER-GREENFIELD PROCESS
The Carver-Greenfield (C-G) Process (Figure 1) is a physical process that separates hazardous oil-soluble organic contaminants from sludges, soils, and non-aqueous liquids. The process involves adding to the waste a "carrier" oil, which removes hazardous organics from contamin...
AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potts, T. Todd; Hylko, James M.; Douglas, Terence A.
2003-02-27
WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHAmore » then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment controls in the field.« less
Oakman, Jodi; Neupane, Subas; Nygård, Clas-Håkan
2016-10-01
Musculoskeletal disorders (MSDs) are a major workplace issue. With increasing pressure to extend working lives, predictors of MSD risk across all age groups require accurate identification to inform risk reduction strategies. In 2005 and 2009, a survey was conducted in a Finnish food processing company (N = 734). Data on workplace physical and psychosocial hazards, work ability, job satisfaction and lifestyle-related variables were collected, and MSD risk was measured through assessment of work-related strain in four body areas. Predictors of MSD risk across three age groups (20-35, 36-49, 50+) were assessed with linear regression analysis. Physical hazards and MSD risk were related differently for each age group. The relationship between psychosocial hazards and MSD risk was less clear. For younger workers, physical hazards were not associated with MSD risk. In contrast, for those aged 36-49, repetitive movements (B = 1.76, p < 0.001) and awkward postures (B = 1.30, p = 0.02) were associated with increased MSD risk. For older workers, environmental hazards were positively associated with MSD risk (B = 0.37, p = 0.04). Predictors of MSD risk changed differently for each age group during 4 years of follow-up. For younger workers, change in environment and repetitive movements, for middle age team support and for older workers change in awkward posture were significant predictors of MSD risk. These results support the need for workplace-specific hazard surveillance data. This will ensure that all contributing factors to MSD risk can be accurately identified and controlled independent of age.
System safety management lessons learned from the US Army acquisition process
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piatt, J.A.
1989-05-01
The Assistant Secretary of the Army for Research, Development and Acquisition directed the Army Safety Center to provide an audit of the causes of accidents and safety of use restrictions on recently fielded systems by tracking residual hazards back through the acquisition process. The objective was to develop lessons learned'' that could be applied to the acquisition process to minimize mishaps in fielded systems. System safety management lessons learned are defined as Army practices or policies, derived from past successes and failures, that are expected to be effective in eliminating or reducing specific systemic causes of residual hazards. They aremore » broadly applicable and supportive of the Army structure and acquisition objectives. Pacific Northwest Laboratory (PNL) was given the task of conducting an independent, objective appraisal of the Army's system safety program in the context of the Army materiel acquisition process by focusing on four fielded systems which are products of that process. These systems included the Apache helicopter, the Bradley Fighting Vehicle (BFV), the Tube Launched, Optically Tracked, Wire Guided (TOW) Missile and the High Mobility Multipurpose Wheeled Vehicle (HMMWV). The objective of this study was to develop system safety management lessons learned associated with the acquisition process. The first step was to identify residual hazards associated with the selected systems. Since it was impossible to track all residual hazards through the acquisition process, certain well-known, high visibility hazards were selected for detailed tracking. These residual hazards illustrate a variety of systemic problems. Systemic or process causes were identified for each residual hazard and analyzed to determine why they exist. System safety management lessons learned were developed to address related systemic causal factors. 29 refs., 5 figs.« less
NASA Astrophysics Data System (ADS)
Owen, S. E.; Simons, M.; Hua, H.; Yun, S. H.; Agram, P. S.; Milillo, P.; Sacco, G. F.; Webb, F.; Rosen, P. A.; Lundgren, P.; Milillo, G.; Manipon, G. J. M.; Moore, A. W.; Liu, Z.; Polet, J.; Cruz, J.
2014-12-01
ARIA is a joint JPL/Caltech project to automate synthetic aperture radar (SAR) and GPS imaging capabilities for scientific understanding, hazard response, and societal benefit. We have built a prototype SAR and GPS data system that forms the foundation for hazard monitoring and response capability, as well as providing imaging capabilities important for science studies. Together, InSAR and GPS have the ability to capture surface deformation in high spatial and temporal resolution. For earthquakes, this deformation provides information that is complementary to seismic data on location, geometry and magnitude of earthquakes. Accurate location information is critical for understanding the regions affected by damaging shaking. Regular surface deformation measurements from SAR and GPS are useful for monitoring changes related to many processes that are important for hazard and resource management such as volcanic deformation, groundwater withdrawal, and landsliding. Observations of SAR coherence change have a demonstrated use for damage assessment for hazards such as earthquakes, tsunamis, hurricanes, and volcanic eruptions. These damage assessment maps can be made from imagery taken day or night and are not affected by clouds, making them valuable complements to optical imagery. The coherence change caused by the damage from hazards (building collapse, flooding, ash fall) is also detectable with intelligent algorithms, allowing for rapid generation of damage assessment maps over large areas at fine resolution, down to the spatial scale of single family homes. We will present the progress and results we have made on automating the analysis of SAR data for hazard monitoring and response using data from the Italian Space Agency's (ASI) COSMO-SkyMed constellation of X-band SAR satellites. Since the beginning of our project with ASI, our team has imaged deformation and coherence change caused by many natural hazard events around the world. We will present progress on our data system technology that enables rapid and reliable production of imagery. Lastly, we participated in the March 2014 FEMA exercise based on a repeat of the 1964 M9.2 Alaska earthquake, providing simulated data products for use in this hazards response exercise. We will present lessons learned from this and other simulation exercises.
DOT National Transportation Integrated Search
1985-10-01
This report summarizes the findings from the second phase of a two-part analysis of hazardous materials truck routes in the Dallas-Fort Worth area. Phase II of this study analyzes the risk of transporting hazardous materials on freeways and arterial ...
Hazard analysis of Arid and semi-Arid (ASAL) regions of Kenya.
Tabu, J S; Otwelo, J A; Koskei, P; Makokha, P
2013-06-01
This paper describes a situationanalysis on hazards in the Arid and semi-Arid lands of Kenya. The leading hazards affecting the Arid and semi-arid lands are mainly natural and include among others drought, floods, and landslides. Other hazards of importance were found to be war and conflict, HIV/AIDS and fires. Over 80% of these are weather related. The overall objective of this study was to prioritize hazards in the ASAL region. Specifically, the study identified the top ten hazards in the ASAL Districts of Kenya, determined Probability of occurrence; Analyzed the potential impact of the hazard and utilizing multiplier effect prioritized the Hazards using a hypothetical model. This was a descriptive study conducted in over half of the Kenya's ASAL Districts in four regions of Lower and Upper Eastern, North Eastern and part of the Coast region. Six Districts were purposively selected per region with six officers from each District all totaling one hundred and forty four. The sectors where respondents were sourced from were Agriculture, Health, local Government, and Provincial Administration, Environment and NGO. The members through a consensus process analyzed hazards in groups of their respective districts using a tool that had been developed and respondents trained on its use. One hundred and forty four (144) officers from Twenty four Districts in the four regions were recruited. One hundred twenty seven (81%) were male and only 27 (19% ) were female The representation of participants per sector was Governance 25% followed by Civil society organizations 21%, Health 16%, Agriculture and arid lands 15%, Research and scientific institutions 13%. The top Priority Hazards identified using the mean score were Drought and famine (5.4) Epidemics and epizootics (3.8), HIV/AIDS (3.6), War and conflict (2.5), Floods (2.5) CONCLUSIONS: The exercise confirmed the priority hazards in the Arid and semi-arid regions of Kenya and described vulnerability factors that included water scarcity, poverty and low educational levels. The region suffers from a variety of hazards in particular Drought and famine, Epidemics including HIV/AIDS and War and conflict. Environmental degradation though given a low score may be more of a perception. There is need to undertake a comprehensive hazard and Vulnerability analysis at regional and country level to inform interventions and other developmental activities. Women should be targeted at the community and leadership level, and efforts to empower them should be stepped up.
NASA Technical Reports Server (NTRS)
Starr, David O. (Technical Monitor); Smith, Eric A.
2002-01-01
Comprehensive understanding of the microphysical nature of Mediterranean storms can be accomplished by a combination of in situ meteorological data analysis and radar-passive microwave data analysis, effectively integrated with numerical modeling studies at various scales, from synoptic scale down through the mesoscale, the cloud macrophysical scale, and ultimately the cloud microphysical scale. The microphysical properties of and their controls on severe storms are intrinsically related to meteorological processes under which storms have evolved, processes which eventually select and control the dominant microphysical properties themselves. This involves intense convective development, stratiform decay, orographic lifting, and sloped frontal lifting processes, as well as the associated vertical motions and thermodynamical instabilities governing physical processes that affect details of the size distributions and fall rates of the various types of hydrometeors found within the storm environment. Insofar as hazardous Mediterranean storms, highlighted in this study by three mountain storms producing damaging floods in northern Italy between 1992 and 2000, developing a comprehensive microphysical interpretation requires an understanding of the multiple phases of storm evolution and the heterogeneous nature of precipitation fields within a storm domain. This involves convective development, stratiform transition and decay, orographic lifting, and sloped frontal lifting processes. This also involves vertical motions and thermodynamical instabilities governing physical processes that determine details of the liquid/ice water contents, size disi:ributions, and fall rates of the various modes of hydrometeors found within hazardous storm environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gary Mecham
2010-08-01
This report is a companion to the Facilities Condition and Hazard Assessment for Materials and Fuel Complex Sodium Processing Facilities MFC-799/799A and Nuclear Calibration Laboratory MFC-770C (referred to as the Facilities Condition and Hazards Assessment). This report specifically responds to the requirement of Section 9.2, Item 6, of the Facilities Condition and Hazards Assessment to provide an updated assessment and verification of the residual hazardous materials remaining in the Sodium Processing Facilities processing system. The hazardous materials of concern are sodium and sodium hydroxide (caustic). The information supplied in this report supports the end-point objectives identified in the Transition Planmore » for Multiple Facilities at the Materials and Fuels Complex, Advanced Test Reactor, Central Facilities Area, and Power Burst Facility, as well as the deactivation and decommissioning critical decision milestone 1, as specified in U.S. Department of Energy Guide 413.3-8, “Environmental Management Cleanup Projects.” Using a tailored approach and based on information obtained through a combination of process knowledge, emergency management hazardous assessment documentation, and visual inspection, this report provides sufficient detail regarding the quantity of hazardous materials for the purposes of facility transfer; it also provides that further characterization/verification of these materials is unnecessary.« less
PRO-Elicere: A Study for Create a New Process of Dependability Analysis of Space Computer Systems
NASA Astrophysics Data System (ADS)
da Silva, Glauco; Netto Lahoz, Carlos Henrique
2013-09-01
This paper presents the new approach to the computer system dependability analysis, called PRO-ELICERE, which introduces data mining concepts and intelligent mechanisms to decision support to analyze the potential hazards and failures of a critical computer system. Also, are presented some techniques and tools that support the traditional dependability analysis and briefly discusses the concept of knowledge discovery and intelligent databases for critical computer systems. After that, introduces the PRO-ELICERE process, an intelligent approach to automate the ELICERE, a process created to extract non-functional requirements for critical computer systems. The PRO-ELICERE can be used in the V&V activities in the projects of Institute of Aeronautics and Space, such as the Brazilian Satellite Launcher (VLS-1).
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-27
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2010-N-0357...; Hazard Analysis and Critical Control Point Procedures for the Safe and Sanitary Processing and Importing of Juice AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The Food and Drug...
49 CFR 107.117 - Emergency processing.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Transportation Other Regulations Relating to Transportation PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION HAZARDOUS MATERIALS AND OIL TRANSPORTATION HAZARDOUS MATERIALS... than the hazardous material to be transported) that could not be prevented if the application were...
49 CFR 107.117 - Emergency processing.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Transportation Other Regulations Relating to Transportation PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION HAZARDOUS MATERIALS AND OIL TRANSPORTATION HAZARDOUS MATERIALS... than the hazardous material to be transported) that could not be prevented if the application were...
NASA Astrophysics Data System (ADS)
Hung, Hung-Chih; Wu, Ju-Yu; Hung, Chih-Hsuan
2017-04-01
1. Background Asia-Pacific region is one of the most vulnerable areas of the world to climate-related hazards and extremes due to rapid urbanization and over-development in hazard-prone areas. It is thus increasingly recognized that the management of land use and reduction of hazard risk are inextricably linked. This is especially critical from the perspective of integrated river basin management. A range of studies has targeted existing vulnerability assessments. However, limited attention has been paid to the cumulative effects of multiple vulnerable factors and their dynamics faced by local communities. This study proposes a novel methodology to access the changing cumulative vulnerability to climate-related hazards, and to examine the relationship between the attraction factors relevant to the general process of urbanization and vulnerability variability with a focus on a river basin management unit. 2. Methods and data The methods applied in this study include three steps. First, using Intergovernmental Panel on Climate Change's (IPCC) approach, a Cumulative Vulnerability Assessment Framework (CVAF) is built with a goal to characterize and compare the vulnerability to climate-related hazards within river basin regions based on a composition of multiple indicators. We organize these indicator metrics into three categories: (1) hazard exposure; (2) socioeconomic sensitivity, and (3) adaptive capacity. Second, the CVAF is applied by combining a geographical information system (GIS)-based spatial statistics technique with a multicriteria decision analysis (MCDA) to assess and map the changing cumulative vulnerability, comparing conditions in 1996 and 2006 in Danshui River Basin, Taiwan. Third, to examine the affecting factors of vulnerability changing, we develop a Vulnerability Changing Model (VCM) using four attraction factors to reflect how the process of urban developments leads to vulnerability changing. The factors are transport networks, land uses, production values of industries, and infrastructures. We then conduct a regression analysis to test the VCM. To illustrate the proposed methodology, the data are collected from the National Science and Technology Center for Disaster Reduction, Taiwan as well as the National Land Use Investigation and official census statistics. 3. Results and policy implications Results of CVAF analysis demonstrate heterogeneous patterns of vulnerability in the region, and highlight trends of long-term changes. The vulnerable areas unfold as clustered patterns and spatial analogues across regions, rather than randomly distributed. Highest cumulative vulnerability is concentrated in densely populated and downstream reaches (such as Taipei City) of the Danshui River in both time periods. When examining the VCM, it indicates that upper stream and more remote areas generally show low vulnerability, increases are observed in some areas between 1996 and 2006 due to land use intensification, industrial and infrastructure expansion. These findings suggest that land use planning should consider the socioeconomic progression and infrastructure investment factors that contribute to urban sprawl and address current as well as future urban developments vulnerable to hazard risk transmission. The cumulative vulnerability assessment, mapping methods and modelling presented here can be applied to other climate change and hazard risks to highlight priority areas for further investigation and contribute towards improving river basin management.
Food safety issues affecting the dairy beef industry.
Stefan, G
1997-12-01
The ability of dairy farmers to market cull cows and veal calves may be affected by the final rule on Pathogen Reduction and HACCP (Hazard Analysis Critical Control Points) Systems, a sweeping reform of USDA food safety regulations that was published on July 25, 1996. Although the regulations apply only to slaughter and processing plants handling meat and poultry, the rule will have an impact on food animal producers, including dairy farmers. Under this regulation, plant operators are required to evaluate potential hazards and to devise and implement controls that are appropriate for each product and plant to prevent or reduce those hazards. Processing plants may need to consider the potential hazards associated with incoming animals, such as illegal drug residues, which may result in marked changes in the relationships among some producers, livestock markets, and slaughter plants. Such information may actually improve the marketability of some animal classes because documentation will help the packer ensure the safety of products for sale to domestic and foreign markets. Dairy scientists are in an excellent position to explain the food safety issues to dairy farmers and to help develop the appropriate strategies that are necessary to guide the changes needed. These scientists can be conduits for information, the research leaders for practical solutions to reduce public health risks, and valuable resources to help farmers adjust to the impact of these new in-plant regulatory systems.
NASA Astrophysics Data System (ADS)
Su, Weizhong
2017-03-01
There is growing interest in using the urban landscape for stormwater management studies, where land patterns and processes can be important controls for the sustainability of urban development and planning. This paper proposes an original index of Major Hazard Oriented Level (MHOL) and investigates the structure distribution, driving factors, and controlling suggestions of urban-rural land growth in flood-prone areas in the Taihu Lake watershed, China. The MHOL of incremental urban-rural land increased from M 31.51 during the years 1985-1995 to M 38.37 during the years 1995-2010 (M for medium structure distribution, and the number for high-hazard value). The index shows that urban-rural land was distributed uniformly in flood hazard levels and tended to move rapidly to high-hazard areas, where 72.68% of incremental urban-rural land was aggregated maximally in new urban districts along the Huning traffic line and the Yangtze River. Thus, the current accelerating growth of new urban districts could account for the ampliative exposure to high-hazard areas. New districts are driven by the powerful link between land financial benefits and political achievements for local governments and the past unsustainable process of "single objective" oriented planning. The correlation categorical analysis of the current development intensity and carrying capacity of hydrological ecosystems for sub-basins was used to determine four types of development areas and provide decision makers with indications on the future watershed-scale subdivision of Major Function Oriented Zoning implemented by the Chinese government.
New Directions in Software Quality Assurance Automation
2009-06-01
generation process. 4.1 Parameterized Safety Analysis We can do a qualitative analysis as well and ask questions like “ what has contributed to this...the probability of interception p1 in the previous example, we can determine what impact those parameters have on the probability of hazardous...assumed that the AEG is traversed top-down and left-to-right and only once to produce a particular event trace Randomized decisions about what
Chen, Ting; Jin, Yiying; Qiu, Xiaopeng; Chen, Xin
2015-03-01
Using laboratory experiments, the authors investigated the impact of dry-heat and moist-heat treatment processes on hazardous trace elements (As, Hg, Cd, Cr, and Pb) in food waste and explored their distribution patterns for three waste components: oil, aqueous, and solid components. The results indicated that an insignificant reduction of hazardous trace elements in heat-treated waste-0.61-14.29% after moist-heat treatment and 4.53-12.25% after dry-heat treatment-and a significant reduction in hazardous trace elements (except for Hg without external addition) after centrifugal dehydration (P < 0.5). Moreover, after heat treatment, over 90% of the hazardous trace elements in the waste were detected in the aqueous and solid components, whereas only a trace amount of hazardous trace elements was detected in the oil component (<0.01%). In addition, results indicated that heat treatment process did not significantly reduce the concentration of hazardous trace elements in food waste, but the separation process for solid and aqueous components, such as centrifugal dehydration, could reduce the risk considerably. Finally, combined with the separation technology for solid and liquid components, dry-heat treatment is superior to moist-heat treatment on the removal of external water-soluble ionic hazardous trace elements. An insignificant reduction of hazardous trace elements in heat-treated waste showed that heat treatment does not reduce trace elements contamination in food waste considerably, whereas the separation process for solid and aqueous components, such as centrifugal dehydration, could reduce the risk significantly. Moreover, combined with the separation technology for solid and liquid components, dry-heat treatment is superior to moist-heat treatment for the removal of external water-soluble ionic hazardous trace elements, by exploring distribution patterns of trace elements in three waste components: oil, aqueous, and solid components.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dionne, B.J.; Morris, S.C. III; Baum, J.W.
1998-01-01
The Department of Energy`s (DOE) Office of Environment, Safety, and Health (EH) sought examples of risk-based approaches to environmental restoration to include in their guidance for DOE nuclear facilities. Extensive measurements of radiological contamination in soil and ground water have been made at Brookhaven National Laboratory`s Hazardous Waste Management Facility (HWMF) as part of a Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) remediation process. This provided an ideal opportunity for a case study. This report provides a risk assessment and an {open_quotes}As Low as Reasonably Achievable{close_quotes} (ALARA) analysis for use at other DOE nuclear facilities as an example ofmore » a risk-based decision technique. This document contains the Appendices for the report.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-29
...The Food and Drug Administration (FDA) is proposing regulations for domestic and foreign facilities that are required to register under the Federal Food, Drug, and Cosmetic Act (the FD&C Act) to establish requirements for current good manufacturing practice in manufacturing, processing, packing, and holding of animal food. FDA also is proposing regulations to require that certain facilities establish and implement hazard analysis and risk-based preventive controls for food for animals. FDA is taking this action to provide greater assurance that animal food is safe and will not cause illness or injury to animals or humans and is intended to build an animal food safety system for the future that makes modern, science and risk-based preventive controls the norm across all sectors of the animal food system.
Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis
NASA Technical Reports Server (NTRS)
Shortle, J. F.; Allocco, M.
2005-01-01
Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.
Large Scale Flood Risk Analysis using a New Hyper-resolution Population Dataset
NASA Astrophysics Data System (ADS)
Smith, A.; Neal, J. C.; Bates, P. D.; Quinn, N.; Wing, O.
2017-12-01
Here we present the first national scale flood risk analyses, using high resolution Facebook Connectivity Lab population data and data from a hyper resolution flood hazard model. In recent years the field of large scale hydraulic modelling has been transformed by new remotely sensed datasets, improved process representation, highly efficient flow algorithms and increases in computational power. These developments have allowed flood risk analysis to be undertaken in previously unmodeled territories and from continental to global scales. Flood risk analyses are typically conducted via the integration of modelled water depths with an exposure dataset. Over large scales and in data poor areas, these exposure data typically take the form of a gridded population dataset, estimating population density using remotely sensed data and/or locally available census data. The local nature of flooding dictates that for robust flood risk analysis to be undertaken both hazard and exposure data should sufficiently resolve local scale features. Global flood frameworks are enabling flood hazard data to produced at 90m resolution, resulting in a mis-match with available population datasets which are typically more coarsely resolved. Moreover, these exposure data are typically focused on urban areas and struggle to represent rural populations. In this study we integrate a new population dataset with a global flood hazard model. The population dataset was produced by the Connectivity Lab at Facebook, providing gridded population data at 5m resolution, representing a resolution increase over previous countrywide data sets of multiple orders of magnitude. Flood risk analysis undertaken over a number of developing countries are presented, along with a comparison of flood risk analyses undertaken using pre-existing population datasets.
1985-12-01
ITASK IWORK UNIT Roo 2 DELEMENT NO. NO. NO. IACCESSION NO ~2sI DC tn 200061 1 ift E (include Security Classification) Industrit I Processes to Reduice...SCRT LSIIAINO HSPG Bes Av ia l ther editions are obsolete. I iN 1 ALSs I F1 1- Bs Available INDUSTRIAL PROCESSES TO REDUCE GENERATION OF HAZARDOUS...Defense (DOD) by CH2M HILL and PEER Consultants, Inc., for the purpose of reducing hazardous waste generation from DOD industrial processes . It is not
Hazard Interactions and Interaction Networks (Cascades) within Multi-Hazard Methodologies
NASA Astrophysics Data System (ADS)
Gill, Joel; Malamud, Bruce D.
2016-04-01
Here we combine research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between 'multi-layer single hazard' approaches and 'multi-hazard' approaches that integrate such interactions. This synthesis suggests that ignoring interactions could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. We proceed to present an enhanced multi-hazard framework, through the following steps: (i) describe and define three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment; (ii) outline three types of interaction relationship (triggering, increased probability, and catalysis/impedance); and (iii) assess the importance of networks of interactions (cascades) through case-study examples (based on literature, field observations and semi-structured interviews). We further propose visualisation frameworks to represent these networks of interactions. Our approach reinforces the importance of integrating interactions between natural hazards, anthropogenic processes and technological hazards/disasters into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential, and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.
Supplemental Hazard Analysis and Risk Assessment - Hydrotreater
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish amore » lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.« less
NASA Astrophysics Data System (ADS)
Sobradelo, R.; Martí, J.; Mendoza-Rosas, A. T.; Gómez, G.
2012-04-01
The Canary Islands are an active volcanic region densely populated and visited by several millions of tourists every year. Nearly twenty eruptions have been reported through written chronicles in the last 600 years, suggesting that the probability of a new eruption in the near future is far from zero. This shows the importance of assessing and monitoring the volcanic hazard of the region in order to reduce and manage its potential volcanic risk, and ultimately contribute to the design of appropriate preparedness plans. Hence, the probabilistic analysis of the volcanic eruption time series for the Canary Islands is an essential step for the assessment of volcanic hazard and risk in the area. Such a series describes complex processes involving different types of eruptions over different time scales. Here we propose a statistical method for calculating the probabilities of future eruptions which is most appropriate given the nature of the documented historical eruptive data. We first characterise the eruptions by their magnitudes, and then carry out a preliminary analysis of the data to establish the requirements for the statistical method. Past studies in eruptive time series used conventional statistics and treated the series as an homogeneous process. In this paper, we will use a method that accounts for the time-dependence of the series and includes rare or extreme events, in the form of few data of large eruptions, since these data require special methods of analysis. Hence, we will use a statistical method from extreme value theory. In particular, we will apply a non-homogeneous Poisson process to the historical eruptive data of the Canary Islands to estimate the probability of having at least one volcanic event of a magnitude greater than one in the upcoming years. Shortly after the publication of this method an eruption in the island of El Hierro took place for the first time in historical times, supporting our method and contributing towards the validation of our results.
Tracking Hazard Analysis Data in a Jungle of Changing Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Robin S.; Young, Jonathan
2006-05-16
Tracking hazard analysis data during the 'life cycle' of a project can be an extremely complicated task. However, a few simple rules, used consistently, can give you the edge that will save countless headaches and provide the information that will help integrate the hazard analysis and design activities even if performed in parallel.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... 0584-AD65 School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles... Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP) was published on... of Management and Budget (OMB) cleared the associated information collection requirements (ICR) on...
Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations
NASA Technical Reports Server (NTRS)
Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art
2012-01-01
This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).
Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land
2006-01-01
We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.
36 CFR 218.12 - Timing of authorized hazardous fuel reduction project decision.
Code of Federal Regulations, 2010 CFR
2010-07-01
... SERVICE, DEPARTMENT OF AGRICULTURE PREDECISIONAL ADMINISTRATIVE REVIEW PROCESSES Predecisional Administrative Review Process for Hazardous Fuel Reduction Projects Authorized by the Healthy Forests Restoration...
Quandt, Sara A; Arcury-Quandt, Alice E; Lawlor, Emma J; Carrillo, Lourdes; Marín, Antonio J; Grzywacz, Joseph G; Arcury, Thomas A
2013-02-01
This study uses qualitative data to describe the tasks performed by chicken catchers, their organization of work, and possible health and safety hazards encountered. Twenty-one Latino immigrant chicken catchers for North Carolina poultry-processing plants were interviewed to obtain their perceptions of the job and its hazards. Interviews were recorded and transcribed (n = 10) or detailed notes recorded (n = 11). Transcripts and notes were subjected to qualitative analysis. Chicken catching takes place in a highly contaminated and hazardous work environment. The fast pace of work, low level of control over work intensity, and piece rate compensation all result in high potential for work-related injury and illness, including trauma, electrical shock, respiratory effects, musculoskeletal injuries, and drug use. Workers receive little safety or job training. Chicken catching is characterized by a work environment and organization of work that promote injury and illness. Copyright © 2012 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Giordan, Daniele; Hayakawa, Yuichi; Nex, Francesco; Remondino, Fabio; Tarolli, Paolo
2018-04-01
The number of scientific studies that consider possible applications of remotely piloted aircraft systems (RPASs) for the management of natural hazards effects and the identification of occurred damages strongly increased in the last decade. Nowadays, in the scientific community, the use of these systems is not a novelty, but a deeper analysis of the literature shows a lack of codified complex methodologies that can be used not only for scientific experiments but also for normal codified emergency operations. RPASs can acquire on-demand ultra-high-resolution images that can be used for the identification of active processes such as landslides or volcanic activities but can also define the effects of earthquakes, wildfires and floods. In this paper, we present a review of published literature that describes experimental methodologies developed for the study and monitoring of natural hazards.
DTREEv2, a computer-based support system for the risk assessment of genetically modified plants.
Pertry, Ine; Nothegger, Clemens; Sweet, Jeremy; Kuiper, Harry; Davies, Howard; Iserentant, Dirk; Hull, Roger; Mezzetti, Bruno; Messens, Kathy; De Loose, Marc; de Oliveira, Dulce; Burssens, Sylvia; Gheysen, Godelieve; Tzotzos, George
2014-03-25
Risk assessment of genetically modified organisms (GMOs) remains a contentious area and a major factor influencing the adoption of agricultural biotech. Methodologically, in many countries, risk assessment is conducted by expert committees with little or no recourse to databases and expert systems that can facilitate the risk assessment process. In this paper we describe DTREEv2, a computer-based decision support system for the identification of hazards related to the introduction of GM-crops into the environment. DTREEv2 structures hazard identification and evaluation by means of an Event-Tree type of analysis. The system produces an output flagging identified hazards and potential risks. It is intended to be used for the preparation and evaluation of biosafety dossiers and, as such, its usefulness extends to researchers, risk assessors and regulators in government and industry. Copyright © 2013 Elsevier B.V. All rights reserved.
1996-01-01
failure as due to an adhesive layer between the foil and inner polypropylene layers. "* Under subcontract, NFPA provided HACCP draft manuals for the...parameters of the production process and to ensure that they are within their target values. In addition, a HACCP program was used to assure product...played an important part in implementing Hazard Analysis Critical Control Points ( HACCP ) as part of the Process and Quality Control manual. The National
Integrated Safety Analysis Tiers
NASA Technical Reports Server (NTRS)
Shackelford, Carla; McNairy, Lisa; Wetherholt, Jon
2009-01-01
Commercial partnerships and organizational constraints, combined with complex systems, may lead to division of hazard analysis across organizations. This division could cause important hazards to be overlooked, causes to be missed, controls for a hazard to be incomplete, or verifications to be inefficient. Each organization s team must understand at least one level beyond the interface sufficiently enough to comprehend integrated hazards. This paper will discuss various ways to properly divide analysis among organizations. The Ares I launch vehicle integrated safety analyses effort will be utilized to illustrate an approach that addresses the key issues and concerns arising from multiple analysis responsibilities.
NASA Astrophysics Data System (ADS)
Nicolet, P.; Choffet, M.; Jaboyedoff, M.; Lauraux, B.; Lance, J.-. L.; Champod, E.
2012-04-01
Natural hazards are being mapped in the Vaud County and sinkhole hazard is part of this process. An preliminary hazard map has already been made based on the sinkholes occurrences and on several other parameters such as rocks types, proximity of major faults and closed basins. The detailed hazard map essentially based on the occurrence of sinkholes is in process. This map will influence the landuse planning. Presently, the public building insurance is covering the damage costs due to all natural hazards that are being mapped except sinkholes. For the sake of consistency and due to political pressure, the insurance company wants to integrate this phenomenon in its insurance coverage. This study aims to assess the potential damage costs to buildings induced by this decision. Karstic process is active in two regions in the County. The first one, namely the Folded Jura, is composed of Jurassic and Cretaceous Carbonate rocks. The second one, the Prealpine region, is composed of both Carbonate and evaporitic rocks. Even if the cavities in carbonate rocks can cause difficulties during the buildings construction, the karst development in these rocks is relatively slow and, as a result, the expected damage costs to buildings is relatively low. In contrast, the evaporitic rocks are likely to cause significant subsidence or sudden collapses at human scale and thus damage the buildings. Therefore, our study is focused on the region where this rock type occurs. It results that many buildings, i.e. more than 10'000, are concerned according to the preliminary hazard map. However some of these buildings are in zones with very low potential. Based on estimated frequencies of collapsing events and subsidence rates, the potential damage cost is estimated. Furthermore, the number of potential claims is also considered in order to know the expected additional work for the insurance company. Careful attention is also given to the potential development of building zones. In addition, the project gives guidelines for the insurance company, focusing on structural measures reducing buildings vulnerability.
2013 Los Alamos National Laboratory Hazardous Waste Minimization Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salzman, Sonja L.; English, Charles J.
2015-08-24
Waste minimization and pollution prevention are inherent goals within the operating procedures of Los Alamos National Security, LLC (LANS). The US Department of Energy (DOE) and LANS are required to submit an annual hazardous waste minimization report to the New Mexico Environment Department (NMED) in accordance with the Los Alamos National Laboratory (LANL or the Laboratory) Hazardous Waste Facility Permit. The report was prepared pursuant to the requirements of Section 2.9 of the LANL Hazardous Waste Facility Permit. This report describes the hazardous waste minimization program (a component of the overall Waste Minimization/Pollution Prevention [WMin/PP] Program) administered by the Environmentalmore » Stewardship Group (ENV-ES). This report also supports the waste minimization and pollution prevention goals of the Environmental Programs Directorate (EP) organizations that are responsible for implementing remediation activities and describes its programs to incorporate waste reduction practices into remediation activities and procedures. LANS was very successful in fiscal year (FY) 2013 (October 1-September 30) in WMin/PP efforts. Staff funded four projects specifically related to reduction of waste with hazardous constituents, and LANS won four national awards for pollution prevention efforts from the National Nuclear Security Administration (NNSA). In FY13, there was no hazardous, mixedtransuranic (MTRU), or mixed low-level (MLLW) remediation waste generated at the Laboratory. More hazardous waste, MTRU waste, and MLLW was generated in FY13 than in FY12, and the majority of the increase was related to MTRU processing or lab cleanouts. These accomplishments and analysis of the waste streams are discussed in much more detail within this report.« less
Seismic hazard assessment: Issues and alternatives
Wang, Z.
2011-01-01
Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.
A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva
2018-03-01
The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.
36 CFR 218.4 - Authorized hazardous fuel reduction projects not subject to objection.
Code of Federal Regulations, 2010 CFR
2010-07-01
... SERVICE, DEPARTMENT OF AGRICULTURE PREDECISIONAL ADMINISTRATIVE REVIEW PROCESSES Predecisional Administrative Review Process for Hazardous Fuel Reduction Projects Authorized by the Healthy Forests Restoration...
NASA Astrophysics Data System (ADS)
Wang, X.; Liu, H.; Yao, K.; Wei, Y.
2018-04-01
It is a complicated process to analyze the cause of geological hazard. Through the analysis function of GIS software, 250 landslides were randomly selected from 395 landslide hazards in the study area, superimposed with the types of landforms, annual rainfall and vegetation coverage respectively. It used box dimension method of fractal dimension theory to study the fractal characteristics of spatial distribution of landslide disasters in Dachuan district, and analyse the statistical results. Research findings showed that the The fractal dimension of the landslides in the Dachuan area is 0.9114, the correlation coefficient is 0.9627, and it has high autocorrelation. Zoning statistics according to various natural factors, the fractal dimension between landslide hazard points and deep hill, middle hill area is strong as well as the area whose average annual rainfall is 1050 mm-1250 mm and vegetation coverage is 30 %-60 %. Superposition of the potential hazard distribution map of single influence factors to get the potential hazard zoning of landslides in the area. Verifying the potential hazard zoning map of the potential landslides with 145 remaining disaster points, among them, there are 74 landslide hazard points in high risk area, accounting for 51.03 % of the total. There are 59 landslides in the middle risk area, accounting for 40.69 % of the total, and 12 in the low risk area, accounting for 8.28 % of the total. The matching degree of the verifying result and the potential hazard zoning is high. Therefore, the fractal dimension value divided the degree of geological disaster susceptibility can be described the influence degree of each influence factor to geological disaster point more intuitively, it also can divide potential disaster risk areas and provide visual data support for effective management of geological disasters.
Probabilistic analysis of tsunami hazards
Geist, E.L.; Parsons, T.
2006-01-01
Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).
Savage, W.U.; Nishenko, S.P.; Honegger, D.G.; Kempner, L.
2006-01-01
Electric power utilities are familiar with and skilled in preparing for and responding to almost-routine natural hazard events such as strong wind and ice storms and seasonal floods, as well as intentional human acts such as vandalism. Recent extreme weather (hurricanes Katrina and Rita), extremely destructive international earthquakes (in Sumatra and Pakistan), and nation-wide concerns regarding future terrorist attacks have increased the pressure on utilities to take appropriate steps to avoid being overwhelmed by such infrequent and exceedingly severe events. Determining what constitutes the appropriate steps to take requires various levels of understanding of the specific hazards and the risks faced by the utility. The American Lifelines Alliance (www. americanlifelinesalliance.org) has prepared a Guideline that provides clear, concise, and nationally-applicable guidance on determining the scope and level of effort necessary to assess power system performance in the wide range of natural hazard or human threat events. Included in this Guideline are specific procedures to follow and information to consider in performing standardized assessments. With the results of such assessments, utility owners can effectively establish and carry out risk management programs that will lead to achieving appropriate levels of performance in future events. The Guideline incorporates an inquiry-driven process with a two-phase performance assessment that can be applied to power systems of any size. The screening phase enables systems or components that are clearly not at risk to be screened out early. The subsequent analysis phase uses results from the screening phase to prioritize and allocate resources for more detailed assessments of hazard, vulnerability, and system performance. This process helps assure that the scope of the assessment meets the specific performance objectives of the inquiry. A case history is presented to illustrate the type of experience with an inquiry-driven process that was considered in developing the Guideline to meet the diverse needs of utility personnel in engineering, operations, and management. Copyright ASCE 2007.
van de Vis, J W; Poelman, M; Lambooij, E; Bégout, M-L; Pilarczyk, M
2012-02-01
The objective was to take a first step in the development of a process-oriented quality assurance (QA) system for monitoring and safeguarding of fish welfare at a company level. A process-oriented approach is focused on preventing hazards and involves establishment of critical steps in a process that requires careful control. The seven principles of the Hazard Analysis Critical Control Points (HACCP) concept were used as a framework to establish the QA system. HACCP is an internationally agreed approach for management of food safety, which was adapted for the purpose of safeguarding and monitoring the welfare of farmed fish. As the main focus of this QA system is farmed fish welfare assurance at a company level, it was named Fish Welfare Assurance System (FWAS). In this paper we present the initial steps of setting up FWAS for on growing of sea bass (Dicentrarchus labrax), carp (Cyprinus carpio) and European eel (Anguilla anguilla). Four major hazards were selected, which were fish species dependent. Critical Control Points (CCPs) that need to be controlled to minimize or avoid the four hazards are presented. For FWAS, monitoring of CCPs at a farm level is essential. For monitoring purposes, Operational Welfare Indicators (OWIs) are needed to establish whether critical biotic, abiotic, managerial and environmental factors are controlled. For the OWIs we present critical limits/target values. A critical limit is the maximum or minimum value to which a factor must be controlled at a critical control point to prevent, eliminate or reduce a hazard to an acceptable level. For managerial factors target levels are more appropriate than critical limits. Regarding the international trade of farmed fish products, we propose that FWAS needs to be standardized in aquaculture chains. For this standardization a consensus on the concept of fish welfare, methods to assess welfare objectively and knowledge on the needs of farmed fish are required.
Analysis of On-board Hazard Detection and Avoidance for Safe Lunar Landing
NASA Technical Reports Server (NTRS)
Johnson, Andrew E.; Huertas, Andres; Werner, Robert A.; Montgomery, James F.
2008-01-01
Landing hazard detection and avoidance technology is being pursued within NASA to improve landing safety and increase access to sites of interest on the lunar surface. The performance of a hazard detection and avoidance system depends on properties of the terrain, sensor performance, algorithm design, vehicle characteristics and the overall all guidance navigation and control architecture. This paper analyzes the size of the region that must be imaged, sensor performance parameters and the impact of trajectory angle on hazard detection performance. The analysis shows that vehicle hazard tolerance is the driving parameter for hazard detection system design.
Updated earthquake catalogue for seismic hazard analysis in Pakistan
NASA Astrophysics Data System (ADS)
Khan, Sarfraz; Waseem, Muhammad; Khan, Muhammad Asif; Ahmed, Waqas
2018-03-01
A reliable and homogenized earthquake catalogue is essential for seismic hazard assessment in any area. This article describes the compilation and processing of an updated earthquake catalogue for Pakistan. The earthquake catalogue compiled in this study for the region (quadrangle bounded by the geographical limits 40-83° N and 20-40° E) includes 36,563 earthquake events, which are reported as 4.0-8.3 moment magnitude (M W) and span from 25 AD to 2016. Relationships are developed between the moment magnitude and body, and surface wave magnitude scales to unify the catalogue in terms of magnitude M W. The catalogue includes earthquakes from Pakistan and neighbouring countries to minimize the effects of geopolitical boundaries in seismic hazard assessment studies. Earthquakes reported by local and international agencies as well as individual catalogues are included. The proposed catalogue is further used to obtain magnitude of completeness after removal of dependent events by using four different algorithms. Finally, seismicity parameters of the seismic sources are reported, and recommendations are made for seismic hazard assessment studies in Pakistan.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mansfield, N.J.
1992-01-01
The increasing number of hazardous materials accidents in the United States has resulted in new federal regulations addressing the emergency response activities associated with chemical releases. A significant part of these new federal standards (29 CFR 1910.120 and 40 CFR Part 311) requires compliance with specific criteria by all personnel involved in a hazardous material emergency. This study investigated alternative lesson design models applicable to instruction for hazardous material emergencies. A specialized design checklist was created based on the work of Gagne, Briggs, and Wager (1988), Merrill (1987), and Clark (1989). This checklist was used in the development of lessonmore » plan templates for the hazardous materials incident commander course. Qualitative data for establishing learning objectives was collected by conducting a needs assessment and a job analysis of the incident commander position. Incident commanders from 14 public and private organizations participated in the needs assessment process. Technical information for the lessons was collected from appropriate governmental agencies. The implementation of the checklist and lesson plans can contribute to assuring quality training for incident commanders throughout the United States.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ely, Geoffrey P.
2013-10-31
This project uses dynamic rupture simulations to investigate high-frequency seismic energy generation. The relevant phenomena (frictional breakdown, shear heating, effective normal-stress fluctuations, material damage, etc.) controlling rupture are strongly interacting and span many orders of magnitude in spatial scale, requiring highresolution simulations that couple disparate physical processes (e.g., elastodynamics, thermal weakening, pore-fluid transport, and heat conduction). Compounding the computational challenge, we know that natural faults are not planar, but instead have roughness that can be approximated by power laws potentially leading to large, multiscale fluctuations in normal stress. The capacity to perform 3D rupture simulations that couple these processes willmore » provide guidance for constructing appropriate source models for high-frequency ground motion simulations. The improved rupture models from our multi-scale dynamic rupture simulations will be used to conduct physicsbased (3D waveform modeling-based) probabilistic seismic hazard analysis (PSHA) for California. These calculation will provide numerous important seismic hazard results, including a state-wide extended earthquake rupture forecast with rupture variations for all significant events, a synthetic seismogram catalog for thousands of scenario events and more than 5000 physics-based seismic hazard curves for California.« less
Risk Assessment and Scaling for the SLS LH2 ET
NASA Technical Reports Server (NTRS)
Hafiychuk, Halyna; Ponizovskaya-Devine, Ekaterina; Luchinsky, Dmitry; Khasin, Michael; Osipov, Viatcheslav V.; Smelyanskiy, Vadim N.
2012-01-01
In this report the main physics processes in LH2 tank during prepress and rocket flight are studied. The goal of this investigation is to analyze possible hazards and to make risk assessment in proposed LH2 tank designs for SLS with 5 engines (the situation with 4 engines is less critical). For analysis we use the multinode model (MNM) developed by us and presented in a separate report and also 3D ANSYS simulations. We carry out simulation and theoretical analysis the physics processes such as (i) accumulation of bubbles in LH2 during replenish stage and their collapsing in the liquid during the prepress; (ii) condensation-evaporation at the liquid-vapor interface and tank wall, (iv) heating the liquid near the interface and wall due to condensation and environment heat, (v) injection of hot He during prepress and of hot GH2 during flight, (vi) mixing and cooling of the injected gases due to heat transfer between the gases, liquid and the tank wall. We analyze the effects of these physical processes on the thermo- and fluid gas dynamics in the ullage and on the stratification of temperature in the liquid and assess the associated hazards. A special emphasize is put on the scaling predictions for the larger SLS LH2 tank.
Dormedy, E S; Brashears, M M; Cutter, C N; Burson, D E
2000-12-01
A 2% lactic acid wash used in a large meat-processing facility was validated as an effective critical control point (CCP) in a hazard analysis and critical control point (HACCP) plan. We examined the microbial profiles of beef carcasses before the acid wash, beef carcasses immediately after the acid wash, beef carcasses 24 h after the acid wash, beef subprimal cuts from the acid-washed carcasses, and on ground beef made from acid-washed carcasses. Total mesophilic, psychrotrophic, coliforms, generic Escherichia coli, lactic acid bacteria, pseudomonads, and acid-tolerant microorganisms were enumerated on all samples. The presence of Salmonella spp. was also determined. Acid washing significantly reduced all counts except for pseudomonads that were present at very low numbers before acid washing. All other counts continued to stay significantly lower (P < 0.05) than those on pre-acid-washed carcasses throughout all processing steps. Total bacteria, coliforms, and generic E. coli enumerated on ground beef samples were more than 1 log cycle lower than those reported in the U.S. Department of Agriculture Baseline data. This study suggests that acid washes may be effective CCPs in HACCP plans and can significantly reduce the total number of microorganisms present on the carcass and during further processing.
Total Diet Studies as a Tool for Ensuring Food Safety
Lee, Joon-Goo; Kim, Sheen-Hee; Kim, Hae-Jung
2015-01-01
With the diversification and internationalization of the food industry and the increased focus on health from a majority of consumers, food safety policies are being implemented based on scientific evidence. Risk analysis represents the most useful scientific approach for making food safety decisions. Total diet study (TDS) is often used as a risk assessment tool to evaluate exposure to hazardous elements. Many countries perform TDSs to screen for chemicals in foods and analyze exposure trends to hazardous elements. TDSs differ from traditional food monitoring in two major aspects: chemicals are analyzed in food in the form in which it will be consumed and it is cost-effective in analyzing composite samples after processing multiple ingredients together. In Korea, TDSs have been conducted to estimate dietary intakes of heavy metals, pesticides, mycotoxins, persistent organic pollutants, and processing contaminants. TDSs need to be carried out periodically to ensure food safety. PMID:26483881
36 CFR 218.3 - Authorized hazardous fuel reduction projects subject to objection.
Code of Federal Regulations, 2010 CFR
2010-07-01
... SERVICE, DEPARTMENT OF AGRICULTURE PREDECISIONAL ADMINISTRATIVE REVIEW PROCESSES Predecisional Administrative Review Process for Hazardous Fuel Reduction Projects Authorized by the Healthy Forests Restoration..., the objection process of this part applies to both the plan amendment and the project. ...
Hazardous Waste: Cleanup and Prevention.
ERIC Educational Resources Information Center
Vandas, Steve; Cronin, Nancy L.
1996-01-01
Discusses hazardous waste, waste disposal, unsafe exposure, movement of hazardous waste, and the Superfund clean-up process that consists of site discovery, site assessment, clean-up method selection, site clean up, and site maintenance. Argues that proper disposal of hazardous waste is everybody's responsibility. (JRH)
40 CFR 266.112 - Regulation of residues.
Code of Federal Regulations, 2012 CFR
2012-07-01
... constituents (toxic constituents) that could reasonably be attributable to the hazardous waste at... processing the hazardous waste (constituents of concern) include toxic constituents in the hazardous waste... shall not be excluded from the definition of a hazardous waste. Concentrations of toxic constituents of...
40 CFR 266.112 - Regulation of residues.
Code of Federal Regulations, 2014 CFR
2014-07-01
... constituents (toxic constituents) that could reasonably be attributable to the hazardous waste at... processing the hazardous waste (constituents of concern) include toxic constituents in the hazardous waste... shall not be excluded from the definition of a hazardous waste. Concentrations of toxic constituents of...
40 CFR 266.112 - Regulation of residues.
Code of Federal Regulations, 2013 CFR
2013-07-01
... constituents (toxic constituents) that could reasonably be attributable to the hazardous waste at... processing the hazardous waste (constituents of concern) include toxic constituents in the hazardous waste... shall not be excluded from the definition of a hazardous waste. Concentrations of toxic constituents of...
Guide for Hydrogen Hazards Analysis on Components and Systems
NASA Technical Reports Server (NTRS)
Beeson, Harold; Woods, Stephen
2003-01-01
The physical and combustion properties of hydrogen give rise to hazards that must be considered when designing and operating a hydrogen system. One of the major concerns in the use of hydrogen is that of fire or detonation because of hydrogen's wide flammability range, low ignition energy, and flame speed. Other concerns include the contact and interaction of hydrogen with materials, such as the hydrogen embrittlement of materials and the formation of hydrogen hydrides. The low temperature of liquid and slush hydrogen bring other concerns related to material compatibility and pressure control; this is especially important when dissimilar, adjoining materials are involved. The potential hazards arising from these properties and design features necessitate a proper hydrogen hazards analysis before introducing a material, component, or system into hydrogen service. The objective of this guide is to describe the NASA Johnson Space Center White Sands Test Facility hydrogen hazards analysis method that should be performed before hydrogen is used in components and/or systems. The method is consistent with standard practices for analyzing hazards. It is recommended that this analysis be made before implementing a hydrogen component qualification procedure. A hydrogen hazards analysis is a useful tool for hydrogen-system designers, system and safety engineers, and facility managers. A hydrogen hazards analysis can identify problem areas before hydrogen is introduced into a system-preventing damage to hardware, delay or loss of mission or objective, and possible injury or loss of life.
Purification process for .sup.153Gd produced in natural europium targets
Johnsen, Amanda M; Soderquist, Chuck Z; McNamara, Bruce K; Risher, Darrell R
2013-04-23
An alteration of the traditional zinc/zinc-amalgam reduction procedure which eliminates both the hazardous mercury and dangerous hydrogen gas generation. In order to avoid the presence of water and hydrated protons in the working solution, which can oxidize Eu.sup.2+ and cause hydrogen gas production, a process utilizing methanol as the process solvent is described. While methanol presents some flammability hazard in a radiological hot cell, it can be better managed and is less of a flammability hazard than hydrogen gas generation.
Wang, Zhi; Liang, Jiabin; Rong, Xing; Zhou, Hao; Duan, Chuanwei; Du, Weijia; Liu, Yimin
2015-12-01
To investigate noise hazard and its influence on hearing loss in workers in the automotive component manufacturing industry. Noise level in the workplace of automotive component manufacturing enterprises was measured and hearing examination was performed for workers to analyze the features and exposure levels of noise in each process, as well as the influence on hearing loss in workers. In the manufacturing processes for different products in this industry, the manufacturing processes of automobile hub and suspension and steering systems had the highest degrees of noise hazard, with over-standard rates of 79.8% and 57.1%, respectively. In the different technical processes for automotive component manufacturing, punching and casting had the highest degrees of noise hazard, with over-standard rates of 65.0% and 50%, respectively. The workers engaged in the automotive air conditioning system had the highest rate of abnormal hearing ability (up to 3.1%). In the automotive component manufacturing industry, noise hazard exceeds the standard seriously. Although the rate of abnormal hearing is lower than the average value of the automobile manufacturing industry in China, this rate tends to increase gradually. Enough emphasis should be placed on the noise hazard in this industry.
Natural hazard fatalities in Switzerland from 1946 to 2015
NASA Astrophysics Data System (ADS)
Badoux, Alexandre; Andres, Norina; Techel, Frank; Hegg, Christoph
2016-12-01
A database of fatalities caused by natural hazard processes in Switzerland was compiled for the period between 1946 and 2015. Using information from the Swiss flood and landslide damage database and the Swiss destructive avalanche database, the data set was extended back in time and more hazard processes were added by conducting an in-depth search of newspaper reports. The new database now covers all natural hazards common in Switzerland, categorised into seven process types: flood, landslide, rockfall, lightning, windstorm, avalanche and other processes (e.g. ice avalanches, earthquakes). Included were all fatal accidents associated with natural hazard processes in which victims did not expose themselves to an important danger on purpose. The database contains information on 635 natural hazard events causing 1023 fatalities, which corresponds to a mean of 14.6 victims per year. The most common causes of death were snow avalanches (37 %), followed by lightning (16 %), floods (12 %), windstorms (10 %), rockfall (8 %), landslides (7 %) and other processes (9 %). About 50 % of all victims died in one of the 507 single-fatality events; the other half were killed in the 128 multi-fatality events. The number of natural hazard fatalities that occurred annually during our 70-year study period ranged from 2 to 112 and exhibited a distinct decrease over time. While the number of victims in the first three decades (until 1975) ranged from 191 to 269 per decade, it ranged from 47 to 109 in the four following decades. This overall decrease was mainly driven by a considerable decline in the number of avalanche and lightning fatalities. About 75 % of victims were males in all natural hazard events considered together, and this ratio was roughly maintained in all individual process categories except landslides (lower) and other processes (higher). The ratio of male to female victims was most likely to be balanced when deaths occurred at home (in or near a building), a situation that mainly occurred in association with landslides and avalanches. The average age of victims of natural hazards was 35.9 years and, accordingly, the age groups with the largest number of victims were the 20-29 and 30-39 year-old groups, which in combination represented 34 % of all fatalities. It appears that the overall natural hazard mortality rate in Switzerland over the past 70 years has been relatively low in comparison to rates in other countries or rates of other types of fatal accidents in Switzerland. However, a large variability in mortality rates was observed within the country with considerably higher rates in Alpine environments.
Overview of Energy Systems' safety analysis report programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-03-01
The primary purpose of an Safety Analysis Report (SAR) is to provide a basis for judging the adequacy of a facility's safety. The SAR documents the safety analyses that systematically identify the hazards posed by the facility, analyze the consequences and risk of potential accidents, and describe hazard control measures that protect the health and safety of the public and employees. In addition, some SARs document, as Technical Safety Requirements (TSRs, which include Technical Specifications and Operational Safety Requirements), technical and administrative requirements that ensure the facility is operated within prescribed safety limits. SARs also provide conveniently summarized information thatmore » may be used to support procedure development, training, inspections, and other activities necessary to facility operation. This Overview of Energy Systems Safety Analysis Report Programs'' Provides an introduction to the programs and processes used in the development and maintenance of the SARs. It also summarizes some of the uses of the SARs within Energy Systems and DOE.« less
The Use of Experiments and Modeling to Evaluate ...
Symposium Paper This paper reports on a study to examine the thermal decomposition of surrogate CWAs (in this case, Malathion) in a laboratory reactor, analysis of the results using reactor design theory, and subsequent scale-up of the results to a computersimulation of a full-scale commercial hazardous waste incinerator processing ceiling tile contaminated with residual Malathion.
NASA Astrophysics Data System (ADS)
Jones, L. M.; Bawden, G. W.; Bowers, J.; Cannon, S.; Cox, D. A.; Fisher, R.; Keeley, J.; Perry, S. C.; Plumlee, G. S.; Wood, N. J.
2009-12-01
The “Station” fire, the largest fire in the history of Los Angeles County in southern California, began on August 26, 2009 and as of the abstract deadline had burned over 150,000 acres of the Angeles National Forest. This fire creates both a demand and an opportunity for hazards science to be used by the communities directly hit by the fire, as well as those downstream of possible postfire impacts. The Multi Hazards Demonstration Project of the USGS is deploying several types of scientific response, including 1) evaluation of potential debris-flow hazards and associated risk, 2) monitoring physical conditions in burned areas and the hydrologic response to rainstorms, 3) increased streamflow monitoring, 4) ash analysis and ground water contamination, 5) ecosystem response and endangered species rescue, 6) lidar data acquisition for evaluations of biomass loss, detailed mapping of the physical processes that lead to debris-flow generation, and other geologic investigations. The Multi Hazards Demonstration Project is working with the southern California community to use the resulting information to better manage the social consequences of the fire and its secondary hazards. In particular, we are working with Los Angeles County to determine what information they need to prioritize recovery efforts. For instance, maps of hazards specific to debris flow potential can help identify the highest priority areas for debris flow mitigation efforts. These same maps together with ecosystem studies will help land managers determine whether individuals from endangered species should be removed to zoos or other refuges during the rainy months. The ash analysis will help water managers prevent contamination to water supplies. Plans are just beginning for a public information campaign with Los Angeles County about the risk posed by potential debris flows that should be underway in December. Activities from the fire response will support the development of the Wildfire Scenario in 2011, which will examine implications of land-use decisions in the frequency of fires in southern California.
An evaluation of treatment strategies for head and neck cancer in an African American population.
Ignacio, D N; Griffin, J J; Daniel, M G; Serlemitsos-Day, M T; Lombardo, F A; Alleyne, T A
2013-07-01
This study evaluated treatment strategies for head and neck cancers in a predominantly African American population. Data were collected utilizing medical records and the tumour registry at the Howard University Hospital. Kaplan-Meier method was used for survival analysis and Cox proportional hazards regression analysis predicted the hazard of death. Analysis revealed that the main treatment strategy was radiation combined with platinum for all stages except stage I. Cetuximab was employed in only 1% of cases. Kaplan-Meier analysis revealed stage II patients had poorer outcome than stage IV while Cox proportional hazard regression analysis (p = 0.4662) showed that stage I had a significantly lower hazard of death than stage IV (HR = 0.314; p = 0.0272). Contributory factors included tobacco and alcohol but body mass index (BMI) was inversely related to hazard of death. There was no difference in survival using any treatment modality for African Americans.
Establishing a proactive safety and health risk management system in the fire service.
Poplin, Gerald S; Pollack, Keshia M; Griffin, Stephanie; Day-Nash, Virginia; Peate, Wayne F; Nied, Ed; Gulotta, John; Burgess, Jefferey L
2015-04-19
Formalized risk management (RM) is an internationally accepted process for reducing hazards in the workplace, with defined steps including hazard scoping, risk assessment, and implementation of controls, all within an iterative process. While required for all industry in the European Union and widely used elsewhere, the United States maintains a compliance-based regulatory structure, rather than one based on systematic, risk-based methodologies. Firefighting is a hazardous profession, with high injury, illness, and fatality rates compared with other occupations, and implementation of RM programs has the potential to greatly improve firefighter safety and health; however, no descriptions of RM implementation are in the peer-reviewed literature for the North American fire service. In this paper we describe the steps used to design and implement the RM process in a moderately-sized fire department, with particular focus on prioritizing and managing injury hazards during patient transport, fireground, and physical exercise procedures. Hazard scoping and formalized risk assessments are described, in addition to the identification of participatory-led injury control strategies. Process evaluation methods were conducted to primarily assess the feasibility of voluntarily instituting the RM approach within the fire service setting. The RM process was well accepted by the fire department and led to development of 45 hazard specific-interventions. Qualitative data documenting the implementation of the RM process revealed that participants emphasized the: value of the RM process, especially the participatory bottom-up approach; usefulness of the RM process for breaking down tasks to identify potential risks; and potential of RM for reducing firefighter injury. As implemented, this risk-based approach used to identify and manage occupational hazards and risks was successful and is deemed feasible for U.S. (and other) fire services. While several barriers and challenges do exist in the implementation of any intervention such as this, recommendations for adopting the process are provided. Additional work will be performed to determine the effectiveness of select controls strategies that were implemented; however participants throughout the organizational structure perceived the RM process to be of high utility while researchers also found the process improved the awareness and engagement in actively enhancing worker safety and health.
Singh, Umesh Kumar; Ramanathan, A L; Subramanian, V
2018-08-01
Groundwater chemistry of mining region of East Singhbhum district having complex contaminant sources were investigated based on heavy metals loads and other hydrochemical constituents. This study aimed to identify the degree of heavy metals exposure and their potential health risk to local population. The results of hydrochemical analysis showed that Na + , K + , and Ca 2+ ions are the dominant cations in the groundwater, while HCO 3 - , F - and Cl - ions dominate the anionic part of the groundwater. The weathering process was considered the dominant factor to determine the major ionic composition in the study area. Compositional analysis for heavy metal has identified that groundwater of the study area is contaminated by Cd, Pb and Cr elements. Source of these metals have been identified as an anthropogenic inputs from mining activities and mineral processing units. Health risk analysis of individual heavy metal for chronic daily intake (CDI) and hazard quotient (HQ) was found in the order of Cr > As > Cd > Pb which is indicating high health risk for the population. In addition, Hazard Index (HI) analysis for heavy metals was found significantly high (>1) which is considered as a threat for human population because they have the tendency to accumulate in the body and cause variety of diseases like kidney problem, dysfunction of liver and renal cortex as well as cancer. Copyright © 2018 Elsevier Ltd. All rights reserved.
40 CFR 63.9580 - What is the purpose of this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Taconite Iron Ore Processing What This... standards for hazardous air pollutants (NESHAP) for taconite iron ore processing. This subpart also...
40 CFR 63.9580 - What is the purpose of this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
... (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Taconite Iron Ore Processing What This... standards for hazardous air pollutants (NESHAP) for taconite iron ore processing. This subpart also...
40 CFR 63.9580 - What is the purpose of this subpart?
Code of Federal Regulations, 2012 CFR
2012-07-01
... (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Taconite Iron Ore Processing What This... standards for hazardous air pollutants (NESHAP) for taconite iron ore processing. This subpart also...
40 CFR 63.9580 - What is the purpose of this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Taconite Iron Ore Processing What This... standards for hazardous air pollutants (NESHAP) for taconite iron ore processing. This subpart also...
40 CFR 63.9580 - What is the purpose of this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
... (CONTINUED) National Emission Standards for Hazardous Air Pollutants: Taconite Iron Ore Processing What This... standards for hazardous air pollutants (NESHAP) for taconite iron ore processing. This subpart also...
Mayega, R W; Wafula, M R; Musenero, M; Omale, A; Kiguli, J; Orach, G C; Kabagambe, G; Bazeyo, W
2013-06-01
Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for identifying the priority hazards to guide disaster management planning. The study as conducted in Uganda, as part of a multi-country assessment. A hazard, vulnerability and capacity analysis was conducted in a focus group discussion of 7 experts representing key stakeholder agencies in disaster management in Uganda. A simple ranking method was used to rank the probability of occurance of 11 top hazards, their potential impact and the level vulnerability of people and infrastructure. In-terms of likelihood of occurance and potential impact, the top ranked disaster hazards in Uganda are: 1) Epidemics of infectious diseases, 2) Drought/famine, 3) Conflict and environmental degradation in that order. In terms of vulnerability, the top priority hazards to which people and infrastructure were vulnerable were: 1) Conflicts, 2) Epidemics, 3) Drought/famine and, 4) Environmental degradation in that order. Poverty, gender, lack of information, and lack of resilience measures were some of the factors promoting vulnerability to disasters. As Uganda develops a disaster risk reduction and response plan, it ought to prioritize epidemics of infectious diseases, drought/famine, conflics and environmental degradation as the priority disaster hazards.
Expert study to select indicators of the occurrence of emerging mycotoxin hazards.
Kandhai, M C; Booij, C J H; Van der Fels-Klerx, H J
2011-01-01
This article describes a Delphi-based expert judgment study aimed at the selection of indicators to identify the occurrence of emerging mycotoxin hazards related to Fusarium spp. in wheat supply chains. A panel of 29 experts from 12 European countries followed a holistic approach to evaluate the most important indicators for different chain stages (growth, transport and storage, and processing) and their relative importance. After three e-mailing rounds, the experts reached consensus on the most important indicators for each of the three stages: wheat growth, transport and storage, and processing. For wheat growth, these indicators include: relative humidity/rainfall, crop rotation, temperature, tillage practice, water activity of the kernels, and crop variety/cultivar. For the transport and storage stage, they include water activity in the kernels, relative humidity, ventilation, temperature, storage capacity, and logistics. For wheat processing, indicators include quality data, fraction of the cereal used, water activity in the kernels, quality management and traceability systems, and carryover of contamination. The indicators selected in this study can be used in an identification system for the occurrence of emerging mycotoxin hazards in wheat supply chains. Such a system can be used by risk managers within governmental (related) organizations and/or the food and feed industry in order to react proactively to the occurrence of these emerging mycotoxins. © 2010 Society for Risk Analysis.
To provide RCRA hazardous waste permitting regulatory information and resources permitted facilities, hazardous waste generators, and permit writers. To provide the public with information on how they can be involved in the permitting process.
Assessment of documentation requirements under DOE 5481. 1, Safety Analysis and Review System (SARS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Browne, E.T.
1981-03-01
This report assesses the requirements of DOE Order 5481.1, Safety Analysis and Review System for DOE Operations (SARS) in regard to maintaining SARS documentation. Under SARS, all pertinent details of the entire safety analysis and review process for each DOE operation are to be traceable from the initial identification of a hazard. This report is intended to provide assistance in identifying the points in the SARS cycle at which documentation is required, what type of documentation is most appropriate, and where it ultimately should be maintained.
Infusing Reliability Techniques into Software Safety Analysis
NASA Technical Reports Server (NTRS)
Shi, Ying
2015-01-01
Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.
NASA Astrophysics Data System (ADS)
Al-Akad, S.; Akensous, Y.; Hakdaoui, M.
2017-11-01
This research article is summarize the applications of remote sensing and GIS to study the urban floods risk in Al Mukalla. Satellite acquisition of a flood event on October 2015 in Al Mukalla (Yemen) by using flood risk mapping techniques illustrate the potential risk present in this city. Satellite images (The Landsat and DEM images data were atmospherically corrected, radiometric corrected, and geometric and topographic distortions rectified.) are used for flood risk mapping to afford a hazard (vulnerability) map. This map is provided by applying image-processing techniques and using geographic information system (GIS) environment also the application of NDVI, NDWI index, and a method to estimate the flood-hazard areas. Four factors were considered in order to estimate the spatial distribution of the hazardous areas: flow accumulation, slope, land use, geology and elevation. The multi-criteria analysis, allowing to deal with vulnerability to flooding, as well as mapping areas at the risk of flooding of the city Al Mukalla. The main object of this research is to provide a simple and rapid method to reduce and manage the risks caused by flood in Yemen by take as example the city of Al Mukalla.
Earthquake Intensity and Strong Motion Analysis Within SEISCOMP3
NASA Astrophysics Data System (ADS)
Becker, J.; Weber, B.; Ghasemi, H.; Cummins, P. R.; Murjaya, J.; Rudyanto, A.; Rößler, D.
2017-12-01
Measuring and predicting ground motion parameters including seismic intensities for earthquakes is crucial and subject to recent research in engineering seismology.gempa has developed the new SIGMA module for Seismic Intensity and Ground Motion Analysis. The module is based on the SeisComP3 framework extending it in the field of seismic hazard assessment and engineering seismology. SIGMA may work with or independently of SeisComP3 by supporting FDSN Web services for importing earthquake or station information and waveforms. It provides a user-friendly and modern graphical interface for semi-automatic and interactive strong motion data processing. SIGMA provides intensity and (P)SA maps based on GMPE's or recorded data. It calculates the most common strong motion parameters, e.g. PGA/PGV/PGD, Arias intensity and duration, Tp, Tm, CAV, SED and Fourier-, power- and response spectra. GMPE's are configurable. Supporting C++ and Python plug-ins, standard and customized GMPE's including the OpenQuake Hazard Library can be easily integrated and compared. Originally tailored to specifications by Geoscience Australia and BMKG (Indonesia) SIGMA has become a popular tool among SeisComP3 users concerned with seismic hazard and strong motion seismology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-07-01
The module introduces a specific hazardous waste identification process, which involves asking and analyzing a series of questions about any waste being evaluated. It analyzes in detail the Resource Conservation and Recovery Act (RCRA) definition of `hazardous waste.` It explains concepts that are essential to identifying a RCRA hazardous waste: hazardous waste listing, hazardous waste characteristics, the `mixture` and `derived-from` rules, the `contained-in` policy, and the hazardous waste identification rules (HWIR).
Uncertainties in predicting debris flow hazards following wildfire [Chapter 19
Kevin D. Hyde; Karin Riley; Cathelijne Stoof
2017-01-01
Wildfire increases the probability of debris flows posing hazardous conditions where valuesâatârisk exist downstream of burned areas. Conditions and processes leading to postfire debris flows usually follow a general sequence defined here as the postfire debris flow hazard cascade: biophysical setting, fire processes, fire effects, rainfall, debris flow, and valuesâatâ...
Ground Water Issue: Phytoremediation of Contaminated Soil and Ground Water at Hazardous Waste Sites
2001-02-01
Development Ground Water Issue Phytoremediation of Contaminated Soil and Ground Water at Hazardous Waste Sites National Risk Management Research... Phytoremediation , the use of plants in remediation, is one such technology. This issue paper focuses on the processes and applications of phytoremediation ...of phytoremediation as a cleanup or containment technique for remediation of hazardous waste sites. Introductory material on plant processes is
230Th/U ages Supporting Hanford Site-Wide Probabilistic Seismic Hazard Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paces, James B.
This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rindsmore » on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.« less
230Th/U ages Supporting Hanford Site‐Wide Probabilistic Seismic Hazard Analysis
Paces, James B.
2014-01-01
This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.
NASA Astrophysics Data System (ADS)
Poluyan, L. V.; Syutkina, E. V.; Guryev, E. S.
2017-11-01
The comparative analysis of key features of the software systems TOXI+Risk and ALOHA is presented. The authors made a comparison of domestic (TOXI+Risk) and foreign (ALOHA) software systems allowing to give the quantitative assessment of impact areas (pressure, thermal, toxic) in case of hypothetical emergencies in potentially hazardous objects of the oil, gas, chemical, petrochemical and oil-processing industry. Both software systems use different mathematical models for assessment of the release rate of a chemically hazardous substance from a storage tank and its evaporation. The comparison of the accuracy of definition of impact areas made by both software systems to verify the examples shows good convergence of both products. The analysis results showed that the ALOHA software can be actively used for forecasting and immediate assessment of emergency situations, assessment of damage as a result of emergencies on the territories of municipalities.
Earthquake hypocenter relocation using double difference method in East Java and surrounding areas
DOE Office of Scientific and Technical Information (OSTI.GOV)
C, Aprilia Puspita; Meteorological, Climatological, and Geophysical Agency; Nugraha, Andri Dian, E-mail: nugraha@gf.itb.ac.id
Determination of precise hypocenter location is very important in order to provide information about subsurface fault plane and for seismic hazard analysis. In this study, we have relocated hypocenter earthquakes in Eastern part of Java and surrounding areas from local earthquake data catalog compiled by Meteorological, Climatological, and Geophysical Agency of Indonesia (MCGA) in time period 2009-2012 by using the double-difference method. The results show that after relocation processes, there are significantly changes in position and orientation of earthquake hypocenter which is correlated with the geological setting in this region. We observed indication of double seismic zone at depths ofmore » 70-120 km within the subducting slab in south of eastern part of Java region. Our results will provide useful information for advance seismological studies and seismic hazard analysis in this study.« less
Electromagnetic Spectrum Test and Evaluation Process
2010-01-01
HERO , hazards of electromagnetic radiation to ordnance ; HERP, hazards of electromagnetic radiation to personnel; HERF, hazards of electromagnetic ... electromagnetic pulse (EMP); electronic protection; electrostatic dis- charge (ESD); hazards of electromagnetic radi- ation to personnel (HERP), ordnance ...including ordnance containing electrically initiated devices, to be mutually compatible in their intended
NASA Technical Reports Server (NTRS)
1996-01-01
Under a Small Business Innovation Research (SBIR) contract to Kennedy Space Center, EIC Laboratories invented a Raman Spectrograph with fiber optic sampling for space applications such as sensing hazardous fuel vapors and making on-board rapid analyses of chemicals and minerals. Raman spectroscopy is a laser-based measurement technique that provides through a unique vibrational spectrum a molecular 'fingerprint,' and can function in aqueous environments. EIC combined optical fiber technology with Raman methods to develop sensors that can be operated at a distance from the spectrographic analysis instruments and the laser excitation source. EIC refined and commercialized the technology to create the Fiber Optic Raman Spectrograph and the RamanProbe. Commercial applications range from process control to monitoring hazardous materials.
NASA Astrophysics Data System (ADS)
Duro, Javier; Iglesias, Rubén; Blanco, Pablo; Albiol, David; Koudogbo, Fifamè
2015-04-01
The Wide Area Product (WAP) is a new interferometric product developed to provide measurement over large regions. Persistent Scatterers Interferometry (PSI) has largely proved their robust and precise performance in measuring ground surface deformation in different application domains. In this context, however, the accurate displacement estimation over large-scale areas (more than 10.000 km2) characterized by low magnitude motion gradients (3-5 mm/year), such as the ones induced by inter-seismic or Earth tidal effects, still remains an open issue. The main reason for that is the inclusion of low quality and more distant persistent scatterers in order to bridge low-quality areas, such as water bodies, crop areas and forested regions. This fact yields to spatial propagation errors on PSI integration process, poor estimation and compensation of the Atmospheric Phase Screen (APS) and the difficult to face residual long-wavelength phase patterns originated by orbit state vectors inaccuracies. Research work for generating a Wide Area Product of ground motion in preparation for the Sentinel-1 mission has been conducted in the last stages of Terrafirma as well as in other research programs. These developments propose technological updates for keeping the precision over large scale PSI analysis. Some of the updates are based on the use of external information, like meteorological models, and the employment of GNSS data for an improved calibration of large measurements. Usually, covering wide regions implies the processing over areas with a land use which is chiefly focused on livestock, horticulture, urbanization and forest. This represents an important challenge for providing continuous InSAR measurements and the application of advanced phase filtering strategies to enhance the coherence. The advanced PSI processing has been performed out over several areas, allowing a large scale analysis of tectonic patterns, and motion caused by multi-hazards as volcanic, landslide and flood. Several examples of the application of the PSI WAP to wide regions for measuring ground displacements related to different types of hazards, natural and human induced will be presented. The InSAR processing approach to measure accurate movements at local and large scales for allowing multi-hazard interpretation studies will also be discussed. The test areas will show deformations related to active faults systems, landslides in mountains slopes, ground compaction over underneath aquifers and movements in volcanic areas.
NASA Astrophysics Data System (ADS)
Kargel, J. S.; Leonard, G. J.
2012-12-01
Recent deadly glacier-related disasters in the Himalayan-Karakoram region—the Attabad landslide and formation of glacier meltwater-fed Lake Gojal, the Gayari ice avalanche/landslide and burial of a Pakistani Army base, and the Seti River outburst disaster—beg the question of whether disasters may be on the rise. Science is not yet ready to offer a full answer, but it is an important one to resolve, because future land-use planning and mitigative measures may be affected. Natural disasters have been commonplace throughout the long human history of the Himalaya-Karakoram region. The broad outlines of the changing natural process, natural hazard, and risk environment may be established. The risk is rising rapidly primarily due to increased human presence in these once-forbidding mountains. Risk is shifting also because climate change is modifying the land surface process system. Rapidly changing glaciers cause a destabilization of the landscape. Glaciers are fundamentally a mestastable phenomenon put in motion by the high gravitational potential energies of the components of glacial systems: snow, ice, water, and debris. Any change in the climate-land-glacier system MUST result in a change in the land process system, with hazards and risks rising or falling or changing location or type. Most commonly, glacier-related disasters include a natural process cascade; as the factors affecting land surface processes and the frequency or magnitude of any one of the elements of the process cascade changes, the net hazard and risk to people changes. Otherwise similar glaciers and glacierized basins have differing sets of hazardous conditions and processes depending on whether the glacier is stable, advancing or retreating. The consequences for the overall risk to people will depend on the details of a specific glacier near a particular village or bridge or railroad. One size does not fit all. Generalizations about trends in natural hazards as related to climate change impacts on glaciers are possible, but any particular locality may buck the general trends. Hence, climate change is affecting the natural process, natural hazard, and human risk environment. However, changing glaciers exhibit a montage of different response behaviors, so the natural hazards and shifting hazards are also a montage. Overwhelmingly, changing land use has the largest impact on the natural hazard and risk environment. We will take recent examples of natural disasters--using both remote sensing data and field data-- and discuss how changing climate, the changing cryosphere, and changing human relationships to the land in Himalayan realms may have contributed to or altered those events.
NASA Astrophysics Data System (ADS)
Owen, S. E.; Yun, S. H.; Hua, H.; Agram, P. S.; Liu, Z.; Sacco, G. F.; Manipon, G.; Linick, J. P.; Fielding, E. J.; Lundgren, P.; Farr, T. G.; Webb, F.; Rosen, P. A.; Simons, M.
2017-12-01
The Advanced Rapid Imaging and Analysis (ARIA) project for Natural Hazards is focused on rapidly generating high-level geodetic imaging products and placing them in the hands of the solid earth science and local, national, and international natural hazard communities by providing science product generation, exploration, and delivery capabilities at an operational level. Space-based geodetic measurement techniques including Interferometric Synthetic Aperture Radar (InSAR), differential Global Positioning System, and SAR-based change detection have become critical additions to our toolset for understanding and mapping the damage and deformation caused by earthquakes, volcanic eruptions, floods, landslides, and groundwater extraction. Up until recently, processing of these data sets has been handcrafted for each study or event and has not generated products rapidly and reliably enough for response to natural disasters or for timely analysis of large data sets. The ARIA project, a joint venture co-sponsored by the California Institute of Technology and by NASA through the Jet Propulsion Laboratory, has been capturing the knowledge applied to these responses and building it into an automated infrastructure to generate imaging products in near real-time that can improve situational awareness for disaster response. In addition to supporting the growing science and hazard response communities, the ARIA project has developed the capabilities to provide automated imaging and analysis capabilities necessary to keep up with the influx of raw SAR data from geodetic imaging missions such as ESA's Sentinel-1A/B, now operating with repeat intervals as short as 6 days, and the upcoming NASA NISAR mission. We will present the progress and results we have made on automating the analysis of Sentinel-1A/B SAR data for hazard monitoring and response, with emphasis on recent developments and end user engagement in flood extent mapping and deformation time series for both volcano monitoring and mapping of groundwater-related subsidence
Structural Health Monitoring Analysis for the Orbiter Wing Leading Edge
NASA Technical Reports Server (NTRS)
Yap, Keng C.
2010-01-01
This viewgraph presentation reviews Structural Health Monitoring Analysis for the Orbiter Wing Leading Edge. The Wing Leading Edge Impact Detection System (WLE IDS) and the Impact Analysis Process are also described to monitor WLE debris threats. The contents include: 1) Risk Management via SHM; 2) Hardware Overview; 3) Instrumentation; 4) Sensor Configuration; 5) Debris Hazard Monitoring; 6) Ascent Response Summary; 7) Response Signal; 8) Distribution of Flight Indications; 9) Probabilistic Risk Analysis (PRA); 10) Model Correlation; 11) Impact Tests; 12) Wing Leading Edge Modeling; 13) Ascent Debris PRA Results; and 14) MM/OD PRA Results.
Identification of Potential Hazard using Hazard Identification and Risk Assessment
NASA Astrophysics Data System (ADS)
Sari, R. M.; Syahputri, K.; Rizkya, I.; Siregar, I.
2017-03-01
This research was conducted in the paper production’s company. These Paper products will be used as a cigarette paper. Along in the production’s process, Company provides the machines and equipment that operated by workers. During the operations, all workers may potentially injured. It known as a potential hazard. Hazard identification and risk assessment is one part of a safety and health program in the stage of risk management. This is very important as part of efforts to prevent occupational injuries and diseases resulting from work. This research is experiencing a problem that is not the identification of potential hazards and risks that would be faced by workers during the running production process. The purpose of this study was to identify the potential hazards by using hazard identification and risk assessment methods. Risk assessment is done using severity criteria and the probability of an accident. According to the research there are 23 potential hazard that occurs with varying severity and probability. Then made the determination Risk Assessment Code (RAC) for each potential hazard, and gained 3 extreme risks, 10 high risks, 6 medium risks and 3 low risks. We have successfully identified potential hazard using RAC.
Murray, Jessica R.; Svarc, Jerry L.
2017-01-01
The U.S. Geological Survey Earthquake Science Center collects and processes Global Positioning System (GPS) data throughout the western United States to measure crustal deformation related to earthquakes and tectonic processes as part of a long‐term program of research and monitoring. Here, we outline data collection procedures and present the GPS dataset built through repeated temporary deployments since 1992. This dataset consists of observations at ∼1950 locations. In addition, this article details our data processing and analysis procedures, which consist of the following. We process the raw data collected through temporary deployments, in addition to data from continuously operating western U.S. GPS stations operated by multiple agencies, using the GIPSY software package to obtain position time series. Subsequently, we align the positions to a common reference frame, determine the optimal parameters for a temporally correlated noise model, and apply this noise model when carrying out time‐series analysis to derive deformation measures, including constant interseismic velocities, coseismic offsets, and transient postseismic motion.
Analysis and design of randomised clinical trials involving competing risks endpoints.
Tai, Bee-Choo; Wee, Joseph; Machin, David
2011-05-19
In randomised clinical trials involving time-to-event outcomes, the failures concerned may be events of an entirely different nature and as such define a classical competing risks framework. In designing and analysing clinical trials involving such endpoints, it is important to account for the competing events, and evaluate how each contributes to the overall failure. An appropriate choice of statistical model is important for adequate determination of sample size. We describe how competing events may be summarised in such trials using cumulative incidence functions and Gray's test. The statistical modelling of competing events using proportional cause-specific and subdistribution hazard functions, and the corresponding procedures for sample size estimation are outlined. These are illustrated using data from a randomised clinical trial (SQNP01) of patients with advanced (non-metastatic) nasopharyngeal cancer. In this trial, treatment has no effect on the competing event of loco-regional recurrence. Thus the effects of treatment on the hazard of distant metastasis were similar via both the cause-specific (unadjusted csHR = 0.43, 95% CI 0.25 - 0.72) and subdistribution (unadjusted subHR 0.43; 95% CI 0.25 - 0.76) hazard analyses, in favour of concurrent chemo-radiotherapy followed by adjuvant chemotherapy. Adjusting for nodal status and tumour size did not alter the results. The results of the logrank test (p = 0.002) comparing the cause-specific hazards and the Gray's test (p = 0.003) comparing the cumulative incidences also led to the same conclusion. However, the subdistribution hazard analysis requires many more subjects than the cause-specific hazard analysis to detect the same magnitude of effect. The cause-specific hazard analysis is appropriate for analysing competing risks outcomes when treatment has no effect on the cause-specific hazard of the competing event. It requires fewer subjects than the subdistribution hazard analysis for a similar effect size. However, if the main and competing events are influenced in opposing directions by an intervention, a subdistribution hazard analysis may be warranted.
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
DOT National Transportation Integrated Search
1988-05-01
The report is devoted to the review and discussion of generic hazards associated with the ground, launch, orbital and re-entry phases of space operations. Since the DOT Office of Commercial Space Transportation (OCST) has been charged with protecting...
9 CFR 417.4 - Validation, Verification, Reassessment.
Code of Federal Regulations, 2012 CFR
2012-01-01
... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...
9 CFR 417.4 - Validation, Verification, Reassessment.
Code of Federal Regulations, 2010 CFR
2010-01-01
... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...
9 CFR 417.4 - Validation, Verification, Reassessment.
Code of Federal Regulations, 2011 CFR
2011-01-01
... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...
14 CFR 417.223 - Flight hazard area analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight hazard area analysis. 417.223 Section 417.223 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... to control the risk to the public from debris impact hazards. The risk management requirements of...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-24
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration Office of... processing of, special permits from the Department of Transportation's Hazardous Material Regulations (49 CFR... special permit is published in accordance with Part 107 of the Federal hazardous materials transportation...
Xiao, Yangfan; Yi, Shanzhen; Tang, Zhongqian
2017-12-01
Flood is the most common natural hazard in the world and has caused serious loss of life and property. Assessment of flood prone areas is of great importance for watershed management and reduction of potential loss of life and property. In this study, a framework of multi-criteria analysis (MCA) incorporating geographic information system (GIS), fuzzy analytic hierarchy process (AHP) and spatial ordered weighted averaging (OWA) method was developed for flood hazard assessment. The factors associated with geographical, hydrological and flood-resistant characteristics of the basin were selected as evaluation criteria. The relative importance of the criteria was estimated through fuzzy AHP method. The OWA method was utilized to analyze the effects of different risk attitudes of the decision maker on the assessment result. The spatial ordered weighted averaging method with spatially variable risk preference was implemented in the GIS environment to integrate the criteria. The advantage of the proposed method is that it has considered spatial heterogeneity in assigning risk preference in the decision-making process. The presented methodology has been applied to the area including Hanyang, Caidian and Hannan of Wuhan, China, where flood events occur frequently. The outcome of flood hazard distribution presents a tendency of high risk towards populated and developed areas, especially the northeast part of Hanyang city, which has suffered frequent floods in history. The result indicates where the enhancement projects should be carried out first under the condition of limited resources. Finally, sensitivity of the criteria weights was analyzed to measure the stability of results with respect to the variation of the criteria weights. The flood hazard assessment method presented in this paper is adaptable for hazard assessment of a similar basin, which is of great significance to establish counterplan to mitigate life and property losses. Copyright © 2017 Elsevier B.V. All rights reserved.
Physical applications of GPS geodesy: a review.
Bock, Yehuda; Melgar, Diego
2016-10-01
Geodesy, the oldest science, has become an important discipline in the geosciences, in large part by enhancing Global Positioning System (GPS) capabilities over the last 35 years well beyond the satellite constellation's original design. The ability of GPS geodesy to estimate 3D positions with millimeter-level precision with respect to a global terrestrial reference frame has contributed to significant advances in geophysics, seismology, atmospheric science, hydrology, and natural hazard science. Monitoring the changes in the positions or trajectories of GPS instruments on the Earth's land and water surfaces, in the atmosphere, or in space, is important for both theory and applications, from an improved understanding of tectonic and magmatic processes to developing systems for mitigating the impact of natural hazards on society and the environment. Besides accurate positioning, all disturbances in the propagation of the transmitted GPS radio signals from satellite to receiver are mined for information, from troposphere and ionosphere delays for weather, climate, and natural hazard applications, to disturbances in the signals due to multipath reflections from the solid ground, water, and ice for environmental applications. We review the relevant concepts of geodetic theory, data analysis, and physical modeling for a myriad of processes at multiple spatial and temporal scales, and discuss the extensive global infrastructure that has been built to support GPS geodesy consisting of thousands of continuously operating stations. We also discuss the integration of heterogeneous and complementary data sets from geodesy, seismology, and geology, focusing on crustal deformation applications and early warning systems for natural hazards.
NASA Astrophysics Data System (ADS)
Auermuller, L. M.; Gatto, J.; Huch, C.
2015-12-01
The highly developed nature of New Jersey's coastline, barrier island and lagoon communities make them particularly vulnerable to storm surge, sea level rise and flooding. The impacts of Hurricane Sandy have enlightened coastal communities to these realities. Recognizing these vulnerabilities, the Jacques Cousteau National Research Reserve (JC NERR), Rutgers Center for Remote Sensing and Spatial Analysis (CRSSA), Rutgers Bloustein School and the Barnegat Bay Partnership (BBP) have developed web-based tools to assist NJ's coastal communities in visualizing and planning for future local impacts. NJFloodMapper and NJAdapt are two complementary interactive mapping websites that visualize different current and future flood hazards. These hazard layers can be combined with additional data including critical facilities, evacuation routes, socioeconomic and environmental data. Getting to Resilience is an online self-assessment tool developed to assist communities reduce vulnerability and increase preparedness by linking planning, mitigation, and adaptation. Through this interactive process communities will learn how their preparedness can yield valuable points through voluntary programs like FEMA's Community Rating System and Sustainable Jersey. The assessment process can also increase the community's understanding of where future vulnerabilities should be addressed through hazard mitigation planning. Since Superstorm Sandy, more than thirty communities in New Jersey have been provided technical assistance in assessing their risks and vulnerabilities to coastal hazards, and have begun to understand how to better plan and prepare for short and long-term changes along their shorelines.
Analysis of a Hypergolic Propellant Explosion During Processing of Launch Vehicles in the VAB
NASA Technical Reports Server (NTRS)
Chrostowski, Jon D.; Gan Wenshui; Campbell, Michael D.
2010-01-01
NASA is developing launch vehicles to support missions to Low Earth Orbit (LEO), the moon and deep space. Whether manned or unmanned, the vehicle components will likely be integrated in the Vehicle Assembly Building (VAB) at Kennedy Space Center (KSC) and typically include a fueled spacecraft (SC) that sits on top of one or more stages. The processing of a fueled SC involves hazardous operations when it is brought into the VAB Transfer Aisle and lifted a significant height for mating with lower stages. Accidents resulting from these hazardous operations could impact unrelated personnel working in buildings adjacent to the VAB. Safe separation distances based on the DOD Explosives Standards Quantity-Distance (Q-D) approach result in large IBD arcs. This paper presents site-specific air blast and fragmentation hazard analyses for comparison with the Q-D arcs as well as consequence and risk analyses to provide added information for the decision maker. A new physics-based fragmentation model is presented that includes: a) the development of a primary fragment list (which defines the fragment characteristics) associated with a hypergolic propellant explosion, b) a description of a 3D fragment bounce model, c) the results of probabilistic Monte-Carlo simulations (that include uncertainties in the fragment characteristics) to determine: i) the hazardous fragment density distance, ii) the expected number of wall/roof impacts and penetrations to over 40 buildings adjacent to the VAB, and iii) the risk to building occupants.
A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam
NASA Astrophysics Data System (ADS)
Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen
2014-05-01
Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho are too short for a meaningful frequency analysis. The detailed hazard mapping is performed by a 2D hydrodynamic model for Can Tho city. As the scenarios are derived in a Monte-Carlo framework, the final flood hazard maps are probabilistic, i.e. show the median flood hazard along with uncertainty estimates for each defined level of probabilities of exceedance. For the pluvial flood hazard a frequency analysis of the hourly rain gauge data of Can Tho is performed implementing a peak-over-threshold procedure. Based on this frequency analysis synthetic rains storms are generated in a Monte-Carlo framework for the same probabilities of exceedance as in the fluvial flood hazard analysis. Probabilistic flood hazard maps were then generated with the same 2D hydrodynamic model for the city. In a last step the fluvial and pluvial scenarios are combined assuming independence of the events. These scenarios were also transferred into hazard maps by the 2D hydrodynamic model finally yielding combined fluvial-pluvial probabilistic flood hazard maps for Can Tho. The derived set of maps may be used for an improved city planning or a flood risk analysis.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Public Property FOREST SERVICE, DEPARTMENT OF AGRICULTURE PREDECISIONAL ADMINISTRATIVE REVIEW PROCESSES Predecisional Administrative Review Process for Hazardous Fuel Reduction Projects Authorized by the Healthy... subject to the objection process pursuant to 36 CFR part 218, subpart A, and include the following: (i...
Kalb, Paul D.; Colombo, Peter
1999-07-20
The present invention provides a composition and process for disposal of radioactive, hazardous and mixed wastes. The present invention preferably includes a process for multibarrier encapsulation of radioactive, hazardous and mixed wastes by combining substantially simultaneously dry waste powder, a non-biodegradable thermoplastic polymer and an anhydrous additive in an extruder to form a homogenous molten matrix. The molten matrix may be directed in a "clean" polyethylene liner, allowed to cool, thus forming a monolithic waste form which provides a multibarrier to the dispersion of wastes into the environment.
Kalb, Paul D.; Colombo, Peter
1998-03-24
The present invention provides a composition and process for disposal of radioactive, hazardous and mixed wastes. The present invention preferably includes a process for multibarrier encapsulation of radioactive, hazardous and mixed wastes by combining substantially simultaneously dry waste powder, a non-biodegradable thermoplastic polymer and an anhydrous additive in an extruder to form a homogenous molten matrix. The molten matrix may be directed in a "clean" polyethylene liner, allowed to cool, thus forming a monolithic waste form which provides a multibarrier to the dispersion of wastes into the environment.
Kalb, Paul D.; Colombo, Peter
1997-01-01
The present invention provides a composition and process for disposal of radioactive, hazardous and mixed wastes. The present invention preferably includes a process for multibarrier encapsulation of radioactive, hazardous and mixed wastes by combining substantially simultaneously dry waste powder, a non-biodegradable thermoplastic polymer and an anhydrous additive in an extruder to form a homogenous molten matrix. The molten matrix may be directed in a "clean" polyethylene liner, allowed to cool, thus forming a monolithic waste form which provides a multibarrier to the dispersion of wastes into the environment.
Laboratory plant study on the melting process of asbestos waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sakai, Shinichi; Terazono, Atsushi; Takatsuki, Hiroshi
The melting process was studied as a method of changing asbestos into non-hazardous waste and recovering it as a reusable resource. In an initial effort, the thermal behaviors of asbestos waste in terms of physical and chemical structure have been studied. Then, 10 kg/h-scale laboratory plant experiments were carried out. By X-ray diffraction analysis, the thermal behaviors of sprayed-on asbestos waste revealed that chrysotile asbestos waste change in crystal structure at around 800 C, and becomes melted slag, mainly composed of magnesium silicate, at around 1,500 C. Laboratory plant experiments on the melting process of sprayed-on asbestos have shown thatmore » melted slag can be obtained. X-ray diffraction analysis of the melted slag revealed crystal structure change, and SEM analysis showed the slag to have a non-fibrous form. And more, TEM analysis proved the very high treatment efficiency of the process, that is, reduction of the asbestos content to 1/10{sup 6} as a weight basis. These analytical results indicate the effectiveness of the melting process for asbestos waste treatment.« less
Ravisankar, R; Vanasundari, K; Suganya, M; Raghu, Y; Rajalakshmi, A; Chandrasekaran, A; Sivakumar, S; Chandramohan, J; Vijayagopal, P; Venkatraman, B
2014-02-01
Using γ spectrometry, the concentration of the naturally occurring radionuclides (226)Ra, (232)Th and (40)K has been measured in soil, sand, cement, clay and bricks, which are used as building materials in Tiruvannamalai, Tamilnadu, India. The radium equivalent activity (Raeq), the criterion formula (CF), indoor gamma absorbed dose rate (DR), annual effective dose (HR), activity utilization index (AUI), alpha index (Iα), gamma index (Iγ), external radiation hazard index (Hex), internal radiation hazard index (Hin), representative level index (RLI), excess lifetime cancer risk (ELCR) and annual gonadal dose equivalent (AGDE) associated with the natural radionuclides are calculated to assess the radiation hazard of the natural radioactivity in the building materials. From the analysis, it is found that these materials used for the construction of dwellings are safe for the inhabitants. The radiological data were processed using multivariate statistical methods to determine the similarities and correlation among the various samples. The frequency distributions for all radionuclides were analyzed. The data set consisted of 15 measured variables. The Pearson correlation coefficient reveals that the (226)Ra distribution in building materials is controlled by the variation of the (40)K concentration. Principal component analysis (PCA) yields a two-component representation of the acquired data from the building materials in Tiruvannamalai, wherein 94.9% of the total variance is explained. The resulting dendrogram of hierarchical cluster analysis (HCA) classified the 30 building materials into four major groups using 15 variables. Copyright © 2013 Elsevier Ltd. All rights reserved.