NASA Astrophysics Data System (ADS)
Bezawada, Rajesh; Uijt de Haag, Maarten
2010-04-01
This paper discusses the results of an initial evaluation study of hazard and integrity monitor functions for use with integrated alerting and notification. The Hazard and Integrity Monitor (HIM) (i) allocates information sources within the Integrated Intelligent Flight Deck (IIFD) to required functionality (like conflict detection and avoidance) and determines required performance of these information sources as part of that function; (ii) monitors or evaluates the required performance of the individual information sources and performs consistency checks among various information sources; (iii) integrates the information to establish tracks of potential hazards that can be used for the conflict probes or conflict prediction for various time horizons including the 10, 5, 3, and <3 minutes used in our scenario; (iv) detects and assesses the class of the hazard and provide possible resolutions. The HIM monitors the operation-dependent performance parameters related to the potential hazards in a manner similar to the Required Navigation Performance (RNP). Various HIM concepts have been implemented and evaluated using a previously developed sensor simulator/synthesizer. Within the simulation framework, various inputs to the IIFD and its subsystems are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. The framework and HIM functions are implemented in SimulinkR, a modeling language developed by The MathworksTM. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft.
Step 1: Human System Integration Pilot-Technology Interface Requirements for Weather Management
NASA Technical Reports Server (NTRS)
2005-01-01
This document involves definition of technology interface requirements for Hazardous Weather Avoidance. Technology concepts in use by the Access 5 Weather Management Work Package were considered. Beginning with the Human System Integration (HIS) high-level functional requirement for Hazardous Weather Avoidance, and Hazardous Weather Avoidance technology elements, HSI requirements for the interface to the pilot were identified. Results of the analysis describe (1) the information required by the pilot to have knowledge of hazardous weather, and (2) the control capability needed by the pilot to obtain hazardous weather information. Fundamentally, these requirements provide the candidate Hazardous Weather Avoidance technology concepts with the necessary human-related elements to make them compatible with human capabilities and limitations. The results of the analysis describe how Hazardous Weather Avoidance operations and functions should interface with the pilot to provide the necessary Weather Management functionality to the UA-pilot system. Requirements and guidelines for Hazardous Weather Avoidance are partitioned into four categories: (1) Planning En Route (2) Encountering Hazardous Weather En Route, (3) Planning to Destination, and (4) Diversion Planning Alternate Airport. Each requirement is stated and is supported with a rationale and associated reference(s).
Analyzing Distributed Functions in an Integrated Hazard Analysis
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Massie, Michael J.
2010-01-01
Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.
Rocky Mountain Research Station USDA Forest Service
2004-01-01
Fire hazard reflects the potential fire behavior and magnitude of effects as a function of fuel conditions. This fact sheet discusses crown fuels, surface fuels, and ground fuels and their contribution and involvement in wildland fire.Other publications in this series...
Hazard Function Estimation with Cause-of-Death Data Missing at Random.
Wang, Qihua; Dinse, Gregg E; Liu, Chunling
2012-04-01
Hazard function estimation is an important part of survival analysis. Interest often centers on estimating the hazard function associated with a particular cause of death. We propose three nonparametric kernel estimators for the hazard function, all of which are appropriate when death times are subject to random censorship and censoring indicators can be missing at random. Specifically, we present a regression surrogate estimator, an imputation estimator, and an inverse probability weighted estimator. All three estimators are uniformly strongly consistent and asymptotically normal. We derive asymptotic representations of the mean squared error and the mean integrated squared error for these estimators and we discuss a data-driven bandwidth selection method. A simulation study, conducted to assess finite sample behavior, demonstrates that the proposed hazard estimators perform relatively well. We illustrate our methods with an analysis of some vascular disease data.
Interval Estimation of Seismic Hazard Parameters
NASA Astrophysics Data System (ADS)
Orlecka-Sikora, Beata; Lasocki, Stanislaw
2017-03-01
The paper considers Poisson temporal occurrence of earthquakes and presents a way to integrate uncertainties of the estimates of mean activity rate and magnitude cumulative distribution function in the interval estimation of the most widely used seismic hazard functions, such as the exceedance probability and the mean return period. The proposed algorithm can be used either when the Gutenberg-Richter model of magnitude distribution is accepted or when the nonparametric estimation is in use. When the Gutenberg-Richter model of magnitude distribution is used the interval estimation of its parameters is based on the asymptotic normality of the maximum likelihood estimator. When the nonparametric kernel estimation of magnitude distribution is used, we propose the iterated bias corrected and accelerated method for interval estimation based on the smoothed bootstrap and second-order bootstrap samples. The changes resulted from the integrated approach in the interval estimation of the seismic hazard functions with respect to the approach, which neglects the uncertainty of the mean activity rate estimates have been studied using Monte Carlo simulations and two real dataset examples. The results indicate that the uncertainty of mean activity rate affects significantly the interval estimates of hazard functions only when the product of activity rate and the time period, for which the hazard is estimated, is no more than 5.0. When this product becomes greater than 5.0, the impact of the uncertainty of cumulative distribution function of magnitude dominates the impact of the uncertainty of mean activity rate in the aggregated uncertainty of the hazard functions. Following, the interval estimates with and without inclusion of the uncertainty of mean activity rate converge. The presented algorithm is generic and can be applied also to capture the propagation of uncertainty of estimates, which are parameters of a multiparameter function, onto this function.
Hazard Function Estimation with Cause-of-Death Data Missing at Random
Wang, Qihua; Dinse, Gregg E.; Liu, Chunling
2010-01-01
Hazard function estimation is an important part of survival analysis. Interest often centers on estimating the hazard function associated with a particular cause of death. We propose three nonparametric kernel estimators for the hazard function, all of which are appropriate when death times are subject to random censorship and censoring indicators can be missing at random. Specifically, we present a regression surrogate estimator, an imputation estimator, and an inverse probability weighted estimator. All three estimators are uniformly strongly consistent and asymptotically normal. We derive asymptotic representations of the mean squared error and the mean integrated squared error for these estimators and we discuss a data-driven bandwidth selection method. A simulation study, conducted to assess finite sample behavior, demonstrates that the proposed hazard estimators perform relatively well. We illustrate our methods with an analysis of some vascular disease data. PMID:22267874
ASIL determination for motorbike's Electronics Throttle Control System (ETCS) mulfunction
NASA Astrophysics Data System (ADS)
Zaman Rokhani, Fakhrul; Rahman, Muhammad Taqiuddin Abdul; Ain Kamsani, Noor; Sidek, Roslina Mohd; Saripan, M. Iqbal; Samsudin, Khairulmizam; Khair Hassan, Mohd
2017-11-01
Electronics Throttle Control System (ETCS) is the principal electronic unit in all fuel injection engine motorbike, augmenting the engine performance efficiency in comparison to the conventional carburetor based engine. ETCS is regarded as a safety-critical component, whereby ETCS malfunction can cause unintended acceleration or deceleration event, which can be hazardous to riders. In this study, Hazard Analysis and Risk Assessment, an ISO26262 functional safety standard analysis has been applied on motorbike's ETCS to determine the required automotive safety integrity level. Based on the analysis, the established automotive safety integrity level can help to derive technical and functional safety measures for ETCS development.
Eini C. Lowell; Dennis R. Becker; Robert Rummer; Debra Larson; Linda Wadleigh
2008-01-01
This research provides an important step in the conceptualization and development of an integrated wildfire fuels reduction system from silvicultural prescription, through stem selection, harvesting, in-woods processing, transport, and market selection. Decisions made at each functional step are informed by knowledge about subsequent functions. Data on the resource...
Eini C. Lowell; Dennis R. Becker; Robert Rummer; Debra Larson; Linda Wadleigh
2008-01-01
This research provides an important step in the conceptualization and development of an integrated wildfire fuels reduction system from silvicultural prescription, through stem selection, harvesting, in-woods processing, transport, and market selection. Decisions made at each functional step are informed by knowledge about subsequent functions. Data on the resource...
NASA Technical Reports Server (NTRS)
Epp, Chirold D.; Robertson, Edward A.; Ruthishauser, David K.
2013-01-01
The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project was chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with real-time terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. This is accomplished with the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN). The NASA plan for the ALHAT technology is to perform the TRL6 closed loop demonstration on the Morpheus Vertical Test Bed (VTB). The first Morpheus vehicle was lost in August of 2012 during free-flight testing at Kennedy Space Center (KSC), so the decision was made to perform a helicopter test of the integrated ALHAT System with the Morpheus avionics over the ALHAT planetary hazard field at KSC. The KSC helicopter tests included flight profiles approximating planetary approaches, with the entire ALHAT system interfaced with all appropriate Morpheus subsystems and operated in real-time. During these helicopter flights, the ALHAT system imaged the simulated lunar terrain constructed in FY2012 to support ALHAT/Morpheus testing at KSC. To the best of our knowledge, this represents the highest fidelity testing of a system of this kind to date. During this helicopter testing, two new Morpheus landers were under construction at the Johnson Space Center to support the objective of an integrated ALHAT/Morpheus free-flight demonstration. This paper provides an overview of this helicopter flight test activity, including results and lessons learned, and also provides an overview of recent integrated testing of ALHAT on the second Morpheus vehicle.
NASA Technical Reports Server (NTRS)
Rutishauser, David; Epp, Chirold; Robertson, Edward
2013-01-01
The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project was chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with real-time terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. The ALHAT System must be capable of identifying and avoiding surface hazards to enable a safe and accurate landing to within tens of meters of designated and certified landing sites anywhere on a planetary surface under any lighting conditions. This is accomplished with the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN). The NASA plan for the ALHAT technology is to perform the TRL6 closed loop demonstration on the Morpheus Vertical Test Bed (VTB). The first Morpheus vehicle was lost in August of 2012 during free-flight testing at Kennedy Space Center (KSC), so the decision was made to perform a helicopter test of the integrated ALHAT System with the Morpheus avionics over the ALHAT planetary hazard field at KSC. The KSC helicopter tests included flight profiles approximating planetary approaches, with the entire ALHAT system interfaced with all appropriate Morpheus subsystems and operated in real-time. During these helicopter flights, the ALHAT system imaged the simulated lunar terrain constructed in FY2012 to support ALHAT/Morpheus testing at KSC. To the best of our knowledge, this represents the highest fidelity testing of a system of this kind to date. During this helicopter testing, two new Morpheus landers were under construction at the Johnson Space Center to support the objective of an integrated ALHAT/Morpheus free-flight demonstration. This paper provides an overview of this helicopter flight test activity, including results and lessons learned, and also provides an overview of recent integrated testing of ALHAT on the second Morpheus vehicle.
Precision Landing and Hazard Avoidance (PL&HA) Domain
NASA Technical Reports Server (NTRS)
Robertson, Edward A.; Carson, John M., III
2016-01-01
The Precision Landing and Hazard Avoidance (PL&HA) domain addresses the development, integration, testing, and spaceflight infusion of sensing, processing, and GN&C (Guidance, Navigation and Control) functions critical to the success and safety of future human and robotic exploration missions. PL&HA sensors also have applications to other mission events, such as rendezvous and docking.
Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2008-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.
Childers, A B; Walsh, B
1996-07-23
Preharvest food safety is essential for the protection of our food supply. The production and transport of livestock and poultry play an integral part in the safety of these food products. The goals of this safety assurance include freedom from pathogenic microorganisms, disease, and parasites, and from potentially harmful residues and physical hazards. Its functions should be based on hazard analysis and critical control points from producer to slaughter plant with emphasis on prevention of identifiable hazards rather than on removal of contaminated products. The production goal is to minimize infection and insure freedom from potentially harmful residues and physical hazards. The marketing goal is control of exposure to pathogens and stress. Both groups should have functional hazard analysis and critical control points management programs which include personnel training and certification of producers. These programs must cover production procedures, chemical usage, feeding, treatment practices, drug usage, assembly and transportation, and animal identification. Plans must use risk assessment principles, and the procedures must be defined. Other elements would include preslaughter certification, environmental protection, control of chemical hazards, live-animal drug-testing procedures, and identification of physical hazards.
Konrad, Christopher P.
2015-01-01
Ecological functions and flood-related risks were assessed for floodplains along the 17 major rivers flowing into Puget Sound Basin, Washington. The assessment addresses five ecological functions, five components of flood-related risks at two spatial resolutions—fine and coarse. The fine-resolution assessment compiled spatial attributes of floodplains from existing, publically available sources and integrated the attributes into 10-meter rasters for each function, hazard, or exposure. The raster values generally represent different types of floodplains with regard to each function, hazard, or exposure rather than the degree of function, hazard, or exposure. The coarse-resolution assessment tabulates attributes from the fine-resolution assessment for larger floodplain units, which are floodplains associated with 0.1 to 21-kilometer long segments of major rivers. The coarse-resolution assessment also derives indices that can be used to compare function or risk among different floodplain units and to develop normative (based on observed distributions) standards. The products of the assessment are available online as geospatial datasets (Konrad, 2015; http://dx.doi.org/10.5066/F7DR2SJC).
Probabilistic analysis of tsunami hazards
Geist, E.L.; Parsons, T.
2006-01-01
Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).
Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land
2006-01-01
We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.
A general framework for parametric survival analysis.
Crowther, Michael J; Lambert, Paul C
2014-12-30
Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.
Automation for System Safety Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul
2009-01-01
This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
NASA Astrophysics Data System (ADS)
Lindsay, Jan M.; Robertson, Richard E. A.
2018-04-01
We report on the process of generating the first suite of integrated volcanic hazard zonation maps for the islands of Dominica, Grenada (including Kick 'em Jenny and Ronde/Caille), Nevis, Saba, St. Eustatius, St. Kitts, Saint Lucia and St Vincent in the Lesser Antilles. We developed a systematic approach that accommodated the range in prior knowledge of the volcanoes in the region. A first-order hazard assessment for each island was used to develop one or more scenario(s) of likely future activity, for which scenario-based hazard maps were generated. For the most-likely scenario on each island we also produced a poster-sized integrated volcanic hazard zonation map, which combined the individual hazardous phenomena depicted in the scenario-based hazard maps into integrated hazard zones. We document the philosophy behind the generation of this suite of maps, and the method by which hazard information was combined to create integrated hazard zonation maps, and illustrate our approach through a case study of St. Vincent. We also outline some of the challenges we faced using this approach, and the lessons we have learned by observing how stakeholders have interacted with the maps over the past 10 years. Based on our experience, we recommend that future map makers involve stakeholders in the entire map generation process, especially when making design choices such as type of base map, use of colour and gradational boundaries, and indeed what to depict on the map. We also recommend careful consideration of how to evaluate and depict offshore hazard of island volcanoes, and recommend computer-assisted modelling of all phenomena to generate more realistic hazard footprints. Finally, although our systematic approach to integrating individual hazard data into zones generally worked well, we suggest that a better approach might be to treat the integration of hazards on a case-by-case basis to ensure the final product meets map users' needs. We hope that the documentation of our experience might be useful for other map makers to take into account when creating new or updating existing maps.
Bas, Esra
2014-07-01
In this paper, an integrated methodology for Quality Function Deployment (QFD) and a 0-1 knapsack model is proposed for occupational safety and health as a systems thinking approach. The House of Quality (HoQ) in QFD methodology is a systematic tool to consider the inter-relationships between two factors. In this paper, three HoQs are used to consider the interrelationships between tasks and hazards, hazards and events, and events and preventive/protective measures. The final priority weights of events are defined by considering their project-specific preliminary weights, probability of occurrence, and effects on the victim and the company. The priority weights of the preventive/protective measures obtained in the last HoQ are fed into a 0-1 knapsack model for the investment decision. Then, the selected preventive/protective measures can be adapted to the task design. The proposed step-by-step methodology can be applied to any stage of a project to design the workplace for occupational safety and health, and continuous improvement for safety is endorsed by the closed loop characteristic of the integrated methodology. Copyright © 2013 Elsevier Ltd. All rights reserved.
A Cloud-Based System for Automatic Hazard Monitoring from Sentinel-1 SAR Data
NASA Astrophysics Data System (ADS)
Meyer, F. J.; Arko, S. A.; Hogenson, K.; McAlpin, D. B.; Whitley, M. A.
2017-12-01
Despite the all-weather capabilities of Synthetic Aperture Radar (SAR), and its high performance in change detection, the application of SAR for operational hazard monitoring was limited in the past. This has largely been due to high data costs, slow product delivery, and limited temporal sampling associated with legacy SAR systems. Only since the launch of ESA's Sentinel-1 sensors have routinely acquired and free-of-charge SAR data become available, allowing—for the first time—for a meaningful contribution of SAR to disaster monitoring. In this paper, we present recent technical advances of the Sentinel-1-based SAR processing system SARVIEWS, which was originally built to generate hazard products for volcano monitoring centers. We outline the main functionalities of SARVIEWS including its automatic database interface to Sentinel-1 holdings of the Alaska Satellite Facility (ASF), and its set of automatic processing techniques. Subsequently, we present recent system improvements that were added to SARVIEWS and allowed for a vast expansion of its hazard services; specifically: (1) In early 2017, the SARVIEWS system was migrated into the Amazon Cloud, providing access to cloud capabilities such as elastic scaling of compute resources and cloud-based storage; (2) we co-located SARVIEWS with ASF's cloud-based Sentinel-1 archive, enabling the efficient and cost effective processing of large data volumes; (3) we integrated SARVIEWS with ASF's HyP3 system (http://hyp3.asf.alaska.edu/), providing functionality such as subscription creation via API or map interface as well as automatic email notification; (4) we automated the production chains for seismic and volcanic hazards by integrating SARVIEWS with the USGS earthquake notification service (ENS) and the USGS eruption alert system. Email notifications from both services are parsed and subscriptions are automatically created when certain event criteria are met; (5) finally, SARVIEWS-generated hazard products are now being made available to the public via the SARVIEWS hazard portal. These improvements have led to the expansion of SARVIEWS toward a broader set of hazard situations, now including volcanoes, earthquakes, and severe weather. We provide details on newly developed techniques and show examples of disasters for which SARVIEWS was invoked.
NASA Technical Reports Server (NTRS)
Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.
2015-01-01
The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g., missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.
NASA Technical Reports Server (NTRS)
Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.
2015-01-01
The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g. missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.
NASA Astrophysics Data System (ADS)
Akiyanova, F. Zh; Arykbayeva, Z. K.; Atalikhova, A. M.; Dauilbayev, B. A.; Zinabdin, N. B.; Kubeyev, A. B.; Tkach, K. A.
2018-01-01
The article outlines research results on the assessment of natural hazards impact risk on the international transport corridors’ Kazakhstan section (from Khorgas and Dostyk dry ports to the seaport of Aktau) functioning. Based on the component-by-stage analysis of physical and geographical conditions with the use of qualimetric approach, the areas with different risk levels of natural disasters were identified. To minimize the risk of natural problems exposure, a set of environmental recommendations has been developed.
EPOS Thematic Core Service Anthropogenic Hazards: Implementation Plan
NASA Astrophysics Data System (ADS)
Orlecka-Sikora, Beata; Lasocki, Stanislaw; Grasso, Jean Robert; Schmittbuhl, Jean; Styles, Peter; Kwiatek, Grzegorz; Sterzel, Mariusz; Garcia, Alexander
2015-04-01
EPOS Thematic Core Service ANTHROPOGENIC HAZARDS (TCS AH) aims to integrate distributed research infrastructures (RI) to facilitate and stimulate research on anthropogenic hazards (AH) especially those associated with the exploration and exploitation of geo-resources. The innovative element is the uniqueness of the integrated RI which comprises two main deliverables: (1) Exceptional datasets, called "episodes", which comprehensively describe a geophysical process; induced or triggered by human technological activity, posing hazard for populations, infrastructure and the environment, (2) Problem-oriented, bespoke services uniquely designed for the discrimination and analysis of correlations between technology, geophysical response and resulting hazard. These objectives will be achieved through the Science-Industry Synergy (SIS) built by EPOS WG10, ensuring bi-directional information exchange, including unique and previously unavailable data furnished by industrial partners. The Episodes and services to be integrated have been selected using strict criteria during the EPOS PP. The data are related to a wide spectrum of inducing technologies, with seismic/aseismic deformation and production history as a minimum data set requirement and the quality of software services is confirmed and referenced in literature. Implementation of TCS AH is planned for four years and requires five major activities: (1) Strategic Activities and Governance: will define and establish the governance structure to ensure the long-term sustainability of these research infrastructures for data provision through EPOS. (2) Coordination and Interaction with the Community: will establish robust communication channels within the whole TCS AH community while supporting global EPOS communication strategy. (3) Interoperability with EPOS Integrated Core Service (ICS) and Testing Activities: will coordinate and ensure interoperability between the RIs and the ICS. Within this modality a functional e-research environment with access to High-Performance Computing will be built. A prototype for such an environment is already under construction and will become operational in mid -2015 (is-epos.eu). (4) Integration of AH Episodes: will address at least 20 global episodes related to conventional hydrocarbon extraction, reservoir treatment, underground mining and geothermal energy production which will be integrated into the e-environment of TCS AH. All the multi-disciplinary heterogeneous data from these particular episodes will be transformed to unified structures to form integrated data sets articulated with the defined standards of ICS and other TCS's. (5) Implementation of services for analyzing Episodes: will deliver the protocols and methodologies for analysis of the seismic/deformation response to time-varying georesource exploitation technologies on long and short time scales and the related time- and technology-dependent seismic hazard issues.
Pass-transistor very large scale integration
NASA Technical Reports Server (NTRS)
Maki, Gary K. (Inventor); Bhatia, Prakash R. (Inventor)
2004-01-01
Logic elements are provided that permit reductions in layout size and avoidance of hazards. Such logic elements may be included in libraries of logic cells. A logical function to be implemented by the logic element is decomposed about logical variables to identify factors corresponding to combinations of the logical variables and their complements. A pass transistor network is provided for implementing the pass network function in accordance with this decomposition. The pass transistor network includes ordered arrangements of pass transistors that correspond to the combinations of variables and complements resulting from the logical decomposition. The logic elements may act as selection circuits and be integrated with memory and buffer elements.
Weather Avoidance Using Route Optimization as a Decision Aid: An AWIN Topical Study. Phase 1
NASA Technical Reports Server (NTRS)
1998-01-01
The aviation community is faced with reducing the fatal aircraft accident rate by 80 percent within 10 years. This must be achieved even with ever increasing, traffic and a changing National Airspace System. This is not just an altruistic goal, but a real necessity, if our growing level of commerce is to continue. Honeywell Technology Center's topical study, "Weather Avoidance Using Route Optimization as a Decision Aid", addresses these pressing needs. The goal of this program is to use route optimization and user interface technologies to develop a prototype decision aid for dispatchers and pilots. This decision aid will suggest possible diversions through single or multiple weather hazards and present weather information with a human-centered design. At the conclusion of the program, we will have a laptop prototype decision aid that will be used to demonstrate concepts to industry for integration into commercialized products for dispatchers and/or pilots. With weather a factor in 30% of aircraft accidents, our program will prevent accidents by strategically avoiding weather hazards in flight. By supplying more relevant weather information in a human-centered format along with the tools to generate flight plans around weather, aircraft exposure to weather hazards can be reduced. Our program directly addresses the NASA's five year investment areas of Strategic Weather Information and Weather Operations (simulation/hazard characterization and crew/dispatch/ATChazard monitoring, display, and decision support) (NASA Aeronautics Safety Investment Strategy: Weather Investment Recommendations, April 15, 1997). This program is comprised of two phases, Phase I concluded December 31, 1998. This first phase defined weather data requirements, lateral routing algorithms, an conceptual displays for a user-centered design. Phase II runs from January 1999 through September 1999. The second phase integrates vertical routing into the lateral optimizer and combines the user interface into a prototype software testbed. Phase II concludes with a dispatcher and pilot evaluation of the route optimizer decision aid. This document describes work completed in Phase I in contract with NASA Langley August 1998 - December 1998. This report includes: (1) Discuss how weather hazards were identified in partnership with experts, and how weather hazards were prioritized; (2) Static representations of display layouts for integrated planning function (3) Cost function for the 2D route optimizer; (4) Discussion of the method for obtaining, access to raw data of, and the results of the flight deck user information requirements definition; (5) Itemized display format requirements identified for representing weather hazards in a route planning aid.
Commercial Firm Training Practices versus Aerial Port Hazardous Cargo Frustration
2007-03-01
locations HAZMAT, even though highly regulated, is an integral piece to the success to the war fighters. These items can be as simple as cleaning ... supplies to as vital as a bullet. Every function within the military relies on HAZMAT to complete its mission. As long as conflicts are being waged the
Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2009-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Hazard Interactions and Interaction Networks (Cascades) within Multi-Hazard Methodologies
NASA Astrophysics Data System (ADS)
Gill, Joel; Malamud, Bruce D.
2016-04-01
Here we combine research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between 'multi-layer single hazard' approaches and 'multi-hazard' approaches that integrate such interactions. This synthesis suggests that ignoring interactions could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. We proceed to present an enhanced multi-hazard framework, through the following steps: (i) describe and define three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment; (ii) outline three types of interaction relationship (triggering, increased probability, and catalysis/impedance); and (iii) assess the importance of networks of interactions (cascades) through case-study examples (based on literature, field observations and semi-structured interviews). We further propose visualisation frameworks to represent these networks of interactions. Our approach reinforces the importance of integrating interactions between natural hazards, anthropogenic processes and technological hazards/disasters into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential, and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.
NASA Astrophysics Data System (ADS)
Medina, José M.; Díaz, José A.
2006-05-01
Simple visual-reaction times (VRT) were measured for a variety of stimuli selected along red-green (L-M axis) and blue-yellow [S-(L+M) axis] directions in the isoluminant plane under different adaptation stimuli. Data were plotted in terms of the RMS cone contrast in contrast-threshold units. For each opponent system, a modified Piéron function was fitted in each experimental configuration and on all adaptation stimuli. A single function did not account for all the data, confirming the existence of separate postreceptoral adaptation mechanisms in each opponent system under suprathreshold conditions. The analysis of the VRT-hazard functions suggested that both color-opponent mechanisms present a well-defined, transient-sustained structure at marked suprathreshold conditions. The influence of signal polarity and chromatic adaptation on each color axis proves the existence of asymmetries in the integrated hazard functions, suggesting separate detection mechanisms for each pole (red, green, blue, and yellow detectors).
NASA Technical Reports Server (NTRS)
Elfes, Alberto; Hall, Jeffery L.; Kulczycki, Eric A.; Cameron, Jonathan M.; Morfopoulos, Arin C.; Clouse, Daniel S.; Montgomery, James F.; Ansar, Adnan I.; Machuzak, Richard J.
2009-01-01
An architecture for autonomous operation of an aerobot (i.e., a robotic blimp) to be used in scientific exploration of planets and moons in the Solar system with an atmosphere (such as Titan and Venus) is undergoing development. This architecture is also applicable to autonomous airships that could be flown in the terrestrial atmosphere for scientific exploration, military reconnaissance and surveillance, and as radio-communication relay stations in disaster areas. The architecture was conceived to satisfy requirements to perform the following functions: a) Vehicle safing, that is, ensuring the integrity of the aerobot during its entire mission, including during extended communication blackouts. b) Accurate and robust autonomous flight control during operation in diverse modes, including launch, deployment of scientific instruments, long traverses, hovering or station-keeping, and maneuvers for touch-and-go surface sampling. c) Mapping and self-localization in the absence of a global positioning system. d) Advanced recognition of hazards and targets in conjunction with tracking of, and visual servoing toward, targets, all to enable the aerobot to detect and avoid atmospheric and topographic hazards and to identify, home in on, and hover over predefined terrain features or other targets of scientific interest. The architecture is an integrated combination of systems for accurate and robust vehicle and flight trajectory control; estimation of the state of the aerobot; perception-based detection and avoidance of hazards; monitoring of the integrity and functionality ("health") of the aerobot; reflexive safing actions; multi-modal localization and mapping; autonomous planning and execution of scientific observations; and long-range planning and monitoring of the mission of the aerobot. The prototype JPL aerobot (see figure) has been tested extensively in various areas in the California Mojave desert.
Al Shami, A; Harik, G; Alameddine, I; Bruschi, D; Garcia, D Astiaso; El-Fadel, M
2017-01-01
Oil pollution in the Mediterranean represents a serious threat to the coastal environment. Quantifying the risks associated with a potential spill is often based on results generated from oil spill models. In this study, MEDSLIK-II, an EU funded and endorsed oil spill model, is used to assess potential oil spill scenarios at four pilot areas located along the northern, eastern, and southern Mediterranean shoreline, providing a wide range of spill conditions and coastal geomorphological characteristics. Oil spill risk assessment at the four pilot areas was quantified as a function of three oil pollution metrics that include the susceptibility of oiling per beach segment, the average volume of oiling expected in the event of beaching, and the average oil beaching time. The results show that while the three pollution metrics tend to agree in their hazard characterization when the shoreline morphology is simple, considerable differences in the quantification of the associated hazard is possible under complex coastal morphologies. These differences proved to greatly alter the evaluation of environmental risks. An integrative hazard index is proposed that encompasses the three simulated pollution metrics. The index promises to shed light on oil spill hazards that can be universally applied across the Mediterranean basin by integrating it with the unified oil spill risk assessment tool developed by the Regional Marine Pollution Emergency Response Centre for the Mediterranean (REMPEC). Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Uijt de Haag, Maarten; Venable, Kyle; Bezawada, Rajesh; Adami, Tony; Vadlamani, Ananth K.
2009-05-01
This paper discusses a sensor simulator/synthesizer framework that can be used to test and evaluate various sensor integration strategies for the implementation of an External Hazard Monitor (EHM) and Integrated Alerting and Notification (IAN) function as part of NASA's Integrated Intelligent Flight Deck (IIFD) project. The IIFD project under the NASA's Aviation Safety program "pursues technologies related to the flight deck that ensure crew workload and situational awareness are both safely optimized and adapted to the future operational environment as envisioned by NextGen." Within the simulation framework, various inputs to the IIFD and its subsystems, the EHM and IAN, are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. Sensors and avionics included in this framework are TCAS, ADS-B, Forward-Looking Infrared, Vision cameras, GPS, Inertial navigators, EGPWS, Laser Detection and Ranging sensors, altimeters, communication links with ATC, and weather radar. The framework is implemented in Simulink, a modeling language developed by The Mathworks. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft. Specifically, this paper addresses the architecture of the simulator, the sensor model interfaces, the timing and database (environment) aspects of the sensor models, the user interface of the modeling environment, and the various avionics implementations.
Integrated Safety Analysis Tiers
NASA Technical Reports Server (NTRS)
Shackelford, Carla; McNairy, Lisa; Wetherholt, Jon
2009-01-01
Commercial partnerships and organizational constraints, combined with complex systems, may lead to division of hazard analysis across organizations. This division could cause important hazards to be overlooked, causes to be missed, controls for a hazard to be incomplete, or verifications to be inefficient. Each organization s team must understand at least one level beyond the interface sufficiently enough to comprehend integrated hazards. This paper will discuss various ways to properly divide analysis among organizations. The Ares I launch vehicle integrated safety analyses effort will be utilized to illustrate an approach that addresses the key issues and concerns arising from multiple analysis responsibilities.
Integration of functional safety systems on the Daniel K. Inouye Solar Telescope
NASA Astrophysics Data System (ADS)
Williams, Timothy R.; Hubbard, Robert P.; Shimko, Steve
2016-07-01
The Daniel K. Inouye Solar Telescope (DKIST) was envisioned from an early stage to incorporate a functional safety system to ensure the safety of personnel and equipment within the facility. Early hazard analysis showed the need for a functional safety system. The design used a distributed approach in which each major subsystem contains a PLC-based safety controller. This PLC-based system complies with the latest international standards for functional safety. The use of a programmable controller also allows for flexibility to incorporate changes in the design of subsystems without adversely impacting safety. Various subsystems were built by different contractors and project partners but had to function as a piece of the overall control system. Using distributed controllers allows project contractors and partners to build components as standalone subsystems that then need to be integrated into the overall functional safety system. Recently factory testing was concluded on the major subsystems of the facility. Final integration of these subsystems is currently underway on the site. Building on lessons learned in early factory tests, changes to the interface between subsystems were made to improve the speed and ease of integration of the entire system. Because of the distributed design each subsystem can be brought online as it is delivered and assembled rather than waiting until the entire facility is finished. This enhances safety during the risky period of integration and testing. The DKIST has implemented a functional safety system that has allowed construction of subsystems in geographically diverse locations but that function cohesively once they are integrated into the facility currently under construction.
NASA Astrophysics Data System (ADS)
Huong, Do Thi Viet; Nagasawa, Ryota
2014-01-01
The potential flood hazard was assessed for the Hoa Chau commune in central Vietnam in order to identify the high flood hazard zones for the decision makers who will execute future rural planning. A new approach for deriving the potential flood hazard based on integration of inundation and flow direction maps is described. Areas inundated in the historical flood event of 2007 were extracted from Advanced Land Observing Satellite (ALOS) phased array L-band synthetic aperture data radar (PALSAR) images, while flow direction characteristics were derived from the ASTER GDEM to extract the depressed surfaces. Past flood experience and the flow direction were then integrated to analyze and rank the potential flood hazard zones. The land use/cover map extracted from LANDSAT TM and flood depth point records from field surveys were utilized to check the possibility of susceptible inundated areas, extracting data from ALOS PALSAR and ranking the potential flood hazard. The estimation of potential flood hazard areas revealed that 17.43% and 17.36% of Hoa Chau had high and medium potential flood hazards, respectively. The flow direction and ALOS PALSAR data were effectively integrated for determining the potential flood hazard when hydrological and meteorological data were inadequate and remote sensing images taken during flood times were not available or were insufficient.
Human System Integration: Regulatory Analysis
NASA Technical Reports Server (NTRS)
2005-01-01
This document was intended as an input to the Access 5 Policy Integrated Product team. Using a Human System Integration (HIS) perspective, a regulatory analyses of the FARS (specifically Part 91), the Airman s Information Manual (AIM) and the FAA Controllers Handbook (7110.65) was conducted as part of a front-end approach needed to derive HSI requirements for Unmanned Aircraft Systems (UAS) operations in the National Airspace System above FL430. The review of the above aviation reference materials yielded eighty-four functions determined to be necessary or highly desirable for flight within the Air Traffic Management System. They include categories for Flight, Communications, Navigation, Surveillance, and Hazard Avoidance.
Integration and Validation of Avian Radars (IVAR)
2011-08-01
hazards of electromagnetic radiation to fuel HERO hazards of electromagnetic radiation to ordnance HERP hazards ... hazard Radiation hazard to humans, fuels, and ordnance can be easily managed. Demonstration of how operation of radars can meet hazards of... electromagnetic radiation to personnel (HERP), hazards of electromagnetic radiation to
Integration of Aquifer Storage Transfer and Recovery and HACCP for Ensuring Drinking Water Quality
NASA Astrophysics Data System (ADS)
Lee, S. I.; Ji, H. W.
2015-12-01
The integration of ASTR (Aquifer Storage Transfer and Recovery) and HACCP (Hazard Analysis and Critical Control Point) is being attempted to ensure drinking water quality in a delta area. ASTR is a water supply system in which surface water is injected into a well for storage and recovered from a different well. During the process natural water treatment is achieved in the aquifer. ASTR has advantages over surface reservoirs in that the water is protected from external contaminants and free from water loss by evaporation. HACCP, originated from the food industry, can efficiently manage hazards and reduce risks when it is introduced to the drinking water production. The study area is the located in the Nakdong River Delta, South Korea. Water quality of this region has been deteriorated due to the increased pollution loads from the upstream cities and industrial complexes. ASTR equipped with HACCP system is suggested as a means to heighten the public trust in drinking water. After the drinking water supply system using ASTR was decomposed into ten processes, principles of HACCP were applied. Hazardous event analysis was conducted for 114 hazardous events and nine major hazardous events were identified based on the likelihood and the severity assessment. Potential risk of chemical hazards, as a function of amounts, travel distance and toxicity, was evaluated and the result shows the relative threat a city poses to the drinking water supply facility. Next, critical control points were determined using decision tree analysis. Critical limits, maximum and/or minimum values to which biological, chemical or physical parameters must be controlled, were established. Other procedures such as monitoring, corrective actions and will be presented.
Hazard interactions and interaction networks (cascades) within multi-hazard methodologies
NASA Astrophysics Data System (ADS)
Gill, Joel C.; Malamud, Bruce D.
2016-08-01
This paper combines research and commentary to reinforce the importance of integrating hazard interactions and interaction networks (cascades) into multi-hazard methodologies. We present a synthesis of the differences between multi-layer single-hazard approaches and multi-hazard approaches that integrate such interactions. This synthesis suggests that ignoring interactions between important environmental and anthropogenic processes could distort management priorities, increase vulnerability to other spatially relevant hazards or underestimate disaster risk. In this paper we proceed to present an enhanced multi-hazard framework through the following steps: (i) description and definition of three groups (natural hazards, anthropogenic processes and technological hazards/disasters) as relevant components of a multi-hazard environment, (ii) outlining of three types of interaction relationship (triggering, increased probability, and catalysis/impedance), and (iii) assessment of the importance of networks of interactions (cascades) through case study examples (based on the literature, field observations and semi-structured interviews). We further propose two visualisation frameworks to represent these networks of interactions: hazard interaction matrices and hazard/process flow diagrams. Our approach reinforces the importance of integrating interactions between different aspects of the Earth system, together with human activity, into enhanced multi-hazard methodologies. Multi-hazard approaches support the holistic assessment of hazard potential and consequently disaster risk. We conclude by describing three ways by which understanding networks of interactions contributes to the theoretical and practical understanding of hazards, disaster risk reduction and Earth system management. Understanding interactions and interaction networks helps us to better (i) model the observed reality of disaster events, (ii) constrain potential changes in physical and social vulnerability between successive hazards, and (iii) prioritise resource allocation for mitigation and disaster risk reduction.
Integrated Satellite Control in REIMEI (INDEX) Satellite
NASA Astrophysics Data System (ADS)
Fukuda, Seisuke; Mizuno, Takahide; Sakai, Shin-Ichiro; Fukushima, Yousuke; Saito, Hirobumi
REIMEI/INDEX (INnovative-technology Demonstration EXperiment) is a 70kg class small satellite which the Institute of Space and Astronautical Science, Japan Exploration Agency, ISAS/JAXA, has developed for observation of auroral small-scale dynamics as well as demonstration of advanced satellite technologies. An important engineering mission of REIMEI is integrated satellite control using commercial RISC CPUs with a triple voting system in order to ensure fault-tolerance against radiation hazards. Software modules concerning every satellite function, such as attitude control, data handling, and mission applications, work cooperatively so that highly sophisticated satellite control can be performed. In this paper, after a concept of the integrated satellite control is introduced, the Integrated Controller Unit (ICU) is described in detail. Also unique topics in developing the integrated control system are shown.
Chen, Keping; Blong, Russell; Jacobson, Carol
2003-04-01
This paper develops a GIS-based integrated approach to risk assessment in natural hazards, with reference to bushfires. The challenges for undertaking this approach have three components: data integration, risk assessment tasks, and risk decision-making. First, data integration in GIS is a fundamental step for subsequent risk assessment tasks and risk decision-making. A series of spatial data integration issues within GIS such as geographical scales and data models are addressed. Particularly, the integration of both physical environmental data and socioeconomic data is examined with an example linking remotely sensed data and areal census data in GIS. Second, specific risk assessment tasks, such as hazard behavior simulation and vulnerability assessment, should be undertaken in order to understand complex hazard risks and provide support for risk decision-making. For risk assessment tasks involving heterogeneous data sources, the selection of spatial analysis units is important. Third, risk decision-making concerns spatial preferences and/or patterns, and a multicriteria evaluation (MCE)-GIS typology for risk decision-making is presented that incorporates three perspectives: spatial data types, data models, and methods development. Both conventional MCE methods and artificial intelligence-based methods with GIS are identified to facilitate spatial risk decision-making in a rational and interpretable way. Finally, the paper concludes that the integrated approach can be used to assist risk management of natural hazards, in theory and in practice.
The Integrated Hazard Analysis Integrator
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Massie, Michael J.
2009-01-01
Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and sufficient requirements of one of the significant contributors to mission success, the IHA integrator. Discussions will be provided to describe both the mindset required as well as deleterious assumptions/behaviors to avoid when integrating within a large scale system.
NASA Astrophysics Data System (ADS)
Armstrong, Michael James
Increases in power demands and changes in the design practices of overall equipment manufacturers has led to a new paradigm in vehicle systems definition. The development of unique power systems architectures is of increasing importance to overall platform feasibility and must be pursued early in the aircraft design process. Many vehicle systems architecture trades must be conducted concurrent to platform definition. With an increased complexity introduced during conceptual design, accurate predictions of unit level sizing requirements must be made. Architecture specific emergent requirements must be identified which arise due to the complex integrated effect of unit behaviors. Off-nominal operating scenarios present sizing critical requirements to the aircraft vehicle systems. These requirements are architecture specific and emergent. Standard heuristically defined failure mitigation is sufficient for sizing traditional and evolutionary architectures. However, architecture concepts which vary significantly in terms of structure and composition require that unique failure mitigation strategies be defined for accurate estimations of unit level requirements. Identifying of these off-nominal emergent operational requirements require extensions to traditional safety and reliability tools and the systematic identification of optimal performance degradation strategies. Discrete operational constraints posed by traditional Functional Hazard Assessment (FHA) are replaced by continuous relationships between function loss and operational hazard. These relationships pose the objective function for hazard minimization. Load shedding optimization is performed for all statistically significant failures by varying the allocation of functional capability throughout the vehicle systems architecture. Expressing hazards, and thereby, reliability requirements as continuous relationships with the magnitude and duration of functional failure requires augmentations to the traditional means for system safety assessment (SSA). The traditional two state and discrete system reliability assessment proves insufficient. Reliability is, therefore, handled in an analog fashion: as a function of magnitude of failure and failure duration. A series of metrics are introduced which characterize system performance in terms of analog hazard probabilities. These include analog and cumulative system and functional risk, hazard correlation, and extensions to the traditional component importance metrics. Continuous FHA, load shedding optimization, and analog SSA constitute the SONOMA process (Systematic Off-Nominal Requirements Analysis). Analog system safety metrics inform both architecture optimization (changes in unit level capability and reliability) and architecture augmentation (changes in architecture structure and composition). This process was applied for two vehicle systems concepts (conventional and 'more-electric') in terms of loss/hazard relationships with varying degrees of fidelity. Application of this process shows that the traditional assumptions regarding the structure of the function loss vs. hazard relationship apply undue design bias to functions and components during exploratory design. This bias is illustrated in terms of inaccurate estimations of the system and function level risk and unit level importance. It was also shown that off-nominal emergent requirements must be defined specific to each architecture concept. Quantitative comparisons of architecture specific off-nominal performance were obtained which provide evidence to the need for accurate definition of load shedding strategies during architecture exploratory design. Formally expressing performance degradation strategies in terms of the minimization of a continuous hazard space enhances the system architects ability to accurately predict sizing critical emergent requirements concurrent to architecture definition. Furthermore, the methods and frameworks generated here provide a structured and flexible means for eliciting these architecture specific requirements during the performance of architecture trades.
Detecting Traversable Area and Water Hazards for the Visually Impaired with a pRGB-D Sensor
Yang, Kailun; Wang, Kaiwei; Cheng, Ruiqi; Hu, Weijian; Huang, Xiao; Bai, Jian
2017-01-01
The use of RGB-Depth (RGB-D) sensors for assisting visually impaired people (VIP) has been widely reported as they offer portability, function-diversity and cost-effectiveness. However, polarization cues to assist traversability awareness without precautions against stepping into water areas are weak. In this paper, a polarized RGB-Depth (pRGB-D) framework is proposed to detect traversable area and water hazards simultaneously with polarization-color-depth-attitude information to enhance safety during navigation. The approach has been tested on a pRGB-D dataset, which is built for tuning parameters and evaluating the performance. Moreover, the approach has been integrated into a wearable prototype which generates a stereo sound feedback to guide visually impaired people (VIP) follow the prioritized direction to avoid obstacles and water hazards. Furthermore, a preliminary study with ten blindfolded participants suggests its effectivity and reliability. PMID:28817069
Choi, Subin; Park, Kyeonghwan; Lee, Seungwook; Lim, Yeongjin; Oh, Byungjoo; Chae, Hee Young; Park, Chan Sam; Shin, Heugjoo; Kim, Jae Joon
2018-03-02
This paper presents a resolution-reconfigurable wide-range resistive sensor readout interface for wireless multi-gas monitoring applications that displays results on a smartphone. Three types of sensing resolutions were selected to minimize processing power consumption, and a dual-mode front-end structure was proposed to support the detection of a variety of hazardous gases with wide range of characteristic resistance. The readout integrated circuit (ROIC) was fabricated in a 0.18 μm CMOS process to provide three reconfigurable data conversions that correspond to a low-power resistance-to-digital converter (RDC), a 12-bit successive approximation register (SAR) analog-to-digital converter (ADC), and a 16-bit delta-sigma modulator. For functional feasibility, a wireless sensor system prototype that included in-house microelectromechanical (MEMS) sensing devices and commercial device products was manufactured and experimentally verified to detect a variety of hazardous gases.
Qiao, Yuanhua; Keren, Nir; Mannan, M Sam
2009-08-15
Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.
HSI Guidelines Outline for the Air Vehicle Control Station. Version 2
NASA Technical Reports Server (NTRS)
2006-01-01
This document provides guidance to the FAA and manufacturers on how to develop UAS Pilot Vehicle Interfaces to safely and effectively integrate UASs into the NAS. Preliminary guidelines are provided for Aviate, Communicate, Navigate and Avoid Hazard functions. The pilot shall have information and control capability so that pilot-UA interactions are not adverse, unfavorable, nor compromise safety. Unfavorable interactions include anomalous aircraft-pilot coupling (APC) interactions (closed loop), pilot-involved oscillations (categories I, II or III), and non-oscillatory APC events (e.g., divergence). - Human Systems Integration Pilot-Technology Interface Requirements for Command, Control, and Communications (C3)
Developing an online tool for identifying at-risk populations to wildfire smoke hazards.
Vaidyanathan, Ambarish; Yip, Fuyuen; Garbe, Paul
2018-04-01
Wildfire episodes pose a significant public health threat in the United States. Adverse health impacts associated with wildfires occur near the burn area as well as in places far downwind due to wildfire smoke exposures. Health effects associated with exposure to particulate matter arising from wildfires can range from mild eye and respiratory tract irritation to more serious outcomes such as asthma exacerbation, bronchitis, and decreased lung function. Real-time operational forecasts of wildfire smoke concentrations are available but they are not readily integrated with information on vulnerable populations necessary to identify at-risk communities during wildfire smoke episodes. Efforts are currently underway at the Centers for Disease Control and Prevention (CDC) to develop an online tool that utilizes short-term predictions and forecasts of smoke concentrations and integrates them with measures of population-level vulnerability for identifying at-risk populations to wildfire smoke hazards. The tool will be operationalized on a national scale, seeking input and assistance from several academic, federal, state, local, Tribal, and Territorial partners. The final product will then be incorporated into CDC's National Environmental Public Health Tracking Network (http://ephtracking.cdc.gov), providing users with access to a suite of mapping and display functionalities. A real-time vulnerability assessment tool incorporating standardized health and exposure datasets, and prevention guidelines related to wildfire smoke hazards is currently unavailable for public health practitioners and emergency responders. This tool could strengthen existing situational awareness competencies, and expedite future response and recovery efforts during wildfire episodes. Published by Elsevier B.V.
Method for detecting and avoiding flight hazards
NASA Astrophysics Data System (ADS)
von Viebahn, Harro; Schiefele, Jens
1997-06-01
Today's aircraft equipment comprise several independent warning and hazard avoidance systems like GPWS, TCAS or weather radar. It is the pilot's task to monitor all these systems and take the appropriate action in case of an emerging hazardous situation. The developed method for detecting and avoiding flight hazards combines all potential external threats for an aircraft into a single system. It is based on an aircraft surrounding airspace model consisting of discrete volume elements. For each element of the volume the threat probability is derived or computed from sensor output, databases, or information provided via datalink. The position of the own aircraft is predicted by utilizing a probability distribution. This approach ensures that all potential positions of the aircraft within the near future are considered while weighting the most likely flight path. A conflict detection algorithm initiates an alarm in case the threat probability exceeds a threshold. An escape manoeuvre is generated taking into account all potential hazards in the vicinity, not only the one which caused the alarm. The pilot gets a visual information about the type, the locating, and severeness o the threat. The algorithm was implemented and tested in a flight simulator environment. The current version comprises traffic, terrain and obstacle hazards avoidance functions. Its general formulation allows an easy integration of e.g. weather information or airspace restrictions.
Total Risk Integrated Methodology (TRIM) - TRIM.Risk
TRIM.Riskis used to integrate the information on exposure received from TRIM.FaTE or TRIM.Expo with that on dose-response or hazard assessment and to provide quantitative descriptions of risk or hazard and some of the attendant uncertainties.
Flight Testing ALHAT Precision Landing Technologies Integrated Onboard the Morpheus Rocket Vehicle
NASA Technical Reports Server (NTRS)
Carson, John M. III; Robertson, Edward A.; Trawny, Nikolas; Amzajerdian, Farzin
2015-01-01
A suite of prototype sensors, software, and avionics developed within the NASA Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project were terrestrially demonstrated onboard the NASA Morpheus rocket-propelled Vertical Testbed (VTB) in 2014. The sensors included a LIDAR-based Hazard Detection System (HDS), a Navigation Doppler LIDAR (NDL) velocimeter, and a long-range Laser Altimeter (LAlt) that enable autonomous and safe precision landing of robotic or human vehicles on solid solar system bodies under varying terrain lighting conditions. The flight test campaign with the Morpheus vehicle involved a detailed integration and functional verification process, followed by tether testing and six successful free flights, including one night flight. The ALHAT sensor measurements were integrated into a common navigation solution through a specialized ALHAT Navigation filter that was employed in closed-loop flight testing within the Morpheus Guidance, Navigation and Control (GN&C) subsystem. Flight testing on Morpheus utilized ALHAT for safe landing site identification and ranking, followed by precise surface-relative navigation to the selected landing site. The successful autonomous, closed-loop flight demonstrations of the prototype ALHAT system have laid the foundation for the infusion of safe, precision landing capabilities into future planetary exploration missions.
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin; Spazier, Johannes; Reißland, Sven
2016-04-01
The TRIDEC Cloud is a platform that merges several complementary cloud-based services for instant tsunami propagation calculations and automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The platform offers a modern web-based graphical user interface so that operators in warning centres and stakeholders of other involved parties (e.g. CPAs, ministries) just need a standard web browser to access a full-fledged early warning and information system with unique interactive features such as Cloud Messages and Shared Maps. Furthermore, the TRIDEC Cloud can be accessed in different modes, e.g. the monitoring mode, which provides important functionality required to act in a real event, and the exercise-and-training mode, which enables training and exercises with virtual scenarios re-played by a scenario player. The software system architecture and open interfaces facilitate global coverage so that the system is applicable for any region in the world and allow the integration of different sensor systems as well as the integration of other hazard types and use cases different to tsunami early warning. Current advances of the TRIDEC Cloud platform will be summarized in this presentation.
The risk concept and its application in natural hazard risk management in Switzerland
NASA Astrophysics Data System (ADS)
Bründl, M.; Romang, H. E.; Bischof, N.; Rheinberger, C. M.
2009-05-01
Over the last ten years, a risk-based approach to manage natural hazards - termed the risk concept - has been introduced to the management of natural hazards in Switzerland. Large natural hazard events, new political initiatives and limited financial resources have led to the development and introduction of new planning instruments and software tools that should support natural hazard engineers and planners to effectively and efficiently deal with natural hazards. Our experience with these new instruments suggests an improved integration of the risk concept into the community of natural hazard engineers and planners. Important factors for the acceptance of these new instruments are the integration of end-users during the development process, the knowledge exchange between science, developers and end-users as well as training and education courses for users. Further improvements require the maintenance of this knowledge exchange and a mindful adaptation of the instruments to case-specific circumstances.
Performance Analysis: Work Control Events Identified January - August 2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Grange, C E; Freeman, J W; Kerr, C E
2011-01-14
This performance analysis evaluated 24 events that occurred at LLNL from January through August 2010. The analysis identified areas of potential work control process and/or implementation weaknesses and several common underlying causes. Human performance improvement and safety culture factors were part of the causal analysis of each event and were analyzed. The collective significance of all events in 2010, as measured by the occurrence reporting significance category and by the proportion of events that have been reported to the DOE ORPS under the ''management concerns'' reporting criteria, does not appear to have increased in 2010. The frequency of reporting inmore » each of the significance categories has not changed in 2010 compared to the previous four years. There is no change indicating a trend in the significance category and there has been no increase in the proportion of occurrences reported in the higher significance category. Also, the frequency of events, 42 events reported through August 2010, is not greater than in previous years and is below the average of 63 occurrences per year at LLNL since 2006. Over the previous four years, an average of 43% of the LLNL's reported occurrences have been reported as either ''management concerns'' or ''near misses.'' In 2010, 29% of the occurrences have been reported as ''management concerns'' or ''near misses.'' This rate indicates that LLNL is now reporting fewer ''management concern'' and ''near miss'' occurrences compared to the previous four years. From 2008 to the present, LLNL senior management has undertaken a series of initiatives to strengthen the work planning and control system with the primary objective to improve worker safety. In 2008, the LLNL Deputy Director established the Work Control Integrated Project Team to develop the core requirements and graded elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009, training of the workforce began and as of the time of this report more than 50% of authorized Integration Work Sheets (IWS) use the activity-based planning process. In 2010, LSO independently reviewed the work planning and control process and confirmed to the Laboratory that the Integrated Safety Management (ISM) System was implemented. LLNL conducted a cross-directorate management self-assessment of work planning and control and is developing actions to respond to the issues identified. Ongoing efforts to strengthen the work planning and control process and to improve the quality of LLNL work packages are in progress: completion of remaining actions in response to the 2009 DOE Office of Health, Safety, and Security (HSS) evaluation of LLNL's ISM System; scheduling more than 14 work planning and control self-assessments in FY11; continuing to align subcontractor work control with the Institutional work planning and control system; and continuing to maintain the electronic IWS application. The 24 events included in this analysis were caused by errors in the first four of the five ISMS functions. The most frequent cause was errors in analyzing the hazards (Function 2). The second most frequent cause was errors occurring when defining the work (Function 1), followed by errors during the performance of work (Function 4). Interestingly, very few errors in developing controls (Function 3) resulted in events. This leads one to conclude that if improvements are made to defining the scope of work and analyzing the potential hazards, LLNL may reduce the frequency or severity of events. Analysis of the 24 events resulted in the identification of ten common causes. Some events had multiple causes, resulting in the mention of 39 causes being identified for the 24 events. The most frequent cause was workers, supervisors, or experts believing they understood the work and the hazards but their understanding was incomplete. The second most frequent cause was unclear, incomplete or confusing documents directing the work. Together, these two causes were mentioned 17 times and contributed to 13 of the events. All of the events with the cause of ''workers, supervisors, or experts believing they understood the work and the hazards but their understanding was incomplete'' had this error in the first two ISMS functions: define the work and analyze the hazard. This means that these causes result in the scope of work being ill-defined or the hazard(s) improperly analyzed. Incomplete implementation of these functional steps leads to the hazards not being controlled. The causes are then manifested in events when the work is conducted. The process to operate safely relies on accurately defining the scope of work. This review has identified a number of examples of latent organizational weakness in the execution of work control processes.« less
75 FR 15485 - Pipeline Safety: Workshop on Guidelines for Integrity Assessment of Cased Pipe
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-29
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID...: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice of workshop. SUMMARY... ``Guidelines for Integrity Assessment of Cased Pipe in Gas Transmission Pipelines'' and related Frequently...
Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-02
... Management Program for Gas Distribution Pipelines; Correction AGENCY: Pipeline and Hazardous Materials Safety... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration 49 CFR Part... Regulations to require operators of gas distribution pipelines to develop and implement integrity management...
77 FR 34123 - Pipeline Safety: Public Meeting on Integrity Management of Gas Distribution Pipelines
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-08
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No. PHMSA-2012-0100] Pipeline Safety: Public Meeting on Integrity Management of Gas Distribution Pipelines AGENCY: Office of Pipeline Safety, Pipeline and Hazardous Materials Safety Administration, DOT. ACTION...
Felker, G Michael; Fiuzat, Mona; Thompson, Vivian; Shaw, Linda K; Neely, Megan L; Adams, Kirkwood F; Whellan, David J; Donahue, Mark P; Ahmad, Tariq; Kitzman, Dalane W; Piña, Ileana L; Zannad, Faiez; Kraus, William E; O'Connor, Christopher M
2013-11-01
ST2 is involved in cardioprotective signaling in the myocardium and has been identified as a potentially promising biomarker in heart failure (HF). We evaluated ST2 levels and their association with functional capacity and long-term clinical outcomes in a cohort of ambulatory patients with HF enrolled in the Heart Failure: A Controlled Trial Investigating Outcomes of Exercise Training (HF-ACTION) study-a multicenter, randomized study of exercise training in HF. HF-ACTION randomized 2331 patients with left ventricular ejection fraction <0.35 and New York Heart Association class II to IV HF to either exercise training or usual care. ST2 was analyzed in a subset of 910 patients with evaluable plasma samples. Correlations and Cox models were used to assess the relationship among ST2, functional capacity, and long-term outcomes. The median baseline ST2 level was 23.7 ng/mL (interquartile range, 18.6-31.8). ST2 was modestly associated with measures of functional capacity. In univariable analysis, ST2 was significantly associated with death or hospitalization (hazard ratio, 1.48; P<0.0001), cardiovascular death or HF hospitalization (hazard ratio, 2.14; P<0.0001), and all-cause mortality (hazard ratio, 2.33; P<0.0001; all hazard ratios for log2 ng/mL). In multivariable models, ST2 remained independently associated with outcomes after adjustment for clinical variables and amino-terminal pro-B-type natriuretic peptide. However, ST2 did not add significantly to reclassification of risk as assessed by changes in the C statistic, net reclassification improvement, and integrated discrimination improvement. ST2 was modestly associated with functional capacity and was significantly associated with outcomes in a well-treated cohort of ambulatory patients with HF although it did not significantly affect reclassification of risk. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00047437.
Memory Hazard Functions: A Vehicle for Theory Development and Test
ERIC Educational Resources Information Center
Chechile, Richard A.
2006-01-01
A framework is developed to rigorously test an entire class of memory retention functions by examining hazard properties. Evidence is provided that the memory hazard function is not monotonically decreasing. Yet most of the proposals for retention functions, which have emerged from the psychological literature, imply that memory hazard is…
ERIC Educational Resources Information Center
Cross, John A.
1988-01-01
Emphasizes the use of geophysical hazard maps and illustrates how they can be used in the classroom from kindergarten to college level. Depicts ways that hazard maps of floods, landslides, earthquakes, volcanoes, and multi-hazards can be integrated into classroom instruction. Tells how maps may be obtained. (SLM)
Time-dependent resilience assessment and improvement of urban infrastructure systems
NASA Astrophysics Data System (ADS)
Ouyang, Min; Dueñas-Osorio, Leonardo
2012-09-01
This paper introduces an approach to assess and improve the time-dependent resilience of urban infrastructure systems, where resilience is defined as the systems' ability to resist various possible hazards, absorb the initial damage from hazards, and recover to normal operation one or multiple times during a time period T. For different values of T and its position relative to current time, there are three forms of resilience: previous resilience, current potential resilience, and future potential resilience. This paper mainly discusses the third form that takes into account the systems' future evolving processes. Taking the power transmission grid in Harris County, Texas, USA as an example, the time-dependent features of resilience and the effectiveness of some resilience-inspired strategies, including enhancement of situational awareness, management of consumer demand, and integration of distributed generators, are all simulated and discussed. Results show a nonlinear nature of resilience as a function of T, which may exhibit a transition from an increasing function to a decreasing function at either a threshold of post-blackout improvement rate, a threshold of load profile with consumer demand management, or a threshold number of integrated distributed generators. These results are further confirmed by studying a typical benchmark system such as the IEEE RTS-96. Such common trends indicate that some resilience strategies may enhance infrastructure system resilience in the short term, but if not managed well, they may compromise practical utility system resilience in the long run.
Time-dependent resilience assessment and improvement of urban infrastructure systems.
Ouyang, Min; Dueñas-Osorio, Leonardo
2012-09-01
This paper introduces an approach to assess and improve the time-dependent resilience of urban infrastructure systems, where resilience is defined as the systems' ability to resist various possible hazards, absorb the initial damage from hazards, and recover to normal operation one or multiple times during a time period T. For different values of T and its position relative to current time, there are three forms of resilience: previous resilience, current potential resilience, and future potential resilience. This paper mainly discusses the third form that takes into account the systems' future evolving processes. Taking the power transmission grid in Harris County, Texas, USA as an example, the time-dependent features of resilience and the effectiveness of some resilience-inspired strategies, including enhancement of situational awareness, management of consumer demand, and integration of distributed generators, are all simulated and discussed. Results show a nonlinear nature of resilience as a function of T, which may exhibit a transition from an increasing function to a decreasing function at either a threshold of post-blackout improvement rate, a threshold of load profile with consumer demand management, or a threshold number of integrated distributed generators. These results are further confirmed by studying a typical benchmark system such as the IEEE RTS-96. Such common trends indicate that some resilience strategies may enhance infrastructure system resilience in the short term, but if not managed well, they may compromise practical utility system resilience in the long run.
Rocky Mountain Research Station USDA Forest Service
2004-01-01
Other fact sheets identified considerations for communicating about hazards, talked about the importance of working locally, and discussed the seven laws of effective hazard communication. This fact sheet introduces the "Golden Rule" of hazard communication and shares some final lessons from hazard educators.
AEGIS: a wildfire prevention and management information system
NASA Astrophysics Data System (ADS)
Kalabokidis, Kostas; Ager, Alan; Finney, Mark; Athanasis, Nikos; Palaiologou, Palaiologos; Vasilakos, Christos
2016-03-01
We describe a Web-GIS wildfire prevention and management platform (AEGIS) developed as an integrated and easy-to-use decision support tool to manage wildland fire hazards in Greece (http://aegis.aegean.gr). The AEGIS platform assists with early fire warning, fire planning, fire control and coordination of firefighting forces by providing online access to information that is essential for wildfire management. The system uses a number of spatial and non-spatial data sources to support key system functionalities. Land use/land cover maps were produced by combining field inventory data with high-resolution multispectral satellite images (RapidEye). These data support wildfire simulation tools that allow the users to examine potential fire behavior and hazard with the Minimum Travel Time fire spread algorithm. End-users provide a minimum number of inputs such as fire duration, ignition point and weather information to conduct a fire simulation. AEGIS offers three types of simulations, i.e., single-fire propagation, point-scale calculation of potential fire behavior, and burn probability analysis, similar to the FlamMap fire behavior modeling software. Artificial neural networks (ANNs) were utilized for wildfire ignition risk assessment based on various parameters, training methods, activation functions, pre-processing methods and network structures. The combination of ANNs and expected burned area maps are used to generate integrated output map of fire hazard prediction. The system also incorporates weather information obtained from remote automatic weather stations and weather forecast maps. The system and associated computation algorithms leverage parallel processing techniques (i.e., High Performance Computing and Cloud Computing) that ensure computational power required for real-time application. All AEGIS functionalities are accessible to authorized end-users through a web-based graphical user interface. An innovative smartphone application, AEGIS App, also provides mobile access to the web-based version of the system.
Integrated approach for coastal hazards and risks in Sri Lanka
NASA Astrophysics Data System (ADS)
Garcin, M.; Desprats, J. F.; Fontaine, M.; Pedreros, R.; Attanayake, N.; Fernando, S.; Siriwardana, C. H. E. R.; de Silva, U.; Poisson, B.
2008-06-01
The devastating impact of the tsunami of 26 December 2004 on the shores of the Indian Ocean recalled the importance of knowledge and the taking into account of coastal hazards. Sri Lanka was one of the countries most affected by this tsunami (e.g. 30 000 dead, 1 million people homeless and 70% of the fishing fleet destroyed). Following this tsunami, as part of the French post-tsunami aid, a project to establish a Geographical Information System (GIS) on coastal hazards and risks was funded. This project aims to define, at a pilot site, a methodology for multiple coastal hazards assessment that might be useful for the post-tsunami reconstruction and for development planning. This methodology could be applied to the whole coastline of Sri Lanka. The multi-hazard approach deals with very different coastal processes in terms of dynamics as well as in terms of return period. The first elements of this study are presented here. We used a set of tools integrating a GIS, numerical simulations and risk scenario modelling. While this action occurred in response to the crisis caused by the tsunami, it was decided to integrate other coastal hazards into the study. Although less dramatic than the tsunami these remain responsible for loss of life and damage. Furthermore, the establishment of such a system could not ignore the longer-term effects of climate change on coastal hazards in Sri Lanka. This GIS integrates the physical and demographic data available in Sri Lanka that is useful for assessing the coastal hazards and risks. In addition, these data have been used in numerical modelling of the waves generated during periods of monsoon as well as for the December 2004 tsunami. Risk scenarios have also been assessed for test areas and validated by field data acquired during the project. The results obtained from the models can be further integrated into the GIS and contribute to its enrichment and to help in better assessment and mitigation of these risks. The coastal-hazards-and-risks GIS coupled with modelling thus appears to be a very useful tool that can constitute the skeleton of a coastal zone management system. Decision makers will be able to make informed choices with regards to hazards during reconstruction and urban planning projects.
NASA Technical Reports Server (NTRS)
Kelly, Michael J.
2013-01-01
The Alternative Fuel Effects on Contrails & Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage raft empennage.
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Estes, Sue
2011-01-01
The NASA Applied Sciences Program's public health initiative began in 2004 to illustratethe potential benefits for using remote sensing in public health applications. Objectives/Purpose: The CDC initiated a st udy with NASA through the National Center for Environmental Health (NCEH) to establish a pilot effort to use remote sensing data as part of its Environmental Public Health Tracking Network (EPHTN). As a consequence, the NCEH and NASA developed a project called HELIX-Atlanta (Health and Environment Linkage for Information Exchange) to demonstrate a process for developing a local environmental public health tracking and surveillance network that integrates non-infectious health and environment systems for the Atlanta metropolitan area. Methods: As an ongo ing, systematic integration, analysis and interpretation of data, an EPHTN focuses on: 1 -- environmental hazards; 2 -- human exposure to environmental hazards; and 3 -- health effects potentially related to exposure to environmental hazards. To satisfy the definition of a surveillance system the data must be disseminated to plan, implement, and evaluate environmental public health action. Results: A close working r elationship developed with NCEH where information was exchanged to assist in the development of an EPHTN that incorporated NASA remote sensing data into a surveillance network for disseminating public health tracking information to users. This project?s success provided NASA with the opportunity to work with other public health entities such as the University of Mississippi Medical Center, the University of New Mexico and the University of Arizona. Conclusions: HELIX-Atlanta became a functioning part of the national EPHTN for tracking environmental hazards and exposure, particularly as related to air quality over Atlanta. Learning Objectives: 1 -- remote sensing data can be integral to an EPHTN; 2 -- public tracking objectives can be enhanced through remote sensing data; 3 -- NASA's involvement in public health applications can have wider benefits in the future.
Integrated survival analysis using an event-time approach in a Bayesian framework
Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.
2015-01-01
Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.
Integrated survival analysis using an event-time approach in a Bayesian framework.
Walsh, Daniel P; Dreitz, Victoria J; Heisey, Dennis M
2015-02-01
Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at <5 days old and were lower for chicks with larger birth weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.
Architecting the Safety Assessment of Large-scale Systems Integration
2009-12-01
Electromagnetic Radiation to Ordnance ( HERO ) Hazards of Electromagnetic Radiation to Fuel (HERF) The main reason that this particular safety study... radiation , high voltage electric shocks and explosives safety. 1. Radiation Hazards (RADHAZ) RADHAZ describes the hazards of electromagnetic radiation ...OP3565/NAVAIR 16-1-529 [19 and 20], these hazards are segregated as follows: Hazards of Electromagnetic
NASA Astrophysics Data System (ADS)
Masure, P.
2003-04-01
The GEMITIS method has been implemented since 1995 into a global and integrated Risk Reduction Strategy for improving the seismic risk-assessment effectiveness in urban areas, including the generation of crisis scenarios and mid- to long term- seismic impact assessment. GEMITIS required us to provide more precise definitions of notions in common use by natural-hazard specialists, such as elements at risk and vulnerability. Until then, only the physical and human elements had been considered, and analysis of their vulnerability referred to their fragility in the face of aggression by nature. We have completed this approach by also characterizing the social and cultural vulnerability of a city and its inhabitants, and, with a wider scope, the functional vulnerability of the "urban system". This functional vulnerability depends upon the relations between the system elements (weak links in chains, functional relays, and defense systems) and upon the city's relations with the outside world (interdependence). Though well developed in methods for evaluating industrial risk (fault-tree analysis, event-tree analysis, multiple defense barriers, etc.), this aspect had until now been ignored by the "hard-science" specialists working on natural hazards. Based on the implementation of an Urban System Exposure methodology, we were able to identify specific human, institutional, or functional vulnerability factors for each urban system, which until had been very little discussed by risk-analysis and civil-protection specialists. In addition, we have defined the new concept of "main stakes" of the urban system, ranked by order of social value (or collective utility). Obviously, vital or strategic issues must be better resistant or protected against natural hazards than issues of secondary importance. The ranking of exposed elements of a city in terms of "main stakes" provides a very useful guide for adapting vulnerability studies and for orienting preventive actions. For this, GEMITIS is based on a systemic approach of the city and on value analysis of exposed elements. It facilitates a collective expertise for the definition of a preventive action plan based on the participation of the main urban actors (crisis preparedness, construction, land-use, etc.).
Murnyak, George R; Spencer, Clark O; Chaney, Ann E; Roberts, Welford C
2002-04-01
During the 1970s, the Army health hazard assessment (HHA) process developed as a medical program to minimize hazards in military materiel during the development process. The HHA Program characterizes health hazards that soldiers and civilians may encounter as they interact with military weapons and equipment. Thus, it is a resource for medical planners and advisors to use that can identify and estimate potential hazards that soldiers may encounter as they train and conduct missions. The U.S. Army Center for Health Promotion and Preventive Medicine administers the program, which is integrated with the Army's Manpower and Personnel Integration program. As the HHA Program has matured, an electronic database has been developed to record and monitor the health hazards associated with military equipment and systems. The current database tracks the results of HHAs and provides reporting designed to assist the HHA Program manager in daily activities.
Uncertainty Modeling of Pollutant Transport in Atmosphere and Aquatic Route Using Soft Computing
NASA Astrophysics Data System (ADS)
Datta, D.
2010-10-01
Hazardous radionuclides are released as pollutants in the atmospheric and aquatic environment (ATAQE) during the normal operation of nuclear power plants. Atmospheric and aquatic dispersion models are routinely used to assess the impact of release of radionuclide from any nuclear facility or hazardous chemicals from any chemical plant on the ATAQE. Effect of the exposure from the hazardous nuclides or chemicals is measured in terms of risk. Uncertainty modeling is an integral part of the risk assessment. The paper focuses the uncertainty modeling of the pollutant transport in atmospheric and aquatic environment using soft computing. Soft computing is addressed due to the lack of information on the parameters that represent the corresponding models. Soft-computing in this domain basically addresses the usage of fuzzy set theory to explore the uncertainty of the model parameters and such type of uncertainty is called as epistemic uncertainty. Each uncertain input parameters of the model is described by a triangular membership function.
An FTIR point sensor for identifying chemical WMD and hazardous materials
NASA Astrophysics Data System (ADS)
Norman, Mark L.; Gagnon, Aaron M.; Reffner, John A.; Schiering, David W.; Allen, Jeffrey D.
2004-03-01
A new point sensor for identifying chemical weapons of mass destruction and other hazardous materials based on Fourier transform infrared (FT-IR) spectroscopy is presented. The sensor is a portable, fully functional FT-IR system that features a miniaturized Michelson interferometer, an integrated diamond attenuated total reflection (ATR) sample interface, and an embedded on-board computer. Samples are identified by an automated search algorithm that compares their infrared spectra to digitized databases that include reference spectra of nerve and blister agents, toxic industrial chemicals, and other hazardous materials. The hardware and software are designed for use by technicians with no background in infrared spectroscopy. The unit, which is fully self-contained, can be hand-carried and used in a hot zone by personnel in Level A protective gear, and subsequently decontaminated by spraying or immersion. Wireless control by a remote computer is also possible. Details of the system design and performance, including results of field validation tests, are discussed.
Preparation and Integration of ALHAT Precision Landing Technology for Morpheus Flight Testing
NASA Technical Reports Server (NTRS)
Carson, John M., III; Robertson, Edward A.; Pierrottet, Diego F.; Roback, Vincent E.; Trawny, Nikolas; Devolites, Jennifer L.; Hart, Jeremy J.; Estes, Jay N.; Gaddis, Gregory S.
2014-01-01
The Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project has developed a suite of prototype sensors for enabling autonomous and safe precision land- ing of robotic or crewed vehicles on solid solar bodies under varying terrain lighting condi- tions. The sensors include a Lidar-based Hazard Detection System (HDS), a multipurpose Navigation Doppler Lidar (NDL), and a long-range Laser Altimeter (LAlt). Preparation for terrestrial ight testing of ALHAT onboard the Morpheus free- ying, rocket-propelled ight test vehicle has been in progress since 2012, with ight tests over a lunar-like ter- rain eld occurring in Spring 2014. Signi cant work e orts within both the ALHAT and Morpheus projects has been required in the preparation of the sensors, vehicle, and test facilities for interfacing, integrating and verifying overall system performance to ensure readiness for ight testing. The ALHAT sensors have undergone numerous stand-alone sensor tests, simulations, and calibrations, along with integrated-system tests in special- ized gantries, trucks, helicopters and xed-wing aircraft. A lunar-like terrain environment was constructed for ALHAT system testing during Morpheus ights, and vibration and thermal testing of the ALHAT sensors was performed based on Morpheus ights prior to ALHAT integration. High- delity simulations were implemented to gain insight into integrated ALHAT sensors and Morpheus GN&C system performance, and command and telemetry interfacing and functional testing was conducted once the ALHAT sensors and electronics were integrated onto Morpheus. This paper captures some of the details and lessons learned in the planning, preparation and integration of the individual ALHAT sen- sors, the vehicle, and the test environment that led up to the joint ight tests.
Precision Landing and Hazard Avoidance Doman
NASA Technical Reports Server (NTRS)
Robertson, Edward A.; Carson, John M., III
2016-01-01
The Precision Landing and Hazard Avoidance (PL&HA) domain addresses the development, integration, testing, and spaceflight infusion of sensing, processing, and GN&C functions critical to the success and safety of future human and robotic exploration missions. PL&HA sensors also have applications to other mission events, such as rendezvous and docking. Autonomous PL&HA builds upon the core GN&C capabilities developed to enable soft, controlled landings on the Moon, Mars, and other solar system bodies. Through the addition of a Terrain Relative Navigation (TRN) function, precision landing within tens of meters of a map-based target is possible. The addition of a 3-D terrain mapping lidar sensor improves the probability of a safe landing via autonomous, real-time Hazard Detection and Avoidance (HDA). PL&HA significantly improves the probability of mission success and enhances access to sites of scientific interest located in challenging terrain. PL&HA can also utilize external navigation aids, such as navigation satellites and surface beacons. Advanced Lidar Sensors High precision ranging, velocimetry, and 3-D terrain mapping Terrain Relative Navigation (TRN) TRN compares onboard reconnaissance data with real-time terrain imaging data to update the S/C position estimate Hazard Detection and Avoidance (HDA) Generates a high-resolution, 3-D terrain map in real-time during the approach trajectory to identify safe landing targets Inertial Navigation During Terminal Descent High precision surface relative sensors enable accurate inertial navigation during terminal descent and a tightly controlled touchdown within meters of the selected safe landing target.
A Hybrid FPGA/Tilera Compute Element for Autonomous Hazard Detection and Navigation
NASA Technical Reports Server (NTRS)
Villalpando, Carlos Y.; Werner, Robert A.; Carson, John M., III; Khanoyan, Garen; Stern, Ryan A.; Trawny, Nikolas
2013-01-01
To increase safety for future missions landing on other planetary or lunar bodies, the Autonomous Landing and Hazard Avoidance Technology (ALHAT) program is developing an integrated sensor for autonomous surface analysis and hazard determination. The ALHAT Hazard Detection System (HDS) consists of a Flash LIDAR for measuring the topography of the landing site, a gimbal to scan across the terrain, and an Inertial Measurement Unit (IMU), along with terrain analysis algorithms to identify the landing site and the local hazards. An FPGA and Manycore processor system was developed to interface all the devices in the HDS, to provide high-resolution timing to accurately measure system state, and to run the surface analysis algorithms quickly and efficiently. In this paper, we will describe how we integrated COTS components such as an FPGA evaluation board, a TILExpress64, and multi-threaded/multi-core aware software to build the HDS Compute Element (HDSCE). The ALHAT program is also working with the NASA Morpheus Project and has integrated the HDS as a sensor on the Morpheus Lander. This paper will also describe how the HDS is integrated with the Morpheus lander and the results of the initial test flights with the HDS installed. We will also describe future improvements to the HDSCE.
A hybrid FPGA/Tilera compute element for autonomous hazard detection and navigation
NASA Astrophysics Data System (ADS)
Villalpando, C. Y.; Werner, R. A.; Carson, J. M.; Khanoyan, G.; Stern, R. A.; Trawny, N.
To increase safety for future missions landing on other planetary or lunar bodies, the Autonomous Landing and Hazard Avoidance Technology (ALHAT) program is developing an integrated sensor for autonomous surface analysis and hazard determination. The ALHAT Hazard Detection System (HDS) consists of a Flash LIDAR for measuring the topography of the landing site, a gimbal to scan across the terrain, and an Inertial Measurement Unit (IMU), along with terrain analysis algorithms to identify the landing site and the local hazards. An FPGA and Manycore processor system was developed to interface all the devices in the HDS, to provide high-resolution timing to accurately measure system state, and to run the surface analysis algorithms quickly and efficiently. In this paper, we will describe how we integrated COTS components such as an FPGA evaluation board, a TILExpress64, and multi-threaded/multi-core aware software to build the HDS Compute Element (HDSCE). The ALHAT program is also working with the NASA Morpheus Project and has integrated the HDS as a sensor on the Morpheus Lander. This paper will also describe how the HDS is integrated with the Morpheus lander and the results of the initial test flights with the HDS installed. We will also describe future improvements to the HDSCE.
NASA Astrophysics Data System (ADS)
Brand, B. D.; McMullin-Messier, P. A.; Schlegel, M. E.
2014-12-01
'Map your Hazards' is an educational module developed within the NSF Interdisciplinary Teaching about Earth for a Sustainable Future program (InTeGrate). The module engages students in place-based explorations of natural hazards, social vulnerability, and the perception of natural hazards and risk. Students integrate geoscience and social science methodologies to (1) identify and assess hazards, vulnerability and risk within their communities; (2) distribute, collect and evaluate survey data (designed by authors) on the knowledge, risk perception and preparedness within their social networks; and (3) deliver a PPT presentation to local stakeholders detailing their findings and recommendations for development of a prepared, resilient community. 'Map your Hazards' underwent four rigorous assessments by a team of geoscience educators and external review before being piloted in our classrooms. The module was piloted in a 300-level 'Volcanoes and Society' course at Boise State University, a 300-level 'Environmental Sociology' course at Central Washington University, and a 100-level 'Natural Disasters and Environmental Geology' course at the College of Western Idaho. In all courses students reported a fascination with learning about the hazards around them and identifying the high risk areas in their communities. They were also surprised at the low level of knowledge, inaccurate risk perception and lack of preparedness of their social networks. This successful approach to engaging students in an interdisciplinary, place-based learning environment also has the broad implications of raising awareness of natural hazards (survey participants are provided links to local hazard and preparedness information). The data and preparedness suggestions can be shared with local emergency managers, who are encouraged to attend the student's final presentations. All module materials are published at serc.carleton.edu/integrate/ and are appropriate to a wide range of classrooms.
NASA Astrophysics Data System (ADS)
Chang, W.; Tsai, W.; Lin, F.; Lin, S.; Lien, H.; Chung, T.; Huang, L.; Lee, K.; Chang, C.
2008-12-01
During a typhoon or a heavy storm event, using various forecasting models to predict rainfall intensity, and water level variation in rivers and flood situation in the urban area is able to reveal its capability technically. However, in practice, the following two causes tend to restrain the further application of these models as a decision support system (DSS) for the hazard mitigation. The first one is due to the difficulty of integration of heterogeneous models. One has to take into consideration the different using format of models, such as input files, output files, computational requirements, and so on. The second one is that the development of DSS requires, due to the heterogeneity of models and systems, a friendly user interface or platform to hide the complexity of various tools from users. It is expected that users can be governmental officials rather than professional experts, therefore the complicated interface of DSS is not acceptable. Based on the above considerations, in the present study, we develop an open system for integration of several simulation models for flood forecasting by adopting the FEWS (Flood Early Warning System) platform developed by WL | Delft Hydraulics. It allows us to link heterogeneous models effectively and provides suitable display modules. In addition, FEWS also has been adopted by Water Resource Agency (WRA), Taiwan as the standard operational system for river flooding management. That means this work can be much easily integrated with the use of practical cases. In the present study, based on FEWS platform, the basin rainfall-runoff model, SOBEK channel-routing model, and estuary tide forecasting model are linked and integrated through the physical connection of model initial and boundary definitions. The work flow of the integrated processes of models is shown in Fig. 1. This differs from the typical single model linking used in FEWS, which only aims at data exchange but without much physical consideration. So it really makes the tighter collaboration work among these hydrological models. In addition, in order to make communication between system users and decision makers efficient and effective, a real-time and multi-user communication platform, designated as Co-life, is incorporated in the present study. Through its application sharing function, the flood forecasting results can be displayed for all attendees situated at different locations to help the processes of decision making for hazard mitigation. Fig. 2 shows the cyber-conference of WRA officials with the Co-life system for hazard mitigation during the typhoon event.
Integrated Geo Hazard Management System in Cloud Computing Technology
NASA Astrophysics Data System (ADS)
Hanifah, M. I. M.; Omar, R. C.; Khalid, N. H. N.; Ismail, A.; Mustapha, I. S.; Baharuddin, I. N. Z.; Roslan, R.; Zalam, W. M. Z.
2016-11-01
Geo hazard can result in reducing of environmental health and huge economic losses especially in mountainous area. In order to mitigate geo-hazard effectively, cloud computer technology are introduce for managing geo hazard database. Cloud computing technology and it services capable to provide stakeholder's with geo hazards information in near to real time for an effective environmental management and decision-making. UNITEN Integrated Geo Hazard Management System consist of the network management and operation to monitor geo-hazard disaster especially landslide in our study area at Kelantan River Basin and boundary between Hulu Kelantan and Hulu Terengganu. The system will provide easily manage flexible measuring system with data management operates autonomously and can be controlled by commands to collects and controls remotely by using “cloud” system computing. This paper aims to document the above relationship by identifying the special features and needs associated with effective geohazard database management using “cloud system”. This system later will use as part of the development activities and result in minimizing the frequency of the geo-hazard and risk at that research area.
Safer Schools: Achieving a Healthy Learning Environment through Integrated Pest Management.
ERIC Educational Resources Information Center
2003
Integrated pest management (IPM) is a program of prevention, monitoring, and control that offers the opportunity to eliminate or drastically reduce hazardous pesticide use. IPM is intended to establish a program that uses cultural, mechanical, biological, and other non-toxic practices, and only introduces least-hazardous chemicals as a last…
The Gars Programme And The Integrated Global Observing Strategy For Geohazards
NASA Astrophysics Data System (ADS)
Marsh, S.; Paganini, M.; Missotten, R.; Palazzo, F.
UNESCO and the IUGS have funded the Geological Applications of Remote Sensing Programme (GARS) since 1984. Its aim is to assess the value and utility of remotely sensed data for geoscience, whilst at the same time building capacity in developing countries. It has run projects in Africa on geological mapping, in Latin America on landslide hazards and in Asia on volcanic hazards. It is a main sponsor of the Integrated Global Observing Strategy (IGOS) for Geohazards. The societal impact of geological and related geophysical hazards is enormous. Every year volcanoes, earthquakes, landslides and subsidence claim thousands of lives, injure thousands more, devastate homes and destroy livelihoods. Damaged infrastructure and insurance premiums increase these costs. As population increases, more people live in hazardous areas and the impact grows. The World Summit on Sustainable Development recognised that systematic, joint international observations under initiatives like the Integrated Global Observing Strategy form the basis for an integrated approach to hazard mitigation and preparedness. In this context, the IGOS Partners developed this geohazards theme. Its goal is to integrate disparate, multidisciplinary, applied research into global, operational systems by filling gaps in organisation, observation and knowledge. It has four strategic objectives; building global capacity to mitigate geohazards; improving mapping, monitoring and forecasting, based on satellite and ground-based observations; increasing preparedness, using integrated geohazards information products and improved geohazards models; and promoting global take-up of local best practice in geohazards management. Gaps remain between what is known and the knowledge required to answer citizen's questions, what is observed and what must be observed to provide the necessary information for hazard mitigation and current data integration and the integration needed to make useful geohazard information products. An action plan is proposed that is designed to close these gaps. Priority actions are to: begin networking within the geohazards community; improve topographic data provision using existing observations and secure continuity of C- and L-Band radar interferometry with the space agencies; assess the potential for existing data to be integrated into geohazard products and services; evaluate ways to improve databases with their managing agencies; and initiate research that increases geohazards knowledge. This paper presents the strategy and describes the action plan that will implement it over the next decade, as a key part of the GARS Programme.
E-research platform of EPOS Thematic Core Service "ANTHROPOGENIC HAZARDS"
NASA Astrophysics Data System (ADS)
Orlecka-Sikora, Beata; Lasocki, Stanisław; Grasso, Jean Robert; Schmittbuhl, Jean; Kwiatek, Grzegorz; Garcia, Alexander; Cassidy, Nigel; Sterzel, Mariusz; Szepieniec, Tomasz; Dineva, Savka; Biggare, Pascal; Saccorotti, Gilberto; Sileny, Jan; Fischer, Tomas
2016-04-01
EPOS Thematic Core Service ANTHROPOGENIC HAZARDS (TCS AH) aims to create new research opportunities in the field of anthropogenic hazards evoked by exploitation of georesources. TCS AH, based on the prototype built in the framework of the IS-EPOS project (https://tcs.ah-epos.eu/), financed from Polish structural funds (POIG.02.03.00-14-090/13-00), is being further developed within EPOS IP project (H2020-INFRADEV-1-2015-1, INFRADEV-3-2015). TCS AH is designed as a functional e-research environment to ensure a researcher the maximum possible freedom for in silico experimentation by providing a virtual laboratory in which researcher will be able to create own workspace with own processing streams. The unique integrated RI is: (i) data gathered in the so- called "episodes", comprehensively describing a geophysical process, induced or triggered by human technological activity, which under certain circumstances can become hazardous for people, infrastructure and the environment and (ii) problem-oriented, specific high-level services, with the particular attention devoted to methods analyzing correlations between technology, geophysical response and resulting hazard. Services to be implemented are grouped within six blocks: (1) Basic services for data integration and handling; (2) Services for physical models of stress/strain changes over time and space as driven by geo-resource production; (3) Services for analysing geophysical signals; (4) Services to extract the relation between technological operations and observed induced seismic/deformation; (5) Services to quantitative probabilistic assessments of anthropogenic seismic hazard - statistical properties of anthropogenic seismic series and their dependence on time-varying anthropogenesis; ground motion prediction equations; stationary and time-dependent probabilistic seismic hazard estimates, related to time-changeable technological factors inducing the seismic process; (6) Simulator for Multi-hazard/multi-risk assessment in ExploRation/exploitation of GEoResources (MERGER) - numerical estimate of the occurrence probability of chains of events or processes impacting the environment. TCS AH will also serve the public sector expert knowledge and background information. In order to fulfill this aim the services for outreach, dissemination & communication will be implemented. From the technical point of view the implementation of services will proceed according to the methods worked within the mentioned before IS-EPOS project. The detailed workflows of implementation process of aforementioned services & interaction between user & TCS AH have been already prepared.
1993-04-01
34 in the remainder of this "• IPS. Ensure that system safety, Section refer to the DoD format paragraph health hazards, and environmental for the...hazardous materials is controlled in the manner which protects human health and the environment at the least cost. Hazardous Material Control and Management...of hazardous materials is controlled in a manner which protects human health and the environment at the least cost. Hazardous Material Control and
Flood Hazard Mapping by Applying Fuzzy TOPSIS Method
NASA Astrophysics Data System (ADS)
Han, K. Y.; Lee, J. Y.; Keum, H.; Kim, B. J.; Kim, T. H.
2017-12-01
There are lots of technical methods to integrate various factors for flood hazard mapping. The purpose of this study is to suggest the methodology of integrated flood hazard mapping using MCDM(Multi Criteria Decision Making). MCDM problems involve a set of alternatives that are evaluated on the basis of conflicting and incommensurate criteria. In this study, to apply MCDM to assessing flood risk, maximum flood depth, maximum velocity, and maximum travel time are considered as criterion, and each applied elements are considered as alternatives. The scheme to find the efficient alternative closest to a ideal value is appropriate way to assess flood risk of a lot of element units(alternatives) based on various flood indices. Therefore, TOPSIS which is most commonly used MCDM scheme is adopted to create flood hazard map. The indices for flood hazard mapping(maximum flood depth, maximum velocity, and maximum travel time) have uncertainty concerning simulation results due to various values according to flood scenario and topographical condition. These kind of ambiguity of indices can cause uncertainty of flood hazard map. To consider ambiguity and uncertainty of criterion, fuzzy logic is introduced which is able to handle ambiguous expression. In this paper, we made Flood Hazard Map according to levee breach overflow using the Fuzzy TOPSIS Technique. We confirmed the areas where the highest grade of hazard was recorded through the drawn-up integrated flood hazard map, and then produced flood hazard map can be compared them with those indicated in the existing flood risk maps. Also, we expect that if we can apply the flood hazard map methodology suggested in this paper even to manufacturing the current flood risk maps, we will be able to make a new flood hazard map to even consider the priorities for hazard areas, including more varied and important information than ever before. Keywords : Flood hazard map; levee break analysis; 2D analysis; MCDM; Fuzzy TOPSIS Acknowlegement This research was supported by a grant (17AWMP-B079625-04) from Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.
An information diffusion technique to assess integrated hazard risks.
Huang, Chongfu; Huang, Yundong
2018-02-01
An integrated risk is a scene in the future associated with some adverse incident caused by multiple hazards. An integrated probability risk is the expected value of disaster. Due to the difficulty of assessing an integrated probability risk with a small sample, weighting methods and copulas are employed to avoid this obstacle. To resolve the problem, in this paper, we develop the information diffusion technique to construct a joint probability distribution and a vulnerability surface. Then, an integrated risk can be directly assessed by using a small sample. A case of an integrated risk caused by flood and earthquake is given to show how the suggested technique is used to assess the integrated risk of annual property loss. Copyright © 2017 Elsevier Inc. All rights reserved.
Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis
NASA Technical Reports Server (NTRS)
Massie, Michael J.; Morris, A. Terry
2010-01-01
Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.
A new web-based course: dealing with glaciers and permafrost hazards
NASA Astrophysics Data System (ADS)
Oswald, S.; Kaeaeb, A.; Haeberli, W.
2003-04-01
The intensive human use of high mountains intersects more and more with the hazard zones of such environments. Because of the complexity of such processes and impacts, dealing with such risks requires a broad education in many sub-domains of the earth sciences and the socio-economic field. Inter- and trans-disciplinary training and education of professionals is therefore essential. Thus the goal of the Swiss Virtual Campus project "Dealing with Natural Hazards" is to provide such a course program covering the basics of dealing with natural hazards, including technical, environmental and social aspects. In the field of natural hazards and risk management, education at the Swiss universities is mostly structured in narrow sectors. Using the advantages of the internet, the Virtual Campus provides teachers and students an interdisciplinary discussion platform on the integral approach and the handling with natural hazards. The course content is organised in 5 modules: 1 basic knowledge and tools, 2 hydrological / meteorological hazards, 3 geological hazards, 4 vulnerability of property and of socio-economic systems and 5 integral natural risk management. To ensure a national and international access the courses are designed in English and published on the internet. Within the scope of this project we are developing lessons in the subject area of natural hazards related to glaciers and permafrost. These are ice avalanches, glacier floods, glacier length variations and permafrost. The content is divided into chapters, which are consistent over the entire module: (1) processes: characterisation of the different processes, (2) triggering: initiating events, (3) data acquisition, mapping and monitoring: appropriate methods, (4) estimation models: application of the adequate model, (5) combinations and interactions: interrelation and impacts of different hazards, (6) long-term effects: global change effects, (7) integral hazard recognition and assessment: integral proceedings, (8) measures: appropriate protection measures, (9) examples: different cases from throughout the world. It is our goal to design the e-lessons in interactive way, to utilise the benefits of computer-based learning. The course will replace the classical way of "ex-cathedra" teaching by problem-based learning. After working out the basics individually the students shell have the opportunity to discuss and apply their acquired knowledge by editing case studies. It is also planned to use the course for capacity building in development countries.
Toxics Release Inventory Chemical Hazard Information Profiles (TRI-CHIP) Dataset
The Toxics Release Inventory (TRI) Chemical Hazard Information Profiles (TRI-CHIP) dataset contains hazard information about the chemicals reported in TRI. Users can use this XML-format dataset to create their own databases and hazard analyses of TRI chemicals. The hazard information is compiled from a series of authoritative sources including the Integrated Risk Information System (IRIS). The dataset is provided as a downloadable .zip file that when extracted provides XML files and schemas for the hazard information tables.
Chen, Yuhuan; Dennis, Sherri B; Hartnett, Emma; Paoli, Greg; Pouillot, Régis; Ruthman, Todd; Wilson, Margaret
2013-03-01
Stakeholders in the system of food safety, in particular federal agencies, need evidence-based, transparent, and rigorous approaches to estimate and compare the risk of foodborne illness from microbial and chemical hazards and the public health impact of interventions. FDA-iRISK (referred to here as iRISK), a Web-based quantitative risk assessment system, was developed to meet this need. The modeling tool enables users to assess, compare, and rank the risks posed by multiple food-hazard pairs at all stages of the food supply system, from primary production, through manufacturing and processing, to retail distribution and, ultimately, to the consumer. Using standard data entry templates, built-in mathematical functions, and Monte Carlo simulation techniques, iRISK integrates data and assumptions from seven components: the food, the hazard, the population of consumers, process models describing the introduction and fate of the hazard up to the point of consumption, consumption patterns, dose-response curves, and health effects. Beyond risk ranking, iRISK enables users to estimate and compare the impact of interventions and control measures on public health risk. iRISK provides estimates of the impact of proposed interventions in various ways, including changes in the mean risk of illness and burden of disease metrics, such as losses in disability-adjusted life years. Case studies for Listeria monocytogenes and Salmonella were developed to demonstrate the application of iRISK for the estimation of risks and the impact of interventions for microbial hazards. iRISK was made available to the public at http://irisk.foodrisk.org in October 2012.
A comparative evaluation of five hazard screening tools.
Panko, J M; Hitchcock, K; Fung, M; Spencer, P J; Kingsbury, T; Mason, A M
2017-01-01
An increasing number of hazard assessment tools and approaches are being used in the marketplace as a means to differentiate products and ingredients with lower versus higher hazards or to certify what some call greener chemical ingredients in consumer products. Some leading retailers have established policies for product manufacturers and their suppliers to disclose chemical ingredients and their related hazard characteristics often specifying what tools to use. To date, no data exists that show a tool's reliability to provide consistent, credible screening-level hazard scores that can inform greener product selection. We conducted a small pilot study to understand and compare the hazard scoring of several hazard screening tools to determine if hazard and toxicity profiles for chemicals differ. Seven chemicals were selected that represent both natural and man-made chemistries as well as a range of toxicological activity. We conducted the assessments according to each tool provider's guidelines, which included factors such as endpoints, weighting preferences, sources of information, and treatment of data gaps. The results indicate the tools varied in the level of discrimination seen in the scores for these 7 chemicals and that tool classifications of the same chemical varied widely between the tools, ranging from little or no hazard or toxicity to very high hazard or toxicity. The results also highlight the need for transparency in describing the basis for the tool's hazard scores and suggest possible enhancements. Based on this pilot study, tools should not be generalized to fit all situations because their evaluations are context-specific. Before choosing a tool or approach, it is critical that the assessment rationale be clearly defined and matches the selected tool or approach. Integr Environ Assess Manag 2017;13:139-154. © 2016 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC. © 2016 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC.
Patterns of Risk Using an Integrated Spatial Multi-Hazard Model (PRISM Model)
Multi-hazard risk assessment has long centered on small scale needs, whereby a single community or group of communities’ exposures are assessed to determine potential mitigation strategies. While this approach has advanced the understanding of hazard interactions, it is li...
Integrated Risk Research. Case of Study: Motozintla, Chiapas, Mexico
NASA Astrophysics Data System (ADS)
Novelo-Casanova, D. A.; Jaimes, M.
2015-12-01
This integrated risk research include the analysis of all components of individual constituents of risk such hazard identification, hazard exposure, and vulnerability. We determined risk to natural hazards in the community of Motozintla located in southern Mexico in the state of Chiapas (15.37ºN, 92.25ºW. Due to its geographical and geological location, this community is continuously exposed mainly to earthquakes, landslides and floods. We developed integrated studies and analysis of seismic zonation, landslides and flood susceptibility using standard methodologies. Vulnerability was quantified from data collected from local families interviews considering five social variables: characteristics of housing construction, availability of basic public services, family economic conditions, existing community plans for disaster preparedness, and risk perception. Local families surveyed were randomly selected considering a sample statistically significant. Our results were spatially represented using a Geographical Information System (GIS). Structural vulnerability curves were generated for typical housing constructions. Our integrated risk analysis demonstrates that the community of Motozintla has a high level of structural and socio-economical risk to floods and earthquakes. More than half of the population does not know any existing Civil Protection Plan and perceive that they are in high risk to landslides and floods. Although the community is located in a high seismic risk zone, most of the local people believe that cannot be impacted by a large earthquake. These natural and social conditions indicate that the community of Motozintla has a very high level of risk to natural hazards. This research will support local decision makers in developing an integrated comprehensive natural hazards mitigation and prevention program.
An integrated knowledge system for the Space Shuttle hazardous gas detection system
NASA Technical Reports Server (NTRS)
Lo, Ching F.; Shi, George Z.; Bangasser, Carl; Fensky, Connie; Cegielski, Eric; Overbey, Glenn
1993-01-01
A computer-based integrated Knowledge-Based System, the Intelligent Hypertext Manual (IHM), was developed for the Space Shuttle Hazardous Gas Detection System (HGDS) at NASA Marshall Space Flight Center (MSFC). The IHM stores HGDS related knowledge and presents it in an interactive and intuitive manner. This manual is a combination of hypertext and an expert system which store experts' knowledge and experience in hazardous gas detection and analysis. The IHM's purpose is to provide HGDS personnel with the capabilities of: locating applicable documentation related to procedures, constraints, and previous fault histories; assisting in the training of personnel; enhancing the interpretation of real time data; and recognizing and identifying possible faults in the Space Shuttle sub-systems related to hazardous gas detection.
Semi-parametric regression model for survival data: graphical visualization with R
2016-01-01
Cox proportional hazards model is a semi-parametric model that leaves its baseline hazard function unspecified. The rationale to use Cox proportional hazards model is that (I) the underlying form of hazard function is stringent and unrealistic, and (II) researchers are only interested in estimation of how the hazard changes with covariate (relative hazard). Cox regression model can be easily fit with coxph() function in survival package. Stratified Cox model may be used for covariate that violates the proportional hazards assumption. The relative importance of covariates in population can be examined with the rankhazard package in R. Hazard ratio curves for continuous covariates can be visualized using smoothHR package. This curve helps to better understand the effects that each continuous covariate has on the outcome. Population attributable fraction is a classic quantity in epidemiology to evaluate the impact of risk factor on the occurrence of event in the population. In survival analysis, the adjusted/unadjusted attributable fraction can be plotted against survival time to obtain attributable fraction function. PMID:28090517
10 CFR 70.62 - Safety program and integrated safety analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Radiological hazards related to possessing or processing licensed material at its facility; (ii) Chemical hazards of licensed material and hazardous chemicals produced from licensed material; (iii) Facility... performed by a team with expertise in engineering and process operations. The team shall include at least...
NASA Technical Reports Server (NTRS)
Kelly, Michael J.
2013-01-01
The Alternative Fuel Effects on Contrails and Cruise Emissions (ACCESS) Project Integration Manager requested in July 2012 that the NASA Engineering and Safety Center (NESC) form a team to independently assess aircraft structural failure hazards associated with the ACCESS experiment and to identify potential flight test hazard mitigations to ensure flight safety. The ACCESS Project Integration Manager subsequently requested that the assessment scope be focused predominantly on structural failure risks to the aircraft empennage (horizontal and vertical tail). This report contains the Appendices to Volume I.
The Necessity of Functional Analysis for Space Exploration Programs
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Breidenthal, Julian C.
2011-01-01
As NASA moves toward expanded commercial spaceflight within its human exploration capability, there is increased emphasis on how to allocate responsibilities between government and commercial organizations to achieve coordinated program objectives. The practice of program-level functional analysis offers an opportunity for improved understanding of collaborative functions among heterogeneous partners. Functional analysis is contrasted with the physical analysis more commonly done at the program level, and is shown to provide theoretical performance, risk, and safety advantages beneficial to a government-commercial partnership. Performance advantages include faster convergence to acceptable system solutions; discovery of superior solutions with higher commonality, greater simplicity and greater parallelism by substituting functional for physical redundancy to achieve robustness and safety goals; and greater organizational cohesion around program objectives. Risk advantages include avoidance of rework by revelation of some kinds of architectural and contractual mismatches before systems are specified, designed, constructed, or integrated; avoidance of cost and schedule growth by more complete and precise specifications of cost and schedule estimates; and higher likelihood of successful integration on the first try. Safety advantages include effective delineation of must-work and must-not-work functions for integrated hazard analysis, the ability to formally demonstrate completeness of safety analyses, and provably correct logic for certification of flight readiness. The key mechanism for realizing these benefits is the development of an inter-functional architecture at the program level, which reveals relationships between top-level system requirements that would otherwise be invisible using only a physical architecture. This paper describes the advantages and pitfalls of functional analysis as a means of coordinating the actions of large heterogeneous organizations for space exploration programs.
Atmospheric Monitoring Strategy for Ground Testing of Closed Ecological Life Support Systems
NASA Technical Reports Server (NTRS)
Feighery, John; Cavenall, Ivan; Knight, Amanda
2004-01-01
This paper reviews the evolution and current state of atmospheric monitoring on the International Space Station to provide context from which we can imagine a more advanced and integrated system. The unique environmental hazards of human space flight are identified and categorized into groups, taking into consideration the time required for the hazard to become a threat to human health or performance. The key functions of a comprehensive monitoring strategy for a closed ecological life support system are derived from past experience and a survey of currently available technologies for monitoring air quality. Finally, a system architecture is developed incorporating the lessons learned from ISS and other analogous closed life support systems. The paper concludes by presenting recommendations on how to proceed with requirements definition and conceptual design of an air monitoring system for exploration missions.
NASA Astrophysics Data System (ADS)
Benkert, B.; Perrin, A.; Calmels, F.
2015-12-01
Together with its partners, the Northern Climate ExChange (NCE, part of the Yukon Research Centre at Yukon College) has been mapping permafrost-related hazard risk in northern communities since 2010. By integrating geoscience and climate project data, we have developed a series of community-scale hazard risk maps. The maps depict hazard risk in stoplight colours for easy interpretation, and support community-based, future-focused adaptation planning. Communities, First Nations, consultants and local regulatory agencies have used the hazard risk maps to site small-scale infrastructure projects, guide land planning processes, and assess suitability of land development applications. However, we know that assessing risk is only one step in integrating the implications of permafrost degradation in societal responses to environmental change. To build on our permafrost hazard risk maps, we are integrating economic principles and traditional land use elements. To assess economic implications of adaptation to permafrost change, we are working with geotechnical engineers to identify adaptation options (e.g., modified building techniques, permafrost thaw mitigation approaches) that suit the risks captured by our existing hazard risk maps. We layer this with an economic analysis of the costs associated with identified adaptation options, providing end-users with a more comprehensive basis upon which to make decisions related to infrastructure. NCE researchers have also integrated traditional land use activities in assessments of permafrost thaw risk, in a project led by Jean Marie River First Nation in the Northwest Territories. Here, the implications of permafrost degradation on food security and land use priorities were assessed by layering key game and gathering areas on permafrost thaw vulnerability maps. Results indicated that close to one quarter of big and small game habitats, and close to twenty percent of key furbearer and gathering areas within the First Nation's traditional territory, are situated on highly thaw sensitive permafrost. These projects demonstrate how physical and socio-economic factors can be integrated in assessments of permafrost vulnerability to thaw, thus providing tangible, useable results that reflect community priorities and support local decision making.
Hazardous drinking and military community functioning: identifying mediating risk factors.
Foran, Heather M; Heyman, Richard E; Slep, Amy M Smith
2011-08-01
Hazardous drinking is a serious societal concern in military populations. Efforts to reduce hazardous drinking among military personnel have been limited in effectiveness. There is a need for a deeper understanding of how community-based prevention models apply to hazardous drinking in the military. Community-wide prevention efforts may be most effective in targeting community functioning (e.g., support from formal agencies, community cohesion) that impacts hazardous drinking via other proximal risk factors. The goal of the current study is to inform community-wide prevention efforts by testing a model of community functioning and mediating risk factors of hazardous drinking among active duty U.S. Air Force personnel. A large, representative survey sample of U.S. Air Force active duty members (N = 52,780) was collected at 82 bases worldwide. Hazardous drinking was assessed with the widely used Alcohol Use Disorders Identification Test (Saunders, Aasland, Babor, de la Fuente, & Grant, 1993). A variety of individual, family, and community measures were also assessed. Structural equation modeling was used to test a hypothesized model of community functioning, mediating risk factors and hazardous drinking. Depressive symptoms, perceived financial stress, and satisfaction with the U.S. Air Force were identified as significant mediators of the link between community functioning and hazardous drinking for men and women. Relationship satisfaction was also identified as a mediator for men. These results provide a framework for further community prevention research and suggest that prevention efforts geared at increasing aspects of community functioning (e.g., the U.S. Air Force Community Capacity model) may indirectly lead to reductions in hazardous drinking through other proximal risk factors.
IRIS Toxicological Review of Ethylene Glycol Mono-Butyl ...
EPA has conducted a peer review of the scientific basis supporting the human health hazard and dose-response assessment of ethylene glycol monobutyl ether that will appear on the Integrated Risk Information System (IRIS) database. EPA is conducting a peer review of the scientific basis supporting the human health hazard and dose-response assessment of propionaldehyde that will appear on the Integrated Risk Information System (IRIS) database.
Karimli, Leyla; Rost, Lucia; Ismayilova, Leyla
2018-01-01
This is the first randomized controlled trial in Burkina Faso testing the effect of economic strengthening alone and in combination with family coaching on child's hazardous work and work-related health outcomes. The study also tests the association between different forms of hazardous work and child's health outcomes. A total of 360 households from 12 villages participated in the study. Villages were randomly assigned to three study arms: economic intervention alone, economic intervention integrated with family coaching, and control. In each household, one female caregiver and one child aged 10-15 years were interviewed. Data were collected at baseline, 12 months, and 24 months. We ran multilevel mixed-effects models that account for both within-individual correlation over time and clustering of subjects within villages. Compared with the control group, at 24 months, children in the integrated arm experienced significant reduction in exposure to hazardous work and some forms of hazards and abuse. Results for children in the economic strengthening-only arm were more modest. In most cases, child's health was significantly associated not with specific forms of work per se, but with child's exposure to hazards and abuse while doing this form of work. We found no significant effect of intervention on child's work-related health. Economic strengthening combined with family coaching on child protection issues, rather than implemented alone, may be more effective in reducing child's exposure to hazardous work. Additional research is needed to understand gender differences and causal links between different forms of child work and health hazards. Copyright © 2017. Published by Elsevier Inc.
Separating spatial search and efficiency rates as components of predation risk
DeCesare, Nicholas J.
2012-01-01
Predation risk is an important driver of ecosystems, and local spatial variation in risk can have population-level consequences by affecting multiple components of the predation process. I use resource selection and proportional hazard time-to-event modelling to assess the spatial drivers of two key components of risk—the search rate (i.e. aggregative response) and predation efficiency rate (i.e. functional response)—imposed by wolves (Canis lupus) in a multi-prey system. In my study area, both components of risk increased according to topographic variation, but anthropogenic features affected only the search rate. Predicted models of the cumulative hazard, or risk of a kill, underlying wolf search paths validated well with broad-scale variation in kill rates, suggesting that spatial hazard models provide a means of scaling up from local heterogeneity in predation risk to population-level dynamics in predator–prey systems. Additionally, I estimated an integrated model of relative spatial predation risk as the product of the search and efficiency rates, combining the distinct contributions of spatial heterogeneity to each component of risk. PMID:22977145
Cox, Trevor F; Czanner, Gabriela
2016-06-30
This paper introduces a new simple divergence measure between two survival distributions. For two groups of patients, the divergence measure between their associated survival distributions is based on the integral of the absolute difference in probabilities that a patient from one group dies at time t and a patient from the other group survives beyond time t and vice versa. In the case of non-crossing hazard functions, the divergence measure is closely linked to the Harrell concordance index, C, the Mann-Whitney test statistic and the area under a receiver operating characteristic curve. The measure can be used in a dynamic way where the divergence between two survival distributions from time zero up to time t is calculated enabling real-time monitoring of treatment differences. The divergence can be found for theoretical survival distributions or can be estimated non-parametrically from survival data using Kaplan-Meier estimates of the survivor functions. The estimator of the divergence is shown to be generally unbiased and approximately normally distributed. For the case of proportional hazards, the constituent parts of the divergence measure can be used to assess the proportional hazards assumption. The use of the divergence measure is illustrated on the survival of pancreatic cancer patients. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Increasing impacts of climate extremes on critical infrastructures in Europe
NASA Astrophysics Data System (ADS)
Forzieri, Giovanni; Bianchi, Alessandra; Feyen, Luc; Silva, Filipe Batista e.; Marin, Mario; Lavalle, Carlo; Leblois, Antoine
2016-04-01
The projected increases in exposure to multiple climate hazards in many regions of Europe, emphasize the relevance of a multi-hazard risk assessment to comprehensively quantify potential impacts of climate change and develop suitable adaptation strategies. In this context, quantifying the future impacts of climatic extremes on critical infrastructures is crucial due to their key role for human wellbeing and their effects on the overall economy. Critical infrastructures describe the existing assets and systems that are essential for the maintenance of vital societal functions, health, safety, security, economic or social well-being of people, and the disruption or destruction of which would have a significant impact as a result of the failure to maintain those functions. We assess the direct damages of heat and cold waves, river and coastal flooding, droughts, wildfires and windstorms to energy, transport, industry and social infrastructures in Europe along the 21st century. The methodology integrates in a coherent framework climate hazard, exposure and vulnerability components. Overall damage is expected to rise up to 38 billion €/yr, ten time-folds the current climate damage, with drastic variations in risk scenarios. Exemplificative are drought and heat-related damages that could represent 70% of the overall climate damage in 2080s versus the current 12%. Many regions, prominently Southern Europe, will likely suffer multiple stresses and systematic infrastructure failures due to climate extremes if no suitable adaptation measures will be taken.
Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site
NASA Astrophysics Data System (ADS)
Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.
2012-04-01
The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological Development, Area "Environment", Activity 1.3.3.1 "Prediction of triggering and risk assessment for landslides".
Confidence intervals for the first crossing point of two hazard functions.
Cheng, Ming-Yen; Qiu, Peihua; Tan, Xianming; Tu, Dongsheng
2009-12-01
The phenomenon of crossing hazard rates is common in clinical trials with time to event endpoints. Many methods have been proposed for testing equality of hazard functions against a crossing hazards alternative. However, there has been relatively few approaches available in the literature for point or interval estimation of the crossing time point. The problem of constructing confidence intervals for the first crossing time point of two hazard functions is considered in this paper. After reviewing a recent procedure based on Cox proportional hazard modeling with Box-Cox transformation of the time to event, a nonparametric procedure using the kernel smoothing estimate of the hazard ratio is proposed. The proposed procedure and the one based on Cox proportional hazard modeling with Box-Cox transformation of the time to event are both evaluated by Monte-Carlo simulations and applied to two clinical trial datasets.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-19
... Compatibility Group S indicates that hazardous effects from accidental functioning are limited to the extent the... package is capable of containing any hazardous effects in the event of an accidental functioning of its... demonstrate that any hazardous effects are confined within a package. In the ANPRM, we invited commenters to...
Lee, Mi Jung; Park, Jung Tak; Park, Kyoung Sook; Kwon, Young Eun; Oh, Hyung Jung; Yoo, Tae-Hyun; Kim, Yong-Lim; Kim, Yon Su; Yang, Chul Woo; Kim, Nam-Ho; Kang, Shin-Wook; Han, Seung Hyeok
2017-03-07
Residual kidney function can be assessed by simply measuring urine volume, calculating GFR using 24-hour urine collection, or estimating GFR using the proposed equation (eGFR). We aimed to investigate the relative prognostic value of these residual kidney function parameters in patients on dialysis. Using the database from a nationwide prospective cohort study, we compared differential implications of the residual kidney function indices in 1946 patients on dialysis at 36 dialysis centers in Korea between August 1, 2008 and December 31, 2014. Residual GFR calculated using 24-hour urine collection was determined by an average of renal urea and creatinine clearance on the basis of 24-hour urine collection. eGFR-urea, creatinine and eGFR β 2 -microglobulin were calculated from the equations using serum urea and creatinine and β 2 -microglobulin, respectively. The primary outcome was all-cause death. During a mean follow-up of 42 months, 385 (19.8%) patients died. In multivariable Cox analyses, residual urine volume (hazard ratio, 0.96 per 0.1-L/d higher volume; 95% confidence interval, 0.94 to 0.98) and GFR calculated using 24-hour urine collection (hazard ratio, 0.98; 95% confidence interval, 0.95 to 0.99) were independently associated with all-cause mortality. In 1640 patients who had eGFR β 2 -microglobulin data, eGFR β 2 -microglobulin (hazard ratio, 0.98; 95% confidence interval, 0.96 to 0.99) was also significantly associated with all-cause mortality as well as residual urine volume (hazard ratio, 0.96 per 0.1-L/d higher volume; 95% confidence interval, 0.94 to 0.98) and GFR calculated using 24-hour urine collection (hazard ratio, 0.97; 95% confidence interval, 0.95 to 0.99). When each residual kidney function index was added to the base model, only urine volume improved the predictability for all-cause mortality (net reclassification index =0.11, P =0.01; integrated discrimination improvement =0.01, P =0.01). Higher residual urine volume was significantly associated with a lower risk of death and exhibited a stronger association with mortality than GFR calculated using 24-hour urine collection and eGFR-urea, creatinine. These results suggest that determining residual urine volume may be beneficial to predict patient survival in patients on dialysis. Copyright © 2017 by the American Society of Nephrology.
Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Sperotto, Anna; Glade, Thomas; Marcomini, Antonio
2016-03-01
This paper presents a review of existing multi-risk assessment concepts and tools applied by organisations and projects providing the basis for the development of a multi-risk methodology in a climate change perspective. Relevant initiatives were developed for the assessment of multiple natural hazards (e.g. floods, storm surges, droughts) affecting the same area in a defined timeframe (e.g. year, season, decade). Major research efforts were focused on the identification and aggregation of multiple hazard types (e.g. independent, correlated, cascading hazards) by means of quantitative and semi-quantitative approaches. Moreover, several methodologies aim to assess the vulnerability of multiple targets to specific natural hazards by means of vulnerability functions and indicators at the regional and local scale. The overall results of the review show that multi-risk approaches do not consider the effects of climate change and mostly rely on the analysis of static vulnerability (i.e. no time-dependent vulnerabilities, no changes among exposed elements). A relevant challenge is therefore to develop comprehensive formal approaches for the assessment of different climate-induced hazards and risks, including dynamic exposure and vulnerability. This requires the selection and aggregation of suitable hazard and vulnerability metrics to make a synthesis of information about multiple climate impacts, the spatial analysis and ranking of risks, including their visualization and communication to end-users. To face these issues, climate impact assessors should develop cross-sectorial collaborations among different expertise (e.g. modellers, natural scientists, economists) integrating information on climate change scenarios with sectorial climate impact assessment, towards the development of a comprehensive multi-risk assessment process. Copyright © 2015 Elsevier Ltd. All rights reserved.
Rocky Mountain Research Station USDA Forest Service
2004-01-01
Many managers and policymakers guided by the National Environmental Policy Act process want to understand the scientific principles on which they can base fuel treatments for reducing the size and severity of wildfires. These Forest Structure and Fire Hazard fact sheets discuss how to estimate fire hazard, how to visualize fuel treatments, and how the role of...
NASA Astrophysics Data System (ADS)
Mihlan, G. I.; Mitchell, R. I.; Smith, R. K.
1984-07-01
A survey to assess control technology for integrated circuit fabrication was conducted. Engineering controls included local and general exhaust ventilation, shielding, and personal protective equipment. Devices or work stations that contained toxic materials that were potentially dangerous were controlled by local exhaust ventilation. Less hazardous areas were controlled by general exhaust ventilation. Process isolation was used in the plasma etching, low pressure chemical vapor deposition, and metallization operations. Shielding was used in ion implantation units to control X-ray emissions, in contact mask alignes to limit ultraviolet (UV) emissions, and in plasma etching units to control radiofrequency and UV emissions. Most operations were automated. Use of personal protective equipment varied by job function.
Opportunities for Launch Site Integrated System Health Engineering and Management
NASA Technical Reports Server (NTRS)
Waterman, Robert D.; Langwost, Patricia E.; Waterman, Susan J.
2005-01-01
The launch site processing flow involves operations such as functional verification, preflight servicing and launch. These operations often include hazards that must be controlled to protect human life and critical space hardware assets. Existing command and control capabilities are limited to simple limit checking durig automated monitoring. Contingency actions are highly dependent on human recognition, decision making, and execution. Many opportunities for Integrated System Health Engineering and Management (ISHEM) exist throughout the processing flow. This paper will present the current human-centered approach to health management as performed today for the shuttle and space station programs. In addition, it will address some of the more critical ISHEM needs, and provide recommendations for future implementation of ISHEM at the launch site.
Tracking Hazard Analysis Data in a Jungle of Changing Design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Robin S.; Young, Jonathan
2006-05-16
Tracking hazard analysis data during the 'life cycle' of a project can be an extremely complicated task. However, a few simple rules, used consistently, can give you the edge that will save countless headaches and provide the information that will help integrate the hazard analysis and design activities even if performed in parallel.
40 CFR 265.340 - Applicability.
Code of Federal Regulations, 2013 CFR
2013-07-01
.... (b) Integration of the MACT standards. (1) Except as provided by paragraphs (b)(2) and (b)(3) of this... 261, subpart D, of this chapter solely because it is ignitable (Hazard Code I), corrosive (Hazard Code... because it is reactive (Hazard Code R) for characteristics other than those listed in § 261.23(a) (4) and...
40 CFR 265.340 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-07-01
.... (b) Integration of the MACT standards. (1) Except as provided by paragraphs (b)(2) and (b)(3) of this... 261, subpart D, of this chapter solely because it is ignitable (Hazard Code I), corrosive (Hazard Code... because it is reactive (Hazard Code R) for characteristics other than those listed in § 261.23(a) (4) and...
40 CFR 264.340 - Applicability.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) Integration of the MACT standards. (1) Except as provided by paragraphs (b)(2) through (b)(4) of this section... hazardous waste in part 261, subpart D, of this chapter solely because it is ignitable (Hazard Code I... chapter solely because it is reactive (Hazard Code R) for characteristics other than those listed in § 261...
40 CFR 265.340 - Applicability.
Code of Federal Regulations, 2014 CFR
2014-07-01
.... (b) Integration of the MACT standards. (1) Except as provided by paragraphs (b)(2) and (b)(3) of this... 261, subpart D, of this chapter solely because it is ignitable (Hazard Code I), corrosive (Hazard Code... because it is reactive (Hazard Code R) for characteristics other than those listed in § 261.23(a) (4) and...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-28
... Property Act for airport purposes (``Subject Airports''), to conduct Wildlife Hazard Site Visits (WHSVs) or... of land under the Surplus Property Act for airport purposes to identify and mitigate wildlife hazards.... These airports are typically smaller and have less air traffic, more piston-powered aircraft, and...
Rocky Mountain Research Station USDA Forest Service
2004-01-01
The amount of science applicable to the management of wildfire hazards is increasing daily. In addition, the attitudes of landowners and policymakers about fire and fuels management are changing. This fact sheet discusses three critical keys to communicating about wildfire hazards.
Pursuing the Delta -- Maximizing Opportunities to Integrate Sustainability in the Funding Processes
2011-03-03
that may contain safety and health hazards. This is not an all-inclusive list: a. Fire protection issues b. Toxic fumes (i.e., engine exhaust...hazards shall be reported as part of the SAR. A.6 Hazardous Materials. The contractor shall not use cadmium, hexavalent chromium , or other
The federal government has established a system of labeling hazardous materials to help identify the type of material and threat posed. Summaries of information on over 300 chemicals are maintained in the Envirofacts Master Chemical Integrator.
Landslide hazards and systems analysis: A Central European perspective
NASA Astrophysics Data System (ADS)
Klose, Martin; Damm, Bodo; Kreuzer, Thomas
2016-04-01
Part of the problem with assessing landslide hazards is to understand the variable settings in which they occur. There is growing consensus that hazard assessments require integrated approaches that take account of the coupled human-environment system. Here we provide a synthesis of societal exposure and vulnerability to landslide hazards, review innovative approaches to hazard identification, and lay a focus on hazard assessment, while presenting the results of historical case studies and a landslide time series for Germany. The findings add to a growing body of literature that recognizes societal exposure and vulnerability as a complex system of hazard interactions that evolves over time as a function of social change and development. We therefore propose to expand hazard assessments by the framework and concepts of systems analysis (e.g., Liu et al., 2007) Results so far have been promising in ways that illustrate the importance of feedbacks, thresholds, surprises, and time lags in the evolution of landslide hazard and risk. In densely populated areas of Central Europe, landslides often occur in urbanized landscapes or on engineered slopes that had been transformed or created intentionally by human activity, sometimes even centuries ago. The example of Germany enables to correlate the causes and effects of recent landslides with the historical transition of urbanization to urban sprawl, ongoing demographic change, and some chronic problems of industrialized countries today, including ageing infrastructures or rising government debts. In large parts of rural Germany, the combination of ageing infrastructures, population loss, and increasing budget deficits starts to erode historical resilience gains, which brings especially small communities to a tipping point in their efforts to risk reduction. While struggling with budget deficits and demographic change, these communities are required to maintain ageing infrastructures that are particularly vulnerable to landslides. Along with a large number of small, but costly landslide events and widespread insidious damages, the interplay of these societal trends determines landslide hazard and risk in Germany or elsewhere in Central Europe (e.g., Houlihan, 1994; Klose et al., 2015). The case studies presented here help to better understand human-environment interactions in the hazard context. Although there has been substantial progress in assessing landslide hazards, integrated approaches with an interdisciplinary focus are still exceptional. The scope of historical datasets available for hazard assessments, however, covers the whole range of natural and social systems interacting with hazards, their influences on overall system vulnerability, and the feedbacks, time lags, and couplings among these systems. In combination with methods from the natural and social sciences, systems analysis supports hazard assessments across disciplinary boundaries to take a broader look at landslide hazards as is usually done. References Houlihan, B., 1994. Europe's ageing infrastructure: Politics, finance and the environment. Utilities Policy 4, 243-252. Liu, J., Dietz, T., Carpenter, S.R., Alberti, M., Folke, C., Moran, E., Pell, A.N., Deadman, P., Kratz, T., Lubchenco, J., Ostrom, E., Ouyang, Z., Provencher, W., Redman, C.L., Schneider, S.H., Taylor, W.W., 2007. Complexity of Coupled Human and Natural Systems. Science 317, 1513-1516. Klose, M., Damm, B., Maurischat, P., 2015. Landslide impacts in Germany: A historical and socioeconomic perspective. Landslides, doi:10.1007/s10346-015-0643-9.
DISTANT EARLY WARNING SYSTEM for Tsunamis - A wide-area and multi-hazard approach
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin; Lendholt, Matthias; Wächter, Joachim
2010-05-01
The DEWS (Distant Early Warning System) [1] project, funded under the 6th Framework Programme of the European Union, has the objective to create a new generation of interoperable early warning systems based on an open sensor platform. This platform integrates OGC [2] SWE [3] compliant sensor systems for the rapid detection of hazardous events, like earthquakes, sea level anomalies, ocean floor occurrences, and ground displacements in the case of tsunami early warning. Based on the upstream information flow DEWS focuses on the improvement of downstream capacities of warning centres especially by improving information logistics for effective and targeted warning message aggregation for a multilingual environment. Multiple telecommunication channels will be used for the dissemination of warning messages. Wherever possible, existing standards have been integrated. The Command and Control User Interface (CCUI), a rich client application based on Eclipse RCP (Rich Client Platform) [4] and the open source GIS uDig [5], integrates various OGC services. Using WMS (Web Map Service) [6] and WFS (Web Feature Service) [7] spatial data are utilized to depict the situation picture and to integrate a simulation system via WPS (Web Processing Service) [8] to identify affected areas. Warning messages are compiled and transmitted in the OASIS [9] CAP (Common Alerting Protocol) [10] standard together with addressing information defined via EDXL-DE (Emergency Data Exchange Language - Distribution Element) [11]. Internal interfaces are realized with SOAP [12] web services. Based on results of GITEWS [13] - in particular the GITEWS Tsunami Service Bus [14] - the DEWS approach provides an implementation for tsunami early warning systems but other geological paradigms are going to follow, e.g. volcanic eruptions or landslides. Therefore in future also multi-hazard functionality is conceivable. The specific software architecture of DEWS makes it possible to dock varying sensors to the system and to extend the CCUI with hazard specific functionality. The presentation covers the DEWS project, the system architecture and the CCUI in conjunction with details of information logistics. The DEWS Wide Area Centre connecting national centres to allow the international communication and warning exchange is presented also. REFERENCES: [1] DEWS, www.dews-online.org [2] OGC, www.opengeospatial.org [3] SWE, www.opengeospatial.org/projects/groups/sensorweb [4] Eclipse RCP, www.eclipse.org/home/categories/rcp.php [5] uDig, udig.refractions.net [6] WMS, www.opengeospatial.org/standards/wms [7] WFS, www.opengeospatial.org/standards/wfs [8] WPS, www.opengeospatial.org/standards/wps [9] OASIS, www.oasis-open.org [10] CAP, www.oasis-open.org/specs/#capv1.1 [11] EDXL-DE, www.oasis-open.org/specs/#edxlde-v1.0 [12] SOAP, www.w3.org/TR/soap [13] GITEWS (German Indonesian Tsunami Early Warning System) is a project of the German Federal Government to aid the recon¬struction of the tsunami-prone Indian Ocean region, www.gitews.org [14] The Tsunami Service Bus is the GITEWS sensor system integration platform offering standardised services for the detection and monitoring of tsunamis
Alert generation and cockpit presentation for an integrated microburst alerting system
NASA Technical Reports Server (NTRS)
Wanke, Craig; Hansman, R. John, Jr.
1991-01-01
Alert generation and cockpit presentation issues for low level wind shear (microburst) alerts are investigated. Alert generation issues center on the development of a hazard criterion which allows integration of both ground based and airborne wind shear detection systems to form an accurate picture of the aviation hazard posed by a particular wind shear situation. A methodology for the testing of a hazard criteria through flight simulation has been developed, and has been used to examine the effectiveness and feasibility of several possible criteria. Also, an experiment to evaluate candidate graphical cockpit displays for microburst alerts using a piloted simulator has been designed.
NASA Astrophysics Data System (ADS)
Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus
2016-04-01
In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach torrent in Tyrol (Austria), are analysed in detail. A couple of buildings are entirely reconstructed within the physical scale model at the scale 1:30. They include basement and first floor and thereby all relevant openings on the building envelopes. The results from experimental modelling represent the data basis for further physics-based vulnerability analysis. Hence, the applied vulnerability analysis concept significantly extends the methods presently used in flood risk assessment. The results of the study are of basic importance for practical application, as they provide extensive information to support hazard zone mapping and management, as well as the planning of local technical protection measures.
EPA Facility Registry Service (FRS): RCRA
This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of hazardous waste facilities that link to the Resource Conservation and Recovery Act Information System (RCRAInfo). EPA's comprehensive information system in support of the Resource Conservation and Recovery Act (RCRA) of 1976 and the Hazardous and Solid Waste Amendments (HSWA) of 1984, RCRAInfo tracks many types of information about generators, transporters, treaters, storers, and disposers of hazardous waste. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to RCRAInfo hazardous waste facilities once the RCRAInfo data has been integrated into the FRS database. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs
Natural-technological risk assessment and management
NASA Astrophysics Data System (ADS)
Burova, Valentina; Frolova, Nina
2016-04-01
EM-DAT statistical data on human impact and economic damages in the 1st semester 2015 are the highest since 2011: 41% of disasters were floods, responsible for 39% of economic damage and 7% of events were earthquakes responsible for 59% of total death toll. This suggests that disaster risk assessment and management still need to be improved and stay the principle issue in national and international related programs. The paper investigates the risk assessment and management practice in the Russian Federation at different levels. The method is proposed to identify the territories characterized by integrated natural-technological hazard. The maps of the Russian Federation zoning according to the integrated natural-technological hazard level are presented, as well as the procedure of updating the integrated hazard level taking into account the activity of separate processes. Special attention is paid to data bases on past natural and technological processes consequences, which are used for verification of current hazard estimation. The examples of natural-technological risk zoning for the country and some regions territory are presented. Different output risk indexes: both social and economic, are estimated taking into account requirements of end-users. In order to increase the safety of population of the Russian Federation the trans-boundaries hazards are also taken into account.
Advanced integrated enhanced vision systems
NASA Astrophysics Data System (ADS)
Kerr, J. R.; Luk, Chiu H.; Hammerstrom, Dan; Pavel, Misha
2003-09-01
In anticipation of its ultimate role in transport, business and rotary wing aircraft, we clarify the role of Enhanced Vision Systems (EVS): how the output data will be utilized, appropriate architecture for total avionics integration, pilot and control interfaces, and operational utilization. Ground-map (database) correlation is critical, and we suggest that "synthetic vision" is simply a subset of the monitor/guidance interface issue. The core of integrated EVS is its sensor processor. In order to approximate optimal, Bayesian multi-sensor fusion and ground correlation functionality in real time, we are developing a neural net approach utilizing human visual pathway and self-organizing, associative-engine processing. In addition to EVS/SVS imagery, outputs will include sensor-based navigation and attitude signals as well as hazard detection. A system architecture is described, encompassing an all-weather sensor suite; advanced processing technology; intertial, GPS and other avionics inputs; and pilot and machine interfaces. Issues of total-system accuracy and integrity are addressed, as well as flight operational aspects relating to both civil certification and military applications in IMC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riddle, F. J.
2003-06-26
The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less
Lambeek, Ludeke C; van Mechelen, Willem; Knol, Dirk L; Loisel, Patrick; Anema, Johannes R
2010-03-16
To evaluate the effectiveness of an integrated care programme, combining a patient directed and a workplace directed intervention, for patients with chronic low back pain. Population based randomised controlled trial. Primary care (10 physiotherapy practices, one occupational health service, one occupational therapy practice) and secondary care (five hospitals). 134 adults aged 18-65 sick listed for at least 12 weeks owing to low back pain. Patients were randomly assigned to usual care (n=68) or integrated care (n=66). Integrated care consisted of a workplace intervention based on participatory ergonomics, involving a supervisor, and a graded activity programme based on cognitive behavioural principles. The primary outcome was the duration of time off work (work disability) due to low back pain until full sustainable return to work. Secondary outcome measures were intensity of pain and functional status. The median duration until sustainable return to work was 88 days in the integrated care group compared with 208 days in the usual care group (P=0.003). Integrated care was effective on return to work (hazard ratio 1.9, 95% confidence interval 1.2 to 2.8, P=0.004). After 12 months, patients in the integrated care group improved significantly more on functional status compared with patients in the usual care group (P=0.01). Improvement of pain between the groups did not differ significantly. The integrated care programme substantially reduced disability due to chronic low back pain in private and working life. Trial registration Current Controlled Trials ISRCTN28478651.
Prototype development of a web-based participative decision support platform in risk management
NASA Astrophysics Data System (ADS)
Aye, Zar Chi; Olyazadeh, Roya; Jaboyedoff, Michel; Derron, Marc-Henri
2014-05-01
This paper discusses the proposed background architecture and prototype development of an internet-based decision support system (DSS) in the field of natural hazards and risk management using open-source geospatial software and web technologies. It is based on a three-tier, client-server architecture with the support of boundless (opengeo) framework and its client side SDK application environment using customized gxp components and data utility classes. The main purpose of the system is to integrate the workflow of risk management systematically with the diverse involvement of stakeholders from different organizations dealing with natural hazards and risk for evaluation of management measures through the active online participation approach. It aims to develop an adaptive user friendly, web-based environment that allows the users to set up risk management strategies based on actual context and data by integrating web-GIS and DSS functionality associated with process flow and other visualization tools. Web-GIS interface has been integrated within the DSS to deliver maps and provide certain geo-processing capabilities on the web, which can be easily accessible and shared by different organizations located in case study sites of the project. This platform could be envisaged not only as a common web-based platform for the centralized sharing of data such as hazard maps, elements at risk maps and additional information but also to ensure an integrated platform of risk management where the users could upload data, analyze risk and identify possible alternative scenarios for risk reduction especially for floods and landslides, either quantitatively or qualitatively depending on the risk information provided by the stakeholders in case study regions. The level of involvement, access to and interaction with the provided functionality of the system varies depending on the roles and responsibilities of the stakeholders, for example, only the experts (planners, geological services, etc.) can have access to the alternative definition component to formulate the risk reduction measures. The development of such a participative platform would finally lead to an integrated risk management approach highlighting the needs to deal with involved experts and civil society in the decision-making process for evaluation of risk management measures through the active participation approach. The system will be applied and evaluated in four case study areas of the CHANGES project in Europe: Romania, North Eastern Italy, French Alps and Poland. However, the framework of the system is designed in a generic way so as to be applicable in other regions to achieve the high adaptability and flexibility of the system. The research has been undertaken as a part of the CHANGES project funded by the European Commission's 7th framework program.
Toward an Application Guide for Safety Integrity Level Allocation in Railway Systems.
Ouedraogo, Kiswendsida Abel; Beugin, Julie; El-Koursi, El-Miloudi; Clarhaut, Joffrey; Renaux, Dominique; Lisiecki, Frederic
2018-02-02
The work in the article presents the development of an application guide based on feedback and comments stemming from various railway actors on their practices of SIL allocation to railway safety-related functions. The initial generic methodology for SIL allocation has been updated to be applied to railway rolling stock safety-related functions in order to solve the SIL concept application issues. Various actors dealing with railway SIL allocation problems are the intended target of the methodology; its principles will be summarized in this article with a focus on modifications and precisions made in order to establish a practical guide for railway safety authorities. The methodology is based on the flowchart formalism used in CSM (common safety method) European regulation. It starts with the use of quantitative safety requirements, particularly tolerable hazard rates (THR). THR apportioning rules are applied. On the one hand, the rules are related to classical logical combinations of safety-related functions preventing hazard occurrence. On the other hand, to take into account technical conditions (last safety weak link, functional dependencies, technological complexity, etc.), specific rules implicitly used in existing practices are defined for readjusting some THR values. SIL allocation process based on apportioned and validated THR values is finally illustrated through the example of "emergency brake" subsystems. Some specific SIL allocation rules are also defined and illustrated. © 2018 Society for Risk Analysis.
A candidate concept for display of forward-looking wind shear information
NASA Technical Reports Server (NTRS)
Hinton, David A.
1989-01-01
A concept is proposed which integrates forward-look wind shear information with airplane performance capabilities to predict future airplane energy state as a function of range. The information could be displayed to a crew either in terms of energy height or airspeed deviations. The anticipated benefits of the proposed display information concept are: (1) a wind shear hazard product that scales directly to the performance impact on the airplane and that has intuitive meaning to flight crews; (2) a reduction in flight crew workload by automatic processing of relevant hazard parameters; and (3) a continuous display of predicted airplane energy state if the approach is continued. Such a display may be used to improve pilot situational awareness or improve pilot confidence in wind shear alerts generated by other systems. The display is described and the algorithms necessary for implementation in a simulation system are provided.
Emergency Operations Center at Johnson Space Center
NASA Technical Reports Server (NTRS)
Caylor, Gary C.
1997-01-01
In June 1966, at the start of the Gulf Coast hurricane season, the Johnson Space Center (JSC) celebrated the opening of its new 4,000-square foot, state-of-the-art Emergency Operations Center (EOC). The new EOC has been upgraded and enhanced to support a wide spectrum of emergencies affecting JSC and neighboring communities. One of the main features of the EOC is its premier computerized dispatch center. The new system unites many of JSC's critical emergency functions into one integrated network. It automatically monitors fire alarms, security entrances, and external cameras. It contains the JSC inventory of hazardous materials, by building and room, and can call up Material Safety Data Sheets for most of the generic hazardous materials used on-site. The EOC is available for community use during area emergencies such as hurricanes and is a welcome addition to the Clear Lake/Galveston Bay Area communities' emergency response resources.
NASA Astrophysics Data System (ADS)
Sajjad, Muhammad; Li, Yangfan; Tang, Zhenghong; Cao, Ling; Liu, Xiaoping
2018-03-01
Worldwide, humans are facing high risks from natural hazards, especially in coastal regions with high population densities. Rising sea levels due to global warming are making coastal communities' infrastructure vulnerable to natural disasters. The present study aims to provide a coupling approach of vulnerability and resilience through restoration and conservation of lost or degraded coastal natural habitats to reclamation under different climate change scenarios. The integrated valuation of ecosystems and tradeoffs model is used to assess the current and future vulnerability of coastal communities. The model employed is based on seven different biogeophysical variables to calculate a natural hazard index and to highlight the criticality of the restoration of natural habitats. The results show that roughly 25% of the coastline and more than 5 million residents are in highly vulnerable coastal areas of mainland China, and these numbers are expected to double by 2100. Our study suggests that restoration and conservation in recently reclaimed areas have the potential to reduce this vulnerability by 45%. Hence, natural habitats have proved to be a great defense against coastal hazards and should be prioritized in coastal planning and development. The findings confirm that natural habitats are critical for coastal resilience and can act as a recovery force of coastal functionality loss. Therefore, we recommend that the Chinese government prioritizes restoration (where possible) and conservation of the remaining habitats for the sake of coastal resilience to prevent natural hazards from escalating into disasters.
Comparing capacity coefficient and dual task assessment of visual multitasking workload
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaha, Leslie M.
Capacity coefficient analysis could offer a theoretically grounded alternative approach to subjective measures and dual task assessment of cognitive workload. Workload capacity or workload efficiency is a human information processing modeling construct defined as the amount of information that can be processed by the visual cognitive system given a specified of amount of time. In this paper, I explore the relationship between capacity coefficient analysis of workload efficiency and dual task response time measures. To capture multitasking performance, I examine how the relatively simple assumptions underlying the capacity construct generalize beyond the single visual decision making tasks. The fundamental toolsmore » for measuring workload efficiency are the integrated hazard and reverse hazard functions of response times, which are defined by log transforms of the response time distribution. These functions are used in the capacity coefficient analysis to provide a functional assessment of the amount of work completed by the cognitive system over the entire range of response times. For the study of visual multitasking, capacity coefficient analysis enables a comparison of visual information throughput as the number of tasks increases from one to two to any number of simultaneous tasks. I illustrate the use of capacity coefficients for visual multitasking on sample data from dynamic multitasking in the modified Multi-attribute Task Battery.« less
IRIS Toxicological Review of Methanol (Non-Cancer) ...
EPA is conducting a peer review and public comment of the scientific basis supporting the human health hazard and dose-response assessment of methanol (non-cancer) that when finalized will appear on the Integrated Risk Information System (IRIS) database. EPA is conducting a peer review of the scientific basis supporting the human health hazard and dose-response assessment of methanol (non-cancer) that will appear in the Integrated Risk Information System (IRIS) database.
Tracking Temporal Hazard in the Human Electroencephalogram Using a Forward Encoding Model
2018-01-01
Abstract Human observers automatically extract temporal contingencies from the environment and predict the onset of future events. Temporal predictions are modeled by the hazard function, which describes the instantaneous probability for an event to occur given it has not occurred yet. Here, we tackle the question of whether and how the human brain tracks continuous temporal hazard on a moment-to-moment basis, and how flexibly it adjusts to strictly implicit variations in the hazard function. We applied an encoding-model approach to human electroencephalographic data recorded during a pitch-discrimination task, in which we implicitly manipulated temporal predictability of the target tones by varying the interval between cue and target tone (i.e. the foreperiod). Critically, temporal predictability either was driven solely by the passage of time (resulting in a monotonic hazard function) or was modulated to increase at intermediate foreperiods (resulting in a modulated hazard function with a peak at the intermediate foreperiod). Forward-encoding models trained to predict the recorded EEG signal from different temporal hazard functions were able to distinguish between experimental conditions, showing that implicit variations of temporal hazard bear tractable signatures in the human electroencephalogram. Notably, this tracking signal was reconstructed best from the supplementary motor area, underlining this area’s link to cognitive processing of time. Our results underline the relevance of temporal hazard to cognitive processing and show that the predictive accuracy of the encoding-model approach can be utilized to track abstract time-resolved stimuli. PMID:29740594
1983-09-01
has been reviewed and is approved for publication. Im Ŕ APPROVED: . L,.. &- MARK W. LEVI Project Engineer APPROVED: W.S. TUTHILL, Colonel, USAF Chief...ebetract entered In Block 20, if different from Report) Same IS. SUPPLEMENTARY NOTES RADC Project Engineer: Mark W. Levi (RBRP) This effort was funded...masking the presence of another fault which was a functional or reliability hazard. ’." • ’ ",, ,~ MARK W. LEVI A ec . ston For 1\\ T’ ir I .] / r "- T A
[Hazard function and life table: an introduction to the failure time analysis].
Matsushita, K; Inaba, H
1987-04-01
Failure time analysis has become popular in demographic studies. It can be viewed as a part of regression analysis with limited dependent variables as well as a special case of event history analysis and multistate demography. The idea of hazard function and failure time analysis, however, has not been properly introduced to nor commonly discussed by demographers in Japan. The concept of hazard function in comparison with life tables is briefly described, where the force of mortality is interchangeable with the hazard rate. The basic idea of failure time analysis is summarized for the cases of exponential distribution, normal distribution, and proportional hazard models. The multiple decrement life table is also introduced as an example of lifetime data analysis with cause-specific hazard rates.
NASA Astrophysics Data System (ADS)
Meyer, F. J.; Webley, P.; Dehn, J.; Arko, S. A.; McAlpin, D. B.
2013-12-01
Volcanic eruptions are among the most significant hazards to human society, capable of triggering natural disasters on regional to global scales. In the last decade, remote sensing techniques have become established in operational forecasting, monitoring, and managing of volcanic hazards. Monitoring organizations, like the Alaska Volcano Observatory (AVO), are nowadays heavily relying on remote sensing data from a variety of optical and thermal sensors to provide time-critical hazard information. Despite the high utilization of these remote sensing data to detect and monitor volcanic eruptions, the presence of clouds and a dependence on solar illumination often limit their impact on decision making processes. Synthetic Aperture Radar (SAR) systems are widely believed to be superior to optical sensors in operational monitoring situations, due to the weather and illumination independence of their observations and the sensitivity of SAR to surface changes and deformation. Despite these benefits, the contributions of SAR to operational volcano monitoring have been limited in the past due to (1) high SAR data costs, (2) traditionally long data processing times, and (3) the low temporal sampling frequencies inherent to most SAR systems. In this study, we present improved data access, data processing, and data integration techniques that mitigate some of the above mentioned limitations and allow, for the first time, a meaningful integration of SAR into operational volcano monitoring systems. We will introduce a new database interface that was developed in cooperation with the Alaska Satellite Facility (ASF) and allows for rapid and seamless data access to all of ASF's SAR data holdings. We will also present processing techniques that improve the temporal frequency with which hazard-related products can be produced. These techniques take advantage of modern signal processing technology as well as new radiometric normalization schemes, both enabling the combination of multiple observation geometries in change detection procedures. Additionally, it will be shown how SAR-based hazard information can be integrated with data from optical satellites, thermal sensors, webcams and models to create near-real time volcano hazard information. We will introduce a prototype monitoring system that integrates SAR-based hazard information into the near real-time volcano hazard monitoring system of the Alaska Volcano Observatory. This prototype system was applied to historic eruptions of the volcanoes Okmok and Augustine, both located in the North Pacific. We will show that for these historic eruptions, the addition of SAR data lead to a significant improvement in activity detection and eruption monitoring, and improved the accuracy and timeliness of eruption alerts.
NASA Astrophysics Data System (ADS)
Leonard, Graham S.; Stewart, Carol; Wilson, Thomas M.; Procter, Jonathan N.; Scott, Bradley J.; Keys, Harry J.; Jolly, Gill E.; Wardman, Johnny B.; Cronin, Shane J.; McBride, Sara K.
2014-10-01
New Zealand's Tongariro National Park volcanoes produce hazardous eruptions every few years to decades. On 6 August 2012 the Te Maari vent of Tongariro Volcano erupted, producing a series of explosions and a fine ash of minor volume which was dispersed rapidly to the east. This manuscript presents a summary of the eruption impacts and the way these supported science communication during the crisis, particularly in terms of hazard map development. The most significant proximal impact was damage from pyroclastic surges and ballistics to the popular and economically-important Tongariro Alpine Crossing track. The only hazard to affect the medial impact zone was a few mms of ashfall with minor impacts. Field testing indicated that the Te Maari ash had extremely low resistivity when wetted, implying a very high potential to cause disruption to nationally-important power transmission networks via the mechanism of insulator flashover. This was not observed, presumably due to insufficient ash accumulation on insulators. Virtually no impacts from distal ashfall were reported. Post-event analysis of PM10 data demonstrates the additional value of regional air quality monitoring networks in quantifying population exposure to airborne respirable ash. While the eruption was minor, it generated a high level of public interest and a demand for information on volcanic hazards and impacts from emergency managers, the public, critical infrastructure managers, health officials, and the agriculture sector. Meeting this demand fully taxed available resources. We present here aspects of the New Zealand experience which may have wider applicability in moving towards improved integration of hazard impact information, mapping, and communication. These include wide use of a wiki technical clearinghouse and email listservs, a focus on multi-agency consistent messages, and a recently developed environment of collaboration and alignment of both research funding and technical science advice. Hazard maps were integral to science communication during the crisis, but there is limited international best practice information available on hazard maps as communication devices, as most volcanic hazard mapping literature is concerned with defining hazard zones. We propose that hazard maps are only as good as the communications framework and inter-agency relationships in which they are embedded, and we document in detail the crisis hazard map development process. We distinguish crisis hazard maps from background hazard maps and ashfall prediction maps, illustrating the complementary nature of these three distinct communication mechanisms. We highlight issues that arose and implications for the development of future maps.
Integrated risk management and communication: case study of Canton Vaud (Switzerland)
NASA Astrophysics Data System (ADS)
Artigue, Veronica; Aye, Zar Chi; Gerber, Christian; Derron, Marc-Henri; Jaboyedoff, Michel
2017-04-01
Canton Vaud's history is marked by events that remind us that any territory may have to cope with natural hazards such as devastating floods of the Baye and the Veraye rivers in Montreux (1927), the overflowing of the Rhône by dam failure (1935), the mud flow of Pissot (1995) and avalanches in the Prealps (1999). All of these examples have caused significant damage, and sometimes even fatalities, in the regions of Canton Vaud. In response to these new issues, the Swiss Confederation and the local authorities of the Canton decided to implement an integrated management policy of natural risks. The realization of natural hazards maps was the first step of the integrated management process. This work resulted in more than 10'000 maps and related documents for 94% of the municipalities of the Canton, covering 17% of its total surface. From this significant amount of data, the main issue is to propose a relevant communication and to build an integrated risk management structure. To make this available information relevant for end users, the implied teams worked to realize documents and tools for a better understanding of these data by all stakeholders. The first step of this process was to carry out a statistical and geographical analysis of hazard maps that allows identifying the most exposed areas to natural hazards. An atlas could thus be created. Then, continued under this framework, several topics have been discussed for each identified risk. The results show that 88 of 318 municipalities in Canton Vaud have at least a high hazard level on their territory, 108 with a moderate hazard level, 41 with a low level and 8 with a residual level. Only 73 of 318 municipalities remain with a minimum or zero hazard level. Concerning the type of hazard considered, 16% of the building zones are exposed to floods, 18% to mud flow, 16% to deep landslides, 14% to spontaneous surface landslides, 6% to rockfall, 55% to rock collapses and less than 5% to avalanches. As the national policies require to take into account the risk at the building scale, further analysis on the buildings have been made. 1'154 buildings are exposed to a high hazard level, while 8409, 21'130 and 14'980 buildings are exposed to a moderate, low and residual hazard level respectively. This paper addresses the complexity of the realization of the hazard map products of the Canton Vaud, particularly through the statistical analysis and the difficulties encountered for data availability and quality at the building scale. The authors highlight the necessary processes to build a robust communication for all the implied stakeholders of risk management in a dynamic and changing area through the example of the Canton Vaud.
TRIM.Risk is used to integrate the information on exposure received from TRIM.FaTE or TRIM.Expo with that on dose-response or hazard assessment and to provide quantitative descriptions of risk or hazard and some of the attendant uncertainties.
Meris, Ronald G; Barbera, Joseph A
2014-01-01
In a large-scale outdoor, airborne, hazardous materials (HAZMAT) incident, such as ruptured chlorine rail cars during a train derailment, the local Incident Commanders and HAZMAT emergency responders must obtain accurate information quickly to assess the situation and act promptly and appropriately. HAZMAT responders must have a clear understanding of key information and how to integrate it into timely and effective decisions for action planning. This study examined the use of HAZMAT plume modeling as a decision support tool during incident action planning in this type of extreme HAZMAT incident. The concept of situation awareness as presented by Endsley's dynamic situation awareness model contains three levels: perception, comprehension, and projection. It was used to examine the actions of incident managers related to adequate data acquisition, current situational understanding, and accurate situation projection. Scientists and engineers have created software to simulate and predict HAZMAT plume behavior, the projected hazard impact areas, and the associated health effects. Incorporating the use of HAZMAT plume projection modeling into an incident action plan may be a complex process. The present analysis used a mixed qualitative and quantitative methodological approach and examined the use and limitations of a "HAZMAT Plume Modeling Cycle" process that can be integrated into the incident action planning cycle. HAZMAT response experts were interviewed using a computer-based simulation. One of the research conclusions indicated the "HAZMAT Plume Modeling Cycle" is a critical function so that an individual/team can be tasked with continually updating the hazard plume model with evolving data, promoting more accurate situation awareness.
Analysis of Compound Water Hazard in Coastal Urbanized Areas under the Future Climate
NASA Astrophysics Data System (ADS)
Shibuo, Y.; Taniguchi, K.; Sanuki, H.; Yoshimura, K.; Lee, S.; Tajima, Y.; Koike, T.; Furumai, H.; Sato, S.
2017-12-01
Several studies indicate the increased frequency and magnitude of heavy rainfalls as well as the sea level rise under the future climate, which implies that coastal low-lying urbanized areas may experience increased risk against flooding. In such areas, where river discharge, tidal fluctuation, and city drainage networks altogether influence urban inundation, it is necessary to consider their potential interference to understand the effect of compound water hazard. For instance, pump stations cannot pump out storm water when the river water level is high, and in the meantime the river water level shall increase when it receives pumped water from cities. At the further downstream, as the tidal fluctuation regulates the water levels in the river, it will also affect the functionality of pump stations and possible inundation from rivers. In this study, we estimate compound water hazard in the coastal low-lying urbanized areas of the Tsurumi river basin under the future climate. We developed the seamlessly integrated river, sewerage, and coastal hydraulic model that can simulate river water levels, water flow in sewerage network, and inundation from the rivers and/or the coast to address the potential interference issue. As a forcing, the pseudo global warming method, which applies the changes in GCM anomaly to re-analysis data, is employed to produce ensemble typhoons to drive the seamlessly integrated model. The results show that heavy rainfalls caused by the observed typhoon generally become stronger under the pseudo global climate condition. It also suggests that the coastal low-lying areas become extensively inundated if the onset of river flooding and storm surge coincides.
Nadal, Martí; Kumar, Vikas; Schuhmacher, Marta; Domingo, José L
2006-08-01
A risk map of the chemical/petrochemical industrial area of Tarragona (Catalonia, Spain) was designed following a two-stage procedure. The first step was the creation of a ranking system (Hazard Index) for a number of different inorganic and organic pollutants: heavy metals, polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs), polychlorinated biphenyls (PCBs) and polychlorinated aromatic hydrocarbons (PAHs) by applying self-organizing maps (SOM) to persistence, bioaccumulation and toxicity properties of the chemicals. PCBs seemed to be the most hazardous compounds, while the light PAHs showed the minimum values. Subsequently, an Integral Risk Index was developed taking into account the Hazard Index and the concentrations of all pollutants in soil samples collected in the assessed area of Tarragona. Finally, a risk map was elaborated by representing the spatial distribution of the Integral Risk Index with a geographic information system (GIS). The results of the present study seem to indicate that the development of an integral risk map might be useful to help in making-decision processes concerning environmental pollutants.
A methodology for post-mainshock probabilistic assessment of building collapse risk
Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.
2011-01-01
This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.
Eaton, A D; Zimmermann, C; Delaney, B; Hurley, B P
2017-08-01
An experimental platform employing human derived intestinal epithelial cell (IEC) line monolayers grown on permeable Transwell ® filters was previously investigated to differentiate between hazardous and innocuous proteins. This approach was effective at distinguishing these types of proteins and perturbation of monolayer integrity, particularly transepithelial electrical resistance (TEER), was the most sensitive indicator. In the current report, in vitro indicators of monolayer integrity, cytotoxicity, and inflammation were evaluated using primary (non-transformed) human polarized small intestinal epithelial barriers cultured on Transwell ® filters to compare effects of a hazardous protein (Clostridium difficile Toxin A [ToxA]) and an innocuous protein (bovine serum albumin [BSA]). ToxA exerted a reproducible decrease on barrier integrity at doses comparable to those producing effects observed from cell line-derived IEC monolayers, with TEER being the most sensitive indicator. In contrast, BSA, tested at concentrations substantially higher than ToxA, did not cause changes in any of the tested variables. These results demonstrate a similarity in response to certain proteins between cell line-derived polarized IEC models and a primary human polarized small intestinal epithelial barrier model, thereby reinforcing the potential usefulness of cell line-derived polarized IECs as a valid experimental platform to differentiate between hazardous and non-hazardous proteins. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
J-SHIS - an integrated system for knowing seismic hazard information in Japan
NASA Astrophysics Data System (ADS)
Azuma, H.; Fujiwara, H.; Kawai, S.; Hao, K. X.; Morikawa, N.
2015-12-01
An integrated system of Japan seismic hazard information station (J-SHIS) was established in 2005 for issuing and exchanging information of the National Seismic Hazard Maps for Japan that are based on seismic hazard assessment (SHA). A simplified app, also named J-SHIS, for smartphones is popularly used in Japan based on the integrated system of http://www.j-shis.bosai.go.jp/map/?lang=en. "Smartphone tells hazard" is realized on a cellphone, a tablet and/or a PC. At a given spot, the comprehensive information of SHA map can be easily obtained as below: 1) A SHA probability at given intensity (JMA=5-, 5+, 6-, 6+) within 30 years. 2) A site amplification factor varies within 0.5 ~ 3.0 and expectation is 1 based on surface geology map information. 3) A depth of seismic basement down to ~3,000m based on deeper borehole and geological structure. 4) Scenario earthquake maps: By choosing an active fault, one got the average case for different parameters of the modeling. Then choose a case, you got the shaking map of intensity with color scale. "Seismic Hazard Karte tells more hazard" is another app based on website of http://www.j-shis.bosai.go.jp/labs/karte/. (1) For every mesh of 250m x 250m, professional service SHA information is provided over national-world. (2) With five ranks for eight items, comprehensive SHA information could be delivered. (3) Site amplification factor with an average index is given. (4) Deeper geologic structure modeling is provided with borehole profiling. (5) A SHA probability is assessed within 30 and/or 50 years for the given site. (6) Seismic Hazard curves are given for earthquake sources from inland active fault, subduction zone, undetermined and their summarization. (7) The JMA seismic intensities are assessed in long-term averaged periods of 500-years to ~100,000 years. The app of J-SHIS can be downloaded freely from http://www.j-shis.bosai.go.jp/app-jshis.
Developing strategies for maintaining tank car integrity during train accidents
DOT National Transportation Integrated Search
2007-09-11
Accidents that lead to rupture of tank cars carrying : hazardous materials can cause serious public safety hazards and : substantial economic losses. The desirability of improved tank : car designs that are better equipped to keep the commodity : con...
46 CFR 111.105-5 - System integrity.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 4 2011-10-01 2011-10-01 false System integrity. 111.105-5 Section 111.105-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING ELECTRIC SYSTEMS-GENERAL REQUIREMENTS Hazardous Locations § 111.105-5 System integrity. In order to maintain system integrity, each...
46 CFR 111.105-5 - System integrity.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 46 Shipping 4 2012-10-01 2012-10-01 false System integrity. 111.105-5 Section 111.105-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING ELECTRIC SYSTEMS-GENERAL REQUIREMENTS Hazardous Locations § 111.105-5 System integrity. In order to maintain system integrity, each...
46 CFR 111.105-5 - System integrity.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 46 Shipping 4 2013-10-01 2013-10-01 false System integrity. 111.105-5 Section 111.105-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING ELECTRIC SYSTEMS-GENERAL REQUIREMENTS Hazardous Locations § 111.105-5 System integrity. In order to maintain system integrity, each...
Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California
NASA Astrophysics Data System (ADS)
Mahdyiar, M.; Guin, J.
2005-12-01
Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground motion uncertainties. The approach is designed to integrate loss distribution functions with different degrees of correlation for portfolio analysis. The analysis is based on USGS 2002 regional seismicity model.
Functional safety for the Advanced Technology Solar Telescope
NASA Astrophysics Data System (ADS)
Bulau, Scott; Williams, Timothy R.
2012-09-01
Since inception, the Advanced Technology Solar Telescope (ATST) has planned to implement a facility-wide functional safety system to protect personnel from harm and prevent damage to the facility or environment. The ATST will deploy an integrated safety-related control system (SRCS) to achieve functional safety throughout the facility rather than relying on individual facility subsystems to provide safety functions on an ad hoc basis. The Global Interlock System (GIS) is an independent, distributed, facility-wide, safety-related control system, comprised of commercial off-the-shelf (COTS) programmable controllers that monitor, evaluate, and control hazardous energy and conditions throughout the facility that arise during operation and maintenance. The GIS has been designed to utilize recent advances in technology for functional safety plus revised national and international standards that allow for a distributed architecture using programmable controllers over a local area network instead of traditional hard-wired safety functions, while providing an equivalent or even greater level of safety. Programmable controllers provide an ideal platform for controlling the often complex interrelationships between subsystems in a modern astronomical facility, such as the ATST. A large, complex hard-wired relay control system is no longer needed. This type of system also offers greater flexibility during development and integration in addition to providing for expanded capability into the future. The GIS features fault detection, self-diagnostics, and redundant communications that will lead to decreased maintenance time and increased availability of the facility.
Stress and the HPA Axis: Balancing Homeostasis and Fertility
Whirledge, Shannon
2017-01-01
An organism’s reproductive fitness is sensitive to the environment, integrating cues of resource availability, ecological factors, and hazards within its habitat. Events that challenge the environment of an organism activate the central stress response system, which is primarily mediated by the hypothalamic–pituitary–adrenal (HPA) axis. The regulatory functions of the HPA axis govern the cardiovascular and metabolic system, immune functions, behavior, and reproduction. Activation of the HPA axis by various stressors primarily inhibits reproductive function and is able to alter fetal development, imparting a biological record of stress experienced in utero. Clinical studies and experimental data indicate that stress signaling can mediate these effects through direct actions in the brain, gonads, and embryonic tissues. This review focuses on the mechanisms by which stress activation of the HPA axis impacts fertility and fetal development. PMID:29064426
Integrated risk reduction framework to improve railway hazardous materials transportation safety.
Liu, Xiang; Saat, M Rapik; Barkan, Christopher P L
2013-09-15
Rail transportation plays a critical role to safely and efficiently transport hazardous materials. A number of strategies have been implemented or are being developed to reduce the risk of hazardous materials release from train accidents. Each of these risk reduction strategies has its safety benefit and corresponding implementation cost. However, the cost effectiveness of the integration of different risk reduction strategies is not well understood. Meanwhile, there has been growing interest in the U.S. rail industry and government to best allocate resources for improving hazardous materials transportation safety. This paper presents an optimization model that considers the combination of two types of risk reduction strategies, broken rail prevention and tank car safety design enhancement. A Pareto-optimality technique is used to maximize risk reduction at a given level of investment. The framework presented in this paper can be adapted to address a broader set of risk reduction strategies and is intended to assist decision makers for local, regional and system-wide risk management of rail hazardous materials transportation. Copyright © 2013 Elsevier B.V. All rights reserved.
Integrated pest management policies in America's schools: is federal legislation needed?
Taylor, Andrea Kidd; Esdaille, Kyle
2010-01-01
America's school children are at risk of developing asthma and other respiratory illnesses as a result of exposure to hazardous pesticides. Integrated pest management (IPM) policies are being implemented in states and school districts across the country; however, the content and regulation of these policies vary. The need for standardization of such policies and a federal IPM law is the only way to ensure that children in America's schools are adequately protected from exposure to hazardous pesticides used to control pests.
NASA Astrophysics Data System (ADS)
Carby, B. E.
2015-12-01
Latin American and Caribbean (LAC) countries face multiple hazards such as earthquakes, volcanoes, accelerated erosion, landslides, drought, flooding, windstorms and the effects of climate variability and change. World Bank (2005) data indicate that seventeen of the top thirty-five countries with relatively high mortality risk from 3 or more hazards are located in LAC, El Salvador has the second highest per cent of its population at risk - 77.7% and 7 of the top 10 countries for population exposure to multiple hazards are in LAC. All LAC countries have half or more of GDP exposed to at least one hazard. The report underscores the need for better data and information on hazards and disasters to inform disaster risk reduction (DRR) and supports the view that reduction of disaster risk is essential for achieving Sustainable Development (SD). This suggests that DRR must be integrated into development planning of countries. However the Global Assessment Report notes that globally, there has been little progress in mainstreaming DRR in national development (UNISDR 2009). Without this, countries will not realise development goals. DRR efforts in LAC require an integrated approach including societal input in deciding priority DRR research themes and interdisciplinary, multi-hazard research informing DRR policy and practice. Jiminez (2015) from a study of countries across LAC reports that efforts are being made to link research to national planning through inclusion of policy makers in some university-led research projects. Research by the author in Jamaica reveals that the public sector has started to apply research on hazards to inform DRR policy, programmes and plans. As most research is done by universities, there is collaboration between the public sector and academia. Despite differences in scale among countries across the region, similarities in exposure to multiple hazards and potential hazard impacts suggest that collaboration among researchers in LAC could be beneficial. It is proposed here that this collaboration should go beyond the scientific community and should include sharing of experiences in linking DRR research to national development needs, inclusion of policy makers in research design and implementation and integration of research results in policy and programme development.
NASA Astrophysics Data System (ADS)
Veitinger, Jochen; Purves, Ross Stuart; Sovilla, Betty
2016-10-01
Avalanche hazard assessment requires a very precise estimation of the release area, which still depends, to a large extent, on expert judgement of avalanche specialists. Therefore, a new algorithm for automated identification of potential avalanche release areas was developed. It overcomes some of the limitations of previous tools, which are currently not often applied in hazard mitigation practice. By introducing a multi-scale roughness parameter, fine-scale topography and its attenuation under snow influence is captured. This allows the assessment of snow influence on terrain morphology and, consequently, potential release area size and location. The integration of a wind shelter index enables the user to define release area scenarios as a function of the prevailing wind direction or single storm events. A case study illustrates the practical usefulness of this approach for the definition of release area scenarios under varying snow cover and wind conditions. A validation with historical data demonstrated an improved estimation of avalanche release areas. Our method outperforms a slope-based approach, in particular for more frequent avalanches; however, the application of the algorithm as a forecasting tool remains limited, as snowpack stability is not integrated. Future research activity should therefore focus on the coupling of the algorithm with snowpack conditions.
77 FR 31815 - Hazardous Materials Regulations: Combustible Liquids
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-30
... are: Safety (hazard communication and packaging integrity); International commerce (frustration/delay... exempt seasonal workers from the Federal Motor Carrier Safety Administration's Commercial Driver's...: Anyone is able to search the electronic form of any written communications and comments received into any...
Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon
NASA Astrophysics Data System (ADS)
Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.
2015-12-01
Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction Engineering, Oregon State University, Corvallis, OR 97331, USA
United States-Chile binational exchange for volcanic risk reduction, 2015—Activities and benefits
Pierson, Thomas C.; Mangan, Margaret T.; Lara Pulgar, Luis E.; Ramos Amigo, Álvaro
2017-07-25
In 2015, representatives from the United States and Chile exchanged visits to discuss and share their expertise and experiences dealing with volcano hazards. Communities in both countries are at risk from various volcano hazards. Risks to lives and property posed by these hazards are a function not only of the type and size of future eruptions but also of distances from volcanoes, structural integrity of volcanic edifices, landscape changes imposed by recent past eruptions, exposure of people and resources to harm, and any mitigative measures taken (or not taken) to reduce risk. Thus, effective risk-reduction efforts require the knowledge and consideration of many factors, and firsthand experience with past volcano crises provides a tremendous advantage for this work. However, most scientists monitoring volcanoes and most officials delegated with the responsibility for emergency response and management in volcanic areas have little or no firsthand experience with eruptions or volcano hazards. The reality is that eruptions are infrequent in most regions, and individual volcanoes may have dormant periods lasting hundreds to thousands of years. Knowledge may be lacking about how to best plan for and manage future volcanic crises, and much can be learned from the sharing of insights and experiences among counterpart specialists who have had direct, recent, or different experiences in dealing with restless volcanoes and threatened populations. The sharing of information and best practices can help all volcano scientists and officials to better prepare for future eruptions or noneruptive volcano hazards, such as large volcanic mudflows (lahars), which could affect their communities.
Morpheus: Advancing Technologies for Human Exploration
NASA Technical Reports Server (NTRS)
Olansen, Jon B.; Munday, Stephen R.; Mitchell, Jennifer D.; Baine, Michael
2012-01-01
NASA's Morpheus Project has developed and tested a prototype planetary lander capable of vertical takeoff and landing. Designed to serve as a vertical testbed (VTB) for advanced spacecraft technologies, the vehicle provides a platform for bringing technologies from the laboratory into an integrated flight system at relatively low cost. This allows individual technologies to mature into capabilities that can be incorporated into human exploration missions. The Morpheus vehicle is propelled by a LOX/Methane engine and sized to carry a payload of 1100 lb to the lunar surface. In addition to VTB vehicles, the Project s major elements include ground support systems and an operations facility. Initial testing will demonstrate technologies used to perform autonomous hazard avoidance and precision landing on a lunar or other planetary surface. The Morpheus vehicle successfully performed a set of integrated vehicle test flights including hot-fire and tethered hover tests, leading up to un-tethered free-flights. The initial phase of this development and testing campaign is being conducted on-site at the Johnson Space Center (JSC), with the first fully integrated vehicle firing its engine less than one year after project initiation. Designed, developed, manufactured and operated in-house by engineers at JSC, the Morpheus Project represents an unprecedented departure from recent NASA programs that traditionally require longer, more expensive development lifecycles and testing at remote, dedicated testing facilities. Morpheus testing includes three major types of integrated tests. A hot-fire (HF) is a static vehicle test of the LOX/Methane propulsion system. Tether tests (TT) have the vehicle suspended above the ground using a crane, which allows testing of the propulsion and integrated Guidance, Navigation, and Control (GN&C) in hovering flight without the risk of a vehicle departure or crash. Morpheus free-flights (FF) test the complete Morpheus system without the additional safeguards provided during tether. A variety of free-flight trajectories are planned to incrementally build up to a fully functional Morpheus lander capable of flying planetary landing trajectories. In FY12, these tests will culminate with autonomous flights simulating a 1 km lunar approach trajectory, hazard avoidance maneuvers and precision landing in a prepared hazard field at the Kennedy Space Center (KSC). This paper describes Morpheus integrated testing campaign, infrastructure, and facilities, and the payloads being incorporated on the vehicle. The Project s fast pace, rapid prototyping, frequent testing, and lessons learned depart from traditional engineering development at JSC. The Morpheus team employs lean, agile development with a guiding belief that technologies offer promise, but capabilities offer solutions, achievable without astronomical costs and timelines.
Evaluation of Prototype Head Shield for Hazardous Material Tank Car
DOT National Transportation Integrated Search
1976-12-01
The structural integrity of a prototype tank car head shield for hazardous material railroad tank cars was evaluated under conditions of freight car coupling at moderate to high speeds. This is one of the most severe environments encountered in norma...
Using the Triad Approach to Improve the Cost-effectiveness of Hazardous Waste Site Cleanups
U.S. EPA's Office of Solid Waste and Emergency Response is promoting more effective strategies for characterizing, monitoring, and cleaning up hazardous waste sites. In particular, a paradigm based on using an integrated triad of systematic planning...
Integrated Risk Assessment to Natural Hazards in Motozintla, Chiapas, Mexico
NASA Astrophysics Data System (ADS)
Novelo-Casanova, D. A.
2012-12-01
An integrated risk assessment includes the analysis of all components of individual constituents of risk such as baseline study, hazard identification and categorization, hazard exposure, and vulnerability. Vulnerability refers to the inability of people, organizations, and societies to withstand adverse impacts from multiple stressors to which they are exposed. These impacts are due to characteristics inherent in social interactions, institutions, and systems of cultural values. Thus, social vulnerability is a pre-existing condition that affects a society's ability to prepare for and recover from a disruptive event. Risk is the probability of a loss, and this loss depends on three elements: hazard, exposure, and vulnerability. Thus, risk is the estimated impact that a hazard event would have on people, services, facilities, structures and assets in a community. In this work we assess the risk to natural hazards in the community of Motozintla located in southern Mexico in the state of Chiapas (15.37N, 92.25W) with a population of about 20 000 habitants. Due to its geographical and geological location, this community is continuously exposed to many different natural hazards (earthquakes, landslides, volcanic eruptions, and floods). To determine the level of exposure of the community to natural hazards, we developed integrated studies and analysis of seismic microzonation, landslide and flood susceptibility as well as volcanic impact using standard methodologies. Social vulnerability was quantified from data obtained from local families interviews. Five variables were considered: household structure quality and design, availability of basic public services, family economic conditions, existing family plans for disaster preparedness, and risk perception.The number of families surveyed was determined considering a sample statistically significant. The families that were interviewed were selected using the simple random sampling technique with replacement. With these procedure, each household was chosen randomly and entirely by chance with the same probability of being chosen at any stage during the sampling process. To facilitate our interpretation, all results were spatially analyzed using a Geographical Information System (GIS). Our results indicate that the community of Motozintla is higly exposed to floods, landslides and earthquakes and to a lesser extent to the impact of a volcanic eruption. The locality has a high level of structural vulnerability to the main identified hazards (floods and landslides). About 70% of the families has a daily income below 11 USD. Approximately 66% of the population does not know any existing Civil Protection Plan. Another major observation is that the community organization for disaster prevention is practically nonexistent. These natural and social conditions indicate that the community of Motozintla has a very high level of risk to natural hazards. This research will support decision makers in Mexico, and particularly from the sate of Chiapas, in the development of an integrated comprenhensive natural hazards mitigation and prevention program in this region.
Targeted gene insertion for molecular medicine.
Voigt, Katrin; Izsvák, Zsuzsanna; Ivics, Zoltán
2008-11-01
Genomic insertion of a functional gene together with suitable transcriptional regulatory elements is often required for long-term therapeutical benefit in gene therapy for several genetic diseases. A variety of integrating vectors for gene delivery exist. Some of them exhibit random genomic integration, whereas others have integration preferences based on attributes of the targeted site, such as primary DNA sequence and physical structure of the DNA, or through tethering to certain DNA sequences by host-encoded cellular factors. Uncontrolled genomic insertion bears the risk of the transgene being silenced due to chromosomal position effects, and can lead to genotoxic effects due to mutagenesis of cellular genes. None of the vector systems currently used in either preclinical experiments or clinical trials displays sufficient preferences for target DNA sequences that would ensure appropriate and reliable expression of the transgene and simultaneously prevent hazardous side effects. We review in this paper the advantages and disadvantages of both viral and non-viral gene delivery technologies, discuss mechanisms of target site selection of integrating genetic elements (viruses and transposons), and suggest distinct molecular strategies for targeted gene delivery.
Addressing Unison and Uniqueness of Reliability and Safety for Better Integration
NASA Technical Reports Server (NTRS)
Huang, Zhaofeng; Safie, Fayssal
2015-01-01
For a long time, both in theory and in practice, safety and reliability have not been clearly differentiated, which leads to confusion, inefficiency, and sometime counter-productive practices in executing each of these two disciplines. It is imperative to address the uniqueness and the unison of these two disciplines to help both disciplines become more effective and to promote a better integration of the two for enhancing safety and reliability in our products as an overall objective. There are two purposes of this paper. First, it will investigate the uniqueness and unison of each discipline and discuss the interrelationship between the two for awareness and clarification. Second, after clearly understanding the unique roles and interrelationship between the two in a product design and development life cycle, we offer suggestions to enhance the disciplines with distinguished and focused roles, to better integrate the two, and to improve unique sets of skills and tools of reliability and safety processes. From the uniqueness aspect, the paper identifies and discusses the respective uniqueness of reliability and safety from their roles, accountability, nature of requirements, technical scopes, detailed technical approaches, and analysis boundaries. It is misleading to equate unreliable to unsafe, since a safety hazard may or may not be related to the component, sub-system, or system functions, which are primarily what reliability addresses. Similarly, failing-to-function may or may not lead to hazard events. Examples will be given in the paper from aerospace, defense, and consumer products to illustrate the uniqueness and differences between reliability and safety. From the unison aspect, the paper discusses what the commonalities between reliability and safety are, and how these two disciplines are linked, integrated, and supplemented with each other to accomplish the customer requirements and product goals. In addition to understanding the uniqueness in reliability and safety, a better understanding of unison and commonalities will further help in understanding the interaction between reliability and safety. This paper discusses the unison and uniqueness of reliability and safety. It presents some suggestions for better integration of the two disciplines in terms of technical approaches, tools, techniques, and skills to enhance the role of reliability and safety in supporting a product design and development life cycle. The paper also discusses eliminating the redundant effort and minimizing the overlap of reliability and safety analyses for an efficient implementation of the two disciplines.
Ground subsidence information as a valuable layer in GIS analysis
NASA Astrophysics Data System (ADS)
Murdzek, Radosław; Malik, Hubert; Leśniak, Andrzej
2018-04-01
Among the technologies used to improve functioning of local governments the geographic information systems (GIS) are widely used. GIS tools allow to simultaneously integrate spatial data resources, analyse them, process and use them to make strategic decisions. Nowadays GIS analysis is widely used in spatial planning or environmental protection. In these applications a number of spatial information are utilized, but rarely it is an information about environmental hazards. This paper includes information about ground subsidence that occurred in USCB mining area into GIS analysis. Monitoring of this phenomenon can be carried out using the radar differential interferometry (DInSAR) method.
Integrated Safety Analysis Teams
NASA Technical Reports Server (NTRS)
Wetherholt, Jonathan C.
2008-01-01
Today's complex systems require understanding beyond one person s capability to comprehend. Each system requires a team to divide the system into understandable subsystems which can then be analyzed with an Integrated Hazard Analysis. The team must have both specific experiences and diversity of experience. Safety experience and system understanding are not always manifested in one individual. Group dynamics make the difference between success and failure as well as the difference between a difficult task and a rewarding experience. There are examples in the news which demonstrate the need to connect the pieces of a system into a complete picture. The Columbia disaster is now a standard example of a low consequence hazard in one part of the system; the External Tank is a catastrophic hazard cause for a companion subsystem, the Space Shuttle Orbiter. The interaction between the hardware, the manufacturing process, the handling, and the operations contributed to the problem. Each of these had analysis performed, but who constituted the team which integrated this analysis together? This paper will explore some of the methods used for dividing up a complex system; and how one integration team has analyzed the parts. How this analysis has been documented in one particular launch space vehicle case will also be discussed.
Neuropsychological Correlates of Hazard Perception in Older Adults.
McInerney, Katalina; Suhr, Julie
2016-03-01
Hazard perception, the ability to identify and react to hazards while driving, is of growing importance in driving research, given its strong relationship to real word driving variables. Furthermore, although poor hazard perception is associated with novice drivers, recent research suggests that it declines with advanced age. In the present study, we examined the neuropsychological correlates of hazard perception in a healthy older adult sample. A total of 68 adults age 60 and older who showed no signs of dementia and were active drivers completed a battery of neuropsychological tests as well as a hazard perception task. Tests included the Repeatable Battery for the Assessment of Neuropsychological Status, Wechsler Test of Adult Reading, Trail Making Test, Block Design, Useful Field of View, and the Delis-Kaplan Executive Function System Color Word Interference Test. Hazard perception errors were related to visuospatial/constructional skills, processing speed, memory, and executive functioning skills, with a battery of tests across these domains accounting for 36.7% of the variance in hazard perception errors. Executive functioning, particularly Trail Making Test part B, emerged as a strong predictor of hazard perception ability. Consistent with prior work showing the relationship of neuropsychological performance to other measures of driving ability, neuropsychological performance was associated with hazard perception skill. Future studies should examine the relationship of neuropsychological changes in adults who are showing driving impairment and/or cognitive changes associated with Mild Cognitive Impairment or dementia.
NASA Technical Reports Server (NTRS)
Bishop, Robert H.; DeMars, Kyle; Trawny, Nikolas; Crain, Tim; Hanak, Chad; Carson, John M.; Christian, John
2016-01-01
The navigation filter architecture successfully deployed on the Morpheus flight vehicle is presented. The filter was developed as a key element of the NASA Autonomous Landing and Hazard Avoidance Technology (ALHAT) project and over the course of 15 free fights was integrated into the Morpheus vehicle, operations, and flight control loop. Flight testing completed by demonstrating autonomous hazard detection and avoidance, integration of an altimeter, surface relative velocity (velocimeter) and hazard relative navigation (HRN) measurements into the onboard dual-state inertial estimator Kalman flter software, and landing within 2 meters of the vertical testbed GPS-based navigation solution at the safe landing site target. Morpheus followed a trajectory that included an ascent phase followed by a partial descent-to-landing, although the proposed filter architecture is applicable to more general planetary precision entry, descent, and landings. The main new contribution is the incorporation of a sophisticated hazard relative navigation sensor-originally intended to locate safe landing sites-into the navigation system and employed as a navigation sensor. The formulation of a dual-state inertial extended Kalman filter was designed to address the precision planetary landing problem when viewed as a rendezvous problem with an intended landing site. For the required precision navigation system that is capable of navigating along a descent-to-landing trajectory to a precise landing, the impact of attitude errors on the translational state estimation are included in a fully integrated navigation structure in which translation state estimation is combined with attitude state estimation. The map tie errors are estimated as part of the process, thereby creating a dual-state filter implementation. Also, the filter is implemented using inertial states rather than states relative to the target. External measurements include altimeter, velocimeter, star camera, terrain relative navigation sensor, and a hazard relative navigation sensor providing information regarding hazards on a map generated on-the-fly.
Optogenetic Random Mutagenesis Using Histone-miniSOG in C. elegans.
Noma, Kentaro; Jin, Yishi
2016-11-14
Forward genetic screening in model organisms is the workhorse to discover functionally important genes and pathways in many biological processes. In most mutagenesis-based screens, researchers have relied on the use of toxic chemicals, carcinogens, or irradiation, which requires designated equipment, safety setup, and/or disposal of hazardous materials. We have developed a simple approach to induce heritable mutations in C. elegans using germline-expressed histone-miniSOG, a light-inducible potent generator of reactive oxygen species. This mutagenesis method is free of toxic chemicals and requires minimal laboratory safety and waste management. The induced DNA modifications include single-nucleotide changes and small deletions, and complement those caused by classical chemical mutagenesis. This methodology can also be used to induce integration of extrachromosomal transgenes. Here, we provide the details of the LED setup and protocols for standard mutagenesis and transgene integration.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-25
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket ID... Management Programs AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION... Nation's gas distribution pipeline systems through development of inspection methods and guidance for the...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-05
... DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration [Docket No... Evaluations AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice... improve performance. For gas transmission pipelines, Sec. Sec. 192.911(i) and 192.945 define the...
40 CFR 63.7800 - What are my operation and maintenance requirements?
Code of Federal Regulations, 2010 CFR
2010-07-01
... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...
40 CFR 63.7800 - What are my operation and maintenance requirements?
Code of Federal Regulations, 2011 CFR
2011-07-01
... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...
40 CFR 63.7800 - What are my operation and maintenance requirements?
Code of Federal Regulations, 2012 CFR
2012-07-01
... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...
40 CFR 63.7800 - What are my operation and maintenance requirements?
Code of Federal Regulations, 2013 CFR
2013-07-01
... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...
40 CFR 63.7800 - What are my operation and maintenance requirements?
Code of Federal Regulations, 2014 CFR
2014-07-01
... prepare and operate at all times according to a written operation and maintenance plan for each capture... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel...
Traditionally, human health risk assessments have relied on qualitative approaches for hazard identification, often using the Hill criteria and weight of evidence determinations to integrate data from multiple studies. Recently, the National Research Council has recommended the ...
DOT National Transportation Integrated Search
2008-12-31
Integrity, robustness, reliability, and resiliency of infrastructure networks are vital to the economy, : security and well-being of any country. Faced with threats caused by natural and man-made hazards, : transportation infrastructure network manag...
Cheung, Weng-Fong; Lin, Tzu-Hsuan; Lin, Yu-Cheng
2018-02-02
In recent years, many studies have focused on the application of advanced technology as a way to improve management of construction safety management. A Wireless Sensor Network (WSN), one of the key technologies in Internet of Things (IoT) development, enables objects and devices to sense and communicate environmental conditions; Building Information Modeling (BIM), a revolutionary technology in construction, integrates database and geometry into a digital model which provides a visualized way in all construction lifecycle management. This paper integrates BIM and WSN into a unique system which enables the construction site to visually monitor the safety status via a spatial, colored interface and remove any hazardous gas automatically. Many wireless sensor nodes were placed on an underground construction site and to collect hazardous gas level and environmental condition (temperature and humidity) data, and in any region where an abnormal status is detected, the BIM model will alert the region and an alarm and ventilator on site will start automatically for warning and removing the hazard. The proposed system can greatly enhance the efficiency in construction safety management and provide an important reference information in rescue tasks. Finally, a case study demonstrates the applicability of the proposed system and the practical benefits, limitations, conclusions, and suggestions are summarized for further applications.
Hazard Detection Software for Lunar Landing
NASA Technical Reports Server (NTRS)
Huertas, Andres; Johnson, Andrew E.; Werner, Robert A.; Montgomery, James F.
2011-01-01
The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is developing a system for safe and precise manned lunar landing that involves novel sensors, but also specific algorithms. ALHAT has selected imaging LIDAR (light detection and ranging) as the sensing modality for onboard hazard detection because imaging LIDARs can rapidly generate direct measurements of the lunar surface elevation from high altitude. Then, starting with the LIDAR-based Hazard Detection and Avoidance (HDA) algorithm developed for Mars Landing, JPL has developed a mature set of HDA software for the manned lunar landing problem. Landing hazards exist everywhere on the Moon, and many of the more desirable landing sites are near the most hazardous terrain, so HDA is needed to autonomously and safely land payloads over much of the lunar surface. The HDA requirements used in the ALHAT project are to detect hazards that are 0.3 m tall or higher and slopes that are 5 or greater. Steep slopes, rocks, cliffs, and gullies are all hazards for landing and, by computing the local slope and roughness in an elevation map, all of these hazards can be detected. The algorithm in this innovation is used to measure slope and roughness hazards. In addition to detecting these hazards, the HDA capability also is able to find a safe landing site free of these hazards for a lunar lander with diameter .15 m over most of the lunar surface. This software includes an implementation of the HDA algorithm, software for generating simulated lunar terrain maps for testing, hazard detection performance analysis tools, and associated documentation. The HDA software has been deployed to Langley Research Center and integrated into the POST II Monte Carlo simulation environment. The high-fidelity Monte Carlo simulations determine the required ground spacing between LIDAR samples (ground sample distances) and the noise on the LIDAR range measurement. This simulation has also been used to determine the effect of viewing on hazard detection performance. The software has also been deployed to Johnson Space Center and integrated into the ALHAT real-time Hardware-in-the-Loop testbed.
An Integrative Research Framework to Unravel the Interplay of Natural Hazards and Vulnerabilities
NASA Astrophysics Data System (ADS)
Di Baldassarre, Giuliano; Nohrstedt, Daniel; Mârd, Johanna; Burchardt, Steffi; Albin, Cecilia; Bondesson, Sara; Breinl, Korbinian; Deegan, Frances M.; Fuentes, Diana; Lopez, Marc Girons; Granberg, Mikael; Nyberg, Lars; Nyman, Monika Rydstedt; Rhodes, Emma; Troll, Valentin; Young, Stephanie; Walch, Colin; Parker, Charles F.
2018-03-01
Climate change, globalization, urbanization, social isolation, and increased interconnectedness between physical, human, and technological systems pose major challenges to disaster risk reduction (DRR). Subsequently, economic losses caused by natural hazards are increasing in many regions of the world, despite scientific progress, persistent policy action, and international cooperation. We argue that these dramatic figures call for novel scientific approaches and new types of data collection to integrate the two main approaches that still dominate the science underpinning DRR: the hazard paradigm and the vulnerability paradigm. Building from these two approaches, here we propose a research framework that specifies the scope of enquiry, concepts, and general relations among phenomena. We then discuss the essential steps to advance systematic empirical research and evidence-based DRR policy action.
Bruce, Martha L; Lohman, Matthew C; Greenberg, Rebecca L; Bao, Yuhua; Raue, Patrick J
2016-11-01
To determine whether a depression care management intervention in Medicare home health recipients decreases risk of hospitalization. Cluster-randomized trial. Nurse teams were randomized to intervention (12 teams) or enhanced usual care (EUC; 9 teams). Six home health agencies from distinct geographic regions. Home health recipients were interviewed at home and over the telephone. Individuals aged 65 and older who screened positive for depression on nurse assessments (N = 755) and a subset who consented to interviews (n = 306). The Depression CARE for PATients at Home (CAREPATH) guides nurses in managing depression during routine home visits. Clinical functions include weekly symptom assessment, medication management, care coordination, patient education, and goal setting. Researchers conducted telephone conferences with team supervisors every 2 weeks. Hospitalization while receiving home health services was assessed using data from the home health record. Hospitalization within 30 days of starting home health, regardless of how long recipients received home health services, was assessed using data from the home care record and research assessments. The relative hazard of being admitted to the hospital directly from home health was 35% lower within 30 days of starting home health care (hazard ratio (HR) = 0.65, P = .01) and 28% lower within 60 days (HR = 0.72, P = .03) for CAREPATH participants than for participants receiving EUC. In participants referred to home health directly from the hospital, the relative hazard of being rehospitalized was approximately 55% lower (HR = 0.45, P = .001) for CAREPATH participants. Integrating CAREPATH depression care management into routine nursing practice reduces hospitalization and rehospitalization risk in older adults receiving Medicare home health nursing services. © 2016, Copyright the Authors Journal compilation © 2016, The American Geriatrics Society.
A model for assessing the systemic vulnerability in landslide prone areas
NASA Astrophysics Data System (ADS)
Pascale, S.; Sdao, F.; Sole, A.
2010-07-01
The objectives of spatial planning should include the definition and assessment of possible mitigation strategies regarding the effects of natural hazards on the surrounding territory. Unfortunately, however, there is often a lack of adequate tools to provide necessary support to the local bodies responsible for land management. This paper deals with the conception, the development and the validation of an integrated numerical model for assessing systemic vulnerability in complex and urbanized landslide-prone areas. The proposed model considers this vulnerability not as a characteristic of a particular element at risk, but as a peculiarity of a complex territorial system, in which the elements are reciprocally linked in a functional way. It is an index of the tendency of a given territorial element to suffer damage (usually of a functional kind) due to its interconnections with other elements of the same territorial system. The innovative nature of this work also lies in the formalization of a procedure based on a network of influences for an adequate assessment of such "systemic" vulnerability. This approach can be used to obtain information which is useful, in any given situation of a territory hit by a landslide event, for the identification of the element which has suffered the most functional damage, ie the most "critical" element and the element which has the greatest repercussions on other elements of the system and thus a "decisive" role in the management of the emergency. This model was developed within a GIS system through the following phases: 1. the topological characterization of the territorial system studied and the assessment of the scenarios in terms of spatial landslide hazard. A statistical method, based on neural networks was proposed for the assessment of landslide hazard; 2. the analysis of the direct consequences of a scenario event on the system; 3. the definition of the assessment model of systemic vulnerability in landslide-prone areas. To highlight the potentialities of the proposed approach we have described a specific case study of landslide hazard in the local council area of Potenza.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.
Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regardingmore » their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.« less
ROBOTICS IN HAZARDOUS ENVIRONMENTS - REAL DEPLOYMENTS BY THE SAVANNAH RIVER NATIONAL LABORATORY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kriikku, E.; Tibrea, S.; Nance, T.
The Research & Development Engineering (R&DE) section in the Savannah River National Laboratory (SRNL) engineers, integrates, tests, and supports deployment of custom robotics, systems, and tools for use in radioactive, hazardous, or inaccessible environments. Mechanical and electrical engineers, computer control professionals, specialists, machinists, welders, electricians, and mechanics adapt and integrate commercially available technology with in-house designs, to meet the needs of Savannah River Site (SRS), Department of Energy (DOE), and other governmental agency customers. This paper discusses five R&DE robotic and remote system projects.
75 FR 27504 - Substantial Product Hazard List: Hand-Held Hair Dryers
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-17
...The Consumer Product Safety Improvement Act of 2008 (``CPSIA''), authorizes the United States Consumer Product Safety Commission (``Commission'') to specify, by rule, for any consumer product or class of consumer products, characteristics whose existence or absence shall be deemed a substantial product hazard under certain circumstances. In this document, the Commission is proposing a rule to determine that any hand-held hair dryer without integral immersion protection presents a substantial product hazard.
Lunar mission safety and rescue: Hazards analysis and safety requirements
NASA Technical Reports Server (NTRS)
1971-01-01
The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel Manufacturing Facilities Initial...
Code of Federal Regulations, 2011 CFR
2011-07-01
... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel Manufacturing Facilities Initial...
Klise, Katherine A.; Bynum, Michael; Moriarty, Dylan; ...
2017-07-07
Water utilities are vulnerable to a wide variety of human-caused and natural disasters. The Water Network Tool for Resilience (WNTR) is a new open source PythonTM package designed to help water utilities investigate resilience of water distribution systems to hazards and evaluate resilience-enhancing actions. In this paper, the WNTR modeling framework is presented and a case study is described that uses WNTR to simulate the effects of an earthquake on a water distribution system. The case study illustrates that the severity of damage is not only a function of system integrity and earthquake magnitude, but also of the available resourcesmore » and repair strategies used to return the system to normal operating conditions. While earthquakes are particularly concerning since buried water distribution pipelines are highly susceptible to damage, the software framework can be applied to other types of hazards, including power outages and contamination incidents.« less
Road landslide information management and forecasting system base on GIS.
Wang, Wei Dong; Du, Xiang Gang; Xie, Cui Ming
2009-09-01
Take account of the characters of road geological hazard and its supervision, it is very important to develop the Road Landslides Information Management and Forecasting System based on Geographic Information System (GIS). The paper presents the system objective, function, component modules and key techniques in the procedure of system development. The system, based on the spatial information and attribute information of road geological hazard, was developed and applied in Guizhou, a province of China where there are numerous and typical landslides. The manager of communication, using the system, can visually inquire all road landslides information based on regional road network or on the monitoring network of individual landslide. Furthermore, the system, integrated with mathematical prediction models and the GIS's strongpoint on spatial analyzing, can assess and predict landslide developing procedure according to the field monitoring data. Thus, it can efficiently assists the road construction or management units in making decision to control the landslides and to reduce human vulnerability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klise, Katherine A.; Bynum, Michael; Moriarty, Dylan
Water utilities are vulnerable to a wide variety of human-caused and natural disasters. The Water Network Tool for Resilience (WNTR) is a new open source PythonTM package designed to help water utilities investigate resilience of water distribution systems to hazards and evaluate resilience-enhancing actions. In this paper, the WNTR modeling framework is presented and a case study is described that uses WNTR to simulate the effects of an earthquake on a water distribution system. The case study illustrates that the severity of damage is not only a function of system integrity and earthquake magnitude, but also of the available resourcesmore » and repair strategies used to return the system to normal operating conditions. While earthquakes are particularly concerning since buried water distribution pipelines are highly susceptible to damage, the software framework can be applied to other types of hazards, including power outages and contamination incidents.« less
49 CFR 192.911 - What are the elements of an integrity management program?
Code of Federal Regulations, 2010 CFR
2010-10-01
...) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.911 What are the elements of an integrity management program...
49 CFR 192.933 - What actions must be taken to address integrity issues?
Code of Federal Regulations, 2010 CFR
2010-10-01
...) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.933 What actions must be taken to address integrity issues? (a...
49 CFR 192.909 - How can an operator change its integrity management program?
Code of Federal Regulations, 2010 CFR
2010-10-01
... (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.909 How can an operator change its integrity management...
NASA Astrophysics Data System (ADS)
Carlier, Benoit; Puissant, Anne; Dujarric, Constance
2017-04-01
Vulnerability assessment together with hazard exposure is generally accepted as the two main steps of risk analysis. If quantitative methods to estimate hazard exposure are now well-defined, it is not the case regarding vulnerability assessment. Vulnerability is a complex concept involving a variety of disciplines from physical and socio-economic sciences (i.e. engineering, economics, social and health sciences etc.). Currently, two opposite trends exist: the 'physical' approach in which vulnerability is analysed as potential impacts (i.e. structural and functional) on the elements at risk (building, network, land cover); and the 'social' approach in which vulnerability is a combination of socio-economic variables determining people's ability to anticipate before a catastrophic event, to react during it, and to recover after it. For a complete analysis of vulnerability it is essential to combine these two approaches but in reality few works exists. The objective of this research is to improve the Potential Damage Index (PDI), detailed in Puissant el al. (2013), originally developed to assess physical injury, structural and functional consequences of landslide hazard, by including socio-economic characteristics of population information. Data from the French Census data (INSEE, 2012) and a survey on risk perception (100 questionnaires obtained between 2014 and 2015/16) were used to propose an overall index taking into account the three main phases of risk management: preparedness, crisis management and recovery. This new index called Global Potential Damage Index (GPDI) is applied on the Upper Guil Catchment to assess potential torrential floods hazard in the context of the French funded project SAMCO (Society Adaptation for coping with Mountain risks in a global change Context). Results of the PDI are compared with the GPDI and show significant differences. GPDI scores mapping are lower than PDI scores indicating that resilient population may qualify results obtained for physical consequences. In GPDI the social and institutional component is expressed by a unique value applied for the overall stakes of a same community. Consequently, socio-economics differences between Upper Guil catchments communities are highlighted and make results easily understandable for local manager.
AGU:Comments Requested on Natural Hazards Position Statement
NASA Astrophysics Data System (ADS)
2004-11-01
Natural hazards (earthquakes, floods, hurricanes, landslides, meteors, space weather, tornadoes, volcanoes, and other geophysical phenomena) are an integral component of our dynamic planet. These can have disastrous effects on vulnerable communities and ecosystems. By understanding how and where hazards occur, what causes them, and what circumstances increase their severity, we can develop effective strategies to reduce their impact. In practice, mitigating hazards requires addressing issues such as real-time monitoring and prediction, emergency preparedness, public education and awareness, post-disaster recovery, engineering, construction practices, land use, and building codes. Coordinated approaches involving scientists, engineers, policy makers, builders, lenders, insurers, news media, educators, relief organizations, and the public are therefore essential to reducing the adverse effects of natural hazards.
Tyler, Nicholas J C; Gregorini, Pablo; Forchhammer, Mads C; Stokkan, Karl-Arne; van Oort, Bob E H; Hazlerigg, David G
2016-10-01
Occurrence of 24-h rhythms in species apparently lacking functional molecular clockwork indicates that strong circadian mechanisms are not essential prerequisites of robust timing, and that rhythmical patterns may arise instead as passive responses to periodically changing environmental stimuli. Thus, in a new synthesis of grazing in a ruminant (MINDY), crepuscular peaks of activity emerge from interactions between internal and external stimuli that influence motivation to feed, and the influence of the light/dark cycle is mediated through the effect of low nocturnal levels of food intake on gastric function. Drawing on risk allocation theory, we hypothesized that the timing of behavior in ruminants is influenced by the independent effects of light on motivation to feed and perceived risk of predation. We predicted that the antithetical relationship between these 2 drivers would vary with photoperiod, resulting in a systematic shift in the phase of activity relative to the solar cycle across the year. This prediction was formalized in a model in which phase of activity emerges from a photoperiod-dependent trade-off between food and safety. We tested this model using data on the temporal pattern of activity in reindeer/caribou Rangifer tarandus free-living at natural mountain pasture in sub-Arctic Norway. The resulting nonlinear relationship between the phasing of crepuscular activity and photoperiod, consistent with the model, suggests a mechanism for behavioral timing that is independent of the core circadian system. We anticipate that such timing depends on integration of metabolic feedback from the digestive system and the activity of the glucocorticoid axis which modulates the behavioral responses of the animal to environmental hazard. The hypothalamus is the obvious neural substrate to achieve this integration. © 2016 The Author(s).
2009-01-01
Background During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia. PMID:19331670
While there is a high potential for exposure of humans and ecosystems to chemicals released from hazardous waste sites, the degree to which this potential is realized is often uncertain. Conceptually divided among parameter, model, and modeler uncertainties imparted during simula...
Rocky Mountain Research Station USDA Forest Service
2004-01-01
Effective public education and communication campaigns about wildland fire and fuels management should have clear objectives, and use the right techniques to achieve these objectives. This fact sheet lists seven important considerations for planning or implementing a hazard communication effort.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L. K.; Vogel, R. M.
2015-11-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied Generalized Pareto (GP) model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series X, with corresponding failure time series T, should have application to a wide class of natural hazards with rich opportunities for future extensions.
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, Laura K.; Vogel, Richard M.
2016-04-01
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (X) with its failure time series (T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keating, Gordon N.; Schultz-Fellenz, Emily S.; Miller, Elizabeth D.
2010-09-01
The integration of available information on the volcanic history of the region surrounding Los Alamos National Laboratory indicates that the Laboratory is at risk from volcanic hazards. Volcanism in the vicinity of the Laboratory is unlikely within the lifetime of the facility (ca. 50–100 years) but cannot be ruled out. This evaluation provides a preliminary estimate of recurrence rates for volcanic activity. If further assessment of the hazard is deemed beneficial to reduce risk uncertainty, the next step would be to convene a formal probabilistic volcanic hazards assessment.
2014-01-01
Background Home hazards are associated with toddlers receiving unintentional home injuries (UHI). These result in not only physical and psychological difficulties for children, but also economic losses and additional stress for their families. Few researchers pay attention to predictors of home hazards among toddlers in a systematic way. The purpose of this study is firstly to describe the characteristics of homes with hazards and secondly to explore the predicted relationship of children, parents and family factors to home hazards among toddlers aged 24–47 months in Wenzhou, China. Methods A random cluster sampling was employed to select 366 parents having children aged 24 – 47 months from 13 kindergartens between March and April of 2012. Four instruments assessed home hazards, demographics, parent’s awareness of UHI, as well as family functioning. Results Descriptive statistics showed that the mean of home hazards was 12.29 (SD = 6.39). The nine kinds of home hazards that were identified in over 50% of households were: plastic bags (74.3%), coin buttons (69.1%), and toys with small components (66.7%) etc. Multivariate linear regression revealed that the predictors of home hazards were the child’s age, the child’s residential status and family functioning (b = .19, 2.02, - .07, p < .01, < .05 and < .01, respectively). Conclusions The results showed that a higher number of home hazards were significantly attributed to older toddlers, migrant toddlers and poorer family functioning. This result suggested that heath care providers should focus on the vulnerable family and help the parents assess home hazards. Further study is needed to find interventions on how to manage home hazards for toddlers in China. PMID:24953678
NASA Astrophysics Data System (ADS)
Câmara, F.; Oliveira, J.; Hormigo, T.; Araújo, J.; Ribeiro, R.; Falcão, A.; Gomes, M.; Dubois-Matra, O.; Vijendran, S.
2015-06-01
This paper discusses the design and evaluation of data fusion strategies to perform tiered fusion of several heterogeneous sensors and a priori data. The aim is to increase robustness and performance of hazard detection and avoidance systems, while enabling safe planetary and small body landings anytime, anywhere. The focus is on Mars and asteroid landing mission scenarios and three distinct data fusion algorithms are introduced and compared. The first algorithm consists of a hybrid camera-LIDAR hazard detection and avoidance system, the H2DAS, in which data fusion is performed at both sensor-level data (reconstruction of the point cloud obtained with a scanning LIDAR using the navigation motion states and correcting the image for motion compensation using IMU data), feature-level data (concatenation of multiple digital elevation maps, obtained from consecutive LIDAR images, to achieve higher accuracy and resolution maps while enabling relative positioning) as well as decision-level data (fusing hazard maps from multiple sensors onto a single image space, with a single grid orientation and spacing). The second method presented is a hybrid reasoning fusion, the HRF, in which innovative algorithms replace the decision-level functions of the previous method, by combining three different reasoning engines—a fuzzy reasoning engine, a probabilistic reasoning engine and an evidential reasoning engine—to produce safety maps. Finally, the third method presented is called Intelligent Planetary Site Selection, the IPSIS, an innovative multi-criteria, dynamic decision-level data fusion algorithm that takes into account historical information for the selection of landing sites and a piloting function with a non-exhaustive landing site search capability, i.e., capable of finding local optima by searching a reduced set of global maps. All the discussed data fusion strategies and algorithms have been integrated, verified and validated in a closed-loop simulation environment. Monte Carlo simulation campaigns were performed for the algorithms performance assessment and benchmarking. The simulations results comprise the landing phases of Mars and Phobos landing mission scenarios.
The Demonstrator for the European Plate Observing System (EPOS)
NASA Astrophysics Data System (ADS)
Hoffmann, T. L.; Euteneuer, F.; Ulbricht, D.; Lauterjung, J.; Bailo, D.; Jeffery, K. G.
2014-12-01
An important outcome of the 4-year Preparatory Phase of the ESFRI project European Plate Observing System (EPOS) was the development and first implementation of the EPOS Demonstrator by the project's ICT Working Group 7. The Demonstrator implements the vertical integration of the three-layer architectural scheme for EPOS, connecting the Integrated Core Services (ICS), Thematic Core Services (TCS) and the National Research Infrastructures (NRI). The demonstrator provides a single GUI with central key discovery and query functionalities, based on already existing services by the seismic, geologic and geodetic communities. More specifically the seismic services of the Demonstrator utilize webservices and APIs for data and discovery of raw seismic data (FDSN webservices by the EIDA Network), events (Geoportal by EMSC) and analytical data products (e.g., hazard maps by EFEHR via OGC WMS). For geologic services, the EPOS Demonstrator accesses OneGeology Europe which serves the community with geologic maps and point information via OGC webservices. The Demonstrator also provides access to raw geodetic data via a newly developed universal tool called GSAC. The Demonstrator itself resembles the future Integrated Core Service (ICS) and provides direct access to the end user. Its core functionality lies in a metadata catalogue, which serves as the central information hub and stores information about all RIs, related persons, projects, financial background and technical access information. The database schema of the catalogue is based on CERIF, which has been slightly adapted. Currently, the portal provides basic query functions as well as cross domain search. [www.epos.cineca.it
A New Seismic Hazard Model for Mainland China
NASA Astrophysics Data System (ADS)
Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z. K.
2017-12-01
We are developing a new seismic hazard model for Mainland China by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data, and derive a strain rate model based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones. For each zone, a tapered Gutenberg-Richter (TGR) magnitude-frequency distribution is used to model the seismic activity rates. The a- and b-values of the TGR distribution are calculated using observed earthquake data, while the corner magnitude is constrained independently using the seismic moment rate inferred from the geodetically-based strain rate model. Small and medium sized earthquakes are distributed within the source zones following the location and magnitude patterns of historical earthquakes. Some of the larger earthquakes are distributed onto active faults, based on their geological characteristics such as slip rate, fault length, down-dip width, and various paleoseismic data. The remaining larger earthquakes are then placed into the background. A new set of magnitude-rupture scaling relationships is developed based on earthquake data from China and vicinity. We evaluate and select appropriate ground motion prediction equations by comparing them with observed ground motion data and performing residual analysis. To implement the modeling workflow, we develop a tool that builds upon the functionalities of GEM's Hazard Modeler's Toolkit. The GEM OpenQuake software is used to calculate seismic hazard at various ground motion periods and various return periods. To account for site amplification, we construct a site condition map based on geology. The resulting new seismic hazard maps can be used for seismic risk analysis and management.
Geospatial Data Integration for Assessing Landslide Hazard on Engineered Slopes
NASA Astrophysics Data System (ADS)
Miller, P. E.; Mills, J. P.; Barr, S. L.; Birkinshaw, S. J.
2012-07-01
Road and rail networks are essential components of national infrastructures, underpinning the economy, and facilitating the mobility of goods and the human workforce. Earthwork slopes such as cuttings and embankments are primary components, and their reliability is of fundamental importance. However, instability and failure can occur, through processes such as landslides. Monitoring the condition of earthworks is a costly and continuous process for network operators, and currently, geospatial data is largely underutilised. The research presented here addresses this by combining airborne laser scanning and multispectral aerial imagery to develop a methodology for assessing landslide hazard. This is based on the extraction of key slope stability variables from the remotely sensed data. The methodology is implemented through numerical modelling, which is parameterised with the slope stability information, simulated climate conditions, and geotechnical properties. This allows determination of slope stability (expressed through the factor of safety) for a range of simulated scenarios. Regression analysis is then performed in order to develop a functional model relating slope stability to the input variables. The remotely sensed raster datasets are robustly re-sampled to two-dimensional cross-sections to facilitate meaningful interpretation of slope behaviour and mapping of landslide hazard. Results are stored in a geodatabase for spatial analysis within a GIS environment. For a test site located in England, UK, results have shown the utility of the approach in deriving practical hazard assessment information. Outcomes were compared to the network operator's hazard grading data, and show general agreement. The utility of the slope information was also assessed with respect to auto-population of slope geometry, and found to deliver significant improvements over the network operator's existing field-based approaches.
Sustainable and safe design of footwear integrating ecological footprint and risk criteria.
Herva, Marta; Álvarez, Antonio; Roca, Enrique
2011-09-15
The ecodesign of a product implies that different potential environmental impacts of diverse nature must be taken into account considering its whole life cycle, apart from the general design criteria (i.e. technical, functional, ergonomic, aesthetic or economic). In this sense, a sustainability assessment methodology, ecological footprint (EF), and environmental risk assessment (ERA), were combined for the first time to derive complementary criteria for the ecodesign of footwear. Four models of children's shoes were analyzed and compared. The synthetic shoes obtained a smaller EF (6.5 gm(2)) when compared to the leather shoes (11.1 gm(2)). However, high concentrations of hazardous substances were detected in the former, even making the Hazard Quotient (HQ) and the Cancer Risk (CR) exceed the recommended safety limits for one of the synthetic models analyzed. Risk criteria were prioritized in this case and, consequently, the design proposal was discarded. For the other cases, the perspective provided by the indicators of different nature was balanced to accomplish a fairest evaluation. The selection of fibers produced under sustainable criteria and the reduction of the materials consumption was recommended, since the area requirements would be minimized and the absence of hazardous compounds would ensure safety conditions during the use stage. Copyright © 2011 Elsevier B.V. All rights reserved.
Quantitative risk analysis of oil storage facilities in seismic areas.
Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto
2005-08-31
Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.
Damage assessment of bridge infrastructure subjected to flood-related hazards
NASA Astrophysics Data System (ADS)
Michalis, Panagiotis; Cahill, Paul; Bekić, Damir; Kerin, Igor; Pakrashi, Vikram; Lapthorne, John; Morais, João Gonçalo Martins Paulo; McKeogh, Eamon
2017-04-01
Transportation assets represent a critical component of society's infrastructure systems. Flood-related hazards are considered one of the main climate change impacts on highway and railway infrastructure, threatening the security and functionality of transportation systems. Of such hazards, flood-induced scour is a primarily cause of bridge collapses worldwide and one of the most complex and challenging water flow and erosion phenomena, leading to structural instability and ultimately catastrophic failures. Evaluation of scour risk under severe flood events is a particularly challenging issue considering that depth of foundations is very difficult to evaluate in water environment. The continual inspection, assessment and maintenance of bridges and other hydraulic structures under extreme flood events requires a multidisciplinary approach, including knowledge and expertise of hydraulics, hydrology, structural engineering, geotechnics and infrastructure management. The large number of bridges under a single management unit also highlights the need for efficient management, information sharing and self-informing systems to provide reliable, cost-effective flood and scour risk management. The "Intelligent Bridge Assessment Maintenance and Management System" (BRIDGE SMS) is an EU/FP7 funded project which aims to couple state-of-the art scientific expertise in multidisciplinary engineering sectors with industrial knowledge in infrastructure management. This involves the application of integrated low-cost structural health monitoring systems to provide real-time information towards the development of an intelligent decision support tool and a web-based platform to assess and efficiently manage bridge assets. This study documents the technological experience and presents results obtained from the application of sensing systems focusing on the damage assessment of water-hazards at bridges over watercourses in Ireland. The applied instrumentation is interfaced with an open-source platform that can offer a more economical remote monitoring solution. The results presented in this investigation provide an important guide for a multidisciplinary approach to bridge monitoring and can be used as a benchmark for the field application of cost-effective and robust sensing methods. This will deliver key information regarding the impact of water-related hazards at bridge structures through an integrated structural health monitoring and management system. Acknowledgement: The authors wish to acknowledge the financial support of the European Commission, through the Marie Curie action Industry-Academia Partnership and Pathways Network BRIDGE SMS (Intelligent Bridge Assessment Maintenance and Management System) - FP7-People-2013-IAPP- 612517.
Water Induced Hazard Mapping in Nepal: A Case Study of East Rapti River Basin
NASA Astrophysics Data System (ADS)
Neupane, N.
2010-12-01
This paper presents illustration on typical water induced hazard mapping of East Rapti River Basin under the DWIDP, GON. The basin covers an area of 2398 sq km. The methodology includes making of base map of water induced disaster in the basin. Landslide hazard maps were prepared by SINMAP approach. Debris flow hazard maps were prepared by considering geology, slope, and saturation. Flood hazard maps were prepared by using two approaches: HEC-RAS and Satellite Imagery Interpretation. The composite water-induced hazard maps were produced by compiling the hazards rendered by landslide, debris flow, and flood. The monsoon average rainfall in the basin is 1907 mm whereas maximum 24 hours precipitation is 456.8 mm. The peak discharge of the Rapati River in the year of 1993 at station was 1220 cu m/sec. This discharge nearly corresponds to the discharge of 100-year return period. The landslides, floods, and debris flows triggered by the heavy rain of July 1993 claimed 265 lives, affected 148516 people, and damaged 1500 houses in the basin. The field investigation and integrated GIS interpretation showed that the very high and high landslide hazard zones collectively cover 38.38% and debris flow hazard zone constitutes 6.58%. High flood hazard zone occupies 4.28% area of the watershed. Mitigation measures are recommendated according to Integrated Watershed Management Approach under which the non-structural and structural measures are proposed. The non-structural measures includes: disaster management training, formulation of evacuation system (arrangement of information plan about disaster), agriculture management practices, protection of water sources, slope protections and removal of excessive bed load from the river channel. Similarly, structural measures such as dike, spur, rehabilitation of existing preventive measures and river training at some locations are recommendated. The major factors that have contributed to induce high incidences of various types of mass movements and inundation in the basin are rock and soil properties, prolonged and high-intensity rainfall, steep topography and various anthropogenic factors.
Transportation of Hazardous Evidentiary Material.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osborn, Douglas.
2005-06-01
This document describes the specimen and transportation containers currently available for use with hazardous and infectious materials. A detailed comparison of advantages, disadvantages, and costs of the different technologies is included. Short- and long-term recommendations are also provided.3 DraftDraftDraftExecutive SummaryThe Federal Bureau of Investigation's Hazardous Materials Response Unit currently has hazardous material transport containers for shipping 1-quart paint cans and small amounts of contaminated forensic evidence, but the containers may not be able to maintain their integrity under accident conditions or for some types of hazardous materials. This report provides guidance and recommendations on the availability of packages for themore » safe and secure transport of evidence consisting of or contaminated with hazardous chemicals or infectious materials. Only non-bulk containers were considered because these are appropriate for transport on small aircraft. This report will addresses packaging and transportation concerns for Hazardous Classes 3, 4, 5, 6, 8, and 9 materials. If the evidence is known or suspected of belonging to one of these Hazardous Classes, it must be packaged in accordance with the provisions of 49 CFR Part 173. The anthrax scare of several years ago, and less well publicized incidents involving unknown and uncharacterized substances, has required that suspicious substances be sent to appropriate analytical laboratories for analysis and characterization. Transportation of potentially hazardous or infectious material to an appropriate analytical laboratory requires transport containers that maintain both the biological and chemical integrity of the substance in question. As a rule, only relatively small quantities will be available for analysis. Appropriate transportation packaging is needed that will maintain the integrity of the substance, will not allow biological alteration, will not react chemically with the substance being shipped, and will otherwise maintain it as nearly as possible in its original condition.The recommendations provided are short-term solutions to the problems of shipping evidence, and have considered only currently commercially available containers. These containers may not be appropriate for all cases. Design, testing, and certification of new transportation containers would be necessary to provide a container appropriate for all cases.Table 1 provides a summary of the recommendations for each class of hazardous material.Table 1: Summary of RecommendationsContainerCost1-quart paint can with ArmlockTM seal ringLabelMaster(r)%242.90 eachHazard Class 3, 4, 5, 8, or 9 Small ContainersTC Hazardous Material Transport ContainerCurrently in Use4 DraftDraftDraftTable 1: Summary of Recommendations (continued)ContainerCost55-gallon open or closed-head steel drumsAll-Pak, Inc.%2458.28 - %2473.62 eachHazard Class 3, 4, 5, 8, or 9 Large Containers95-gallon poly overpack LabelMaster(r)%24194.50 each1-liter glass container with plastic coatingLabelMaster(r)%243.35 - %243.70 eachHazard Class 6 Division 6.1 Poisonous by Inhalation (PIH) Small ContainersTC Hazardous Material Transport ContainerCurrently in Use20 to 55-gallon PIH overpacksLabelMaster(r)%24142.50 - %24170.50 eachHazard Class 6 Division 6.1 Poisonous by Inhalation (PIH) Large Containers65 to 95-gallon poly overpacksLabelMaster(r)%24163.30 - %24194.50 each1-liter transparent containerCurrently in UseHazard Class 6 Division 6.2 Infectious Material Small ContainersInfectious Substance ShipperSource Packaging of NE, Inc.%24336.00 eachNone Commercially AvailableN/AHazard Class 6 Division 6.2 Infectious Material Large ContainersNone Commercially Available N/A5« less
Cheung, Weng-Fong; Lin, Tzu-Hsuan; Lin, Yu-Cheng
2018-01-01
In recent years, many studies have focused on the application of advanced technology as a way to improve management of construction safety management. A Wireless Sensor Network (WSN), one of the key technologies in Internet of Things (IoT) development, enables objects and devices to sense and communicate environmental conditions; Building Information Modeling (BIM), a revolutionary technology in construction, integrates database and geometry into a digital model which provides a visualized way in all construction lifecycle management. This paper integrates BIM and WSN into a unique system which enables the construction site to visually monitor the safety status via a spatial, colored interface and remove any hazardous gas automatically. Many wireless sensor nodes were placed on an underground construction site and to collect hazardous gas level and environmental condition (temperature and humidity) data, and in any region where an abnormal status is detected, the BIM model will alert the region and an alarm and ventilator on site will start automatically for warning and removing the hazard. The proposed system can greatly enhance the efficiency in construction safety management and provide an important reference information in rescue tasks. Finally, a case study demonstrates the applicability of the proposed system and the practical benefits, limitations, conclusions, and suggestions are summarized for further applications. PMID:29393887
76 FR 28336 - Domestic Licensing of Source Material-Amendments/Integrated Safety Analysis
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-17
... considered. The HF gas (and uranyl fluoride) is quickly produced from the chemical reaction that occurs when... worker's death was the inhalation of HF gas, which was produced from the chemical reaction of UF6 and..., would address both the radiological and chemical hazards from licensed material and hazardous chemicals...
Ecosystem processes at the watershed scale: mapping and modeling ecohydrological controls
Lawrence E. Band; T. Hwang; T.C. Hales; James Vose; Chelcy Ford
2012-01-01
Mountain watersheds are sources of a set of valuable ecosystem services as well as potential hazards. The former include high quality freshwater, carbon sequestration, nutrient retention, and biodiversity, whereas the latter include flash floods, landslides and forest fires. Each of these ecosystem services and hazards represents different elements of the integrated...
Recording and cataloging hazards information, revision A
NASA Technical Reports Server (NTRS)
Stein, R. J.
1974-01-01
A data collection process is described for the purpose of discerning causation factors of accidents, and the establishment of boundaries or controls aimed at mitigating and eliminating accidents. A procedure is proposed that suggests a discipline approach to hazard identification based on energy interrelationships together with an integrated control technique which takes the form of checklists.
Rocky Mountain Research Station USDA Forest Service
2004-01-01
Within the past 10 years, breakthrough research has identified factors that are most important for effectively communicating about wildland fire hazards. This fact sheet discusses seven "Laws" of effective public communication that should be considered in any state-of-the-art education campaign.
Hazard function theory for nonstationary natural hazards
Read, Laura K.; Vogel, Richard M.
2016-04-11
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.« less
Hazard function theory for nonstationary natural hazards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Read, Laura K.; Vogel, Richard M.
Impact from natural hazards is a shared global problem that causes tremendous loss of life and property, economic cost, and damage to the environment. Increasingly, many natural processes show evidence of nonstationary behavior including wind speeds, landslides, wildfires, precipitation, streamflow, sea levels, and earthquakes. Traditional probabilistic analysis of natural hazards based on peaks over threshold (POT) generally assumes stationarity in the magnitudes and arrivals of events, i.e., that the probability of exceedance of some critical event is constant through time. Given increasing evidence of trends in natural hazards, new methods are needed to characterize their probabilistic behavior. The well-developed field ofmore » hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series ( X) with its failure time series ( T), enabling computation of corresponding average return periods, risk, and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose POT magnitudes are assumed to follow the widely applied generalized Pareto model. We derive the hazard function for this case and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. As a result, our theoretical analysis linking hazard random variable X with corresponding failure time series T should have application to a wide class of natural hazards with opportunities for future extensions.« less
Ramadan, Adham R; Kock, Per; Nadim, Amani
2005-04-01
A facility for the treatment and disposal of industrial hazardous waste has been established in Alexandria, Egypt. Phase I of the facility encompassing a secure landfill and solar evaporation ponds is ready to receive waste, and Phase II encompassing physico-chemical treatment, solidification, and interim storage is underway. The facility, the Nasreya Centre, is the first of its kind in Egypt, and represents the nucleus for the integration, improvement and further expansion of different hazardous waste management practices and services in Alexandria. It has been developed within the overall legal framework of the Egyptian Law for the Environment, and is expected to improve prospects for enforcement of the regulatory requirements specified in this law. It has been developed with the overall aim of promoting the establishment of an integrated industrial hazardous waste management system in Alexandria, serving as a demonstration to be replicated elsewhere in Egypt. For Phase I, the Centre only accepts inorganic industrial wastes. In this respect, a waste acceptance policy has been developed, which is expected to be reviewed during Phase II, with an expansion of the waste types accepted.
Next generation microbiological risk assessment-Potential of omics data for hazard characterisation.
Haddad, Nabila; Johnson, Nick; Kathariou, Sophia; Métris, Aline; Phister, Trevor; Pielaat, Annemarie; Tassou, Chrysoula; Wells-Bennik, Marjon H J; Zwietering, Marcel H
2018-04-12
According to the World Health Organization estimates in 2015, 600 million people fall ill every year from contaminated food and 420,000 die. Microbial risk assessment (MRA) was developed as a tool to reduce and prevent risks presented by pathogens and/or their toxins. MRA is organized in four steps to analyse information and assist in both designing appropriate control options and implementation of regulatory decisions and programs. Among the four steps, hazard characterisation is performed to establish the probability and severity of a disease outcome, which is determined as function of the dose of toxin and/or pathogen ingested. This dose-response relationship is subject to both variability and uncertainty. The purpose of this review/opinion article is to discuss how Next Generation Omics can impact hazard characterisation and, more precisely, how it can improve our understanding of variability and limit the uncertainty in the dose-response relation. The expansion of omics tools (e.g. genomics, transcriptomics, proteomics and metabolomics) allows for a better understanding of pathogenicity mechanisms and virulence levels of bacterial strains. Detection and identification of virulence genes, comparative genomics, analyses of mRNA and protein levels and the development of biomarkers can help in building a mechanistic dose-response model to predict disease severity. In this respect, systems biology can help to identify critical system characteristics that confer virulence and explain variability between strains. Despite challenges in the integration of omics into risk assessment, some omics methods have already been used by regulatory agencies for hazard identification. Standardized methods, reproducibility and datasets obtained from realistic conditions remain a challenge, and are needed to improve accuracy of hazard characterisation. When these improvements are realized, they will allow the health authorities and government policy makers to prioritize hazards more accurately and thus refine surveillance programs with the collaboration of all stakeholders of the food chain. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Space vehicle propulsion systems: Environmental space hazards
NASA Technical Reports Server (NTRS)
Disimile, P. J.; Bahr, G. K.
1990-01-01
The hazards that exist in geolunar space which may degrade, disrupt, or terminate the performance of space-based LOX/LH2 rocket engines are evaluated. Accordingly, a summary of the open literature pertaining to the geolunar space hazards is provided. Approximately 350 citations and about 200 documents and abstracts were reviewed; the documents selected give current and quantitative detail. The methodology was to categorize the various space hazards in relation to their importance in specified regions of geolunar space. Additionally, the effect of the various space hazards in relation to spacecraft and their systems were investigated. It was found that further investigation of the literature would be required to assess the effects of these hazards on propulsion systems per se; in particular, possible degrading effects on exterior nozzle structure, directional gimbals, and internal combustion chamber integrity and geometry.
PERSONNEL PROTECTION THROUGH RECONNAISSANCE ROBOTICS AT SUPERFUND REMEDIAL SITES
Investigation, mitigation, and clean-up of hazardous materials at Superfund sites normally require on-site workers to perform hazardous and sometimes potentially dangerous functions. uch functions include site surveys and the reconnaissance for airborne and buried toxic environme...
DEMONSTRATION OF AUTONOMOUS AIR MONITORING THROUGH ROBOTICS
Hazardous and/or tedious functions are often performed by on-site workers during investigation, mitigation and clean-up of hazardous substances. These functions include site surveys, sampling and analysis, excavation, and treatment and preparation of wastes for shipment to chemic...
An Integrated GIS-Expert System Framework for Live Hazard Monitoring and Detection.
McCarthy, James D; Graniero, Phil A; Rozic, Steven M
2008-02-08
In the context of hazard monitoring, using sensor web technology to monitor anddetect hazardous conditions in near-real-time can result in large amounts of spatial data thatcan be used to drive analysis at an instrumented site. These data can be used for decisionmaking and problem solving, however as with any analysis problem the success ofanalyzing hazard potential is governed by many factors such as: the quality of the sensordata used as input; the meaning that can be derived from those data; the reliability of themodel used to describe the problem; the strength of the analysis methods; and the ability toeffectively communicate the end results of the analysis. For decision makers to make use ofsensor web data these issues must be dealt with to some degree. The work described in thispaper addresses all of these areas by showing how raw sensor data can be automaticallytransformed into a representation which matches a predefined model of the problem context.This model can be understood by analysis software that leverages rule-based logic andinference techniques to reason with, and draw conclusions about, spatial data. These toolsare integrated with a well known Geographic Information System (GIS) and existinggeospatial and sensor web infrastructure standards, providing expert users with the toolsneeded to thoroughly explore a problem site and investigate hazards in any domain.
Integrating Powered Descent Vehicle with Back Shell of Mars Spacecraft
2011-11-10
The powered descent vehicle of NASA Mars Science Laboratory spacecraft is being prepared for final integration into the spacecraft back shell in this photograph from inside the Payload Hazardous Servicing Facility at NASA Kennedy Space Center, Fla.
Mars Science Laboratory Heat Shield Integration for Flight
2011-11-10
During final stacking of NASA Mars Science Laboratory spacecraft, the heat shield is positioned for integration with the rest of the spacecraft in this photograph from inside the Payload Hazardous Servicing Facility at NASA Kennedy Space Center, Fla.
Anaesthesia machine: checklist, hazards, scavenging.
Goneppanavar, Umesh; Prabhu, Manjunath
2013-09-01
From a simple pneumatic device of the early 20(th) century, the anaesthesia machine has evolved to incorporate various mechanical, electrical and electronic components to be more appropriately called anaesthesia workstation. Modern machines have overcome many drawbacks associated with the older machines. However, addition of several mechanical, electronic and electric components has contributed to recurrence of some of the older problems such as leak or obstruction attributable to newer gadgets and development of newer problems. No single checklist can satisfactorily test the integrity and safety of all existing anaesthesia machines due to their complex nature as well as variations in design among manufacturers. Human factors have contributed to greater complications than machine faults. Therefore, better understanding of the basics of anaesthesia machine and checking each component of the machine for proper functioning prior to use is essential to minimise these hazards. Clear documentation of regular and appropriate servicing of the anaesthesia machine, its components and their satisfactory functioning following servicing and repair is also equally important. Trace anaesthetic gases polluting the theatre atmosphere can have several adverse effects on the health of theatre personnel. Therefore, safe disposal of these gases away from the workplace with efficiently functioning scavenging system is necessary. Other ways of minimising atmospheric pollution such as gas delivery equipment with negligible leaks, low flow anaesthesia, minimal leak around the airway equipment (facemask, tracheal tube, laryngeal mask airway, etc.) more than 15 air changes/hour and total intravenous anaesthesia should also be considered.
Anaesthesia Machine: Checklist, Hazards, Scavenging
Goneppanavar, Umesh; Prabhu, Manjunath
2013-01-01
From a simple pneumatic device of the early 20th century, the anaesthesia machine has evolved to incorporate various mechanical, electrical and electronic components to be more appropriately called anaesthesia workstation. Modern machines have overcome many drawbacks associated with the older machines. However, addition of several mechanical, electronic and electric components has contributed to recurrence of some of the older problems such as leak or obstruction attributable to newer gadgets and development of newer problems. No single checklist can satisfactorily test the integrity and safety of all existing anaesthesia machines due to their complex nature as well as variations in design among manufacturers. Human factors have contributed to greater complications than machine faults. Therefore, better understanding of the basics of anaesthesia machine and checking each component of the machine for proper functioning prior to use is essential to minimise these hazards. Clear documentation of regular and appropriate servicing of the anaesthesia machine, its components and their satisfactory functioning following servicing and repair is also equally important. Trace anaesthetic gases polluting the theatre atmosphere can have several adverse effects on the health of theatre personnel. Therefore, safe disposal of these gases away from the workplace with efficiently functioning scavenging system is necessary. Other ways of minimising atmospheric pollution such as gas delivery equipment with negligible leaks, low flow anaesthesia, minimal leak around the airway equipment (facemask, tracheal tube, laryngeal mask airway, etc.) more than 15 air changes/hour and total intravenous anaesthesia should also be considered. PMID:24249887
Sansoë-Bourget, Emmanuelle
2006-01-01
The use of biological indicators is integral to the validation of isolator decontamination cycles. The difficulty in setting up the initial qualification of the decontamination cycle and especially the successive requalifications may vary as a function of not only the installation to be qualified and the sterilizing agent and generator used, but also as a function of the type of biological indicators used. In this article the manufacture and control of biological indicators are analyzed using the hazard analysis and critical control point (HACCP) approach. The HACCP risk analysis, which must take into account the application of the isolator being qualified or requalified, is an efficient simplification tool for performing a decontamination cycle using either hydrogen peroxide gas or peracetic acid in a reliable, economical, and reproducible way.
Satellite-driven modeling approach for monitoring lava flow hazards during the 2017 Etna eruption
NASA Astrophysics Data System (ADS)
Del Negro, C.; Bilotta, G.; Cappello, A.; Ganci, G.; Herault, A.; Zago, V.
2017-12-01
The integration of satellite data and modeling represents an efficient strategy that may provide immediate answers to the main issues raised at the onset of a new effusive eruption. Satellite-based thermal remote sensing of hotspots related to effusive activity can effectively provide a variety of products suited to timing, locating, and tracking the radiant character of lava flows. Hotspots show the location and occurrence of eruptive events (vents). Discharge rate estimates may indicate the current intensity (effusion rate) and potential magnitude (volume). High-spatial resolution multispectral satellite data can complement field observations for monitoring the front position (length) and extension of flows (area). Physics-based models driven, or validated, by satellite-derived parameters are now capable of fast and accurate forecast of lava flow inundation scenarios (hazard). Here, we demonstrate the potential of the integrated application of satellite remote-sensing techniques and lava flow models during the 2017 effusive eruption at Mount Etna in Italy. This combined approach provided insights into lava flow field evolution by supplying detailed views of flow field construction (e.g., the opening of ephemeral vents) that were useful for more accurate and reliable forecasts of eruptive activity. Moreover, we gave a detailed chronology of the lava flow activity based on field observations and satellite images, assessed the potential extent of impacted areas, mapped the evolution of lava flow field, and executed hazard projections. The underside of this combination is the high sensitivity of lava flow inundation scenarios to uncertainties in vent location, discharge rate, and other parameters, which can make interpreting hazard forecasts difficult during an effusive crisis. However, such integration at last makes timely forecasts of lava flow hazards during effusive crises possible at the great majority of volcanoes for which no monitoring exists.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodman, Julie, E-mail: jgoodman@gradientcorp.com
Background: The International Agency for Research on Cancer (IARC) recently developed a framework for evaluating mechanistic evidence that includes a list of 10 key characteristics of carcinogens. This framework is useful for identifying and organizing large bodies of literature on carcinogenic mechanisms, but it lacks sufficient guidance for conducting evaluations that fully integrate mechanistic evidence into hazard assessments. Objectives: We summarize the framework, and suggest approaches to strengthen the evaluation of mechanistic evidence using this framework. Discussion: While the framework is useful for organizing mechanistic evidence, its lack of guidance for implementation limits its utility for understanding human carcinogenic potential.more » Specifically, it does not include explicit guidance for evaluating the biological significance of mechanistic endpoints, inter- and intra-individual variability, or study quality and relevance. It also does not explicitly address how mechanistic evidence should be integrated with other realms of evidence. Because mechanistic evidence is critical to understanding human cancer hazards, we recommend that IARC develop transparent and systematic guidelines for the use of this framework so that mechanistic evidence will be evaluated and integrated in a robust manner, and concurrently with other realms of evidence, to reach a final human cancer hazard conclusion. Conclusions: IARC does not currently provide a standardized approach to evaluating mechanistic evidence. Incorporating the recommendations discussed here will make IARC analyses of mechanistic evidence more transparent, and lead to assessments of cancer hazards that reflect the weight of the scientific evidence and allow for scientifically defensible decision-making. - Highlights: • IARC has a revised framework for evaluating literature on carcinogenic mechanisms. • The framework is based on 10 key characteristics of carcinogens. • IARC should develop transparent and systematic guidelines for using the framework. • It should better address biological significance, study quality, and relevance. • It should better address integrating mechanistic evidence with other evidence.« less
NASA Technical Reports Server (NTRS)
Crain, Timothy P.; Bishop, Robert H.; Carson, John M., III; Trawny, Nikolas; Hanak, Chad; Sullivan, Jacob; Christian, John; DeMars, Kyle; Campbell, Tom; Getchius, Joel
2016-01-01
The Morpheus Project began in late 2009 as an ambitious e ort code-named Project M to integrate three ongoing multi-center NASA technology developments: humanoid robotics, liquid oxygen/liquid methane (LOX/LCH4) propulsion and Autonomous Precision Landing and Hazard Avoidance Technology (ALHAT) into a single engineering demonstration mission to be own to the Moon by 2013. The humanoid robot e ort was redirected to a deploy- ment of Robonaut 2 on the International Space Station in February of 2011 while Morpheus continued as a terrestrial eld test project integrating the existing ALHAT Project's tech- nologies into a sub-orbital ight system using the world's rst LOX/LCH4 main propulsion and reaction control system fed from the same blowdown tanks. A series of 33 tethered tests with the Morpheus 1.0 vehicle and Morpheus 1.5 vehicle were conducted from April 2011 - December 2013 before successful, sustained free ights with the primary Vertical Testbed (VTB) navigation con guration began with Free Flight 3 on December 10, 2013. Over the course of the following 12 free ights and 3 tethered ights, components of the ALHAT navigation system were integrated into the Morpheus vehicle, operations, and ight control loop. The ALHAT navigation system was integrated and run concurrently with the VTB navigation system as a reference and fail-safe option in ight (see touchdown position esti- mate comparisons in Fig. 1). Flight testing completed with Free Flight 15 on December 15, 2014 with a completely autonomous Hazard Detection and Avoidance (HDA), integration of surface relative and Hazard Relative Navigation (HRN) measurements into the onboard dual-state inertial estimator Kalman lter software, and landing within 2 meters of the VTB GPS-based navigation solution at the safe landing site target. This paper describes the Mor- pheus joint VTB/ALHAT navigation architecture, the sensors utilized during the terrestrial ight campaign, issues resolved during testing, and the navigation results from the ight tests.
Smoothing spline ANOVA frailty model for recurrent event data.
Du, Pang; Jiang, Yihua; Wang, Yuedong
2011-12-01
Gap time hazard estimation is of particular interest in recurrent event data. This article proposes a fully nonparametric approach for estimating the gap time hazard. Smoothing spline analysis of variance (ANOVA) decompositions are used to model the log gap time hazard as a joint function of gap time and covariates, and general frailty is introduced to account for between-subject heterogeneity and within-subject correlation. We estimate the nonparametric gap time hazard function and parameters in the frailty distribution using a combination of the Newton-Raphson procedure, the stochastic approximation algorithm (SAA), and the Markov chain Monte Carlo (MCMC) method. The convergence of the algorithm is guaranteed by decreasing the step size of parameter update and/or increasing the MCMC sample size along iterations. Model selection procedure is also developed to identify negligible components in a functional ANOVA decomposition of the log gap time hazard. We evaluate the proposed methods with simulation studies and illustrate its use through the analysis of bladder tumor data. © 2011, The International Biometric Society.
Special Issue "Natural Hazards' Impact on Urban Areas and Infrastructure" in Natural Hazards
NASA Astrophysics Data System (ADS)
Bostenaru Dan, M.
2009-04-01
In 2006 and 2007, at the 3rd and 4th General Assembly of the European Geosciences Union respectivelly, the session on "Natural Hazards' Impact on Urban Areas and Infrastructure" was convened by Maria Bostenaru Dan, then at the Istituto Universitario di Studi Superiori di Pavia, ROSE School, Italy, who conducts research on earthquake management and Heidi Kreibich from the GFZ Potsdam, Germany, who conducts research on flood hazards, in 2007 being co-convened also by Agostino Goretti from the Civil Protection in Rome, Italy. The session initially started from an idea of Friedemann Wenzel from the Universität Karlsruhe (TH), Germany, the former speaker of the SFB 461 "Strong earthquakes", the university where also Maria Bostenaru graduated and worked and which runs together with the GFZ Potsdam the CEDIM, the Center for Disaster Management and Risk Reduction Technology. Selected papers from these two sessions as well as invited papers from other specialists were gathered for a special issue to be published in the journal "Natural Hazards" under the guest editorship of Heidi Kreibich and Maria Bostenaru Dan. Unlike the former special issue, this one contains a well balanced mixture of many hazards: climate change, floods, mountain hazards like avalanches, volcanoes, earthquakes. Aim of the issue was to enlarge the co-operation prospects between geosciences and other professions in field of natural hazards. Earthquake engineering and engineering seismology are seen more frequently co-operating, but in field of natural hazards there is a need to co-operate with urban planners, and, looking to the future, also in the field of integrated conservation, which implies co-operation between architecture and urban planning for the preservation of our environment. Integrated conservation is stipulated since the 1970s, which are the years when the participatism, and so the involvment of social sciences started.
Hazardous Drinking and Military Community Functioning: Identifying Mediating Risk Factors
ERIC Educational Resources Information Center
Foran, Heather M.; Heyman, Richard E.; Slep, Amy M. Smith
2011-01-01
Objective: Hazardous drinking is a serious societal concern in military populations. Efforts to reduce hazardous drinking among military personnel have been limited in effectiveness. There is a need for a deeper understanding of how community-based prevention models apply to hazardous drinking in the military. Community-wide prevention efforts may…
49 CFR 171.1 - Applicability of Hazardous Materials Regulations (HMR) to persons and functions.
Code of Federal Regulations, 2013 CFR
2013-10-01
... package or container or on a package or container containing a residue of a hazardous material. (5... bracing a hazardous materials package in a freight container or transport vehicle. (13) Segregating a hazardous materials package in a freight container or transport vehicle from incompatible cargo. (14...
49 CFR 171.1 - Applicability of Hazardous Materials Regulations (HMR) to persons and functions.
Code of Federal Regulations, 2011 CFR
2011-10-01
... package or container or on a package or container containing a residue of a hazardous material. (5... bracing a hazardous materials package in a freight container or transport vehicle. (13) Segregating a hazardous materials package in a freight container or transport vehicle from incompatible cargo. (14...
49 CFR 171.1 - Applicability of Hazardous Materials Regulations (HMR) to persons and functions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... package or container or on a package or container containing a residue of a hazardous material. (5... bracing a hazardous materials package in a freight container or transport vehicle. (13) Segregating a hazardous materials package in a freight container or transport vehicle from incompatible cargo. (14...
49 CFR 171.1 - Applicability of Hazardous Materials Regulations (HMR) to persons and functions.
Code of Federal Regulations, 2012 CFR
2012-10-01
... package or container or on a package or container containing a residue of a hazardous material. (5... bracing a hazardous materials package in a freight container or transport vehicle. (13) Segregating a hazardous materials package in a freight container or transport vehicle from incompatible cargo. (14...
Hazard recognition in mining: A psychological perspective. Information circular/1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perdue, C.W.; Kowalski, K.M.; Barrett, E.A.
1995-07-01
This U.S. Bureau of Mines report considers, from a psychological perspective, the perceptual process by which miners recognize and respond to mining hazards. It proposes that if the hazard recognition skills of miners can be improved, mining accidents may be reduced to a significant degree. Prior studies of hazard perception in mining are considered, as are relevant studies from investigations of military target identification, pilot and gunnery officer training, transportation safety, automobile operator behavior, as well as research into sensory functioning and visual information processing. A general model of hazard perception is introduced, and selected concepts from the psychology ofmore » perception that are applicable to the detection of mining hazards are reviewed. Hazard recognition is discussed as a function of the perceptual cues available to the miner as well as the cognitive resources and strategies employed by the miner. The development of expertise in resonding to hazards is related to individual differences in the experience, aptitude, and personality of the worker. Potential applications to miner safety and training are presented.« less
Integrating Entropy-Based Naïve Bayes and GIS for Spatial Evaluation of Flood Hazard.
Liu, Rui; Chen, Yun; Wu, Jianping; Gao, Lei; Barrett, Damian; Xu, Tingbao; Li, Xiaojuan; Li, Linyi; Huang, Chang; Yu, Jia
2017-04-01
Regional flood risk caused by intensive rainfall under extreme climate conditions has increasingly attracted global attention. Mapping and evaluation of flood hazard are vital parts in flood risk assessment. This study develops an integrated framework for estimating spatial likelihood of flood hazard by coupling weighted naïve Bayes (WNB), geographic information system, and remote sensing. The north part of Fitzroy River Basin in Queensland, Australia, was selected as a case study site. The environmental indices, including extreme rainfall, evapotranspiration, net-water index, soil water retention, elevation, slope, drainage proximity, and density, were generated from spatial data representing climate, soil, vegetation, hydrology, and topography. These indices were weighted using the statistics-based entropy method. The weighted indices were input into the WNB-based model to delineate a regional flood risk map that indicates the likelihood of flood occurrence. The resultant map was validated by the maximum inundation extent extracted from moderate resolution imaging spectroradiometer (MODIS) imagery. The evaluation results, including mapping and evaluation of the distribution of flood hazard, are helpful in guiding flood inundation disaster responses for the region. The novel approach presented consists of weighted grid data, image-based sampling and validation, cell-by-cell probability inferring and spatial mapping. It is superior to an existing spatial naive Bayes (NB) method for regional flood hazard assessment. It can also be extended to other likelihood-related environmental hazard studies. © 2016 Society for Risk Analysis.
Akazawa, K; Nakamura, T; Moriguchi, S; Shimada, M; Nose, Y
1991-07-01
Small sample properties of the maximum partial likelihood estimates for Cox's proportional hazards model depend on the sample size, the true values of regression coefficients, covariate structure, censoring pattern and possibly baseline hazard functions. Therefore, it would be difficult to construct a formula or table to calculate the exact power of a statistical test for the treatment effect in any specific clinical trial. The simulation program, written in SAS/IML, described in this paper uses Monte-Carlo methods to provide estimates of the exact power for Cox's proportional hazards model. For illustrative purposes, the program was applied to real data obtained from a clinical trial performed in Japan. Since the program does not assume any specific function for the baseline hazard, it is, in principle, applicable to any censored survival data as long as they follow Cox's proportional hazards model.
Landslide susceptibility and risk assessment: specificities for road networks
NASA Astrophysics Data System (ADS)
Pellicani, Roberta; Argentiero, Ilenia; Parisi, Alessandro; Spilotro, Giuseppe
2017-04-01
A regional-scale assessment of landslide susceptibility and risk along the main road corridors crossing the provincial territory of Matera (Basilicata Region, Southern Italy) was carried out. The entire provincial road network extends for about 1,320 km through a territory, of which represents the main connection infrastructure among thirty-one municipalities due to the lack of an efficient integrated transportation system through the whole regional territory. For this reason, the strategic importance of these roads consists in their uniqueness in connecting every urban center with the socio-economic surrounding context. These roads and their vehicular traffic are continuously exposed to instability processes (about the 40% of the total length is disrupted by landslides), characterized both by high intensity and low frequency and by low intensity and high frequency. This last typology, consisting in small shallow landslides, is particularly hazardous for the roads since it is widespread along the road network, its occurrence is connected to rainfalls and determines high vulnerability conditions for the road in terms of interruption of vehicular traffic. A GIS-based heuristic-bivariate statistical predictive model was performed to assess and map the landslide susceptibility in the study area, by using a polynomial function of eight predisposing factors, weighted according to their influence on the landslide phenomena, recognized and collected in an inventory. Susceptibility associated to small shallow phenomena was assessed by using a polynomial function of specific factors, such as slope angle and aspect, lithological outcrops, rainfalls, etc. In absence of detailed input data, the spatial distribution of landslide risk along the road corridors was assessed and mapped using a qualitative hazard-consequence matrix approach, by which risk is obtained by combining hazard categories with consequence classes pairwise in a two-dimensional table or matrix. Landslide hazard, which is a function of the return time, due to the lack of temporal data, was evaluated as a function of the landslide intensity (velocity and areal extent) and susceptibility. The direct consequences of instability on the roads were defined by combining exposure and vulnerability in a matrix. Exposure was evaluated in terms of amount of traffic, which was calculated along each road stretch, connecting two or more urban areas, as a function of the average of population of each centers. Vulnerability, which expresses the degree of damage, was assessed in function of the presence of criticalities along roads, which were ranked according to the severity of damages and type of performed reparation works. The consequences, combined with the hazard levels, allowed to assess the landslide risk, classified in low, medium and high levels. The risk map highlighted that about the 30% (392 km) of the examined road corridors is affected by high risk levels. The comparison between the risk map and the landslide inventory recognized along roads has also revealed that the 49.5% of landslides affects sections where the risk was evaluated high. The obtained risk classification of the roads represents a support for decision making and allows to identify the priorities for designing appropriate landslide mitigation plans.
NASA Astrophysics Data System (ADS)
Wang, Rui; Zhang, Jiquan; Guo, Enliang; Alu, Si; Li, Danjun; Ha, Si; Dong, Zhenhua
2018-02-01
Along with global warming, drought disasters are occurring more frequently and are seriously affecting normal life and food security in China. Drought risk assessments are necessary to provide support for local governments. This study aimed to establish an integrated drought risk model based on the relation curve of drought joint probabilities and drought losses of multi-hazard-affected bodies. First, drought characteristics, including duration and severity, were classified using the 1953-2010 precipitation anomaly in the Taoerhe Basin based on run theory, and their marginal distributions were identified by exponential and Gamma distributions, respectively. Then, drought duration and severity were related to construct a joint probability distribution based on the copula function. We used the EPIC (Environmental Policy Integrated Climate) model to simulate maize yield and historical data to calculate the loss rates of agriculture, industry, and animal husbandry in the study area. Next, we constructed vulnerability curves. Finally, the spatial distributions of drought risk for 10-, 20-, and 50-year return periods were expressed using inverse distance weighting. Our results indicate that the spatial distributions of the three return periods are consistent. The highest drought risk is in Ulanhot, and the duration and severity there were both highest. This means that higher drought risk corresponds to longer drought duration and larger drought severity, thus providing useful information for drought and water resource management. For 10-, 20-, and 50-year return periods, the drought risk values ranged from 0.41 to 0.53, 0.45 to 0.59, and 0.50 to 0.67, respectively. Therefore, when the return period increases, the drought risk increases.
Capturing Essential Information to Achieve Safe Interoperability
Weininger, Sandy; Jaffe, Michael B.; Rausch, Tracy; Goldman, Julian M.
2016-01-01
In this article we describe the role of “clinical scenario” information to assure the safety of interoperable systems, as well as the system’s ability to deliver the requisite clinical functionality to improve clinical care. Described are methods and rationale for capturing the clinical needs, workflow, hazards, and device interactions in the clinical environment. Key user (clinician and clinical engineer) needs and system requirements can be derived from this information, therefore improving the communication from clinicians to medical device and information technology system developers. This methodology is intended to assist the health care community, including researchers, standards developers, regulators, and manufacturers, by providing clinical definition to support requirements in the systems engineering process, particularly those focusing on development of Integrated Clinical Environments described in standard ASTM F2761. Our focus is on identifying and documenting relevant interactions and medical device capabilities within the system using a documentation tool called medical device interface data sheets (MDIDSa) and mitigating hazardous situations related to workflow, product usability, data integration, and the lack of effective medical device-health information technology system integration to achieve safe interoperability. Portions of the analysis of a clinical scenario for a “Patient-controlled analgesia safety interlock” are provided to illustrate the method. Collecting better clinical adverse event information and proposed solutions can help identify opportunities to improve current device capabilities and interoperability and support a Learning Health System to improve health care delivery. Developing and analyzing clinical scenarios are the first steps in creating solutions to address vexing patient safety problems and enable clinical innovation. A web-based research tool for implementing a means of acquiring and managing this information, the Clinical Scenario Repository™, is described. PMID:27387840
Capturing Essential Information to Achieve Safe Interoperability.
Weininger, Sandy; Jaffe, Michael B; Rausch, Tracy; Goldman, Julian M
2017-01-01
In this article, we describe the role of "clinical scenario" information to assure the safety of interoperable systems, as well as the system's ability to deliver the requisite clinical functionality to improve clinical care. Described are methods and rationale for capturing the clinical needs, workflow, hazards, and device interactions in the clinical environment. Key user (clinician and clinical engineer) needs and system requirements can be derived from this information, therefore, improving the communication from clinicians to medical device and information technology system developers. This methodology is intended to assist the health care community, including researchers, standards developers, regulators, and manufacturers, by providing clinical definition to support requirements in the systems engineering process, particularly those focusing on development of Integrated Clinical Environments described in standard ASTM F2761. Our focus is on identifying and documenting relevant interactions and medical device capabilities within the system using a documentation tool called medical device interface data sheets and mitigating hazardous situations related to workflow, product usability, data integration, and the lack of effective medical device-health information technology system integration to achieve safe interoperability. Portions of the analysis of a clinical scenario for a "patient-controlled analgesia safety interlock" are provided to illustrate the method. Collecting better clinical adverse event information and proposed solutions can help identify opportunities to improve current device capabilities and interoperability and support a learning health system to improve health care delivery. Developing and analyzing clinical scenarios are the first steps in creating solutions to address vexing patient safety problems and enable clinical innovation. A Web-based research tool for implementing a means of acquiring and managing this information, the Clinical Scenario Repository™ (MD PnP Program), is described.
IRIS Toxicological Review of Dichloromethane (Methylene ...
EPA is conducting a peer review and public comment of the scientific basis supporting the human health hazard and dose-response assessment of Dichloromethane that when finalized will appear on the Integrated Risk Information System (IRIS) database. The draft Toxicological Review of Dichloromethane provides scientific support and rationale for the hazard and dose-response assessment pertaining to chronic exposure to dichloromethane.
HAZARDS SUMMARY REPORT FOR A TWO WATT PROMETHIUM-147 FUELED THERMOELECTRIC GENERATOR
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1959-06-01
Discussions are included of the APU design, vehicle integration, Pm/sup 147/ properties, shielding requirements, hazards design criteria, statistical analysis for impact, and radiation protection. The use of Pm/sup 147/ makes possible the fabrication of an auxiliary power unit which has applications for low power space missions of <10 watts (electrical). (B.O.G.)
Audits are an important and integral part of the EPA Hazardous Waste Engineering Research Laboratory (HWERL) Quality Assurance (QA) Program. As part of the overall QA program, audits are used to determine contractor compliance with quality assurance plans and to assess the overal...
The Mediterranean Supersite Volcanoes (MED-SUV) Project: an overview
NASA Astrophysics Data System (ADS)
Puglisi, Giuseppe
2013-04-01
In response to the EC call ENV.2012.6.4-2 (Long-term monitoring experiments in geologically active regions of Europe prone to natural hazards: the Supersite concept - FP7-ENV-2012-two-stage) a wide community of volcanological institutions proposed the project Mediterranean Supersite Volcanoes (MED-SUV), which is in the negotiation phase at the time of writing. The Consortium is composed by 18 European University and research institutes, four Small or Medium Enterprises (SME) and two non-European University and research institutes. MED-SUV will improve the consortium capacity of assessment of volcanic hazards in Supersites of Southern Italy by optimising and integrating existing and new observation/monitoring systems, by a breakthrough in understanding of volcanic processes and by increasing the effectiveness of the coordination between the scientific and end-user communities. More than 3 million of people are exposed to potential volcanic hazards in a large region in the Mediterranean Sea, where two among the largest European volcanic areas are located: Mt. Etna and Campi Flegrei/Vesuvius. This project will fully exploit the unique detailed long-term in-situ monitoring data sets available for these volcanoes and integrate with Earth Observation (EO) data, setting the basic tools for a significant step ahead in the discrimination of pre-, syn- and post-eruptive phases. The wide range of styles and intensities of volcanic phenomena observed on these volcanoes, which can be assumed as archetypes of 'closed conduit ' and 'open conduit' volcano, together with the long-term multidisciplinary data sets give an exceptional opportunity to improve the understanding of a very wide spectrum of geo-hazards, as well as implementing and testing a large variety of innovative models of ground deformation and motion. Important impacts on the European industrial sector are expected, arising from a partnership integrating the scientific community and SMEs to implement together new observation/monitoring sensors/systems. Specific experiments and studies will be carried out to improve our understanding of the volcanic internal structure and dynamics, as well as to recognise signals related to impending unrest or eruption. Hazard quantitative assessment will benefit by the outcomes of these studies and by their integration into the cutting edge monitoring approaches thus leading to a step-change in hazard awareness and preparedness and leveraging the close relationship between scientists, SMEs, and end-users.
Modeling landslide recurrence in Seattle, Washington, USA
Salciarini, Diana; Godt, Jonathan W.; Savage, William Z.; Baum, Rex L.; Conversini, Pietro
2008-01-01
To manage the hazard associated with shallow landslides, decision makers need an understanding of where and when landslides may occur. A variety of approaches have been used to estimate the hazard from shallow, rainfall-triggered landslides, such as empirical rainfall threshold methods or probabilistic methods based on historical records. The wide availability of Geographic Information Systems (GIS) and digital topographic data has led to the development of analytic methods for landslide hazard estimation that couple steady-state hydrological models with slope stability calculations. Because these methods typically neglect the transient effects of infiltration on slope stability, results cannot be linked with historical or forecasted rainfall sequences. Estimates of the frequency of conditions likely to cause landslides are critical for quantitative risk and hazard assessments. We present results to demonstrate how a transient infiltration model coupled with an infinite slope stability calculation may be used to assess shallow landslide frequency in the City of Seattle, Washington, USA. A module called CRF (Critical RainFall) for estimating deterministic rainfall thresholds has been integrated in the TRIGRS (Transient Rainfall Infiltration and Grid-based Slope-Stability) model that combines a transient, one-dimensional analytic solution for pore-pressure response to rainfall infiltration with an infinite slope stability calculation. Input data for the extended model include topographic slope, colluvial thickness, initial water-table depth, material properties, and rainfall durations. This approach is combined with a statistical treatment of rainfall using a GEV (General Extreme Value) probabilistic distribution to produce maps showing the shallow landslide recurrence induced, on a spatially distributed basis, as a function of rainfall duration and hillslope characteristics.
[Assessment of eco-environmental vulnerability of Hainan Island, China].
Huang, Bao-rong; Ouyang, Zhi-yun; Zhang, Hui-zhi; Zhang, Li-hua; Zheng, Hua
2009-03-01
Based on the assessment method of environmental vulnerability constructed by SOPAC and UNEP, this paper constructed an indicator system from three sub-themes including hazard, resistance, and damage to assess the eco-environmental vulnerability of Hainan Island. The results showed that Hainan Island was suffering a middling level eco-environmental hazard, and the main hazards came from some intensive human activities such as intensive agriculture, mass tourism, mining, and a mass of solid wastes thrown by islanders and tourists. Some geographical characters such as larger land area, larger altitude range, integrated geographical form, and abundant habitat types endowed Hainan Island higher resistance to environmental hazards. However, disturbed by historical accumulative artificial and natural hazards, the Island ecosystem had showed serious ecological damage, such as soil degradation and biodiversity loss. Comprehensively considered hazard, resistance, damage, and degradation, the comprehensive environmental vulnerability of the Island was at a middling level. Some indicators showed lower vulnerability, but some showed higher vulnerability.
Haas, Jessica R.; Thompson, Matthew P.; Tillery, Anne C.; Scott, Joe H.
2017-01-01
Wildfires can increase the frequency and magnitude of catastrophic debris flows. Integrated, proactive natural hazard assessment would therefore characterize landscapes based on the potential for the occurrence and interactions of wildfires and postwildfire debris flows. This chapter presents a new modeling effort that can quantify the variability surrounding a key input to postwildfire debris-flow modeling, the amount of watershed burned at moderate to high severity, in a prewildfire context. The use of stochastic wildfire simulation captures variability surrounding the timing and location of ignitions, fire weather patterns, and ultimately the spatial patterns of watershed area burned. Model results provide for enhanced estimates of postwildfire debris-flow hazard in a prewildfire context, and multiple hazard metrics are generated to characterize and contrast hazards across watersheds. Results can guide mitigation efforts by allowing planners to identify which factors may be contributing the most to the hazard rankings of watersheds.
Assessing qualitative long-term volcanic hazards at Lanzarote Island (Canary Islands)
NASA Astrophysics Data System (ADS)
Becerril, Laura; Martí, Joan; Bartolini, Stefania; Geyer, Adelina
2017-07-01
Conducting long-term hazard assessment in active volcanic areas is of primary importance for land-use planning and defining emergency plans able to be applied in case of a crisis. A definition of scenario hazard maps helps to mitigate the consequences of future eruptions by anticipating the events that may occur. Lanzarote is an active volcanic island that has hosted the largest (> 1.5 km3 DRE) and longest (6 years) eruption, the Timanfaya eruption (1730-1736), on the Canary Islands in historical times (last 600 years). This eruption brought severe economic losses and forced local people to migrate. In spite of all these facts, no comprehensive hazard assessment or hazard maps have been developed for the island. In this work, we present an integrated long-term volcanic hazard evaluation using a systematic methodology that includes spatial analysis and simulations of the most probable eruptive scenarios.
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Camp
Over the past four years, the Electrical Safety Program at PPPL has evolved in addressing changing regulatory requirements and lessons learned from accident events, particularly in regards to arc flash hazards and implementing NFPA 70E requirements. This presentation will discuss PPPL's approaches to the areas of electrical hazards evaluation, both shock and arc flash; engineered solutions for hazards mitigation such as remote racking of medium voltage breakers, operational changes for hazards avoidance, targeted personnel training and hazard appropriate personal protective equipment. Practical solutions for nominal voltage identification and zero voltage checks for lockout/tagout will also be covered. Finally, we willmore » review the value of a comprehensive electrical drawing program, employee attitudes expressed as a personal safety work ethic, integrated safety management, and sustained management support for continuous safety improvement.« less
NASA Astrophysics Data System (ADS)
Khabarov, Nikolay; Huggel, Christian; Obersteiner, Michael; Ramírez, Juan Manuel
2010-05-01
Mountain regions are typically characterized by rugged terrain which is susceptible to different types of landslides during high-intensity precipitation. Landslides account for billions of dollars of damage and many casualties, and are expected to increase in frequency in the future due to a projected increase of precipitation intensity. Early warning systems (EWS) are thought to be a primary tool for related disaster risk reduction and climate change adaptation to extreme climatic events and hydro-meteorological hazards, including landslides. An EWS for hazards such as landslides consist of different components, including environmental monitoring instruments (e.g. rainfall or flow sensors), physical or empirical process models to support decision-making (warnings, evacuation), data and voice communication, organization and logistics-related procedures, and population response. Considering this broad range, EWS are highly complex systems, and it is therefore difficult to understand the effect of the different components and changing conditions on the overall performance, ultimately being expressed as human lives saved or structural damage reduced. In this contribution we present a further development of our approach to assess a landslide EWS in an integral way, both at the system and component level. We utilize a numerical model using 6 hour rainfall data as basic input. A threshold function based on a rainfall-intensity/duration relation was applied as a decision criterion for evacuation. Damage to infrastructure and human lives was defined as a linear function of landslide magnitude, with the magnitude modelled using a power function of landslide frequency. Correct evacuation was assessed with a ‘true' reference rainfall dataset versus a dataset of artificially reduced quality imitating the observation system component. Performance of the EWS using these rainfall datasets was expressed in monetary terms (i.e. damage related to false and correct evacuation). We applied this model to a landslide EWS in Colombia that is currently being implemented within a disaster prevention project. We evaluated the EWS against rainfall data with artificially introduced error and computed with multiple model runs the probabilistic damage functions depending on rainfall error. Then we modified the original precipitation pattern to reflect possible climatic changes e.g. change in annual precipitation as well as change in precipitation intensity with annual values remaining constant. We let the EWS model adapt for changed conditions to function optimally. Our results show that for the same errors in rainfall measurements the system's performance degrades with expected changing climatic conditions. The obtained results suggest that EWS cannot internally adapt to climate change and require exogenous adaptive measures to avoid increase in overall damage. The model represents a first attempt to integrally simulate and evaluate EWS under future possible climatic pressures. Future work will concentrate on refining model components and spatially explicit climate scenarios.
Seismic hazard exposure for the Trans-Alaska Pipeline
Cluff, L.S.; Page, R.A.; Slemmons, D.B.; Grouse, C.B.; ,
2003-01-01
The discovery of oil on Alaska's North Slope and the construction of a pipeline to transport that oil across Alaska coincided with the National Environmental Policy Act of 1969 and a destructive Southern California earthquake in 1971 to cause stringent stipulations, state-of-the-art investigations, and innovative design for the pipeline. The magnitude 7.9 earthquake on the Denali fault in November 2002 was remarkably consistent with the design earthquake and fault displacement postulated for the Denali crossing of the Trans-Alaska Pipeline route. The pipeline maintained its integrity, and disaster was averted. Recent probabilistic studies to update previous hazard exposure conclusions suggest continuing pipeline integrity.
Improving education and resources for health care providers.
Paul, M; Welch, L
1993-01-01
Workers and citizens are turning increasingly to the health care system for information about occupational and environmental reproductive hazards, yet most primary care providers and specialists know little about the effects of occupational/environmental toxicants on the reproductive system or how to evaluate and manage patients at potential risk. Although it is unrealistic to expect all clinicians to become experts in this area, practitioners should know how to take a basic screening history, identify patients at potential risk, and make appropriate referrals. At present, occupational and environmental health issues are not well integrated into health professional education in the United States, and clinical information and referral resources pertaining to reproductive hazards are inadequate. In addressing these problems, the conference "Working Group on Health Provider Education and Resources" made several recommendations that are detailed in this report. Short-term goals include enhancement of existing expertise and resources at a regional level and better integration of information on occupational/environmental reproductive hazards into curricula, meetings, and publications of medical and nursing organizations. Longer term goals include development of a comprehensive, single-access information and referral system for clinicians and integration of occupational and environmental medicine into formal health professional education curricula at all levels. PMID:8243391
Screening and Assessment of Young Children.
ERIC Educational Resources Information Center
Friedlander, Bernard Z.
Most language development hazards in infancy and early childhood fall into the categories of auditory impairment, central integrative dysfunction, inadequate environmental support, and peripheral expressive impairment. Existing knowledge and techniques are inadequate to meet the screening and assessment problems of central integrative dysfunction,…
Wood, Nathan J.; Good, James W.
2004-01-01
AbstractEarthquakes and tsunamis pose significant threats to Pacific Northwest coastal port and harbor communities. Developing holistic mitigation and preparedness strategies to reduce the potential for loss of life and property damage requires community-wide vulnerability assessments that transcend traditional site-specific analyses. The ability of a geographic information system (GIS) to integrate natural, socioeconomic, and hazards information makes it an ideal assessment tool to support community hazard planning efforts. This article summarizes how GIS was used to assess the vulnerability of an Oregon port and harbor community to earthquake and tsunami hazards, as part of a larger risk-reduction planning initiative. The primary purposes of the GIS were to highlight community vulnerability issues and to identify areas that both are susceptible to hazards and contain valued port and harbor community resources. Results of the GIS analyses can help decision makers with limited mitigation resources set priorities for increasing community resiliency to natural hazards.
Time prediction of failure a type of lamps by using general composite hazard rate model
NASA Astrophysics Data System (ADS)
Riaman; Lesmana, E.; Subartini, B.; Supian, S.
2018-03-01
This paper discusses the basic survival model estimates to obtain the average predictive value of lamp failure time. This estimate is for the parametric model, General Composite Hazard Level Model. The random time variable model used is the exponential distribution model, as the basis, which has a constant hazard function. In this case, we discuss an example of survival model estimation for a composite hazard function, using an exponential model as its basis. To estimate this model is done by estimating model parameters, through the construction of survival function and empirical cumulative function. The model obtained, will then be used to predict the average failure time of the model, for the type of lamp. By grouping the data into several intervals and the average value of failure at each interval, then calculate the average failure time of a model based on each interval, the p value obtained from the tes result is 0.3296.
1999-01-01
The past decade has seen rapid expansion in aquaculture production. In the fisheries sector, as in animal production, farming is replacing hunting as the primary food production strategy. In future, farmed fish will be an even more important source of protein foods than they are today, and the safety for human consumption of products from aquaculture is of public health significance. This is the report of a Study Group that considered food safety issues associated with farmed finfish and crustaceans. The principal conclusion was that an integrated approach--involving close collaboration between the aquaculture, agriculture, food safety, health and education sectors--is needed to identify and control hazards associated with products from aquaculture. Food safety assurance should be included in fish farm management and form an integral part of the farm-to-table food safety continuum. Where appropriate, measures should be based on Hazard Analysis and Critical Control Point (HACCP) methods; however, difficulties in applying HACCP principles to small-scale farming systems were recognized. Food safety hazards associated with products from aquaculture differ according to region, habitat and environmental conditions, as well as methods of production and management. Lack of awareness of hazards can hinder risk assessment and the application of risk management strategies to aquaculture production, and education is therefore needed. Chemical and biological hazards that should to be taken into account in public health policies concerning products from aquaculture are discussed in this report, which should be of use to policy-makers and public health officials. The report will also assist fish farmers to identify hazards and develop appropriate hazard-control strategies.
Integrate urban‐scale seismic hazard analyses with the U.S. National Seismic Hazard Model
Moschetti, Morgan P.; Luco, Nicolas; Frankel, Arthur; Petersen, Mark D.; Aagaard, Brad T.; Baltay, Annemarie S.; Blanpied, Michael; Boyd, Oliver; Briggs, Richard; Gold, Ryan D.; Graves, Robert; Hartzell, Stephen; Rezaeian, Sanaz; Stephenson, William J.; Wald, David J.; Williams, Robert A.; Withers, Kyle
2018-01-01
For more than 20 yrs, damage patterns and instrumental recordings have highlighted the influence of the local 3D geologic structure on earthquake ground motions (e.g., M">M 6.7 Northridge, California, Gao et al., 1996; M">M 6.9 Kobe, Japan, Kawase, 1996; M">M 6.8 Nisqually, Washington, Frankel, Carver, and Williams, 2002). Although this and other local‐scale features are critical to improving seismic hazard forecasts, historically they have not been explicitly incorporated into the U.S. National Seismic Hazard Model (NSHM, national model and maps), primarily because the necessary basin maps and methodologies were not available at the national scale. Instead,...
Computation of nonparametric convex hazard estimators via profile methods.
Jankowski, Hanna K; Wellner, Jon A
2009-05-01
This paper proposes a profile likelihood algorithm to compute the nonparametric maximum likelihood estimator of a convex hazard function. The maximisation is performed in two steps: First the support reduction algorithm is used to maximise the likelihood over all hazard functions with a given point of minimum (or antimode). Then it is shown that the profile (or partially maximised) likelihood is quasi-concave as a function of the antimode, so that a bisection algorithm can be applied to find the maximum of the profile likelihood, and hence also the global maximum. The new algorithm is illustrated using both artificial and real data, including lifetime data for Canadian males and females.
NASA Astrophysics Data System (ADS)
Tomas, Robert; Harrison, Matthew; Barredo, José I.; Thomas, Florian; Llorente Isidro, Miguel; Cerba, Otakar; Pfeiffer, Manuela
2014-05-01
The vast amount of information and data necessary for comprehensive hazard and risk assessment presents many challenges regarding the lack of accessibility, comparability, quality, organisation and dissemination of natural hazards spatial data. In order to mitigate these limitations an interoperable framework has been developed in the framework of the development of legally binding Implementing rules of the EU INSPIRE Directive1* aiming at the establishment of the European Spatial Data Infrastructure. The interoperability framework is described in the Data Specification on Natural risk zones - Technical Guidelines (DS) document2* that was finalized and published on 10.12. 2013. This framework provides means for facilitating access, integration, harmonisation and dissemination of natural hazard data from different domains and sources. The objective of this paper is twofold. Firstly, the paper demonstrates the applicability of the interoperable framework developed in the DS and highlights the key aspects of the interoperability to the various natural hazards communities. Secondly, the paper "translates" into common language the main features and potentiality of the interoperable framework of the DS for a wider audience of scientists and practitioners in the natural hazards domain. Further in this paper the main five aspects of the interoperable framework will be presented. First, the issue of a common terminology for the natural hazards domain will be addressed. A common data model to facilitate cross domain data integration will follow secondly. Thirdly, the common methodology developed to provide qualitative or quantitative assessments of natural hazards will be presented. Fourthly, the extensible classification schema for natural hazards developed from a literature review and key reference documents from the contributing community of practice will be shown. Finally, the applicability of the interoperable framework for the various stakeholder groups will be also presented. This paper closes discussing open issues and next steps regarding the sustainability and evolution of the interoperable framework and missing aspects such as multi-hazard and multi-risk. --------------- 1*INSPIRE - Infrastructure for spatial information in Europe, http://inspire.ec.europa.eu 2*http://inspire.jrc.ec.europa.eu/documents/Data_Specifications/INSPIRE_DataSpecification_NZ_v3.0.pdf
2002 Hyperspectral Analysis of Hazardous Waste Sites on the Savannah River Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gladden, J.B.
2003-08-28
Hazardous waste site inspection is a labor intensive, time consuming job, performed primarily on the ground using visual inspection and instrumentation. It is an expensive process to continually monitor hazardous waste and/or landfill sites to determine if they are maintaining their integrity. In certain instances, it may be possible to monitor aspects of the hazardous waste sites and landfills remotely. The utilization of multispectral data was suggested for the mapping of clays and iron oxides associated with contaminated groundwater, vegetation stress, and methane gas emissions (which require longer wavelength detectors). The Savannah River Site (SRS) near Aiken, S.C. is amore » United States Department of Energy facility operated by the Westinghouse Savannah River Company. For decades the SRS was responsible for developing weapons grade plutonium and other materials for the nation's nuclear defense. Hazardous waste was generated during this process. Waste storage site inspection is a particularly important issue at the SRS because there are over 100 hazardous waste sites scattered throughout the 300 mile complex making it difficult to continually monitor all of the facilities. The goal is to use remote sensing technology to identify surface anomalies on the hazardous waste sites as early as possible so that remedial work can take place rapidly to maintain the integrity of the storage sites. The anomalous areas are then targeted for intensive in situ human examination and measurement. During the 1990s, many of the hazardous waste sites were capped with protective layers of polyethelene sheeting and soil, and planted with bahia grass and/or centipede grass. This research investigated hyperspectral remote sensing technology to determine if it can be used to measure accurately and monitor possible indicators of change on vegetated hazardous waste sites. Specifically, it evaluated the usefulness of hyperspectral remote sensing to assess the condition of vegetation on clay- caps on the Mixed Waste Management Facility (MWMF). This report first describes the principles of hyperspectral remote sensing. In situ measurement and hyperspectral remote sensing methods used to analyze hazardous waste sites on the Savannah River Site are then presented.« less
Establishment and function of tissue-resident innate lymphoid cells in the skin.
Yang, Jie; Zhao, Luming; Xu, Ming; Xiong, Na
2017-07-01
Innate lymphoid cells (ILCs) are a newly classified family of immune cells of the lymphoid lineage. While they could be found in both lymphoid organs and non-lymphoid tissues, ILCs are preferentially enriched in barrier tissues such as the skin, intestine, and lung where they could play important roles in maintenance of tissue integrity and function and protection against assaults of foreign agents. On the other hand, dysregulated activation of ILCs could contribute to tissue inflammatory diseases. In spite of recent progress towards understanding roles of ILCs in the health and disease, mechanisms regulating specific establishment, activation, and function of ILCs in barrier tissues are still poorly understood. We herein review the up-to-date understanding of tissue-specific relevance of ILCs. Particularly we will focus on resident ILCs of the skin, the outmost barrier tissue critical in protection against various foreign hazardous agents and maintenance of thermal and water balance. In addition, we will discuss remaining outstanding questions yet to be addressed.
Vertical Field of View Reference Point Study for Flight Path Control and Hazard Avoidance
NASA Technical Reports Server (NTRS)
Comstock, J. Raymond, Jr.; Rudisill, Marianne; Kramer, Lynda J.; Busquets, Anthony M.
2002-01-01
Researchers within the eXternal Visibility System (XVS) element of the High-Speed Research (HSR) program developed and evaluated display concepts that will provide the flight crew of the proposed High-Speed Civil Transport (HSCT) with integrated imagery and symbology to permit path control and hazard avoidance functions while maintaining required situation awareness. The challenge of the XVS program is to develop concepts that would permit a no-nose-droop configuration of an HSCT and expanded low visibility HSCT operational capabilities. This study was one of a series of experiments exploring the 'design space' restrictions for physical placement of an XVS display. The primary experimental issues here was 'conformality' of the forward display vertical position with respect to the side window in simulated flight. 'Conformality' refers to the case such that the horizon and objects appear in the same relative positions when viewed through the forward windows or display and the side windows. This study quantified the effects of visual conformality on pilot flight path control and hazard avoidance performance. Here, conformality related to the positioning and relationship of the artificial horizon line and associated symbology presented on the forward display and the horizon and associated ground, horizon, and sky textures as they would appear in the real view through a window presented in the side window display. No significant performance consequences were found for the non-conformal conditions.
43 CFR 2.51 - Assuring integrity of records.
Code of Federal Regulations, 2011 CFR
2011-10-01
... on those recommended in the National Bureau of Standard's booklet “Computer Security Guidelines for..., technical and physical safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in...
43 CFR 2.51 - Assuring integrity of records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... on those recommended in the National Bureau of Standard's booklet “Computer Security Guidelines for..., technical and physical safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in...
43 CFR 2.51 - Assuring integrity of records.
Code of Federal Regulations, 2012 CFR
2012-10-01
... on those recommended in the National Bureau of Standard's booklet “Computer Security Guidelines for..., technical and physical safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in...
43 CFR 2.226 - Assuring integrity of records.
Code of Federal Regulations, 2014 CFR
2014-10-01
... on those recommended in the National Bureau of Standard's booklet “Computer Security Guidelines for..., technical and physical safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in...
43 CFR 2.226 - Assuring integrity of records.
Code of Federal Regulations, 2013 CFR
2013-10-01
... on those recommended in the National Bureau of Standard's booklet “Computer Security Guidelines for..., technical and physical safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in...
The importance of vegetation change in the prediction of future tropical cyclone flood statistics
NASA Astrophysics Data System (ADS)
Irish, J. L.; Resio, D.; Bilskie, M. V.; Hagen, S. C.; Weiss, R.
2015-12-01
Global sea level rise is a near certainty over the next century (e.g., Stocker et al. 2013 [IPCC] and references therein). With sea level rise, coastal topography and land cover (hereafter "landscape") is expected to change and tropical cyclone flood hazard is expected to accelerate (e.g., Irish et al. 2010 [Ocean Eng], Woodruff et al. 2013 [Nature], Bilskie et al. 2014 [Geophys Res Lett], Ferreira et al. 2014 [Coast Eng], Passeri et al. 2015 [Nat Hazards]). Yet, the relative importance of sea-level rise induced landscape change on future tropical cyclone flood hazard assessment is not known. In this paper, idealized scenarios are used to evaluate the relative impact of one class of landscape change on future tropical cyclone extreme-value statistics in back-barrier regions: sea level rise induced vegetation migration and loss. The joint probability method with optimal sampling (JPM-OS) (Resio et al. 2009 [Nat Hazards]) with idealized surge response functions (e.g., Irish et al. 2009 [Nat Hazards]) is used to quantify the present-day and future flood hazard under various sea level rise scenarios. Results are evaluated in terms of their impact on the flood statistics (a) when projected flood elevations are included directly in the JPM analysis (Figure 1) and (b) when represented as additional uncertainty within the JPM integral (Resio et al. 2013 [Nat Hazards]), i.e., as random error. Findings are expected to aid in determining the level of effort required to reasonably account for future landscape change in hazard assessments, namely in determining when such processes are sufficiently captured by added uncertainty and when sea level rise induced vegetation changes must be considered dynamically, via detailed modeling initiatives. Acknowledgements: This material is based upon work supported by the National Science Foundation under Grant No. CMMI-1206271 and by the National Sea Grant College Program of the U.S. Department of Commerce's National Oceanic and Atmospheric Administration under Grant No. NA10OAR4170099. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of these organizations. The STOKES ARCC at the University of Central Florida provided computational resources for storm surge simulations.
Markell, Lauren K; Wezalis, Stephanie M; Roper, Jason M; Zimmermann, Cindi; Delaney, Bryan
2017-10-01
Relatively few proteins in nature produce adverse effects following oral exposure. Of those that do, effects are often observed in the gut, particularly on intestinal epithelial cells (IEC). Previous studies reported that addition of protein toxins to IEC lines disrupted monolayer integrity but innocuous dietary proteins did not. Studies presented here investigated the effects of innocuous (bovine serum albumin, β-lactoglobulin, RuBisCO, fibronectin) or hazardous (phytohaemagglutinin-E, concanavalin A, wheat germ agglutinin, melittin) proteins that either were untreated or exposed to digestive enzymes prior to addition to Caco-2 human IEC line monolayers. At high concentrations intact fibronectin caused an increase in monolayer permeability but other innocuous proteins did not whether exposed to digestive enzymes or not. In contrast, all untreated hazardous proteins and those that were resistant to digestion (ex. wheat germ agglutinin) disrupted monolayer integrity. However, proteins sensitive to degradation by digestive enzymes (ex. melittin) did not adversely affect monolayers when exposed to these enzymes prior to addition to IEC line monolayers. These results indicate that in vitro exposure of proteins to digestive enzymes can assist in differentiating between innocuous and hazardous proteins as another component to consider in the overall weight of evidence approach in protein hazard assessment. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Baron, Sherry L; Beard, Sharon; Davis, Letitia K.; Delp, Linda; Forst, Linda; Kidd-Taylor, Andrea; Liebman, Amy K.; Linnan, Laura; Punnett, Laura; Welch, Laura S.
2013-01-01
Nearly one of every three workers in the United States is low-income. Low-income populations have a lower life expectancy and greater rates of chronic diseases compared to those with higher incomes. Low- income workers face hazards in their workplaces as well as in their communities. Developing integrated public health programs that address these combined health hazards, especially the interaction of occupational and non-occupational risk factors, can promote greater health equity. We apply a social-ecological perspective in considering ways to improve the health of the low-income working population through integrated health protection and health promotion programs initiated in four different settings: the worksite, state and local health departments, community health centers, and community-based organizations. An example of successful approaches to developing integrated programs in each of these settings is described. Recommendations for improved research, training, and coordination among health departments, health practitioners, worksites and community organizations are proposed. PMID:23532780
Managing forest structure and fire hazard--a tool for planners.
M.C. Johnson; D.L. Peterson; C.L. Raymond
2006-01-01
Fire planners and other resource managers need to examine a range of potential fuel and vegetation treatments to select options that will lead to desired outcomes for fire hazard and natural resource conditions. A new approach to this issue integrates concepts and tools from silviculture and fuel science to quantify outcomes for a large number of treatment options in...
Rocky Mountain Research Station USDA Forest Service
2004-01-01
Appropriate types of thinning and surface fuel treatments are clearly useful in reducing surface and crown fire hazards under a wide range of fuels and topographic situations. This paper provides well-established scientific principles and simulation tools that can be used to adjust fuel treatments to attain specific risk levels.
46 CFR 129.520 - Hazardous areas.
Code of Federal Regulations, 2011 CFR
2011-10-01
... liquid with a flashpoint of below 140 °F (60 °C), or carries hazardous cargoes on deck or in integral...-storage spaces, or within 3 meters (10 feet) of a source of vapor on a weather deck unless the equipment... liquid unless the equipment is explosion-proof or intrinsically safe under § 111.105-9 or § 111.105-11 of...
46 CFR 129.520 - Hazardous areas.
Code of Federal Regulations, 2014 CFR
2014-10-01
... liquid with a flashpoint of below 140 °F (60 °C), or carries hazardous cargoes on deck or in integral...-storage spaces, or within 3 meters (10 feet) of a source of vapor on a weather deck unless the equipment... liquid unless the equipment is explosion-proof or intrinsically safe under § 111.105-9 or § 111.105-11 of...
46 CFR 129.520 - Hazardous areas.
Code of Federal Regulations, 2012 CFR
2012-10-01
... liquid with a flashpoint of below 140 °F (60 °C), or carries hazardous cargoes on deck or in integral...-storage spaces, or within 3 meters (10 feet) of a source of vapor on a weather deck unless the equipment... liquid unless the equipment is explosion-proof or intrinsically safe under § 111.105-9 or § 111.105-11 of...
NASA Astrophysics Data System (ADS)
Meyer, F. J.; Webley, P. W.; Dehn, J.; Arko, S. A.; McAlpin, D. B.; Gong, W.
2016-12-01
Volcanic eruptions are among the most significant hazards to human society, capable of triggering natural disasters on regional to global scales. In the last decade, remote sensing has become established in operational volcano monitoring. Centers like the Alaska Volcano Observatory rely heavily on remote sensing data from optical and thermal sensors to provide time-critical hazard information. Despite this high use of remote sensing data, the presence of clouds and a dependence on solar illumination often limit their impact on decision making. Synthetic Aperture Radar (SAR) systems are widely considered superior to optical sensors in operational monitoring situations, due to their weather and illumination independence. Still, the contribution of SAR to operational volcano monitoring has been limited in the past due to high data costs, long processing times, and low temporal sampling rates of most SAR systems. In this study, we introduce the automatic SAR processing system SARVIEWS, whose advanced data analysis and data integration techniques allow, for the first time, a meaningful integration of SAR into operational monitoring systems. We will introduce the SARVIEWS database interface that allows for automatic, rapid, and seamless access to the data holdings of the Alaska Satellite Facility. We will also present a set of processing techniques designed to automatically generate a set of SAR-based hazard products (e.g. change detection maps, interferograms, geocoded images). The techniques take advantage of modern signal processing and radiometric normalization schemes, enabling the combination of data from different geometries. Finally, we will show how SAR-based hazard information is integrated in existing multi-sensor decision support tools to enable joint hazard analysis with data from optical and thermal sensors. We will showcase the SAR processing system using a set of recent natural disasters (both earthquakes and volcanic eruptions) to demonstrate its robustness. We will also show the benefit of integrating SAR with data from other sensors to support volcano monitoring. For historic eruptions at Okmok and Augustine volcano, both located in the North Pacific, we will demonstrate that the addition of SAR can lead to a significant improvement in activity detection and eruption forecasting.
Surface Support Systems for Co-Operative and Integrated Human/Robotic Lunar Exploration
NASA Technical Reports Server (NTRS)
Mueller, Robert P.
2006-01-01
Human and robotic partnerships to realize space goals can enhance space missions and provide increases in human productivity while decreasing the hazards that the humans are exposed to. For lunar exploration, the harsh environment of the moon and the repetitive nature of the tasks involved with lunar outpost construction, maintenance and operation as well as production tasks associated with in-situ resource utilization, make it highly desirable to use robotic systems in co-operation with human activity. A human lunar outpost is functionally examined and concepts for selected human/robotic tasks are discussed in the context of a lunar outpost which will enable the presence of humans on the moon for extended periods of time.
Multi-hazards risk assessment at different levels
NASA Astrophysics Data System (ADS)
Frolova, N.; Larionov, V.; Bonnin, J.
2012-04-01
Natural and technological disasters are becoming more frequent and devastating. Social and economic losses due to those events increase annually, which is definitely in relation with evolution of society. Natural hazards identification and analysis, as well natural risk assessment taking into account secondary technological accidents are the first steps in prevention strategy aimed at saving lives and protecting property against future events. The paper addresses methodological issues of natural and technological integrated risk assessment and mapping at different levels [1, 2]. At the country level the most hazardous natural processes, which may results in fatalities, injuries and economic loss in the Russian Federation, are considered. They are earthquakes, landslides, mud flows, floods, storms, avalanches. The special GIS environment for the country territory was developed which includes information about hazards' level and reoccurrence, an impact databases for the last 20 years, as well as models for estimating damage and casualties caused by these hazards. Federal maps of seismic individual and collective risk, as well as multi-hazards natural risk maps are presented. The examples of regional seismic risk assessment taking into account secondary accidents at fire, explosion and chemical hazardous facilities and regional integrated risk assessment are given for the earthquake prone areas of the Russian Federation. The paper also gives examples of loss computations due to scenario earthquakes taking into account accidents trigged by strong events at critical facilities: fire and chemical hazardous facilities, including oil pipe lines routes located in the earthquake prone areas. The estimations of individual seismic risk obtained are used by EMERCOM of the Russian Federation, as well as by other federal and local authorities, for planning and implementing preventive measures, aimed at saving lives and protecting property against future disastrous events. The results also allow to develop effective emergency response plans taking into account possible scenario events. Taking into consideration the size of the oil pipe line systems located in the highly active seismic zones, the results of seismic risk computation are used by TRANSNEFT JSC.
Engineered Nanomaterials, Sexy New Technology and Potential Hazards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beaulieu, R A
Engineered nanomaterials enhance exciting new applications that can greatly benefit society in areas of cancer treatments, solar energy, energy storage, and water purification. While nanotechnology shows incredible promise in these and other areas by exploiting nanomaterials unique properties, these same properties can potentially cause adverse health effects to workers who may be exposed during work. Dispersed nanoparticles in air can cause adverse health effects to animals not merely due to their chemical properties but due to their size, structure, shape, surface chemistry, solubility, carcinogenicity, reproductive toxicity, mutagenicity, dermal toxicity, and parent material toxicity. Nanoparticles have a greater likelihood of lungmore » deposition and blood absorption than larger particles due to their size. Nanomaterials can also pose physical hazards due to their unusually high reactivity, which makes them useful as catalysts, but has the potential to cause fires and explosions. Characterization of the hazards (and potential for exposures) associated with nanomaterial development and incorporation in other products is an essential step in the development of nanotechnologies. Developing controls for these hazards are equally important. Engineered controls should be integrated into nanomaterial manufacturing process design according to 10CFR851, DOE Policy 456.1, and DOE Notice 456.1 as safety-related hardware or administrative controls for worker safety. Nanomaterial hazards in a nuclear facility must also meet control requirements per DOE standards 3009, 1189, and 1186. Integration of safe designs into manufacturing processes for new applications concurrent with the developing technology is essential for worker safety. This paper presents a discussion of nanotechnology, nanomaterial properties/hazards and controls.« less
16 CFR 1511.5 - Structural integrity tests.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Structural integrity tests. 1511.5 Section 1511.5 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION FEDERAL HAZARDOUS SUBSTANCES ACT... lowest position in the cylinder. If the uppermost edge of the component or fragment is below the plane of...
25 CFR 700.263 - Assuring integrity of records.
Code of Federal Regulations, 2013 CFR
2013-04-01
... safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in substantial harm, embarrassment..., subject to safeguards based on those recommended in the National Bureau of Standards booklet “Computer...
25 CFR 700.263 - Assuring integrity of records.
Code of Federal Regulations, 2014 CFR
2014-04-01
... safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in substantial harm, embarrassment..., subject to safeguards based on those recommended in the National Bureau of Standards booklet “Computer...
25 CFR 700.263 - Assuring integrity of records.
Code of Federal Regulations, 2010 CFR
2010-04-01
... safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in substantial harm, embarassment..., subject to safeguards based on those recommended in the National Bureau of Standards booklet “Computer...
25 CFR 700.263 - Assuring integrity of records.
Code of Federal Regulations, 2011 CFR
2011-04-01
... safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in substantial harm, embarassment..., subject to safeguards based on those recommended in the National Bureau of Standards booklet “Computer...
25 CFR 700.263 - Assuring integrity of records.
Code of Federal Regulations, 2012 CFR
2012-04-01
... safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which could result in substantial harm, embarrassment..., subject to safeguards based on those recommended in the National Bureau of Standards booklet “Computer...
25 CFR 43.22 - Assuring integrity of records.
Code of Federal Regulations, 2012 CFR
2012-04-01
..., “Computer Security Guidelines for Implementing the Privacy Act of 1974” (May 30, 1975), and any supplements... appropriate administrative, technical and physical safeguards to insure the security and confidentiality of records and to protect against any anticipated threats or hazards to their security or integrity which...
46 CFR 111.105-5 - System integrity.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 46 Shipping 4 2010-10-01 2010-10-01 false System integrity. 111.105-5 Section 111.105-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) ELECTRICAL ENGINEERING ELECTRIC SYSTEMS-GENERAL... individual electrical installation in a hazardous location must comply specifically with Articles 500-505 of...
Real-Time Hazard Detection and Avoidance Demonstration for a Planetary Lander
NASA Technical Reports Server (NTRS)
Epp, Chirold D.; Robertson, Edward A.; Carson, John M., III
2014-01-01
The Autonomous Landing Hazard Avoidance Technology (ALHAT) Project is chartered to develop and mature to a Technology Readiness Level (TRL) of six an autonomous system combining guidance, navigation and control with terrain sensing and recognition functions for crewed, cargo, and robotic planetary landing vehicles. In addition to precision landing close to a pre-mission defined landing location, the ALHAT System must be capable of autonomously identifying and avoiding surface hazards in real-time to enable a safe landing under any lighting conditions. This paper provides an overview of the recent results of the ALHAT closed loop hazard detection and avoidance flight demonstrations on the Morpheus Vertical Testbed (VTB) at the Kennedy Space Center, including results and lessons learned. This effort is also described in the context of a technology path in support of future crewed and robotic planetary exploration missions based upon the core sensing functions of the ALHAT system: Terrain Relative Navigation (TRN), Hazard Detection and Avoidance (HDA), and Hazard Relative Navigation (HRN).
Hazard function theory for nonstationary natural hazards
NASA Astrophysics Data System (ADS)
Read, L.; Vogel, R. M.
2015-12-01
Studies from the natural hazards literature indicate that many natural processes, including wind speeds, landslides, wildfires, precipitation, streamflow and earthquakes, show evidence of nonstationary behavior such as trends in magnitudes through time. Traditional probabilistic analysis of natural hazards based on partial duration series (PDS) generally assumes stationarity in the magnitudes and arrivals of events, i.e. that the probability of exceedance is constant through time. Given evidence of trends and the consequent expected growth in devastating impacts from natural hazards across the world, new methods are needed to characterize their probabilistic behavior. The field of hazard function analysis (HFA) is ideally suited to this problem because its primary goal is to describe changes in the exceedance probability of an event over time. HFA is widely used in medicine, manufacturing, actuarial statistics, reliability engineering, economics, and elsewhere. HFA provides a rich theory to relate the natural hazard event series (x) with its failure time series (t), enabling computation of corresponding average return periods and reliabilities associated with nonstationary event series. This work investigates the suitability of HFA to characterize nonstationary natural hazards whose PDS magnitudes are assumed to follow the widely applied Poisson-GP model. We derive a 2-parameter Generalized Pareto hazard model and demonstrate how metrics such as reliability and average return period are impacted by nonstationarity and discuss the implications for planning and design. Our theoretical analysis linking hazard event series x, with corresponding failure time series t, should have application to a wide class of natural hazards.
Risk analysis for roadways subjected to multiple landslide-related hazards
NASA Astrophysics Data System (ADS)
Corominas, Jordi; Mavrouli, Olga
2014-05-01
Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two dynamic factors: the service load and the wall deformation. The consequences are then calculated for each hazard type according to its characteristics (mechanism, magnitude, frequency). The difference of this method in comparison with other methodologies for landslide-related hazards lies in the hazard scenarios and consequence profiles that are investigated. The depth of analysis permits to account for local conditions either concerning the hazard or the consequences (the latter with respect to the very particular characteristics of the roadway such as traffic, number of lanes, velocity…). Furthermore it provides an extensive list of quantitative risk descriptors, including both individual and collective ones. The methodology was made automatic using the data sheets by Microsoft Excel. The results can be used to support decision-taking for the planning of protection measures. Gaps in knowledge and restrictions are discussed as well.
Estimating piecewise exponential frailty model with changing prior for baseline hazard function
NASA Astrophysics Data System (ADS)
Thamrin, Sri Astuti; Lawi, Armin
2016-02-01
Piecewise exponential models provide a very flexible framework for modelling univariate survival data. It can be used to estimate the effects of different covariates which are influenced by the survival data. Although in a strict sense it is a parametric model, a piecewise exponential hazard can approximate any shape of a parametric baseline hazard. In the parametric baseline hazard, the hazard function for each individual may depend on a set of risk factors or explanatory variables. However, it usually does not explain all such variables which are known or measurable, and these variables become interesting to be considered. This unknown and unobservable risk factor of the hazard function is often termed as the individual's heterogeneity or frailty. This paper analyses the effects of unobserved population heterogeneity in patients' survival times. The issue of model choice through variable selection is also considered. A sensitivity analysis is conducted to assess the influence of the prior for each parameter. We used the Markov Chain Monte Carlo method in computing the Bayesian estimator on kidney infection data. The results obtained show that the sex and frailty are substantially associated with survival in this study and the models are relatively quite sensitive to the choice of two different priors.
NASA Astrophysics Data System (ADS)
Gallina, Valentina; Torressan, Silvia; Zabeo, Alex; Critto, Andrea; Glade, Thomas; Marcomini, Antonio
2015-04-01
Climate change is expected to pose a wide range of impacts on natural and human systems worldwide, increasing risks from long-term climate trends and disasters triggered by weather extremes. Accordingly, in the future, one region could be potentially affected by interactions, synergies and trade-offs of multiple hazards and impacts. A multi-risk risk approach is needed to effectively address multiple threats posed by climate change across regions and targets supporting decision-makers toward a new paradigm of multi-hazard and risk management. Relevant initiatives have been already developed for the assessment of multiple hazards and risks affecting the same area in a defined timeframe by means of quantitative and semi-quantitative approaches. Most of them are addressing the relations of different natural hazards, however, the effect of future climate change is usually not considered. In order to fill this gap, an advanced multi-risk methodology was developed at the Euro-Mediterranean Centre on Climate Change (CMCC) for estimating cumulative impacts related to climate change at the regional (i.e. sub-national) scale. This methodology was implemented into an assessment tool which allows to scan and classify quickly natural systems and human assets at risk resulting from different interacting hazards. A multi-hazard index is proposed to evaluate the relationships of different climate-related hazards (e.g. sea-level rise, coastal erosion, storm surge) occurring in the same spatial and temporal area, by means of an influence matrix and the disjoint probability function. Future hazard scenarios provided by regional climate models are used as input for this step in order to consider possible effects of future climate change scenarios. Then, the multi-vulnerability of different exposed receptors (e.g. natural systems, beaches, agricultural and urban areas) is estimated through a variety of vulnerability indicators (e.g. vegetation cover, sediment budget, % of urbanization), tailored case by case to different sets of natural hazards and elements at risk. Finally, the multi-risk assessment integrates the multi-hazard with the multi-vulnerability index of exposed receptors, providing a relative ranking of areas and targets potentially affected by multiple risks in the considered region. The methodology was applied to the North Adriatic coast (Italy) producing a range of GIS-based multi-hazard, exposure, multi-vulnerability and multi-risk maps that can be used by policy-makers to define risk management and adaptation strategies. Results show that areas affected by higher multi-hazard scores are located close to the coastline where all the investigated hazards are present. Multi-vulnerability assumes relatively high scores in the whole case study, showing that beaches, wetlands, protected areas and river mouths are the more sensible targets. The final estimate of multi-risk for coastal municipalities provides useful information for local public authorities to set future priorities for adaptation and define future plans for shoreline and coastal management in view of climate change.
Pal, Parimal; Das, Pallabi; Chakrabortty, Sankha; Thakura, Ritwik
2016-11-01
Dynamic modelling and simulation of a nanofiltration-forward osmosis integrated complete system was done along with economic evaluation to pave the way for scale up of such a system for treating hazardous pharmaceutical wastes. The system operated in a closed loop not only protects surface water from the onslaught of hazardous industrial wastewater but also saves on cost of fresh water by turning wastewater recyclable at affordable price. The success of dynamic modelling in capturing the relevant transport phenomena is well reflected in high overall correlation coefficient value (R 2 > 0.98), low relative error (<0.1) and Willmott d-index (<0.95). The system could remove more than 97.5 % chemical oxygen demand (COD) from real pharmaceutical wastewater having initial COD value as high as 3500 mg/L while ensuring operation of the forward osmosis loop at a reasonably high flux of 56-58 l per square meter per hour.
IRIS Toxicological Review of Urea (External Review Draft) ...
EPA conducted a peer review and public comment of the scientific basis of a draft report supporting the human health hazard and dose-response assessment of Urea that when finalized will appear on the Integrated Risk Information System (IRIS) database. The draft Toxicological Review of Urea provides scientific support and rationale for the hazard and dose-response assessment pertaining to chronic exposure to Urea.
IRIS Toxicological Review of Trichloroacetic Acid (TCA) ...
EPA is conducting a peer review and public comment of the scientific basis supporting the human health hazard and dose-response assessment of Trichloroacetic acid (TCA) that when finalized will appear on the Integrated Risk Information System (IRIS) database. The draft Toxicological Review of trichloroacetic acid provides scientific support and rationale for the hazard and dose-response assessment pertaining to chronic exposure to trichloroacetic acid.
Jessica E. Halofsky; Stephanie K. Hart; Miles A. Hemstrom; Joshua S. Halofsky; Morris C. Johnson
2014-01-01
Information on the effects of management activities such as fuel reduction treatments and of processes such as vegetation growth and disturbance on fire hazard can help land managers prioritize treatments across a landscape to best meet management goals. State-and-transition models (STMs) allow landscape-scale simulations that incorporate effects of succession,...
Nanoporous Silicon Ignition of JA2 Propellant
2014-06-01
signals that would satisfy the hazard of electromagnetic radiation to ordnance (HERO) requirements of modern munitions. Such integrated circuits can...NUMBER (Include area code) 410-278-6098 Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 iii Contents List of Figures iv 1...fabricated as an integral element of a silicon chip. Integrated circuits that filter the firing command signal could remove extraneous electromagnetic
An integrative model of organizational safety behavior.
Cui, Lin; Fan, Di; Fu, Gui; Zhu, Cherrie Jiuhua
2013-06-01
This study develops an integrative model of safety management based on social cognitive theory and the total safety culture triadic framework. The purpose of the model is to reveal the causal linkages between a hazardous environment, safety climate, and individual safety behaviors. Based on primary survey data from 209 front-line workers in one of the largest state-owned coal mining corporations in China, the model is tested using structural equation modeling techniques. An employee's perception of a hazardous environment is found to have a statistically significant impact on employee safety behaviors through a psychological process mediated by the perception of management commitment to safety and individual beliefs about safety. The integrative model developed here leads to a comprehensive solution that takes into consideration the environmental, organizational and employees' psychological and behavioral aspects of safety management. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Davis, V. Leon; Nordeen, Ross
1988-01-01
A laboratory for developing robotics technology for hazardous and repetitive Shuttle and payload processing activities is discussed. An overview of the computer hardware and software responsible for integrating the laboratory systems is given. The center's anthropomorphic robot is placed on a track allowing it to be moved to different stations. Various aspects of the laboratory equipment are described, including industrial robot arm control, smart systems integration, the supervisory computer, programmable process controller, real-time tracking controller, image processing hardware, and control display graphics. Topics of research include: automated loading and unloading of hypergolics for space vehicles and payloads; the use of mobile robotics for security, fire fighting, and hazardous spill operations; nondestructive testing for SRB joint and seal verification; Shuttle Orbiter radiator damage inspection; and Orbiter contour measurements. The possibility of expanding the laboratory in the future is examined.
NASA Astrophysics Data System (ADS)
Bernard, E. N.; Behn, R. R.; Hebenstreit, G. T.; Gonzalez, F. I.; Krumpe, P.; Lander, J. F.; Lorca, E.; McManamon, P. M.; Milburn, H. B.
Rapid onset natural hazards have claimed more than 2.8 million lives worldwide in the past 20 years. This category includes such events as earthquakes, landslides, hurricanes, tornados, floods, volcanic eruptions, wildfires, and tsunamis. Effective hazard mitigation is particularly difficult in such cases, since the time available to issue warnings can be very short or even nonexistent. This paper presents the concept of a local warning system that exploits and integrates the existing technologies of risk evaluation, environmental measurement, and telecommunications. We describe Project THRUST, a successful implementation of this general, systematic approach to tsunamis. The general approach includes pre-event emergency planning, real-time hazard assessment, and rapid warning via satellite communication links.
The role of health and safety experts in the management of hazardous and toxic wastes in Indonesia
NASA Astrophysics Data System (ADS)
Supriyadi; Hadiyanto
2018-02-01
Occupational Safety and Health Experts in Indonesia have an important role in integrating environmental health and safety factors, including in this regard as human resources assigned to undertake hazardous waste management. Comprehensive knowledge and competence skills need to be carried out responsibly, as an inherent professional occupational safety and health profession. Management leaders should continue to provide training in external agencies responsible for science in the management of toxic waste to enable occupational safety and health experts to improve their performance in the hierarchy of control over the presence of hazardous materials. This paper provides an overview of what strategies and competencies the Occupational Safety and Health expert needs to have in embracing hazardous waste management practices.
49 CFR 171.1 - Applicability of Hazardous Materials Regulations (HMR) to persons and functions.
Code of Federal Regulations, 2014 CFR
2014-10-01
... transportation of hazardous materials in commerce and to pre-transportation and transportation functions. (a..., reconditions, repairs, or tests a packaging or a component of a packaging that is represented, marked..., reconditions, repairs, or tests a packaging or a component of a packaging that is represented, marked...
42 CFR 93.217 - Office of Research Integrity or ORI.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 1 2011-10-01 2011-10-01 false Office of Research Integrity or ORI. 93.217 Section 93.217 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH ASSESSMENTS AND HEALTH EFFECTS STUDIES OF HAZARDOUS SUBSTANCES RELEASES AND FACILITIES PUBLIC HEALTH SERVICE...
Code of Federal Regulations, 2010 CFR
2010-10-01
... Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192...
49 CFR Appendix C to Part 195 - Guidance for Implementation of an Integrity Management Program
Code of Federal Regulations, 2010 CFR
2010-10-01
... (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED... understanding and analysis of the failure mechanisms or threats to integrity of each pipeline segment. (2) An... pipeline, information and data used for the information analysis; (13) results of the information analyses...
Code of Federal Regulations, 2010 CFR
2010-10-01
... Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.937 What is a...
Code of Federal Regulations, 2010 CFR
2010-10-01
... Relating to Transportation (Continued) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas Transmission Pipeline Integrity Management § 192.915 What knowledge...
Litovskaia, A V; Sadovskiĭ, V V; Vifleemskiĭ, A B
1995-01-01
Clinical and immunologic examination including 1 and 2 level tests covered 429 staffers of chemical enterprises and 1122 of those engaged into microbiological synthesis of proteins, both the groups exposed to some irritating gases and isocyanates. Using calculation of Kulbak's criterion, the studies selected informative parameters to diagnose immune disturbances caused by occupational hazards. For integral evaluation of immune state, the authors applied general immunologic parameter, meanings of which can serve as criteria for early diagnosis of various immune disorders and for definition of risk groups among industrial workers exposed to occupational biologic and chemical hazards.
Hydrology Analysis and Modelling for Klang River Basin Flood Hazard Map
NASA Astrophysics Data System (ADS)
Sidek, L. M.; Rostam, N. E.; Hidayah, B.; Roseli, ZA; Majid, W. H. A. W. A.; Zahari, N. Z.; Salleh, S. H. M.; Ahmad, R. D. R.; Ahmad, M. N.
2016-03-01
Flooding, a common environmental hazard worldwide has in recent times, increased as a result of climate change and urbanization with the effects felt more in developing countries. As a result, the explosive of flooding to Tenaga Nasional Berhad (TNB) substation is increased rapidly due to existing substations are located in flood prone area. By understanding the impact of flood to their substation, TNB has provided the non-structure mitigation with the integration of Flood Hazard Map with their substation. Hydrology analysis is the important part in providing runoff as the input for the hydraulic part.
Remote vacuum compaction of compressible hazardous waste
Coyne, M.J.; Fiscus, G.M.; Sammel, A.G.
1998-10-06
A system is described for remote vacuum compaction and containment of low-level radioactive or hazardous waste comprising a vacuum source, a sealable first flexible container, and a sealable outer flexible container for receiving one or more first flexible containers. A method for compacting low level radioactive or hazardous waste materials at the point of generation comprising the steps of sealing the waste in a first flexible container, sealing one or more first containers within an outer flexible container, breaching the integrity of the first containers, evacuating the air from the inner and outer containers, and sealing the outer container shut. 8 figs.
Remote vacuum compaction of compressible hazardous waste
Coyne, Martin J.; Fiscus, Gregory M.; Sammel, Alfred G.
1998-01-01
A system for remote vacuum compaction and containment of low-level radioactive or hazardous waste comprising a vacuum source, a sealable first flexible container, and a sealable outer flexible container for receiving one or more first flexible containers. A method for compacting low level radioactive or hazardous waste materials at the point of generation comprising the steps of sealing the waste in a first flexible container, sealing one or more first containers within an outer flexible container, breaching the integrity of the first containers, evacuating the air from the inner and outer containers, and sealing the outer container shut.
Remote vacuum compaction of compressible hazardous waste
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coyne, M.J.; Fiscus, G.M.; Sammel, A.G.
1996-12-31
A system is described for remote vacuum compaction and containment of low-level radioactive or hazardous waste comprising a vacuum source, a sealable first flexible container, and a sealable outer flexible container for receiving one or more first flexible containers. A method for compacting low level radioactive or hazardous waste materials at the point of generation comprising the steps of sealing the waste in a first flexible container, sealing one or more first containers within an outer flexible container, breaching the integrity of the first containers, evacuating the air from the inner and outer containers, and sealing the outer container shut.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sezen, Halil; Aldemir, Tunc; Denning, R.
Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.
Glovebox and Experiment Safety
NASA Astrophysics Data System (ADS)
Maas, Gerard
2005-12-01
Human spaceflight hardware and operations must comply with NSTS 1700.7. This paper discusses how a glovebox can help.A short layout is given on the process according NSTS/ISS 13830, explaining the responsibility of the payload organization, the approval authority of the PSRP and the defined review phases (0 till III).Amongst others, the following requirement has to be met:"200.1 Design to Tolerate Failures. Failure tolerance is the basic safety requirement that shall be used to control most payload hazards. The payload must tolerate a minimum number of credible failures and/or operator errors determined by the hazard level. This criterion applies when the loss of a function or the inadvertent occurrence of a function results in a hazardous event.200.1a Critical Hazards. Critical hazards shall be controlled such that no single failure or operator error can result in damage to STS/ISS equipment, a nondisabling personnel injury, or the use of unscheduled safing procedures that affect operations of the Orbiter/ISS or another payload.200.1b Catastrophic Hazards. Catastrophic hazards shall be controlled such that no combination of two failures or operator errors can result in the potential for a disabling or fatal personnel injury or loss of the Orbiter/ISS, ground facilities or STS/ISS equipment."For experiments in material science, biological science and life science that require real time operator manipulation, the above requirement may be hard or impossible to meet. Especially if the experiment contains substances that are considered hazardous when released into the habitable environment. In this case operation of the experiment in a glovebox can help to comply.A glovebox provides containment of the experiment and at the same time allows manipulation and visibility to the experiment.The containment inside the glovebox provides failure tolerance because the glovebox uses a negative pressure inside the working volume (WV). The level of failure tolerance is dependent of: the identified failure case and the hazardous substance being released (chemical, biological or different).The principle of the glovebox operation is explained, including: mechanical enclosure, air circulation, air filtration and operational modes.Limitations of the glovebox are presented: inability of an experiment fire to be detected by the ASDA, containment only with respect to specified substances, etc. There are requirements induced by the glovebox that the experiment must comply with: Compatibility with the glovebox filter system, thermal limitations, safe without glovebox services, parameter monitoring when a fire hazard is credible, sufficient containment when entering the glovebox and after the experiment, etc.Experiments that are using a glovebox to be operated in shall assess this integrated set-up and the associated operations for compliance to the safety requirements. During this assessment the PSRP shall determine if the provided failure tolerance is sufficient.The gloveboxes that Bradford Engineering (co-) built for human space flight are: USML-1 and 2, MGBX (STS and MIR), MSG, PGBX, LSG-WVA, BGB and PGB. Some of the evolutions are pointed out (experiment services added without compromising safety levels). The major differences of the gloveboxes are presented. For the gloveboxes that are in operation at this time (MSG) or in the near future (BGB, LSG- WVA and PGB) the specific applications are presented.
NASA Astrophysics Data System (ADS)
Weatherill, Graeme; Burton, Paul W.
2010-09-01
The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard mapping calculations. These hazard maps are in general agreement with previous maps for the Aegean, recognising the highest hazard in the Ionian Islands, Gulf of Corinth and Hellenic Arc. Peak Ground Accelerations for some sites in these regions reach as high as 500-600 cm s -2 using European/NGA attenuation models, and 400-500 cm s -2 using Greek attenuation models.
Mapping mountain torrent hazards in the Hexi Corridor using an evidential reasoning approach
NASA Astrophysics Data System (ADS)
Ran, Youhua; Liu, Jinpeng; Tian, Feng; Wang, Dekai
2017-02-01
The Hexi Corridor is an important part of the Silk Road Economic Belt and a crucial channel for westward development in China. Many important national engineering projects pass through the corridor, such as highways, railways, and the West-to-East Gas Pipeline. The frequent torrent disasters greatly impact the security of infrastructure and human safety. In this study, an evidential reasoning approach based on Dempster-Shafer theory is proposed for mapping mountain torrent hazards in the Hexi Corridor. A torrent hazard map for the Hexi Corridor was generated by integrating the driving factors of mountain torrent disasters including precipitation, terrain, flow concentration processes, and the vegetation fraction. The results show that the capability of the proposed method is satisfactory. The torrent hazard map shows that there is high potential torrent hazard in the central and southeastern Hexi Corridor. The results are useful for engineering planning support and resource protection in the Hexi Corridor. Further efforts are discussed for improving torrent hazard mapping and prediction.
NASA Astrophysics Data System (ADS)
Baruffini, Mirko
2010-05-01
Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a GIS-based system can be for effective and efficient disaster response management. In the coming years our GIS application will be a data base containing all information needed for the evaluation of risk sites along the Gotthard line. Our GIS application can help the technical management to decide about protection measures because of, in addition to the visualisation, tools for spatial data analysis will be available. REFERENCES Bründl M. (Ed.) 2009 : Risikokonzept für Naturgefahren - Leitfaden. Nationale Plattform für Naturgefahren PLANAT, Bern. 416 S. BUWAL 1999: Risikoanalyse bei gravitativen Naturgefahren - Methode, Fallbeispiele und Daten (Risk analyses for gravitational natural hazards). Bundesamt für Umwelt, Wald und Landschaft (BUWAL). Umwelt-Materialen Nr. 107, 1-244. Loat, R. & Zimmermann, M. 2004: La gestion des risques en Suisse (Risk Management in Switzerland). In: Veyret, Y., Garry, G., Meschinet de Richemont, N. & Armand Colin (eds) 2002: Colloque Arche de la Défense 22-24 octobre 2002, dans Risques naturels et aménagement en Europe, 108-120. Maggi R. et al, 2009: Evaluation of the optimal resilience for vulnerable infrastructure networks. An interdisciplinary pilot study on the transalpine transportation corridors, NRP 54 "Sustainable Development of the Built Environment", Projekt Nr. 405 440, Final Scientific Report, Lugano
NASA Astrophysics Data System (ADS)
Bevilacqua, Andrea; Neri, Augusto; Esposti Ongaro, Tomaso; Isaia, Roberto; Flandoli, Franco; Bisson, Marina
2016-04-01
Today hundreds of thousands people live inside the Campi Flegrei caldera (Italy) and in the adjacent part of the city of Naples making a future eruption of such volcano an event with huge consequences. Very high risks are associated with the occurrence of pyroclastic density currents (PDCs). Mapping of background or long-term PDC hazard in the area is a great challenge due to the unknown eruption time, scale and vent location of the next event as well as the complex dynamics of the flow over the caldera topography. This is additionally complicated by the remarkable epistemic uncertainty on the eruptive record, affecting the time of past events, the location of vents as well as the PDCs areal extent estimates. First probability maps of PDC invasion were produced combining a vent-opening probability map, statistical estimates concerning the eruptive scales and a Cox-type temporal model including self-excitement effects, based on the eruptive record of the last 15 kyr. Maps were produced by using a Monte Carlo approach and adopting a simplified inundation model based on the "box model" integral approximation tested with 2D transient numerical simulations of flow dynamics. In this presentation we illustrate the independent effects of eruption scale, vent location and time of forecast of the next event. Specific focus was given to the remarkable differences between the eastern and western sectors of the caldera and their effects on the hazard maps. The analysis allowed to identify areas with elevated probabilities of flow invasion as a function of the diverse assumptions made. With the quantification of some sources of uncertainty in relation to the system, we were also able to provide mean and percentile maps of PDC hazard levels.
Synopsis of Precision Landing and Hazard Avoidance (PL&HA) Capabilities for Space Exploration
NASA Technical Reports Server (NTRS)
Robertson, Edward A.
2017-01-01
Until recently, robotic exploration missions to the Moon, Mars, and other solar system bodies relied upon controlled blind landings. Because terrestrial techniques for terrain relative navigation (TRN) had not yet been evolved to support space exploration, landing dispersions were driven by the capabilities of inertial navigation systems combined with surface relative altimetry and velocimetry. Lacking tight control over the actual landing location, mission success depended on the statistical vetting of candidate landing areas within the predicted landing dispersion ellipse based on orbital reconnaissance data, combined with the ability of the spacecraft to execute a controlled landing in terms of touchdown attitude, attitude rates, and velocity. In addition, the sensors, algorithms, and processing technologies required to perform autonomous hazard detection and avoidance in real time during the landing sequence were not yet available. Over the past decade, NASA has invested substantial resources on the development, integration, and testing of autonomous precision landing and hazard avoidance (PL&HA) capabilities. In addition to substantially improving landing accuracy and safety, these autonomous PL&HA functions also offer access to targets of interest located within more rugged and hazardous terrain. Optical TRN systems are baselined on upcoming robotic landing missions to the Moon and Mars, and NASA JPL is investigating the development of a comprehensive PL&HA system for a Europa lander. These robotic missions will demonstrate and mature PL&HA technologies that are considered essential for future human exploration missions. PL&HA technologies also have applications to rendezvous and docking/berthing with other spacecraft, as well as proximity navigation, contact, and retrieval missions to smaller bodies with microgravity environments, such as asteroids.
NASA Technical Reports Server (NTRS)
1971-01-01
The findings, conclusions, and recommendations relative to the investigations conducted to evaluate tests for classifying pyrotechnic materials and end items as to their hazard potential are presented. Information required to establish an applicable means of determining the potential hazards of pyrotechnics is described. Hazard evaluations are based on the peak overpressure or impulse resulting from the explosion as a function of distance from the source. Other hazard classification tests include dust ignition sensitivity, impact ignition sensitivity, spark ignition sensitivity, and differential thermal analysis.
Armenti, Karla; Moure-Eraso, Rafael; Slatin, Craig; Geiser, Ken
2003-01-01
Occupational and environmental health issues are not always considered simultaneously when attempting to reduce or eliminate hazardous materials from our environment. Methods used to decrease exposure to hazardous chemicals in the workplace often lead to increased exposure in the environment and to the community outside the workplace. Conversely, efforts to control emissions of hazardous chemicals into the environment often lead to increased exposure to the workers inside the plant. There are government regulations in place that ensure a safe work environment or a safe outside environment; however, there is little integration of both approaches when considering the public's health as a whole. This article examines some of the reasons behind this dichotomy, focusing on the regulatory and policy frameworks with respect to workplace and environment that have resulted in the inability of the Occupational Safety and Health Administration (OSHA) and the Environmental Protection Agency (EPA) to coordinate their efforts to protect public health. The components of the Pollution Prevention Act and its potential to serve as a model for integrating occupational and environmental health are discussed. Limitations regarding enforcement of pollution prevention, as well as its disconnection from the work environment are equally highlighted. The article finishes by examining the barriers to integrating the occupational and environmental health paradigms and the promotion of primary prevention in public health.
NASA Astrophysics Data System (ADS)
Hammitzsch, M.; Spazier, J.; Reißland, S.
2014-12-01
Usually, tsunami early warning and mitigation systems (TWS or TEWS) are based on several software components deployed in a client-server based infrastructure. The vast majority of systems importantly include desktop-based clients with a graphical user interface (GUI) for the operators in early warning centers. However, in times of cloud computing and ubiquitous computing the use of concepts and paradigms, introduced by continuously evolving approaches in information and communications technology (ICT), have to be considered even for early warning systems (EWS). Based on the experiences and the knowledge gained in three research projects - 'German Indonesian Tsunami Early Warning System' (GITEWS), 'Distant Early Warning System' (DEWS), and 'Collaborative, Complex, and Critical Decision-Support in Evolving Crises' (TRIDEC) - new technologies are exploited to implement a cloud-based and web-based prototype to open up new prospects for EWS. This prototype, named 'TRIDEC Cloud', merges several complementary external and in-house cloud-based services into one platform for automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The prototype in its current version addresses tsunami early warning and mitigation. The integration of GPU accelerated tsunami simulation computations have been an integral part of this prototype to foster early warning with on-demand tsunami predictions based on actual source parameters. However, the platform is meant for researchers around the world to make use of the cloud-based GPU computation to analyze other types of geohazards and natural hazards and react upon the computed situation picture with a web-based GUI in a web browser at remote sites. The current website is an early alpha version for demonstration purposes to give the concept a whirl and to shape science's future. Further functionality, improvements and possible profound changes have to implemented successively based on the users' evolving needs.
An assessment of the crash fire hazard of liquid hydrogen fueled aircraft
NASA Technical Reports Server (NTRS)
1982-01-01
The crash fire hazards of liquid hydrogen fueled aircraft relative to those of mission equivalent aircraft fueled either with conventional fuel or with liquefied methane were evaluated. The aircraft evaluated were based on Lockheed Corporation design for 400 passenger, Mach 0.85, 5500 n. mile aircraft. Four crash scenarios were considered ranging from a minor incident causing some loss of fuel system integrity to a catastrophic crash. Major tasks included a review of hazardous properties of the alternate fuels and of historic crash fire data; a comparative hazard evluation for each of the three fuels under four crash scenarios a comprehensive review and analysis and an identification of areas further development work. The conclusion was that the crash fire hazards are not significantly different when compared in general for the three fuels, although some fuels showed minor advantages in one respect or another.
The influence of hazard models on GIS-based regional risk assessments and mitigation policies
Bernknopf, R.L.; Rabinovici, S.J.M.; Wood, N.J.; Dinitz, L.B.
2006-01-01
Geographic information systems (GIS) are important tools for understanding and communicating the spatial distribution of risks associated with natural hazards in regional economies. We present a GIS-based decision support system (DSS) for assessing community vulnerability to natural hazards and evaluating potential mitigation policy outcomes. The Land Use Portfolio Modeler (LUPM) integrates earth science and socioeconomic information to predict the economic impacts of loss-reduction strategies. However, the potential use of such systems in decision making may be limited when multiple but conflicting interpretations of the hazard are available. To explore this problem, we conduct a policy comparison using the LUPM to test the sensitivity of three available assessments of earthquake-induced lateral-spread ground failure susceptibility in a coastal California community. We find that the uncertainty regarding the interpretation of the science inputs can influence the development and implementation of natural hazard management policies. Copyright ?? 2006 Inderscience Enterprises Ltd.
Applying the Land Use Portfolio Model with Hazus to analyse risk from natural hazard events
Dinitz, Laura B.; Taketa, Richard A.
2013-01-01
This paper describes and demonstrates the integration of two geospatial decision-support systems for natural-hazard risk assessment and management. Hazus is a risk-assessment tool developed by the Federal Emergency Management Agency to identify risks and estimate the severity of risk from natural hazards. The Land Use Portfolio Model (LUPM) is a risk-management tool developed by the U.S. Geological Survey to evaluate plans or actions intended to reduce risk from natural hazards. We analysed three mitigation policies for one earthquake scenario in the San Francisco Bay area to demonstrate the added value of using Hazus and the LUPM together. The demonstration showed that Hazus loss estimates can be input to the LUPM to obtain estimates of losses avoided through mitigation, rates of return on mitigation investment, and measures of uncertainty. Together, they offer a more comprehensive approach to help with decisions for reducing risk from natural hazards.
NASA Astrophysics Data System (ADS)
Chang, N. B.
2016-12-01
Many countries concern about development and redevelopment efforts in urban regions to reduce the flood risk by considering hazards such as high-tide events, storm surge, flash floods, stormwater runoff, and impacts of sea level rise. Combining these present and future hazards with vulnerable characteristics found throughout coastal communities such as majority low-lying areas and increasing urban development, create scenarios for increasing exposure of flood hazard. As such, the most vulnerable areas require adaptation strategies and mitigation actions for flood hazard management. In addition, in the U.S., Numeric Nutrient Criteria (NNC) are a critical tool for protecting and restoring the designated uses of a waterbody with regard to nitrogen and phosphorus pollution. Strategies such as low impact development (LID) have been promoted in recent years as an alternative to traditional stormwater management and drainage to control both flooding and water quality impact. LID utilizes decentralized multifunctional site designs and incorporates on-site storm water management practices rather than conventional storm water management approaches that divert flow toward centralized facilities. How to integrate hydrologic and water quality models to achieve the decision support becomes a challenge. The Cross Bayou Watershed of Pinellas County in Tampa Bay, a highly urbanized coastal watershed, is utilized as a case study due to its sensitivity to flood hazards and water quality management within the watershed. This study will aid the County, as a decision maker, to implement its stormwater management policy and honor recent NNC state policy via demonstration of an integrated hydrologic and water quality model, including the Interconnected Channel and Pond Routing Model v.4 (ICPR4) and the BMPTRAIN model as a decision support tool. The ICPR4 can be further coupled with the ADCIRC/SWAN model to reflect the storm surge and seal level rise in coastal regions.
Integrated Research on Disaster Risk - A Review
NASA Astrophysics Data System (ADS)
Beer, T.
2016-12-01
Integrated Research on Disaster Risk, generally known as IRDR, is a decade-long research programme co-sponsored by the International Council for Science (ICSU), the International Social Science Council (ISSC), and the United Nations International Strategy for Disaster Reduction (UNISDR). It is a global, multi-disciplinary approach to dealing with the challenges brought by natural disasters, mitigating their impacts, and improving related policy-making mechanisms. The home page is at: http://www.irdrinternational.org/The research programme was named Integrated Research on Disaster Risk to indicate that it is addressing the challenge of natural and human-induced environmental hazards. In November 2008 and May 2009 respectively, both the ISSC and the UNISDR agreed to join the ICSU in co-sponsoring the IRDR programme. Although the approaches in the sciences vary, the IRDR programme approaches the issues of natural and human-induced hazards and disasters from several perspectives: from the hazards to the disasters, and from the human exposures and vulnerabilities back to the hazards. This coordinated and multi-dimensional approach takes the IRDR programme beyond approaches that have traditionally been undertaken To meet its research objectives the IRDR established four core projects, comprising working groups of experts from diverse disciplines, to formulate new methods in addressing the shortcomings of current disaster risk research. Assessment of Integrated Research on Disaster Risk (AIRDR) Disaster Loss Data (DATA) Forensic Investigations of Disasters (FORIN) Risk Interpretation and Action (RIA) Dr Tom Beer was a member of both the scoping and planning groups and was a member of the committee to undertake a mid-term review of IRDR with the terms of reference being to examine and to report by November 2016. 1. Strategic planning and implementation 2. Governance 3. Secretariat, funding and operations 4. Stakeholders and partnerships 5. Communication, visibility and influence 6. Future development His talk will give an overview of the history and science of IRDR and some of the outcomes of the mid-term review.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Joshua M.
Manufacturing tasks that are deemed too hazardous for workers require the use of automation, robotics, and/or other remote handling tools. The associated hazards may be radiological or nonradiological, and based on the characteristics of the environment and processing, a design may necessitate robotic labor, human labor, or both. There are also other factors such as cost, ergonomics, maintenance, and efficiency that also effect task allocation and other design choices. Handling the tradeoffs of these factors can be complex, and lack of experience can be an issue when trying to determine if and what feasible automation/robotics options exist. To address thismore » problem, we utilize common engineering design approaches adapted more for manufacturing system design in hazardous environments. We limit our scope to the conceptual and embodiment design stages, specifically a computational algorithm for concept generation and early design evaluation. In regard to concept generation, we first develop the functional model or function structure for the process, using the common 'verb-noun' format for describing function. A common language or functional basis for manufacturing was developed and utilized to formalize function descriptions and guide rules for function decomposition. Potential components for embodiment are also grouped in terms of this functional language and are stored in a database. The properties of each component are given as quantitative and qualitative criteria. Operators are also rated for task-relevant criteria which are used to address task compatibility. Through the gathering of process requirements/constraints, construction of the component database, and development of the manufacturing basis and rule set, design knowledge is stored and available for computer use. Thus, once the higher level process functions are defined, the computer can automate the synthesis of new design concepts through alternating steps of embodiment and function structure updates/decomposition. In the process, criteria guide function allocation of components/operators and help ensure compatibility and feasibility. Through multiple function assignment options and varied function structures, multiple design concepts are created. All of the generated designs are then evaluated based on a number of relevant evaluation criteria: cost, dose, ergonomics, hazards, efficiency, etc. These criteria are computed using physical properties/parameters of each system based on the qualities an engineer would use to make evaluations. Nuclear processes such as oxide conversion and electrorefining are utilized to aid algorithm development and provide test cases for the completed program. Through our approach, we capture design knowledge related to manufacturing and other operations in hazardous environments to enable a computational program to automatically generate and evaluate system design concepts.« less
AN ADVANCED TOOL FOR APPLIED INTEGRATED SAFETY MANAGEMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Potts, T. Todd; Hylko, James M.; Douglas, Terence A.
2003-02-27
WESKEM, LLC's Environmental, Safety and Health (ES&H) Department had previously assessed that a lack of consistency, poor communication and using antiquated communication tools could result in varying operating practices, as well as a failure to capture and disseminate appropriate Integrated Safety Management (ISM) information. To address these issues, the ES&H Department established an Activity Hazard Review (AHR)/Activity Hazard Analysis (AHA) process for systematically identifying, assessing, and controlling hazards associated with project work activities during work planning and execution. Depending on the scope of a project, information from field walkdowns and table-top meetings are collected on an AHR form. The AHAmore » then documents the potential failure and consequence scenarios for a particular hazard. Also, the AHA recommends whether the type of mitigation appears appropriate or whether additional controls should be implemented. Since the application is web based, the information is captured into a single system and organized according to the >200 work activities already recorded in the database. Using the streamlined AHA method improved cycle time from over four hours to an average of one hour, allowing more time to analyze unique hazards and develop appropriate controls. Also, the enhanced configuration control created a readily available AHA library to research and utilize along with standardizing hazard analysis and control selection across four separate work sites located in Kentucky and Tennessee. The AHR/AHA system provides an applied example of how the ISM concept evolved into a standardized field-deployed tool yielding considerable efficiency gains in project planning and resource utilization. Employee safety is preserved through detailed planning that now requires only a portion of the time previously necessary. The available resources can then be applied to implementing appropriate engineering, administrative and personal protective equipment controls in the field.« less
Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F
2010-01-01
The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.
This regulation prescribes Chemical Data Quality Management (CDQM) responsibilities and procedures for projects involving hazardous, toxic and/or radioactive waste (HTRW) materials. Its purpose is to assure that the analytical data meet project data quality objectives. This is the umbrella regulation that defines CDQM activities and integrates all of the other U.S. Army Corps of Engineers (USACE) guidance on environmental data quality management .
Evacuation Planning in the TMI Accident
1980-01-01
their scientific counterparts (for a particular hazard) can arrange planned and systematic procedures for handling the "translation" of a technical hazard...conditions is combined with a limited number of jurisdictions which can be integrated in a state-wide or large area disaster-response operation. These...should include not only the expansion of communications available locally but also their augmentation by mobile radio units and other additions which can
Safety issues of high-concentrated hydrogen peroxide production used as rocket propellant
NASA Astrophysics Data System (ADS)
Romantsova, O. V.; Ulybin, V. B.
2015-04-01
The article dwells on the possibility of production of high-concentrated hydrogen peroxide with the Russian technology of isopropyl alcohol autoxidation. Analysis of fire/explosion hazards and reasons of insufficient quality is conducted for the technology. Modified technology is shown. Non-standard fire/explosion characteristics required for integrated fire/explosion hazards rating for modified hydrogen peroxide production based on the autoxidation of isopropyl alcohol are defined.
Simulating Scenario Floods for Hazard Assessment on the Lower Bicol Floodplain, the Philippines
NASA Astrophysics Data System (ADS)
Usamah, Muhibuddin Bin; Alkema, Dinand
This paper describes the first results from a study to the behavior of floods in the lower Bicol area, the Philippines. A 1D2D dynamic hydraulic model was applied to simulate a set of scenario floods through the complex topography of the city Naga and surrounding area. The simulation results are integrated into a multi-parameter hazard zonation for the five scenario floods.
Integrating population dynamics into mapping human exposure to seismic hazard
NASA Astrophysics Data System (ADS)
Freire, S.; Aubrecht, C.
2012-11-01
Disaster risk is not fully characterized without taking into account vulnerability and population exposure. Assessment of earthquake risk in urban areas would benefit from considering the variation of population distribution at more detailed spatial and temporal scales, and from a more explicit integration of this improved demographic data with existing seismic hazard maps. In the present work, "intelligent" dasymetric mapping is used to model population dynamics at high spatial resolution in order to benefit the analysis of spatio-temporal exposure to earthquake hazard in a metropolitan area. These night- and daytime-specific population densities are then classified and combined with seismic intensity levels to derive new spatially-explicit four-class-composite maps of human exposure. The presented approach enables a more thorough assessment of population exposure to earthquake hazard. Results show that there are significantly more people potentially at risk in the daytime period, demonstrating the shifting nature of population exposure in the daily cycle and the need to move beyond conventional residence-based demographic data sources to improve risk analyses. The proposed fine-scale maps of human exposure to seismic intensity are mainly aimed at benefiting visualization and communication of earthquake risk, but can be valuable in all phases of the disaster management process where knowledge of population densities is relevant for decision-making.
Scholz, Stefan; Sela, Erika; Blaha, Ludek; Braunbeck, Thomas; Galay-Burgos, Malyka; García-Franco, Mauricio; Guinea, Joaquin; Klüver, Nils; Schirmer, Kristin; Tanneberger, Katrin; Tobor-Kapłon, Marysia; Witters, Hilda; Belanger, Scott; Benfenati, Emilio; Creton, Stuart; Cronin, Mark T D; Eggen, Rik I L; Embry, Michelle; Ekman, Drew; Gourmelon, Anne; Halder, Marlies; Hardy, Barry; Hartung, Thomas; Hubesch, Bruno; Jungmann, Dirk; Lampi, Mark A; Lee, Lucy; Léonard, Marc; Küster, Eberhard; Lillicrap, Adam; Luckenbach, Till; Murk, Albertinka J; Navas, José M; Peijnenburg, Willie; Repetto, Guillermo; Salinas, Edward; Schüürmann, Gerrit; Spielmann, Horst; Tollefsen, Knut Erik; Walter-Rohde, Susanne; Whale, Graham; Wheeler, James R; Winter, Matthew J
2013-12-01
Tests with vertebrates are an integral part of environmental hazard identification and risk assessment of chemicals, plant protection products, pharmaceuticals, biocides, feed additives and effluents. These tests raise ethical and economic concerns and are considered as inappropriate for assessing all of the substances and effluents that require regulatory testing. Hence, there is a strong demand for replacement, reduction and refinement strategies and methods. However, until now alternative approaches have only rarely been used in regulatory settings. This review provides an overview on current regulations of chemicals and the requirements for animal tests in environmental hazard and risk assessment. It aims to highlight the potential areas for alternative approaches in environmental hazard identification and risk assessment. Perspectives and limitations of alternative approaches to animal tests using vertebrates in environmental toxicology, i.e. mainly fish and amphibians, are discussed. Free access to existing (proprietary) animal test data, availability of validated alternative methods and a practical implementation of conceptual approaches such as the Adverse Outcome Pathways and Integrated Testing Strategies were identified as major requirements towards the successful development and implementation of alternative approaches. Although this article focusses on European regulations, its considerations and conclusions are of global relevance. Copyright © 2013 Elsevier Inc. All rights reserved.
Unmanned aircraft system sense and avoid integrity and continuity
NASA Astrophysics Data System (ADS)
Jamoom, Michael B.
This thesis describes new methods to guarantee safety of sense and avoid (SAA) functions for Unmanned Aircraft Systems (UAS) by evaluating integrity and continuity risks. Previous SAA efforts focused on relative safety metrics, such as risk ratios, comparing the risk of using an SAA system versus not using it. The methods in this thesis evaluate integrity and continuity risks as absolute measures of safety, as is the established practice in commercial aircraft terminal area navigation applications. The main contribution of this thesis is a derivation of a new method, based on a standard intruder relative constant velocity assumption, that uses hazard state estimates and estimate error covariances to establish (1) the integrity risk of the SAA system not detecting imminent loss of '"well clear," which is the time and distance required to maintain safe separation from intruder aircraft, and (2) the probability of false alert, the continuity risk. Another contribution is applying these integrity and continuity risk evaluation methods to set quantifiable and certifiable safety requirements on sensors. A sensitivity analysis uses this methodology to evaluate the impact of sensor errors on integrity and continuity risks. The penultimate contribution is an integrity and continuity risk evaluation where the estimation model is refined to address realistic intruder relative linear accelerations, which goes beyond the current constant velocity standard. The final contribution is an integrity and continuity risk evaluation addressing multiple intruders. This evaluation is a new innovation-based method to determine the risk of mis-associating intruder measurements. A mis-association occurs when the SAA system incorrectly associates a measurement to the wrong intruder, causing large errors in the estimated intruder trajectories. The new methods described in this thesis can help ensure safe encounters between aircraft and enable SAA sensor certification for UAS integration into the National Airspace System.
Extended cox regression model: The choice of timefunction
NASA Astrophysics Data System (ADS)
Isik, Hatice; Tutkun, Nihal Ata; Karasoy, Durdu
2017-07-01
Cox regression model (CRM), which takes into account the effect of censored observations, is one the most applicative and usedmodels in survival analysis to evaluate the effects of covariates. Proportional hazard (PH), requires a constant hazard ratio over time, is the assumptionofCRM. Using extended CRM provides the test of including a time dependent covariate to assess the PH assumption or an alternative model in case of nonproportional hazards. In this study, the different types of real data sets are used to choose the time function and the differences between time functions are analyzed and discussed.
Castillo, Maria Isabel; Larsen, Emily; Cooke, Marie; Marsh, Nicole M; Wallis, Marianne C; Finucane, Julie; Brown, Peter; Mihala, Gabor; Byrnes, Joshua; Walker, Rachel; Cable, Prudence; Zhang, Li; Sear, Candi; Jackson, Gavin; Rowsome, Anna; Ryan, Alison; Humphries, Julie C; Sivyer, Susan; Flanigan, Kathy; Rickard, Claire M
2018-01-01
Introduction Peripheral intravenous catheters (PIVCs) are frequently used in hospitals. However, PIVC complications are common, with failures leading to treatment delays, additional procedures, patient pain and discomfort, increased clinician workload and substantially increased healthcare costs. Recent evidence suggests integrated PIVC systems may be more effective than traditional non-integrated PIVC systems in reducing phlebitis, infiltration and costs and increasing functional dwell time. The study aim is to determine the efficacy, cost–utility and acceptability to patients and professionals of an integrated PIVC system compared with a non-integrated PIVC system. Methods and analysis Two-arm, multicentre, randomised controlled superiority trial of integrated versus non-integrated PIVC systems to compare effectiveness on clinical and economic outcomes. Recruitment of 1560 patients over 2 years, with randomisation by a centralised service ensuring allocation concealment. Primary outcomes: catheter failure (composite endpoint) for reasons of: occlusion, infiltration/extravasation, phlebitis/thrombophlebitis, dislodgement, localised or catheter-associated bloodstream infections. Secondary outcomes: first time insertion success, types of PIVC failure, device colonisation, insertion pain, functional dwell time, adverse events, mortality, cost–utility and consumer acceptability. One PIVC per patient will be included, with intention-to-treat analysis. Baseline group comparisons will be made for potentially clinically important confounders. The proportional hazards assumption will be checked, and Cox regression will test the effect of group, patient, device and clinical variables on failure. An as-treated analysis will assess the effect of protocol violations. Kaplan-Meier survival curves with log-rank tests will compare failure by group over time. Secondary endpoints will be compared between groups using parametric/non-parametric techniques. Ethics and dissemination Ethical approval from the Royal Brisbane and Women’s Hospital Human Research Ethics Committee (HREC/16/QRBW/527), Griffith University Human Research Ethics Committee (Ref No. 2017/002) and the South Metropolitan Health Services Human Research Ethics Committee (Ref No. 2016–239). Results will be published in peer-reviewed journals. Trial registration number ACTRN12617000089336. PMID:29764876
Castillo, Maria Isabel; Larsen, Emily; Cooke, Marie; Marsh, Nicole M; Wallis, Marianne C; Finucane, Julie; Brown, Peter; Mihala, Gabor; Carr, Peter J; Byrnes, Joshua; Walker, Rachel; Cable, Prudence; Zhang, Li; Sear, Candi; Jackson, Gavin; Rowsome, Anna; Ryan, Alison; Humphries, Julie C; Sivyer, Susan; Flanigan, Kathy; Rickard, Claire M
2018-05-14
Peripheral intravenous catheters (PIVCs) are frequently used in hospitals. However, PIVC complications are common, with failures leading to treatment delays, additional procedures, patient pain and discomfort, increased clinician workload and substantially increased healthcare costs. Recent evidence suggests integrated PIVC systems may be more effective than traditional non-integrated PIVC systems in reducing phlebitis, infiltration and costs and increasing functional dwell time. The study aim is to determine the efficacy, cost-utility and acceptability to patients and professionals of an integrated PIVC system compared with a non-integrated PIVC system. Two-arm, multicentre, randomised controlled superiority trial of integrated versus non-integrated PIVC systems to compare effectiveness on clinical and economic outcomes. Recruitment of 1560 patients over 2 years, with randomisation by a centralised service ensuring allocation concealment. Primary outcomes: catheter failure (composite endpoint) for reasons of: occlusion, infiltration/extravasation, phlebitis/thrombophlebitis, dislodgement, localised or catheter-associated bloodstream infections. first time insertion success, types of PIVC failure, device colonisation, insertion pain, functional dwell time, adverse events, mortality, cost-utility and consumer acceptability. One PIVC per patient will be included, with intention-to-treat analysis. Baseline group comparisons will be made for potentially clinically important confounders. The proportional hazards assumption will be checked, and Cox regression will test the effect of group, patient, device and clinical variables on failure. An as-treated analysis will assess the effect of protocol violations. Kaplan-Meier survival curves with log-rank tests will compare failure by group over time. Secondary endpoints will be compared between groups using parametric/non-parametric techniques. Ethical approval from the Royal Brisbane and Women's Hospital Human Research Ethics Committee (HREC/16/QRBW/527), Griffith University Human Research Ethics Committee (Ref No. 2017/002) and the South Metropolitan Health Services Human Research Ethics Committee (Ref No. 2016-239). Results will be published in peer-reviewed journals. ACTRN12617000089336. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
IRIS Toxicological Review of Ammonia (External Review Draft ...
EPA is conducting a peer review of the scientific basis supporting the human health hazard and dose-response assessment of ammonia that will appear in the Integrated Risk Information System (IRIS) database. EPA is undertaking an Integrated Risk Information System (IRIS) health assessment for ammonia. IRIS is an EPA database containing Agency scientific positions on potential adverse human health effects that may result from chronic (or lifetime) exposure to chemicals in the environment. IRIS contains chemical-specific summaries of qualitative and quantitative health information in support of two steps of the risk assessment paradigm, i.e., hazard identification and dose-response evaluation. IRIS assessments are used in combination with specific situational exposure assessment information to evaluate potential public health risk associated with environmental contaminants.
T/R Multi-Chip MMIC Modules for 150 GHz
NASA Technical Reports Server (NTRS)
Samoska, Lorene A.; Pukala, David M.; Soria, Mary M.; Sadowy, Gregory A.
2009-01-01
Modules containing multiple monolithic microwave integrated-circuit (MMIC) chips have been built as prototypes of transmitting/receiving (T/R) modules for millimeter-wavelength radar systems, including phased-array radar systems to be used for diverse purposes that could include guidance and avoidance of hazards for landing spacecraft, imaging systems for detecting hidden weapons, and hazard-avoidance systems for automobiles. Whereas prior landing radar systems have operated at frequencies around 35 GHz, the integrated circuits in this module operate in a frequency band centered at about 150 GHz. The higher frequency (and, hence, shorter wavelength), is expected to make it possible to obtain finer spatial resolution while also using smaller antennas and thereby reducing the sizes and masses of the affected systems.
IRIS Toxicological Review of n-Butanol (External Review Draft ...
EPA is conducting a peer review of the scientific basis supporting the human health hazard and dose-response assessment of n-butanol that will appear in the Integrated Risk Information System (IRIS) database. EPA is undertaking an Integrated Risk Information System (IRIS) health assessment for n-butanol. IRIS is an EPA database containing Agency scientific positions on potential adverse human health effects that may result from chronic (or lifetime) exposure to chemicals in the environment. IRIS contains chemical-specific summaries of qualitative and quantitative health information in support of two steps of the risk assessment paradigm, i.e., hazard identification and dose-response evaluation. IRIS assessments are used in combination with specific situational exposure assessment information to evaluate potential public health risk associated with environmental contaminants.
Integrated assessment of urban vulnerability and resilience. Case study: Targu Ocna town, Romania
NASA Astrophysics Data System (ADS)
Grozavu, Adrian; Bănică, Alexandru
2015-04-01
Vulnerability assessment frequently emphasizes the internal fragility of a system in relation to a given hazard, when compared to similar systems or to a reference standard. This internal fragility, either biophysical or structural, may affect the ability to predict, to prepare for and cope with or to recover from the manifestation of a risk phenomenon. Thus, the vulnerability is highly related to resilience and adaptability. There is no single methodology for vulnerability and resilience analysis, their assessment can only be made by identifying and integrating indicators which are compatible with the analysis level and the geographic, economic and social features of a certain area. An integrated model of evaluating vulnerability and resilience capacity is being proposed in this paper for Targu Ocna, a small mining settlement in the Eastern Carpathians of Romania, that became in the last years a tourist town and acts within the surrounding territory as a dynamic local pole. Methodologically, the following steps and operations were considered: identifying potential hazards, identifying elements at risk, identifying proper indicators and integrating them in order to evaluate the general vulnerability and resilience. The inventory of elements at risk (the number of people potentially affected, residential or other functionalities buildings, roads and other infrastructure elements etc.) was made based on General Urban Plan, topographic maps (scale 1:5000), ortophotos from 2003 and 2008 and field mapping and researches. Further on, several vulnerability indicators were identified and included within the analytical approach: dependency ratio, income, quality of the habitat and technical urban facilities, environment quality showing differentiated sensitivity. Issues such as preparedness and preventive measures (priority areas within the risk prevention plans), coping ability (networks' geometry and connectivity, emergency utilities and services accessibility) and the recovering capacity (the time needed to reestablish functions after a disastrous event) were also taken into account. The selected indicators were mathematically processed (standardized and normalized) in order to maximize their relevance and to unitary express the results in the spread 0-1. Then a grid with a cell size of 100 x 100 m was created in order to spatialize vulnerability indicators, that were calculated as the average vulnerability of the exposed elements in each cell. All identified indicators have been processed within a cluster analysis that permitted the identification of similar areas in terms of vulnerabilities. Finally, a general index was obtained by the integration of all vulnerability factors in an equation based on the geometric mean. The results of the study could provide a reference basis to substantiate local correctly prioritized decisions for reducing vulnerability by mitigation and adaptation measures in order to avoid significant damages when risks materialise.
Environment, Safety, and Health Self-Assessment Report, Fiscal Year 2008
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chernowski, John
2009-02-27
Lawrence Berkeley National Laboratory's Environment, Safety, and Health (ES&H) Self-Assessment Program ensures that Integrated Safety Management (ISM) is implemented institutionally and by all divisions. The Self-Assessment Program, managed by the Office of Contract Assurance (OCA), provides for an internal evaluation of all ES&H programs and systems at LBNL. The functions of the program are to ensure that work is conducted safely, and with minimal negative impact to workers, the public, and the environment. The Self-Assessment Program is also the mechanism used to institute continuous improvements to the Laboratory's ES&H programs. The program is described in LBNL/PUB 5344, Environment, Safety, andmore » Health Self-Assessment Program and is composed of four distinct assessments: the Division Self-Assessment, the Management of Environment, Safety, and Health (MESH) review, ES&H Technical Assurance, and the Appendix B Self-Assessment. The Division Self-Assessment uses the five core functions and seven guiding principles of ISM as the basis of evaluation. Metrics are created to measure performance in fulfilling ISM core functions and guiding principles, as well as promoting compliance with applicable regulations. The five core functions of ISM are as follows: (1) Define the Scope of Work; (2) Identify and Analyze Hazards; (3) Control the Hazards; (4) Perform the Work; and (5) Feedback and Improvement. The seven guiding principles of ISM are as follows: (1) Line Management Responsibility for ES&H; (2) Clear Roles and Responsibilities; (3) Competence Commensurate with Responsibilities; (4) Balanced Priorities; (5) Identification of ES&H Standards and Requirements; (6) Hazard Controls Tailored to the Work Performed; and (7) Operations Authorization. Performance indicators are developed by consensus with OCA, representatives from each division, and Environment, Health, and Safety (EH&S) Division program managers. Line management of each division performs the Division Self-Assessment annually. The primary focus of the review is workplace safety. The MESH review is an evaluation of division management of ES&H in its research and operations, focusing on implementation and effectiveness of the division's ISM plan. It is a peer review performed by members of the LBNL Safety Review Committee (SRC), with staff support from OCA. Each division receives a MESH review every two to four years, depending on the results of the previous review. The ES&H Technical Assurance Program (TAP) provides the framework for systematic reviews of ES&H programs and processes. The intent of ES&H Technical Assurance assessments is to provide assurance that ES&H programs and processes comply with their guiding regulations, are effective, and are properly implemented by LBNL divisions. The Appendix B Performance Evaluation and Measurement Plan (PEMP) requires that LBNL sustain and enhance the effectiveness of integrated safety, health, and environmental protection through a strong and well-deployed system. Information required for Appendix B is provided by EH&S Division functional managers. The annual Appendix B report is submitted at the close of the fiscal year. This assessment is the Department of Energy's (DOE) primary mechanism for evaluating LBNL's contract performance in ISM.« less
Survivorship analysis when cure is a possibility: a Monte Carlo study.
Goldman, A I
1984-01-01
Parametric survivorship analyses of clinical trials commonly involves the assumption of a hazard function constant with time. When the empirical curve obviously levels off, one can modify the hazard function model by use of a Gompertz or Weibull distribution with hazard decreasing over time. Some cancer treatments are thought to cure some patients within a short time of initiation. Then, instead of all patients having the same hazard, decreasing over time, a biologically more appropriate model assumes that an unknown proportion (1 - pi) have constant high risk whereas the remaining proportion (pi) have essentially no risk. This paper discusses the maximum likelihood estimation of pi and the power curves of the likelihood ratio test. Monte Carlo studies provide results for a variety of simulated trials; empirical data illustrate the methods.
MEditerranean Supersite Volcanoes (MED-SUV) project: from objectives to results
NASA Astrophysics Data System (ADS)
Puglisi, Giuseppe; Spampinato, Letizia
2017-04-01
The MEditerranean Supersite Volcanoes (MED-SUV) was a FP7 3-year lasting project aimed at improving the assessment of volcanic hazards at two of the most active European volcanic areas - Campi Flegrei/Vesuvius and Mt. Etna. More than 3 million people are exposed to potential hazards in the two areas, and the geographic location of the volcanoes increases the number of people extending the impact to a wider region. MED-SUV worked on the (1) optimisation and integration of the existing and new monitoring systems, (2) understanding of volcanic processes, and on the (3) relationship between the scientific and end-user communities. MED-SUV fully exploited the unique multidisciplinary long-term in-situ datasets available for these volcanoes and integrated them with Earth observations. Technological developments and implemented algorithms allowed better constraint of pre-, sin- and post-eruptive phases. The wide range of styles and intensities of the volcanic phenomena observed at the targeted volcanoes - archetypes of 'closed' and 'open' conduit systems - observed by using the long-term multidisciplinary datasets, exceptionally upgraded the understanding of a variety of geo-hazards. Proper experiments and studies were carried out to advance the understanding of the volcanoes' internal structure and processes, and to recognise signals related to impending unrest/eruptive phases. Indeed, the hazard quantitative assessment benefitted from the outcomes of these studies and from their integration with cutting edge monitoring approaches, thus leading to step-changes in hazard awareness and preparedness, and leveraging the close relationship between scientists, SMEs, and end-users. Among the MED-SUV achievements, we can list the (i) implementation of a data policy compliant with the GEO Open Data Principles for ruling the exploitation and shared use of the project outcomes; (ii) MED-SUV e-infrastructure creation as test bed for designing an interoperable infrastructure to manage different data sources, applying the data policy, and envisaging sustainability strategies after the project in a coherent national and international framework; (iii) improvement of the SAR capability in detecting and monitoring ground deformation; (iv) development/implementation and testing of prototypes and software for measuring and retrieving more accurate/novel parameters; (v) integration of satellite and in-situ data; and (vi) novel methods of data analysis increasing the knowledge of volcanic process dynamics and improving alert systems. The project has fostered the assessment of short-term volcanic hazard in the Italian Supersites, and exploitation of the information provided by the monitoring. The main breakthroughs in the hazard focused on fine-tuning the Bayesian approach for the probabilistic evaluation of the occurrence of eruptive events at Campi Flegrei and its effects in the area, and the preliminary application to assess the occurrence of flank eruptions and the effects of volcanic plume fallout at Mt. Etna. Indeed, MED-SUV worked also on the communication between scientists and decision makers by evaluating the suitability of scientific outcomes (e.g. hazard maps) to be informative for this goal. Dissemination of the outcomes aimed at spreading new volcanology knowledge among the scientific community, as well as among decision-maker bodies and public, and allowing the end-user community access to the two Italian Supersites' data through a proper implemented e-infrastructure.
Is Directivity Still Effective in a PSHA Framework?
NASA Astrophysics Data System (ADS)
Spagnuolo, E.; Herrero, A.; Cultrera, G.
2008-12-01
Source rupture parameters, like directivity, modulate the energy release causing variations in the radiated signal amplitude. Thus they affect the empirical predictive equations and as a consequence the seismic hazard assessment. Classical probabilistic hazard evaluations, e.g. Cornell (1968), use very simple predictive equations only based on magnitude and distance which do not account for variables concerning the rupture process. However nowadays, a few predictive equations (e.g. Somerville 1997, Spudich and Chiou 2008) take into account for rupture directivity. Also few implementations have been made in a PSHA framework (e.g. Convertito et al. 2006, Rowshandel 2006). In practice, these new empirical predictive models incorporate quantitatively the rupture propagation effects through the introduction of variables like rake, azimuth, rupture velocity and laterality. The contribution of all these variables is summarized in corrective factors derived from measuring differences between the real data and the predicted ones Therefore, it's possible to keep the older computation, making use of a simple predictive model, and besides, to incorporate the directivity effect through the corrective factors. Any single supplementary variable meaning a new integral in the parametric space. However the difficulty consists of the constraints on parameter distribution functions. We present the preliminary result for ad hoc distributions (Gaussian, uniform distributions) in order to test the impact of incorporating directivity into PSHA models. We demonstrate that incorporating directivity in PSHA by means of the new predictive equations may lead to strong percentage variations in the hazard assessment.
NASA Technical Reports Server (NTRS)
Klein, Karl E. (Editor); Contant, Jean-Michel (Editor)
1992-01-01
The present symposium on living and working in space encompasses the physiological responses of humans in space and biomedical support for the conditions associated with space travel. Specific physiological issues addressed include cerebral and sensorimotor functions, effects on the cardiovascular and respiratory system, musculoskeletal system, body fluid, hormones and electrolytes, and some orthostatic hypotension mechanisms as countermeasures. The biomedical support techniques examined include selection training, and care, teleoperation and artificial intelligence, robotic automation, bioregenerative life support, and toxic hazard risks in space habitats. Also addressed are determinants of orientation in microgravity, the hormonal control of body fluid metabolism, integrated human-machine intelligence in space machines, and material flow estimation in CELSS.
The Detroit River, Michigan: an ecological profile
Manny, Bruce A.; Edsall, Thomas A.; Jaworski, Eugene
1988-01-01
A part of the connecting channel system between Lake Huron and Lake Erie, the Detroit River forms an integral link between the two lakes for both humans and biological resources such as fish, nutrients, and plant detritus. This profile summarizes existing scientific information on the ecological structure and functioning of this ecosystem. Topics include the geological history of the region, climatic influences, river hydrology, lower trophic-level biotic components, native and introduced fishes, waterfowl use, ecological interrelationships, commercial and recreational uses of the river, and current management issues. Despite urbanization, the river still supports diverse fish, waterfowl, and benthic populations. Management issues include sewer overflows; maintenance dredging for navigation and port activities; industrial discharges of potentially hazardous materials; and wetland, fishery, and waterfowl protection and enhancement.
NASA Technical Reports Server (NTRS)
Grigor'ev, A. I. (Editor); Klein, K. E. (Editor); Nicogossian, A. (Editor)
1991-01-01
The present conference on findings from space life science investigations relevant to long-term earth orbit and planetary exploration missions, as well as considerations for future research projects on these issues, discusses the cardiovascular system and countermeasures against its deterioration in the microgravity environment, cerebral and sensorimotor functions, findings to date in endocrinology and immunology, the musculoskeletal system, and health maintenance and medical care. Also discussed are radiation hazards and protective systems, life-support and habitability factors, and such methodologies and equipment for long space mission research as the use of animal models, novel noninvasive techniques for space crew health monitoring, and an integrated international aerospace medical information system.
Prokopenko, L V; Afanas'eva, R F; Bessonova, N A; Burmistrova, O V; Losik, T K; Konstantinov, E I
2013-01-01
Studies of heat state of human involved into physical work in heating environment and having various protective clothing on demonstrated value of the protective clothing in modifying thermal load on the body and possible decrease of this load through air temperature and humidity correction, shorter stay at workplace. The authors presented hygienic requirements to air temperatures range in accordance with allowable body heating degree, suggested mathematic model to forecast integral parameter of human functional state in accordance with type of protective clothing applied. The article also covers necessity of upper air temperature limit during hot season, for applying protective clothing made of materials with low air permeability and hydraulic conductivity.
Susceptibility to mountain hazards in Austria - paradigms of vulnerability revisited
NASA Astrophysics Data System (ADS)
Fuchs, Sven
2010-05-01
The concept of vulnerability is pillared by multiple disciplinary theories underpinning either a technical or a social origin of the concept and resulting in a range of paradigms for either a qualitative or quantitative assessment of vulnerability. However, efforts to reduce susceptibility to hazards and to create disaster-resilient communities require intersections among these theories, since human activity cannot be seen independently from the environmental setting. Acknowledging different roots of disciplinary paradigms, issues determining structural, economic, institutional and social vulnerability are discussed with respect to mountain hazards in Austria. The underlying idea of taking such an integrative viewpoint was the cognition that human action in mountain environments affects the state of vulnerability, and the state of vulnerability in turn shapes the possibilities of human action. It is argued that structural vulnerability as originator results in considerable economic vulnerability, generated by the institutional settings of dealing with natural hazards and shaped by the overall societal framework. Hence, the vulnerability of a specific location and within a considered point of time is triggered by the hazardous event and the related physical susceptibility of structures, such as buildings located on a torrent fan. Depending on the specific institutional settings, economic vulnerability of individuals or of the society results, above all with respect to imperfect loss compensation mechanisms in the areas under investigation. While this potential for harm can be addressed as social vulnerability, the concept of institutional vulnerability has been developed with respect to the overall political settings of governmental risk management. As a result, the concept of vulnerability, as being used in natural sciences, can be extended by integration of possible reasons why such physical susceptibility of structures exists, and by integration of compensation mechanisms and coping strategies being developed within social sciences. If vulnerability and its counterpart, resilience, is analysed and evaluated by using a comprehensive approach, a better understanding of the vulnerability-influencing parameters could be achieved, taking into account the interdependencies and interactions between the disciplinary foci. Thereby the overall aim of this work is not to develop another integrative approach for vulnerability assessment, different approaches are rather applied by using a vulnerability-of-place criterion, and key issues of vulnerability are reconsidered aiming at a general illustration of the situation in a densely populated mountain region of Europe.
NASA Astrophysics Data System (ADS)
Kapitan, Loginn
This research created a new model which provides an integrated approach to planning the effective selection and employment of airborne sensor systems in response to accidental or intentional chemical vapor releases. The approach taken was to use systems engineering and decision analysis methods to construct a model architecture which produced a modular structure for integrating both new and existing components into a logical procedure to assess the application of airborne sensor systems to address chemical vapor hazards. The resulting integrated process model includes an internal aggregation model which allowed differentiation among alternative airborne sensor systems. Both models were developed and validated by experts and demonstrated using appropriate hazardous chemical release scenarios. The resultant prototype integrated process model or system fills a current gap in capability allowing improved planning, training and exercise for HAZMAT teams and first responders when considering the selection and employment of airborne sensor systems. Through the research process, insights into the current response structure and how current airborne capability may be most effectively used were generated. Furthermore, the resultant prototype system is tailorable for local, state, and federal application, and can potentially be modified to help evaluate investments in new airborne sensor technology and systems. Better planning, training and preparedness exercising holds the prospect for the effective application of airborne assets for improved response to large scale chemical release incidents. Improved response will result in fewer casualties and lives lost, reduced economic impact, and increased protection of critical infrastructure when faced with accidental and intentional terrorist release of hazardous industrial chemicals. With the prospect of more airborne sensor systems becoming available, this prototype system integrates existing and new tools into an effective process for the selection and employment of airborne sensors to better plan, train and exercise ahead of potential chemical release events.
Geomorphological hazards and environmental impact: Assessment and mapping
NASA Astrophysics Data System (ADS)
Panizza, Mario
In five sections the author develops the methods for the integration of geomorphological concepts into Environmental Impact and Mapping. The first section introduces the concepts of Impact and Risk through the relationships between Geomorphological Environment and Anthropical Element. The second section proposes a methodology for the determination of Geomorphological Hazard and the identification of Geomorphological Risk. The third section synthesizes the procedure for the compilation of a Geomorphological Hazards Map. The fourth section outlines the concepts of Geomorphological Resource Assessment for the analysis of the Environmental Impact. The fifth section considers the contribution of geomorphological studies and mapping in the procedure for Environmental Impact Assessment.
Camera Image Transformation and Registration for Safe Spacecraft Landing and Hazard Avoidance
NASA Technical Reports Server (NTRS)
Jones, Brandon M.
2005-01-01
Inherent geographical hazards of Martian terrain may impede a safe landing for science exploration spacecraft. Surface visualization software for hazard detection and avoidance may accordingly be applied in vehicles such as the Mars Exploration Rover (MER) to induce an autonomous and intelligent descent upon entering the planetary atmosphere. The focus of this project is to develop an image transformation algorithm for coordinate system matching between consecutive frames of terrain imagery taken throughout descent. The methodology involves integrating computer vision and graphics techniques, including affine transformation and projective geometry of an object, with the intrinsic parameters governing spacecraft dynamic motion and camera calibration.
Using the USGS Seismic Risk Web Application to estimate aftershock damage
McGowan, Sean M.; Luco, Nicolas
2014-01-01
The U.S. Geological Survey (USGS) Engineering Risk Assessment Project has developed the Seismic Risk Web Application to combine earthquake hazard and structural fragility information in order to calculate the risk of earthquake damage to structures. Enabling users to incorporate their own hazard and fragility information into the calculations will make it possible to quantify (in near real-time) the risk of additional damage to structures caused by aftershocks following significant earthquakes. Results can quickly be shared with stakeholders to illustrate the impact of elevated ground motion hazard and earthquake-compromised structural integrity on the risk of damage during a short-term, post-earthquake time horizon.
NASA Astrophysics Data System (ADS)
Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.
2012-04-01
Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m resolution was used for the analysis. Three failure mechanisms were analyzed: planar and wedge sliding, as well as toppling. Based on this kinematic analysis, areas where failure is feasible were used as source areas for run out analysis using Rockyfor3D v. 4.1 (www.ecorisq.org). The software calculates trajectories of single falling blocks in three dimensions using physically based algorithms developed under a stochastic approach. The ALS-DEM was down-scaled to 5 m resolution to optimize processing time. Results were compared with run-out simulations using Rockyfor3D with the whole rock wall as source area, and with maps of deposits generated from field observations and aerial photo interpretation. The results product of our implementation show a better correlation with field observations, and help to produce more accurate rock fall hazard assessment maps by a better definition of the source areas. It reduces the time processing for the analysis as well. The findings presented in this contribution are part of an effort to produce guidelines for natural hazard mapping in Norway. Guidelines will be used in upcoming years for hazard mapping in areas where larger groups of population are exposed to mass movements from steep slopes.
A New Lifetime Distribution with Bathtube and Unimodal Hazard Function
NASA Astrophysics Data System (ADS)
Barriga, Gladys D. C.; Louzada-Neto, Francisco; Cancho, Vicente G.
2008-11-01
In this paper we propose a new lifetime distribution which accommodate bathtub-shaped, unimodal, increasing and decreasing hazard function. Some special particular cases are derived, including the standard Weibull distribution. Maximum likelihood estimation is considered for estimate the tree parameters present in the model. The methodology is illustrated in a real data set on industrial devices on a lite test.
Relocating San Miguel Volcanic Seismic Events for Receiver Functions and Tomographic Models
NASA Astrophysics Data System (ADS)
Patlan, E.; Velasco, A. A.; Konter, J.
2009-12-01
The San Miguel volcano lies near the city of San Miguel, El Salvador (13.43N and -88.26W). San Miguel volcano, an active stratovolcano, presents a significant natural hazard for the city of San Miguel. Furthermore, the internal state and activity of volcanoes remains an important component to understanding volcanic hazard. The main technology for addressing volcanic hazards and processes is through the analysis of data collected from the deployment of seismic sensors that record ground motion. Six UTEP seismic stations were deployed around San Miguel volcano from 2007-2008 to define the magma chamber and assess the seismic and volcanic hazard. We utilize these data to develop images of the earth structure beneath the volcano, studying the volcanic processes by identifying different sources, and investigating the role of earthquakes and faults in controlling the volcanic processes. We will calculate receiver functions to determine the thickness of San Miguel volcano internal structure, within the Caribbean plate. Crustal thicknesses will be modeled using calculated receiver functions from both theoretical and hand-picked P-wave arrivals. We will use this information derived from receiver functions, along with P-wave delay times, to map the location of the magma chamber.
A knowledge integration approach to flood vulnerability
NASA Astrophysics Data System (ADS)
Mazzorana, Bruno; Fuchs, Sven
2014-05-01
Understanding, qualifying and quantifying vulnerability is an essential need for implementing effective and efficient flood risk mitigation strategies; in particular if possible synergies between different mitigation alternatives, such as active and passive measures, should be achieved. In order to combine different risk management options it is necessary to take an interdisciplinary approach to vulnerability reduction, and as a result the affected society may be willing to accept a certain degree of self-responsibility. However, due to differing mono-disciplinary approaches and regional foci undertaken until now, different aspects of vulnerability to natural hazards in general and to floods in particular remain uncovered and as a result the developed management options remain sub-optimal. Taking an even more fundamental viewpoint, the empirical vulnerability functions used in risk assessment specifically fail to capture physical principles of the damage-generating mechanisms to the build environment. The aim of this paper is to partially close this gap by discussing a balanced knowledge integration approach which can be used to resolve the multidisciplinary disorder in flood vulnerability research. Modelling techniques such as mathematical-physical modelling of the flood hazard impact to and response from the building envelope affected, and formative scenario analyses of possible consequences in terms of damage and loss are used in synergy to provide an enhanced understanding of vulnerability and to render the derived knowledge into interdisciplinary mitigation strategies. The outlined formal procedure allows for a convincing knowledge alignment of quantified, but partial, information about vulnerability as a result of the application of physical and engineering notions and valuable, but often underspecified, qualitative argumentation strings emerging from the adopted socio-economic viewpoint.
HMPT: Hazardous Waste Transportation Live 27928, Test 27929
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, Lewis Edward
2016-03-17
HMPT: Hazardous Waste Transportation (Live 27928, suggested one time and associated Test 27929, required initially and every 36 months) addresses the Department of Transportation (DOT) function-specific training requirements of the hazardous materials packagings and transportation (HMPT) Los Alamos National Laboratory (LANL) lab-wide training. This course addresses the requirements of the DOT that are unique to hazardous waste shipments. Appendix B provides the Title 40 Code of Federal Regulations (CFR) reference material needed for this course.
NASA Astrophysics Data System (ADS)
Lam, Carl
Due to technology proliferation, the environmental burden attributed to the production, use, and disposal of hazardous materials in electronics have become a worldwide concern. The major theme of this dissertation is to develop and apply hazardous materials assessment tools to systematically guide pollution prevention opportunities in the context of electronic product design, manufacturing and end-of-life waste management. To this extent, a comprehensive review is first provided on describing hazard traits and current assessment methods to evaluate hazardous materials. As a case study at the manufacturing level, life cycle impact assessment (LCIA)-based and risk-based screening methods are used to quantify chemical and geographic environmental impacts in the U.S. printed wiring board (PWB) industry. Results from this industrial assessment clarify priority waste streams and States to most effectively mitigate impact. With further knowledge of PWB manufacturing processes, select alternative chemical processes (e.g., spent copper etchant recovery) and material options (e.g., lead-free etch resist) are discussed. In addition, an investigation on technology transition effects for computers and televisions in the U.S. market is performed by linking dynamic materials flow and environmental assessment models. The analysis forecasts quantities of waste units generated and maps shifts in environmental impact potentials associated with metal composition changes due to product substitutions. This insight is important to understand the timing and waste quantities expected and the emerging toxic elements needed to be addressed as a consequence of technology transition. At the product level, electronic utility meter devices are evaluated to eliminate hazardous materials within product components. Development and application of a component Toxic Potential Indicator (TPI) assessment methodology highlights priority components requiring material alternatives. Alternative recommendations are provided and substitute materials such as aluminum alloys for stainless steel and high-density polyethylene for polyvinyl chloride and acrylonitrile-based polymers show promise to meet toxicity reduction, cost, and material functionality requirements. Furthermore, the TPI method, an European Union focused screening tool, is customized to reflect regulated U.S. toxicity parameters. Results show that, although it is possible to adopt U.S. parameters into the TPI method, harmonization of toxicity regulation and standards in various nations and regions is necessary to eliminate inconsistencies during hazard screening of substances used globally. As a whole, the present work helps to assimilate material hazard assessment methods into the larger framework of design for environment strategies so toxics use reduction could be achieved for the development and management of electronics and other consumer goods.
The Active Fault Parameters for Time-Dependent Earthquake Hazard Assessment in Taiwan
NASA Astrophysics Data System (ADS)
Lee, Y.; Cheng, C.; Lin, P.; Shao, K.; Wu, Y.; Shih, C.
2011-12-01
Taiwan is located at the boundary between the Philippine Sea Plate and the Eurasian Plate, with a convergence rate of ~ 80 mm/yr in a ~N118E direction. The plate motion is so active that earthquake is very frequent. In the Taiwan area, disaster-inducing earthquakes often result from active faults. For this reason, it's an important subject to understand the activity and hazard of active faults. The active faults in Taiwan are mainly located in the Western Foothills and the Eastern longitudinal valley. Active fault distribution map published by the Central Geological Survey (CGS) in 2010 shows that there are 31 active faults in the island of Taiwan and some of which are related to earthquake. Many researchers have investigated these active faults and continuously update new data and results, but few people have integrated them for time-dependent earthquake hazard assessment. In this study, we want to gather previous researches and field work results and then integrate these data as an active fault parameters table for time-dependent earthquake hazard assessment. We are going to gather the seismic profiles or earthquake relocation of a fault and then combine the fault trace on land to establish the 3D fault geometry model in GIS system. We collect the researches of fault source scaling in Taiwan and estimate the maximum magnitude from fault length or fault area. We use the characteristic earthquake model to evaluate the active fault earthquake recurrence interval. In the other parameters, we will collect previous studies or historical references and complete our parameter table of active faults in Taiwan. The WG08 have done the time-dependent earthquake hazard assessment of active faults in California. They established the fault models, deformation models, earthquake rate models, and probability models and then compute the probability of faults in California. Following these steps, we have the preliminary evaluated probability of earthquake-related hazards in certain faults in Taiwan. By accomplishing active fault parameters table in Taiwan, we would apply it in time-dependent earthquake hazard assessment. The result can also give engineers a reference for design. Furthermore, it can be applied in the seismic hazard map to mitigate disasters.
Operation Cartwheel, 1943-1944: Integrated Force Projection to Overcome Limited Access
2014-06-13
aircraft and crews came more personnel exposed to the hazardous local flying conditions. Thunderstorms frequently covered the high terrain throughout the...Group operating in New Guinea. Lack of current charts, weather, and hazard information coupled with negligible radio coverage combined to create...amazing fact about this big operation ‘up the valley’ is that every single item of food and equipment, from the inevitable tin of bully beef to the bulky
Rocky Mountain Research Station USDA Forest Service
2005-01-01
The Guide to Fuel Treatments analyzes a range of potential silvicultural thinnings and surface fuel treatments for 25 representative dry-forest stands in the Western United States. The guide provides quantitative guidelines and visualization for treatment based on scientific principles identified for reducing potential crown fires. This fact sheet identifies the...
NASA Astrophysics Data System (ADS)
Bindi, D.; Cotton, F.; Kotha, S. R.; Bosse, C.; Stromeyer, D.; Grünthal, G.
2017-09-01
We present a ground motion prediction equation (GMPE) for probabilistic seismic hazard assessments (PSHA) in low-to-moderate seismicity areas, such as Germany. Starting from the NGA-West2 flat-file (Ancheta et al. in Earthquake Spectra 30:989-1005, 2014), we develop a model tailored to the hazard application in terms of data selection and implemented functional form. In light of such hazard application, the GMPE is derived for hypocentral distance (along with the Joyner-Boore one), selecting recordings at sites with vs30 ≥ 360 m/s, distances within 300 km, and magnitudes in the range 3 to 8 (being 7.4 the maximum magnitude for the PSHA in the target area). Moreover, the complexity of the considered functional form is reflecting the availability of information in the target area. The median predictions are compared with those from the NGA-West2 models and with one recent European model, using the Sammon's map constructed for different scenarios. Despite the simplification in the functional form, the assessed epistemic uncertainty in the GMPE median is of the order of those affecting the NGA-West2 models for the magnitude range of interest of the hazard application. On the other hand, the simplification of the functional form led to an increment of the apparent aleatory variability. In conclusion, the GMPE developed in this study is tailored to the needs for applications in low-to-moderate seismic areas and for short return periods (e.g., 475 years); its application in studies where the hazard is involving magnitudes above 7.4 and for long return periods is not advised.
Fluid Dynamics of Small, Rugged Vacuum Pumps of Viscous-Drag Type
NASA Technical Reports Server (NTRS)
Russell, John M.
2002-01-01
The need to identify spikes in the concentration of hazardous gases during countdowns to space shuttle launches has led Kennedy Space Center to acquire considerable expertise in the design, construction, and operation of special-purpose gas analyzers of mass-spectrometer type. If such devices could be miniaturized so as to fit in a small airborne package or backpack them their potential applications would include integrated vehicle health monitoring in later-generation space shuttles and in hazardous material detection in airports, to name two examples. The bulkiest components of such devices are vacuum pumps, particularly those that function in the low vacuum range. Now some pumps that operate in the high vacuum range (e.g. molecular-drag and turbomolecular pumps) are already small and rugged. The present work aims to determine whether, on physical grounds, one may or may not adopt the molecular-drag principle to the low-vacuum range (in which case viscous-drag principle is the appropriate term). The deliverable of the present effort is the derivation and justification of some key formulas and calculation methods for the preliminary design of a single-spool, spiral-channel viscous-drag pump.
Chen, Guangchao; Peijnenburg, Willie; Xiao, Yinlong; Vijver, Martina G
2017-07-12
As listed by the European Chemicals Agency, the three elements in evaluating the hazards of engineered nanomaterials (ENMs) include the integration and evaluation of toxicity data, categorization and labeling of ENMs, and derivation of hazard threshold levels for human health and the environment. Assessing the hazards of ENMs solely based on laboratory tests is time-consuming, resource intensive, and constrained by ethical considerations. The adoption of computational toxicology into this task has recently become a priority. Alternative approaches such as (quantitative) structure-activity relationships ((Q)SAR) and read-across are of significant help in predicting nanotoxicity and filling data gaps, and in classifying the hazards of ENMs to individual species. Thereupon, the species sensitivity distribution (SSD) approach is able to serve the establishment of ENM hazard thresholds sufficiently protecting the ecosystem. This article critically reviews the current knowledge on the development of in silico models in predicting and classifying the hazard of metallic ENMs, and the development of SSDs for metallic ENMs. Further discussion includes the significance of well-curated experimental datasets and the interpretation of toxicity mechanisms of metallic ENMs based on reported models. An outlook is also given on future directions of research in this frontier.
Ghermandi, Luciana; Beletzky, Natacha A; de Torres Curth, Mónica I; Oddi, Facundo J
2016-12-01
The overlapping zone between urbanization and wildland vegetation, known as the wildland urban interface (WUI), is often at high risk of wildfire. Human activities increase the likelihood of wildfires, which can have disastrous consequences for property and land use, and can pose a serious threat to lives. Fire hazard assessments depend strongly on the spatial scale of analysis. We assessed the fire hazard in a WUI area of a Patagonian city by working at three scales: landscape, community and species. Fire is a complex phenomenon, so we used a large number of variables that correlate a priori with the fire hazard. Consequently, we analyzed environmental variables together with fuel load and leaf flammability variables and integrated all the information in a fire hazard map with four fire hazard categories. The Nothofagus dombeyi forest had the highest fire hazard while grasslands had the lowest. Our work highlights the vulnerability of the wildland-urban interface to fire in this region and our suggested methodology could be applied in other wildland-urban interface areas. Particularly in high hazard areas, our work could help in spatial delimitation policies, urban planning and development of plans for the protection of human lives and assets. Copyright © 2016 Elsevier Ltd. All rights reserved.
[Management of hazardous waste in a hospital].
Neveu C, Alejandra; Matus C, Patricia
2007-07-01
An inadequate management of hospital waste, that have toxic, infectious and chemical wastes, is a risk factor for humans and environment. To identify, quantify and assess the risk associated to the management of hospital residues. A cross sectional assessment of the generation of hazardous waste from a hospital, between June and August 2005, was performed. The environmental risk associated to the management of non-radioactive hospital waste was assessed and the main problems related to solid waste were identified. The rate of generation of hazardous non-radioactive waste was 1.35 tons per months or 0.7 kg/bed/day. Twenty five percent of hazardous liquid waste were drained directly to the sewage system. The drug preparation unit of the pharmacy had the higher environmental risk associated to the generation of hazardous waste. The internal transport of hazardous waste had a high risk due to the lack of trip planning. The lack of training of personnel dealing with these waste was another risk factor. Considering that an adequate management of hospital waste should minimize risks for patients, the hospital that was evaluated lacks an integral management system for its waste.
Wet Work and Barrier Function.
Fartasch, Manigé
2016-01-01
Wet work defined as unprotected exposure to humid environments/water; high frequencies of hand washing procedures or prolonged glove occlusion is believed to cause irritant contact dermatitis in a variety of occupations. This review considers the recent studies on wet-work exposure and focuses on its influence on barrier function. There are different methods to study the effect of wet work on barrier function. On the one hand, occupational cohorts at risk can be monitored prospectively by skin bioengineering technology and clinical visual scoring systems; on the other hand, experimental test procedures with defined application of water, occlusion and detergents are performed in healthy volunteers. Both epidemiological studies and the results of experimental procedures are compared and discussed. A variety of epidemiological studies analyze occupational cohorts at risk. The measurement of transepidermal water loss, an indicator of the integrity of the epidermal barrier, and clinical inspection of the skin have shown that especially the frequencies of hand washing and water contact/contact to aqueous mixtures seem to be the main factors for the occurrence of barrier alterations. On the other hand, in a single cross-sectional study, prolonged glove wearing (e.g. occlusion for 6 h per shift in clean-room workers) without exposure to additional hazardous substances seemed not to affect the skin negatively. But regarding the effect of occlusion, there is experimental evidence that previously occluded skin challenged with sodium lauryl sulfate leads to an increased susceptibility to the irritant with an aggravation of the irritant reaction. These findings might have relevance for the real-life situation in so far as after occupational glove wearing, the skin is more susceptible to potential hazards to the skin even during leisure hours. © 2016 S. Karger AG, Basel.
Bioluminescent bioreporter integrated circuit
Simpson, Michael L.; Sayler, Gary S.; Paulus, Michael J.
2000-01-01
Disclosed are monolithic bioelectronic devices comprising a bioreporter and an OASIC. These bioluminescent bioreporter integrated circuit are useful in detecting substances such as pollutants, explosives, and heavy-metals residing in inhospitable areas such as groundwater, industrial process vessels, and battlefields. Also disclosed are methods and apparatus for environmental pollutant detection, oil exploration, drug discovery, industrial process control, and hazardous chemical monitoring.
42 CFR 93.217 - Office of Research Integrity or ORI.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 1 2010-10-01 2010-10-01 false Office of Research Integrity or ORI. 93.217 Section 93.217 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH ASSESSMENTS AND HEALTH EFFECTS STUDIES OF HAZARDOUS SUBSTANCES RELEASES AND FACILITIES PUBLIC HEALTH SERVICE POLICIES ON RESEARCH MISCONDUCT Definitions §...
Engaging academia to advance the science and practice of environmental public health tracking.
Strosnider, Heather; Zhou, Ying; Balluz, Lina; Qualters, Judith
2014-10-01
Public health agencies at the federal, state, and local level are responsible for implementing actions and policies that address health problems related to environmental hazards. These actions and policies can be informed by integrating or linking data on health, exposure, hazards, and population. The mission of the Centers for Disease Control and Prevention׳s National Environmental Public Health Tracking Program (Tracking Program) is to provide information from a nationwide network of integrated health, environmental hazard, and exposure data that drives actions to improve the health of communities. The Tracking Program and federal, state, and local partners collect, integrate, analyze, and disseminate data and information to inform environmental public health actions. However, many challenges exist regarding the availability and quality of data, the application of appropriate methods and tools to link data, and the state of the science needed to link and analyze health and environmental data. The Tracking Program has collaborated with academia to address key challenges in these areas. The collaboration has improved our understanding of the uses and limitations of available data and methods, expanded the use of existing data and methods, and increased our knowledge about the connections between health and environment. Valuable working relationships have been forged in this process, and together we have identified opportunities and improvements for future collaborations to further advance the science and practice of environmental public health tracking. Published by Elsevier Inc.
The effect of natural disturbances on the risk from hydrogeomorphic hazards under climate change
NASA Astrophysics Data System (ADS)
Scheidl, Christian; Thaler, Thomas; Seidl, Rupert; Rammer, Werner; Kohl, Bernhard; Markart, Gerhard
2017-04-01
Recent storm events in Austria show once more how floods, sediment transport processes and debris flows constitute a major threat in alpine regions with a high density of population and an increasing spatial development. As protection forests have a major control function on runoff and erosion, they directly affect the risk from such hydrogeomorphic processes. However, research on future climate conditions, with an expected increase of the global average surface temperature of 3-5°C by 2100, compared to the first decade of the 20th century, raises a number of open questions for a sustainable and improved hazard management in mountain forests. For Europe, for instance, a climate-induced increase in forest disturbances like wildfire, wind, and insect's outbreaks is highly likely for the coming decades. Especially in protection forests, future scenarios of such climate induced natural disturbances and their impact on the protective effect remain an unresolved issue. Combining methods from forestry, hydrology and geotechnical engineering our project uses an integral approach to simulate possible effects of natural disturbances on hydrogeomorphic hazards in the perspective of future protection forest developments. With the individual-based forest landscape and disturbance model (iLand) we conduct an ensemble of forest landscape simulations, assessing the impact of future changes in natural disturbance regimes in four selected torrential catchments. These catchments are situated in two different forest growth areas. Drainage rate simulations are based on the conceptual hydrological model (ZEMOKOST), whereas simulations of the effect of forest disturbances on hillslope erosion processes are conducted by the Distributed Hydrology Soil Vegetation Model (DHSVM). Beside process based simulations, we also emphasis to identify the risk perception and adaptive capacity to mitigate a probable loss of protection functions in forests. For this reason, a postal survey among forestry actors will be performed to assess forest managers concern and willingness to engage in natural hazards management in contrast to the roles of their social network and the roles of political/administrative representatives. We will compare these perceived roles along the dimensions efficacy, attribution of responsibility and trust. This theory-driven approach highlights the motivational structure underlying the willingness to participate in natural hazards initiatives, and allows to tailor policy implications to the needs and capacities of distinct target groups. The outcomes of the investigations shall contribute to the development of adaptive management strategies for forestry administrations at all political levels to mitigate negative effects of climate change in protection forests.
Integrated hazard assessment of Cirenmaco glacial lake in Zhangzangbo valley, Central Himalayas
NASA Astrophysics Data System (ADS)
Wang, Weicai; Gao, Yang; Iribarren Anacona, Pablo; Lei, Yanbin; Xiang, Yang; Zhang, Guoqing; Li, Shenghai; Lu, Anxin
2018-04-01
Glacial lake outburst floods (GLOFs) have recently become one of the primary natural hazards in the Himalayas. There is therefore an urgent need to assess GLOF hazards in the region. Cirenmaco, a moraine-dammed lake located in the upstream portion of Zhangzangbo valley, Central Himalayas, has received public attention after its damaging 1981 outburst flood. Here, by combining remote sensing methods, bathymetric survey and 2D hydraulic modeling, we assessed the hazard posed by Cirenmaco in its current status. Inter-annual variation of Cirenmaco lake area indicates a rapid lake expansion from 0.10 ± 0.08 km2 in 1988 to 0.39 ± 0.04 km2 in 2013. Bathymetric survey shows the maximum water depth of the lake in 2012 was 115 ± 2 m and the lake volume was calculated to be 1.8 × 107 m3. Field geomorphic analysis shows that Cirenmaco glacial lake is prone to GLOFs as mass movements and ice and snow avalanches can impact the lake and the melting of the dead ice in the moraine can lower the dam level. HEC-RAS 2D model was then used to simulate moraine dam failure of the Cirenmaco and assess GLOF impacts downstream. Reconstruction of Cirenmaco 1981 GLOF shows that HEC-RAS can produce reasonable flood extent and water depth, thus demonstrate its ability to effectively model complex GLOFs. GLOF modeling results presented can be used as a basis for the implementation of disaster prevention and mitigation measures. As a case study, this work shows how we can integrate different methods to GLOF hazard assessment.
Kīlauea June 27th Lava Flow Hazard Mapping and Disaster Response with UAS
NASA Astrophysics Data System (ADS)
Turner, N.; Perroy, R. L.; Hon, K. A.; Rasgado, V.
2015-12-01
In June of 2014, pāhoehoe lava flows from the Púu ´Ō´ō eruption began threatening communities and infrastructure on eastern Hawaii Island. During the subsequent declared state of emergency by Hawaii Civil Defense and temporary flight restriction by the Federal Aviation Administration (FAA), we used a small fixed-wing Unmanned Aircraft System (UAS) to collect high spatial and temporal resolution imagery over the active flow in support of natural hazard assessment by emergency managers. Integration of our UAS into busy airspace, populated by emergency aircraft and tour helicopters, required close operational coordination with the FAA and local operators. We logged >80 hours of UAS flight operations between October 2014 and March 2015, generating a dense time-series of 4-5 cm resolution imagery and derived topographic datasets using structure from motion. These data were used to monitor flow activity, document pre- and post- lava flow damage, identify hazardous areas for first responders, and model lava flow paths in complex topography ahead of the active flow front. Turnaround times for delivered spatial data products improved from 24-48 hours at the beginning of the study to ~2-4 hours by the end. Data from this project are being incorporated into cloud computing applications to shorten delivery time and extract useful analytics regarding lava flow hazards in near real-time. The lessons learned from this event have advanced UAS integration in disaster operations in U.S. airspace and show the high potential UAS hold for natural hazards assessment and real-time emergency management.
NASA Astrophysics Data System (ADS)
Keiler, M.
2003-04-01
Reports on catastrophes with high damage caused by natural hazards seem to have increased in number recently. A new trend in dealing with these natural processes leads to the integration of risk into natural hazards evaluations and approaches of integral risk management. The risk resulting from natural hazards can be derived from the combination of parameters of physical processes (intensity and recurrence probability) and damage potential (probability of presence and expected damage value). Natural hazard research focuses mainly on the examination, modelling and estimation of individual geomorphological processes as well as on future developments caused by climate change. Even though damage potential has been taken into account more frequently, quantifying statements are still missing. Due to the changes of the socio-economic structures in mountain regions (urban sprawl, population growth, increased mobility and tourism) these studies are mandatory. This study presents a conceptual method that records the damage potential (probability of physical presence, evaluation of buildings) and shows the development of the damage potential resulting from avalanches since 1950. The study area is the community of Galtür, Austria. 36 percent of the existing buildings are found in officially declared avalanche hazard zones. The majority of these buildings are either agricultural or accommodation facilities. Additionally, the effects of physical planning and/or technical measures on the spatial development of the potential damage are illustrated. The results serve to improve risk determination and point out an unnoticed increase of damage potential and risk in apparently safe settlement areas.
Large Scale System Safety Integration for Human Rated Space Vehicles
NASA Astrophysics Data System (ADS)
Massie, Michael J.
2005-12-01
Since the 1960s man has searched for ways to establish a human presence in space. Unfortunately, the development and operation of human spaceflight vehicles carry significant safety risks that are not always well understood. As a result, the countries with human space programs have felt the pain of loss of lives in the attempt to develop human space travel systems. Integrated System Safety is a process developed through years of experience (since before Apollo and Soyuz) as a way to assess risks involved in space travel and prevent such losses. The intent of Integrated System Safety is to take a look at an entire program and put together all the pieces in such a way that the risks can be identified, understood and dispositioned by program management. This process has many inherent challenges and they need to be explored, understood and addressed.In order to prepare truly integrated analysis safety professionals must gain a level of technical understanding of all of the project's pieces and how they interact. Next, they must find a way to present the analysis so the customer can understand the risks and make decisions about managing them. However, every organization in a large-scale project can have different ideas about what is or is not a hazard, what is or is not an appropriate hazard control, and what is or is not adequate hazard control verification. NASA provides some direction on these topics, but interpretations of those instructions can vary widely.Even more challenging is the fact that every individual/organization involved in a project has different levels of risk tolerance. When the discrete hazard controls of the contracts and agreements cannot be met, additional risk must be accepted. However, when one has left the arena of compliance with the known rules, there can be no longer be specific ground rules on which to base a decision as to what is acceptable and what is not. The integrator must find common grounds between all parties to achieve concurrence on these non-compliant conditionsAnother area of challenge lies in determining the credibility of a proposed hazard. For example, NASA's definition of a credible hazard is accurate but does not provide specific guidance about contractors declaring a hazard "not credible" and ceasing working on that item.Unfortunately, this has the side effect of taking valuable resources from high-risk areas and using them to investigate whether these extremely low risk items have the potential to become worse than they appear.In order to deal with these types of issues, there must exist the concept of a "Safe State" and it must be used as a building block to help address many of the technical and social challenges in working safety and risk management. This "Safe State" must serve as the foundation for building the cultural modifications needed to assure that safety issues are properly identified, heard, and dispositioned by our space program management.As the space program and the countries involved in it move forward in development of human rated spacecraft, they must learn from the recent Columbia accident and establish new/modified basis for safety risk decisions. Those involved must also become more cognizant of the diversity in safety approaches and agree on how to deal with them. Most of all, those involved must never forget that while the System Safety duty maybe difficult, their efforts help to preserve the lives of space crews and their families.
32 CFR 172.2 - Applicability and scope.
Code of Federal Regulations, 2010 CFR
2010-07-01
... chemical processing. The recycling of hazardous materials or hazardous waste shall be accomplished with due...-bearing scrap and those items that may be used again for their original purposes or functions without any...
32 CFR 172.2 - Applicability and scope.
Code of Federal Regulations, 2013 CFR
2013-07-01
... chemical processing. The recycling of hazardous materials or hazardous waste shall be accomplished with due...-bearing scrap and those items that may be used again for their original purposes or functions without any...
32 CFR 172.2 - Applicability and scope.
Code of Federal Regulations, 2012 CFR
2012-07-01
... chemical processing. The recycling of hazardous materials or hazardous waste shall be accomplished with due...-bearing scrap and those items that may be used again for their original purposes or functions without any...
32 CFR 172.2 - Applicability and scope.
Code of Federal Regulations, 2014 CFR
2014-07-01
... chemical processing. The recycling of hazardous materials or hazardous waste shall be accomplished with due...-bearing scrap and those items that may be used again for their original purposes or functions without any...
32 CFR 172.2 - Applicability and scope.
Code of Federal Regulations, 2011 CFR
2011-07-01
... chemical processing. The recycling of hazardous materials or hazardous waste shall be accomplished with due...-bearing scrap and those items that may be used again for their original purposes or functions without any...
40 CFR 63.7830 - What are my monitoring requirements?
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Integrated Iron and Steel... leaks. (viii) Inspect fans for wear, material buildup, and corrosion through quarterly visual...
Two solar proton fluence models based on ground level enhancement observations
NASA Astrophysics Data System (ADS)
Raukunen, Osku; Vainio, Rami; Tylka, Allan J.; Dietrich, William F.; Jiggens, Piers; Heynderickx, Daniel; Dierckxsens, Mark; Crosby, Norma; Ganse, Urs; Siipola, Robert
2018-01-01
Solar energetic particles (SEPs) constitute an important component of the radiation environment in interplanetary space. Accurate modeling of SEP events is crucial for the mitigation of radiation hazards in spacecraft design. In this study we present two new statistical models of high energy solar proton fluences based on ground level enhancement (GLE) observations during solar cycles 19-24. As the basis of our modeling, we utilize a four parameter double power law function (known as the Band function) fits to integral GLE fluence spectra in rigidity. In the first model, the integral and differential fluences for protons with energies between 10 MeV and 1 GeV are calculated using the fits, and the distributions of the fluences at certain energies are modeled with an exponentially cut-off power law function. In the second model, we use a more advanced methodology: by investigating the distributions and relationships of the spectral fit parameters we find that they can be modeled as two independent and two dependent variables. Therefore, instead of modeling the fluences separately at different energies, we can model the shape of the fluence spectrum. We present examples of modeling results and show that the two methodologies agree well except for a short mission duration (1 year) at low confidence level. We also show that there is a reasonable agreement between our models and three well-known solar proton models (JPL, ESP and SEPEM), despite the differences in both the modeling methodologies and the data used to construct the models.
Leist, Marcel; Lidbury, Brett A; Yang, Chihae; Hayden, Patrick J; Kelm, Jens M; Ringeissen, Stephanie; Detroyer, Ann; Meunier, Jean R; Rathman, James F; Jackson, George R; Stolper, Gina; Hasiwa, Nina
2012-01-01
Several alternative methods to replace animal experiments have been accepted by legal bodies. An even larger number of tests are under development or already in use for non-regulatory applications or for the generation of information stored in proprietary knowledge bases. The next step for the use of the different in vitro methods is their combination into integrated testing strategies (ITS) to get closer to the overall goal of predictive "in vitro-based risk evaluation processes." We introduce here a conceptual framework as the basis for future ITS and their use for risk evaluation without animal experiments. The framework allows incorporation of both individual tests and already integrated approaches. Illustrative examples for elements to be incorporated are drawn from the session "Innovative technologies" at the 8th World Congress on Alternatives and Animal Use in the Life Sciences, held in Montreal, 2011. For instance, LUHMES cells (conditionally immortalized human neurons) were presented as an example for a 2D cell system. The novel 3D platform developed by InSphero was chosen as an example for the design and use of scaffold-free, organotypic microtissues. The identification of critical pathways of toxicity (PoT) may be facilitated by approaches exemplified by the MatTek 3D model for human epithelial tissues with engineered toxicological reporter functions. The important role of in silico methods and of modeling based on various pre-existing data is demonstrated by Altamira's comprehensive approach to predicting a molecule's potential for skin irritancy. A final example demonstrates how natural variation in human genetics may be overcome using data analytic (pattern recognition) techniques borrowed from computer science and statistics. The overall hazard and risk assessment strategy integrating these different examples has been compiled in a graphical work flow.
The role of the health physicist in nuclear security.
Waller, Edward J; van Maanen, Jim
2015-04-01
Health physics is a recognized safety function in the holistic context of the protection of workers, members of the public, and the environment against the hazardous effects of ionizing radiation, often generically designated as radiation protection. The role of the health physicist as protector dates back to the Manhattan Project. Nuclear security is the prevention and detection of, and response to, criminal or intentional unauthorized acts involving or directed at nuclear material, other radioactive material, associated facilities, or associated activities. Its importance has become more visible and pronounced in the post 9/11 environment, and it has a shared purpose with health physics in the context of protection of workers, members of the public, and the environment. However, the duties and responsibilities of the health physicist in the nuclear security domain are neither clearly defined nor recognized, while a fundamental understanding of nuclear phenomena in general, nuclear or other radioactive material specifically, and the potential hazards related to them is required for threat assessment, protection, and risk management. Furthermore, given the unique skills and attributes of professional health physicists, it is argued that the role of the health physicist should encompass all aspects of nuclear security, ranging from input in the development to implementation and execution of an efficient and effective nuclear security regime. As such, health physicists should transcend their current typical role as consultants in nuclear security issues and become fully integrated and recognized experts in the nuclear security domain and decision making process. Issues regarding the security clearances of health physics personnel and the possibility of insider threats must be addressed in the same manner as for other trusted individuals; however, the net gain from recognizing and integrating health physics expertise in all levels of a nuclear security regime far outweighs any negative aspects. In fact, it can be argued that health physics is essential in achieving an integrated approach toward nuclear safety, security, and safeguards.
The Role of the Health Physicist in Nuclear Security
Waller, Edward J.; van Maanen, Jim
2015-01-01
Abstract Health physics is a recognized safety function in the holistic context of the protection of workers, members of the public, and the environment against the hazardous effects of ionizing radiation, often generically designated as radiation protection. The role of the health physicist as protector dates back to the Manhattan Project. Nuclear security is the prevention and detection of, and response to, criminal or intentional unauthorized acts involving or directed at nuclear material, other radioactive material, associated facilities, or associated activities. Its importance has become more visible and pronounced in the post 9/11 environment, and it has a shared purpose with health physics in the context of protection of workers, members of the public, and the environment. However, the duties and responsibilities of the health physicist in the nuclear security domain are neither clearly defined nor recognized, while a fundamental understanding of nuclear phenomena in general, nuclear or other radioactive material specifically, and the potential hazards related to them is required for threat assessment, protection, and risk management. Furthermore, given the unique skills and attributes of professional health physicists, it is argued that the role of the health physicist should encompass all aspects of nuclear security, ranging from input in the development to implementation and execution of an efficient and effective nuclear security regime. As such, health physicists should transcend their current typical role as consultants in nuclear security issues and become fully integrated and recognized experts in the nuclear security domain and decision making process. Issues regarding the security clearances of health physics personnel and the possibility of insider threats must be addressed in the same manner as for other trusted individuals; however, the net gain from recognizing and integrating health physics expertise in all levels of a nuclear security regime far outweighs any negative aspects. In fact, it can be argued that health physics is essential in achieving an integrated approach toward nuclear safety, security, and safeguards. PMID:25706142
Hazardous and toxic waste management in Botswana: practices and challenges.
Mmereki, Daniel; Li, Baizhan; Meng, Liu
2014-12-01
Hazardous and toxic waste is a complex waste category because of its inherent chemical and physical characteristics. It demands for environmentally sound technologies and know-how as well as clean technologies that simultaneously manage and dispose it in an environmentally friendly way. Nevertheless, Botswana lacks a system covering all the critical steps from importation to final disposal or processing of hazardous and toxic waste owing to limited follow-up of the sources and types of hazardous and toxic waste, lack of modern and specialised treatment/disposal facilities, technical know-how, technically skilled manpower, funds and capabilities of local institutions to take lead in waste management. Therefore, because of a lack of an integrated system, there are challenges such as lack of cooperation among all the stakeholders about the safe management of hazardous and toxic waste. Furthermore, Botswana does not have a systematic regulatory framework regarding monitoring and hazardous and toxic waste management. In addition to the absence of a systematic regulatory framework, inadequate public awareness and dissemination of information about hazardous and toxic waste management, slower progress to phase-out persistent and bio-accumulative waste, and lack of reliable and accurate information on hazardous and toxic waste generation, sources and composition have caused critical challenges to effective hazardous and toxic waste management. It is, therefore, important to examine the status of hazardous and toxic waste as a waste stream in Botswana. By default; this mini-review article presents an overview of the current status of hazardous and toxic waste management and introduces the main challenges in hazardous and toxic waste management. Moreover, the article proposes the best applicable strategies to achieve effective hazardous and toxic waste management in the future. © The Author(s) 2014.
EFEHR - the European Facilities for Earthquake Hazard and Risk: beyond the web-platform
NASA Astrophysics Data System (ADS)
Danciu, Laurentiu; Wiemer, Stefan; Haslinger, Florian; Kastli, Philipp; Giardini, Domenico
2017-04-01
European Facilities for Earthquake Hazard and Risk (EEFEHR) represents the sustainable community resource for seismic hazard and risk in Europe. The EFEHR web platform is the main gateway to access data, models and tools as well as provide expertise relevant for assessment of seismic hazard and risk. The main services (databases and web-platform) are hosted at ETH Zurich and operated by the Swiss Seismological Service (Schweizerischer Erdbebendienst SED). EFEHR web-portal (www.efehr.org) collects and displays (i) harmonized datasets necessary for hazard and risk modeling, e.g. seismic catalogues, fault compilations, site amplifications, vulnerabilities, inventories; (ii) extensive seismic hazard products, namely hazard curves, uniform hazard spectra and maps for national and regional assessments. (ii) standardized configuration files for re-computing the regional seismic hazard models; (iv) relevant documentation of harmonized datasets, models and web-services. Today, EFEHR distributes full output of the 2013 European Seismic Hazard Model, ESHM13, as developed within the SHARE project (http://www.share-eu.org/); the latest results of the 2014 Earthquake Model of the Middle East (EMME14), derived within the EMME Project (www.emme-gem.org); the 2001 Global Seismic Hazard Assessment Project (GSHAP) results and the 2015 updates of the Swiss Seismic Hazard. New datasets related to either seismic hazard or risk will be incorporated as they become available. We present the currents status of the EFEHR platform, with focus on the challenges, summaries of the up-to-date datasets, user experience and feedback, as well as the roadmap to future technological innovation beyond the web-platform development. We also show the new services foreseen to fully integrate with the seismological core services of European Plate Observing System (EPOS).
Rauch, Geraldine; Brannath, Werner; Brückner, Matthias; Kieser, Meinhard
2018-05-01
In many clinical trial applications, the endpoint of interest corresponds to a time-to-event endpoint. In this case, group differences are usually expressed by the hazard ratio. Group differences are commonly assessed by the logrank test, which is optimal under the proportional hazard assumption. However, there are many situations in which this assumption is violated. Especially in applications were a full population and several subgroups or a composite time-to-first-event endpoint and several components are considered, the proportional hazard assumption usually does not simultaneously hold true for all test problems under investigation. As an alternative effect measure, Kalbfleisch and Prentice proposed the so-called 'average hazard ratio'. The average hazard ratio is based on a flexible weighting function to modify the influence of time and has a meaningful interpretation even in the case of non-proportional hazards. Despite this favorable property, it is hardly ever used in practice, whereas the standard hazard ratio is commonly reported in clinical trials regardless of whether the proportional hazard assumption holds true or not. There exist two main approaches to construct corresponding estimators and tests for the average hazard ratio where the first relies on weighted Cox regression and the second on a simple plug-in estimator. The aim of this work is to give a systematic comparison of these two approaches and the standard logrank test for different time-toevent settings with proportional and nonproportional hazards and to illustrate the pros and cons in application. We conduct a systematic comparative study based on Monte-Carlo simulations and by a real clinical trial example. Our results suggest that the properties of the average hazard ratio depend on the underlying weighting function. The two approaches to construct estimators and related tests show very similar performance for adequately chosen weights. In general, the average hazard ratio defines a more valid effect measure than the standard hazard ratio under non-proportional hazards and the corresponding tests provide a power advantage over the common logrank test. As non-proportional hazards are often met in clinical practice and the average hazard ratio tests often outperform the common logrank test, this approach should be used more routinely in applications. Schattauer GmbH.
Compute Element and Interface Box for the Hazard Detection System
NASA Technical Reports Server (NTRS)
Villalpando, Carlos Y.; Khanoyan, Garen; Stern, Ryan A.; Some, Raphael R.; Bailey, Erik S.; Carson, John M.; Vaughan, Geoffrey M.; Werner, Robert A.; Salomon, Phil M.; Martin, Keith E.;
2013-01-01
The Autonomous Landing and Hazard Avoidance Technology (ALHAT) program is building a sensor that enables a spacecraft to evaluate autonomously a potential landing area to generate a list of hazardous and safe landing sites. It will also provide navigation inputs relative to those safe sites. The Hazard Detection System Compute Element (HDS-CE) box combines a field-programmable gate array (FPGA) board for sensor integration and timing, with a multicore computer board for processing. The FPGA does system-level timing and data aggregation, and acts as a go-between, removing the real-time requirements from the processor and labeling events with a high resolution time. The processor manages the behavior of the system, controls the instruments connected to the HDS-CE, and services the "heavy lifting" computational requirements for analyzing the potential landing spots.
NASA Astrophysics Data System (ADS)
Schaub, Y.; Huggel, C.; Serraino, M.; Haeberli, W.
2012-04-01
The changes in high-mountain environments are increasingly fast and complex. GIS-based models of the Swiss Alps show that numerous topographic overdeepenings are likely to appear on progressively exposed glacier beds, which are considered as potential sites of future lake formation. In many cases these newly forming lakes will be situated in an over-steepened and destabilized high-mountain environment and are, therefore, prone to impact waves from landslides. The risk of glacier lake outburst floods, endangering infrastructure, residential areas and persons further downvalley, is increasing with further lake formation and glacier recession. This risk may persist for many decades if not centuries. Future-oriented hazard assessments have to be integrative and must deal with all possible process chains. Reference studies and methodologies are still scarce, however. We present an approach to compare risks resulting from high-mountain lakes in the Swiss Alps amongst each other. Already existing lakes are thereby as much included in the analysis as future ones. The presented risk assessment approach integrates the envisaged high-mountain hazard process chain with present and future socio-economic conditions. Applying the concept of integral risk management, the hazard and damage potentials have to be analyzed. The areas that feature the topographic potential for rock/iceavalanches to reach a lake were analyzed regarding their susceptibility to slope failure including the factors slope inclination, permafrost occurrence, glacier recession and bedrock lithology. Together with the analysis of the lakes (volume and runout path of potential outburst floods), the hazard analysis of the process chain was completed. As an example, high long-term hazard potentials in the Swiss Alps have, for instance, to be expected in the area of the Great Aletsch glacier. A methodology for the assessment of the damage potential was elaborated and will be presented. In order to estimate the location of the largest damage potentials, driving forces of different spatial development scenarios for the Swiss Alps will be implemented in a land allocation model for the Swiss Alps. By bringing together hazard, exposure and vulnerability analyses, a risk assessment for the entire Swiss Alps regarding lake-outburst floods triggered by impacts of rock/ice avalanches can be conducted for today, the middle of the century and even beyond.
Detection of Hazardous Cavities Below a Road Using Combined Geophysical Methods
NASA Astrophysics Data System (ADS)
De Giorgi, L.; Leucci, G.
2014-07-01
Assessment of the risk arising from near-surface natural hazard is a crucial step in safeguarding the security of the roads in karst areas. It helps authorities and other related parties to apply suitable procedures for ground treatment, mitigate potential natural hazards and minimize human and economic losses. Karstic terrains in the Salento Peninsula (Apulia region—South Italy) is a major challenge to engineering constructions and roads due to extensive occurrence of cavities and/or sinkholes that cause ground subsidence and both roads and building collapse. Cavities are air/sediment-filled underground voids, commonly developed in calcarenite sedimentary rocks by the infiltration of rainwater into the ground, opening up, over a long period of time, holes and tunnels. Mitigation of natural hazards can best be achieved through careful geoscientific studies. Traditionally, engineers use destructive probing techniques for the detection of cavities across regular grids or random distances. Such probing is insufficient on its own to provide confidence that cavities will not be encountered. Frequency of probing and depth of investigation may become more expensive. Besides, probing is intrusive, non-continuous, slow, expensive and cannot provide a complete lateral picture of the subsurface geology. Near-surface cavities usually can be easily detected by surface geophysical methods. Traditional and recently developed measuring techniques in seismic, geoelectrics and georadar are suitable for economical investigation of hazardous, potentially collapsing cavities. The presented research focused on an integrated geophysical survey that was carried out in a near-coast road located at Porto Cesareo, a small village a few kilometers south west of Lecce (south Italy). The roads in this area are intensively affected by dangerous surface cracks that cause structural instability. The survey aimed to image the shallow subsurface structures, including karstic features, and evaluate their extent, as they may cause rock instability and lead to cracking of the road. Seismic refraction tomography and ground-penetrating radar surveys were carried out along several parallel traverses extending about 100 m on the cracked road. The acquired data were processed and interpreted integrally to elucidate the shallow structural setting of the site. Integrated interpretation led to the delineation of hazard zones rich with karstic features in the area. Most of these karstic features are associated with vertical and subvertical linear features and cavities. These features are the main reason of the rock instability that resulted in potentially dangerous cracking of road.
Hall, A Tilghman; Belanger, Scott E; Guiney, Pat D; Galay-Burgos, Malyka; Maack, Gerd; Stubblefield, William; Martin, Olwenn
2017-07-01
Ecological risk assessments and risk management decisions are only as sound as the underlying information and processes to integrate them. It is important to develop transparent and reproducible procedures a priori to integrate often-heterogeneous evidence. Current weight-of-evidence (WoE) approaches for effects or hazard assessment tend to conflate aspects of the assessment of the quality of the data with the strength of the body of evidence as a whole. We take forward recent developments in the critical appraisal of the reliability and relevance of individual ecotoxicological studies as part of the effect or hazard assessment of prospective risk assessments and propose a streamlined WoE approach. The aim is to avoid overlap and double accounting of criteria used in reliability and relevance with that used in current WoE methods. The protection goals, problem formulation, and evaluation process need to be clarified at the outset. The data are first integrated according to lines of evidence (LoEs), typically mechanistic insights (e.g., cellular, subcellular, genomic), in vivo experiments, and higher-tiered field or observational studies. Data are then plotted on the basis of both relevance and reliability scores or categories. This graphical approach provides a means to visually assess and communicate the credibility (reliability and relevance of available individual studies), quantity, diversity, and consistency of the evidence. In addition, the external coherence of the body of evidence needs to be considered. The final step in the process is to derive an expression of the confidence in the conclusions of integrating the information considering these 5 aspects in the context of remaining uncertainties. We suggest that this streamlined approach to WoE for the effects or hazard characterization should facilitate reproducible and transparent assessments of data across different regulatory requirements. Integr Environ Assess Manag 2017;13:573-579. © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).
Cerri, Rodrigo I; Reis, Fábio A G V; Gramani, Marcelo F; Giordano, Lucilia C; Zaine, José Eduardo
2017-01-01
This paper presents a new approach of landslides zonation hazard studies, based on an integrated study of structural data along with geomorphological and external factors, in a hilly regions of Brazil, covered by a tropical humid rain-forest, called Serra do Mar. The Serra do Mar consists of a hilly region along the east coast of Brazil, with high slopes and many geological structures in a gneiss - migmatitic terrain. In contrast to traditional approaches, this method proposes that structural data (foliation, fractures and bedding planes) and its relation with the slope geometry, is important to be consider in the landslide zonation hazard, along with declivity, relative relief, soil and rock properties, land use and vegetation cover and hydrogeological and climate factors. Results show that slopes with high hazard have the same dip direction of geological structures. Landslide zonation hazard using structural data contributes to a better understanding of how these structures, preserved in tropical residual soils, influence on slope stability and generates landslides.
Wang, Wei; Albert, Jeffrey M
2017-08-01
An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.
Addressing Uniqueness and Unison of Reliability and Safety for a Better Integration
NASA Technical Reports Server (NTRS)
Huang, Zhaofeng; Safie, Fayssal
2016-01-01
Over time, it has been observed that Safety and Reliability have not been clearly differentiated, which leads to confusion, inefficiency, and, sometimes, counter-productive practices in executing each of these two disciplines. It is imperative to address this situation to help Reliability and Safety disciplines improve their effectiveness and efficiency. The paper poses an important question to address, "Safety and Reliability - Are they unique or unisonous?" To answer the question, the paper reviewed several most commonly used analyses from each of the disciplines, namely, FMEA, reliability allocation and prediction, reliability design involvement, system safety hazard analysis, Fault Tree Analysis, and Probabilistic Risk Assessment. The paper pointed out uniqueness and unison of Safety and Reliability in their respective roles, requirements, approaches, and tools, and presented some suggestions for enhancing and improving the individual disciplines, as well as promoting the integration of the two. The paper concludes that Safety and Reliability are unique, but compensating each other in many aspects, and need to be integrated. Particularly, the individual roles of Safety and Reliability need to be differentiated, that is, Safety is to ensure and assure the product meets safety requirements, goals, or desires, and Reliability is to ensure and assure maximum achievability of intended design functions. With the integration of Safety and Reliability, personnel can be shared, tools and analyses have to be integrated, and skill sets can be possessed by the same person with the purpose of providing the best value to a product development.
Embedding Scientific Integrity and Ethics into the Scientific Process and Research Data Lifecycle
NASA Astrophysics Data System (ADS)
Gundersen, L. C.
2016-12-01
Predicting climate change, developing resources sustainably, and mitigating natural hazard risk are complex interdisciplinary challenges in the geosciences that require the integration of data and knowledge from disparate disciplines and scales. This kind of interdisciplinary science can only thrive if scientific communities work together and adhere to common standards of scientific integrity, ethics, data management, curation, and sharing. Science and data without integrity and ethics can erode the very fabric of the scientific enterprise and potentially harm society and the planet. Inaccurate risk analyses of natural hazards can lead to poor choices in construction, insurance, and emergency response. Incorrect assessment of mineral resources can bankrupt a company, destroy a local economy, and contaminate an ecosystem. This paper presents key ethics and integrity questions paired with the major components of the research data life cycle. The questions can be used by the researcher during the scientific process to help ensure the integrity and ethics of their research and adherence to sound data management practice. Questions include considerations for open, collaborative science, which is fundamentally changing the responsibility of scientists regarding data sharing and reproducibility. The publication of primary data, methods, models, software, and workflows must become a norm of science. There are also questions that prompt the scientist to think about the benefit of their work to society; ensuring equity, respect, and fairness in working with others; and always striving for honesty, excellence, and transparency.
IRIS Toxicological Review of Tetrahydrofuran (THF) (External ...
EPA is conducting a peer review and public comment of the scientific basis supporting the human health hazard and dose-response assessment of tetrahydrofuran (THF) that when finalized will appear on the Integrated Risk Information System (IRIS) database. EPA is undertaking an Integrated Risk Information System (IRIS) health assessment for tetrahydrofuran. IRIS is an EPA database containing Agency scientific positions on potential adverse human health effects that may result from chronic (or lifetime) exposure to chemicals in the environment. IRIS contains chemical-specific summaries of qualitative and quantitative health information in support of two steps of the risk assessment paradigm, i.e., hazard identification and dose-response evaluation. IRIS assessments are used in combination with specific situational exposure assessment information to evaluate potential public health risk associated with environmental contaminants.
Application of seismic interpretation in the development of Jerneh Field, Malay Basin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yusoff, Z.
1994-07-01
Development of the Jerneh gas field has been significantly aided by the use of 3-D and site survey seismic interpretations. The two aspects that have been of particular importance are identification of sea-floor and near-surface safety hazards for safe platform installation/development drilling and mapping of reservoirs/hydrocarbons within gas-productive sands of the Miocene groups B, D, and E. Choice of platform location as well as casing design require detailed analysis of sea-floor and near-surface safety hazards. At Jerneh, sea-floor pockmarks near-surface high amplitudes, distributary channels, and minor faults were recognized as potential operational safety hazards. The integration of conventional 3-D andmore » site survey seismic data enabled comprehensive understanding of the occurrence and distribution of potential hazards to platform installation and development well drilling. Three-dimensional seismic interpretation has been instrumental not only in the field structural definition but also in recognition of reservoir trends and hydrocarbon distribution. Additional gas reservoirs were identified by their DHI characteristics and subsequently confirmed by development wells. The innovative use of seismic attribute mapping techniques has been very important in defining both fluid and reservoir distribution in groups B and D. Integration of 3-D seismic data and well-log interpretations has helped in optimal field development, including the planning of well locations and drilling sequence.« less
Uncertainty on shallow landslide hazard assessment: from field data to hazard mapping
NASA Astrophysics Data System (ADS)
Trefolini, Emanuele; Tolo, Silvia; Patelli, Eduardo; Broggi, Matteo; Disperati, Leonardo; Le Tuan, Hai
2015-04-01
Shallow landsliding that involve Hillslope Deposits (HD), the surficial soil that cover the bedrock, is an important process of erosion, transport and deposition of sediment along hillslopes. Despite Shallow landslides generally mobilize relatively small volume of material, they represent the most hazardous factor in mountain regions due to their high velocity and the common absence of warning signs. Moreover, increasing urbanization and likely climate change make shallow landslides a source of widespread risk, therefore the interest of scientific community about this process grown in the last three decades. One of the main aims of research projects involved on this topic, is to perform robust shallow landslides hazard assessment for wide areas (regional assessment), in order to support sustainable spatial planning. Currently, three main methodologies may be implemented to assess regional shallow landslides hazard: expert evaluation, probabilistic (or data mining) methods and physical models based methods. The aim of this work is evaluate the uncertainty of shallow landslides hazard assessment based on physical models taking into account spatial variables such as: geotechnical and hydrogeologic parameters as well as hillslope morphometry. To achieve this goal a wide dataset of geotechnical properties (shear strength, permeability, depth and unit weight) of HD was gathered by integrating field survey, in situ and laboratory tests. This spatial database was collected from a study area of about 350 km2 including different bedrock lithotypes and geomorphological features. The uncertainty associated to each step of the hazard assessment process (e.g. field data collection, regionalization of site specific information and numerical modelling of hillslope stability) was carefully characterized. The most appropriate probability density function (PDF) was chosen for each numerical variable and we assessed the uncertainty propagation on HD strength parameters obtained by empirical relations with geotechnical index properties. Site specific information was regionalized at map scale by (hard and fuzzy) clustering analysis taking into account spatial variables such as: geology, geomorphology and hillslope morphometric variables (longitudinal and transverse curvature, flow accumulation and slope), the latter derived by a DEM with 10 m cell size. In order to map shallow landslide hazard, Monte Carlo simulation was performed for some common physically based models available in literature (eg. SINMAP, SHALSTAB, TRIGRS). Furthermore, a new approach based on the use of Bayesian Network was proposed and validated. Different models, such as Intervals, Convex Models and Fuzzy Sets, were adopted for the modelling of input parameters. Finally, an accuracy assessment was carried out on the resulting maps and the propagation of uncertainty of input parameters into the final shallow landslide hazard estimation was estimated. The outcomes of the analysis are compared and discussed in term of discrepancy among map pixel values and related estimated error. The novelty of the proposed method is on estimation of the confidence of the shallow landslides hazard mapping at regional level. This allows i) to discriminate regions where hazard assessment is robust from areas where more data are necessary to increase the confidence level and ii) to assess the reliability of the procedure used for hazard assessment.
Design for Connecting Spatial Data Infrastructures with Sensor Web (sensdi)
NASA Astrophysics Data System (ADS)
Bhattacharya, D.; M., M.
2016-06-01
Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS); 'Sensor Planning Service' (SPS); 'Sensor Alert Service' (SAS); a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS). Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.
NASA Astrophysics Data System (ADS)
List-Kratochvil, Emil J. W.
2016-09-01
Comfortable, wearable sensors and computers will enhance every person's awareness of his or her health condition, environment, chemical pollutants, potential hazards, and information of interest. In agriculture and in the food industry there is a need for a constant control of the condition and needs of plants, animals, and farm products. Yet many of these applications depend upon the development of novel, cheap devices and sensors that are easy to implement and to integrate. Organic semiconductors as well as several inorganic materials and hybrid material systems have proven to combine a number of intriguing optical and electronic properties with simple processing methods. As it will be reviewed in this contribution, these materials are believed to find their application in printed electronic devices allowing for the development of smart disposable devices in food-, health-, and environmental monitoring, diagnostics and control, possibly integrated into arrays of sensor elements for multi-parameter detection. In this contribution we review past and recent achievements in the field. Followed by a brief introduction, we will focus on two topics being on the agenda recently: a) the use of electrolyte-gated organic field-effect transistor (EGOFET) and ion-selective membrane based sensors for in-situ sensing of ions and biological substances and b) the development of hybrid material based resistive switches and their integration into fully functional, printed hybrid crossbar sensor array structures.
NASA Astrophysics Data System (ADS)
Kwiatek, Grzegorz; Blanke, Aglaja; Olszewska, Dorota; Orlecka-Sikora, Beata; Lasocki, Stanisław; Kozlovskaya, Elena; Nevalainen, Jouni; Schmittbuhl, Jean; Grasso, Jean-Robert; Schaming, Marc; Bigarre, Pascal; Kinscher, Jannes-Lennart; Saccorotti, Gilberto; Garcia, Alexander; Cassidy, Nigel; Toon, Sam; Mutke, Grzegorz; Sterzel, Mariusz; Szepieniec, Tomasz
2017-04-01
The Thematic Core Service "Anthropogenic Hazards" (TCS AH) integrates data and provides various data services in a form of complete e-research infrastructure for advanced analysis and geophysical modelling of anthropogenic hazard due to georesources exploitation. TCS AH is based on the prototype built in the framework of the IS-EPOS project POIG.02.03.00-14-090/13-00 (https://tcs.ah-epos.eu/). The TCS AH is currently being further developed within EPOS Implementation phase (H2020-INFRADEV-1-2015-1, INFRADEV-3-2015). The TCS AH aims to have a measurable impact on innovative research and development by providing a comprehensive, wide-scale and high quality research infrastructure available to the scientific community, industrial partners and public. One of the main deliverable of TCS AH is the access to numerous induced seismicity datasets called "episodes". The episode is defined as a comprehensive set of data describing the geophysical process induced or triggered by technological activity, which under certain circumstances can become hazardous for people, infrastructure and the environment. The episode is a time-correlated, standardized collection of geophysical, technological and other relevant geodata forming complete documentation of seismogenic process. In addition to the 6 episodes already implemented during previous phase of integration, and 3 episodes integrated within SHEER project, at least 18 new episodes related to conventional hydrocarbon extraction, reservoir treatment, underground mining and geothermal energy production are currently being integrated into the TCS AH. The heterogeneous multi-disciplinary data from different episodes are subjected to an extensive quality control (QC) procedure composed of five steps and involving the collaborative work of data providers, quality control team, IT team, that is being supervised by the quality control manager with the aid of Redmine platform. The first three steps of QC are performed at local data center and include the (1) transfer of episode data to the local data center, (2) data standardization and validation of formats, (3) metadata preparation according to TCS AH metadata scheme. The final two steps of QC are performed already at the level of TCS AH website and include (4) Contextual analysis of data quality followed by appearance of episode in TCS AH maintenance area, and finally the (5) Episode publication at TCS AH website.
An estimator of the survival function based on the semi-Markov model under dependent censorship.
Lee, Seung-Yeoun; Tsai, Wei-Yann
2005-06-01
Lee and Wolfe (Biometrics vol. 54 pp. 1176-1178, 1998) proposed the two-stage sampling design for testing the assumption of independent censoring, which involves further follow-up of a subset of lost-to-follow-up censored subjects. They also proposed an adjusted estimator for the survivor function for a proportional hazards model under the dependent censoring model. In this paper, a new estimator for the survivor function is proposed for the semi-Markov model under the dependent censorship on the basis of the two-stage sampling data. The consistency and the asymptotic distribution of the proposed estimator are derived. The estimation procedure is illustrated with an example of lung cancer clinical trial and simulation results are reported of the mean squared errors of estimators under a proportional hazards and two different nonproportional hazards models.
National-Level Multi-Hazard Risk Assessments in Sub-Saharan Africa
NASA Astrophysics Data System (ADS)
Murnane, R. J.; Balog, S.; Fraser, S. A.; Jongman, B.; Van Ledden, M.; Phillips, E.; Simpson, A.
2017-12-01
National-level risk assessments can provide important baseline information for decision-making on risk management and risk financing strategies. In this study, multi-hazard risk assessments were undertaken for 9 countries in Sub-Saharan Africa: Cape Verde, Ethiopia, Kenya, Niger, Malawi, Mali, Mozambique, Senegal and Uganda. The assessment was part of the Building Disaster Resilience in Sub-Saharan Africa Program and aimed at supporting the development of multi-risk financing strategies to help African countries make informed decisions to mitigate the socio-economic, fiscal and financial impacts of disasters. The assessments considered hazards and exposures consistent with the years 2010 and 2050. We worked with multiple firms to develop the hazard, exposure and vulnerability data and the risk results. The hazards include: coastal flood, drought, earthquake, landslide, riverine flood, tropical cyclone wind and storm surge, and volcanoes. For hazards expected to vary with climate, the 2050 hazard is based on the IPCC RCP 6.0. Geolocated exposure data for 2010 and 2050 at a 15 arc second ( 0.5 km) resolution includes: structures as a function of seven development patterns; transportation networks including roads, bridges, tunnels and rail; critical facilities such as schools, hospitals, energy facilities and government buildings; crops; population; and, gross domestic product (GDP). The 2050 exposure values for population are based on the IPCC SSP 2. Values for other exposure data are a function of population change. Vulnerability was based on openly available vulnerability functions. Losses were based on replacement values (e.g., cost/m2 or cost/km). Risk results are provided in terms of annual average loss and a variety of return periods at the national and Admin 1 levels. Assessments of recent historical events are used to validate the model results. In the future, it would be useful to use hazard footprints of historical events for validation purposes. The results will be visualized in a set of national risk profile documents intended to form the basis for conversations with governments on risk reduction and risk financing strategies.
The Mediterranean Supersite Volcanoes (MED-SUV) Project: an overview
NASA Astrophysics Data System (ADS)
Puglisi, G.
2013-12-01
The EC FP7 MEDiterranean SUpersite Volcanoes (MED-SUV) EC-FP7 Project, which started on June 2013, aims to improve the capacity of the scientific institutions, end users and SME forming the project consortium to assess the volcanic hazards at Italian Supersites, i.e. Mt. Etna and Campi Flegrei/Vesuvius. The Project activities will focus on the optimisation and integration of ground and space monitoring systems, the breakthrough in understanding of volcanic processes, and on the increase of the effectiveness of the coordination between the scientific and end-user communities in the hazard management. The overall goal of the project is to apply the rationale of the Supersites GEO initiative to Mt. Etna and Campi Flegrei/Vesuvius, considered as cluster of Supersites. For the purpose MED-SUV will integrate long-term observations of ground-based multidisciplinary data available for these volcanoes, i.e. geophysical, geochemical, and volcanological datasets, with Earth Observation (EO) data. Merging of different parameters over a long period will provide better understanding of the volcanic processes. In particular, given the variety of styles and intensities of the volcanic activity observed at these volcanoes, and which make them sort of archetypes for 'closed conduit '; and ';open conduit' volcanic systems, the combination of different data will allow discrimination between peculiar volcano behaviours associated with pre-, syn- and post-eruptive phases. Indeed, recognition of specific volcano patterns will allow broadening of the spectrum of knowledge of geo-hazards, as well as better parameterisation and modelling of the eruptive phenomena and of the processes occurring in the volcano supply system; thus improving the capability of carrying out volcano surveillance activities. Important impacts on the European industrial sector, arising from a partnership integrating the scientific community and SMEs to implement together new observation/monitoring sensors/systems, are also expected. MED-SUV proposes the development and implementation of a state-of-the-art e-infrastructure for the data integration and sharing and for volcanic risk management life-cycle, from observation to people preparedness. Experiments and studies will be devoted to better understanding of the internal structures and related dynamics of the case study volcanoes, as well as to recognition of signals associated with to impending unrest or eruptive phases. Hazard quantitative assessment will benefit by the outcomes of these studies and by their integration into the cutting edge monitoring approaches, thus leading to a step-change in hazard awareness and preparedness, and leveraging the close relationship between scientists, SMEs, and end-users. The applicability of the project outcomes will be tested on the cluster of Supersite itself during a Pilot phase, as well as on other volcanic systems with similar behaviours like Piton de la Fournaise (Reunion Island) and Azores.
The Mediterranean Supersite Volcanoes (MED-SUV) Project: an overview
NASA Astrophysics Data System (ADS)
Puglisi, Giuseppe
2014-05-01
The EC FP7 MEDiterranean SUpersite Volcanoes (MED-SUV) EC-FP7 Project, which started on June 2013, aims to improve the capacity of the scientific institutions, end users and SME forming the project consortium to assess the volcanic hazards at Italian Supersites, i.e. Mt. Etna and Campi Flegrei/Vesuvius. The Project activities will focus on the optimisation and integration of ground and space monitoring systems, the breakthrough in understanding of volcanic processes, and on the increase of the effectiveness of the coordination between the scientific and end-user communities in the hazard management. The overall goal of the project is to apply the rationale of the Supersites GEO initiative to Mt. Etna and Campi Flegrei/Vesuvius, considered as cluster of Supersites. For the purpose MED-SUV will integrate long-term observations of ground-based multidisciplinary data available for these volcanoes, i.e. geophysical, geochemical, and volcanological datasets, with Earth Observation (EO) data. Merging of different parameters over a long period will provide better understanding of the volcanic processes. In particular, given the variety of styles and intensities of the volcanic activity observed at these volcanoes, and which make them sort of archetypes for 'closed conduit ' and 'open conduit' volcanic systems, the combination of different data will allow discrimination between peculiar volcano behaviours associated with pre-, syn- and post-eruptive phases. Indeed, recognition of specific volcano patterns will allow broadening of the spectrum of knowledge of geo-hazards, as well as better parameterisation and modelling of the eruptive phenomena and of the processes occurring in the volcano supply system; thus improving the capability of carrying out volcano surveillance activities. Important impacts on the European industrial sector, arising from a partnership integrating the scientific community and SMEs to implement together new observation/monitoring sensors/systems, are also expected. MED-SUV proposes the development and implementation of a state-of-the-art e-infrastructure for the data integration and sharing and for volcanic risk management life-cycle, from observation to people preparedness. Experiments and studies will be devoted to better understanding of the internal structures and related dynamics of the case study volcanoes, as well as to recognition of signals associated with to impending unrest or eruptive phases. Hazard quantitative assessment will benefit by the outcomes of these studies and by their integration into the cutting edge monitoring approaches, thus leading to a step-change in hazard awareness and preparedness, and leveraging the close relationship between scientists, SMEs, and end-users. The applicability of the project outcomes will be tested on the cluster of Supersite itself during a Pilot phase, as well as on other volcanic systems with similar behaviours like Piton de la Fournaise (Reunion Island) and Azores.
NASA Technical Reports Server (NTRS)
Serke, David J.; Politovich, Marcia K.; Reehorst, Andrew L.; Gaydos, Andrew
2009-01-01
The Alliance Icing Research Study-II (AIRS-II) field program was conducted near Montreal, Canada during the winter of 2003. The NASA Icing Remote Detection System (NIRSS) was deployed to detect in-flight icing hazards and consisted of a vertically pointing multichannel radiometer, a ceilometer and an x-band cloud radar. The radiometer was used to derive atmospheric temperature soundings and integrated liquid water, while the ceilometer and radar were used only to define cloud boundaries. The purpose of this study is to show that the radar reflectivity profiles from AIRS-II case studies could be used to provide a qualitative icing hazard.
The QSPR-THESAURUS: the online platform of the CADASTER project.
Brandmaier, Stefan; Peijnenburg, Willie; Durjava, Mojca K; Kolar, Boris; Gramatica, Paola; Papa, Ester; Bhhatarai, Barun; Kovarich, Simona; Cassani, Stefano; Roy, Partha Pratim; Rahmberg, Magnus; Öberg, Tomas; Jeliazkova, Nina; Golsteijn, Laura; Comber, Mike; Charochkina, Larisa; Novotarskyi, Sergii; Sushko, Iurii; Abdelaziz, Ahmed; D'Onofrio, Elisa; Kunwar, Prakash; Ruggiu, Fiorella; Tetko, Igor V
2014-03-01
The aim of the CADASTER project (CAse Studies on the Development and Application of in Silico Techniques for Environmental Hazard and Risk Assessment) was to exemplify REACH-related hazard assessments for four classes of chemical compound, namely, polybrominated diphenylethers, per and polyfluorinated compounds, (benzo)triazoles, and musks and fragrances. The QSPR-THESAURUS website (http: / /qspr-thesaurus.eu) was established as the project's online platform to upload, store, apply, and also create, models within the project. We overview the main features of the website, such as model upload, experimental design and hazard assessment to support risk assessment, and integration with other web tools, all of which are essential parts of the QSPR-THESAURUS. 2014 FRAME.
The role of models in estimating consequences as part of the risk assessment process.
Forde-Folle, K; Mitchell, D; Zepeda, C
2011-08-01
The degree of disease risk represented by the introduction, spread, or establishment of one or several diseases through the importation of animals and animal products is assessed by importing countries through an analysis of risk. The components of a risk analysis include hazard identification, risk assessment, risk management, and risk communication. A risk assessment starts with identification of the hazard(s) and then continues with four interrelated steps: release assessment, exposure assessment, consequence assessment, and risk estimation. Risk assessments may be either qualitative or quantitative. This paper describes how, through the integration of epidemiological and economic models, the potential adverse biological and economic consequences of exposure can be quantified.
MetaPathways v2.5: quantitative functional, taxonomic and usability improvements.
Konwar, Kishori M; Hanson, Niels W; Bhatia, Maya P; Kim, Dongjae; Wu, Shang-Ju; Hahn, Aria S; Morgan-Lang, Connor; Cheung, Hiu Kan; Hallam, Steven J
2015-10-15
Next-generation sequencing is producing vast amounts of sequence information from natural and engineered ecosystems. Although this data deluge has an enormous potential to transform our lives, knowledge creation and translation need software applications that scale with increasing data processing and analysis requirements. Here, we present improvements to MetaPathways, an annotation and analysis pipeline for environmental sequence information that expedites this transformation. We specifically address pathway prediction hazards through integration of a weighted taxonomic distance and enable quantitative comparison of assembled annotations through a normalized read-mapping measure. Additionally, we improve LAST homology searches through BLAST-equivalent E-values and output formats that are natively compatible with prevailing software applications. Finally, an updated graphical user interface allows for keyword annotation query and projection onto user-defined functional gene hierarchies, including the Carbohydrate-Active Enzyme database. MetaPathways v2.5 is available on GitHub: http://github.com/hallamlab/metapathways2. shallam@mail.ubc.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
OpenQuake, a platform for collaborative seismic hazard and risk assessment
NASA Astrophysics Data System (ADS)
Henshaw, Paul; Burton, Christopher; Butler, Lars; Crowley, Helen; Danciu, Laurentiu; Nastasi, Matteo; Monelli, Damiano; Pagani, Marco; Panzeri, Luigi; Simionato, Michele; Silva, Vitor; Vallarelli, Giuseppe; Weatherill, Graeme; Wyss, Ben
2013-04-01
Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, tools and models for global seismic hazard and risk assessment, within the context of the Global Earthquake Model (GEM). Guided by the needs and experiences of governments, companies and international organisations, all contributions are being integrated into OpenQuake: a web-based platform that - together with other resources - will become accessible in 2014. With OpenQuake, stakeholders worldwide will be able to calculate, visualize and investigate earthquake hazard and risk, capture new data and share findings for joint learning. The platform is envisaged as a collaborative hub for earthquake risk assessment, used at global and local scales, around which an active network of users has formed. OpenQuake will comprise both online and offline tools, many of which can also be used independently. One of the first steps in OpenQuake development was the creation of open-source software for advanced seismic hazard and risk calculations at any scale, the OpenQuake Engine. Although in continuous development, a command-line version of the software is already being test-driven and used by hundreds worldwide; from non-profits in Central Asia, seismologists in sub-Saharan Africa and companies in South Asia to the European seismic hazard harmonization programme (SHARE). In addition, several technical trainings were organized with scientists from different regions of the world (sub-Saharan Africa, Central Asia, Asia-Pacific) to introduce the engine and other OpenQuake tools to the community, something that will continue to happen over the coming years. Other tools that are being developed of direct interest to the hazard community are: • OpenQuake Modeller; fundamental instruments for the creation of seismogenic input models for seismic hazard assessment, a critical input to the OpenQuake Engine. OpenQuake Modeller will consist of a suite of tools (Hazard Modellers Toolkit) for characterizing the seismogenic sources of earthquakes and their models of earthquakes recurrence. An earthquake catalogue homogenization tool, for integration, statistical comparison and user-defined harmonization of multiple catalogues of earthquakes is also included in the OpenQuake modeling tools. • A data capture tool for active faults; a tool that allows geologists to draw (new) fault discoveries on a map in an intuitive GIS-environment and add details on the fault through the tool. This data, once quality checked, can then be integrated with the global active faults database, which will increase in value with every new fault insertion. Building on many ongoing efforts and the knowledge of scientists worldwide, GEM will for the first time integrate state-of-the-art data, models, results and open-source tools into a single platform. The platform will continue to increase in value, in particular for use in local contexts, through contributions from and collaborations with scientists and organisations worldwide. This presentation will showcase the OpenQuake Platform, focusing on the IT solutions that have been adopted as well as the added value that the platform will bring to scientists worldwide.
The Nature of Natural Hazards Communication (Invited)
NASA Astrophysics Data System (ADS)
Kontar, Y. Y.
2013-12-01
Some of the many issues of interest to natural hazards professionals include the analysis of proactive approaches to the governance of risk from natural hazards and approaches to broaden the scope of public policies related to the management of risks from natural hazards, as well as including emergency and environmental management, community development and spatial planning related to natural hazards. During the talk we will present results of scientific review, analysis and synthesis, which emphasize same new trends in communication of the natural hazards theories and practices within an up-to-the-minute context of new environmental and climate change issues, new technologies, and a new focus on resiliency. The presentation is divided into five sections that focus on natural hazards communication in terms of education, risk management, public discourse, engaging the public, theoretical perspectives, and new media. It includes results of case studies and best practices. It delves into natural hazards communication theories, including diffusion, argumentation, and constructivism, to name a few. The presentation will provide information about: (1) A manual of natural hazards communication for scientists, policymakers, and media; (2) An up-to-the-minute context of environmental hazards, new technologies & political landscape; (3) A work by natural hazards scientists for geoscientists working with social scientists and communication principles; (4) A work underpinned by key natural hazards communication theories and interspersed with pragmatic solutions; (5) A work that crosses traditional natural hazards boundaries: international, interdisciplinary, theoretical/applied. We will further explore how spatial planning can contribute to risk governance by influencing the occupation of natural hazard-prone areas, and review the central role of emergency management in risk policy. The goal of this presentation is to contribute to the augmentation of the conceptual framework of risk governance and increase the awareness of practitioners and decision-makers to the need to adopt proactive policies, leading to a more integrated, participative, and adaptive governance that can respond more efficiently to the increasing uncertainty resulting from escalating natural hazards risk exposure.
Defense Acquisitions Acronyms and Terms
2012-12-01
Computer-Aided Design CADD Computer-Aided Design and Drafting CAE Component Acquisition Executive; Computer-Aided Engineering CAIV Cost As an...Radiation to Ordnance HFE Human Factors Engineering HHA Health Hazard Assessment HNA Host-Nation Approval HNS Host-Nation Support HOL High -Order...Engineering Change Proposal VHSIC Very High Speed Integrated Circuit VLSI Very Large Scale Integration VOC Volatile Organic Compound W WAN Wide
Assessing natural hazards in forestry for risk management: a review
Marc Hanewinkel; Susan Hummel; Axel Albrecht
2011-01-01
We address the problem of how to integrate risk assessment into forest management and therefore provide a comprehensive review of recent and past literature on risk analysis and modeling and, moreover, an evaluation and summary on these papers. We provide a general scheme on how to integrate concepts of risk into forest management decisions. After an overview of the...
Unmanned Aerial Vehicle (UAV) associated DTM quality evaluation and hazard assessment
NASA Astrophysics Data System (ADS)
Huang, Mei-Jen; Chen, Shao-Der; Chao, Yu-Jui; Chiang, Yi-Lin; Chang, Kuo-Jen
2014-05-01
Taiwan, due to the high seismicity and high annual rainfall, numerous landslides triggered every year and severe impacts affect the island. Concerning to the catastrophic landslides, the key information of landslide, including range of landslide, volume estimation and the subsequent evolution are important when analyzing the triggering mechanism, hazard assessment and mitigation. Thus, the morphological analysis gives a general overview for the landslides and been considered as one of the most fundamental information. We try to integrate several technologies, especially by Unmanned Aerial Vehicle (UAV) and multi-spectral camera, to decipher the consequence and the potential hazard, and the social impact. In recent years, the remote sensing technology improves rapidly, providing a wide range of image, essential and precious information. Benefited of the advancing of informatics, remote-sensing and electric technologies, the Unmanned Aerial Vehicle (UAV) photogrammetry mas been improve significantly. The study tries to integrate several methods, including, 1) Remote-sensing images gathered by Unmanned Aerial Vehicle (UAV) and by aerial photos taken in different periods; 2) field in-situ geologic investigation; 3) Differential GPS, RTK GPS and Ground LiDAR field in-site geoinfomatics measurements; 4) Construct the DTMs before and after landslide, as well as the subsequent periods using UAV and aerial photos; 5) Discrete element method should be applied to understand the geomaterial composing the slope failure, for predicting earthquake-induced and rainfall-induced landslides displacement. First at all, we evaluate the Microdrones MD4-1000 UAV airphotos derived Digital Terrain Model (DTM). The ground resolution of the DSM point cloud of could be as high as 10 cm. By integrated 4 ground control point within an area of 56 hectares, compared with LiDAR DSM and filed RTK-GPS surveying, the mean error is as low as 6cm with a standard deviation of 17cm. The quality of the UAV DSM could be as good as LiDAR data, and is ready for other applications. The quality of the data set provides not only geoinfomatics and GIS dataset of the hazards, but also for essential geomorphologic information for other study, and for hazard mitigation and planning, as well.
Mace Firebaugh, Casey; Moyes, Simon; Jatrana, Santosh; Rolleston, Anna; Kerse, Ngaire
2018-01-18
The relationship between physical activity, function, and mortality is not established in advanced age. Physical activity, function, and mortality were followed in a cohort of Māori and non-Māori adults living in advanced age for a period of six years. Generalised Linear regression models were used to analyse the association between physical activity and NEADL while Kaplan-Meier survival analysis, and Cox-proportional hazard models were used to assess the association between the physical activity and mortality. The Hazard Ratio for mortality for those in the least active physical activity quartile was 4.1 for Māori and 1.8 for non- Māori compared to the most active physical activity quartile. There was an inverse relationship between physical activity and mortality, with lower hazard ratios for mortality at all levels of physical activity. Higher levels of physical activity were associated with lower mortality and higher functional status in advanced aged adults.
Mars Science Laboratory Rover Closeout
2011-11-10
The Mars Science Laboratory mission rover, Curiosity, is prepared for final integration into the complete NASA spacecraft in this photograph taken inside the Payload Hazardous Servicing Facility at NASA Kennedy Space Center, Fla.
Construction Management Training in the Navy Seabees
1992-01-01
classroom training in developing a variety of skills. Skills attained are recorded under the Personnel Readiness Capability Program (PRCP) and...Functional Skill 090.2) - Hands on safety course required for all crew leaders and project supervisors. e- Hazard Communication (094. 1) - Federal...Hazard Communication Training Program m required by 19CFR1910.1200. This course is required for all personnel. Those exposed to hazardous chemicals
EPOS Thematic Core Service ANTHROPOGENIC HAZARDS (TCS AH) - development of e-research platform
NASA Astrophysics Data System (ADS)
Orlecka-Sikora, Beata
2017-04-01
TCS AH is based on IS-EPOS Platform. The Platform facilitates research on anthropogenic hazards and is available online, free of charge https://tcs.ah-epos.eu/. The Platform is a final product of the IS-EPOS project, founded by the national programme - POIG - which was implemented in 2013-2015 (POIG.02.03.00-14-090/13-00). The platform is a result of a joint work of scientific community and industrial partners. Currently, the development of TCS AH is carried under EPOS IP project (H2020-INFRADEV-1-2015-1, INFRADEV-3-2015). Platform is an open virtual access point for researchers and Ph. D. students interested in anthropogenic seismicity and related hazards. This environment is designed to ensure a researcher the maximum possible liberty for experimentation by providing a virtual laboratory, in which the researcher can design own processing streams and process the data integrated on the platform. TCS AH integrates: data and specific high-level services. Data gathered in the so-called "episodes", comprehensively describing a geophysical process, induced or triggered by human technological activity, which, under certain circumstances can become hazardous for people, infrastructure and the environment. 7 sets of seismic, geological and technological data were made available on the Platform. The data come from Poland, Germany, UK and Vietnam, and refer to underground mining, reservoir impoundment, shale gas exploitation and geothermal energy production. The next at least 19 new episodes related to conventional hydrocarbon extraction, reservoir treatment, underground mining and geothermal energy production are being integrated within the framework of EPOS IP project. The heterogeneous multi-disciplinary data (seismic, displacement, geomechanical data, production data etc.) are transformed to unified structures to form integrated and validated datasets. To deal with this various data the problem-oriented services were designed and implemented. The particular attention devoted to methods analyzing correlations between technology, geophysical response and resulting hazard was stressed out in service preparation. TCS AH contains a number of computing and data visualization services, which give opportunity to make graphical presentations of the available data. Further development of the Platform, except integration of at least new episodes of all types of anthropogenic hazards, will be covering gradually implementation of new services. TCS AH platform is open for the whole research community. The platform is also designated to be used in research projects, eg. it serves "Shale gas exploration and exploitation induced risks (SHEER)" project (Horizon 2020, call LCE 16-2014). In addition, it is also meant to serve the public sector expert knowledge and background information. In order to fulfill this aim the services for outreach, dissemination & communication will be implemented. TCS AH was used as a teaching tool in Ph. D. students education within IG PAS seismology course for Ph. D. candidates, Interdisciplinary Polar Studies as well as in several workshops for Polish and international students. Additionally, the platform is also used within educational project ERIS (Exploitation of Research results In School practice) aimed for junior high and high schools, funded with support from the European Commission within ERASMUS+ Programme.
RiskScape: a new tool for comparing risk from natural hazards (Invited)
NASA Astrophysics Data System (ADS)
Stirling, M. W.; King, A.
2010-12-01
The Regional RiskScape is New Zealand’s joint venture between GNS Science & NIWA, and represents a comprehensive and easy-to-use tool for multi-hazard-based risk and impact analysis. It has basic GIS functionality, in that it has Import/Export functions to use with GIS software. Five natural hazards have been implemented in Riskscape to date: Flood (river), earthquake, volcano (ash), tsunami and wind storm. The software converts hazard exposure information into the likely impacts for a region, for example, damage and replacement costs, casualties, economic losses, disruption, and number of people affected. It therefore can be used to assist with risk management, land use planning, building codes and design, risk identification, prioritization of risk-reduction/mitigation, determination of “best use” risk-reduction investment, evacuation and contingency planning, awareness raising, public information, realistic scenarios for exercises, and hazard event response. Three geographically disparate pilot regions have been used to develop and triall Riskscape in New Zealand, and each region is exposed to a different mix of natural hazards. Future (phase II) development of Riskscape will include the following hazards: Landslides (both rainfall and earthquake triggered), storm surges, pyroclastic flows and lahars, and climate change effects. While Riskscape developments have thus far focussed on scenario-based risk, future developments will advance the software into providing probabilistic-based solutions.
NASA Astrophysics Data System (ADS)
mouloud, Hamidatou
2016-04-01
The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria.
A rainfall risk analysis thanks to an GIS based estimation of urban vulnerability
NASA Astrophysics Data System (ADS)
Renard, Florent; Pierre-Marie, Chapon
2010-05-01
The urban community of Lyon, situated in France in the north of the Rhône valley, comprises 1.2 million inhabitants within 515 km ². With such a concentration of issues, policy makers and local elected officials therefore attach great importance to the management of hydrological risks, particularly due to the inherent characteristics of the territory. If the hazards associated with these risks in the territory of Lyon have been the subject of numerous analyses, studies on the vulnerability of greater Lyon are rare and have common shortcomings that impair their validity. We recall that the risk is seen as the classic relationship between the probability of occurrence of hazards and vulnerability. In this article, this vulnerability will be composed of two parts. The first one is the sensitivity of the stakes facing hydrological hazards as urban runoff, that is to say, their propensity to suffer damage during a flood (Gleize and Reghezza, 2007). The second factor is their relative importance in the functioning of the community. Indeed, not all the stakes could provide the same role and contribution to the Greater Lyon. For example, damage to the urban furniture such as bus shelter seems less harmful to the activities of the urban area than that of transport infrastructure (Renard and Chapon, 2010). This communication proposes to assess the vulnerability of Lyon urban area facing to hydrological hazards. This territory is composed of human, environmental and material stakes. The first part of this work is to identify all these issues so as to completeness. Then, is it required to build a "vulnerability index" (Tixier et al, 2006). Thus, it is necessary to use methods of multicriteria decision aid to evaluate the two components of vulnerability: the sensitivity and the contribution to the functioning of the community. Finally, the results of the overall vulnerability are presented, and then coupled to various hazards related to water such as runoff associated with heavy rains, to locate areas of risk in the urban area. The targets that share the same rank of this vulnerability index do not possess the same importance, or the same sensitivity to the flood hazard. Therefore, the second part of this work is to define the priorities and sensitivities of different targets based on the judgments of experts. Multicriteria decision methods are used to prioritize elements and are therefore adapted to the modelling of the sensitivity of the issues of greater Lyon (Griot, 2008). The purpose of these methods is the assessment of priorities between the different components of the situation. Thomas Saaty's analytic hierarchy process (1980) is the most frequently used because of its many advantages. On this basis, the formal calculations of priorities and sensitivities of the elements have been conducted. These calculations are based on the judgments of experts. Indeed, during semi-structured interview, the 38 experts in our sample delivered a verdict on issues that seem relatively more important than others by binary comparison. They carry the same manner to determine sensitivity's stakes to hazard flooding. Finally, the consistency of answers given by experts is validated by calculating a ratio of coherence, and their results are aggregated to provide functions of priority (based on the relative importance of each stakes), and functions of sensitivity (based on the relative sensitivity of each stakes). From these functions of priority and sensitivity is obtained the general function of vulnerability. The vulnerability functions allow defining the importance of the stakes of Greater Lyon and their sensitivity to hydrological hazards. The global vulnerability function is obtained from sensitivity and priority functions and shows the great importance of human issues (75 %). The vulnerability factor of environmental targets represents 12 % of the global vulnerability function, as much as the materials issues. However, it can be seen that the environmental and material stakes do not represent the same weight into the priority and sensitivity functions. Indeed, the environmental issues seem more important than the material ones (17 % for the environmental stakes whereas only 5 % for the material stakes in the priority function), but less sensitive to an hydrological hazard (6 % for the environmental issues while 20 % for the material issues in the sensitivity function). Similarly, priority functions and sensitivity are established for all stakes at all levels. The stakes are then converted into a mesh form (100 meters wide). This will standardize the collection framework and the heterogeneous nature of data to allow their comparison. Finally, it is obtained a detailed, consistent and objective vulnerability of the territory of Greater Lyon. At the end, to get a direct reading of risk, combination of hazard and vulnerability, it is overlaid the two maps.
Engineering risk reduction in satellite programs
NASA Technical Reports Server (NTRS)
Dean, E. S., Jr.
1979-01-01
Methods developed in planning and executing system safety engineering programs for Lockheed satellite integration contracts are presented. These procedures establish the applicable safety design criteria, document design compliance and assess the residual risks where non-compliant design is proposed, and provide for hazard analysis of system level test, handling and launch preparations. Operations hazard analysis identifies product protection and product liability hazards prior to the preparation of operational procedures and provides safety requirements for inclusion in them. The method developed for documenting all residual hazards for the attention of program management assures an acceptable minimum level of risk prior to program deployment. The results are significant for persons responsible for managing or engineering the deployment and production of complex high cost equipment under current product liability law and cost/time constraints, have a responsibility to minimize the possibility of an accident, and should have documentation to provide a defense in a product liability suit.
Unmanned Aircraft Hazards and their Implications for Regulation
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.; Maddalon, Jeffrey M.; Miner, Paul S.; DeWalt, Michael P.; McCormick, G. Frank
2006-01-01
Use of unmanned aircraft systems (UASs) has been characterized as the next great step forward in the evolution of civil aviation. Indeed, UASs are in limited civil use in the United States today, and many believe that the time is rapidly approaching when they will move into the commercial marketplace, too. To make this a reality, a number of challenges must be overcome to develop the necessary regulatory framework for assuring safe operation of this special class of aircraft. This paper discusses some of what must be done to establish that framework. In particular, we examine hazards specific to the design, operation, and flight crew of UASs, and discuss implications of these hazards for existing policy and guidance. Understanding unique characteristics of UASs that pose new hazards is essential to developing a cogent argument, and the corresponding regulatory framework, for safely integrating these aircraft into civil airspace.
75 FR 27273 - Hazardous Materials; Packages Intended for Transport by Aircraft
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-14
...PHMSA proposes to amend requirements in the Hazardous Materials Regulations to enhance the integrity of inner packagings or receptacles of combination packagings containing liquid hazardous material by ensuring they remain intact when subjected to the reduced pressure and other forces encountered in air transportation. In order to substantially decrease the likelihood of a hazardous materials release, the proposed amendments: prescribe specific test protocols and standards for determining whether an inner packaging or receptacle is capable of meeting the pressure differential requirements specified in the regulations and, consistent with the 2011-2012 edition of the International Civil Aviation Organization Technical Instructions for the Safe Transport of Dangerous Goods by Aircraft (ICAO Technical Instructions), require the closures on all inner packagings containing liquids within a combination packaging to be secured by a secondary means or, under certain circumstances, permit the use of a liner.
Volcanic ash hazards and aviation risk: Chapter 4
Guffanti, Marianne C.; Tupper, Andrew C.
2015-01-01
The risks to safe and efficient air travel from volcanic-ash hazards are well documented and widely recognized. Under the aegis of the International Civil Aviation Organization, globally coordinated mitigation procedures are in place to report explosive eruptions, detect airborne ash clouds and forecast their expected movement, and issue specialized messages to warn aircraft away from hazardous airspace. This mitigation framework is based on the integration of scientific and technical capabilities worldwide in volcanology, meteorology, and atmospheric physics and chemistry. The 2010 eruption of Eyjafjallajökull volcano in Iceland, which led to a nearly week-long shutdown of air travel into and out of Europe, has prompted the aviation industry, regulators, and scientists to work more closely together to improve how hazardous airspace is defined and communicated. Volcanic ash will continue to threaten aviation and scientific research will continue to influence the risk-mitigation framework.
Extended GTST-MLD for aerospace system safety analysis.
Guo, Chiming; Gong, Shiyu; Tan, Lin; Guo, Bo
2012-06-01
The hazards caused by complex interactions in the aerospace system have become a problem that urgently needs to be settled. This article introduces a method for aerospace system hazard interaction identification based on extended GTST-MLD (goal tree-success tree-master logic diagram) during the design stage. GTST-MLD is a functional modeling framework with a simple architecture. Ontology is used to extend the ability of system interaction description in GTST-MLD by adding the system design knowledge and the past accident experience. From the level of functionality and equipment, respectively, this approach can help the technician detect potential hazard interactions. Finally, a case is used to show the method. © 2011 Society for Risk Analysis.
The investigation of tethered satellite system dynamics
NASA Technical Reports Server (NTRS)
Lorenzini, E.
1985-01-01
Progress in tethered satellite system dynamics research is reported. A retrieval rate control law with no angular feedback to investigate the system's dynamic response was studied. The initial conditions for the computer code which simulates the satellite's rotational dynamics were extended to a generic orbit. The model of the satellite thrusters was modified to simulate a pulsed thrust, by making the SKYHOOK integrator suitable for dealing with delta functions without loosing computational efficiency. Tether breaks were simulated with the high resolution computer code SLACK3. Shuttle's maneuvers were tested. The electric potential around a severed conductive tether with insulator, in the case of a tether breakage at 20 km from the Shuttle, was computed. The electrodynamic hazards due to the breakage of the TSS electrodynamic tether in a plasma are evaluated.
Reliability, Safety and Error Recovery for Advanced Control Software
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2003-01-01
For long-duration automated operation of regenerative life support systems in space environments, there is a need for advanced integration and control systems that are significantly more reliable and safe, and that support error recovery and minimization of operational failures. This presentation outlines some challenges of hazardous space environments and complex system interactions that can lead to system accidents. It discusses approaches to hazard analysis and error recovery for control software and challenges of supporting effective intervention by safety software and the crew.
Combining System Safety and Reliability to Ensure NASA CoNNeCT's Success
NASA Technical Reports Server (NTRS)
Havenhill, Maria; Fernandez, Rene; Zampino, Edward
2012-01-01
Hazard Analysis, Failure Modes and Effects Analysis (FMEA), the Limited-Life Items List (LLIL), and the Single Point Failure (SPF) List were applied by System Safety and Reliability engineers on NASA's Communications, Navigation, and Networking reConfigurable Testbed (CoNNeCT) Project. The integrated approach involving cross reviews of these reports by System Safety, Reliability, and Design engineers resulted in the mitigation of all identified hazards. The outcome was that the system met all the safety requirements it was required to meet.
2008-03-01
irregular struggle, and, finally, a protracted struggle that will last decades rather than years. how will this war Evolve? It is hazardous to...There is no downside to engagement. It is not an act of para- noia or pessimism to engage Americans in the very real hazards that confront us. It...making process about what to do next. Because it is an undisciplined process, they work through about 100 options when there are only two: duck or
2005-08-31
to the launch complex is considered a hazardous operation. Transportation of fueled payloads will comply with AFSPCMAN 91 - 710 , Range Safety User...April. 45th Space Wing (SW). 1996b. Hazardous Materials Response Plan 32- 3 , Volume I, March. 45th Space Wing (SW). 2001. Integrated Natural...control number. 1. REPORT DATE 31 AUG 2005 2. REPORT TYPE 3 . DATES COVERED 00-00-2005 to 00-00-2005 4. TITLE AND SUBTITLE Final Environmental
IRIS Toxicological Review of Trichloroethylene (TCE) ...
EPA is conducting a peer review and public comment of the scientific basis supporting the human health hazard and dose-response assessment of Trichloroethylene (TCE) that when finalized will appear on the Integrated Risk Information System (IRIS) database. The purpose of this Toxicological Review is to provide scientific support and rationale for the hazard and dose-response assessment in IRIS pertaining to chronic exposure to trichloroethylene. It is not intended to be a comprehensive treatise on the chemical or toxicological nature of trichloroethylene.
IRIS Toxicological Review of 1,2,3-Trichloropropane (External ...
EPA conducted a peer review of the scientific basis supporting the human health hazard and dose-response assessment of 1,2,3-trichloropropane (TCP) that once finalized will appear on the Integrated Risk Information System (IRIS) database. Peer review is meant to ensure that science is used credibly and appropriately in derivation of the dose-response assessments and toxicological characterization. This Tox Review provides scientific support and rationale for the hazard and dose-response assessment pertaining to chronic exposure to 1,2,3-trichloropropane.
Comín-Colet, Josep; Verdú-Rotellar, José María; Vela, Emili; Clèries, Montse; Bustins, Montserrat; Mendoza, Lola; Badosa, Neus; Cladellas, Mercè; Ferré, Sofía; Bruguera, Jordi
2014-04-01
The efficacy of heart failure programs has been demonstrated in clinical trials but their applicability in the real world practice setting is more controversial. This study evaluates the feasibility and efficacy of an integrated hospital-primary care program for the management of patients with heart failure in an integrated health area covering a population of 309,345. For the analysis, we included all patients consecutively admitted with heart failure as the principal diagnosis who had been discharged alive from all of the hospitals in Catalonia, Spain, from 2005 to 2011, the period when the program was implemented, and compared mortality and readmissions among patients exposed to the program with the rates in the patients of all the remaining integrated health areas of the Servei Català de la Salut (Catalan Health Service). We included 56,742 patients in the study. There were 181,204 hospital admissions and 30,712 deaths during the study period. In the adjusted analyses, when compared to the 54,659 patients from the other health areas, the 2083 patients exposed to the program had a lower risk of death (hazard ratio=0.92 [95% confidence interval, 0.86-0.97]; P=.005), a lower risk of clinically-related readmission (hazard ratio=0.71 [95% confidence interval, 0.66-0.76]; P<.001), and a lower risk of readmission for heart failure (hazard ratio=0.86 [95% confidence interval, 0.80-0.94]; P<.001). The positive impact on the morbidity and mortality rates was more marked once the program had become well established. The implementation of multidisciplinary heart failure management programs that integrate the hospital and the community is feasible and is associated with a significant reduction in patient morbidity and mortality. Copyright © 2013 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.
NASA Astrophysics Data System (ADS)
Harrison, M.; Cocco, M.
2017-12-01
EPOS (European Plate Observing System) has been designed with the vision of creating a pan-European infrastructure for solid Earth science to support a safe and sustainable society. In accordance with this scientific vision, the EPOS mission is to integrate the diverse and advanced European Research Infrastructures for solid Earth science relying on new e-science opportunities to monitor and unravel the dynamic and complex Earth System. EPOS will enable innovative multidisciplinary research for a better understanding of the Earth's physical and chemical processes that control earthquakes, volcanic eruptions, ground instability and tsunami as well as the processes driving tectonics and Earth's surface dynamics. To accomplish its mission, EPOS is engaging different stakeholders, to allow the Earth sciences to open new horizons in our understanding of the planet. EPOS also aims at contributing to prepare society for geo-hazards and to responsibly manage the exploitation of geo-resources. Through integration of data, models and facilities, EPOS will allow the Earth science community to make a step change in developing new concepts and tools for key answers to scientific and socio-economic questions concerning geo-hazards and geo-resources as well as Earth sciences applications to the environment and human welfare. The research infrastructures (RIs) that EPOS is coordinating include: i) distributed geophysical observing systems (seismological and geodetic networks); ii) local observatories (including geomagnetic, near-fault and volcano observatories); iii) analytical and experimental laboratories; iv) integrated satellite data and geological information services; v) new services for natural and anthropogenic hazards; vi) access to geo-energy test beds. Here we present the activities planned for the implementation phase focusing on the TCS, the ICS and on their interoperability. We will discuss the data, data-products, software and services (DDSS) presently under implementation, which will be validated and tested during 2018. Particular attention in this talk will be given to connecting EPOS with similar global initiatives and identifying common best practice and approaches.
IRIS Toxicological Review of Propionaldehyde (External Review Draft)
EPA conducted a peer review of the scientific basis supporting the human health hazard and dose-response assessment of propionaldehyde that will appear on the Integrated Risk Information System (IRIS) database.
Xiao, Yangfan; Yi, Shanzhen; Tang, Zhongqian
2017-12-01
Flood is the most common natural hazard in the world and has caused serious loss of life and property. Assessment of flood prone areas is of great importance for watershed management and reduction of potential loss of life and property. In this study, a framework of multi-criteria analysis (MCA) incorporating geographic information system (GIS), fuzzy analytic hierarchy process (AHP) and spatial ordered weighted averaging (OWA) method was developed for flood hazard assessment. The factors associated with geographical, hydrological and flood-resistant characteristics of the basin were selected as evaluation criteria. The relative importance of the criteria was estimated through fuzzy AHP method. The OWA method was utilized to analyze the effects of different risk attitudes of the decision maker on the assessment result. The spatial ordered weighted averaging method with spatially variable risk preference was implemented in the GIS environment to integrate the criteria. The advantage of the proposed method is that it has considered spatial heterogeneity in assigning risk preference in the decision-making process. The presented methodology has been applied to the area including Hanyang, Caidian and Hannan of Wuhan, China, where flood events occur frequently. The outcome of flood hazard distribution presents a tendency of high risk towards populated and developed areas, especially the northeast part of Hanyang city, which has suffered frequent floods in history. The result indicates where the enhancement projects should be carried out first under the condition of limited resources. Finally, sensitivity of the criteria weights was analyzed to measure the stability of results with respect to the variation of the criteria weights. The flood hazard assessment method presented in this paper is adaptable for hazard assessment of a similar basin, which is of great significance to establish counterplan to mitigate life and property losses. Copyright © 2017 Elsevier B.V. All rights reserved.
Protection of agriculture against drought in Slovenia based on vulnerability and risk assessment
NASA Astrophysics Data System (ADS)
Dovžak, M.; Stanič, S.; Bergant, K.; Gregorič, G.
2012-04-01
Past and recent extreme events, like earthquakes, extreme droughts, heat waves, flash floods and volcanic eruptions continuously remind us that natural hazards are an integral component of the global environment. Despite rapid improvement of detection techniques many of these events evade long-term or even mid-term prediction and can thus have disastrous impacts on affected communities and environment. Effective mitigation and preparedness strategies will be possible to develop only after gaining the understanding on how and where such hazards may occur, what causes them, what circumstances increase their severity, and what their impacts may be and their study has the recent years emerged as under the common title of natural hazard management. The first step in natural risk management is risk identification, which includes hazard analysis and monitoring, vulnerability analysis and determination of the risk level. The presented research focuses on drought, which is at the present already the most widespread as well as still unpredictable natural hazard. Its primary aim was to assess the frequency and the consequences of droughts in Slovenia based on drought events in the past, to develop methodology for drought vulnerability and risk assessment that can be applied in Slovenia and wider in South-Eastern Europe, to prepare maps of drought risk and crop vulnerability and to guidelines to reduce the vulnerability of the crops. Using the amounts of plant available water in the soil, slope inclination, solar radiation, land use and irrigation infrastructure data sets as inputs, we obtained vulnerability maps for Slovenia using GIS-based multi-criteria decision analysis with a weighted linear combination of the input parameters. The weight configuration was optimized by comparing the modelled crop damage to the assessed actual damage, which was available for the extensive drought case in 2006. Drought risk was obtained quantitatively as a function of hazard and vulnerability and presented in the same way as the vulnerability, as a GIS-based map. Risk maps show geographic regions in Slovenia where droughts pose a major threat to the agriculture and together with the vulnerability maps provide the basis for drought management, in particular for the appropriate mitigation and response actions in specific regions. The developed methodology is expected to be applied to the entire region of South-Eastern Europe within the initiative of the Drought Management Centre for Southeastern Europe.
Ahmadi, Azadeh; Roudbari, Masoud; Gohari, Mahmood Reza; Hosseini, Bistoon
2012-01-01
Increase of mortality rates of gastric cancer in Iran and the world in recent years reveal necessity of studies on this disease. Here, hazard function for gastric cancer patients was estimated using Wavelet and Kernel methods and some related factors were assessed. Ninety- five gastric cancer patients in Fayazbakhsh Hospital between 1996 and 2003 were studied. The effects of age of patients, gender, stage of disease and treatment method on patient's lifetime were assessed. For data analyses, survival analyses using Wavelet method and Log-rank test in R software were used. Nearly 25.3% of patients were female. Fourteen percent had surgery treatment and the rest had treatment without surgery. Three fourths died and the rest were censored. Almost 9.5% of patients were in early stages of the disease, 53.7% in locally advance stage and 36.8% in metastatic stage. Hazard function estimation with the wavelet method showed significant difference for stages of disease (P<0.001) and did not reveal any significant difference for age, gender and treatment method. Only stage of disease had effects on hazard and most patients were diagnosed in late stages of disease, which is possibly one of the most reasons for high hazard rate and low survival. Therefore, it seems to be necessary a public education about symptoms of disease by media and regular tests and screening for early diagnosis.
NASA Astrophysics Data System (ADS)
Huang, H. H.; Hsu, Y. J.; Kuo, C. Y.; Chen, C. C.; Kuo, L. W.; Chen, R. F.; Lin, C. R.; Lin, P. P.; Lin, C. W.; Lin, M. L.; Wang, K. L.
2017-12-01
A unique landslide monitoring project integrating multidisciplinary geophysics experiments such as GPS, inclinometer, piezometer, and spontaneous potential log has been established at Lantai, Ilan area to investigating the possible detachment depth range and the physical mechanism of a slowly creeping landslide. In parallel with this, a lately deployed local seismic network also lends an opportunity to employ the passive seismic imaging technique to detect the time-lapse changes of seismic velocity in and around the landslide area. Such technique that retrieves Green's functions by cross-correlation of continuous ambient noise has opened new opportunities to seismologically monitoring the environmental and tectonic events such as ground water variation, magma intrusion under volcanos, and co-seismic medium damage in recent years. Integrating these geophysical observations, we explore the primary controls of derived seismic velocity changes and especially the hydrological response of the landslide to the passage of Megi typhoon in the last September 2016, which could potentially further our understanding of the dynamic system of landslides and in turn help the hazard mitigation.
Fiber-Optic Surface Temperature Sensor Based on Modal Interference.
Musin, Frédéric; Mégret, Patrice; Wuilpart, Marc
2016-07-28
Spatially-integrated surface temperature sensing is highly useful when it comes to controlling processes, detecting hazardous conditions or monitoring the health and safety of equipment and people. Fiber-optic sensing based on modal interference has shown great sensitivity to temperature variation, by means of cost-effective image-processing of few-mode interference patterns. New developments in the field of sensor configuration, as described in this paper, include an innovative cooling and heating phase discrimination functionality and more precise measurements, based entirely on the image processing of interference patterns. The proposed technique was applied to the measurement of the integrated surface temperature of a hollow cylinder and compared with a conventional measurement system, consisting of an infrared camera and precision temperature probe. As a result, the optical technique is in line with the reference system. Compared with conventional surface temperature probes, the optical technique has the following advantages: low heat capacity temperature measurement errors, easier spatial deployment, and replacement of multiple angle infrared camera shooting and the continuous monitoring of surfaces that are not visually accessible.
2012-01-01
Background We explore the benefits of applying a new proportional hazard model to analyze survival of breast cancer patients. As a parametric model, the hypertabastic survival model offers a closer fit to experimental data than Cox regression, and furthermore provides explicit survival and hazard functions which can be used as additional tools in the survival analysis. In addition, one of our main concerns is utilization of multiple gene expression variables. Our analysis treats the important issue of interaction of different gene signatures in the survival analysis. Methods The hypertabastic proportional hazards model was applied in survival analysis of breast cancer patients. This model was compared, using statistical measures of goodness of fit, with models based on the semi-parametric Cox proportional hazards model and the parametric log-logistic and Weibull models. The explicit functions for hazard and survival were then used to analyze the dynamic behavior of hazard and survival functions. Results The hypertabastic model provided the best fit among all the models considered. Use of multiple gene expression variables also provided a considerable improvement in the goodness of fit of the model, as compared to use of only one. By utilizing the explicit survival and hazard functions provided by the model, we were able to determine the magnitude of the maximum rate of increase in hazard, and the maximum rate of decrease in survival, as well as the times when these occurred. We explore the influence of each gene expression variable on these extrema. Furthermore, in the cases of continuous gene expression variables, represented by a measure of correlation, we were able to investigate the dynamics with respect to changes in gene expression. Conclusions We observed that use of three different gene signatures in the model provided a greater combined effect and allowed us to assess the relative importance of each in determination of outcome in this data set. These results point to the potential to combine gene signatures to a greater effect in cases where each gene signature represents some distinct aspect of the cancer biology. Furthermore we conclude that the hypertabastic survival models can be an effective survival analysis tool for breast cancer patients. PMID:23241496
Assessing Natural Hazard Vulnerability Through Marmara Region Using GIS
NASA Astrophysics Data System (ADS)
Sabuncu, A.; Garagon Dogru, A.; Ozener, H.
2013-12-01
Natural hazards are natural phenomenon occured in the Earth's system that include geological and meteorological events such as earthquakes, floods, landslides, droughts, fires and tsunamis. The metropolitan cities are vulnerable to natural hazards due to their population densities, industrial facilities and proporties. The urban layout of the megacities are complex since industrial facilities are interference with residential area. The Marmara region is placed in North-western Turkey suffered from natural hazards (earthquakes, floods etc.) for years. After 1999 Kocaeli and Duzce earthquakes and 2009 Istanbul flash floods, dramatic number of casualities and economic losses were reported by the authorities. Geographic information systems (GIS) have substantial capacity in order to develop natural disaster management. As these systems provide more efficient and reliable analysis and evaluation of the data in the management, and also convenient and better solutions for the decision making before during and after the natural hazards. The Earth science data and socio-economic data can be integrated into a GIS as different layers. Additionally, satellite data are used to understand the changes pre and post the natural hazards. GIS is a powerful software for the combination of different type of digital data. A natural hazard database for the Marmara region provides all different types of digital data to the users. All proper data collection processing and analysing are critical to evaluate and identify hazards. The natural hazard database allows users to monitor, analyze and query past and recent disasters in the Marmara Region. The long term aim of this study is to develop geodatabase and identify the natural hazard vulnerabilities of the metropolitan cities.
Risk factors for hazardous events in olfactory-impaired patients.
Pence, Taylor S; Reiter, Evan R; DiNardo, Laurence J; Costanzo, Richard M
2014-10-01
Normal olfaction provides essential cues to allow early detection and avoidance of potentially hazardous situations. Thus, patients with impaired olfaction may be at increased risk of experiencing certain hazardous events such as cooking or house fires, delayed detection of gas leaks, and exposure to or ingestion of toxic substances. To identify risk factors and potential trends over time in olfactory-related hazardous events in patients with impaired olfactory function. Retrospective cohort study of 1047 patients presenting to a university smell and taste clinic between 1983 and 2013. A total of 704 patients had both clinical olfactory testing and a hazard interview and were studied. On the basis of olfactory function testing results, patients were categorized as normosmic (n = 161), mildly hyposmic (n = 99), moderately hyposmic (n = 93), severely hyposmic (n = 142), and anosmic (n = 209). Patient evaluation including interview, examination, and olfactory testing. Incidence of specific olfaction-related hazardous events (ie, burning pots and/or pans, starting a fire while cooking, inability to detect gas leaks, inability to detect smoke, and ingestion of toxic substances or spoiled foods) by degree of olfactory impairment. The incidence of having experienced any hazardous event progressively increased with degree of impairment: normosmic (18.0%), mildly hyposmic (22.2%), moderately hyposmic (31.2%), severely hyposmic (32.4%), and anosmic (39.2%). Over 3 decades there was no significant change in the overall incidence of hazardous events. Analysis of demographic data (age, sex, race, smoking status, and etiology) revealed significant differences in the incidence of hazardous events based on age (among 397 patients <65 years, 148 [37.3%] with hazardous event, vs 31 of 146 patients ≥65 years [21.3%]; P < .001), sex (among 278 women, 106 [38.1%] with hazardous event, vs 73 of 265 men [27.6%]; P = .009), and race (among 98 African Americans, 41 [41.8%] with hazardous event, vs 134 of 434 whites [30.9%]; P = .04). Increased level of olfactory impairment portends an increased risk of experiencing a hazardous event. Risk is further impacted by individuals' age, sex, and race. These results may assist health care practitioners in counseling patients on the risks associated with olfactory impairment.
NASA Astrophysics Data System (ADS)
Ghosh, Kapil; De, Sunil Kumar
2017-04-01
Successful landslide management plans and policy depends on in-depth knowledge about the hazard and associated risk. Thus, the present research is intended to present an integrated approach involving uses of geospatial technologies for landslide hazard and risk assessment at different scales (site specific to regional level). The landslide hazard map at regional scale (district level) is prepared by using weight-rating based method. To analyze landslide manifestation in the Dhalai district of Tripura different causative factor maps (lithology, road buffer, slope, relative relief, rainfall, fault buffer, landuse/landcover and drainage density) are derived. The analysis revealed that the geological structure and human interference have more influence than other considered factors on the landslide occurrences. The landslide susceptibility zonation map shows that about 1.64 and 16.68% of the total study area is falling under very high and high susceptibility zones respectively. The landslide risk assessment at district level is generated by integrating hazard scouring and resource damage potential scouring (fuzzy membership values) maps. The values of landslide risk matrix are varying within the range of 0.001 to 0.18 and the risk assessment map shows that only 0.45% (10.80 km2) of the district is under very high risk zone, whereas, about 50% pixels of existing road section are under very high to high level of landslide risk. The major part (94.06%) of the district is under very low to low risk zone. Landslide hazard and risk assessment at site specific level have been carried out through intensive field investigation in which it is found that the Ambassa landslide is located within 150 m buffer zone of fault line. Variation of geo-electrical resistivity (2.2Ωm to 31.4Ωm) indicates the complex geological character in this area. Based on the obtained geo-technical result which helps to identify the degree of risk to the existing resource, it is appropriate to implement the management plans such as construction of sub-surface drainage, extension of retaining walls, cutting/filling of slope in scientific manner. Keywords: landslide, hazard, risk, fuzzy set theory
NASA Astrophysics Data System (ADS)
Torres Morales, G. F.; Dávalos Sotelo, R.; Castillo Aguilar, S.; Mora González, I.; Lermo Samaniego, J. F.; Rodriguez, M.; García Martínez, J.; Suárez, M. Leonardo; Hernández Juan, F.
2013-05-01
This paper presents the results of microzonification of the natural hazards for different metropolitan areas and highlights the importance of integrating these results in urban planning. The cities that have been covered for the definition of danger in the state of Veracruz are: Orizaba, Veracruz and Xalapa, as part of the production of a Geological and Hydrometeorology Hazards Atlas for the state of Veracruz, financed by the Funds for the Prevention of Natural Disasters FOPREDEN and CONACYT. The general data of each metropolitan area was integrated in a geographic information system (GIS), obtaining different theme maps, and maps of dynamic characteristics of soils in each metropolitan area. For the planning of an urban area to aspire to promote sustainable development, it is essential to have a great deal of the details on the pertinent information and the most important is that that has to do with the degree of exposure to natural phenomena. In general, microzonation investigations consider all natural phenomena that could potentially affect an area of interest and hazard maps for each of potential hazards are prepared. With all the data collected and generated and fed into a SIG, models were generated which define the areas most threatened by earthquake, flood and landslide slopes. These results were compared with maps of the main features in the urban zones and a qualitative classification of areas of high to low hazard was established. It will have the basic elements of information for urban planning and land use. This information will be made available to the authorities and the general public through an Internet portal where people can download and view maps using free software available online.;
Thompson, David A; Marsteller, Jill A; Pronovost, Peter J; Gurses, Ayse; Lubomski, Lisa H; Goeschel, Christine A; Gosbee, John W; Wahr, Joyce; Martinez, Elizabeth A
2015-09-01
The objectives were to develop a scientifically sound and feasible peer-to-peer assessment model that allows health-care organizations to evaluate patient safety in cardiovascular operating rooms and to establish safety priorities for improvement. The locating errors through networked surveillance study was conducted to identify hazards in cardiac surgical care. A multidisciplinary team, composed of organizational sociology, organizational psychology, applied social psychology, clinical medicine, human factors engineering, and health services researchers, conducted the study. We used a transdisciplinary approach, which integrated the theories, concepts, and methods from each discipline, to develop comprehensive research methods. Multiple data collection was involved: focused literature review of cardiac surgery-related adverse events, retrospective analysis of cardiovascular events from a national database in the United Kingdom, and prospective peer assessment at 5 sites, involving survey assessments, structured interviews, direct observations, and contextual inquiries. A nominal group methodology, where one single group acts to problem solve and make decisions was used to review the data and develop a list of the top priority hazards. The top 6 priority hazard themes were as follows: safety culture, teamwork and communication, infection prevention, transitions of care, failure to adhere to practices or policies, and operating room layout and equipment. We integrated the theories and methods of a diverse group of researchers to identify a broad range of hazards and good clinical practices within the cardiovascular surgical operating room. Our findings were the basis for a plan to prioritize improvements in cardiac surgical care. These study methods allowed for the comprehensive assessment of a high-risk clinical setting that may translate to other clinical settings.
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.
2014-10-01
Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
NASA Astrophysics Data System (ADS)
Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.
2015-03-01
Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.
Ensemble of ground subsidence hazard maps using fuzzy logic
NASA Astrophysics Data System (ADS)
Park, Inhye; Lee, Jiyeong; Saro, Lee
2014-06-01
Hazard maps of ground subsidence around abandoned underground coal mines (AUCMs) in Samcheok, Korea, were constructed using fuzzy ensemble techniques and a geographical information system (GIS). To evaluate the factors related to ground subsidence, a spatial database was constructed from topographic, geologic, mine tunnel, land use, groundwater, and ground subsidence maps. Spatial data, topography, geology, and various ground-engineering data for the subsidence area were collected and compiled in a database for mapping ground-subsidence hazard (GSH). The subsidence area was randomly split 70/30 for training and validation of the models. The relationships between the detected ground-subsidence area and the factors were identified and quantified by frequency ratio (FR), logistic regression (LR) and artificial neural network (ANN) models. The relationships were used as factor ratings in the overlay analysis to create ground-subsidence hazard indexes and maps. The three GSH maps were then used as new input factors and integrated using fuzzy-ensemble methods to make better hazard maps. All of the hazard maps were validated by comparison with known subsidence areas that were not used directly in the analysis. As the result, the ensemble model was found to be more effective in terms of prediction accuracy than the individual model.
Advanced Environmental Monitoring and Control Program: Strategic Plan
NASA Technical Reports Server (NTRS)
Schmidt, Gregory
1996-01-01
Human missions in space, from short-duration shuttle missions lasting no more than several days to the medium-to-long-duration missions planned for the International Space Station, face a number of hazards that must be understood and mitigated for the mission to be carried out safely. Among these hazards are those posed by the internal environment of the spacecraft itself; through outgassing of toxic vapors from plastics and other items, failures or off-nominal operations of spacecraft environmental control systems, accidental exposure to hazardous compounds used in experiments: all present potential hazards that while small, may accumulate and pose a danger to crew health. The first step toward mitigating the dangers of these hazards is understanding the internal environment of the spacecraft and the compounds contained within it. Future spacecraft will have integrated networks of redundant sensors which will not only inform the crew of hazards, but will pinpoint the problem location and, through analysis by intelligent systems, recommend and even implement a course of action to stop the problem. This strategic plan details strategies to determine NASA's requirements for environmental monitoring and control systems for future spacecraft, and goals and objectives for a program to answer these needs.
NASA Astrophysics Data System (ADS)
Arnaud, G.; Krien, Y.; Zahibo, N.; Dudon, B.
2017-12-01
Coastal hazards are among the most worrying threats of our time. In a context of climate change coupled to a large population increase, tropical areas could be the most exposed zones of the globe. In such circumstances, understanding the underlying processes can help to better predict storm surges and the associated global risks.Here we present the partial preliminary results integrated in a multidisciplinary project focused on climatic change effects over the coastal threat in the French West Indies and funded by the European Regional Development Fund. The study aims to provide a coastal hazard assessment based on hurricane surge and tsunami modeling including several aspects of climate changes that can affect hazards such as sea level rise, crustal subsidence/uplift, coastline changes etc. Several tsunamis scenarios have been simulated including tele-tsunamis to ensure a large range of tsunami hazards. Surge level of hurricane have been calculated using a large number of synthetic hurricanes to cover the actual and forecasted climate over the tropical area of Atlantic ocean. This hazard assessment will be later coupled with stakes assessed over the territory to provide risk maps.
NASA Astrophysics Data System (ADS)
Harper, Bryan; Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto; Harper, Stacey
2015-06-01
The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure-activity relationships.
HMPT: Basic Radioactive Material Transportation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hypes, Philip A.
2016-02-29
Hazardous Materials and Packaging and Transportation (HMPT): Basic Radioactive Material Transportation Live (#30462, suggested one time) and Test (#30463, required initially and every 36 months) address the Department of Transportation’s (DOT’s) function-specific [required for hazardous material (HAZMAT) handlers, packagers, and shippers] training requirements of the HMPT Los Alamos National Laboratory (LANL) Labwide training. This course meets the requirements of 49 CFR 172, Subpart H, Section 172.704(a)(ii), Function-Specific Training.
Using hazard functions to assess changes in processing capacity in an attentional cuing paradigm.
Wenger, Michael J; Gibson, Bradley S
2004-08-01
Processing capacity--defined as the relative ability to perform mental work in a unit of time--is a critical construct in cognitive psychology and is central to theories of visual attention. The unambiguous use of the construct, experimentally and theoretically, has been hindered by both conceptual confusions and the use of measures that are at best only coarsely mapped to the construct. However, more than 25 years ago, J. T. Townsend and F. G. Ashby (1978) suggested that the hazard function on the response time (RT) distribution offered a number of conceptual advantages as a measure of capacity. The present study suggests that a set of statistical techniques, well-known outside the cognitive and perceptual literatures, offers the ability to perform hypothesis tests on RT-distribution hazard functions. These techniques are introduced, and their use is illustrated in application to data from the contingent attentional capture paradigm.
Tahir, M Ramzan; Tran, Quang X; Nikulin, Mikhail S
2017-05-30
We studied the problem of testing a hypothesized distribution in survival regression models when the data is right censored and survival times are influenced by covariates. A modified chi-squared type test, known as Nikulin-Rao-Robson statistic, is applied for the comparison of accelerated failure time models. This statistic is used to test the goodness-of-fit for hypertabastic survival model and four other unimodal hazard rate functions. The results of simulation study showed that the hypertabastic distribution can be used as an alternative to log-logistic and log-normal distribution. In statistical modeling, because of its flexible shape of hazard functions, this distribution can also be used as a competitor of Birnbaum-Saunders and inverse Gaussian distributions. The results for the real data application are shown. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Grose, Vernon L.
1985-12-01
The progress of technology is marked by fragmentation -- dividing research and development into ever narrower fields of specialization. Ultimately, specialists know everything about nothing. And hope for integrating those slender slivers of specialty into a whole fades. Without an integrated, all-encompassing perspective, technology becomes applied in a lopsided and often inefficient manner. A decisionary model, developed and applied for NASA's Chief Engineer toward establishment of commercial space operations, can be adapted to the identification, evaluation, and selection of optimum application of artificial intelligence for space station automation -- restoring wholeness to a situation that is otherwise chaotic due to increasing subdivision of effort. Issues such as functional assignments for space station task, domain, and symptom modules can be resolved in a manner understood by all parties rather than just the person with assigned responsibility -- and ranked by overall significance to mission accomplishment. Ranking is based on the three basic parameters of cost, performance, and schedule. This approach has successfully integrated many diverse specialties in situations like worldwide terrorism control, coal mining safety, medical malpractice risk, grain elevator explosion prevention, offshore drilling hazards, and criminal justice resource allocation -- all of which would have otherwise been subject to "squeaky wheel" emphasis and support of decision-makers.
NASA Astrophysics Data System (ADS)
Lindquist, Eric; Pierce, Jen; Wuerzer, Thomas; Glenn, Nancy; Dialani, Jijay; Gibble, Katie; Frazier, Tim; Strand, Eva
2015-04-01
The stages of planning for and responding to natural hazards, such as wildfires and related events, are often conducted as discrete (and often not connected) efforts. Disaster response often takes precedence, exhausting agency and stakeholder resources, and the planning stages are conducted by different agencies or entities with different and often competing agendas and jurisdictions. The result is that evaluation after a disaster can be minimal or even non-existent as resources are expended and interest moves on to the next event. Natural disasters and hazards, however, have a tendency to cascade and multiply: wildfires impact the vulnerability of hillslopes, for example, which may result in landslides, flooding and debris flows long after the initial event has occurred. Connecting decisions across multiple events and time scales is ignored, yet these connections could lead to better policy making at all stages of disaster risk reduction. Considering this situation, we present an adapted life cycle analysis (LCA) approach to examine fire-related hazards at the Wildland-Urban Interface in the American West. The LCHA focuses on the temporal integration of : 1) the 'pre-fire' set of physical conditions (e.g. fuel loads) and human conditions (e.g. hazard awareness), 2) the 'fire event', focusing on computational analysis of the communication patterns and responsibility for response to the event, and 3) the 'post-event' analysis of the landscape susceptibility to fire-related debris flows. The approach of the LCHA follows other models used by governmental agencies to prepare for disasters through 1) preparation and prevention, 2) response and 3) recovery. As an overlay are the diverse agencies and policies associated with these stages and their respective resource and management decisions over time. LCAs have evolved from a business-centric consideration of the environmental impact of a specific product over the products life. This approach takes several phases to end up with an assessment of the impact of the product on the environment over time and is being considered beyond the business and logistics communities in such areas as biodiversity and ecosystem impacts. From our perspective, we consider wildfire as the "product" and want to understand how it impacts the environment (spatially, temporally, across the bio-physical and social domains). Through development of this LCHA we adapt the LCA approach with a focus on the inputs (from fire and pre-fire efforts) outputs (from post fire conditions) and how they evolve and are responded to by the responsible agencies and stakeholders responsible. A Life Cycle Hazard Assessment (LCHA) approach extends and integrates the understanding of hazards over much longer periods of time than previously considered. The LCHA also provides an integrated platform for the necessary interdisciplinary approach to understanding decision and environmental change across the life cycle of the fire event. This presentation will discuss our theoretical and empirical framework for developing a longitudinal LCHA and contribute to the overall goals of the NH7.1 session.
IRIS Toxicological Review of Chloroprene (External Review Draft)
EPA is conducting a peer review of the scientific basis supporting the human health hazard and dose-response assessment of chloroprene that will appear on the Integrated Risk Information System (IRIS) database.
IRIS Toxicological Review of Chloroprene (2000 External Review Draft)
EPA is conducting a peer review of the scientific basis supporting the human health hazard and dose-response assessment of chloroprene that will appear on the Integrated Risk Information System (IRIS) database.
IRIS Toxicological Review of Trimethylbenzenes (External Review Draft)
EPA is conducting a peer review of the scientific basis supporting the human health hazard and dose-response assessment of trimethylbenzenes that will appear in the Integrated Risk Information System (IRIS) database.
General RMP Guidance - Chapter 6: Prevention Program (Program 2)
Sound prevention practices are founded on safety information, hazard review, operating procedures, training, maintenance, compliance audits, and accident investigation. These must be integrated into a risk management system that you implement consistently.
IRIS Toxicological Review of Ammonia (External Review Draft)
EPA is conducting a peer review of the scientific basis supporting the human health hazard and dose-response assessment of ammonia that will appear in the Integrated Risk Information System (IRIS) database.
IRIS Toxicological Review of Pentachlorophenol (External Review Draft)
EPA is conducting a peer review of the scientific basis supporting the human health hazard and dose-response assessment of pentachlorophenol that will appear in the Integrated Risk Information System (IRIS) database.
Low temperature ablation models made by pressure/vacuum application
NASA Technical Reports Server (NTRS)
Fischer, M. C.; Heier, W. C.
1970-01-01
Method developed employs high pressure combined with strong vacuum force to compact ablation models into desired conical shape. Technique eliminates vapor hazard and results in high material density providing excellent structural integrity.
NASA Astrophysics Data System (ADS)
Werg, J.; Grothmann, T.; Schmidt, P.
2013-06-01
People are unequally affected by extreme weather events in terms of mortality, morbidity and financial losses; this is the case not only for developing, but also for industrialized countries. Previous research has established indicators for identifying who is particularly vulnerable and why, focusing on socio-demographic factors such as income, age, gender, health and minority status. However, these factors can only partly explain the large disparities in the extent to which people are affected by natural hazards. Moreover, these factors are usually not alterable in the short to medium term, which limits their usefulness for strategies of reducing social vulnerability and building social capacity. Based on a literature review and an expert survey, we propose an approach for refining assessments of social vulnerability and building social capacity by integrating psychological and governance factors.
Techniques for capturing expert knowledge - An expert systems/hypertext approach
NASA Technical Reports Server (NTRS)
Lafferty, Larry; Taylor, Greg; Schumann, Robin; Evans, Randy; Koller, Albert M., Jr.
1990-01-01
The knowledge-acquisition strategy developed for the Explosive Hazards Classification (EHC) Expert System is described in which expert systems and hypertext are combined, and broad applications are proposed. The EHC expert system is based on rapid prototyping in which primary knowledge acquisition from experts is not emphasized; the explosive hazards technical bulletin, technical guidance, and minimal interviewing are used to develop the knowledge-based system. Hypertext is used to capture the technical information with respect to four issues including procedural, materials, test, and classification issues. The hypertext display allows the integration of multiple knowlege representations such as clarifications or opinions, and thereby allows the performance of a broad range of tasks on a single machine. Among other recommendations, it is suggested that the integration of hypertext and expert systems makes the resulting synergistic system highly efficient.
NASA Astrophysics Data System (ADS)
Raef, Abdelmoneam; Gad, Sabreen; Tucker-Kulesza, Stacey
2015-10-01
Seismic site characteristics, as pertaining to earthquake hazard reduction, are a function of the subsurface elastic moduli and the geologic structures. This study explores how multiscale (surface, downhole, and laboratory) datasets can be utilized to improve "constrained" average Vs30 (shear-wave velocity to a 30-meter depth). We integrate borehole, surface and laboratory measurements for a seismic site classification based on the standards of the National Earthquake Hazard Reduction Program (NEHRP). The seismic shear-wave velocity (Vs30) was derived from a geophysical inversion workflow that utilized multichannel analysis of surface-waves (MASW) and downhole acoustic televiewer imaging (DATI). P-wave and S-wave velocities, based on laboratory measurements of arrival times of ultrasonic-frequency signals, supported the workflow by enabling us to calculate Poisson's ratio, which was incorporated in building an initial model for the geophysical inversion of MASW. Extraction of core samples from two boreholes provided lithology and thickness calibration of the amplitudes of the acoustic televiewer imaging for each layer. The MASW inversion, for calculating Vs sections, was constrained with both ultrasonic laboratory measurements (from first arrivals of Vs and Vp waveforms at simulated in situ overburden stress conditions) and the downhole acoustic televiewer (DATV) amplitude logs. The Vs30 calculations enabled categorizing the studied site as NEHRP-class "C" - very dense soil and soft rock. Unlike shallow fractured carbonates in the studied area, S-wave and P-wave velocities at ultrasonic frequency for the deeper intact shale core-samples from two boreholes were in better agreement with the corresponding velocities from both a zero-offset vertical seismic profiling (VSP) and inversion of Rayleigh-wave velocity dispersion curves.
Bruce, Martha L.; Lohman, Matthew C.; Greenberg, Rebecca L.; Bao, Yuhua; Raue, Patrick J.
2016-01-01
OBJECTIVES To determine whether a depression care management intervention among Medicare home health recipients decreases risks of hospitalization. DESIGN Cluster-randomized trial. Nurse teams were randomized to Intervention (12 teams) or Enhanced Usual Care (EUC; 9 teams). SETTING Six home health agencies from distinct geographic regions. Patients were interviewed at home and by telephone. PARTICIPANTS Patients age>65 who screened positive for depression on nurse assessments (N=755), and a subset who consented to interviews (N=306). INTERVENTION The Depression CAREPATH (CARE for PATients at Home) guides nurses in managing depression during routine home visits. Clinical functions include weekly symptom assessment, medication management, care coordination, patient education, and goal setting. Researchers conducted biweekly telephone conferences with team supervisors. MEASUREMENTS The study examined acute-care hospitalization and days to hospitalization. H1 used data from the home health record to examine hospitalization over 30-day and 60-day periods while a home health patient. H2 used data from both home care record and research assessments to examine 30-day hospitalization from any setting. RESULTS The adjusted hazard ratio (HR) of being admitted to hospital directly from home health within 30 days of start of home health care was 0.65 (p=.013) for CAREPATH compared to EUC patients, and 0.72 (p=.027) within 60 days. In patients referred to home health directly from hospital, the relative hazard of being rehospitalized was approximately 55% lower (HR = 0.45, p=.001) among CAREPATH patients. CONCLUSION Integrating CAREPATH depression care management into routine nursing practice reduces hospitalization and rehospitalization risk among older adults receiving Medicare home health nursing services. PMID:27739067
Health Hazard Assessment and Toxicity Clearances in the Army Acquisition Process
NASA Technical Reports Server (NTRS)
Macko, Joseph A., Jr.
2000-01-01
The United States Army Materiel Command, Army Acquisition Pollution Prevention Support Office (AAPPSO) is responsible for creating and managing the U.S. Army Wide Acquisition Pollution Prevention Program. They have established Integrated Process Teams (IPTs) within each of the Major Subordinate Commands of the Army Materiel Command. AAPPSO provides centralized integration, coordination, and oversight of the Army Acquisition Pollution Prevention Program (AAPPP) , and the IPTs provide the decentralized execution of the AAPPSO program. AAPPSO issues policy and guidance, provides resources and prioritizes P2 efforts. It is the policy of the (AAPPP) to require United States Army Surgeon General approval of all materials or substances that will be used as an alternative to existing hazardous materials, toxic materials and substances, and ozone-depleting substances. The Army has a formal process established to address this effort. Army Regulation 40-10 requires a Health Hazard Assessment (HHA) during the Acquisition milestones of a new Army system. Army Regulation 40-5 addresses the Toxicity Clearance (TC) process to evaluate new chemicals and materials prior to acceptance as an alternative. U.S. Army Center for Health Promotion and Preventive Medicine is the Army's matrixed medical health organization that performs the HHA and TC mission.
Management of hazardous medical waste in Croatia.
Marinković, Natalija; Vitale, Ksenija; Janev Holcer, Natasa; Dzakula, Aleksandar; Pavić, Tomo
2008-01-01
This article provides a review of hazardous medical waste production and its management in Croatia. Even though Croatian regulations define all steps in the waste management chain, implementation of those steps is one of the country's greatest issues. Improper practice is evident from the point of waste production to final disposal. The biggest producers of hazardous medical waste are hospitals that do not implement existing legislation, due to the lack of education and funds. Information on quantities, type and flow of medical waste are inadequate, as is sanitary control. We propose an integrated approach to medical waste management based on a hierarchical structure from the point of generation to its disposal. Priority is given to the reduction of the amounts and potential for harm. Where this is not possible, management includes reduction by sorting and separating, pretreatment on site, safe transportation, final treatment and sanitary disposal. Preferred methods should be the least harmful for human health and the environment. Integrated medical waste management could greatly reduce quantities and consequently financial strains. Landfilling is the predominant route of disposal in Croatia, although the authors believe that incineration is the most appropriate method. In a country such as Croatia, a number of small incinerators would be the most economical solution.