Science.gov

Sample records for quantitative hazard analysis

  1. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  2. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  3. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  4. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  5. Quantitative electroencephalography analysis in university students with hazardous alcohol consumption, but not alcohol dependence.

    PubMed

    Núñez-Jaramillo, Luis; Vega-Perera, Paulo; Ramírez-Lugo, Leticia; Reyes-López, Julián V; Santiago-Rodríguez, Efraín; Herrera-Morales, Wendy V

    2015-07-08

    Hazardous alcohol consumption is a pattern of consumption that leads to a higher risk of harmful consequences either for the user or for others. This pattern of alcohol consumption has been linked to risky behaviors, accidents, and injuries. Individuals with hazardous alcohol consumption do not necessarily present alcohol dependence; thus, a study of particular neurophysiological correlates of this alcohol consumption pattern needs to be carried out in nondependent individuals. Here, we carried out a quantitative electroencephalography analysis in health sciences university students with hazardous alcohol consumption, but not alcohol dependence (HAC), and control participants without hazardous alcohol consumption or alcohol dependence (NHAC). We analyzed Absolute Power (AP), Relative Power (RP), and Mean Frequency (MF) for beta and theta frequency bands under both eyes closed and eyes open conditions. We found that participants in the HAC group presented higher beta AP at centroparietal region, as well as lower beta MF at frontal and centroparietal regions in the eyes closed condition. Interestingly, participants did not present any change in theta activity (AP, RP, or MF), whereas previous reports indicate an increase in theta AP in alcohol-dependent individuals. Our results partially resemble those found in alcohol-dependent individuals, although are not completely identical, suggesting a possible difference in the underlying neuronal mechanism behind alcohol dependence and hazardous alcohol consumption. Similarities could be explained considering that both hazardous alcohol consumption and alcohol dependence are manifestations of behavioral disinhibition.

  6. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    NASA Astrophysics Data System (ADS)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  7. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  8. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  9. Hazard Analysis Database Report

    SciTech Connect

    GRAMS, W.H.

    2000-12-28

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification.

  10. Software safety hazard analysis

    SciTech Connect

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  11. K Basins Hazard Analysis

    SciTech Connect

    WEBB, R.H.

    1999-12-29

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062, Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  12. K Basin Hazard Analysis

    SciTech Connect

    PECH, S.H.

    2000-08-23

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  13. HAZARD ANALYSIS SOFTWARE

    SciTech Connect

    Sommer, S; Tinh Tran, T

    2008-04-08

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process.

  14. Identifying hazard parameter to develop quantitative and dynamic hazard map of an active volcano in Indonesia

    NASA Astrophysics Data System (ADS)

    Suminar, Wulan; Saepuloh, Asep; Meilano, Irwan

    2016-05-01

    Analysis of hazard assessment to active volcanoes is crucial for risk management. The hazard map of volcano provides information to decision makers and communities before, during, and after volcanic crisis. The rapid and accurate hazard assessment, especially to an active volcano is necessary to be developed for better mitigation on the time of volcanic crises in Indonesia. In this paper, we identified the hazard parameters to develop quantitative and dynamic hazard map of an active volcano. The Guntur volcano in Garut Region, West Java, Indonesia was selected as study area due population are resided adjacent to active volcanoes. The development of infrastructures, especially related to tourism at the eastern flank from the Summit, are growing rapidly. The remote sensing and field investigation approaches were used to obtain hazard parameters spatially. We developed a quantitative and dynamic algorithm to map spatially hazard potential of volcano based on index overlay technique. There were identified five volcano hazard parameters based on Landsat 8 and ASTER imageries: volcanic products including pyroclastic fallout, pyroclastic flows, lava and lahar, slope topography, surface brightness temperature, and vegetation density. Following this proposed technique, the hazard parameters were extracted, indexed, and calculated to produce spatial hazard values at and around Guntur Volcano. Based on this method, the hazard potential of low vegetation density is higher than high vegetation density. Furthermore, the slope topography, surface brightness temperature, and fragmental volcanic product such as pyroclastics influenced to the spatial hazard value significantly. Further study to this proposed approach will be aimed for effective and efficient analyses of volcano risk assessment.

  15. Hidden Markov Model for quantitative prediction of snowfall and analysis of hazardous snowfall events over Indian Himalaya

    NASA Astrophysics Data System (ADS)

    Joshi, J. C.; Tankeshwar, K.; Srivastava, Sunita

    2017-04-01

    A Hidden Markov Model (HMM) has been developed for prediction of quantitative snowfall in Pir-Panjal and Great Himalayan mountain ranges of Indian Himalaya. The model predicts snowfall for two days in advance using daily recorded nine meteorological variables of past 20 winters from 1992-2012. There are six observations and six states of the model. The most probable observation and state sequence has been computed using Forward and Viterbi algorithms, respectively. Baum-Welch algorithm has been used for optimizing the model parameters. The model has been validated for two winters (2012-2013 and 2013-2014) by computing root mean square error (RMSE), accuracy measures such as percent correct (PC), critical success index (CSI) and Heidke skill score (HSS). The RMSE of the model has also been calculated using leave-one-out cross-validation method. Snowfall predicted by the model during hazardous snowfall events in different parts of the Himalaya matches well with the observed one. The HSS of the model for all the stations implies that the optimized model has better forecasting skill than random forecast for both the days. The RMSE of the optimized model has also been found smaller than the persistence forecast and standard deviation for both the days.

  16. Canister Storage Building (CSB) Hazard Analysis Report

    SciTech Connect

    POWERS, T.B.

    2000-03-16

    This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safety analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other

  17. Chemical process hazards analysis

    SciTech Connect

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  18. Job Hazard Analysis

    DTIC Science & Technology

    1998-01-01

    lifting heavy objects? • Do environmenta on, welding rays, heat, or excessiv Job Hazard Analysis U.S. Department of Labor Occupational Safety and...Performing Organization Name(s) and Address(es) U.S. Department of Labor Occupational Safety & Health Administration 200 Constitution Avenue Washington, DC...not itself alter or determine compliance responsibilities, which are set forth in OSHA standards themselves and the Occupational Safety and Health Act

  19. Counterfactual Volcano Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2013-04-01

    The historical database of past disasters is a cornerstone of catastrophe risk assessment. Whereas disasters are fortunately comparatively rare, near-misses are quite common for both natural and man-made hazards. The word disaster originally means 'an unfavourable aspect of a star'. Except for astrologists, disasters are no longer perceived fatalistically as pre-determined. Nevertheless, to this day, historical disasters are treated statistically as fixed events, although in reality there is a large luck element involved in converting a near-miss crisis situation into a disaster statistic. It is possible to conceive a stochastic simulation of the past to explore the implications of this chance factor. Counterfactual history is the exercise of hypothesizing alternative paths of history from what actually happened. Exploring history from a counterfactual perspective is instructive for a variety of reasons. First, it is easy to be fooled by randomness and see regularity in event patterns which are illusory. The past is just one realization of a variety of possible evolutions of history, which may be analyzed through a stochastic simulation of an array of counterfactual scenarios. In any hazard context, there is a random component equivalent to dice being rolled to decide whether a near-miss becomes an actual disaster. The fact that there may be no observed disaster over a period of time may belie the occurrence of numerous near-misses. This may be illustrated using the simple dice paradigm. Suppose a dice is rolled every month for a year, and an event is recorded if a six is thrown. There is still an 11% chance of no events occurring during the year. A variety of perils may be used to illustrate the use of near-miss information within a counterfactual disaster analysis. In the domain of natural hazards, near-misses are a notable feature of the threat landscape. Storm surges are an obvious example. Sea defences may protect against most meteorological scenarios. However

  20. An evaluation of the effectiveness of the US Department of Energy Integrated Safety Process (SS-21) for Nuclear Explosive Operations using quantitative hazard analysis

    SciTech Connect

    Fischer, S.R.; Konkel, H.; Bott, T.; Eisenhawer, S.; Auflick, J.; Houghton, K.; Maloney, K.; DeYoung, L.; Wilson, M. |

    1996-03-01

    This paper evaluates the effectiveness of the US Department of Energy Integrated Safety Process or ``Seamless Safety (SS-21)`` program for reducing risk associated with nuclear explosive operations. A key element in the Integrated Safety Process is the use of hazard assessment techniques to evaluate process design changes in parallel or concurrently with process design and development. This concurrent hazard assessment method recently was employed for the B61-0, 2 & 5 and W69 nuclear explosive dismantlement activities. This paper reviews the SS-21 hazard assessment process and summarizes the results of the concurrent hazard assessments performed for the B61 and W69 dismantlement programs. Comparisons of quantitative hazard assessment results before and after implementation of the SS-21 design process shed light on the effectiveness of the SS-21 program for achieving risk reduction.

  1. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  2. MGR External Events Hazards Analysis

    SciTech Connect

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  3. Hazardous Materials Hazard Analysis, Portland, Oregon.

    DTIC Science & Technology

    1981-06-01

    regional equipment, crossings , and other aspects of overall system safety. For further information: Henry Jacobs, Hazardous Materials Inspector U.S... cross -tralning and an Increased in- spection effort. In the last year, twelve Oregon State Police Officers who patrol highways around the Portland area...Response System:, The International Fire Chlef (date unknown). Lee, Myra T. and Roe, Penelope G. Hazardous Materials Management System: A Guide for Local

  4. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  5. Preliminary hazards analysis -- vitrification process

    SciTech Connect

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  6. FIRE HAZARDS ANALYSIS - BUSTED BUTTE

    SciTech Connect

    R. Longwell; J. Keifer; S. Goodin

    2001-01-22

    The purpose of this fire hazards analysis (FHA) is to assess the risk from fire within individual fire areas at the Busted Butte Test Facility and to ascertain whether the DOE fire safety objectives are met. The objective, identified in DOE Order 420.1, Section 4.2, is to establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: (1) The occurrence of a fire related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees. (3) Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. Critical process controls and safety class systems being damaged as a result of a fire and related events.

  7. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  8. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  9. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  10. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  11. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard analysis...

  12. The Integrated Hazard Analysis Integrator

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2009-01-01

    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and

  13. Multivariate Quantitative Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  14. The hazard in using probabilistic seismic hazard analysis

    SciTech Connect

    Krinitzsky, E.L. . Geotechnical Lab.)

    1993-11-01

    Earthquake experts rely on probabilistic seismic hazard analysis for everything from emergency-response planning to development of building codes. Unfortunately, says the author, the analysis is defective for the large earthquakes that pose the greater risks. Structures have short lifetimes and the distance over which earthquakes cause damage are relatively small. Exceptions serve to prove the rule. To be useful in engineering, earthquakes hazard assessment must focus narrowly in both time and space.

  15. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  16. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  17. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  18. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify and...

  19. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    SciTech Connect

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology for this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.

  20. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  1. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  2. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must identify and characterize each of the hazards and assess the risk to public health and safety and the safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify...

  3. Canister storage building hazard analysis report

    SciTech Connect

    POWERS, T.B.

    1999-05-11

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis was performed in accordance with the DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports'', and meets the intent of HNF-PRO-704, ''Hazard and Accident Analysis Process''. This hazard analysis implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports''.

  4. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  5. MONITORED GEOLOGIC REPOSITORY INTERNAL HAZARDS ANALYSIS

    SciTech Connect

    S.E. Salzman

    1999-11-05

    This analysis was performed by the Management and Operating Contractor (M&O) Safety Assurance Department to identify and document the internal hazards and preliminary events associated with preclosure operations of the Monitored Geologic Repository (MGR). Internal hazards are those hazards presented by operation of the facility and associated processes. These are in contrast to external hazards which involve natural phenomena and external man-made hazards. The hazard analysis methodology used in this analysis provides a systematic means to identify facility hazards and associated events that may result in radiological consequences to the public and facility worker during the MGR preclosure period. The events are documented in a preliminary events list and are intended to be used as input to the MGR Design Basis Event (DBE) selection process. It is expected that the results from this analysis will undergo further screening and analysis based on the criteria that apply to the performance of DBE analyses for the preclosure period of repository operation. As the MGR design progresses, this analysis will be reviewed to ensure no new hazards are introduced and that previously evaluated hazards have not increased in severity.

  6. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  7. Cold Vacuum Drying Facility hazard analysis report

    SciTech Connect

    Krahn, D.E.

    1998-02-23

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) hazard analysis to support the CVDF phase 2 safety analysis report (SAR), and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, and implements the requirements of US Department of Energy (DOE) Order 5480.23, Nuclear Safety Analysis Reports.

  8. Canister storage building hazard analysis report

    SciTech Connect

    Krahn, D.E.; Garvin, L.J.

    1997-07-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the final CSB safety analysis report (SAR) and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Report, and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  9. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    SciTech Connect

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka; Ken Yanagisawa; Tadashi Annaka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present an example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)

  10. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  11. Quantitative Techniques in Volumetric Analysis

    NASA Astrophysics Data System (ADS)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  12. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  13. Integrating waste management with Job Hazard analysis

    SciTech Connect

    2007-07-01

    The web-based Automated Job Hazard Analysis (AJHA) system is a tool designed to help capture and communicate the results of the hazard review and mitigation process for specific work activities. In Fluor Hanford's day-to-day work planning and execution process, AJHA has become the focal point for integrating Integrated Safety Management (ISM) through industrial health and safety principles; environmental safety measures; and involvement by workers, subject-matter experts and management. This paper illustrates how AJHA has become a key element in involving waste-management and environmental-control professionals in planning and executing work. To support implementing requirements for waste management and environmental compliance within the core function and guiding principles of an integrated safety management system (ISMS), Fluor Hanford has developed the a computer-based application called the 'Automated Job Hazard Analysis' (AJHA), into the work management process. This web-based software tool helps integrate the knowledge of site workers, subject-matter experts, and safety principles and requirements established in standards, and regulations. AJHA facilitates a process of work site review, hazard identification, analysis, and the determination of specific work controls. The AJHA application provides a well-organized job hazard analysis report including training and staffing requirements, prerequisite actions, notifications, and specific work controls listed for each sub-task determined for the job. AJHA lists common hazards addressed in the U.S. Occupational, Safety, and Health Administration (OSHA) federal codes; and State regulations such as the Washington Industrial Safety and Health Administration (WISHA). AJHA also lists extraordinary hazards that are unique to a particular industry sector, such as radiological hazards and waste management. The work-planning team evaluates the scope of work and reviews the work site to identify potential hazards. Hazards

  14. Preliminary hazards analysis conducted on FCCU complex

    SciTech Connect

    Rooney, J.J.; Turner, J.H.; Arendt, J.S.

    1988-08-08

    A preliminary hazards analysis (PHA) was conducted on an actual fluid catalytic cracking unit (FCCU) complex at a refinery. The PHA was conducted to improve the FCCU's availability and profitability. These types of analyses can be important because the current turbulent economic conditions that exist in the refining industry have complicated the decision making process for refinery and process plant managers. To decide how to allocate their limited resources, managers need to know the hazards that present the greatest threat to refinery profitability (economic risks) and the most cost-effective options for dealing with these hazards. An economic-risk-based preliminary hazards analysis is an efficient way to identify areas in a refinery or process unit that are at unacceptably high risk. A PHA also allows for the formulation of practical, cost-effective recommendations for increasing unit on-stream time factors.

  15. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Hazard Analysis and HACCP Plan. 417.2... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.2 Hazard Analysis and HACCP Plan. (a) Hazard analysis. (1) Every official establishment shall conduct, or have conducted for it, a hazard analysis...

  16. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    Richard C. Logan

    2002-03-28

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  17. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    J. L. Kubicek

    2001-09-07

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  18. Quantitative analysis of PET studies.

    PubMed

    Weber, Wolfgang A

    2010-09-01

    Quantitative analysis can be included relatively easily in clinical PET-imaging protocols, but in order to obtain meaningful quantitative results one needs to follow a standardized protocol for image acquisition and data analysis. Important factors to consider are the calibration of the PET scanner, the radiotracer uptake time and the approach for definition of regions of interests. Using such standardized acquisition protocols quantitative parameters of tumor metabolism or receptor status can be derived from tracer kinetic analysis and simplified approaches such as calculation of standardized uptake values (SUVs).

  19. Lunar mission safety and rescue: Hazards analysis and safety requirements

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.

  20. Quantitative analysis in megageomorphology

    NASA Technical Reports Server (NTRS)

    Mayer, L.

    1985-01-01

    Megageomorphology is the study of regional topographic features and their relations to independent geomorphic variables that operate at the regional scale. These independent variables can be classified as either tectonic or climatic in nature. Quantitative megageomorphology stresses the causal relations between plate tectonic factors and landscape features or correlations between climatic factors and geomorphic processes. In addition, the cumulative effects of tectonics and climate on landscape evolution that simultaneously operate in a complex system of energy transfer is of interst. Regional topographic differentiation, say between continents and ocean floors, is largely the result of the different densities and density contrasts within the oceanic and continental lithosphere and their isostatic consequences. Regional tectonic processes that alter these lithospheric characteristics include rifting, collision, subduction, transpression and transtension.

  1. Repository Subsurface Preliminary Fire Hazard Analysis

    SciTech Connect

    Richard C. Logan

    2001-07-30

    This fire hazard analysis identifies preliminary design and operations features, fire, and explosion hazards, and provides a reasonable basis to establish the design requirements of fire protection systems during development and emplacement phases of the subsurface repository. This document follows the Technical Work Plan (TWP) (CRWMS M&O 2001c) which was prepared in accordance with AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''; Attachment 4 of AP-ESH-008, ''Hazards Analysis System''; and AP-3.11Q, ''Technical Reports''. The objective of this report is to establish the requirements that provide for facility nuclear safety and a proper level of personnel safety and property protection from the effects of fire and the adverse effects of fire-extinguishing agents.

  2. Environmental risk analysis of hazardous material rail transportation.

    PubMed

    Saat, Mohd Rapik; Werth, Charles J; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P L

    2014-01-15

    An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials. Published by Elsevier B.V.

  3. Seismic Hazard Analysis — Quo vadis?

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2008-05-01

    The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design

  4. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... employee hygiene, to determine the potential effect of each on the safety of the finished food for the...

  5. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... employee hygiene, to determine the potential effect of each on the safety of the finished food for the...

  6. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... employee hygiene, to determine the potential effect of each on the safety of the finished food for the...

  7. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... employee hygiene, to determine the potential effect of each on the safety of the finished food for the...

  8. 21 CFR 120.7 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard analysis. 120.7 Section 120.7 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... employee hygiene, to determine the potential effect of each on the safety of the finished food for the...

  9. Probabilistic earthquake hazard analysis for Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  10. Preliminary Hazards Analysis Plasma Hearth Process

    SciTech Connect

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P.

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  11. A situational analysis of priority disaster hazards in Uganda: findings from a hazard and vulnerability analysis.

    PubMed

    Mayega, R W; Wafula, M R; Musenero, M; Omale, A; Kiguli, J; Orach, G C; Kabagambe, G; Bazeyo, W

    2013-06-01

    Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for identifying the priority hazards to guide disaster management planning. The study as conducted in Uganda, as part of a multi-country assessment. A hazard, vulnerability and capacity analysis was conducted in a focus group discussion of 7 experts representing key stakeholder agencies in disaster management in Uganda. A simple ranking method was used to rank the probability of occurance of 11 top hazards, their potential impact and the level vulnerability of people and infrastructure. In-terms of likelihood of occurance and potential impact, the top ranked disaster hazards in Uganda are: 1) Epidemics of infectious diseases, 2) Drought/famine, 3) Conflict and environmental degradation in that order. In terms of vulnerability, the top priority hazards to which people and infrastructure were vulnerable were: 1) Conflicts, 2) Epidemics, 3) Drought/famine and, 4) Environmental degradation in that order. Poverty, gender, lack of information, and lack of resilience measures were some of the factors promoting vulnerability to disasters. As Uganda develops a disaster risk reduction and response plan, it ought to prioritize epidemics of infectious diseases, drought/famine, conflics and environmental degradation as the priority disaster hazards.

  12. Software for quantitative trait analysis.

    PubMed

    Almasy, Laura; Warren, Diane M

    2005-09-01

    This paper provides a brief overview of software currently available for the genetic analysis of quantitative traits in humans. Programs that implement variance components, Markov Chain Monte Carlo (MCMC), Haseman-Elston (H-E) and penetrance model-based linkage analyses are discussed, as are programs for measured genotype association analyses and quantitative trait transmission disequilibrium tests. The software compared includes LINKAGE, FASTLINK, PAP, SOLAR, SEGPATH, ACT, Mx, MERLIN, GENEHUNTER, Loki, Mendel, SAGE, QTDT and FBAT. Where possible, the paper provides URLs for acquiring these programs through the internet, details of the platforms for which the software is available and the types of analyses performed.

  13. Software for quantitative trait analysis

    PubMed Central

    2005-01-01

    This paper provides a brief overview of software currently available for the genetic analysis of quantitative traits in humans. Programs that implement variance components, Markov Chain Monte Carlo (MCMC), Haseman-Elston (H-E) and penetrance model-based linkage analyses are discussed, as are programs for measured genotype association analyses and quantitative trait transmission disequilibrium tests. The software compared includes LINKAGE, FASTLINK, PAP, SOLAR, SEGPATH, ACT, Mx, MERLIN, GENEHUNTER, Loki, Mendel, SAGE, QTDT and FBAT. Where possible, the paper provides URLs for acquiring these programs through the internet, details of the platforms for which the software is available and the types of analyses performed. PMID:16197737

  14. A practical approach to fire hazard analysis for offshore structures.

    PubMed

    Krueger, Joel; Smith, Duncan

    2003-11-14

    Offshore quantitative risk assessments (QRA) have historically been complex and costly. For large offshore design projects, the level of detail required for a QRA is often not available until well into the detailed design phase of the project. In these cases, the QRA may be unable to provide timely hazard understanding. As a result, the risk reduction measures identified often come too late to allow for cost effective changes to be implemented. This forces project management to make a number of difficult or costly decisions. This paper demonstrates how a scenario-based approached to fire risk assessment can be effectively applied early in a project's development. The scenario or design basis fire approach calculates the consequence of a select number of credible fire scenarios, determines the potential impact on the platform process equipment, structural members, egress routes, safety systems, and determines the effectiveness of potential options for mitigation. The early provision of hazard data allows the project team to select an optimum design that is safe and will meet corporate or regulatory risk criteria later in the project cycle. The focus of this paper is on the application of the scenario-based approach to gas jet fires. This paper draws on recent experience in the Gulf of Mexico (GOM) and other areas to outline an approach to fire hazard analysis and fire hazard management for deep-water structures. The methods presented will include discussions from the recent June 2002 International Workshop for Fire Loading and Response.

  15. Quantitative rock-fall hazard and risk assessment for Yosemite Valley, California

    NASA Astrophysics Data System (ADS)

    Stock, G. M.; Luco, N.; Collins, B. D.; Harp, E.; Reichenbach, P.; Frankel, K. L.

    2011-12-01

    Rock falls are a considerable hazard in Yosemite Valley, California with more than 835 rock falls and other slope movements documented since 1857. Thus, rock falls pose potentially significant risk to the nearly four million annual visitors to Yosemite National Park. Building on earlier hazard assessment work by the U.S. Geological Survey, we performed a quantitative rock-fall hazard and risk assessment for Yosemite Valley. This work was aided by several new data sets, including precise Geographic Information System (GIS) maps of rock-fall deposits, airborne and terrestrial LiDAR-based point cloud data and digital elevation models, and numerical ages of talus deposits. Using Global Position Systems (GPS), we mapped the positions of over 500 boulders on the valley floor and measured their distance relative to the mapped base of talus. Statistical analyses of these data yielded an initial hazard zone that is based on the 90th percentile distance of rock-fall boulders beyond the talus edge. This distance was subsequently scaled (either inward or outward from the 90th percentile line) based on rock-fall frequency information derived from a combination of cosmogenic beryllium-10 exposure dating of boulders beyond the edge of the talus, and computer model simulations of rock-fall runout. The scaled distances provide the basis for a new hazard zone on the floor of Yosemite Valley. Once this zone was delineated, we assembled visitor, employee, and resident use data for each structure within the hazard zone to quantitatively assess risk exposure. Our results identify areas within the new hazard zone that may warrant more detailed study, for example rock-fall susceptibility, which can be assessed through examination of high-resolution photographs, structural measurements on the cliffs, and empirical calculations derived from LiDAR point cloud data. This hazard and risk information is used to inform placement of existing and potential future infrastructure in Yosemite Valley.

  16. Decision analysis for INEL hazardous waste storage

    SciTech Connect

    Page, L.A.; Roach, J.A.

    1994-01-01

    In mid-November 1993, the Idaho National Engineering Laboratory (INEL) Waste Reduction Operations Complex (WROC) Manager requested that the INEL Hazardous Waste Type Manager perform a decision analysis to determine whether or not a new Hazardous Waste Storage Facility (HWSF) was needed to store INEL hazardous waste (HW). In response to this request, a team was formed to perform a decision analysis for recommending the best configuration for storage of INEL HW. Personnel who participated in the decision analysis are listed in Appendix B. The results of the analysis indicate that the existing HWSF is not the best configuration for storage of INEL HW. The analysis detailed in Appendix C concludes that the best HW storage configuration would be to modify and use a portion of the Waste Experimental Reduction Facility (WERF) Waste Storage Building (WWSB), PBF-623 (Alternative 3). This facility was constructed in 1991 to serve as a waste staging facility for WERF incineration. The modifications include an extension of the current Room 105 across the south end of the WWSB and installing heating, ventilation, and bay curbing, which would provide approximately 1,600 ft{sup 2} of isolated HW storage area. Negotiations with the State to discuss aisle space requirements along with modifications to WWSB operating procedures are also necessary. The process to begin utilizing the WWSB for HW storage includes planned closure of the HWSF, modification to the WWSB, and relocation of the HW inventory. The cost to modify the WWSB can be funded by a reallocation of funding currently identified to correct HWSF deficiencies.

  17. Economic Analysis of Hazardous Waste Minimization Alternatives

    DTIC Science & Technology

    1992-08-01

    OF REPORT OF THIS PP,,E OF ABSTRACT Unclassified Unclassilied Unclassified SAR NSN 7540 01-280 5500 SuxVl form t (Rev 2- R ]I Pe•nbed or ANSi Srd 2r39...Consulting Associates, Inc.. 15 June 1987). Chapter 7. T . Page, R . Harris, and J. Bruser, Removal of Carcinogens from Drinking Water: A Cost-Benefit Analysis...Pretreatment of Hazardous Waste, EPA/600/D-87/047 (EPA, January 1987), pp 58-70. Page, T ., R . Harris, and J. Bruser. Removal of Carcinogens from Drinking Water

  18. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... to control the risk to the public from debris impact hazards. The risk management requirements of... debris hazards; (6) Debris identified by the debris analysis of § 417.211; and (7) All launch vehicle... dispersion area for each planned debris impact. A flight hazard areas analysis must establish the aircraft...

  19. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... to control the risk to the public from debris impact hazards. The risk management requirements of... debris hazards; (6) Debris identified by the debris analysis of § 417.211; and (7) All launch vehicle... dispersion area for each planned debris impact. A flight hazard areas analysis must establish the aircraft...

  20. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... to control the risk to the public from debris impact hazards. The risk management requirements of... debris hazards; (6) Debris identified by the debris analysis of § 417.211; and (7) All launch vehicle... dispersion area for each planned debris impact. A flight hazard areas analysis must establish the aircraft...

  1. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... to control the risk to the public from debris impact hazards. The risk management requirements of... debris hazards; (6) Debris identified by the debris analysis of § 417.211; and (7) All launch vehicle... dispersion area for each planned debris impact. A flight hazard areas analysis must establish the aircraft...

  2. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... to control the risk to the public from debris impact hazards. The risk management requirements of... debris hazards; (6) Debris identified by the debris analysis of § 417.211; and (7) All launch vehicle... dispersion area for each planned debris impact. A flight hazard areas analysis must establish the aircraft...

  3. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... (a) of this section, which must be controlled for each process. (2) List the critical control points... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.2 Hazard Analysis and HACCP Plan. (a) Hazard... determine the food safety hazards reasonably likely to occur in the production process and identify...

  4. Quantitative analysis of glycated proteins.

    PubMed

    Priego-Capote, Feliciano; Ramírez-Boo, María; Finamore, Francesco; Gluck, Florent; Sanchez, Jean-Charles

    2014-02-07

    The proposed protocol presents a comprehensive approach for large-scale qualitative and quantitative analysis of glycated proteins (GP) in complex biological samples including biological fluids and cell lysates such as plasma and red blood cells. The method, named glycation isotopic labeling (GIL), is based on the differential labeling of proteins with isotopic [(13)C6]-glucose, which supports quantitation of the resulting glycated peptides after enzymatic digestion with endoproteinase Glu-C. The key principle of the GIL approach is the detection of doublet signals for each glycated peptide in MS precursor scanning (glycated peptide with in vivo [(12)C6]- and in vitro [(13)C6]-glucose). The mass shift of the doublet signals is +6, +3 or +2 Da depending on the peptide charge state and the number of glycation sites. The intensity ratio between doublet signals generates quantitative information of glycated proteins that can be related to the glycemic state of the studied samples. Tandem mass spectrometry with high-energy collisional dissociation (HCD-MS2) and data-dependent methods with collision-induced dissociation (CID-MS3 neutral loss scan) are used for qualitative analysis.

  5. Bioimaging for quantitative phenotype analysis.

    PubMed

    Chen, Weiyang; Xia, Xian; Huang, Yi; Chen, Xingwei; Han, Jing-Dong J

    2016-06-01

    With the development of bio-imaging techniques, an increasing number of studies apply these techniques to generate a myriad of image data. Its applications range from quantification of cellular, tissue, organismal and behavioral phenotypes of model organisms, to human facial phenotypes. The bio-imaging approaches to automatically detect, quantify, and profile phenotypic changes related to specific biological questions open new doors to studying phenotype-genotype associations and to precisely evaluating molecular changes associated with quantitative phenotypes. Here, we review major applications of bioimage-based quantitative phenotype analysis. Specifically, we describe the biological questions and experimental needs addressable by these analyses, computational techniques and tools that are available in these contexts, and the new perspectives on phenotype-genotype association uncovered by such analyses.

  6. Lost in translation? The hazards of applying social constructionism to quantitative research on sexual orientation development.

    PubMed

    Robboy, Caroline Alex

    2002-01-01

    This article explores the hazards faced by social constructionists who attempt to conduct quantitative research on sexual orientation development. By critically reviewing two quantitative research studies, this article explores the ways in which the very nature of social constructionist arguments may be incongruous with the methodological requirements of quantitative studies. I suggest this conflict is a result of the differing natures of these two modes of scholarly inquiry. While research requires the acceptance of certain analytical categories, the strength of social constructionism comes from its reflexive scrutiny and problematization of those very categories. Ultimately, social constructionists who try to apply their theories/perspectives must necessarily conform to the methodological constraints of quantitative research. The intent of this article is not to suggest that it is futile or self-contradictory for social constructionists to attempt empirical research, but that these are two distinct modes of scholarly inquiry which can, and should, co-exist in a dialectical relationship to each other.

  7. Quantitative physical models of volcanic phenomena for hazards assessment of critical infrastructures

    NASA Astrophysics Data System (ADS)

    Costa, Antonio

    2016-04-01

    Volcanic hazards may have destructive effects on economy, transport, and natural environments at both local and regional scale. Hazardous phenomena include pyroclastic density currents, tephra fall, gas emissions, lava flows, debris flows and avalanches, and lahars. Volcanic hazards assessment is based on available information to characterize potential volcanic sources in the region of interest and to determine whether specific volcanic phenomena might reach a given site. Volcanic hazards assessment is focussed on estimating the distances that volcanic phenomena could travel from potential sources and their intensity at the considered site. Epistemic and aleatory uncertainties strongly affect the resulting hazards assessment. Within the context of critical infrastructures, volcanic eruptions are rare natural events that can create severe hazards. In addition to being rare events, evidence of many past volcanic eruptions is poorly preserved in the geologic record. The models used for describing the impact of volcanic phenomena generally represent a range of model complexities, from simplified physics based conceptual models to highly coupled thermo fluid dynamical approaches. Modelling approaches represent a hierarchy of complexity, which reflects increasing requirements for well characterized data in order to produce a broader range of output information. In selecting models for the hazard analysis related to a specific phenomenon, questions that need to be answered by the models must be carefully considered. Independently of the model, the final hazards assessment strongly depends on input derived from detailed volcanological investigations, such as mapping and stratigraphic correlations. For each phenomenon, an overview of currently available approaches for the evaluation of future hazards will be presented with the aim to provide a foundation for future work in developing an international consensus on volcanic hazards assessment methods.

  8. Comparative analysis of hazardous household waste in two Mexican regions.

    PubMed

    Delgado, Otoniel Buenrostro; Ojeda-Benítez, Sara; Márquez-Benavides, Liliana

    2007-01-01

    Household hazardous waste (HHW) generation in two Mexican regions was examined, a northern region (bordering with the USA) and a central region. The aim of this work was to determine the dynamics of solid waste generation and to be able to compare the results of both regions, regarding consumption patterns and solid waste generation rates. In the northern region, household solid waste was analysed quantitatively. In order to perform this analysis, the population was categorized into three socioeconomic strata (lower, middle, upper). Waste characterization revealed the presence of products that give origin to household hazardous waste. In the northern region (Mexicali city), household hazardous waste comprised 3.7% of municipal solid waste, the largest categories in this fraction were home care products (29.2%), cleaning products (19.5%) and batteries and electronic equipment (15.7%). In the central region, HHW comprised 1.03% of municipal solid waste; the main categories in this fraction were represented by cleaning products (39%), self care products (27.3%), and insecticides (14.4%). In Mexicali, the socioeconomic study demonstrated that the production of HHW is independent of the income level. Furthermore, the composition of the solid waste stream in both regions suggested the influence of another set of variables such as local climate, migration patterns and marketing coverage. Further research is needed in order to establish the effect of low quantities of HHW upon the environment and public health.

  9. Preliminary hazards analysis for the National Ignition Facility

    SciTech Connect

    Brereton, S.J.

    1993-10-01

    This report documents the Preliminary Hazards Analysis (PHA) for the National Ignition Facility (NIF). In summary, it provides: a general description of the facility and its operation; identification of hazards at the facility; and details of the hazards analysis, including inventories, bounding releases, consequences, and conclusions. As part of the safety analysis procedure set forth by DOE, a PHA must be performed for the NIF. The PHA characterizes the level of intrinsic potential hazard associated with a facility, and provides the basis for hazard classification. The hazard classification determines the level of safety documentation required, and the DOE Order governing the safety analysis. The hazard classification also determines the level of review and approval required for the safety analysis report. The hazards of primary concern associated with NIF are radiological and toxicological in nature. The hazard classification is determined by comparing facility inventories of radionuclides and chemicals with threshold values for the various hazard classification levels and by examining postulated bounding accidents associated with the hazards of greatest significance. Such postulated bounding accidents cannot take into account active mitigative features; they must assume the unmitigated consequences of a release, taking into account only passive safety features. In this way, the intrinsic hazard level of the facility can be ascertained.

  10. Remote rainfall sensing for landslide hazard analysis

    USGS Publications Warehouse

    Wieczorek, Gerald F.; McWreath, Harry; Davenport, Clay

    2001-01-01

    Methods of assessing landslide hazards and providing warnings are becoming more advanced as remote sensing of rainfall provides more detailed temporal and spatial data on rainfall distribution. Two recent landslide disasters are examined noting the potential for using remotely sensed rainfall data for landslide hazard analysis. For the June 27, 1995, storm in Madison County, Virginia, USA, National Weather Service WSR-88D Doppler radar provided rainfall estimates based on a relation between cloud reflectivity and moisture content on a 1 sq. km. resolution every 6 minutes. Ground-based measurements of rainfall intensity and precipitation total, in addition to landslide timing and distribution, were compared with the radar-derived rainfall data. For the December 14-16, 1999, storm in Vargas State, Venezuela, infrared sensing from the GOES-8 satellite of cloud top temperatures provided the basis for NOAA/NESDIS rainfall estimates on a 16 sq. km. resolution every 30 minutes. These rainfall estimates were also compared with ground-based measurements of rainfall and landslide distribution. In both examples, the remotely sensed data either overestimated or underestimated ground-based values by up to a factor of 2. The factors that influenced the accuracy of rainfall data include spatial registration and map projection, as well as prevailing wind direction, cloud orientation, and topography.

  11. Regional Frequency Analysis of Ocean Hazard

    NASA Astrophysics Data System (ADS)

    Bernardara, Pietro; Weiss, Jerome; Benoit, Michel; Andreewsky, Marc

    2015-04-01

    The estimation of the extreme return level (up to 10-4 annual probability of exceedence) of natural phenomena is a very uncertain exercise, when extrapolating using the information and the measure collected in a single site. The aim of the Regional Frequency Analysis (RFA) is to benefit from the information contained in observations and data collected not only on the site of interested but in a larger set of sites, located in the same region of the site of interest or sharing with it similar characteristics. This technique was introduced in the '60 and widely used in various domains including hydrology and meteorology. The RFA was recently acknowledge as a potential choice for the estimation of flooding hazard in the Methodological Guide for flooding hazard estimation [1], published in 2013 by the French Nuclear Safety Autority. The aim of this presentation is to bring in the main concepts of the RFA and illustrate the latest innovation on its application, delivered by EDF R&D. They concerns the statistical definition of storms, the formation of homogeneous regions and a new approach for filtering the redundant information linked to the spatial correlation of natural phenomena. Application to skew surges and waves will be shown 1. ASN, Guide pour la Protection des installations nucléaires de base contre les inondations externes. 2013, ASN. p. 44.

  12. Hazard Analysis for Building 34 Vacuum Glove Box Assembly

    NASA Technical Reports Server (NTRS)

    Meginnis, Ian

    2014-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".

  13. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from...

  14. 327 Building fire hazards analysis implementation plan

    SciTech Connect

    BARILO, N.F.

    1999-05-10

    In March 1998, the 327 Building Fire Hazards Analysis (FHA) (Reference 1) was approved by the U.S. Department of Energy, Richland Operations Office (DOE-E) for implementation by B and W Hanford Company (BWC). The purpose of the FHA was to identify gaps in compliance with DOE Order 5480.7A (Reference 2) and Richland Operations Office Implementation Directive (RLID) 5480.7 (Reference 3), especially in regard to loss limitation. The FHA identified compliance gaps in five areas and provided nine recommendations (11 items) to bring the 327 Building into compliance. A status is provided for each recommendation in this document. BWHC will use this Implementation Plan to bring the 327 Building and its operation into compliance with DOE Order 5480.7A and IUD 5480.7.

  15. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  16. The use of hazards analysis in the development of training

    SciTech Connect

    Houghton, F.K.

    1998-03-01

    When training for a job in which human error has the potential of producing catastrophic results, an understanding of the hazards that may be encountered is of paramount importance. In high consequence activities, it is important that the training program be conducted in a safe environment and yet emphasize the potential hazards. Because of the high consequence of a human error the use of a high-fidelity simulation is of great importance to provide the safe environment the worker needs to learn and hone required skills. A hazards analysis identifies the operation hazards, potential human error, and associated positive measures that aid in the mitigation or prevention of the hazard. The information gained from the hazards analysis should be used in the development of training. This paper will discuss the integration of information from the hazards analysis into the development of simulation components of a training program.

  17. Fire hazard analysis for the fuel supply shutdown storage buildings

    SciTech Connect

    REMAIZE, J.A.

    2000-09-27

    The purpose of a fire hazards analysis (FHA) is to comprehensively assess the risk from fire and other perils within individual fire areas in a DOE facility in relation to proposed fire protection so as to ascertain whether the objectives of DOE 5480.7A, Fire Protection, are met. This Fire Hazards Analysis was prepared as required by HNF-PRO-350, Fire Hazards Analysis Requirements, (Reference 7) for a portion of the 300 Area N Reactor Fuel Fabrication and Storage Facility.

  18. Fire hazards analysis of transuranic waste storage and assay facility

    SciTech Connect

    Busching, K.R., Westinghouse Hanford

    1996-07-31

    This document analyzes the fire hazards associated with operations at the Central Waste Complex. It provides the analysis and recommendations necessary to ensure compliance with applicable fire codes.

  19. Quantitative analysis of retinal OCT.

    PubMed

    Sonka, Milan; Abràmoff, Michael D

    2016-10-01

    Clinical acceptance of 3-D OCT retinal imaging brought rapid development of quantitative 3-D analysis of retinal layers, vasculature, retinal lesions as well as facilitated new research in retinal diseases. One of the cornerstones of many such analyses is segmentation and thickness quantification of retinal layers and the choroid, with an inherently 3-D simultaneous multi-layer LOGISMOS (Layered Optimal Graph Image Segmentation for Multiple Objects and Surfaces) segmentation approach being extremely well suited for the task. Once retinal layers are segmented, regional thickness, brightness, or texture-based indices of individual layers can be easily determined and thus contribute to our understanding of retinal or optic nerve head (ONH) disease processes and can be employed for determination of disease status, treatment responses, visual function, etc. Out of many applications, examples provided in this paper focus on image-guided therapy and outcome prediction in age-related macular degeneration and on assessing visual function from retinal layer structure in glaucoma.

  20. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  1. Identification of chemical hazards for security risk analysis activities.

    SciTech Connect

    Jaeger, Calvin Dell

    2005-01-01

    The presentation outline of this paper is: (1) How identification of chemical hazards fits into a security risk analysis approach; (2) Techniques for target identification; and (3) Identification of chemical hazards by different organizations. The summary is: (1) There are a number of different methodologies used within the chemical industry which identify chemical hazards: (a) Some develop a manual listing of potential targets based on published lists of hazardous chemicals or chemicals of concern, 'expert opinion' or known hazards. (b) Others develop a prioritized list based on chemicals found at a facility and consequence analysis (offsite release affecting population, theft of material, product tampering). (2) Identification of chemical hazards should include not only intrinsic properties of the chemicals but also potential reactive chemical hazards and potential use for activities off-site.

  2. Military hazardous wastes: an overview and analysis

    SciTech Connect

    Kawaoka, K.E.; Malloy, M.C.; Dever, G.L.; Weinberger, L.P.

    1981-12-01

    The report describes and analyzes the management activities and motivating factors of the military in dealing with its hazardous waste streams. Findings and conclusions in areas of concern are given to provide information that may be of value to the future management of military hazardous wastes.

  3. The use of hazards analysis in the development of training

    SciTech Connect

    Houghton, F.K.

    1998-12-01

    A hazards analysis identifies the operation hazards and the positive measures that aid in the mitigation or prevention of the hazard. If the tasks are human intensive, the hazard analysis often credits the personnel training as contributing to the mitigation of the accident`s consequence or prevention of an accident sequence. To be able to credit worker training, it is important to understand the role of the training in the hazard analysis. Systematic training, known as systematic training design (STD), performance-based training (PBT), or instructional system design (ISD), uses a five-phase (analysis, design, development, implementation, and evaluation) model for the development and implementation of the training. Both a hazards analysis and a training program begin with a task analysis that documents the roles and actions of the workers. Though the tasks analyses are different in nature, there is common ground and both the hazard analysis and the training program can benefit from a cooperative effort. However, the cooperation should not end with the task analysis phase of either program. The information gained from the hazards analysis should be used in all five phases of the training development. The training evaluation, both of the individual worker and institutional training program, can provide valuable information to the hazards analysis effort. This paper will discuss the integration of the information from the hazards analysis into a training program. The paper will use the installation and removal of a piece of tooling that is used in a high-explosive operation. This example will be used to follow the systematic development of a training program and demonstrate the interaction and cooperation between the hazards analysis and training program.

  4. Quantitative analysis of endogenous compounds.

    PubMed

    Thakare, Rhishikesh; Chhonker, Yashpal S; Gautam, Nagsen; Alamoudi, Jawaher Abdullah; Alnouti, Yazen

    2016-09-05

    Accurate quantitative analysis of endogenous analytes is essential for several clinical and non-clinical applications. LC-MS/MS is the technique of choice for quantitative analyses. Absolute quantification by LC/MS requires preparing standard curves in the same matrix as the study samples so that the matrix effect and the extraction efficiency for analytes are the same in both the standard and study samples. However, by definition, analyte-free biological matrices do not exist for endogenous compounds. To address the lack of blank matrices for the quantification of endogenous compounds by LC-MS/MS, four approaches are used including the standard addition, the background subtraction, the surrogate matrix, and the surrogate analyte methods. This review article presents an overview these approaches, cite and summarize their applications, and compare their advantages and disadvantages. In addition, we discuss in details, validation requirements and compatibility with FDA guidelines to ensure method reliability in quantifying endogenous compounds. The standard addition, background subtraction, and the surrogate analyte approaches allow the use of the same matrix for the calibration curve as the one to be analyzed in the test samples. However, in the surrogate matrix approach, various matrices such as artificial, stripped, and neat matrices are used as surrogate matrices for the actual matrix of study samples. For the surrogate analyte approach, it is required to demonstrate similarity in matrix effect and recovery between surrogate and authentic endogenous analytes. Similarly, for the surrogate matrix approach, it is required to demonstrate similar matrix effect and extraction recovery in both the surrogate and original matrices. All these methods represent indirect approaches to quantify endogenous compounds and regardless of what approach is followed, it has to be shown that none of the validation criteria have been compromised due to the indirect analyses.

  5. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  6. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    NASA Astrophysics Data System (ADS)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-06-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  7. Active polarimeter optical system laser hazard analysis.

    SciTech Connect

    Augustoni, Arnold L.

    2005-07-01

    A laser hazard analysis was performed for the SNL Active Polarimeter Optical System based on the ANSI Standard Z136.1-2000, American National Standard for Safe Use of Lasers and the ANSI Standard Z136.6-2000, American National Standard for Safe Use of Lasers Outdoors. The Active Polarimeter Optical System (APOS) uses a pulsed, near-infrared, chromium doped lithium strontium aluminum fluoride (Cr:LiSAF) crystal laser in conjunction with a holographic diffuser and lens to illuminate a scene of interest. The APOS is intended for outdoor operations. The system is mounted on a height adjustable platform (6 feet to 40 feet) and sits atop a tripod that points the beam downward. The beam can be pointed from nadir to as much as 60 degrees off of nadir producing an illuminating spot geometry that can vary from circular (at nadir) to elliptical in shape (off of nadir). The JP Innovations crystal Cr:LiSAF laser parameters are presented in section II. The illuminating laser spot size is variable and can be adjusted by adjusting the separation distance between the lens and the holographic diffuser. The system is adjusted while platform is at the lowest level. The laser spot is adjusted for a particular spot size at a particular distance (elevation) from the laser by adjusting the separation distance (d{sub diffuser}) to predetermined values. The downward pointing angle is also adjusted before the platform is raised to the selected operation elevation.

  8. A hazards analysis of a nuclear explosives dismantlement

    SciTech Connect

    Bott, T.F.; Eisenhawer, S.W.

    1995-07-01

    This paper describes the methodology used in a quantitative hazard assessment of a nuclear weapon disassembly process. Potential accident sequences were identified using an accident-sequence fault tree based on operational history, weapon safety studies, a hazard analysis team composed of weapons experts, and walkthroughs of the process. The experts provided an initial screening of the accident sequences to reduce the number of accident sequences that would be quantified. The accident sequences that survived the screening process were developed further using event trees. Spreadsheets were constructed for each event tree, the accident sequences associated with that event tree were entered as rows on the spreadsheet, and that spreadsheet was linked to spreadsheets with initiating-event frequencies, enabling event probabilities, and weapon response probabilities. The probability and frequency distribution estimates used in these spreadsheets were gathered from weapon process operational data, surrogate industrial data, expert judgment, and probability models. Frequency distributions were calculated for the sequences whose point-value frequency represented 99% of the total point-value frequency using a Monte Carlo simulation. Partial differential importances of events and distributions of accident frequency by weapon configuration, location, process, and other parameters were calculated.

  9. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Toxic release hazard analysis. 417.227 Section 417.227 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard...

  10. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Toxic release hazard analysis. 417.227 Section 417.227 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard...

  11. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Toxic release hazard analysis. 417.227 Section 417.227 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard...

  12. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Toxic release hazard analysis. 417.227 Section 417.227 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard...

  13. Fire hazards analysis for solid waste burial grounds

    SciTech Connect

    McDonald, K.M.

    1995-09-28

    This document comprises the fire hazards analysis for the solid waste burial grounds, including TRU trenches, low-level burial grounds, radioactive mixed waste trenches, etc. It analyzes fire potential, and fire damage potential for these facilities. Fire scenarios may be utilized in future safety analysis work, or for increasing the understanding of where hazards may exist in the present operation.

  14. Cold Vacuum Drying (CVD) Facility Hazards Analysis Report

    SciTech Connect

    CROWE, R.D.

    2000-08-07

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) Hazard Analysis to support the CVDF Final Safety Analysis Report and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports,'' and implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports.''

  15. Probabilistic Seismic Hazard Analysis for Southern California Coastal Facilities

    SciTech Connect

    Savy, J; Foxall, B

    2004-04-16

    The overall objective of this study was to develop probabilistic seismic hazard estimates for the coastal and offshore area of Ventura, Los Angeles and Orange counties for use as a basis for the University of Southern California (USC) to develop physical models of tsunami for the coastal regions and by the California State Lands Commission (SLC) to develop regulatory standards for seismic loading and liquefaction evaluation of marine oil terminals. The probabilistic seismic hazard analysis (PSHA) was carried out by the Lawrence Livermore National Laboratory (LLNL), in several phases over a time period of two years, following the method developed by LLNL for the estimation of seismic hazards at Department Of Energy (DOE) facilities, and for 69 locations of nuclear plants in the Eastern United States, for the Nuclear Regulatory Commission (NRC). This method consists in making maximum use of all physical data (qualitative, and quantitative) and to characterize the uncertainties by using a set of alternate spatiotemporal models of occurrence of future earthquakes, as described in the SSHAC, PSHA Guidance Document (Budnitz et al., 1997), and implemented for the NRC (Savy et al., 2002). In general, estimation of seismic hazard is based not only on our understanding of the regional tectonics and detailed characterization of the faults in the area but also on the analysis methods employed and the types of physical and empirical models that are deemed appropriate for the analysis. To develop this understanding, the body of knowledge in the scientific community is sampled in a series of workshops with a group of experts representative of the entire scientific community, including geologists and seismologists from the United States Geological Survey (USGS), members of the South California Earthquake Center (SCEC), and members of academic institutions (University of California Santa-Cruz, Stanford, UC Santa Barbara, and University of Southern California), and members of

  16. Hydrotreater/Distillation Column Hazard Analysis Report Rev. 2

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2016-04-15

    This project Hazard and Risk Analysis Report contains the results of several hazard analyses and risk assessments. An initial assessment was conducted in 2012, which included a multi-step approach ranging from design reviews to a formal What-If hazard analysis. A second What-If hazard analysis was completed during February 2013 to evaluate the operation of the hydrotreater/distillation column processes to be installed in a process enclosure within the Process Development Laboratory West (PDL-West) facility located on the PNNL campus. The qualitative analysis included participation of project and operations personnel and applicable subject matter experts. The analysis identified potential hazardous scenarios, each based on an initiating event coupled with a postulated upset condition. The unmitigated consequences of each hazardous scenario were generally characterized as a process upset; the exposure of personnel to steam, vapors or hazardous material; a spray or spill of hazardous material; the creation of a flammable atmosphere; or an energetic release from a pressure boundary.

  17. SYNTHESIS OF SAFETY ANALYSIS AND FIRE HAZARD ANALYSIS METHODOLOGIES

    SciTech Connect

    Coutts, D

    2007-04-17

    Successful implementation of both the nuclear safety program and fire protection program is best accomplished using a coordinated process that relies on sound technical approaches. When systematically prepared, the documented safety analysis (DSA) and fire hazard analysis (FHA) can present a consistent technical basis that streamlines implementation. If not coordinated, the DSA and FHA can present inconsistent conclusions, which can create unnecessary confusion and can promulgate a negative safety perception. This paper will compare the scope, purpose, and analysis techniques for DSAs and FHAs. It will also consolidate several lessons-learned papers on this topic, which were prepared in the 1990s.

  18. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    SciTech Connect

    Waters, Michael Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  19. Hazard Analysis of Japanese Boxed Lunches (Bento).

    ERIC Educational Resources Information Center

    Bryan, Frank L.; And Others

    1991-01-01

    For the purposes of identifying contaminants, of assessing risks, and of determining critical food processing control points, hazard analyses were conducted at two "bento" (oriental boxed meals) catering operations. Time and temperature abuses during the holding period, after cooking and prior to consumption, were found to be the primary…

  20. Hazard Analysis of Japanese Boxed Lunches (Bento).

    ERIC Educational Resources Information Center

    Bryan, Frank L.; And Others

    1991-01-01

    For the purposes of identifying contaminants, of assessing risks, and of determining critical food processing control points, hazard analyses were conducted at two "bento" (oriental boxed meals) catering operations. Time and temperature abuses during the holding period, after cooking and prior to consumption, were found to be the primary…

  1. User’s Guide - Seismic Hazard Analysis

    DTIC Science & Technology

    1993-02-01

    Eartquake Magnitude Cutoff 8.5 example 8.8 Enter Site Longitude (Degrees) 117 example 115.0 Enter Site Latitude (Degrees) 38 example 38.5 Any Chnges? Y / H...the art for assessing earthquake hazards in the United States catalogue of strong motion eartquake records, Wtaerways Experiment Station, Vicks- burg

  2. Critical asset and portfolio risk analysis: an all-hazards framework.

    PubMed

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  3. Guide for Hydrogen Hazards Analysis on Components and Systems

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Woods, Stephen

    2003-01-01

    The physical and combustion properties of hydrogen give rise to hazards that must be considered when designing and operating a hydrogen system. One of the major concerns in the use of hydrogen is that of fire or detonation because of hydrogen's wide flammability range, low ignition energy, and flame speed. Other concerns include the contact and interaction of hydrogen with materials, such as the hydrogen embrittlement of materials and the formation of hydrogen hydrides. The low temperature of liquid and slush hydrogen bring other concerns related to material compatibility and pressure control; this is especially important when dissimilar, adjoining materials are involved. The potential hazards arising from these properties and design features necessitate a proper hydrogen hazards analysis before introducing a material, component, or system into hydrogen service. The objective of this guide is to describe the NASA Johnson Space Center White Sands Test Facility hydrogen hazards analysis method that should be performed before hydrogen is used in components and/or systems. The method is consistent with standard practices for analyzing hazards. It is recommended that this analysis be made before implementing a hydrogen component qualification procedure. A hydrogen hazards analysis is a useful tool for hydrogen-system designers, system and safety engineers, and facility managers. A hydrogen hazards analysis can identify problem areas before hydrogen is introduced into a system-preventing damage to hardware, delay or loss of mission or objective, and possible injury or loss of life.

  4. Coherent and consistent decision making for mixed hazardous waste management: The application of quantitative assessment techniques

    SciTech Connect

    Smith, G.M.; Little, R.H.; Torres, C.

    1994-12-31

    This paper focuses on predictive modelling capacity for post-disposal safety assessments of land-based disposal facilities, illustrated by presentation of the development and application of a comprehensive, yet practicable, assessment framework. The issues addressed include: (1) land-based disposal practice, (2) the conceptual and mathematical representation of processes leading to release, migration and accumulation of contaminants, (3) the identification and evaluation of relevant assessment end-points, including human health, health of non-human biota and eco-systems, and property and resource effects, (4) the gap between data requirements and data availability, and (5) the application of results in decision making, given the uncertainties in assessment results and the difficulty of comparing qualitatively different impacts arising in different temporal and spatial scales. The paper illustrates the issues with examples based on disposal of metals and radionuclides to shallow facilities. The types of disposal facility considered include features consistent with facilities for radioactive wastes as well as other types of design more typical of hazardous wastes. The intention is to raise the question of whether radioactive and other hazardous wastes are being consistently managed, and to show that assessment methods are being developed which can provide quantitative information on the levels of environmental impact as well as a consistent approach for different types of waste, such methods can then be applied to mixed hazardous wastes contained radionuclides as well as other contaminants. The remaining question is whether the will exists to employ them. The discussion and worked illustrations are based on a methodology developed and being extended within the current European Atomic Energy Community`s cost-sharing research program on radioactive waste management and disposal, with co-funding support from Empresa Nacional de Residuous Radiactivos SA, Spain.

  5. Statistical analysis of the uncertainty related to flood hazard appraisal

    NASA Astrophysics Data System (ADS)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  6. Department of Defense radiation hazards testing analysis

    NASA Astrophysics Data System (ADS)

    Fortinberry, Jarrod Douglas

    The United States Department of Defense tests its weapons systems to many different real as well as man-made environments prior to deploying the systems to the troops. The intent of testing is to insure the systems function as they are intended without adverse reactions. One of the required tests is a Radiation Hazards test to insure that nonionizing radio frequency waves generated by transmitters, such as radios and jammers, do not cause harm to personnel, ordnance or fuel. This test is typically performed at a DoD test lab and data taken at the lab is used to determine safe operating parameters for a particular piece of equipment. This thesis presents measurements as well as mathematical models to demonstrate methods that can be employed to take more relevant Radiation Hazards data.

  7. Quantitative analysis of glycoprotein glycans.

    PubMed

    Orlando, Ron

    2013-01-01

    The ability to quantitatively determine changes in the N- and O-linked glycans is an essential component of comparative glycomics. Multiple strategies are available to by which this can be accomplished, including; both label free approaches and isotopic labeling strategies. The focus of this chapter is to describe each of these approaches while providing insight into their strengths and weaknesses, so that glycomic investigators can make an educated choice of the strategy that is best suited for their particular application.

  8. Space Propulsion Hazards Analysis Manual (SPHAM), volume 1

    NASA Technical Reports Server (NTRS)

    Becker, Dorothy L. (Editor)

    1989-01-01

    The Space Propulsion Hazards Analysis Manual (SPHAM) is a compilation of methods and data directed at hazards analysis and safety for space propulsion and associated vehicles, but broadly applicable to other environments and systems. Methods are described of compiling relevant regulatory documentation, deriving design requirements and specifications, modeling accident scenarios in formal risk assessments, and correlation real-time data to risk probability modeling. Also, SPHAM provides methods for predicting post-accident blast, fragmentation, thermal, and environmental damage. Included in the appendices are an exhaustive bibliography, hazardous properties information on selected space propulsion commodities, and system descriptions of various launch vehicles, upper stages, and spacecrafts.

  9. Assessing the potential hazard of chemical substances for the terrestrial environment. Development of hazard classification criteria and quantitative environmental indicators.

    PubMed

    Tarazona, J V; Fresno, A; Aycard, S; Ramos, C; Vega, M M; Carbonell, G

    2000-03-20

    Hazard assessment constitutes an essential tool in order to evaluate the potential effects of chemical substances on organisms and ecosystems. It includes as a first step, hazard identification, which must detect the potential dangers of the substance (i.e. the kind of effects that the substance may produce), and a second step to quantify each danger and to set the expected dose/response relationships. Hazard assessment plays a key role in the regulation of chemical substances, including pollution control and sustainable development. However, the aquatic environment has largely received more attention than terrestrial ecosystems. This paper presents the extrapolation of several basic concepts from the aquatic to the terrestrial compartment, and suggests possibilities for their regulatory use. Two specific proposals are discussed. The first focuses on the scientific basis of the hazard identification-classification criteria included in the EU regulations and their extrapolation to the terrestrial environment. The second focuses on the OECD programme for environmental indicators and the development of a soil pollution pressure indicator to quantify the potential hazards for the soil compartment and its associated terrestrial ecosystem related to the toxic chemicals applied deliberately (i.e. pesticides) or not (i.e. heavy metals in sludge-based fertilisers; industrial spills) to the soil.

  10. Fire Hazards Analysis for the 200 Area Interim Storage Area

    SciTech Connect

    JOHNSON, D.M.

    2000-01-06

    This documents the Fire Hazards Analysis (FHA) for the 200 Area Interim Storage Area. The Interim Storage Cask, Rad-Vault, and NAC-1 Cask are analyzed for fire hazards and the 200 Area Interim Storage Area is assessed according to HNF-PRO-350 and the objectives of DOE Order 5480 7A. This FHA addresses the potential fire hazards associated with the Interim Storage Area (ISA) facility in accordance with the requirements of DOE Order 5480 7A. It is intended to assess the risk from fire to ensure there are no undue fire hazards to site personnel and the public and to ensure property damage potential from fire is within acceptable limits. This FHA will be in the form of a graded approach commensurate with the complexity of the structure or area and the associated fire hazards.

  11. Overview hazard analysis for the H2Fuel Bus Program

    SciTech Connect

    Hovis, G.L.

    1996-06-18

    The H2Fuel Bus project is a joint development effort to produce a safe, near-zero emission, 32 passenger bus that is propelled by electric power with continuous on-board hydrogen powered battery recharging. A key initiative in the hydrogen bus development effort is a rigorous evaluation of operational safety. Westinghouse Savannah River Co., the prime contractor at the Department of Energy`s Savannah River Site, has developed a hazard analysis methodology designed to provide a systematic, comprehensive identification and evaluation of hazards. Although originally developed to support nuclear/chemical facility safety basis documentation, the SRS Methodology has widespread applicability to operations and/or systems that utilize hazardous materials and energy. This methodology was used to perform an overview hazard analysis for the H2Fuel Bus project to focus attention on those hypothetical circumstances that pose the greatest threat to the populace and property. The hazard analysis yields a listing of all known H2Fuel Bus hazards, postulated accident scenarios describing possible hazardous releases or conditions, an assessment of the scenarios in terms of frequency of occurrence and consequence, and binning in frequency-consequence space to assess the relative severity of postulated scenarios.

  12. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    NASA Astrophysics Data System (ADS)

    Liu, B.; Siu, Y. L.; Mitchell, G.

    2015-12-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  13. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    NASA Astrophysics Data System (ADS)

    Liu, Baoyin; Siu, Yim Ling; Mitchell, Gordon

    2016-03-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  14. Analysis of Brazilian data for seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Drouet, S.; Assumpção, M.

    2013-05-01

    Seismic hazard analysis in Brazil is going to be re-assessed in the framework of the Global Earthquake Model (GEM) project. Since the last worldwide Global Seismic Hazard Analysis Project (GSHAP) there has been no specific study in this field in Brazil. Brazil is a stable continental region and is characterized by a low seismic activity. In this particular type of regions, seismic hazard assessment is a very hard task due to the limited amount of data available regarding the seismic sources, earthquake catalogue, or ground-motion amplitudes, and the uncertainties associated are very large. This study focuses on recorded data in South-East Brazil where broadband stations are installed, belonging to two networks: the network managed by the seismology group at the IAG-USP in São Paulo which exists since about 20 years, and the network managed by the Observatorio Nacional in Rio de Janeiro which has just been set up. The two networks are now integrated into the national network RSB (Rede Sismográfica Brasileira) which will also include stations from the rest of Brazil currently in installation by the Universities of Brasilia and Natal. There are a couple of events with magnitude greater than 3 recorded at these very sensitive stations, usually at rather large distances. At first sight these data may appear meaningless in the context of seismic hazard but they can help to improve different parts involved in the process. The analysis of the S-wave Fourier spectra can help to better resolve source, path and site effects in Brazil. For instance moment magnitudes can be computed from the flat part of the Fourier spectra. These magnitudes are of utmost importance in order to build an homogeneous catalogue in terms of moment magnitude. At the moment only body wave magnitude (or some equivalent scale) are determined routinely for the events in Brazil. Attenuation and site effect, especially the high-frequency attenuation known as the kappa effect will also help to

  15. Quantitative Analysis of Face Symmetry.

    PubMed

    Tamir, Abraham

    2015-06-01

    The major objective of this article was to report quantitatively the degree of human face symmetry for reported images taken from the Internet. From the original image of a certain person that appears in the center of each triplet, 2 symmetric combinations were constructed that are based on the left part of the image and its mirror image (left-left) and on the right part of the image and its mirror image (right-right). By applying a computer software that enables to determine length, surface area, and perimeter of any geometric shape, the following measurements were obtained for each triplet: face perimeter and area; distance between the pupils; mouth length; its perimeter and area; nose length and face length, usually below the ears; as well as the area and perimeter of the pupils. Then, for each of the above measurements, the value C, which characterizes the degree of symmetry of the real image with respect to the combinations right-right and left-left, was calculated. C appears on the right-hand side below each image. A high value of C indicates a low symmetry, and as the value is decreasing, the symmetry is increasing. The magnitude on the left relates to the pupils and compares the difference between the area and perimeter of the 2 pupils. The major conclusion arrived at here is that the human face is asymmetric to some degree; the degree of asymmetry is reported quantitatively under each portrait.

  16. Speed your hazard analysis with the focused what if?

    SciTech Connect

    Goodman, L.

    1996-07-01

    Chemical process industries (CPI) plants worldwide must diminish their associated hazards. CPI firms are driven by their need to protect the public, their plants` neighbors, and their operating staffs from the consequences of accidents, as well as to meet regulations. Assuring the company and its regulators of safe operation requires conducting hazard analyses to demonstrate that plants are designed and operated to be as safe as practicable. For all organizations, large and small, it can be daunting and resource-consuming to conduct hazard analyses. There are various acceptable methods of hazard analysis: the hazard and operability analysis (HAZOP), the what if?, failure modes and effects and analysis (FMEA), and checklists, among others. Of all these methods, the HAZOP is on its way to becoming the de facto standard, because it is seemingly the most comprehensive and detailed method. Unfortunately, the HAZOP analysis, in many cases, may not be a good choice for a hazard analysis. This article explains more of the why of the focused what if (FWI), describes what it is, tells where it has been applied successfully, and provides an abridged example of its use.

  17. Seismic hazard analysis for Jayapura city, Papua

    SciTech Connect

    Robiana, R. Cipta, A.

    2015-04-24

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  18. Quantitative analysis of qualitative images

    NASA Astrophysics Data System (ADS)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  19. Sensitivity analysis in quantitative microbial risk assessment.

    PubMed

    Zwieterin, M H; van Gerwen, S J

    2000-07-15

    The occurrence of foodborne disease remains a widespread problem in both the developing and the developed world. A systematic and quantitative evaluation of food safety is important to control the risk of foodborne diseases. World-wide, many initiatives are being taken to develop quantitative risk analysis. However, the quantitative evaluation of food safety in all its aspects is very complex, especially since in many cases specific parameter values are not available. Often many variables have large statistical variability while the quantitative effect of various phenomena is unknown. Therefore, sensitivity analysis can be a useful tool to determine the main risk-determining phenomena, as well as the aspects that mainly determine the inaccuracy in the risk estimate. This paper presents three stages of sensitivity analysis. First, deterministic analysis selects the most relevant determinants for risk. Overlooking of exceptional, but relevant cases is prevented by a second, worst-case analysis. This analysis finds relevant process steps in worst-case situations, and shows the relevance of variations of factors for risk. The third, stochastic analysis, studies the effects of variations of factors for the variability of risk estimates. Care must be taken that the assumptions made as well as the results are clearly communicated. Stochastic risk estimates are, like deterministic ones, just as good (or bad) as the available data, and the stochastic analysis must not be used to mask lack of information. Sensitivity analysis is a valuable tool in quantitative risk assessment by determining critical aspects and effects of variations.

  20. Qualitative and Quantitative Assessment of Naturals Hazards in the Caldera of Mount Bambouto (West Cameroon)

    NASA Astrophysics Data System (ADS)

    Zangmo Tefogoum, G.; Kagou Dongmo, A.; Nkouathio, D. G.; Wandji, P.

    2009-04-01

    Mount Bambouto is polygenic stratovolcano of the Cameroon Volcanic Line, build between 21 Ma and 4,5Ma (Nkouathio et al., 2008). It is situated at about 200 km NE of mount Cameroon, at 09°55' and 10°15' East and, 05°25' and 05°50' Nord. This volcano covers an area of 500 Km2 and culminates at 2740 m at Meletan hill and bears a collapse caldera (13 x 8 km). Fissural, extrusive and explosive dynamism are responsible of the construction in three main stages this volcano including the edification of a sommital large rim caldera. Mount Bambouto structure gives rise to different natural hazards, of volcanological origin and meteorological origin. In the past time, landslides, floodings, firebush, blocks collapse took place in this area with catastrophic impact on the population. New research program had been carried out in the caldera concerning qualitative and quantitative evaluation of natural risks and catastrophes. The main factors of instability are rain, structure of the basement, slopes, lithology and anthropic activities; particularly, the occurrence of exceptional rainfall due to global change are relevant; this gives opportunity to draw landslides hazards zonation map of the Bambouto caldera which is the main risk in this area. We evaluate the financial potential of the caldera base on the average income of breeding, farming, school fees and the cost of houses and equipments for each family. The method of calculation revealed that, the yearly economy of the mounts Bambouto caldera represents about 2 billions FCFA. Some recommendations have been made in order to prevent and reduced the potential losses and the number of victims in particular by better land use planning. These help us to estimate the importance of destruction of the environment and biodiversity in case of catastrophes. We conclude that in the Bambouto caldera there is moderate to high probability that destructive phenomena due to landslides occurs within the upcoming years with enormous

  1. Fire hazards analysis for the uranium oxide (UO{sub 3}) facility

    SciTech Connect

    Wyatt, D.M.

    1994-12-06

    The Fire Hazards Analysis (FHA) documents the deactivation end-point status of the UO{sub 3} complex fire hazards, fire protection and life safety systems. This FHA has been prepared for the Uranium Oxide Facility by Westinghouse Hanford Company in accordance with the criteria established in DOE 5480.7A, Fire Protection and RLID 5480.7, Fire Protection. The purpose of the Fire Hazards Analysis is to comprehensively and quantitatively assess the risk from a fire within individual fire areas in a Department of Energy facility so as to ascertain whether the objectives stated in DOE Order 5480.7, paragraph 4 are met. Particular attention has been paid to RLID 5480.7, Section 8.3, which specifies the criteria for deactivating fire protection in decommission and demolition facilities.

  2. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    PubMed

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  3. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...; (iv) Pesticides; (v) Drug residues; (vi) Decomposition in scombroid toxin-forming species or in any other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where... HACCP plan need not list the food safety hazard associated with the formation of Clostridium...

  4. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...; (iv) Pesticides; (v) Drug residues; (vi) Decomposition in scombroid toxin-forming species or in any other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where... HACCP plan need not list the food safety hazard associated with the formation of Clostridium...

  5. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...; (iv) Pesticides; (v) Drug residues; (vi) Decomposition in scombroid toxin-forming species or in any other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where... HACCP plan need not list the food safety hazard associated with the formation of Clostridium...

  6. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...; (iv) Pesticides; (v) Drug residues; (vi) Decomposition in scombroid toxin-forming species or in any other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where... HACCP plan need not list the food safety hazard associated with the formation of Clostridium...

  7. Hazard screening application guide. Safety Analysis Report Update Program

    SciTech Connect

    1992-06-01

    The basic purpose of hazard screening is to group precesses, facilities, and proposed modifications according to the magnitude of their hazards so as to determine the need for and extent of follow on safety analysis. A hazard is defined as a material, energy source, or operation that has the potential to cause injury or illness in human beings. The purpose of this document is to give guidance and provide standard methods for performing hazard screening. Hazard screening is applied to new and existing facilities and processes as well as to proposed modifications to existing facilities and processes. The hazard screening process evaluates an identified hazards in terms of the effects on people, both on-site and off-site. The process uses bounding analyses with no credit given for mitigation of an accident with the exception of certain containers meeting DOT specifications. The process is restricted to human safety issues only. Environmental effects are addressed by the environmental program. Interfaces with environmental organizations will be established in order to share information.

  8. Frequency Analysis of Aircraft hazards for License Application

    SciTech Connect

    K. Ashley

    2006-10-24

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards.

  9. AN ENHANCED HAZARD ANALYSIS PROCESS FOR THE HANFORD TANK FARMS

    SciTech Connect

    SHULTZ MV

    2008-05-15

    CH2M HILL Hanford Group, Inc., has expanded the scope and increased the formality of process hazards analyses performed on new or modified Tank Farm facilities, designs, and processes. The CH2M HILL process hazard analysis emphasis has been altered to reflect its use as a fundamental part of the engineering and change control process instead of simply being a nuclear safety analysis tool. The scope has been expanded to include identification of accidents/events that impact the environment, or require emergency response, in addition to those with significant impact to the facility worker, the offsite, and the 100-meter receptor. Also, there is now an expectation that controls will be identified to address all types of consequences. To ensure that the process has an appropriate level of rigor and formality, a new engineering standard for process hazards analysis was created. This paper discusses the role of process hazards analysis as an information source for not only nuclear safety, but also for the worker-safety management programs, emergency management, environmental programs. This paper also discusses the role of process hazards analysis in the change control process, including identifying when and how it should be applied to changes in design or process.

  10. Quantitative evaluation of heavy metals' pollution hazards in liquefaction residues of sewage sludge.

    PubMed

    Huang, Huajun; Yuan, Xingzhong; Zeng, Guangming; Zhu, Huina; Li, Hui; Liu, Zhifeng; Jiang, Hongwei; Leng, Lijian; Bi, Wenkai

    2011-11-01

    Liquefaction residues (LR) are the main by-products of sewage sludge (SS) liquefaction. This study quantitatively evaluates the potential ecological risk and pollution degrees of heavy metals (Pb, Zn, Cu, Cd, Cr and Ni) in LR versus SS. The leaching rates (R1) of heavy metals in LR were much lower than those in SS, revealing that the mobility/leachability of heavy metals was well suppressed after liquefaction. Geo-accumulation index (Igeo) indicated that the liquefaction process significantly weakened the contamination degrees of heavy metals. Potential ecological risk index (RI) demonstrated that overall risks caused by heavy metals were obviously lowered from 1093.56 (very high risk) in SS to 4.72 and 1.51 (low risk) in LR1 and LR2, respectively. According to the risk assessment code (RAC), each tested heavy metal had no or low risk to the environments after liquefaction. In a word, the pollution hazards of heavy metals in LR were markedly mitigated.

  11. Quantitative meta-analytic approaches for the systematic synthesis of data and hazard identification: A case study of decreased pain sensitivity due to trimethylbenzene exposure.

    PubMed

    Davis, J Allen; Kraft, Andrew

    2017-10-01

    Traditionally, human health risk assessments have relied on qualitative approaches for hazard identification, which involves weight of evidence determinations that integrate evidence across multiple studies. Recently, the National Research Council has recommended the development of quantitative approaches for evidence integration, including the application of meta-analyses, to help summarize and evaluate the results of a systematic review. In the meta-analytic approach, a pooled effect size is calculated after consideration of multiple potential confounding factors in order to determine whether the entire database under consideration indicates a chemical is a hazard. The following case-study applies qualitative and quantitative approaches to determine whether trimethylbenzene (TMB) isomers represent a neurotoxic hazard, specifically focusing on pain sensitivity. Following a thorough literature search, the only pain sensitivity studies available for TMBs initially seem discordant in their results: effects on pain sensitivity are seen immediately after termination of exposure, appear to resolve 24h after exposure, and then reappear 50 days later following foot-shock. Qualitative consideration of toxicological and toxicokinetic characteristics of the TMB isomers suggests that the observed differences between studies are likely due to testing time and the application of external stressors. Meta-analyses and -regressions support this conclusion: when all studies are included and possible confounders (isomer, testing time, laboratory, etc.) are accounted for, the pooled effect sizes are statistically significant, thus supporting that TMBs are a possible neurotoxic hazard to human health. Ultimately, this case study demonstrates how qualitative and quantitative methods can be combined to provide a robust hazard identification analysis by incorporating more of the available information. Published by Elsevier Inc.

  12. FMEA, the alternative process hazard method. [Failure Mode Effects Analysis

    SciTech Connect

    Goyal, R.K. )

    1993-05-01

    Failure mode effects analysis (FMEA) is an old reliability/assurance tool finding its way into the HPI. Not popular yet, this hazard technique has some viable applications that can improve hazard assessment data. Notably, FMEA studies can identify possible areas for improvement that may have not been discovered using other methods. Also, FMEA is not as labor intensive and costly as other process hazard analysis (PHA) methods. PSHA 1910.119 set in place an informational structure whose main purpose is the reduction of potential accidents and minimizing risks in the event of an accident. Consequently, HPI operators must evaluate their process systems and identify potential major hazards, such as fires, explosions and accidental release of toxic/hazardous chemicals, and protect their facilities, employees, the public and the environment. But, which PHA method(s) apply to a particular plant or process still remains a difficult question. This paper describes what FMEA is; types of FMEA; how to conduct a FMEA study; comparison with HAZOP (hazard and operability study); computer software; applicability of FMEA; and examples of its use.

  13. A Bayesian Seismic Hazard Analysis for the city of Naples

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  14. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  15. K basin sludge & water system preliminary hazard analysis

    SciTech Connect

    MARTIN, J.B.

    2001-09-06

    This report provides the initial Preliminary Hazard Analysis (PHA) for the Spent Nuclear Fuel (SNF) Project, Sludge and Water System (SWS) subproject. The purpose of the K East (KE) basins SWS is to provide a system that will transfer the KE basins sludge from in-basin storage or source locations to containment and loadout for subsequent transport to T Plant for storage. The SWS will provide a system (Phase I) to retrieve and contain the identified as-settled sludge volume in the KE basin, while maintaining basin water clarity and associated radioactive control limits in accordance with basin operation requirements, and to support the Fuel Transfer System (FTS) and Debris Removal System (DRS). The SWS will also provide a system (Phase II) to retrieve and contain the remaining as-settled sludge volume in the K East basin, while maintaining basin water clarity and associated radioactive control limits in accordance with basin operation requirements. The SWS is in the conceptual design phase. A primary purpose of this PHA is to support the Conceptual Design Document (draft). As details of the SWS process and design are developed, this initial PHA will require revision. The K Basin Hazard Analysis (HNF-3960) documents the hazard analysis previously performed on the K basins including the multi-canister overpack (MCO) and support buildings. HNF-3960 will be updated later to reflect future SWS activities for sludge and water transfer to ensure that all hazards are included. All operational activities and energy sources associated with the SWS are evaluated in this hazard analysis. Using a systematic approach, this document identifies hazards created by abnormal operating conditions, external events (e.g., range fire), and natural phenomena hazards (e.g., earthquake) with the potential for causing undesirable consequences to facility workers, on-site individuals, the public, or the environment.

  16. Pedestrian Evacuation Analysis for Tsunami Hazards

    NASA Astrophysics Data System (ADS)

    Jones, J. M.; Ng, P.; Wood, N. J.

    2014-12-01

    Recent catastrophic tsunamis in the last decade, as well as the 50th anniversary of the 1964 Alaskan event, have heightened awareness of the threats these natural hazards present to large and increasing coastal populations. For communities located close to the earthquake epicenter that generated the tsunami, strong shaking may also cause significant infrastructure damage, impacting the road network and hampering evacuation. There may also be insufficient time between the earthquake and first wave arrival to rely on a coordinated evacuation, leaving at-risk populations to self-evacuate on foot and across the landscape. Emergency managers evaluating these coastal risks need tools to assess the evacuation potential of low-lying areas in order to discuss mitigation options, which may include vertical evacuation structures to provide local safe havens in vulnerable communities. The U.S. Geological Survey has developed the Pedestrian Evacuation Analyst software tool for use by researchers and emergency managers to assist in the assessment of a community's evacuation potential by modeling travel times across the landscape and producing both maps of travel times and charts of population counts with corresponding times. The tool uses an anisotropic (directionally dependent) least cost distance model to estimate evacuation potential and allows for the variation of travel speed to measure its effect on travel time. The effectiveness of vertical evacuation structures on evacuation time can also be evaluated and compared with metrics such as travel time maps showing each structure in place and graphs displaying the percentage change in population exposure for each structure against the baseline. Using the tool, travel time maps and at-risk population counts have been generated for some coastal communities of the U.S. Pacific Northwest and Alaska. The tool can also be used to provide valuable decision support for tsunami vertical evacuation siting.

  17. Seismic hazard analysis of the Adelaide region, South Australia

    NASA Astrophysics Data System (ADS)

    Setiawan, Bambang

    2017-07-01

    Seismic activity in Australia is categorised as low to moderate. However, the rate of the deformation of the Australian continent is faster than other stable intraplate regions, such as Eastern North America and Africa. Adelaide region is the most seismically active zone in the Australian continent. Therefore, seismic hazard analysis of the Adelaide region is needed to improve the accuracy of seismic hazard predictions. Probabilistic seismic hazard analysis (PSHA) incorporating Monte Carlo simulation is selected in the present paper. This method has several advantages i.e. simple, fast, flexible, and robust. Results of the analysis suggest comparable results with previous studies. Furthermore, two main threats are identified in the de-aggregation for the Adelaide city. Due to the limitation of strong magnitude event and the scarcity of the seismic event, further work is suggested for improving the estimates by extending the seismic catalogue i.e. incorporating neo-tectonic and paleo-seismic studies.

  18. Preliminary hazards analysis of thermal scrap stabilization system. Revision 1

    SciTech Connect

    Lewis, W.S.

    1994-08-23

    This preliminary analysis examined the HA-21I glovebox and its supporting systems for potential process hazards. Upon further analysis, the thermal stabilization system has been installed in gloveboxes HC-21A and HC-21C. The use of HC-21C and HC-21A simplified the initial safety analysis. In addition, these gloveboxes were cleaner and required less modification for operation than glovebox HA-21I. While this document refers to glovebox HA-21I for the hazards analysis performed, glovebox HC-21C is sufficiently similar that the following analysis is also valid for HC-21C. This hazards analysis document is being re-released as revision 1 to include the updated flowsheet document (Appendix C) and the updated design basis (Appendix D). The revised Process Flow Schematic has also been included (Appendix E). This Current revision incorporates the recommendations provided from the original hazards analysis as well. The System Design Description (SDD) has also been appended (Appendix H) to document the bases for Safety Classification of thermal stabilization equipment.

  19. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria. ********* ;Without an analysis of the physical causes of recorded floods, and of the whole geophysical, biophysical and anthropogenic context which circumscribes the potential for flood formation, results of flood frequency analysis as [now practiced], rather than providing information useful for coping with the flood hazard, themselves represent an additional hazard that can contribute to damages caused by floods. This danger is very real since decisions made on the basis of wrong numbers presented as good estimates of flood probabilities will generally be worse than decisions made with an awareness of an impossibility to make a good estimate and with the aid of merely qualitative information on the general flooding potential.;

  20. Influence of analysis methods on interpretation of hazard maps.

    PubMed

    Koehler, Kirsten A; Peters, Thomas M

    2013-06-01

    Exposure or hazard mapping is becoming increasingly popular among industrial hygienists. Direct-reading instruments used for hazard mapping of data collection are steadily increasing in reliability and portability while decreasing in cost. Exposure measurements made with these instruments generally require no laboratory analysis although hazard mapping can be a time-consuming process. To inform decision making by industrial hygienists and management, it is crucial that the maps generated from mapping data are as accurate and representative as possible. Currently, it is unclear how many sampling locations are necessary to produce a representative hazard map. As such, researchers typically collect as many points as can be sampled in several hours and interpolation methods are used to produce higher resolution maps. We have reanalyzed hazard-mapping data sets from three industrial settings to determine which interpolation methods yield the most accurate results. The goal is to provide practicing industrial hygienists with some practical guidelines to generate accurate hazard maps with 'off-the-shelf' mapping software. Visually verifying the fit of the variogram model is crucial for accurate interpolation. Exponential and spherical variogram models performed better than Gaussian models. It was also necessary to diverge from some of the default interpolation parameters such as the number of bins used for the experimental variogram and whether or not to allow for a nugget effect to achieve reasonable accuracy of the interpolation for some data sets.

  1. Tracking Hazard Analysis Data in a Jungle of Changing Design

    SciTech Connect

    Sullivan, Robin S.; Young, Jonathan

    2006-05-14

    The biggest fear of the hazard analyst is the loss of data in the middle of the design jungle. When project schedules are demanding and design is changing rapidly it is essential that the hazard analysis data be tracked and kept current in order to provide the required project design, development, and regulatory support. Being able to identify the current information, as well as the past archived information, as the design progresses and to be able to show how the project is designing in safety through modifications based on hazard analysis results is imperative. At the DOE Hanford site in Washington State, Flour Hanford Inc is in the process of the removal and disposition of sludge from the 100 Area K Basins. The K Basins were used to store spent fuel from the operating reactors at the Hanford Site. The sludge is a by-product from the corrosion of the fuel and fuel storage canisters. The sludge removal project has been very dynamic involving the design, procurement and, more recently, the operation of processes at two basins, K East and K West. The project has an ambitious schedule with a large number of changes to design concepts. In order to support the complex K Basins project a technique to track the status of the hazard analysis data was developed. This paper will identify the most important elements of the tracking system and how it was used to assist the project in ensuring that current design data was reflected in a specific version of the hazard analysis and to show how the project was keeping up with the design and ensuring compliance with the requirements to design in safety. While the specifics of the data tracking strategy for the K Basins sludge removal project will be described in the paper, the general concepts of the strategy are applicable to similar projects requiring iteration of hazard analysis and design.

  2. Landslide hazards and systems analysis: A Central European perspective

    NASA Astrophysics Data System (ADS)

    Klose, Martin; Damm, Bodo; Kreuzer, Thomas

    2016-04-01

    Part of the problem with assessing landslide hazards is to understand the variable settings in which they occur. There is growing consensus that hazard assessments require integrated approaches that take account of the coupled human-environment system. Here we provide a synthesis of societal exposure and vulnerability to landslide hazards, review innovative approaches to hazard identification, and lay a focus on hazard assessment, while presenting the results of historical case studies and a landslide time series for Germany. The findings add to a growing body of literature that recognizes societal exposure and vulnerability as a complex system of hazard interactions that evolves over time as a function of social change and development. We therefore propose to expand hazard assessments by the framework and concepts of systems analysis (e.g., Liu et al., 2007) Results so far have been promising in ways that illustrate the importance of feedbacks, thresholds, surprises, and time lags in the evolution of landslide hazard and risk. In densely populated areas of Central Europe, landslides often occur in urbanized landscapes or on engineered slopes that had been transformed or created intentionally by human activity, sometimes even centuries ago. The example of Germany enables to correlate the causes and effects of recent landslides with the historical transition of urbanization to urban sprawl, ongoing demographic change, and some chronic problems of industrialized countries today, including ageing infrastructures or rising government debts. In large parts of rural Germany, the combination of ageing infrastructures, population loss, and increasing budget deficits starts to erode historical resilience gains, which brings especially small communities to a tipping point in their efforts to risk reduction. While struggling with budget deficits and demographic change, these communities are required to maintain ageing infrastructures that are particularly vulnerable to

  3. Hazardous-waste analysis plan for LLNL operations

    SciTech Connect

    Roberts, R.S.

    1982-02-12

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan will address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.

  4. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  5. Quantitative rock-fall hazard and risk assessment for Yosemite Valley, Yosemite National Park, California

    USGS Publications Warehouse

    Stock, Greg M.; Luco, Nicolas; Collins, Brian D.; Harp, Edwin L.; Reichenbach, Paola; Frankel, Kurt L.

    2014-01-01

    Rock falls are common in Yosemite Valley, California, posing substantial hazard and risk to the approximately four million annual visitors to Yosemite National Park. Rock falls in Yosemite Valley over the past few decades have damaged structures and caused injuries within developed regions located on or adjacent to talus slopes highlighting the need for additional investigations into rock-fall hazard and risk. This assessment builds upon previous investigations of rock-fall hazard and risk in Yosemite Valley and focuses on hazard and risk to structures posed by relatively frequent fragmental-type rock falls as large as approximately 100,000 (cubic meters) in volume.

  6. Mobile app-based quantitative scanometric analysis.

    PubMed

    Wong, Jessica X H; Liu, Frank S F; Yu, Hua-Zhong

    2014-12-16

    The feasibility of using smartphones and other mobile devices as the detection platform for quantitative scanometric assays is demonstrated. The different scanning modes (color, grayscale, black/white) and grayscale converting protocols (average, weighted average/luminosity, and software specific) have been compared in determining the optical darkness ratio (ODR) values, a conventional quantitation measure for scanometric assays. A mobile app was developed to image and analyze scanometric assays, as demonstrated by paper-printed tests and a biotin-streptavidin assay on a plastic substrate. Primarily for ODR analysis, the app has been shown to perform as well as a traditional desktop scanner, augmenting that smartphones (and other mobile devices) promise to be a practical platform for accurate, quantitative chemical analysis and medical diagnostics.

  7. Quantitative WDS analysis using electron probe microanalyzer

    SciTech Connect

    Ul-Hamid, Anwar . E-mail: anwar@kfupm.edu.sa; Tawancy, Hani M.; Mohammed, Abdul-Rashid I.; Al-Jaroudi, Said S.; Abbas, Nureddin M.

    2006-04-15

    In this paper, the procedure for conducting quantitative elemental analysis by ZAF correction method using wavelength dispersive X-ray spectroscopy (WDS) in an electron probe microanalyzer (EPMA) is elaborated. Analysis of a thermal barrier coating (TBC) system formed on a Ni-based single crystal superalloy is presented as an example to illustrate the analysis of samples consisting of a large number of major and minor elements. The analysis was performed by known standards and measured peak-to-background intensity ratios. The procedure for using separate set of acquisition conditions for major and minor element analysis is explained and its importance is stressed.

  8. Fire Hazard Analysis for the Cold Vacuum Drying (CVD) Facility

    SciTech Connect

    JOHNSON, B.H.

    1999-08-19

    This Fire Hazard Analysis assesses the risk from fire within individual fire areas in the Cold Vacuum Drying Facility at the Hanford Site in relation to existing or proposed fire protection features to ascertain whether the objectives of DOE Order 5480.7A Fire Protection are met.

  9. Appalachian Play Fairway Analysis Seismic Hazards Supporting Data

    SciTech Connect

    Frank Horowitz

    2016-07-20

    These are the data used in estimating the seismic hazards (both natural and induced) for candidate direct use geothermal locations in the Appalachian Basin Play Fairway Analysis by Jordan et al. (2015). xMin,yMin -83.1407,36.7461 : xMax,yMax -71.5175,45.1729

  10. Phase 2 fire hazard analysis for the canister storage building

    SciTech Connect

    Sadanaga, C.T., Westinghouse Hanford

    1996-07-01

    The fire hazard analysis assesses the risk from fire in a facility to ascertain whether the fire protection policies are met. This document provides a preliminary FHA for the CSB facility. Open items have been noted in the document. A final FHA will be required at the completion of definitive design, prior to operation of the facility.

  11. Implementation of hazard analysis critical control point in jameed production.

    PubMed

    Al-Saed, A K; Al-Groum, R M; Al-Dabbas, M M

    2012-06-01

    The average of standard plate count and coliforms, Staphylococcus aureus and Salmonella counts for three home-made jameed samples, a traditional fermented dairy product, before applying hazard analysis critical control point system were 2.1 × 10(3), 8.9 × 10(1), 4 × 10(1) and less than 10 cfu/g, respectively. The developed hazard analysis critical control point plan resulted in identifying ten critical control points in the flow chart of jameed production. The critical control points included fresh milk receiving, pasteurization, addition of starter, water and salt, straining, personnel hygiene, drying and packaging. After applying hazard analysis critical control point system, there was significant improvement in the microbiological quality of the home-made jameed. The standard plate count was reduced to 3.1 × 10(2) cfu/g whereas coliform and Staphylococcus aureus counts were less than 10 cfu/g and Salmonella was not detected. Sensory evaluation results of color and flavor of sauce prepared from jameed showed a significant increase in the average scores given after hazard analysis critical control point application.

  12. Fire Hazards Analysis for the Inactive Equipment Storage Sprung Structure

    SciTech Connect

    MYOTT, C.F.

    2000-02-03

    The purpose of the analysis is to comprehensively assess the risk from fire within individual fire areas in relation to proposed fire protection so as to ascertain whether the fire protection objective of DOE Order 5480.1A are met. The order acknowledges a graded approach commensurate with the hazards involved.

  13. Risk analysis for roadways subjected to multiple landslide-related hazards

    NASA Astrophysics Data System (ADS)

    Corominas, Jordi; Mavrouli, Olga

    2014-05-01

    dynamic factors: the service load and the wall deformation. The consequences are then calculated for each hazard type according to its characteristics (mechanism, magnitude, frequency). The difference of this method in comparison with other methodologies for landslide-related hazards lies in the hazard scenarios and consequence profiles that are investigated. The depth of analysis permits to account for local conditions either concerning the hazard or the consequences (the latter with respect to the very particular characteristics of the roadway such as traffic, number of lanes, velocity…). Furthermore it provides an extensive list of quantitative risk descriptors, including both individual and collective ones. The methodology was made automatic using the data sheets by Microsoft Excel. The results can be used to support decision-taking for the planning of protection measures. Gaps in knowledge and restrictions are discussed as well.

  14. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  15. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  16. Quantitative analysis of arm movement smoothness

    NASA Astrophysics Data System (ADS)

    Szczesna, Agnieszka; Błaszczyszyn, Monika

    2017-07-01

    The paper deals with the problem of motion data quantitative smoothness analysis. We investigated values of movement unit, fluidity and jerk for healthy and paralyzed arm of patients with hemiparesis after stroke. Patients were performing drinking task. To validate the approach, movement of 24 patients were captured using optical motion capture system.

  17. A quantitative approach to scar analysis.

    PubMed

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-02-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology.

  18. A Quantitative Approach to Scar Analysis

    PubMed Central

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-01-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. PMID:21281794

  19. Hazard analysis of Clostridium perfringens in the Skylab Food System

    NASA Technical Reports Server (NTRS)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.

    1974-01-01

    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  20. PO*WW*ER mobile treatment unit process hazards analysis

    SciTech Connect

    Richardson, R.B.

    1996-06-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented PO*WW*ER mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat aqueous mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses evaporation to separate organics and water from radionuclides and solids, and catalytic oxidation to convert the hazardous into byproducts. This process hazards analysis evaluated a number of accident scenarios not directly related to the operation of the MTU, such as natural phenomena damage and mishandling of chemical containers. Worst case accident scenarios were further evaluated to determine the risk potential to the MTU and to workers, the public, and the environment. The overall risk to any group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.

  1. Hazard analysis of Clostridium perfringens in the Skylab Food System

    NASA Technical Reports Server (NTRS)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.

    1974-01-01

    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  2. Challenges to Seismic Hazard Analysis of Critical Infrastructures

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2005-12-01

    Based on the background of the review of a large scale probabilistic seismic hazard analysis (PSHA) performed in Switzerland for the sites of Swiss nuclear power plants- the PEGASOS project (2000-2004) - challenges to seismic hazard analysis of critical infrastructures from the perspective of a professional safety analyst are discussed. The PEGASOS study was performed to provide a meaningful input for the update of the plant specific PRAs (Probabilistic Risk Assessment) of Swiss nuclear power plants. Earlier experience had shown that the results of these studies to a large extend are driven by the results of the seismic hazard analysis. The PEGASOS-study was performed in full compliance with the procedures developed by the Senior Seismic Hazard Analysis Committee (SSHAC) of U.S.A (SSHAC, 1997) developed for the treatment of uncertainties by the use of a structured expert elicitation process. The preliminary results derived from the project did show an unexpected amount of uncertainty and were regarded as not suitable for direct application. A detailed review of the SSHAC-methodology revealed a number of critical issues with respect to the treatment of uncertainties and the mathematical models applied, which will be presented in the paper. The most important issued to be discussed are: * The ambiguous solution of PSHA-logic trees * The inadequate mathematical treatment of the results of expert elicitations based on the assumption of bias free expert estimates * The problems associated with the "think model" of the separation of epistemic and aleatory uncertainties * The consequences of the ergodic assumption used to justify the transfer of attenuation equations of other regions to the region of interest. Based on these observations methodological questions with respect to the development of a risk-consistent design basis for new nuclear power plants as required by the U.S. NRC RG 1.165 will be evaluated. As an principal alternative for the development of a

  3. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  4. Quantitative ADF STEM: acquisition, analysis and interpretation

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2016-01-01

    Quantitative annular dark-field in the scanning transmission electron microscope (ADF STEM), where image intensities are used to provide composition and thickness measurements, has enjoyed a renaissance during the last decade. Now in a post aberration-correction era many aspects of the technique are being revisited. Here the recent progress and emerging best-practice for such aberration corrected quantitative ADF STEM is discussed including issues relating to proper acquisition of experimental data and its calibration, approaches for data analysis, the utility of such data, its interpretation and limitations.

  5. Evaluation and Application of Probabilistic Tsunami Hazard Analysis in California

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Wilson, R. I.; Miller, K.

    2014-12-01

    The California Geological Survey (CGS) and URS Corporation are in the process of generating tsunami hazard map products for land-use planning and construction through the California Seismic Hazard Mapping Act (Public Resources Code, sec 2690 et seq.). Similar to seismic hazard zonation, these products for land-use and development decision-making are typically based on a probabilistic analysis, and require information on the frequency of occurrence through a probabilistic tsunami hazard analysis (PTHA). In Phase 1 of CGS's work, the California PTHA Work Group was established to evaluate the results of PTHA demonstration projects in Crescent City and Huntington Beach. The results of this Phase 1 review of the two independent analyses indicate PTHA's can be developed with recommended improvements in source characterization, PTHA methods, and numerical model use. An immediate significant improvement is to align the characterization of the Cascadia Subduction Zone PTHA with the seismic characterization of the National Seismic Hazard Map Program of the USGS. In addition to applying PTHA to land-use planning and the two demonstration projects, CGS and the CA-PTHA Work Group identified other potential applications for various PTHA risk levels (ARP = Average Return Period), including flood insurance (100 and 500 year ARP), building codes (2,500 year ARP), and emergency response planning (1000 year ARP or larger). CGS is working with URS Corp., the California Office of Emergency Services, and FEMA on a Phase 2 plan to produce a single set of reliable and consistent PTHA maps for multiple risk levels and work with various end-users to determine how to use the maps. The California PTHA and the results of the Work Group review are also proposed to be used by the U.S. National Tsunami Hazard Mitigation Program to develop guidelines for production in other coastal states.

  6. Quantitative analysis of blood vessel geometry

    NASA Astrophysics Data System (ADS)

    Fuhrman, Michael G.; Abdul-Karim, Othman; Shah, Sujal; Gilbert, Steven G.; Van Bibber, Richard

    2001-07-01

    Re-narrowing or restenosis of a human coronary artery occurs within six months in one third of balloon angioplasty procedures. Accurate and repeatable quantitative analysis of vessel shape is important to characterize the progression and type of restenosis, and to evaluate effects new therapies might have. A combination of complicated geometry and image variability, and the need for high resolution and large image size makes visual/manual analysis slow, difficult, and prone to error. The image processing and analysis described here was developed to automate feature extraction of the lumen, internal elastic lamina, neointima, external elastic lamina, and tunica adventitia and to enable an objective, quantitative definition of blood vessel geometry. The quantitative geometrical analysis enables the measurement of several features including perimeter, area, and other metrics of vessel damage. Automation of feature extraction creates a high throughput capability that enables analysis of serial sections for more accurate measurement of restenosis dimensions. Measurement results are input into a relational database where they can be statistically analyzed compared across studies. As part of the integrated process, results are also imprinted on the images themselves to facilitate auditing of the results. The analysis is fast, repeatable and accurate while allowing the pathologist to control the measurement process.

  7. D0 Detector Collision Hall Oxygen Deficiancy Hazard Analysis

    SciTech Connect

    Wu, J.; /Fermilab

    1992-08-06

    EN-258, D0 Platform ODH Analysts. provided the oxygen deficiency hazard analysts for the D0 detector in the Assembly Hall. This note covers the same analysis. but revised for the Collision Hall. Liquid cryogens. released and warming to atmosphere conditions, expand to, on average, seven hundred times their liquid volume, and displace vital atmospheric oxygen. An oxygen deficiency hazard analysis assesses the increased risk to personnel in areas containing cryogenic systems. The D0 detector Collision Hall ODH analysis has been approached five different ways using established methods. If the low beta quad magnets are powered, and the exhaust rate is below 4220 scfm, the area is ODH class 1. In any other case, the analysis shows the area to be ODH class 0 as equipped (with ventilation fans) and requiring no special safety provisions. System designers have provided for a reduced oxygen level detection and warning system as well as emergency procedures to address fault conditions.

  8. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  9. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects.

  10. Quantitative rock-fall hazard and risk assessment for Yosemite Valley, Yosemite National Park, California

    USGS Publications Warehouse

    Stock, Greg M.; Luco, Nicolas; Collins, Brian D.; Harp, Edwin L.; Reichenbach, Paola; Frankel, Kurt L.

    2012-01-01

    caused injuries within developed regions located on or adjacent to talus slopes, highlighting the need for additional investigations into rock-fall hazard and risk. This assessment builds upon previous investigations of rock fall hazard and risk in Yosemite Valley (Wieczorek et al., 1998, 1999; Guzzetti et al., 2003; Wieczorek et al., 2008), and focuses on hazard and risk to structures posed by relatively frequent fragmental-type rock falls (Evans and Hungr, 1999), up to approximately 100,000 m3 in volume.

  11. High-energy PIXE: quantitative analysis

    NASA Astrophysics Data System (ADS)

    Denker, A.; Opitz-Coutureau, J.; Campbell, J. L.; Maxwell, J. A.; Hopman, T.

    2004-06-01

    In recent years, high-energy PIXE was applied successfully for qualitative analysis on art and archaeological objects, e.g. coins, bronzes, sculptures, brooches. However, in the absence of software for quantitative analysis the full benefit inherent in the PIXE technique was not obtained. For example, a bronze could easily be distinguished from a brass, but the concentrations could not be rigorously compared within a set of bronzes. In this paper, the first quantitative analysis by high-energy PIXE is presented. The Guelph PIXE Software Package GUPIX has been extended to proton energies up to 100 MeV, so that high-energy PIXE spectra can be evaluated and concentrations derived. Measurements on metal and alloy standards at two different proton energies have been performed and the obtained compositions were compared to the certified values. The results will be presented and deviations discussed.

  12. Hydraulic fracturing in unconventional reservoirs - Identification of hazards and strategies for a quantitative risk assessment

    NASA Astrophysics Data System (ADS)

    Helmig, R.; Kissinger, A.; Class, H.; Ebigbo, A.

    2012-12-01

    The production of unconventional gas resources, which require a fracking process to be released, such as shale gas, tight gas and coal bed methane, has become an economically attractive technology for a continued supply of fossil-fuel energy sources in many countries. Just recently, a major focus of interest has been directed to hydraulic fracking in Germany. The technology is controversial since it involves severe risks. The main difference in risk with respect to other technologies in the subsurface such as carbon sequestration is that fracking is remunerative, and it is important to distinguish between economical and environmental issues. The hydrofracking process may pose a threat to groundwater resources if fracking fluid or brine can migrate through fault zones into shallow aquifers. Diffuse methane emissions from the gas reservoir may not only contaminate shallow groundwater aquifers but also escape into the atmosphere where methane acts as a greenhouse gas. The working group "Risks in the Geological System" as part of ExxonMobil's hydrofracking dialogue and information dissemination processes was tasked with the assessment of possible hazards posed by migrating fluids as a result of hydrofracking activities. In this work several flow paths for fracking fluid, brine and methane are identified and scenarios are set up to qualitatively estimate under what circumstances these fluids would leak into shallower layers. The parametrization for potential fracking sites in North Rhine-Westphalia and Lower Saxony (both in Germany) is derived from literature using upper and lower bounds of hydraulic parameters. The results show that a significant fluid migration is only possible if a combination of several conservative assumptions are met by a scenario. Another outcome of this work is the demand for further research, as many of the involved processes in the hydrofracking process have yet not been fully understood (e.g. quantification of source terms for methane in the

  13. Quantitative analysis of colony morphology in yeast.

    PubMed

    Ruusuvuori, Pekka; Lin, Jake; Scott, Adrian C; Tan, Zhihao; Sorsa, Saija; Kallio, Aleksi; Nykter, Matti; Yli-Harja, Olli; Shmulevich, Ilya; Dudley, Aimée M

    2014-01-01

    Microorganisms often form multicellular structures such as biofilms and structured colonies that can influence the organism's virulence, drug resistance, and adherence to medical devices. Phenotypic classification of these structures has traditionally relied on qualitative scoring systems that limit detailed phenotypic comparisons between strains. Automated imaging and quantitative analysis have the potential to improve the speed and accuracy of experiments designed to study the genetic and molecular networks underlying different morphological traits. For this reason, we have developed a platform that uses automated image analysis and pattern recognition to quantify phenotypic signatures of yeast colonies. Our strategy enables quantitative analysis of individual colonies, measured at a single time point or over a series of time-lapse images, as well as the classification of distinct colony shapes based on image-derived features. Phenotypic changes in colony morphology can be expressed as changes in feature space trajectories over time, thereby enabling the visualization and quantitative analysis of morphological development. To facilitate data exploration, results are plotted dynamically through an interactive Yeast Image Analysis web application (YIMAA; http://yimaa.cs.tut.fi) that integrates the raw and processed images across all time points, allowing exploration of the image-based features and principal components associated with morphological development.

  14. Uncertainty analysis for seismic hazard in Northern and Central Italy

    USGS Publications Warehouse

    Lombardi, A.M.; Akinci, A.; Malagnini, L.; Mueller, C.S.

    2005-01-01

    In this study we examine uncertainty and parametric sensitivity of Peak Ground Acceleration (PGA) and 1-Hz Spectral Acceleration (1-Hz SA) in probabilistic seismic hazard maps (10% probability of exceedance in 50 years) of Northern and Central Italy. The uncertainty in hazard is estimated using a Monte Carlo approach to randomly sample a logic tree that has three input-variables branch points representing alternative values for b-value, maximum magnitude (Mmax) and attenuation relationships. Uncertainty is expressed in terms of 95% confidence band and Coefficient Of Variation (COV). The overall variability of ground motions and their sensitivity to each parameter of the logic tree are investigated. The largest values of the overall 95% confidence band are around 0.15 g for PGA in the Friuli and Northern Apennines regions and around 0.35 g for 1-Hz SA in the Central Apennines. The sensitivity analysis shows that the largest contributor to seismic hazard variability is uncertainty in the choice of ground-motion attenuation relationships, especially in the Friuli Region (???0.10 g) for PGA and in the Friuli and Central Apennines regions (???0.15 g) for 1-Hz SA. This is followed by the variability of the b-value: its main contribution is evident in the Friuli and Central Apennines regions for both 1-Hz SA (???0.15 g) and PGA (???0.10 g). We observe that the contribution of Mmax to seismic hazard variability is negligible, at least for 10% exceedance in 50-years hazard. The overall COV map for PGA shows that the uncertainty in the hazard is larger in the Friuli and Northern Apennine regions, around 20-30%, than the Central Apennines and Northwestern Italy, around 10-20%. The overall uncertainty is larger for the 1-Hz SA map and reaches 50-60% in the Central Apennines and Western Alps.

  15. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  16. Decreased pain sensitivity due to trimethylbenzene exposure: case study on quantitative approaches for hazard identification

    EPA Science Inventory

    Traditionally, human health risk assessments have relied on qualitative approaches for hazard identification, often using the Hill criteria and weight of evidence determinations to integrate data from multiple studies. Recently, the National Research Council has recommended the ...

  17. Graded approach for initiating event selection in a facility hazard analysis

    SciTech Connect

    Majumdar, K.; Altenbach, T.

    1998-04-01

    This paper describes a methodology for selecting initiating events or event scenarios for the hazard analysis of a new Department of Energy (DOE) facility at the Nevada Test Site for nuclear explosive operations called the Device Assembly Facility (DAF). The selection process is a very important first step in conducting the hazard analysis for the facility, which in turn may feed into a quantitative risk analysis. A comprehensive risk analysis is dependent on the identification and inclusion of a complete set of initiating events in the analysis model. A systematic and logical method of grading or screening all the potential initiating events satisfies the needs for completeness within the bounds of efficiency and practicality. By applying the graded approach to the selection of the initiating events, the task and hazard analysis was able to focus its attention on only those events having the potential to develop into credible accident scenarios. Resources were concentrated into the understanding of those scenarios, and assuring that adequate positive measures are in place to control the risk associated with them.

  18. Probabilistic Seismic Hazard Disaggregation Analysis for the South of Portugal

    NASA Astrophysics Data System (ADS)

    Rodrigues, I.; Sousa, M.; Teves-Costa, P.

    2010-12-01

    Probabilistic seismic hazard disaggregation analysis was performed and seismic scenarios were identified for Southern Mainland Portugal. This region’s seismicity is characterized by small and moderate magnitude events and by the sporadic occurrence of large earthquakes (e.g. the 1755 Lisbon earthquake). Thus, the Portuguese Civil Protection Agency (ANPC) sponsored a collaborative research project for the study of the seismic and tsunami risks in the Algarve (project ERSTA). In the framework of this project, a series of new developments were obtained, namely the revision of the seismic catalogue (IM, 2008), the delineation of new seismogenic zones affecting the Algarve region, which reflects the growing knowledge of this region's seismotectonic context, the derivation of new spectral attenuation laws (Carvalho and Campos Costa, 2008) and the revision of the probabilistic seismic hazard (Sousa et al. 2008). Seismic hazard was disaggregated considering different spaces of random variables, namely, bivariate conditional hazard distributions of X-Y (seismic source latitude and longitude) and multivariate 4D conditional hazard distributions of M-(X-Y)-ɛ (ɛ - deviation of ground motion to the median value predicted by an attenuation model). These procedures were performed for the peak ground acceleration (PGA) and for the 5% damped 1.0 and 2.5 Hz spectral acceleration levels of three return periods: 95, 475 and 975 years. The seismic scenarios controlling the hazard of a given ground motion level, were identified as the modal values of the 4D disaggregation analysis for each of the 84 parishes of the Algarve region. Those scenarios, based on a probabilistic analysis, are meant to be used in the emergency planning as a complement to the historical scenarios that severely affected this region. Seismic scenarios share a few number of geographical locations for all return periods. Moreover, seismic hazard of most Algarve’s parishes is dominated by the seismicity located

  19. Technical Guidance for Hazardous Analysis, Emergency Planning for Extremely Hazardous Substances

    EPA Pesticide Factsheets

    This current guide supplements NRT-1 by providing technical assistance to LEPCs to assess the lethal hazards related to potential airborne releases of extremely hazardous substances (EHSs) as designated under Section 302 of Title Ill of SARA.

  20. Good practices for quantitative bias analysis.

    PubMed

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  1. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  2. Probabilistic Tsunami Hazard Analysis - Results for the Western United States

    NASA Astrophysics Data System (ADS)

    Thio, H.; Polet, J.; Somerville, P.

    2007-12-01

    We have developed a series of probabilistic tsunami hazard maps for the coasts of western North America based on fault source characterizations of the circum-Pacific subduction zones as well as local offshore faults. The maps show the probabilistic offshore exceedance waveheights at 72, 475, 975 and 2475 year return periods, which are the return periods typically used in Probabilistic Seismic Hazard Analysis (PSHA). Our method follows along similar lines as (PSHA) which has become a standard practice in the evaluation and mitigation of seismic hazard in particular with respect to structures, infrastructure and lifelines. Its ability to condense complexities, variability and uncertainties of seismic activity into a manageable set of ground motion parameters greatly facilitates the planning and design of effective seismic resistant buildings and infrastructure. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can rapidly synthesize tsunami waveforms for any slip distribution on those faults by summing the individual weighted subfault tsunami waveforms. This Green's function summation provides accurate estimates of tsunami height for probabilistic calculations, where one typically integrates over thousands of earthquake scenarios. We have carried out tsunami hazard calculations for western North America and Hawaii based on a comprehensive source model around the Pacific Ocean including both subduction zone sources as well as local offshore faults. We will present the tsunami hazard maps and discuss how these results are used for probabilistic inundation mapping, including a follow-up inundation study of the San Francisco Bay area that is based on disaggregation results of the

  3. Quantitative analysis to guide orphan drug development.

    PubMed

    Lesko, L J

    2012-08-01

    The development of orphan drugs for rare diseases has made impressive strides in the past 10 years. There has been a surge in orphan drug designations, but new drug approvals have not kept up. This article presents a three-pronged hierarchical strategy for quantitative analysis of data at the descriptive, mechanistic, and systems levels of the biological system that could represent a standardized and rational approach to orphan drug development. Examples are provided to illustrate the concept.

  4. Long term volcanic hazard analysis in the Canary Islands

    NASA Astrophysics Data System (ADS)

    Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.

    2009-04-01

    Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit

  5. Surface Fire Hazards Analysis Technical Report-Constructor Facilities

    SciTech Connect

    R.E. Flye

    2000-10-24

    The purpose of this Fire Hazards Analysis Technical Report (hereinafter referred to as Technical Report) is to assess the risk from fire within individual fire areas to ascertain whether the U.S. Department of Energy (DOE) fire safety objectives are met. The objectives identified in DOE Order 420.1, Change 2, Facility Safety, Section 4.2, establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public, or the environment; Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding defined limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  6. Influence analysis in quantitative trait loci detection

    PubMed Central

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-01-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods—the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. PMID:24740424

  7. Influence analysis in quantitative trait loci detection.

    PubMed

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. © 2014 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. A methodology for probabilistic fault displacement hazard analysis (PFDHA)

    USGS Publications Warehouse

    Youngs, R.R.; Arabasz, W.J.; Anderson, R.E.; Ramelli, A.R.; Ake, J.P.; Slemmons, D.B.; McCalpin, J.P.; Doser, D.I.; Fridrich, C.J.; Swan, F. H.; Rogers, A.M.; Yount, J.C.; Anderson, L.W.; Smith, K.D.; Bruhn, R.L.; Knuepfer, P.L.K.; Smith, R.B.; DePolo, C.M.; O'Leary, D. W.; Coppersmith, K.J.; Pezzopane, S.K.; Schwartz, D.P.; Whitney, J.W.; Olig, S.S.; Toro, G.R.

    2003-01-01

    We present a methodology for conducting a site-specific probabilistic analysis of fault displacement hazard. Two approaches are outlined. The first relates the occurrence of fault displacement at or near the ground surface to the occurrence of earthquakes in the same manner as is done in a standard probabilistic seismic hazard analysis (PSHA) for ground shaking. The methodology for this approach is taken directly from PSHA methodology with the ground-motion attenuation function replaced by a fault displacement attenuation function. In the second approach, the rate of displacement events and the distribution for fault displacement are derived directly from the characteristics of the faults or geologic features at the site of interest. The methodology for probabilistic fault displacement hazard analysis (PFDHA) was developed for a normal faulting environment and the probability distributions we present may have general application in similar tectonic regions. In addition, the general methodology is applicable to any region and we indicate the type of data needed to apply the methodology elsewhere.

  9. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  10. Ergonomics hazards analysis of linemen's power line fixing work in China.

    PubMed

    Yu, Ming; Sun, Linyan; Du, Jianhua; Wu, Fengge

    2009-01-01

    This study used qualitative and quantitative methods, such as OWAS (Ovako working posture analysis system) and behavior observation, to analyze musculoskeletal disorder (MSD) risk factors of power line fixing work in China. Video-based sampling was used to record and analyze the frequency and posture of on-pole activities. Those key subtasks showed ergonomics characteristics of on-pole fixing tasks. Insulator-fixing was the longest subtask (33% of total working time). Bar-installing was the second longest (26% of total working time). It was evident that bar-installing and insulator-fixing were full of hazardous risks. The action categories of the 2 subtasks were higher than of the other ones. The 2 subtasks were also time-consuming, difficult and induced MSDs. Assistant linemen faced more hazardous factors than chief linemen.

  11. Flood Hazard and Risk Analysis in Urban Area

    NASA Astrophysics Data System (ADS)

    Huang, Chen-Jia; Hsu, Ming-hsi; Teng, Wei-Hsien; Lin, Tsung-Hsien

    2017-04-01

    Typhoons always induce heavy rainfall during summer and autumn seasons in Taiwan. Extreme weather in recent years often causes severe flooding which result in serious losses of life and property. With the rapid industrial and commercial development, people care about not only the quality of life, but also the safety of life and property. So the impact of life and property due to disaster is the most serious problem concerned by the residents. For the mitigation of the disaster impact, the flood hazard and risk analysis play an important role for the disaster prevention and mitigation. In this study, the vulnerability of Kaohsiung city was evaluated by statistics of social development factor. The hazard factors of Kaohsiung city was calculated by simulated flood depth of six different return periods and four typhoon events which result in serious flooding in Kaohsiung city. The flood risk can be obtained by means of the flood hazard and social vulnerability. The analysis results provide authority to strengthen disaster preparedness and to set up more resources in high risk areas.

  12. A Hazard Analysis for a Generic Insulin Infusion Pump

    PubMed Central

    Zhang, Yi; Jones, Paul L.; Jetley, Raoul

    2010-01-01

    Background Researchers at the Food and Drug Administration (FDA)/Center for Device and Radiological Health/Office of Science and Engineering Laboratories have been exploring the concept of model-based engineering as a means for improving the quality of medical device software. Insulin pumps were chosen as a research subject because their design provides the desired degree of research complexity and these types of devices present an ongoing regulatory challenge. Methods Insulin pump hazards and their contributing factors are considered in the context of a highly abstract generic insulin infusion pump (GIIP) model. Hazards were identified by consulting with manufacturers, pump users, and clinicians; by reviewing national and international standards and adverse event reports collected by the FDA; and from workshops sponsored by Diabetes Technology Society. This information has been consolidated in tabular form to facilitate further community analysis and discussion. Results A generic insulin infusion pump model architecture has been established. A fairly comprehensive hazard analysis document, corresponding to the GIIP model, is presented in this article. Conclusions We believe that this work represents the genesis of an insulin pump safety reference standard upon which future insulin pump designs can be based to help ensure a basic level of safety. More interaction with the diabetes community is needed to assure the quality of this safety modeling process. PMID:20307387

  13. Lithium-thionyl chloride cell system safety hazard analysis

    NASA Astrophysics Data System (ADS)

    Dampier, F. W.

    1985-03-01

    This system safety analysis for the lithium thionyl chloride cell is a critical review of the technical literature pertaining to cell safety and draws conclusions and makes recommendations based on this data. The thermodynamics and kinetics of the electrochemical reactions occurring during discharge are discussed with particular attention given to unstable SOCl2 reduction intermediates. Potentially hazardous reactions between the various cell components and discharge products or impurities that could occur during electrical or thermal abuse are described and the most hazardous conditions and reactions identified. Design factors influencing the safety of Li/SOCl2 cells, shipping and disposal methods and the toxicity of Li/SOCl2 battery components are additional safety issues that are also addressed.

  14. Deep Borehole Emplacement Mode Hazard Analysis Revision 0

    SciTech Connect

    Sevougian, S. David

    2015-08-07

    This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent of this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.

  15. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP... SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan....

  16. 75 FR 40839 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-14

    ... Collection; Comment Request; Hazard Analysis and Critical Control Point Procedures for the Safe and Sanitary... requirements for applying hazard analysis and critical control point (HAACP) procedures for safe and sanitary... appropriate, and other forms of information technology. Hazard Analysis and Critical Control Point...

  17. Quantitative petrostructure analysis. Technical summary report

    SciTech Connect

    Warren, N.

    1980-09-01

    The establishment of quantitative techniques would lead to the development of predictive tools which would be of obvious importance in applied geophysics and engineering. In rock physics, it would help establish laws for averaging the effects of finite densities of real cracks and pores. It would also help in elucidating the relation between observed complex crack structures and various models for the mechanical properties of single cracks. The petrostructure study is addressed to this problem. The purpose of the effort is to quantitatively characterize the mineral and crack texture of granitic rock samples. The rock structures are to be characterized in such a way that the results can be used (1) to constrain the modelling of the effect of cracks on the physical properties of rocks, and (2) to test the possibility of establishing quantitative and predictive relations between petrographic observables and whole rock properties. Statistical techniques are being developed and being applied to the problem of parameterizing complex texture and crack patterns of rock, and of measuring correlation of these parameters to other measurable variables. The study is an application in factor analysis.

  18. Quantitative textural analysis of phenocryst zoning patterns

    NASA Astrophysics Data System (ADS)

    Niespolo, E.; Andrews, B. J.

    2011-12-01

    The textural complexity of phenocrysts has made quantitative analysis of large populations of crystals a challenging study. Because each phenocryst expresses a unique localized event in the volcanic interior, no single crystal necessarily records the complete pre-eruptive history of the magmatic system as a whole. Synthesizing the textural and compositional records of many crystals, however, should provide a more complete understanding of conditions prior to eruption. In this research, we present new techniques for quantitative analysis of individual crystals and across populations of crystals. We apply those techniques to back-scattered electron images of complexly zoned plagioclase from El Chichón volcano, Mexico. Analysis begins with Gaussian filtering to remove noise from the images and create more qualitatively distinct zoning patterns. Because pixel intensity is directly correlated with Anorthite content, compositional anisotropy is then calculated throughout each image by determining the distance from a grid point at which variation in pixel intensity exceeds a pre-determined standard deviation; both regular and adaptive grid spacings are used, and length scales are calculated in 8 directions. The resulting textural maps are analogous to a vector field and quantify 2-dimensional variation in texture. With both types of grid spacing, changes in magnitude and orientation of textural anisotropy and length scale indicate different crystal zones. The adaptive grid spacing, however, describes non-uniform textural variation more completely and has a higher measurement density in regions of high-frequency variation. In general, textural regions commonly described as clean or smooth show longer length scales and aligned anisotropies, whereas shorter length scales with variable anisotropies identify areas commonly described as patchy, dusty, or rough. The comparison and correlation of textural and compositional zoning help determine how different crystals record the

  19. Multi-hazard risk analysis related to hurricanes

    NASA Astrophysics Data System (ADS)

    Lin, Ning

    Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is

  20. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  1. Quantitative NIR Raman analysis in liquid mixtures.

    PubMed

    Sato-Berrú, R Ysacc; Medina-Valtierra, Jorge; Medina-Gutiérrez, Cirilo; Frausto-Reyes, Claudio

    2004-08-01

    The capability to obtain quantitative information of a simple way from Raman spectra is a subject of considerable interest. In this work, this is demonstrated for mixtures of ethanol with water and rhodamine-6G (R-6G) with methanol, which were analyzed directly in glass vessel. The Raman intensities and a simple mathematical model have been used and applied for the analysis of liquid samples. It is starting point to generate a general expression, from the experimental spectra, as the sum of the particular expression for each pure compound allow us to obtain an expression for the mixtures which can be used for determining concentrations, from the Raman spectrum, of the mixture.

  2. Quantitative analysis of non-Hodgkin's lymphoma.

    PubMed Central

    Abbott, C R; Blewitt, R W; Bird, C C

    1982-01-01

    A preliminary attempt has been made to characterise a small series of non-Hodgkin's lymphomas (NHL) by morphometric means using the Quantimet 720 Kontron MOP/AMO3 image analysis systems. In most cases it was found that the distribution of nuclear area and correlation between mean nuclear area and frequency per unit field, corresponded closely with tumour classification determined by light microscopy. These results suggest that it may be possible to devise an objective and reproducible grading system for NHL using quantitative morphometric techniques. PMID:7040479

  3. A Gis Model Application Supporting The Analysis of The Seismic Hazard For The Urban Area of Catania (italy)

    NASA Astrophysics Data System (ADS)

    Grasso, S.; Maugeri, M.

    rigorous complex methods of analysis or qualitative procedures. A semi quantitative procedure based on the definition of the geotechnical hazard index has been applied for the zonation of the seismic geotechnical hazard of the city of Catania. In particular this procedure has been applied to define the influence of geotechnical properties of soil in a central area of the city of Catania, where some historical buildings of great importance are sited. It was also performed an investigation based on the inspection of more than one hundred historical ecclesiastical buildings of great importance, located in the city. Then, in order to identify the amplification effects due to the site conditions, a geotechnical survey form was prepared, to allow a semi quantitative evaluation of the seismic geotechnical hazard for all these historical buildings. In addition, to evaluate the foundation soil time -history response, a 1-D dynamic soil model was employed for all these buildings, considering the non linearity of soil behaviour. Using a GIS, a map of the seismic geotechnical hazard, of the liquefaction hazard and a preliminary map of the seismic hazard for the city of Catania have been obtained. From the analysis of obtained results it may be noticed that high hazard zones are mainly clayey sites

  4. Trichloroethylene and Cancer: Systematic and Quantitative Review of Epidemiologic Evidence for Identifying Hazards

    PubMed Central

    Scott, Cheryl Siegel; Jinot, Jennifer

    2011-01-01

    We conducted a meta-analysis focusing on studies with high potential for trichloroethylene (TCE) exposure to provide quantitative evaluations of the evidence for associations between TCE exposure and kidney, liver, and non-Hodgkin lymphoma (NHL) cancers. A systematic review documenting essential design features, exposure assessment approaches, statistical analyses, and potential sources of confounding and bias identified twenty-four cohort and case-control studies on TCE and the three cancers of interest with high potential for exposure, including five recently published case-control studies of kidney cancer or NHL. Fixed- and random-effects models were fitted to the data on overall exposure and on the highest exposure group. Sensitivity analyses examined the influence of individual studies and of alternative risk estimate selections. For overall TCE exposure and kidney cancer, the summary relative risk (RRm) estimate from the random effects model was 1.27 (95% CI: 1.13, 1.43), with a higher RRm for the highest exposure groups (1.58, 95% CI: 1.28, 1.96). The RRm estimates were not overly sensitive to alternative risk estimate selections or to removal of an individual study. There was no apparent heterogeneity or publication bias. For NHL, RRm estimates for overall exposure and for the highest exposure group, respectively, were 1.23 (95% CI: 1.07, 1.42) and 1.43 (95% CI: 1.13, 1.82) and, for liver cancer, 1.29 (95% CI: 1.07, 1.56) and 1.28 (95% CI: 0.93, 1.77). Our findings provide strong support for a causal association between TCE exposure and kidney cancer. The support is strong but less robust for NHL, where issues of study heterogeneity, potential publication bias, and weaker exposure-response results contribute uncertainty, and more limited for liver cancer, where only cohort studies with small numbers of cases were available. PMID:22163205

  5. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  6. Safety analysis of contained low-hazard biotechnology applications.

    PubMed

    Pettauer, D; Käppeli, O; van den Eede, G

    1998-06-01

    A technical safety analysis has been performed on a containment-level-2 pilot plant in order to assess an upgrading of the existing facility, which should comply with good manufacturing practices. The results were obtained by employing the hazard and operability (HAZOP) assessment method and are discussed in the light of the appropriateness of this procedural tool for low-hazard biotechnology applications. The potential release of micro-organisms accounts only for a minor part of the hazardous consequences. However, in certain cases the release of a large or moderate amount of micro-organisms would not be immediately identified. Most of the actions required to avoid these consequences fall into the realm of operational procedures. As a major part of potential failures result from human errors, standard operating procedures play a prominent role when establishing the concept of safety management. The HAZOP assessment method was found to be adequate for the type of process under investigation. The results also may be used for the generation of checklists which, in most cases, are sufficient for routine safety assurance.

  7. Quantitative analysis of retinal changes in hypertension

    NASA Astrophysics Data System (ADS)

    Giansanti, Roberto; Boemi, Massimo; Fumelli, Paolo; Passerini, Giorgio; Zingaretti, Primo

    1995-05-01

    Arterial hypertension is a high prevalence disease in Western countries and it is associated with increased risk for cardiovascular accidents. Retinal vessel changes are common findings in patients suffering from long-standing hypertensive disease. Morphological evaluations of the fundus oculi represent a fundamental tool for the clinical approach to the patient with hypertension. A qualitative analysis of the retinal lesions is usually performed and this implies severe limitations both in the classification of the different degrees of the pathology and in the follow-up of the disease. A diagnostic system based on a quantitative analysis of the retinal changes could overcome these problems. Our computerized approach was intended for this scope. The paper concentrates on the results and the implications of a computerized approach to the automatic extraction of numerical indexes describing morphological details of the fundus oculi. A previously developed image processing and recognition system, documented elsewhere and briefly described here, was successfully tested in pre-clinical experiments and applied in the evaluation of normal as well as of pathological fundus. The software system was developed to extract indexes such as caliber and path of vessels, local tortuosity of arteries and arterioles, positions and angles of crossings between two vessels. The reliability of the results, justified by their low variability, makes feasible the standardization of quantitative parameters to be used both in the diagnosis and in the prognosis of hypertension, and also allows prospective studies based upon them.

  8. Quantitative architectural analysis of bronchial intraepithelial neoplasia

    NASA Astrophysics Data System (ADS)

    Guillaud, Martial; MacAulay, Calum E.; Le Riche, Jean C.; Dawe, Chris; Korbelik, Jagoda; Lam, Stephen

    2000-04-01

    Considerable variation exists among pathologist in the interpretation of intraepithelial neoplasia making it difficult to determine the natural history of these lesion and to establish management guidelines for chemoprevention. The aim of the study is to evaluate architectural features of pre-neoplastic progression in lung cancer, and to search for a correlation between architectural index and conventional pathology. Quantitative architectural analysis was performed on a series of normal lung biopsies and Carcinoma In Situ (CIS). Centers of gravity of the nuclei within a pre-defined region of interest were used as seeds to generate a Voronoi Diagram. About 30 features derived from the Voronoi diagram, its dual the Delaunay tessellation, and the Minimum Spanning Tree were extracted. A discriminant analysis was performed to separate between the two groups. The architectural Index was calculated for each of the bronchial biopsies that were interpreted as hyperplasia, metaplasia, mild, moderate or severe dysplasia by conventional histopathology criteria. As a group, lesions classified as CIS by conventional histopathology criteria could be distinguished from dysplasia using the architectural Index. Metaplasia was distinct from hyperplasia and hyperplasia from normal. There was overlap between severe and moderate dysplasia but mild dysplasia could be distinguished form moderate dysplasia. Bronchial intraepithelial neoplastic lesions can be degraded objectively by architectural features. Combination of architectural features and nuclear morphometric features may improve the quantitation of the changes occurring during the intra-epithelial neoplastic process.

  9. Quantitative interactome analysis reveals a chemoresistant edgotype

    PubMed Central

    Chavez, Juan D.; Schweppe, Devin K.; Eng, Jimmy K.; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E.

    2015-01-01

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for ‘edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype. PMID:26235782

  10. The method of quantitative automatic metallographic analysis

    NASA Astrophysics Data System (ADS)

    Martyushev, N. V.; Skeeba, V. Yu

    2017-01-01

    A brief analysis of the existing softwares for computer processing of microstructure photographs is presented. The descriptions of the the software package developed by the author are demonstrated. This software product is intended for quantitative metallographic analysis of digital photographs of the microstructure of materials. It allows calculating the volume fraction and the average size of particles of the structure by several hundred secants (depending on the photographs resolution) in one vision field. Besides, a special module is built in the software allowing assessing the degree of deviation of the shape of different particles and impurities from the spherical one. The article presents the main algorithms, used during the creation of the software product, and formulae according to which the software calculates the parameters of the microstructure. It is shown that the reliability of calculations depends on the quality of preparation of the microstructure.

  11. Probabilistic Seismic Hazard Analysis: Adaptation for CO2 Sequestration Sites

    NASA Astrophysics Data System (ADS)

    Vasudevan, K.; Eaton, D. W.

    2011-12-01

    Large-scale sequestration of CO2 in depleted oil and gas fields in sedimentary basins such as the Western Canada Sedimentary Basin (WCSB) and in particular, central Alberta, should consider, among other safety and risk issues, a seismic hazard analysis that would include potential ground motions induced by earthquakes. The region is juxtaposed to major tectonically active seismogenic zones such as the Cascadia Subduction Zone, the Queen Charlotte Fault Zone, and the northern Cordillera region. Hazards associated with large-scale storage from strong ground motions caused by large-magnitude earthquakes along the west coast of Canada, and/or medium-to-large magnitude earthquakes triggered by such earthquakes in the neighbourhood of the storage site, must be clearly understood. To this end, stochastic modeling of the accelerograms recorded during large magnitude earthquakes in western Canada has been undertaken. A lack of recorded accelerograms and the absence of a catalogue of ground-motion prediction equations similar to the Next Generation Attenuation (NGA) database, however, hamper such analysis for the WCSB. In order to generate our own database of ground-motions for probabilistic seismic hazard analysis, we employ a site-based stochastic simulation approach. We use it to simulate three-component ground-motion accelerograms recorded during the November 3, 2002 Denali earthquake to mimic the Queen Charlotte Fault earthquakes. To represent a Cascadia megathrust earthquake, we consider three-component strong-motion accelerograms recorded during the March 11, 2011 Tohoku earthquake in Japan. Finally, to simulate an event comparable to the thrust-style Kinbasket Lake earthquake of 1908, we use three-component ground-motion accelerograms recorded during the 1985 Nahanni earthquake and the 2004 Chuetsu earthquake. Here, we develop predictive equations for the stochastic model parameters that describe ground motions in terms of earthquake and site characteristics such as

  12. Fire hazards analysis for W030 tank farm ventilation upgrade

    SciTech Connect

    Huckfeldt, R.A.

    1996-07-17

    This Fire Hazard Analysis (FHA) was prepared according to the requirements of U.S. Department of Energy (DOE) Order 5480.7A,FIRE PROTECTION, 2-17-93. The purpose of this FHA is to ascertain whether the objectives of DOE 5480.7A are being met. This purpose is accomplished through a conservative comprehensive assessment of the risk from fire and other perils within individual fire areas of a DOE facility in relation to proposed fire protection. This FHA is based on conditions set forth within this document and is valid only under these conditions.

  13. Quantitative laryngeal electromyography: turns and amplitude analysis.

    PubMed

    Statham, Melissa McCarty; Rosen, Clark A; Nandedkar, Sanjeev D; Munin, Michael C

    2010-10-01

    Laryngeal electromyography (LEMG) is primarily a qualitative examination, with no standardized approach to interpretation. The objectives of our study were to establish quantitative norms for motor unit recruitment in controls and to compare with interference pattern analysis in patients with unilateral vocal fold paralysis (VFP). Retrospective case-control study We performed LEMG of the thyroarytenoid-lateral cricoarytenoid muscle complex (TA-LCA) in 21 controls and 16 patients with unilateral VFP. Our standardized protocol used a concentric needle electrode with subjects performing variable force TA-LCA contraction. To quantify the interference pattern density, we measured turns and mean amplitude per turn for ≥10 epochs (each 500 milliseconds). Logarithmic regression analysis between amplitude and turns was used to calculate slope and intercept. Standard deviation was calculated to further define the confidence interval, enabling generation of a linear-scale graphical "cloud" of activity containing ≥90% of data points for controls and patients. Median age of controls and patients was similar (50.7 vs. 48.5 years). In controls, TA-LCA amplitude with variable contraction ranged from 145-1112 μV, and regression analysis comparing mean amplitude per turn to root-mean-square amplitude demonstrated high correlation (R = 0.82). In controls performing variable contraction, median turns per second was significantly higher compared to patients (450 vs. 290, P = .002). We first present interference pattern analysis in the TA-LCA in healthy adults and patients with unilateral VFP. Our findings indicate that motor unit recruitment can be quantitatively measured within the TA-LCA. Additionally, patients with unilateral VFP had significantly reduced turns when compared with controls.

  14. A LiDAR based analysis of hydraulic hazard mapping

    NASA Astrophysics Data System (ADS)

    Cazorzi, F.; De Luca, A.; Checchinato, A.; Segna, F.; Dalla Fontana, G.

    2012-04-01

    Mapping hydraulic hazard is a ticklish procedure as it involves technical and socio-economic aspects. On the one hand no dangerous areas should be excluded, on the other hand it is important not to exceed, beyond the necessary, with the surface assigned to some use limitations. The availability of a high resolution topographic survey allows nowadays to face this task with innovative procedures, both in the planning (mapping) and in the map validation phases. The latter is the object of the present work. It should be stressed that the described procedure is proposed purely as a preliminary analysis based on topography only, and therefore does not intend in any way to replace more sophisticated analysis methods requiring based on hydraulic modelling. The reference elevation model is a combination of the digital terrain model and the digital building model (DTM+DBM). The option of using the standard surface model (DSM) is not viable, as the DSM represents the vegetation canopy as a solid volume. This has the consequence of unrealistically considering the vegetation as a geometric obstacle to water flow. In some cases the topographic model construction requires the identification and digitization of the principal breaklines, such as river banks, ditches and similar natural or artificial structures. The geometrical and topological procedure for the validation of the hydraulic hazard maps is made of two steps. In the first step the whole area is subdivided into fluvial segments, with length chosen as a reasonable trade-off between the need to keep the hydrographical unit as complete as possible, and the need to separate sections of the river bed with significantly different morphology. Each of these segments is made of a single elongated polygon, whose shape can be quite complex, especially for meandering river sections, where the flow direction (i.e. the potential energy gradient associated to the talweg) is often inverted. In the second step the segments are analysed

  15. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    NASA Astrophysics Data System (ADS)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  16. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    SciTech Connect

    Payne, Suzette; Coppersmith, Ryan; Coppersmith, Kevin; Rodriguez-Marek, Adrian; Falero, Valentina Montaldo; Youngs, Robert

    2016-09-01

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-risk informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.

  17. Automated quantitative image analysis of nanoparticle assembly

    NASA Astrophysics Data System (ADS)

    Murthy, Chaitanya R.; Gao, Bo; Tao, Andrea R.; Arya, Gaurav

    2015-05-01

    The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated manner. The software outputs averages and distributions in the size, radius of gyration, fractal dimension, backbone length, end-to-end distance, anisotropic ratio, and aspect ratio of NP clusters as a function of time along with bootstrapped error bounds for all calculated properties. The polydispersity in the NP building blocks and biases in the sampling of NP clusters are accounted for through the use of probabilistic weights. This software, named Particle Image Characterization Tool (PICT), has been made publicly available and could be an invaluable resource for researchers studying NP assembly. To demonstrate its practical utility, we used PICT to analyze scanning electron microscopy images taken during the assembly of surface-functionalized metal NPs of differing shapes and sizes within a polymer matrix. PICT is used to characterize and analyze the morphology of NP clusters, providing quantitative information that can be used to elucidate the physical mechanisms governing NP assembly.The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated

  18. Fire hazard analysis for Plutonium Finishing Plant complex

    SciTech Connect

    MCKINNIS, D.L.

    1999-02-23

    A fire hazards analysis (FHA) was performed for the Plutonium Finishing Plant (PFP) Complex at the Department of Energy (DOE) Hanford site. The scope of the FHA focuses on the nuclear facilities/structures in the Complex. The analysis was conducted in accordance with RLID 5480.7, [DOE Directive RLID 5480.7, 1/17/94] and DOE Order 5480.7A, ''Fire Protection'' [DOE Order 5480.7A, 2/17/93] and addresses each of the sixteen principle elements outlined in paragraph 9.a(3) of the Order. The elements are addressed in terms of the fire protection objectives stated in paragraph 4 of DOE 5480.7A. In addition, the FHA also complies with WHC-CM-4-41, Fire Protection Program Manual, Section 3.4 [1994] and WHC-SD-GN-FHA-30001, Rev. 0 [WHC, 1994]. Objectives of the FHA are to determine: (1) the fire hazards that expose the PFP facilities, or that are inherent in the building operations, (2) the adequacy of the fire safety features currently located in the PFP Complex, and (3) the degree of compliance of the facility with specific fire safety provisions in DOE orders, related engineering codes, and standards.

  19. Materials characterization through quantitative digital image analysis

    SciTech Connect

    J. Philliber; B. Antoun; B. Somerday; N. Yang

    2000-07-01

    A digital image analysis system has been developed to allow advanced quantitative measurement of microstructural features. This capability is maintained as part of the microscopy facility at Sandia, Livermore. The system records images digitally, eliminating the use of film. Images obtained from other sources may also be imported into the system. Subsequent digital image processing enhances image appearance through the contrast and brightness adjustments. The system measures a variety of user-defined microstructural features--including area fraction, particle size and spatial distributions, grain sizes and orientations of elongated particles. These measurements are made in a semi-automatic mode through the use of macro programs and a computer controlled translation stage. A routine has been developed to create large montages of 50+ separate images. Individual image frames are matched to the nearest pixel to create seamless montages. Results from three different studies are presented to illustrate the capabilities of the system.

  20. Near Real Time Quantitative Gas Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Herget, William F.; Tromp, Marianne L.; Anderson, Charles R.

    1985-12-01

    A Fourier transform infrared (FT-IR) - based system has been developed and is undergoing evaluation for near real time multicomponent quantitative analysis of undiluted gaseous automotive exhaust emissions. The total system includes: (1) a gas conditioning system (GCS) for tracer gas injection, gas mixing, and temperature stabilization; and (2) an exhaust gas analyzer (EGA) consisting of a sample cell, an FT-IR system, and a computerized data processing system. Tests have shown that the system can monitor about 20 individual species (concentrations down to the 1-20 ppm range) with a time resolution of one second. Tests have been conducted on a chassis dynamometer system utilizing different autos, different fuels, and different driving cycles. Results were compared with those obtained using a standard constant volume sampling (CVS) system.

  1. Resources Available for Hazards Analysis of Aerospace Fluids

    NASA Technical Reports Server (NTRS)

    Woods, S. S.; Stewart, W. F.; Baker, D. L.

    2001-01-01

    In recent years, the legislative and executive branches of the federal government have pushed to make government more efficient and responsive to the needs of the marketplace. One of these initiatives, Public Law 104-113, also known as the National Technology Transfer and Advancement Act of 1995 (NTTAA), is designed to accelerate technology transfer to industry and promote government-industry partnership. Summarized, NTTAA states that '... all Federal agencies and departments shall use technical standards that are developed or adopted by voluntary consensus standards bodies, using such technical standards as a means to carry out policy objectives or activities determined by the agencies and departments. Government agencies must now determine if their in-house requirement-setting activities are sufficiently unique that no public interest is served by having them adopted by a voluntary consensus organization (VCO), or if not, to use or develop voluntary consensus standards. The Office of Management and Budget (OMB) is chartered by the law to monitor federal agency progress and report the results to Congress. In response to NTTAA, agency-wide oxygen and hydrogen safety standards sponsored by the NASA Headquarters (HQ) Office of Safety and Mission Assurance (OSMA) were obvious choices for early adoption by VCOs. In 1996, HQ sought assistance from the Johnson Space Center (JSC) White Sands Test Facility (WSTF), the technical lead for development of these safety standards, to evaluate their adoption by VCOs. At that time, WSTF-developed propellant hazards manuals were likewise identified for possible VCO adoption. Subsequently, WSTF was asked to represent NASA for development of an international ISO safety standard for hydrogen use. Concurrent with these WSTF standards activities are related efforts to develop and publish propellant hazards analysis protocols and safety courses for the industrial, propellant use of oxygen, hydrogen, and hypergols. This paper reports on

  2. Student evaluations of teaching: teaching quantitative courses can be hazardous to one’s career

    PubMed Central

    Smibert, Dylan

    2017-01-01

    Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors’ teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards. PMID:28503380

  3. Student evaluations of teaching: teaching quantitative courses can be hazardous to one's career.

    PubMed

    Uttl, Bob; Smibert, Dylan

    2017-01-01

    Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors' teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards.

  4. Quantitative Analysis of Tremors in Welders

    PubMed Central

    Sanchez-Ramos, Juan; Reimer, Dacy; Zesiewicz, Theresa; Sullivan, Kelly; Nausieda, Paul A.

    2011-01-01

    Background: Workers chronically exposed to manganese in welding fumes may develop an extra-pyramidal syndrome with postural and action tremors. Objectives: To determine the utility of tremor analysis in distinguishing tremors among workers exposed to welding fumes, patients with Idiopathic Parkinson’s Disease (IPD) and Essential Tremor (ET). Methods: Retrospective study of recorded tremor in subjects from academic Movement Disorders Clinics and Welders. Quantitative tremor analysis was performed and associated with clinical status. Results: Postural tremor intensity was increased in Welders and ET and was associated with visibly greater amplitude of tremor with arms extended. Mean center frequencies (Cf) of welders and patients with ET were significantly higher than the mean Cf of PD subjects. Although both the welders and the ET group exhibited a higher Cf with arms extended, welders could be distinguished from the ET subjects by a significantly lower Cf of the rest tremor than that measured in ET subjects. Conclusions: In the context of an appropriate exposure history and neurological examination, tremor analysis may be useful in the diagnosis of manganese-related extra-pyramidal manifestations. PMID:21655131

  5. Nonlinear dynamics and quantitative EEG analysis.

    PubMed

    Jansen, B H

    1996-01-01

    Quantitative, computerized electroencephalogram (EEG) analysis appears to be based on a phenomenological approach to EEG interpretation, and is primarily rooted in linear systems theory. A fundamentally different approach to computerized EEG analysis, however, is making its way into the laboratories. The basic idea, inspired by recent advances in the area of nonlinear dynamics and chaos theory, is to view an EEG as the output of a deterministic system of relatively simple complexity, but containing nonlinearities. This suggests that studying the geometrical dynamics of EEGs, and the development of neurophysiologically realistic models of EEG generation may produce more successful automated EEG analysis techniques than the classical, stochastic methods. A review of the fundamentals of chaos theory is provided. Evidence supporting the nonlinear dynamics paradigm to EEG interpretation is presented, and the kind of new information that can be extracted from the EEG is discussed. A case is made that a nonlinear dynamic systems viewpoint to EEG generation will profoundly affect the way EEG interpretation is currently done.

  6. D0 Detector Assemble Hall Platform Oxygen Deficiency Hazard Analysis

    SciTech Connect

    Clark, D.; Michael, J.; /Fermilab

    1991-01-29

    Liquid cryogens, released and warming to atmosphere conditions, expand to, on average, seven hundred times their liquid volume, and displace vital atmospheric oxygen. An oxygen deficiency hazard analysis assesses the increased risk to personnel in areas containing cryogenic systems. The D0 detector platform area ODH analysis has been approached four different ways using established methods. In each case, the analysis shows the platform area to be ODH class 0 as equipped (with ventilation fans) and requiring no special safety provisions. System designers have provided for a reduced oxygen level detection and warning system as well as emergency procedures to address fault conditions. The Oxygen Deficiency Hazard of any particular area is defined by these parameters: the nature of the accidental supply of inert gas (probability of occurrence and quantity then released), the area's volume, the area's ventilation rate, and to a small degree the elevation of the area. Once this information is assembled, the ODH classification can be determined through standardized calculations. The platform area under the D0 detector contains much of the cryogenic and gas system piping necessary for the D0 experiment. Prior to moving the detector into the Collision Hall, the liquid argon calorimeters are cooled down and operated in the Assembly Hall. The first phase of this operation involved the cooldown of the Central Calorimeter, which was done in February 1991. This engineering note assesses the increased risk to personnel in the platform level to a reduced oxygen atmosphere during the cool down and subsequent operation of the calorimeters in the Assembly Hall. In addition, it outlines the steps taken to warn personnel of an emergency and to direct the subsequent evacuation. This note analyses only the Assembly Hall area. A similar engineering note, EN-332, covers the analysis of the Collision Hall area.

  7. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  8. A novel hazard assessment method for biomass gasification stations based on extended set pair analysis.

    PubMed

    Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai

    2017-01-01

    Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA.

  9. A novel hazard assessment method for biomass gasification stations based on extended set pair analysis

    PubMed Central

    Yan, Fang; Xu, Kaili; Li, Deshun; Cui, Zhikai

    2017-01-01

    Biomass gasification stations are facing many hazard factors, therefore, it is necessary to make hazard assessment for them. In this study, a novel hazard assessment method called extended set pair analysis (ESPA) is proposed based on set pair analysis (SPA). However, the calculation of the connection degree (CD) requires the classification of hazard grades and their corresponding thresholds using SPA for the hazard assessment. In regard to the hazard assessment using ESPA, a novel calculation algorithm of the CD is worked out when hazard grades and their corresponding thresholds are unknown. Then the CD can be converted into Euclidean distance (ED) by a simple and concise calculation, and the hazard of each sample will be ranked based on the value of ED. In this paper, six biomass gasification stations are introduced to make hazard assessment using ESPA and general set pair analysis (GSPA), respectively. By the comparison of hazard assessment results obtained from ESPA and GSPA, the availability and validity of ESPA can be proved in the hazard assessment for biomass gasification stations. Meanwhile, the reasonability of ESPA is also justified by the sensitivity analysis of hazard assessment results obtained by ESPA and GSPA. PMID:28938011

  10. 14 CFR Appendix B to Part 417 - Flight Hazard Area Analysis for Aircraft and Ship Protection

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... hazard area that encompasses the launch point from the surface of the Earth to an altitude of 100,000 ft... radial direction. A launch operator must calculate an aircraft hazard area in the vicinity of the launch... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight Hazard Area Analysis for Aircraft...

  11. A quantitative risk analysis approach to port hydrocarbon logistics.

    PubMed

    Ronza, A; Carol, S; Espejo, V; Vílchez, J A; Arnaldos, J

    2006-01-16

    A method is presented that allows quantitative risk analysis to be performed on marine hydrocarbon terminals sited in ports. A significant gap was identified in the technical literature on QRA for the handling of hazardous materials in harbours published prior to this work. The analysis is extended to tanker navigation through port waters and loading and unloading facilities. The steps of the method are discussed, beginning with data collecting. As to accident scenario identification, an approach is proposed that takes into account minor and massive spills due to loading arm failures and tank rupture. Frequency estimation is thoroughly reviewed and a shortcut approach is proposed for frequency calculation. This allows for the two-fold possibility of a tanker colliding/grounding at/near the berth or while navigating to/from the berth. A number of probability data defining the possibility of a cargo spill after an external impact on a tanker are discussed. As to consequence and vulnerability estimates, a scheme is proposed for the use of ratios between the numbers of fatal victims, injured and evacuated people. Finally, an example application is given, based on a pilot study conducted in the Port of Barcelona, where the method was tested.

  12. Quantitative Analysis of Triple Mutant Genetic Interactions

    PubMed Central

    Braberg, Hannes; Alexander, Richard; Shales, Michael; Xu, Jiewei; Franks-Skiba, Kathleen E.; Wu, Qiuqin; Haber, James E.; Krogan, Nevan J.

    2014-01-01

    The quantitative analysis of genetic interactions between pairs of gene mutations has proven effective for characterizing cellular functions but can miss important interactions for functionally redundant genes. To address this limitation, we have developed an approach termed Triple Mutant Analysis (TMA). The procedure relies on a query strain that contains two deletions in a pair of redundant or otherwise related genes, that is crossed against a panel of candidate deletion strains to isolate triple mutants and measure their growth. A central feature of TMA is to interrogate mutants that are synthetically sick when two other genes are deleted but interact minimally with either single deletion. This approach has been valuable for discovering genes that restore critical functions when the principle actors are deleted. TMA has also uncovered double mutant combinations that produce severe defects because a third protein becomes deregulated and acts in a deleterious fashion, and it has revealed functional differences between proteins presumed to act together. The protocol is optimized for Singer ROTOR pinning robots, takes 3 weeks to complete, and measures interactions for up to 30 double mutants against a library of 1536 single mutants. PMID:25010907

  13. Seniors' online communities: a quantitative content analysis.

    PubMed

    Nimrod, Galit

    2010-06-01

    To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. There was a constant increase in the daily activity level during the research period. Content analysis identified 13 main subjects discussed in the communities, including (in descending order) "Fun on line," "Retirement," "Family," "Health," "Work and Study," "Recreation" "Finance," "Religion and Spirituality," "Technology," "Aging," "Civic and Social," "Shopping," and "Travels." The overall tone was somewhat more positive than negative. The findings suggest that the utilities of Information and Communications Technologies for older adults that were identified in previous research are valid for seniors' online communities as well. However, the findings suggest several other possible benefits, which may be available only to online communities. The communities may provide social support, contribute to self-preservation, and serve as an opportunity for self-discovery and growth. Because they offer both leisure activity and an expanded social network, it is suggested that active participation in the communities may contribute to the well-being of older adults. Directions for future research and applied implications are further discussed.

  14. Quantitative analysis of protein turnover in plants.

    PubMed

    Nelson, Clark J; Li, Lei; Millar, A Harvey

    2014-03-01

    Proteins are constantly being synthesised and degraded as plant cells age and as plants grow, develop and adapt the proteome. Given that plants develop through a series of events from germination to fruiting and even undertake whole organ senescence, an understanding of protein turnover as a fundamental part of this process in plants is essential. Both synthesis and degradation processes are spatially separated in a cell across its compartmented structure. The majority of protein synthesis occurs in the cytosol, while synthesis of specific components occurs inside plastids and mitochondria. Degradation of proteins occurs in both the cytosol, through the action of the plant proteasome, and in organelles and lytic structures through different protease classes. Tracking the specific synthesis and degradation rate of individual proteins can be undertaken using stable isotope feeding and the ability of peptide MS to track labelled peptide fractions over time. Mathematical modelling can be used to follow the isotope signature of newly synthesised protein as it accumulates and natural abundance proteins as they are lost through degradation. Different technical and biological constraints govern the potential for the use of (13)C, (15)N, (2)H and (18)O for these experiments in complete labelling and partial labelling strategies. Future development of quantitative protein turnover analysis will involve analysis of protein populations in complexes and subcellular compartments, assessing the effect of PTMs and integrating turnover studies into wider system biology study of plants. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Application of disease burden to quantitative assessment of health hazards for a decentralized water reuse system.

    PubMed

    Gao, Tingting; Chen, Rong; Wang, Xiaochang; Ngo, Huu Hao; Li, Yu-You; Zhou, Jinhong; Zhang, Lu

    2016-05-01

    The aim of this article is to introduce the methodology of disease burden (DB) to quantify the health impact of microbial regrowth during wastewater reuse, using the case study of a decentralized water reuse system in Xi'an Si-yuan University, located in Xi'an, China. Based on field investigation findings, Escherichia coli (E. coli), Salmonella and rotavirus were selected as typical regrowth pathogens causing potential health hazards during the reuse of reclaimed water. Subsequently, major exposure routes including sprinkler irrigation, landscape fountains and toilet flushing were identified. Mathematical models were established to build the relationship between exposure dose and disease burden by calculating the disability adjusted life year (DALY). Results of disease burden for this case study show that DALYs attributed to E. coli were significantly greater than those caused by other pathogens, and DALYs associated with sprinkler irrigation were higher than those originating from other routes. A correlation between exposure dose and disease was obtained by introducing a modified calculation of morbidity, which can extend the assessment endpoint of health risk to disease burden from the conventional infection rate. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Rapid microbiological methods with hazard analysis critical control point.

    PubMed

    Griffiths, M W

    1997-01-01

    The proactive approach to ensuring food safety termed hazard analysis critical control point (HACCP) was introduced in the 1960s by the Pillsbury Company, in collaboration with the U.S. Army Natick Laboratories and National Aeronautics and Space Administration, to help guarantee that astronauts would not be incapacitated by the trauma of foodborne illness during space flights. The approach has subsequently been adopted as the standard food safety management system world-wide and is seen as forming the basis for harmonization of food inspection regulations necessitated by trade agreements such as General Agreement on Tariffs and Trade and North American Free Trade Agreement as the move toward globalization of trade in food products gains momentum. The new U.S. Department of Agriculture Mega-Reg requires mandatory introduction of HACCP, and the Food Safety Enhancement Program of Agriculture and Agri-food Canada, as well as the "due diligence" legislation of the European Union, is centered on HACCP principles.

  17. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko

    2010-05-01

    system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a

  18. Regional Analysis of the Hazard Level of Glacial Lakes in the Cordillera Blanca, Peru

    NASA Astrophysics Data System (ADS)

    Chisolm, Rachel E.; Jhon Sanchez Leon, Walter; McKinney, Daene C.; Cochachin Rapre, Alejo

    2016-04-01

    The Cordillera Blanca mountain range is the highest in Peru and contains many of the world's tropical glaciers. This region is severely impacted by climate change causing accelerated glacier retreat. Secondary impacts of climate change on glacier retreat include stress on water resources and the risk of glacial lake outburst floods (GLOFs) from the many lakes that are forming and growing at the base of glaciers. A number of GLOFs originating from lakes in the Cordillera Blanca have occurred over the last century, several of which have had catastrophic impacts on cities and communities downstream. Glaciologists and engineers in Peru have been studying the lakes of the Cordillera Blanca for many years and have identified several lakes that are considered dangerous. However, a systematic analysis of all the lakes in the Cordillera Blanca has never before been attempted. Some methodologies for this type of systematic analysis have been proposed (eg. Emmer and Vilimek 2014; Wang, et al. 2011), but as yet they have only been applied to a few select lakes in the Cordillera Blanca. This study uses remotely sensed data to study all of the lakes of the Glacial Lake Inventory published by the Glaciology and Water Resources Unit of Peru's National Water Authority (UGRH 2011). The objective of this study is to assign a level of potential hazard to each glacial lake in the Cordillera Blanca and to ascertain if any of the lakes beyond those that have already been studied might pose a danger to nearby populations. A number of parameters of analysis, both quantitative and qualitative, have been selected to assess the hazard level of each glacial lake in the Cordillera Blanca using digital elevation models, satellite imagery, and glacier outlines. These parameters are then combined to come up with a preliminary assessment of the hazard level of each lake; the equation weighting each parameter draws on previously published methodologies but is tailored to the regional characteristics

  19. Analysis of hazardous material releases due to natural hazards in the United States.

    PubMed

    Sengul, Hatice; Santella, Nicholas; Steinberg, Laura J; Cruz, Ana Maria

    2012-10-01

    Natural hazards were the cause of approximately 16,600 hazardous material (hazmat) releases reported to the National Response Center (NRC) between 1990 and 2008-three per cent of all reported hazmat releases. Rain-induced releases were most numerous (26 per cent of the total), followed by those associated with hurricanes (20 per cent), many of which resulted from major episodes in 2005 and 2008. Winds, storms or other weather-related phenomena were responsible for another 25 per cent of hazmat releases. Large releases were most frequently due to major natural disasters. For instance, hurricane-induced releases of petroleum from storage tanks account for a large fraction of the total volume of petroleum released during 'natechs' (understood here as a natural hazard and the hazardous materials release that results). Among the most commonly released chemicals were nitrogen oxides, benzene, and polychlorinated biphenyls. Three deaths, 52 injuries, and the evacuation of at least 5,000 persons were recorded as a consequence of natech events. Overall, results suggest that the number of natechs increased over the study period (1990-2008) with potential for serious human and environmental impacts.

  20. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  1. UPDATE TO THE PROBABILISTIC VOLCANIC HAZARD ANALYSIS, YUCCA MOUNTAIN, NEVADA

    SciTech Connect

    K.J. Coppersmith

    2005-09-14

    A probabilistic volcanic hazard analysis (PVHA) was conducted in 1996 for the proposed repository at Yucca Mountain, Nevada. Based on data gathered by the Yucca Mountain Project over the course of about 15 years, the analysis integrated the judgments of a panel of ten volcanic experts using methods of formal expert elicitation. PVHA resulted in a probability distribution of the annual frequency of a dike intersecting the repository, which ranges from 10E-7 to 10E-10 (mean 1.6 x 10E-8). The analysis incorporates assessments of the future locations, rates, and types of volcanic dikes that could intersect the repository, which lies about 300 m below the surface. A particular focus of the analysis is the quantification of uncertainties. Since the 1996 PVHA, additional aeromagnetic data have been collected in the Yucca Mountain region, including a high-resolution low-altitude survey. A number of anomalies have been identified within alluvial areas and modeling suggests that some of these may represent buried eruptive centers (basaltic cinder cones). A program is currently underway to drill several of the anomalies to gain information on their origin and, if basalt, their age and composition. To update the PVHA in light of the new aeromagnetic and drilling data as well as other advancements in volcanic hazard modeling over the past decade, the expert panel has been reconvened and the expert elicitation process has been fully restarted. The analysis requires assessments of the spatial distribution of igneous events, temporal distributions, and geometries and characteristics of future events (both intrusive and extrusive). The assessments are for future time periods of 10,000 years and 1,000,000 years. Uncertainties are being quantified in both the conceptual models that define these elements as well as in the parameters for the models. The expert elicitation process is centered around a series of workshops that focus on the available data; alternative approaches to

  2. Quantitative color analysis for capillaroscopy image segmentation.

    PubMed

    Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Amorosi, Beatrice; D'Alessio, Tommaso; Palma, Claudio

    2012-06-01

    This communication introduces a novel approach for quantitatively evaluating the role of color space decomposition in digital nailfold capillaroscopy analysis. It is clinically recognized that any alterations of the capillary pattern, at the periungual skin region, are directly related to dermatologic and rheumatic diseases. The proposed algorithm for the segmentation of digital capillaroscopy images is optimized with respect to the choice of the color space and the contrast variation. Since the color space is a critical factor for segmenting low-contrast images, an exhaustive comparison between different color channels is conducted and a novel color channel combination is presented. Results from images of 15 healthy subjects are compared with annotated data, i.e. selected images approved by clinicians. By comparison, a set of figures of merit, which highlights the algorithm capability to correctly segment capillaries, their shape and their number, is extracted. Experimental tests depict that the optimized procedure for capillaries segmentation, based on a novel color channel combination, presents values of average accuracy higher than 0.8, and extracts capillaries whose shape and granularity are acceptable. The obtained results are particularly encouraging for future developments on the classification of capillary patterns with respect to dermatologic and rheumatic diseases.

  3. Statistical modeling of ground motion relations for seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Raschke, Mathias

    2013-10-01

    We introduce a new approach for ground motion relations (GMR) in the probabilistic seismic hazard analysis (PSHA), being influenced by the extreme value theory of mathematical statistics. Therein, we understand a GMR as a random function. We derive mathematically the principle of area equivalence, wherein two alternative GMRs have an equivalent influence on the hazard if these GMRs have equivalent area functions. This includes local biases. An interpretation of the difference between these GMRs (an actual and a modeled one) as a random component leads to a general overestimation of residual variance and hazard. Beside this, we discuss important aspects of classical approaches and discover discrepancies with the state of the art of stochastics and statistics (model selection and significance, test of distribution assumptions, extreme value statistics). We criticize especially the assumption of logarithmic normally distributed residuals of maxima like the peak ground acceleration (PGA). The natural distribution of its individual random component (equivalent to exp( ɛ 0) of Joyner and Boore, Bull Seism Soc Am 83(2):469-487, 1993) is the generalized extreme value. We show by numerical researches that the actual distribution can be hidden and a wrong distribution assumption can influence the PSHA negatively as the negligence of area equivalence does. Finally, we suggest an estimation concept for GMRs of PSHA with a regression-free variance estimation of the individual random component. We demonstrate the advantages of event-specific GMRs by analyzing data sets from the PEER strong motion database and estimate event-specific GMRs. Therein, the majority of the best models base on an anisotropic point source approach. The residual variance of logarithmized PGA is significantly smaller than in previous models. We validate the estimations for the event with the largest sample by empirical area functions, which indicate the appropriate modeling of the GMR by an anisotropic

  4. NFPA 70E: Performing the electrical flash hazard analysis.

    PubMed

    Wallace, W Jon

    2005-08-01

    Arc flash hazards pose a serious risk to employee safety; proper safe work practices must be utilized. Electrical equipment > or = 50 volts must be deenergized and locked out/tagged out prior to servicing and maintenance unless doing so would increase hazards or is infeasible. Remember, convenience is not an acceptable reason for keeping equipment energized during servicing and maintenance. If electrical equipment must remain energized during Servicing and maintenance, NFPA 70E should be consulted to determine flash hazard boundaries as well as required PPE. Finally, circuit breakers and electrical disconnects must be marked to warn qualified employees of potential arc flash hazards.

  5. Probabilistic Rockfall Hazard Analysis in the area affect by the Christchurch Earthquakes, New Zealand

    NASA Astrophysics Data System (ADS)

    Frattini, P.; Lari, S.; Agliardi, F.; Crosta, G. B.; Salzmann, H.

    2012-04-01

    To limit damages to human lives and property in case of natural disasters, land planning and zonation, as well as the design of countermeasures, are fundamental tools, requiring however a rigorous quantitative risk analysis. As a consequence of the 3rd September 2010 (Mw 7.1) Darfield Earthquake, and the 22nd February (Mw 6.2), the 16th April 2011 (Mw 5.3) and the 13th June, 2011 (Mw 6.2) aftershock events, about 6000 rockfalls were triggered in the Port Hills of Christchurch, New Zealand. Five people were killed by falling rocks in the area, and several hundred homes were damaged or evacuated. In this work, we present a probabilistic rockfall hazard analysis for a small area located in the south-eastern slope of Richmond Hill (0.6 km2, Sumner, Christchurch, NZ). For the analysis, we adopted a new methodology (Probabilistic Rockfall Hazard Analysis, PRHA), which allows to quantify the exceedance probability for a given slope location of being affected by a rockfall event with a specific level of kinetic energy, integrating the contribution of different rockfall magnitude (volume) scenarios. The methodology requires the calculation of onset annual frequency, rockfall runout, and spatially-varying kinetic energy. Onset annual frequencies for different magnitude scenarios were derived from frequency-magnitude relationship adapted from the literature. The probability distribution of kinetic energy for a given slope location and volume scenario was obtained by rockfall runout modeling of non-interacting blocks through the 3D Hy-Stone simulation code. The reference simulation was calibrated by back-analysis of rockfall events occurred during the earthquake. For each rockfall magnitude scenario, 20 rockfall trajectories have been simulated for each source cell using stochastically variable values of restitution parameters. Finally, probabilistic analysis integrating over six rockfall magnitude scenarios (ranging from 0.001 m3 to 1000 m3) was carried out to produce

  6. A Descriptive Analysis of Prehospital Response to Hazardous Materials Events.

    PubMed

    Martin, Ashley J; Lohse, Christine M; Sztajnkrycer, Matthew D

    2015-10-01

    Little is known about the overall frequency of hazardous materials (HazMat) events in the United States and the nature of prehospital care for those exposed. The purpose of the current study was to perform a descriptive analysis of Emergency Medical Services (EMS) activations reported to a national EMS database. Analysis of the 2012 National EMS Information System (NEMSIS) Public Release Research Data Set v.2.2.1, containing EMS emergency response data submitted by 41 states, was conducted. Mandatory data elements E0207 (Type of Response Delay), E0208 (Type of Scene Delay), and E0209 (Type of Transport Delay) contained specific codes for HazMat events and were used to identify specific EMS activation records for subsequent analysis. Overlapping data elements were identified and combined in order to prevent duplicate entries. Descriptive analyses were generated from the NEMSIS Research Data Set. A total of 17,479,328 EMS activations were reported, of which 2,527 unique activations involved HazMat response. Mass-casualty incident was coded for 5.6% of activations. The most common level of prehospital care present on scene was Basic Life Support (BLS; 51.1%); 2.1% required aggressive Advanced Life Support (ALS) response. The most common locations for HazMat activations were homes (36.2%), streets or highways (26.3%), and health care facilities (11.6%). The primary symptoms observed by EMS personnel were pain (29.6%), breathing problems (12.2%), and change in responsiveness (9.6%). Two percent of HazMat activations involved cardiac arrest, with 21.7% occurring after EMS arrival. Delays in patient care included response delay, scene delay, and transport delay. Hazardous materials events are rare causes of EMS activation in the United States. The majority occur in non-industrial venues and involve two or fewer patients. Scene time frequently is delayed due to multiple barriers. Cardiac arrest is rare but occurred after EMS arrival in one-fifth of patients.

  7. Analysis of On-board Hazard Detection and Avoidance for Safe Lunar Landing

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew E.; Huertas, Andres; Werner, Robert A.; Montgomery, James F.

    2008-01-01

    Landing hazard detection and avoidance technology is being pursued within NASA to improve landing safety and increase access to sites of interest on the lunar surface. The performance of a hazard detection and avoidance system depends on properties of the terrain, sensor performance, algorithm design, vehicle characteristics and the overall all guidance navigation and control architecture. This paper analyzes the size of the region that must be imaged, sensor performance parameters and the impact of trajectory angle on hazard detection performance. The analysis shows that vehicle hazard tolerance is the driving parameter for hazard detection system design.

  8. Quantitative Analysis of Hypoperfusion in Acute Stroke

    PubMed Central

    Nael, Kambiz; Meshksar, Arash; Liebeskind, David S.; Coull, Bruce M.; Krupinski, Elizabeth A.; Villablanca, J. Pablo

    2014-01-01

    Background and Purpose This study compares the concordance between arterial spin labeling (ASL) and dynamic susceptibility contrast (DSC) for the identification of regional hypoperfusion and diffusion-perfusion mismatch tissue classification using a quantitative method. Methods The inclusion criteria for this retrospective study were as follows: patients with acute ischemic syndrome with symptom onset <24 hours and acquisition of both ASL and DSC MR perfusion. The volumes of infarction and hypoperfused lesions were calculated on ASL and DSC multi-parametric maps. Patients were classified into reperfused, matched, or mismatch groups using time to maximum >6 sec as the reference. In a subset of patients who were successfully recanalized, the identical analysis was performed and the infarction and hypoperfused lesion volumes were used for paired pre- and posttreatment comparisons. Results Forty-one patients met our inclusion criteria. Twenty patients underwent successful endovascular revascularization (TICI>2a), resulting in a total of 61 ASL-DSC data pairs for comparison. The hypoperfusion volume on ASL-cerebral blood flow best approximated the DSC-time to peak volume (r=0.83) in pretreatment group and time to maximum (r=0.46) after recanalization. Both ASL-cerebral blood flow and DSC-TTP overestimated the hypoperfusion volume compared with time to maximum volume in pretreatment (F=27.41, P<0.0001) and recanalized patients (F=8.78, P<0.0001). Conclusions ASL-cerebral blood flow overestimates the DSC time to maximum hypoperfusion volume and mismatch classification in patients with acute ischemic syndrome. Continued overestimation of hypoperfused volume after recanalization suggests flow pattern and velocity changes in addition to arterial transit delay can affects the performance of ASL. PMID:23988646

  9. Spatial prediction of landslide hazard using discriminant analysis and GIS

    Treesearch

    Peter V. Gorsevski; Paul Gessler; Randy B. Foltz

    2000-01-01

    Environmental attributes relevant for spatial prediction of landslides triggered by rain and snowmelt events were derived from digital elevation model (DEM). Those data in conjunction with statistics and geographic information system (GIS) provided a detailed basis for spatial prediction of landslide hazard. The spatial prediction of landslide hazard in this paper is...

  10. Natural hazard modeling and uncertainty analysis [Chapter 2

    Treesearch

    Matthew Thompson; Jord J. Warmink

    2017-01-01

    Modeling can play a critical role in assessing and mitigating risks posed by natural hazards. These modeling efforts generally aim to characterize the occurrence, intensity, and potential consequences of natural hazards. Uncertainties surrounding the modeling process can have important implications for the development, application, evaluation, and interpretation of...

  11. Hazard function analysis for flood planning under nonstationarity

    NASA Astrophysics Data System (ADS)

    Read, Laura K.; Vogel, Richard M.

    2016-05-01

    The field of hazard function analysis (HFA) involves a probabilistic assessment of the "time to failure" or "return period," T, of an event of interest. HFA is used in epidemiology, manufacturing, medicine, actuarial statistics, reliability engineering, economics, and elsewhere. For a stationary process, the probability distribution function (pdf) of the return period always follows an exponential distribution, the same is not true for nonstationary processes. When the process of interest, X, exhibits nonstationary behavior, HFA can provide a complementary approach to risk analysis with analytical tools particularly useful for hydrological applications. After a general introduction to HFA, we describe a new mathematical linkage between the magnitude of the flood event, X, and its return period, T, for nonstationary processes. We derive the probabilistic properties of T for a nonstationary one-parameter exponential model of X, and then use both Monte-Carlo simulation and HFA to generalize the behavior of T when X arises from a nonstationary two-parameter lognormal distribution. For this case, our findings suggest that a two-parameter Weibull distribution provides a reasonable approximation for the pdf of T. We document how HFA can provide an alternative approach to characterize the probabilistic properties of both nonstationary flood series and the resulting pdf of T.

  12. Quantitative assessment of volcanic ash hazards for health and infrastructure at Mt. Etna (Italy) by numerical simulation

    NASA Astrophysics Data System (ADS)

    Barsotti, S.; Andronico, D.; Neri, A.; Del Carlo, P.; Baxter, P. J.; Aspinall, W. P.; Hincks, T.

    2010-04-01

    We performed a quantitative hazard assessment to determine the potential impacts of volcanic tephra fall on human health and infrastructure in the vicinity of Mt. Etna (Italy). Using the numerical model VOL-CALPUFF, we explored the dynamics of long-lasting weak plume eruptions and their effects on the surrounding region. Input data are based on credible estimates of the main parameters characterising the expected events as derived from the historically observed and reconstructed explosive record of Mt. Etna. Monte Carlo techniques are used to capture the effects on estimates of finer ash concentration and total ground deposition due to volcanological uncertainties and meteorological variability. Numerical simulations compute the likelihoods of experiencing critical 10-μm volcanic particle (VP 10) concentrations in ambient air and tephra ground deposition at various populated locations around the volcano, including the city of Catania, and at key infrastructure, such as airports and main roads. Results show how the towns and infrastructure on the east side of the volcano are significantly more exposed to ash-related hazards than those on the west side, in accordance with wind statistics. Simulation outcomes also illustrate how, at the sites analysed, the amount of deposited particulate matter is proportional to the intensity (i.e. mass flow rate) of the event whereas predicted values of VP 10 concentrations are significantly larger for smaller events due to the reduced dispersal of low altitude plumes. The use of a simple re-mobilization model highlights the fact that particle re-suspension needs to be considered in the estimation of VP 10 values. Our findings can be used to inform civil protection agencies responsible for mitigating tephra fall impacts to human health, road transport and aviation safety.

  13. Skeleton-based cerebrovascular quantitative analysis.

    PubMed

    Wang, Xingce; Liu, Enhui; Wu, Zhongke; Zhai, Feifei; Zhu, Yi-Cheng; Shui, Wuyang; Zhou, Mingquan

    2016-12-20

    Cerebrovascular disease is the most common cause of death worldwide, with millions of deaths annually. Interest is increasing toward understanding the geometric factors that influence cerebrovascular diseases, such as stroke. Cerebrovascular shape analyses are essential for the diagnosis and pathological identification of these conditions. The current study aimed to provide a stable and consistent methodology for quantitative Circle of Willis (CoW) analysis and to identify geometric changes in this structure. An entire pipeline was designed with emphasis on automating each step. The stochastic segmentation was improved and volumetric data were obtained. The L1 medial axis method was applied to vessel volumetric data, which yielded a discrete skeleton dataset. A B-spline curve was used to fit the skeleton, and geometric values were proposed for a one-dimensional skeleton and radius. The calculations used to derive these values were illustrated in detail. In one example(No. 47 in the open dataset) all values for different branches of CoW were calculated. The anterior communicating artery(ACo) was the shortest vessel, with a length of 2.6mm. The range of the curvature of all vessels was (0.3, 0.9) ± (0.1, 1.4). The range of the torsion was (-12.4,0.8) ± (0, 48.7). The mean radius value range was (3.1, 1.5) ± (0.1, 0.7) mm, and the mean angle value range was (2.2, 2.9) ± (0, 0.2) mm. In addition to the torsion variance values in a few vessels, the variance values of all vessel characteristics remained near 1. The distribution of the radii of symmetrical posterior cerebral artery(PCA) and angle values of the symmetrical posterior communicating arteries(PCo) demonstrated a certain correlation between the corresponding values of symmetrical vessels on the CoW. The data verified the stability of our methodology. Our method was appropriate for the analysis of large medical image datasets derived from the automated pipeline for populations. This method was applicable to

  14. Quantitative Data Analysis--In the Graduate Curriculum

    ERIC Educational Resources Information Center

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  15. Quantitative Auger analysis of Nb-Ge superconducting alloys

    SciTech Connect

    Buitrago, R.H.

    1980-01-01

    The feasibility of using Auger electron analysis for quantitative analysis was investigated by studying Nb/sub 3/Ge thin-film Auger data with different approaches. A method base on elemental standards gave consistent quantitative values with reported Nb-Ge data. Alloy sputter yields were also calculated and results were consistent with those for pure elements.

  16. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity.

    PubMed

    Harper, Bryan; Thomas, Dennis; Chikkagoudar, Satish; Baker, Nathan; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto; Harper, Stacey

    The integration of rapid assays, large datasets, informatics, and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here, we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality in developing embryonic zebrafish, were established at realistic exposure levels and used to develop a hazard ranking of diverse nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both the core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a surface chemistry-based model of gold nanoparticle toxicity. Our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. Research should continue to focus on methodologies for determining nanomaterial hazard based on multiple sub-lethal responses following realistic, low-dose exposures, thus increasing the availability of quantitative measures of nanomaterial hazard to support the development of nanoparticle structure-activity relationships.

  17. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  18. Hazard Analysis and Disaster Preparedness in the Fairbanks North Star Borough, Alaska using Hazard Simulations, GIS, and Network Analysis

    NASA Astrophysics Data System (ADS)

    Schaefer, K.; Prakash, A.; Witte, W.

    2011-12-01

    The Fairbanks North Star Borough (FNSB) lies in interior Alaska, an area that is dominated by semiarid, boreal forest climate. FNSB frequently witnesses flooding events, wild land fires, earthquakes, extreme winter storms and other natural and man-made hazards. Being a large 19,065 km2 area, with a population of approximately 97,000 residents, providing emergency services to residents in a timely manner is a challenge. With only four highways going in and out of the borough, and only two of those leading to another city, most residents do not have quick access to a main road. Should a major disaster occur and block one of the two highways, options for evacuating or getting supplies to the area quickly dwindle. We present the design of a Geographic Information System (GIS) and network analysis based decision support tool that we have created for planning and emergency response. This tool will be used by Emergency Service (Fire/EMS), Emergency Management, Hazardous Materials Team, and Law Enforcement Agencies within FNSB to prepare and respond to a variety of potential disasters. The GIS combines available road and address networks from different FNSB agencies with the 2010 census data. We used ESRI's ArcGIS and FEMA's HAZUS-MH software to run multiple disaster scenarios and create several evacuation and response plans. Network analysis resulted in determining response time and classifying the borough by response times to facilitate allocation of emergency resources. The resulting GIS database can be used by any responding agency in FNSB to determine possible evacuation routes, where to open evacuation centers, placement of resources, and emergency response times. We developed a specific emergency response plan for three common scenarios: (i) major wildfire threatening Fairbanks, (ii) a major earthquake, (iii) loss of power during flooding in a flood-prone area. We also combined the network analysis results with high resolution imagery and elevation data to determine

  19. Comparative risk analysis of technological hazards (a review).

    PubMed Central

    Kates, R W; Kasperson, J X

    1983-01-01

    Hazards are threats to people and what they value and risks are measures of hazards. Comparative analyses of the risks and hazards of technology can be dated to Starr's 1969 paper [Starr, C. (1969) Science 165, 1232-1238] but are rooted in recent trends in the evolution of technology, the identification of hazard, the perception of risk, and the activities of society. These trends have spawned an interdisciplinary quasi profession with new terminology, methodology, and literature. A review of 54 English-language monographs and book-length collections, published between 1970 and 1983, identified seven recurring themes: (i) overviews of the field of risk assessment, (ii) efforts to estimate and quantify risk, (iii) discussions of risk acceptability, (iv) perception, (v) analyses of regulation, (vi) case studies of specific technological hazards, and (vii) agenda for research. Within this field, science occupies a unique niche, for many technological hazards transcend the realm of ordinary experience and require expert study. Scientists can make unique contributions to each area of hazard management but their primary contribution is the practice of basic science. Beyond that, science needs to further risk assessment by understanding the more subtle processes of hazard creation and by establishing conventions for estimating risk and for presenting and handling uncertainty. Scientists can enlighten the discussion of tolerable risk by setting risks into comparative contexts, by studying the process of evaluation, and by participating as knowledgeable individuals, but they cannot decide the issue. Science can inform the hazard management process by broadening the range of alternative control actions and modes of implementation and by devising methods to evaluate their effectiveness. PMID:6580625

  20. Seismic Hazard characterization study using an earthquake source with Probabilistic Seismic Hazard Analysis (PSHA) method in the Northern of Sumatra

    NASA Astrophysics Data System (ADS)

    Yahya, A.; Palupi, M. I. R.; Suharsono

    2016-11-01

    Sumatra region is one of the earthquake-prone areas in Indonesia because it is lie on an active tectonic zone. In 2004 there is earthquake with a moment magnitude of 9.2 located on the coast with the distance 160 km in the west of Nanggroe Aceh Darussalam and triggering a tsunami. These events take a lot of casualties and material losses, especially in the Province of Nanggroe Aceh Darussalam and North Sumatra. To minimize the impact of the earthquake disaster, a fundamental assessment of the earthquake hazard in the region is needed. Stages of research include the study of literature, collection and processing of seismic data, seismic source characterization and analysis of earthquake hazard by probabilistic methods (PSHA) used earthquake catalog from 1907 through 2014. The earthquake hazard represented by the value of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) in the period of 0.2 and 1 second on bedrock that is presented in the form of a map with a return period of 2475 years and the earthquake hazard curves for the city of Medan and Banda Aceh.

  1. Probabilistic Tsunami Hazard Analysis for Eastern Sicily (Italy)

    NASA Astrophysics Data System (ADS)

    Lorito, S.; Piatanesi, A.; Romano, F.; Basili, R.; Kastelic, V.; Tiberti, M. M.; Valensise, G.

    2009-12-01

    We applied the method for Probabilistic Tsunami Hazard Analysis (PTHA), first introduced by Rikitake and Aida (1988), to the coasts of eastern Sicily (Italy), and focusing on important cities such as Messina, Catania, and Augusta. The method has been very recently improved by González et al. (2009) to explicitly include inundation. Our approach considers both far- and near-field sources. Based on the approach described by Lorito et al. (2008), we identified a few Source Zones (SZ), each of which is characterized by a Typical Fault (TF) floating within it and capable of generating the Maximum Credible Earthquake (MCE). Some of the MCE recall historical events such as the 1908 Messina, the 1693 Catania and the 365 AD Crete earthquakes. We calculated the combined probability for all considered SZs by assigning a recurrence interval to each of them and a Poisson distribution of the inter-event times. We performed the PTHA for different damage metrics, such as runup, current speed, momentum and Froude number, with probability of exceedance of given thresholds being evaluated for different time intervals.

  2. Analysis of hazardous biological material by MALDI mass spectrometry

    SciTech Connect

    KL Wahl; KH Jarman; NB Valentine; MT Kingsley; CE Petersen; ST Cebula; AJ Saenz

    2000-03-21

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-MS) has become a valuable tool for analyzing microorganisms. The speed with which data can be obtained from MALDI-MS makes this a potentially important tool for biological health hazard monitoring and forensic applications. The excitement in the mass spectrometry community in this potential field of application is evident by the expanding list of research laboratories pursuing development of MALDI-MS for bacterial identification. Numerous research groups have demonstrated the ability to obtain unique MALDI-MS spectra from intact bacterial cells and bacterial cell extracts. The ability to differentiate strains of the same species has been investigated. Reproducibility of MALDI-MS spectra from bacterial species under carefully controlled experimental conditions has also been demonstrated. Wang et al. have reported on interlaboratory reproducibility of the MALDI-MS analysis of several bacterial species. However, there are still issues that need to be addressed, including the careful control of experimental parameters for reproducible spectra and selection of optimal experimental parameters such as solvent and matrix.

  3. Hazardous Gas Leak Analysis in the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.

    1991-01-01

    Helium tests of the main propulsion system in the Space Shuttle and on hydrogen leaks are examined. The hazardous gas detection system (HGDS) in the mobile launch pad uses mass spectrometers (MS) to monitor the shuttle environment for leaks. The mass spectrometers are fed by long tubes to sample gas from the payload bay, mid-body, aft engine compartment, and external tank. The purpose is to improve the HGDS, especially in its potential for locating cryogen leaks. Pre-existing leak data was analyzed for transient information to determine if the leak location could be pinpointed from test data. A rapid response leak detection experiment was designed, built, and tested. Large eddies and vortices were visually seen with Schlieren imaging, and they were detected in the time plots of the various instruments. The response time of the MS was found in the range of 0.05 to 0.1 sec. Pulsed concentration waves were clearly detected at 25 cycles per sec by spectral analysis of MS data. One conclusion is that the backup HGDS sampling frequency should be increased above the present rate of 1 sample per second.

  4. Preparation of waste oil for analysis to determine hazardous metals

    SciTech Connect

    Essling, A.M.; Huff, D.R.; Huff, E.A.; Fox, I.M.; Graczyk, D.G.

    1995-07-01

    Two methods for preparing waste-oil samples to permit measurement of their metals content were evaluated. For this evaluation, metals-in-oil standard reference materials were prepared by each method and the resulting solutions were analyzed for 20 metals, including those (As, Ba, Cd, Cr, Pb, Hg, Se, and Ag) regulated as hazardous under the Resource Conservation and Recovery Act. One preparation method involved combustion of the waste oil under oxygen at 25 atm pressure, as described in the American Society for Testing and Materials test method E926-88. As we applied it, this method gave recoveries well under 90% for most of the metals that we examined and, hence, proved unsatisfactory for routine application to waste-oil analysis. With the other method, nitric acid decomposition in a sealed vessel heated with microwave energy (analogous to US Environmental Protection Agency Method 3051), recoveries of all 20 metal contaminants were within 90 to 110% of the certified values. This microwave digestion procedure was also more efficient since it allowed six samples to be prepared together, whereas the oxygen combustion approach allowed processing of only one sample at a time.

  5. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan. Each.... Sanitation controls may be included in the HACCP plan. However, to the extent that they are monitored in... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard Analysis and Critical Control Point...

  6. 78 FR 69689 - Agency Information Collection Activities; Proposed Collection; Comment Request; Hazard Analysis...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... Collection; Comment Request; Hazard Analysis and Critical Control Point Procedures for the Safe and Sanitary... appropriate, and other forms of information technology. Hazard Analysis and Critical Control Point (HACCP... monitoring of the critical control points and other measurements as prescribed in the HACCP plan......

  7. Scout: orbit analysis and hazard assessment for NEOCP objects

    NASA Astrophysics Data System (ADS)

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  8. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    SciTech Connect

    Thomas, John

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  9. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  10. Some Epistemological Considerations Concerning Quantitative Analysis

    ERIC Educational Resources Information Center

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that…

  11. A Chemical Plant Safety and Hazard Analysis Course.

    ERIC Educational Resources Information Center

    Gupta, J. P.

    1989-01-01

    Describes a course for teaching chemical engineering students about safety and hazards. Summarizes the course content including topics for term papers and disciplines related to this course. Lists 18 references. (YP)

  12. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Technical Reports Server (NTRS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-01-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  13. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Astrophysics Data System (ADS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-02-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  14. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  15. SLUDGE TREATMENT PROJECT ENGINEERED CONTAINER RETRIEVAL AND TRANSFER SYSTEM PRELIMINARY DESIGN HAZARD ANALYSIS SUPPLEMENT 1

    SciTech Connect

    FRANZ GR; MEICHLE RH

    2011-07-18

    This 'What/If' Hazards Analysis addresses hazards affecting the Sludge Treatment Project Engineered Container Retrieval and Transfer System (ECRTS) NPH and external events at the preliminary design stage. In addition, the hazards of the operation sequence steps for the mechanical handling operations in preparation of Sludge Transport and Storage Container (STSC), disconnect STSC and prepare STSC and Sludge Transport System (STS) for shipping are addressed.

  16. Analysis of Two Quantitative Ultrasound Approaches.

    PubMed

    Muleki-Seya, Pauline; Han, Aiguo; Andre, Michael P; Erdman, John W; O'Brien, William D

    2017-09-01

    There are two well-known ultrasonic approaches to extract sets of quantitative parameters: Lizzi-Feleppa (LF) parameters: slope, intercept, and midband; and quantitative ultrasound (QUS)-derived parameters: effective scatterer diameter (ESD) and effective acoustic concentration (EAC). In this study, the relation between the LF and QUS-derived parameters is studied theoretically and experimentally on ex vivo mouse livers. As expected from the theory, LF slope is correlated to ESD ([Formula: see text]), and from experimental data, LF midband is correlated to EAC ([Formula: see text]). However, LF intercept is not correlated to ESD ([Formula: see text]) nor EAC ([Formula: see text]). The unexpected correlation observed between LF slope and EAC ([Formula: see text]) results likely from the high correlation between ESD and EAC due to the inversion process. For a liver fat percentage estimation, an important potential medical application, the parameters presenting the better correlation are EAC ([Formula: see text]) and LF midband ([Formula: see text]).

  17. Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations

    NASA Technical Reports Server (NTRS)

    Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art

    2012-01-01

    This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).

  18. Analysis of hazardous substances released during CFRP laser processing

    NASA Astrophysics Data System (ADS)

    Hustedt, Michael; Walter, Juergen; Bluemel, Sven; Jaeschke, Peter; Kaierle, Stefan

    2017-02-01

    Due to their outstanding mechanical properties, in particular their high specific strength parallel to the carbon fibers, carbon fiber reinforced plastics (CFRP) have a high potential regarding resource-efficient lightweight construction. Consequently, these composite materials are increasingly finding application in important industrial branches such as aircraft, automotive and wind energy industry. However, the processing of these materials is highly demanding. On the one hand, mechanical processing methods such as milling or drilling are sometimes rather slow, and they are connected with notable tool wear. On the other hand, thermal processing methods are critical as the two components matrix and reinforcement have widely differing thermophysical properties, possibly leading to damages of the composite structure in terms of pores or delamination. An emerging innovative method for processing of CFRP materials is the laser technology. As principally thermal method, laser processing is connected with the release of potentially hazardous, gaseous and particulate substances. Detailed knowledge of these process emissions is the basis to ensure the protection of man and the environment, according to the existing legal regulations. This knowledge will help to realize adequate protective measures and thus strengthen the development of CFRP laser processing. In this work, selected measurement methods and results of the analysis of the exhaust air and the air at the workplace during different laser processes with CFRP materials are presented. The investigations have been performed in the course of different cooperative projects, funded by the German Federal Ministry of Education and Research (BMBF) in the course of the funding initiative "Photonic Processes and Tools for Resource-Efficient Lightweight Structures".

  19. Hazard Detection Analysis for a Forward-Looking Interferometer

    NASA Technical Reports Server (NTRS)

    West, Leanne; Gimmestad, Gary; Herkert, Ralph; Smith, William L.; Kireev, Stanislav; Schaffner, Philip R.; Daniels, Taumi S.; Cornman, Larry B.; Sharman, Robert; Weekley, Andrew; Perram, Glen; Gross, Kevin; Smith, Greg; Feltz, Wayne; Taylor, Joe; Olson, Erik

    2010-01-01

    The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining the measurements required to alert flight crews to potential weather hazards to safe flight. To meet the needs of the commercial fleet, such a sensor should address multiple hazards to warrant the costs of development, certification, installation, training, and maintenance. The FLI concept is based on high-resolution Infrared Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing. These technologies have also been applied to the detection of aerosols and gases for other purposes. The FLI concept is being evaluated for its potential to address multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing during all phases of flight (takeoff, cruise, and landing). The research accomplished in this second phase of the FLI project was in three major areas: further sensitivity studies to better understand the potential capabilities and requirements for an airborne FLI instrument, field measurements that were conducted in an effort to provide empirical demonstrations of radiometric hazard detection, and theoretical work to support the development of algorithms to determine the severity of detected hazards

  20. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    SciTech Connect

    Matthews, Elizabeth C.; Sattler, Meredith; Friedland, Carol J.

    2014-11-15

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.

  1. Occupational hazards control of hazardous substances in clean room of semiconductor manufacturing plant using CFD analysis.

    PubMed

    Li, Jianfeng; Zhou, Ya-Fei

    2015-02-01

    The manufacturing processes in chip industries are complex, and many kinds of raw materials and solvents of different nature are used, most of which are highly toxic and dangerous. During the machine preventive maintenance period, these toxic and harmful substances will escape from the sealed reaction chamber to the clean workshop environment and endanger the health of the workers on-site, resulting in occupational diseases. From the perspective of prevention, the spread and prediction of hydrochloric acid (HCl) that escaped from the metal-etching chamber during maintenance were studied in this article. The computational fluid dynamics technology was used for a three-dimensional numerical simulation of the indoor air velocity field and the HCl concentration field, and the simulation results were then compared with the on-site monitoring data to verify the correctness and feasibility. The occupational hazards and control measures were analyzed based on the numerical simulation, and the optimal control measure was obtained. In this article, using the method of ambient air to analyze the occupational exposure can provide a new idea to the field of occupational health research in the integrated circuit industry and had theoretical and practical significance.

  2. Overview of Mars Sample Hazard Analysis (Requirements Workshop Series)

    NASA Technical Reports Server (NTRS)

    Rummel, John D.

    2000-01-01

    Samples returned from Mars should be contained and treated as though potentially hazardous until proven otherwise. If sample containment cannot be verified en route to Earth, the sample and spacecraft should either be sterilized in space or not returned to Earth. Integrity of sample containment should be maintained through reentry and transfer to a receiving facility. Controlled distribution of unsterilized materials should only occur if analyses determine the sample not to contain a biological hazard. Planetary protection measures adopted for the first sample return should not be relaxed for subsequent missions without thorough scientific review and concurrence by an appropriate independent body.

  3. In silico analysis of nanomaterials hazard and risk.

    PubMed

    Cohen, Yoram; Rallo, Robert; Liu, Rong; Liu, Haoyang Haven

    2013-03-19

    false positive relative to false negative predictions and the reliability of toxicity data. To establish the environmental impact of ENMs identified as toxic, researchers will need to estimate the potential level of environmental exposure concentration of ENMs in the various media such as air, water, soil, and vegetation. When environmental monitoring data are not available, models of ENMs fate and transport (at various levels of complexity) serve as alternative approaches for estimating exposure concentrations. Risk management decisions regarding the manufacturing, use, and environmental regulations of ENMs would clearly benefit from both the assessment of potential ENMs exposure concentrations and suitable toxicity metrics. The decision process should consider the totality of available information: quantitative and qualitative data and the analysis of nanomaterials toxicity, and fate and transport behavior in the environment. Effective decision-making to address the potential impacts of nanomaterials will require considerations of the relevant environmental, ecological, technological, economic, and sociopolitical factors affecting the complete lifecycle of nanomaterials, while accounting for data and modeling uncertainties. Accordingly, researchers will need to establish standardized data management and analysis tools through nanoinformatics as a basis for the development of rational decision tools.

  4. Preliminary fire hazard analysis for the PUTDR and TRU trenches in the Solid Waste Burial Ground

    SciTech Connect

    Gaschott, L.J.

    1995-06-16

    This document represents the Preliminary Fire Hazards Analysis for the Pilot Unvented TRU Drum Retrieval effort and for the Transuranic drum trenches in the low level burial grounds. The FHA was developed in accordance with DOE Order 5480.7A to address major hazards inherent in the facility.

  5. Structural and quantitative analysis of Equisetum alkaloids.

    PubMed

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved.

  6. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  7. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  8. Job load and hazard analysis: a method for the analysis of workplace conditions for occupational health care.

    PubMed Central

    Mattila, M K

    1985-01-01

    One requirement for successful occupational health care is reliable information on occupational hazards. The aim of this study was to develop a simple, standardised method for workplace investigations for use in occupational health care. The theoretical framework of the method comprises the stress-strain model, the hazard-danger model, and risk behaviour theory. The new method, termed job load and hazard analysis, includes four stages: identification of hazards, their evaluation, conclusions and proposals, and follow up. Different methods are available for hazard identification. The identification starts with a rough analysis of five factors, chemical hazards, physical hazards, physical load, mental stress, and accident risk. Hazards and stress factors are assessed with an ordinal scale. Specialised methods are used if all hazards cannot otherwise be identified. The analytical procedure comprises: detection of hazards through observations and interviews at the workplace and with a questionnaire; assessment of findings as teamwork; and evaluation of the results of these assessments to yield conclusions and proposals made by occupational health care personnel. A data processing system has been developed for data storage and future use. The method has functioned in practice, improving the contents of the occupational health care programme and generating preventive measures. The method offers many new possibilities for controlling occupational hazards and studying relations between working conditions and workers' health. PMID:4041383

  9. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... determine the food safety hazards reasonably likely to occur in the production process and identify the... chart describing the steps of each process and product flow in the establishment shall be prepared, and...) Decomposition; (viii) Parasites; (ix) Unapproved use of direct or indirect food or color additives; and...

  10. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... determine the food safety hazards reasonably likely to occur in the production process and identify the... chart describing the steps of each process and product flow in the establishment shall be prepared, and...) Decomposition; (viii) Parasites; (ix) Unapproved use of direct or indirect food or color additives; and...

  11. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... determine the food safety hazards reasonably likely to occur in the production process and identify the... chart describing the steps of each process and product flow in the establishment shall be prepared, and...) Decomposition; (viii) Parasites; (ix) Unapproved use of direct or indirect food or color additives; and...

  12. Reliability analysis of common hazardous waste treatment processes

    SciTech Connect

    Waters, Robert D.

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  13. [Analysis and toxicological evaluation of hazardous gases in sealed cabin].

    PubMed

    He, Z; Shi, J; Yu, B; Liang, H; Yu, F

    1998-10-01

    82 volatile organic compounds (VOCs) of eight organic sorts and 3 target inorganic compounds in a sealed cabin that simulating the flying spaceship were identified and quantified for 5 d, the law of hazardous gas concentration variation was discussed, and the atmosphere toxicology was evaluated preliminarily. It provides a basis for detecting gas compounds and evaluating the atmosphere toxicology in the spaceship.

  14. Princeton Plasma Physics Laboratory (PPPL) seismic hazard analysis

    SciTech Connect

    Savy, J.

    1989-10-01

    New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the results of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs.

  15. Hazardous Glaciers In Switzerland: A Statistical Analysis of Inventory Data

    NASA Astrophysics Data System (ADS)

    Raymond, M.; Funk, M.; Wegmann, M.

    Because of the recent increase in both occupation and economical activities in high mountain areas, a systematic overview of potential hazard zones of glaciers is needed to avoid the constuction of settlements and infrastructures in endangered areas in fu- ture. Historical informations about glacier disasters show that catastrophic events can happen repeatedly for the same causes and with the same dramatic consequences. Past catastrophic events are not only useful to identify potentially dangerous glaciers, but represent an indication of the kind of glacier hazards to expect for any given glacier. An inventory containing all known events having caused damages in the past has been compiled for Switzerland. Three different types of glacier hazards are distinguished , e.g. ice avalanches, glacier floods and glacier length changes.Hazardous glaciers have been identified in the alpine cantons of Bern, Grison, Uri, Vaud and Valais so far. The inventory data were analysed in terms of periodicity of different types of events as well as of damage occured.

  16. Qualitative and quantitative analysis of endocytic recycling.

    PubMed

    Reineke, James B; Xie, Shuwei; Naslavsky, Naava; Caplan, Steve

    2015-01-01

    Endocytosis, which encompasses the internalization and sorting of plasma membrane (PM) lipids and proteins to distinct membrane-bound intracellular compartments, is a highly regulated and fundamental cellular process by which eukaryotic cells dynamically regulate their PM composition. Indeed, endocytosis is implicated in crucial cellular processes that include proliferation, migration, and cell division as well as maintenance of tissue homeostasis such as apical-basal polarity. Once PM constituents have been taken up into the cell, either via clathrin-dependent endocytosis (CDE) or clathrin-independent endocytosis (CIE), they typically have two fates: degradation through the late-endosomal/lysosomal pathway or returning to the PM via endocytic recycling pathways. In this review, we will detail experimental procedures that allow for both qualitative and quantitative assessment of endocytic recycling of transmembrane proteins internalized by CDE and CIE, using the HeLa cervical cancer cell line as a model system.

  17. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    DOE PAGES

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; ...

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less

  18. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    SciTech Connect

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; Baker, Nathan A.; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto D.; Harper, Stacey

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.

  19. The value of integrating information from multiple hazards for flood risk analysis and management

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, J. T.; Escuder-Bueno, I.; Altarejos-García, L.; Serrano-Lombillo, A.

    2014-02-01

    This article presents a methodology for estimating flood risk in urban areas integrating pluvial flooding, river flooding and failure of both small and large dams. The first part includes a review of basic concepts on flood risk analysis, evaluation and management. Flood risk analyses may be developed at local, regional and national level, however a general methodology to perform a quantitative flood risk analysis including different flood hazards is still required. The second part describes the proposed methodology, which presents an integrated approach - combining pluvial, river flooding and flooding from dam failure, as applied to a case study: an urban area located downstream of a dam under construction. The methodology enhances the approach developed within the SUFRI project ("Sustainable Strategies of Urban Flood Risk Management to cope with the residual risk", 2009-2011). This article also shows how outcomes from flood risk analysis provide better and more complete information to inform authorities, local entities and the stakeholders involved in decision-making with regard to flood risk management.

  20. Joint association analysis of bivariate quantitative and qualitative traits.

    PubMed

    Yuan, Mengdie; Diao, Guoqing

    2011-11-29

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF.

  1. Estimating Source Recurrence Rates for Probabilistic Tsunami Hazard Analysis (PTHA)

    NASA Astrophysics Data System (ADS)

    Geist, E. L.; Parsons, T.

    2004-12-01

    A critical factor in probabilistic tsunami hazard analysis (PTHA) is estimating the average recurrence rate for tsunamigenic sources. Computational PTHA involves aggregating runup values derived from numerical simulations for many far-field and local sources, primarily earthquakes, each with a specified probability of occurrence. Computational PTHA is the primary method used in the ongoing FEMA pilot study at Seaside, Oregon. For a Poissonian arrival time model, the probability for a given source is dependent on a single parameter: the mean inter-event time of the source. In other probability models, parameters such as aperiodicity are also included. In this study, we focus on methods to determine the recurrence rates for large, shallow subduction zone earthquakes. For earthquakes below about M=8, recurrence rates can be obtained from modified Gutenberg-Richter distributions that are constrained by the tectonic moment rate for individual subduction zones. However, significant runup from far-field sources is commonly associated with the largest magnitude earthquakes, for which the recurrence rates are poorly constrained by the tail of empirical frequency-magnitude relationships. For these earthquakes, paleoseismic evidence of great earthquakes can be used to establish recurrence rates. Because the number of geologic horizons representing great earthquakes along a particular subduction zone is limited, special techniques are needed to account for open intervals before the first and after the last observed events. Uncertainty in age dates for the horizons also has to be included in estimating recurrence rates and aperiodicity. A Monte Carlo simulation is performed in which a random sample of earthquake times is drawn from a specified probability distribution with varying average recurrence rates and aperiodicities. A recurrence rate can be determined from the mean rate of all random samples that fit the observations, or a range of rates can be carried through the

  2. Quantitative Risk Analysis of Obstacle Limitation Standards

    NASA Astrophysics Data System (ADS)

    Sandaradura, Amila Silva

    Obstacle limitation surfaces (OLS) are the main safeguard against objects that can pose a hazard to aircraft operations at and around the airports. The standard dimensions of the most of these surfaces were estimated using the pilot's experience at the time when they were included in to the standard documents. As a result, some of these standards may have been overestimated while others may not provide an adequate level of safety. With airports moving to the Safety Management System (SMS) approach to design and operations safety, proper evaluation of the level of safety provided by OLS at specific sites becomes great importance to airport operators. There is no published evidence, however, for the estimation of the safety level provided by the existing OLS standards. Moreover, the rationale used by the ICAO to establish existing OLS standards is not readily available in the standard documents. Therefore this study attempts to collect actual flight path data using information provided by air traffic control radars and construct a methodology to assess the probability of aircraft deviating from their intended/protected path. The extension of the developed methodology can be used to estimate the OLS dimensions that provide an acceptable safety level for the aircraft operations. This will be helpful to estimate safe and efficient standard dimensions of the OLS and assess the risk level of objects to the aircraft operations around airports. In order to assess the existing standards and show the applications of the methodology, three case studies were conducted using aircraft data collected from Ottawa (CYOW), Calgary (CYYC) and Edmonton (CYEG) International Airports.

  3. Hazard analysis for 300 Area N Reactor Fuel Fabrication and Storage Facilty

    SciTech Connect

    Johnson, D.J.; Brehm, J.R.

    1994-01-25

    This hazard analysis (HA) has been prepared for the 300 Area N Reactor Fuel Fabrication and Storage Facility (Facility), in compliance with the requirements of Westinghouse Hanford Company (Westinghouse Hanford) controlled manual WHC-CM-4-46, Nonreactor Facility Safety Analysis Manual, and to the direction of WHC-IP-0690, Safety Analysis and Regulation Desk Instructions, (WHC 1992). An HA identifies potentially hazardous conditions in a facility and the associated potential accident scenarios. Unlike the Facility hazard classification documented in WHC-SD-NR-HC-004, Hazard Classification for 300 Area N Reactor Fuel Fabrication and Storage Facility, (Huang 1993), which is based on unmitigated consequences, credit is taken in an HA for administrative controls or engineered safety features planned or in place. The HA is the foundation for the accident analysis. The significant event scenarios identified by this HA will be further evaluated in a subsequent accident analysis.

  4. Taxonomic analysis of perceived risk: modeling individual and group perceptions within homogeneous hazard domains

    SciTech Connect

    Kraus, N.N.; Slovic, P.

    1988-09-01

    Previous studies of risk perception have typically focused on the mean judgments of a group of people regarding the riskiness (or safety) of a diverse set of hazardous activities, substances, and technologies. This paper reports the results of two studies that take a different path. Study 1 investigated whether models within a single technological domain were similar to previous models based on group means and diverse hazards. Study 2 created a group taxonomy of perceived risk for only one technological domain, railroads, and examined whether the structure of that taxonomy corresponded with taxonomies derived from prior studies of diverse hazards. Results from Study 1 indicated that the importance of various risk characteristics in determining perceived risk differed across individuals and across hazards, but not so much as to invalidate the results of earlier studies based on group means and diverse hazards. In Study 2, the detailed analysis of railroad hazards produced a structure that had both important similarities to, and dissimilarities from, the structure obtained in prior research with diverse hazard domains. The data also indicated that railroad hazards are really quite diverse, with some approaching nuclear reactors in their perceived seriousness. These results suggest that information about the diversity of perceptions within a single domain of hazards could provide valuable input to risk-management decisions.

  5. Sample size and power for a logrank test and Cox proportional hazards model with multiple groups and strata, or a quantitative covariate with multiple strata.

    PubMed

    Lachin, John M

    2013-11-10

    I describe general expressions for the evaluation of sample size and power for the K group Mantel-logrank test or the Cox proportional hazards (PH) model score test. Under an exponential model, the method of Lachin and Foulkes for the 2 group case is extended to the K ⩾2 group case using the non-centrality parameter of the K - 1 df chi-square test. I also show similar results to apply to the K group score test in a Cox PH model. Lachin and Foulkes employed a truncated exponential distribution to provide for a non-linear rate of enrollment. I present expressions for the mean time of enrollment and the expected follow-up time in the presence of exponential losses to follow-up. When used with the expression for the noncentrality parameter for the test, equations are derived for the evaluation of sample size and power under specific designs with r years of recruitment and T years total duration. I also describe sample size and power for a stratified-adjusted K group test and for the assessment of a group by stratum interaction. Similarly, I describe computations for a stratified-adjusted analysis of a quantitative covariate and a test of a stratum by covariate interaction in the Cox PH model. Copyright © 2013 John Wiley & Sons, Ltd.

  6. Development of hazard analysis by critical control points (HACCP) procedures to control organic chemical hazards in the agricultural production of raw food commodities.

    PubMed

    Ropkins, Karl; Ferguson, Andrew; Beck, Angus J

    2003-01-01

    Hazard Analysis by Critical Control Points (HACCP) is a systematic approach to the identification, assessment, and control of hazards in the food chain. Effective HACCP requires the consideration of all chemical microbiological, and physical hazards. However, current procedures focus primarily on microbiological and physical hazards, while chemical aspects of HACCP have received relatively little attention. In this article we discuss the application of HACCP to organic chemical contaminants and the problems that are likely to be encountered in agriculture. We also present generic templates for the development of organic chemical contaminant HACCP procedures for selected raw food commodities, that is, cereal crops,raw meats, and milk.

  7. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

    NASA Astrophysics Data System (ADS)

    Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

    2016-04-01

    In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach

  8. Quantitative infrared analysis of hydrogen fluoride

    SciTech Connect

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF{sub 6}. This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm{sup -1} as a function of pressure for 100% HF. (2) Absorbance at 3877 cm{sup -1} as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm{sup -1} for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm{sup -1} can be quantitatively analyzed via infrared methods.

  9. Quantitative Analysis of HIV-1 Preintegration Complexes

    PubMed Central

    Engelman, Alan; Oztop, Ilker; Vandegraaff, Nick; Raghavendra, Nidhanapati K.

    2009-01-01

    Retroviral replication proceeds through the formation of a provirus, an integrated DNA copy of the viral RNA genome. The linear cDNA product of reverse transcription is the integration substrate and two different integrase activities, 3′ processing and DNA strand transfer, are required for provirus formation. Integrase nicks the cDNA ends adjacent to phylogenetically-conserved CA dinucleotides during 3′ processing. After nuclear entry and locating a suitable chromatin acceptor site, integrase joins the recessed 3′-OHs to the 5′-phosphates of a double-stranded staggered cut in the DNA target. Integrase functions in the context of a large nucleoprotein complex, called the preintegration complex (PIC), and PICs are analyzed to determine levels of integrase 3′ processing and DNA strand transfer activities that occur during acute virus infection. Denatured cDNA end regions are monitored by indirect end-labeling to measure the extent of 3′ processing. Native PICs can efficiently integrate their viral cDNA into exogenously added target DNA in vitro, and Southern blotting or nested PCR assays are used to quantify the resultant DNA strand transfer activity. This study details HIV-1 infection, PIC extraction, partial purification, and quantitative analyses of integrase 3′ processing and DNA strand transfer activities. PMID:19233280

  10. Quantitative multi-modal NDT data analysis

    SciTech Connect

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.

  11. Hazard analysis for magnetic induction from electric transmission lines

    NASA Astrophysics Data System (ADS)

    Taylor, R. J.

    1983-06-01

    The potential hazard of magnetic induction from electric transmission lines was investigated. A literature search was accomplished, measurements were made and compared with the theories found in the literature and a new approach was developed for estimating the hazardous potential of magnetically induced voltage in fences which could be grasped by barefoot children under wet conditions. Conditions under which representative transmission lines could induce such voltage were explored. For example, a standard three-phase horizontal 500 KV transmission line carrying 1000 A was estimated to induce 10.5 V in a 900 M fence 30 M from the center of the line. It was also found that third harmonic currents can magnetically induce significant voltages compared to those induced by 60 HZ currents umder some circumstances.

  12. Space Propulsion Hazards Analysis Manual (SPHAM). Volume 1

    DTIC Science & Technology

    1988-10-01

    dioxide, carbon monoxide, Freons, hydrazine, liquid hydrogen , hydrogen peroxide, liquid fluorine, nitrogen tetroxide, monomethyihydrazine, liquid methane...why the requirement does not apply. Item 15: Column 8. Enter an exact dese -iption of the drawing, test method, operations procedurz etc., wherefrom...environment - Hydrogen embrittlement *Note: This discussion on hazard reports has been extracted for the most part from MCR 82-800, Rev. B, 29 Sept 82

  13. Chemical hazards analysis of resilient flooring for healthcare.

    PubMed

    Lent, Tom; Silas, Julie; Vallette, Jim

    2010-01-01

    This article addresses resilient flooring, evaluating the potential health effects of vinyl flooring and the leading alternatives-synthetic rubber, polyolefin, and linoleum-currently used in the healthcare marketplace. The study inventories chemicals incorporated as components of each of the four material types or involved in their life cycle as feedstocks, intermediary chemicals, or emissions. It then characterizes those chemicals using a chemical hazard-based framework that addresses persistence and bioaccumulation, human toxicity, and human exposures.

  14. Rockfall hazard analysis using LiDAR and spatial modeling

    NASA Astrophysics Data System (ADS)

    Lan, Hengxing; Martin, C. Derek; Zhou, Chenghu; Lim, Chang Ho

    2010-05-01

    Rockfalls have been significant geohazards along the Canadian Class 1 Railways (CN Rail and CP Rail) since their construction in the late 1800s. These rockfalls cause damage to infrastructure, interruption of business, and environmental impacts, and their occurrence varies both spatially and temporally. The proactive management of these rockfall hazards requires enabling technologies. This paper discusses a hazard assessment strategy for rockfalls along a section of a Canadian railway using LiDAR and spatial modeling. LiDAR provides accurate topographical information of the source area of rockfalls and along their paths. Spatial modeling was conducted using Rockfall Analyst, a three dimensional extension to GIS, to determine the characteristics of the rockfalls in terms of travel distance, velocity and energy. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results based on a high-resolution digital elevation model from a LiDAR dataset were compared with those based on a coarse digital elevation model. A comprehensive methodology for rockfall hazard assessment is proposed which takes into account the characteristics of source areas, the physical processes of rockfalls and the spatial attribution of their frequency and energy.

  15. Spatial analysis of hazardous waste data using geostatistics

    SciTech Connect

    Zirschky, J.H.

    1984-01-01

    The objective of this investigation was to determine if geostatistics could be a useful tool for evaluating hazardous waste sites. Three sites contaminated by dioxin (2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD)) were investigated. The first site evaluated was a creek into which TCDD-contaminated soil had eroded. The second site was a town in which TCDD-contaminated wastes had been sprayed onto the streets. Finally, the third site was a highway of which the shoulders were contaminated by dust deposition from a nearby hazardous waste site. The distribution of TCDD at the first and third sites were investigated using kriging, an optimal estimation technique. By using kriging, the areas of both sites requiring cleanup were successfully identified. At the second site, the town, satisfactory results were not obtained. The distribution of contamination in this town is believed to be very heterogeneous; thus, reasonable estimates could not be obtained. Additional sampling was therefore recommended at this site. Based upon this research, geostatistics appears to be a very useful tool for evaluating a hazardous waste site if the distribution of contaminants at the site is homogeneous, or can be divided into homogeneous areas.

  16. Analysis of temporal and spatial overlapping of hazards interactions at different scales

    NASA Astrophysics Data System (ADS)

    De Angeli, Silvia; Trasforini, Eva; Taylor, Faith; Rudari, Roberto; Rossi, Lauro

    2017-04-01

    The aim of this work is to develop a methodological framework to analyse the impact of multiple hazards on complex territorial systems, not only focusing on multi-hazard interactions but evaluating also the multi-risk, i.e. considering the impact of multiple hazards also in terms of exposure and vulnerability. Impacts generated by natural hazards in the last years are growing also because many regions of the world become subject to multiple hazards and cascading effects. The modelling of the multi-hazard dimension is a new challenge that allows the stakeholder to face with the chain effects between hazards and to model the risk in a real holistic way. Despite the recognition of the importance of a multi-hazard approach in risk assessment, there are only a few multi-risk approaches developed up to now. The examination of multiple hazards, in contrast to single-hazard cases, poses a series of challenges in each step of the risk analysis, starting from the assessment of the hazard level, passing trough the vulnerability evaluation, and arriving finally at the resultant risk level. Hazard interactions and hazard contemporaneity arising from their spatial and temporal overlap may not only influence the overall hazard level, but also the vulnerability of elements at risk. In the proposed approach a series of possible interactions between hazards are identified and classified. These interactions are then analysed looking at the temporal and spatial evolution of the hazards and the consequent impacts and represented through an explicative graphical framework. Different temporal dimensions are identified. The time of the impact differs from the time of the damage because, even after the end of the impact, damages remain until recovery and restoration processes are completed. The discrepancy between the time of the impact and time of the damage is very important for the modelling of multi-hazard damage. Whenever a certain interval of time occurs between two impacts

  17. A Guidance Manual: Waste Analysis at Facilities that Generate, Treat, Store, and Dispose of Hazardous Wastes

    EPA Pesticide Factsheets

    Discusses how a person can perform waste analyses and develop waste analysis plans (WAPs) in accordance with the federal hazardous waste regulations of the Resource Conservation and Recovery Act (RCRA)

  18. The quantitative failure of human reliability analysis

    SciTech Connect

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  19. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy.

    PubMed

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H; Jacobsen, Christina; Vainer, Ben

    2016-01-01

    The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  20. Laser hazard analysis for various candidate diode lasers associated with the high resolution pulsed scanner.

    SciTech Connect

    Augustoni, Arnold L.

    2004-10-01

    A laser hazard analysis and safety assessment was performed for each various laser diode candidates associated with the High Resolution Pulse Scanner based on the ANSI Standard Z136.1-2000, American National Standard for the Safe Use of Lasers. A theoretical laser hazard analysis model for this system was derived and an Excel{reg_sign} spreadsheet model was developed to answer the 'what if questions' associated with the various modes of operations for the various candidate diode lasers.

  1. Quantitive and Sociological Analysis of Blog Networks

    NASA Astrophysics Data System (ADS)

    Bachnik, W.; Szymczyk, S.; Leszczynski, S.; Podsiadlo, R.; Rymszewicz, E.; Kurylo, L.; Makowiec, D.; Bykowska, B.

    2005-10-01

    This paper examines the emerging phenomenon of blogging, using three different Polish blogging services as the base of the research. Authors show that blog networks are sharing their characteristics with complex networks (gamma coefficients, small worlds, cliques, etc.). Elements of sociometric analysis were used to prove existence of some social structures in the blog networks.

  2. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and spatial planning) and links back to the risk assessment module to calculate the new level of risk if the measure is implemented, and a cost-benefit (or cost-effectiveness/ Spatial Multi Criteria Evaluation) component to compare the alternatives and make decision on the optimal one. The third component of the SDSS is a temporal scenario component, which allows to define future scenarios in terms of climate change, land use change and population change, and the time periods for which these scenarios will be made. The component doesn't generate these scenarios but uses input maps for the effect of the scenarios on the hazard and assets maps. The last component is a communication and visualization component, which can compare scenarios and alternatives, not only in the form of maps, but also in other forms (risk curves, tables, graphs)

  3. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  4. Quantitative analysis of Li by PIGE technique

    NASA Astrophysics Data System (ADS)

    Fonseca, M.; Mateus, R.; Santos, C.; Cruz, J.; Silva, H.; Luis, H.; Martins, L.; Jesus, A. P.

    2017-09-01

    In this work, the cross section of the reactions 7Li(p,pγ)7Li (γ - 478 keV) at the proton energy range 2.0-4.2 MeV was measured. The measurements were carried out at the 3 MV Tandem Accelerator at the CTN/IST Laboratory in Lisbon. To validate the obtained results, calculated gamma-ray yields were compared, at several proton energy values, with experimental yields for thick samples made of inorganic compounds containing lithium. In order to quantify the light elements present in the samples, we used a standard free method for PIGE in thick samples, based on a code - Emitted Radiation Yield Analysis (ERYA), which integrates the nuclear reaction excitation function along the depth of the sample. We also demonstrated the capacity of the technique for analysis of Li ores, as Spodumene, Lithium Muscovite and Holmquistite, and Li-alloys for plasma facing materials showing that this is a reliable and accurate method for PIGE analysis of Li in thick samples.

  5. Analysis of flood hazard under consideration of dike breaches

    NASA Astrophysics Data System (ADS)

    Vorogushyn, S.; Apel, H.; Lindenschmidt, K.-E.; Merz, B.

    2009-04-01

    The study focuses on the development and application of a new modelling system which allows a comprehensive flood hazard assessment along diked river reaches under consideration of dike failures. The proposed Inundation Hazard Assessment Model (IHAM) represents a hybrid probabilistic-deterministic model. It comprises three models interactively coupled at runtime. These are: (1) 1D unsteady hydrodynamic model of river channel and floodplain flow between dikes, (2) probabilistic dike breach model which determines possible dike breach locations, breach widths and breach outflow discharges, and (3) 2D raster-based diffusion wave storage cell model of the hinterland areas behind the dikes. Due to the unsteady nature of the 1D and 2D coupled models, the dependence between hydraulic load at various locations along the reach is explicitly considered. The probabilistic dike breach model describes dike failures due to three failure mechanisms: overtopping, piping and slope instability caused by the seepage flow through the dike core (micro-instability). Dike failures for each mechanism are simulated based on fragility functions. The probability of breach is conditioned by the uncertainty in geometrical and geotechnical dike parameters. The 2D storage cell model driven by the breach outflow boundary conditions computes an extended spectrum of flood intensity indicators such as water depth, flow velocity, impulse, inundation duration and rate of water rise. IHAM is embedded in a Monte Carlo simulation in order to account for the natural variability of the flood generation processes reflected in the form of input hydrographs and for the randomness of dike failures given by breach locations, times and widths. The scenario calculations for the developed synthetic input hydrographs for the main river and tributary were carried out for floods with return periods of T = 100; 200; 500; 1000 a. Based on the modelling results, probabilistic dike hazard maps could be generated that

  6. Assessing the long-term probabilistic volcanic hazard for tephra fallout in Reykjavik, Iceland: a preliminary multi-source analysis

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Barsotti, Sara; Sandri, Laura; Tumi Guðmundsson, Magnús

    2015-04-01

    Icelandic volcanism is largely dominated by basaltic magma. Nevertheless the presence of glaciers over many Icelandic volcanic systems results in frequent phreatomagmatic eruptions and associated tephra production, making explosive eruptions the most common type of volcanic activity. Jökulhlaups are commonly considered as major volcanic hazard in Iceland for their high frequency and potentially very devastating local impact. Tephra fallout is also frequent and can impact larger areas. It is driven by the wind direction that can change with both altitude and season, making impossible to predict a priori where the tephra will be deposited during the next eruptions. Most of the volcanic activity in Iceland occurs in the central eastern part, over 100 km to the east of the main population centre around the capital Reykjavík. Therefore, the hazard from tephra fallout in Reykjavík is expected to be smaller than for communities settled near the main volcanic systems. However, within the framework of quantitative hazard and risk analyses, less frequent and/or less intense phenomena should not be neglected, since their risk evaluation depends on the effects suffered by the selected target. This is particularly true if the target is highly vulnerable, as large urban areas or important infrastructures. In this work we present the preliminary analysis aiming to perform a Probabilistic Volcanic Hazard Assessment (PVHA) for tephra fallout focused on the target area which includes the municipality of Reykjavík and the Keflavík international airport. This approach reverts the more common perspective where the hazard analysis is focused on the source (the volcanic system) and it follows a multi-source approach: indeed, the idea is to quantify, homogeneously, the hazard due to the main hazardous volcanoes that could pose a tephra fallout threat for the municipality of Reykjavík and the Keflavík airport. PVHA for each volcanic system is calculated independently and the results

  7. Landslide hazard zonation assessment using GIS analysis at Golmakan Watershed, northeast of Iran

    NASA Astrophysics Data System (ADS)

    Mansouri Daneshvar, Mohammad Reza; Bagherzadeh, Ali

    2011-03-01

    Landslide hazard is one of the major environmental hazards in geomorphic studies in mountainous areas. For helping the planners in selection of suitable locations to implement development projects, a landslide hazard zonation map has been produced for the Golmakan Watershed as part of Binaloud northern hillsides (northeast of Iran). For this purpose, after preparation of a landslide inventory of the study area, some 15 major parameters were examined for integrated analysis of landslide hazard in the region. The analyses of parameters were done by geo-referencing and lateral model making, satellite imaging of the study area, and spatial analyses by using geographical information system (GIS). The produced factor maps were weighted with analytic hierarchy process (AHP) method and then classified. The study area was classified into four classes of relative landslide hazards: negligible, low, moderate, and high. The final produced map for landslide hazard zonation in Golmakan Watershed revealed that: 1) the parameters of land slope and geologic formation have strong correlation ( R 2 = 0.79 and 0.83, respectively) with the dependent variable landslide hazard ( p<0.05). 2) About 18.8% of the study area has low and negligible hazards to future landslides, while 81.2% of the land area of Golmakan Watershed falls into the high and moderate categories.

  8. Solar Glaze Hazard Analysis Tool v. 3.0

    SciTech Connect

    Ho, Clifford K.; Sims, Cianan A.

    2015-08-25

    SGHAT predicts the occurrence and intensity of glare caused by a user-specified solar panel array when viewed from one or more observation points. An interactive mapping interface is used to determine the latitude, longitude and elevation of the array and observation points. The presence and intensity of glare is then calculated along a given time interval throughout the year, based on the position of the sun. The potential ocular hazard is also reported. The maximum energy production of the solar array is also estimated so that alternative designs can be compared to determine the design that yields the most energy production while mitigating glare.

  9. Quantitative Analysis of Immunohistochemistry in Melanoma Tumors.

    PubMed

    Lilyquist, Jenna; White, Kirsten Anne Meyer; Lee, Rebecca J; Philips, Genevieve K; Hughes, Christopher R; Torres, Salina M

    2017-04-01

    Identification of positive staining is often qualitative and subjective. This is particularly troublesome in pigmented melanoma lesions, because melanin is difficult to distinguish from the brown stain resulting from immunohistochemistry (IHC) using horse radish peroxidase developed with 3,3'-Diaminobenzidine (HRP-DAB). We sought to identify and quantify positive staining, particularly in melanoma lesions. We visualized G-protein coupled estrogen receptor (GPER) expression developed with HRP-DAB and counterstained with Azure B (stains melanin) in melanoma tissue sections (n = 3). Matched sections (n = 3), along with 22 unmatched sections, were stained only with Azure B as a control. Breast tissue (n = 1) was used as a positive HRP-DAB control. Images of the stained tissues were generated using a Nuance Spectral Imaging Camera. Analysis of the images was performed using the Nuance Spectral Imaging software and SlideBook. Data was analyzed using a Kruskal-Wallis one way analysis of variance (ANOVA). We showed that a pigmented melanoma tissue doubly stained with anti-GPER HRP-DAB and Azure B can be unmixed using spectra derived from a matched, Azure B-only section, and an anti-GPER HRP-DAB control. We unmixed each of the melanoma lesions using each of the Azure B spectra, evaluated the mean intensity of positive staining, and examined the distribution of the mean intensities (P = .73; Kruskal-Wallis). These results suggest that this method does not require a matched Azure B-only stained control tissue for every melanoma lesion, allowing precious tissues to be conserved for other studies. Importantly, this quantification method reduces the subjectivity of protein expression analysis, and provides a valuable tool for accurate evaluation, particularly for pigmented tissues.

  10. Chromatic Image Analysis For Quantitative Thermal Mapping

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  11. Influence of corrosion layers on quantitative analysis

    NASA Astrophysics Data System (ADS)

    Denker, A.; Bohne, W.; Opitz-Coutureau, J.; Rauschenberg, J.; Röhrich, J.; Strub, E.

    2005-09-01

    Art historians and restorers in charge of ancient metal objects are often reluctant to remove the corrosion layer evolved over time, as this would change the appearance of the artefact dramatically. Therefore, when an elemental analysis of the objects is required, this has to be done by penetrating the corrosion layer. In this work the influence of corrosion was studied on Chinese and Roman coins, where removal of oxidized material was possible. Measurements on spots with and without corrosion are presented and the results discussed.

  12. Quantitative Analysis in Nuclear Medicine Imaging

    NASA Astrophysics Data System (ADS)

    Zaidi, Habib

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases.

  13. Segmentation and Quantitative Analysis of Epithelial Tissues.

    PubMed

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  14. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  15. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  16. Hazard analysis and critical control point systems in the United States Department of Agriculture regulatory policy.

    PubMed

    Billy, T J; Wachsmuth, I K

    1997-08-01

    Recent outbreaks of foodborne illness and studies by expert groups have established the need for fundamental change in the United States meat and poultry inspection programme to reduce the risk of foodborne illness. The Food Safety and Inspection Service (FSIS) of the United States Department of Agriculture (USDA) has embarked on a broad effort to bring about such change, with particular emphasis on the reduction of pathogenic micro-organisms in raw meat and poultry products. The publication on 25 July 1996 of the Final Rule on pathogen reduction and hazard analysis and critical control point (HACCP) systems was a major milestone in the FSIS strategy for change. The Final Rule provides a framework for change and clarifies the respective roles of industry and government in ensuring the safety of meat and poultry products. With the implementation of this Final Rule underway, the FSIS has been exploring ways in which slaughter inspection carried out under an HACCP-based system can be changed so that food safety risks are addressed more adequately and the allocation of inspection resources is improved further. In addition, the FSIS is broadening the focus of food safety activities to extend beyond slaughter and processing plants by working with industry, academia and other government agencies. Such co-operation should lead to the development of measures to improve food safety before animals reach the slaughter plant and after products leave the inspected establishment for distribution to the retail level. For the future, the FSIS believes that quantitative risk assessments will be at the core of food safety activities. Risk assessments provide the most effective means of identifying how specific pathogens and other hazards may be encountered throughout the farm-to-table chain and of measuring the potential impact of various interventions. In addition, these assessments will be used in the development and evaluation of HACCP systems. The FSIS is currently conducting a

  17. Quantitative Analysis of Seismicity in Iran

    NASA Astrophysics Data System (ADS)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2016-12-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  18. Quantitative Analysis of Seismicity in Iran

    NASA Astrophysics Data System (ADS)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2017-03-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  19. Quantitative analysis of heart rate variability

    NASA Astrophysics Data System (ADS)

    Kurths, J.; Voss, A.; Saparin, P.; Witt, A.; Kleiner, H. J.; Wessel, N.

    1995-03-01

    In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The individual risk for this sudden cardiac death cannot be defined precisely by common available, noninvasive diagnostic tools like Holter monitoring, highly amplified ECG and traditional linear analysis of heart rate variability (HRV). Therefore, we apply some rather unconventional methods of nonlinear dynamics to analyze the HRV. Especially, some complexity measures that are based on symbolic dynamics as well as a new measure, the renormalized entropy, detect some abnormalities in the HRV of several patients who have been classified in the low risk group by traditional methods. A combination of these complexity measures with the parameters in the frequency domain seems to be a promising way to get a more precise definition of the individual risk. These findings have to be validated by a representative number of patients.

  20. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations

    PubMed Central

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific. PMID:28076440

  1. Application of a Cloud Model-Set Pair Analysis in Hazard Assessment for Biomass Gasification Stations.

    PubMed

    Yan, Fang; Xu, Kaili

    2017-01-01

    Because a biomass gasification station includes various hazard factors, hazard assessment is needed and significant. In this article, the cloud model (CM) is employed to improve set pair analysis (SPA), and a novel hazard assessment method for a biomass gasification station is proposed based on the cloud model-set pair analysis (CM-SPA). In this method, cloud weight is proposed to be the weight of index. In contrast to the index weight of other methods, cloud weight is shown by cloud descriptors; hence, the randomness and fuzziness of cloud weight will make it effective to reflect the linguistic variables of experts. Then, the cloud connection degree (CCD) is proposed to replace the connection degree (CD); the calculation algorithm of CCD is also worked out. By utilizing the CCD, the hazard assessment results are shown by some normal clouds, and the normal clouds are reflected by cloud descriptors; meanwhile, the hazard grade is confirmed by analyzing the cloud descriptors. After that, two biomass gasification stations undergo hazard assessment via CM-SPA and AHP based SPA, respectively. The comparison of assessment results illustrates that the CM-SPA is suitable and effective for the hazard assessment of a biomass gasification station and that CM-SPA will make the assessment results more reasonable and scientific.

  2. Hazard and consequence analysis for waste emplacement at the Waste Isolation Pilot Plant

    SciTech Connect

    Gerstner, D.M.; Clayton, S.G.; Farrell, R.F.; McCormick, J.A.; Ortiz, C.; Standiford, D.L.

    1996-05-01

    The Carlsbad Area Office established and analyzed the safety bases for the design and operations as documented in the WIPP Safety Analysis Report (SAR). Additional independent efforts are currently underway to assess the hazards associated with the long-term (10,000 year) isolation period as required by 40 CFR 191. The structure of the WIPP SAR is unique due to the hazards involved, and the agreement between the State of New Mexico and the DOE regarding SAR content and format. However, the hazards and accident analysis philosophy as contained in DOE-STD-3009-94 was followed as closely as possible, while adhering to state agreements. Hazards associated with WIPP waste receipt, emplacement, and disposal operations were systematically identified using a modified Hazard and Operability Study (HAZOP) technique. The WIPP HAZOP assessed the potential internal, external, and natural phenomena events that can cause the identified hazards to develop into accidents. The hazard assessment identified deviations from the intended design and operation of the waste handling system, analyzed potential accident consequences to the public and workers, estimated likelihood of occurrence, and evaluated associated preventative and mitigative features. It was concluded from the assessment that the proposed WIPP waste emplacement operations and design are sufficient to ensure safety of the public, workers, and environment, over the 35 year disposal phase.

  3. Nanomaterials, and Occupational Health and Safety—A Literature Review About Control Banding and a Semi-Quantitative Method Proposed for Hazard Assessment.

    NASA Astrophysics Data System (ADS)

    Dimou, Kaotar; Emond, Claude

    2017-06-01

    In recent decades, the control banding (CB) approach has been recognised as a hazard assessment methodology because of its increased importance in the occupational safety, health and hygiene (OSHH) industry. According to the American Industrial Hygiene Association, this approach originates from the pharmaceutical industry in the United Kingdom. The aim of the CB approach is to protect more than 90% (or approximately 2.7 billion) of the world’s workers who do not have access to OSHH professionals and traditional quantitative risk assessment methods. In other words, CB is a qualitative or semi-quantitative tool designed to prevent occupational accidents by controlling worker exposures to potentially hazardous chemicals in the absence of comprehensive toxicological and exposure data. These criteria correspond very precisely to the development and production of engineered nanomaterials (ENMs). Considering the significant lack of scientific knowledge about work-related health risks because of ENMs, CB is, in general, appropriate for these issues. Currently, CB can be adapted to the specificities of ENMs; hundreds of nanotechnology products containing ENMs are already on the market. In this context, this qualitative or semi-quantitative approach appears to be relevant for characterising and quantifying the degree of physico-chemical and biological reactivities of ENMs, leading towards better control of human health effects and the safe handling of ENMs in workplaces. The need to greater understand the CB approach is important to further manage the risks related to handling hazardous substances, such as ENMs, without established occupational exposure limits. In recent years, this topic has garnered much interest, including discussions in many technical papers. Several CB models have been developed, and many countries have created their own nano-specific CB instruments. The aims of this research were to perform a literature review about CBs, to classify the main

  4. Fire hazard analysis for Project W-320 Tank 241-C-106 waste retrieval

    SciTech Connect

    Conner, J.C.

    1995-09-12

    This Fire Hazards Analysis (FHA) for Project W-320, `Tank 241-C-106 Waste Retrieval` addresses fire hazards or fire related concerns in accordance with DOE 5480.7A (DOE 1998), resulting from or related to the processes and equipment to be installed or modified under Project W-320 to ensure that there are no undue fire hazards to site personnel and the public; the potential for the occurrence of a fire is minimized, process control and safety systems are not damaged by fire or related perils; and property damage from fire and related perils does not exceed an acceptable level.

  5. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  6. Defining geologic Hazards for natural resources management using tree-ring analysis

    USGS Publications Warehouse

    DeGraff, J.V.; Agard, S.S.

    1984-01-01

    Landslides, avalanches, floods, and other geologic hazards impair natural resources management by jeopardizing public safety, damaging or restricting resource utilization, and necessitating expenditures for corrective measures The negative impact of geologic hazard events can be reduced by tailoring resources management to hazard potential of an area This requires assessment of where and how frequently the events occur National forests and other managed wildlands often lack monitoring or historical records to compute frequency of hazard occurrence Tree-ring analysis, based on internal growth response to external events such as tilting and abrasion, can provide frequency data Two examples of the use of tree-ring analysis to date landslide activity illustrate advantages and limitations of the technique An example from the Fishlake National Forest in central Utah illustrates assessment for planning purposes An example from the Sierra National Forest in east-central California shows assessment applied to project design Many geologic hazards in addition to landslides are suited to tree-ring analysis to establish frequency of occurrence Hazard reduction efforts in natural resources management could be enhanced by careful application of tree-ring analysis ?? 1984 Springer-Verlag New York Inc.

  7. Quantiprot - a Python package for quantitative analysis of protein sequences.

    PubMed

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  8. Preliminary Hazards Analysis of K-Basin Fuel Encapsulation and Storage

    SciTech Connect

    Strickland, G.C.

    1994-08-19

    This Preliminary Hazards Analysis (PHA) systematically examines the K-Basin facilities and their supporting systems for hazards created by abnormal operating conditions and external events (e.g., earthquakes) which have the potential for causing undesirable consequences to the facility worker, the onsite individual, or the public. The operational activities examined are fuel encapsulation, fuel storage and cooling. Encapsulation of sludges in the basins is not examined. A team of individuals from Westinghouse produced a set of Hazards and Operability (HAZOP) tables documenting their examination of abnormal process conditions in the systems and activities examined in K-Basins. The purpose of this report is to reevaluate and update the HAZOP in the original Preliminary Hazard Analysis of K-Basin Fuel Encapsulation and Storage originally developed in 1991.

  9. Sensitivity analysis of seismic hazard for Western Liguria (North Western Italy): A first attempt towards the understanding and quantification of hazard uncertainty

    NASA Astrophysics Data System (ADS)

    Barani, Simone; Spallarossa, Daniele; Bazzurro, Paolo; Eva, Claudio

    2007-05-01

    The use of logic trees in probabilistic seismic hazard analyses often involves a large number of branches that reflect the uncertainty in the selection of different models and in the selection of the parameter values of each model. The sensitivity analysis, as proposed by Rabinowitz and Steinberg [Rabinowitz, N., Steinberg, D.M., 1991. Seismic hazard sensitivity analysis: a multi-parameter approach. Bull. Seismol. Soc. Am. 81, 796-817], is an efficient tool that allows the construction of logic trees focusing attention on the parameters that have greater impact on the hazard. In this paper the sensitivity analysis is performed in order to identify the parameters that have the largest influence on the Western Liguria (North Western Italy) seismic hazard. The analysis is conducted for six strategic sites following the multi-parameter approach developed by Rabinowitz and Steinberg [Rabinowitz, N., Steinberg, D.M., 1991. Seismic hazard sensitivity analysis: a multi-parameter approach. Bull. Seismol. Soc. Am. 81, 796-817] and accounts for both mean hazard values and hazard values corresponding to different percentiles (e.g., 16%-ile and 84%-ile). The results are assessed in terms of the expected PGA with a 10% probability of exceedance in 50 years for rock conditions and account for both the contribution from specific source zones using the Cornell approach [Cornell, C.A., 1968. Engineering seismic risk analysis. Bull. Seismol. Soc. Am. 58, 1583-1606] and the spatially smoothed seismicity [Frankel, A., 1995. Mapping seismic hazard in the Central and Eastern United States. Seismol. Res. Lett. 66, 8-21]. The influence of different procedures for calculating seismic hazard, seismic catalogues (epicentral parameters), source zone models, frequency-magnitude parameters, maximum earthquake magnitude values and attenuation relationships is considered. As a result, the sensitivity analysis allows us to identify the parameters with higher influence on the hazard. Only these

  10. Control of separation and quantitative analysis by GC-FTIR

    NASA Astrophysics Data System (ADS)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  11. Quantitative flow cytometric analysis of membrane antigen expression.

    PubMed

    D'hautcourt, Jean-Luc

    2002-11-01

    Immunological analysis for cell antigens has been performed by flow cytometry in a qualitative fashion for over thirty years. During that time it has become increasingly apparent that quantitative measurements such as number of antigens per cell provide unique and useful information. This unit on quantitative flow cytometry (QFCM) describes the most commonly used protocols, both direct and indirect, and the major methods of analysis for the number of antibody binding sites on a cell or particle. Practical applications include detection of antigen under- or overexpression in hematological malignancies, distinguishing between B cell lymphoproliferative disorders, and precise diagnosis of certain rare diseases.

  12. North Portal Fuel Storage System Fire Hazard Analysis-ESF Surface Design Package ID

    SciTech Connect

    N.M. Ruonavaara

    1995-01-18

    The purpose of the fire hazard analysis is to comprehensively assess the risk from fire within the individual fire areas. This document will only assess the fire hazard analysis within the Exploratory Studies Facility (ESF) Design Package ID, which includes the fuel storage system area of the North Portal facility, and evaluate whether the following objectives are met: 1.1.1--This analysis, performed in accordance with the requirements of this document, will satisfy the requirements for a fire hazard analysis in accordance with U.S. Department of Energy (DOE) Order 5480.7A. 1.1.2--Ensure that property damage from fire and related perils does not exceed an acceptable level. 1.1.3--Provide input to the ESF Basis For Design (BFD) Document. 1.1.4 Provide input to the facility Safety Analysis Report (SAR) (Paragraph 3.8).

  13. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  14. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  15. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    NASA Technical Reports Server (NTRS)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  16. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    NASA Technical Reports Server (NTRS)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  17. Evaluating the spatial distribution of quantitative risk and hazard level of arsenic exposure in groundwater, case study of Qorveh County, Kurdistan Iran.

    PubMed

    Nasrabadi, Touraj; Bidabadi, Niloufar Shirani

    2013-01-01

    Regional distribution of quantitative risk and hazard levels due to arsenic poisoning in some parts of Iran's Kurdistan province is considered. To investigate the potential risk and hazard level regarding arsenic-contaminated drinking water and further carcinogenic and non-carcinogenic effects on villagers, thirteen wells in rural areas of Qorveh County were considered for evaluation of arsenic concentration in water. Sampling campaign was performed in August 2010 and arsenic concentration was measured via the Silver Diethyldithiocarbamate method. The highest and lowest arsenic concentration are reported in Guilaklu and Qezeljakand villages with 420 and 67 μg/L, respectively. None of thirteen water samples met the maximum contaminant level issued by USEPA and Institute of Standards and Industrial Research of Iran (10 ppb). The highest arsenic concentration and consequently risk and hazard levels belong to villages situated alongside the eastern frontiers of the county. Existence of volcanic activities within the upper Miocene and Pleistocene in this part of the study area may be addressed as the main geopogenic source of arsenic pollution. Quantitative risk values are varying from 1.49E-03 in Qezeljakand to 8.92E-03 in Guilaklu and may be interpreted as very high when compared by similar studies in Iran. Regarding non-carcinogenic effects, all thirteen water samples are considered hazardous while all calculated chronic daily intakes are greater than arsenic reference dose. Such drinking water source has the potential to impose adverse carcinogenic and non-carcinogenic effects on villagers. Accordingly, an urgent decision must be made to substitute the current drinking water source with a safer one.

  18. Evaluating the spatial distribution of quantitative risk and hazard level of arsenic exposure in groundwater, case study of Qorveh County, Kurdistan Iran

    PubMed Central

    2013-01-01

    Regional distribution of quantitative risk and hazard levels due to arsenic poisoning in some parts of Iran’s Kurdistan province is considered. To investigate the potential risk and hazard level regarding arsenic-contaminated drinking water and further carcinogenic and non-carcinogenic effects on villagers, thirteen wells in rural areas of Qorveh County were considered for evaluation of arsenic concentration in water. Sampling campaign was performed in August 2010 and arsenic concentration was measured via the Silver Diethyldithiocarbamate method. The highest and lowest arsenic concentration are reported in Guilaklu and Qezeljakand villages with 420 and 67 μg/L, respectively. None of thirteen water samples met the maximum contaminant level issued by USEPA and Institute of Standards and Industrial Research of Iran (10 ppb). The highest arsenic concentration and consequently risk and hazard levels belong to villages situated alongside the eastern frontiers of the county. Existence of volcanic activities within the upper Miocene and Pleistocene in this part of the study area may be addressed as the main geopogenic source of arsenic pollution. Quantitative risk values are varying from 1.49E-03 in Qezeljakand to 8.92E-03 in Guilaklu and may be interpreted as very high when compared by similar studies in Iran. Regarding non-carcinogenic effects, all thirteen water samples are considered hazardous while all calculated chronic daily intakes are greater than arsenic reference dose. Such drinking water source has the potential to impose adverse carcinogenic and non-carcinogenic effects on villagers. Accordingly, an urgent decision must be made to substitute the current drinking water source with a safer one. PMID:23574885

  19. Crossing thresholds: Analysis of hazardous tipping points in alpine catchments

    NASA Astrophysics Data System (ADS)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Steep mountain channels or torrents in small alpine catchments are characterized by high geomorphic activity with sediment dynamics being inherently nonlinear and threshold-mediated. Localized, high intensity rainstorms can drive torrential systems past a tipping point resulting in a sudden onset of hazardous events like (flash-) flooding, heavy bedload transport or debris flows. Such responses exhibit an abrupt switch in the fluvial system's mode (e.g. transport / supply limited). Changes in functional connectivity may persist beyond the tipping point. Torrential hazards cause costly damage in the densely populated Alpine Region. Thus, there is a rising interest in potential effects of climate change on torrential sediment dynamics. Understanding critical conditions close to tipping points is important to reduce uncertainty in predicting sediment fluxes. In this study we aim at (i) establishing threshold precipitation characteristics for the Eastern Alps of Austria. Precipitation is hypothesized to be the main forcing factor of torrential events. (ii) How do thresholds vary in space and time? (iii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which internal conditions are critical for susceptibility? (iv) Is there a change in magnitude or frequency in the recent past and what can be expected for the future? The 71 km2 catchment of the river Schöttlbach in the East Alpine Region of Styria (Austria) is monitored since a heavy precipitation event resulted in a catastrophic flood in July 2011. Sediment mobilization from slopes as well as within-channel storage and bedload transport are regularly measured using photogrammetric methods and sediment impact sensors. Thus, detailed knowledge exists on magnitude and spatial propagation of sediment waves through the catchment. The associated hydro-meteorological (pre-) conditions can be inferred from a dense station network. Changing bedload transport rates and

  20. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  1. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  2. Techniques for hazard analysis and their use at CERN.

    PubMed

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  3. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  4. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  5. Site specific probabilistic seismic hazard analysis at Dubai Creek on the west coast of UAE

    NASA Astrophysics Data System (ADS)

    Shama, Ayman A.

    2011-03-01

    A probabilistic seismic hazard analysis (PSHA) was conducted to establish the hazard spectra for a site located at Dubai Creek on the west coast of the United Arab Emirates (UAE). The PSHA considered all the seismogenic sources that affect the site, including plate boundaries such as the Makran subduction zone, the Zagros fold-thrust region and the transition fault system between them; and local crustal faults in UAE. PSHA indicated that local faults dominate the hazard. The peak ground acceleration (PGA) for the 475-year return period spectrum is 0.17 g and 0.33 g for the 2,475-year return period spectrum. The hazard spectra are then employed to establish rock ground motions using the spectral matching technique.

  6. Analysis of error-prone survival data under additive hazards models: measurement error effects and adjustments.

    PubMed

    Yan, Ying; Yi, Grace Y

    2016-07-01

    Covariate measurement error occurs commonly in survival analysis. Under the proportional hazards model, measurement error effects have been well studied, and various inference methods have been developed to correct for error effects under such a model. In contrast, error-contaminated survival data under the additive hazards model have received relatively less attention. In this paper, we investigate this problem by exploring measurement error effects on parameter estimation and the change of the hazard function. New insights of measurement error effects are revealed, as opposed to well-documented results for the Cox proportional hazards model. We propose a class of bias correction estimators that embraces certain existing estimators as special cases. In addition, we exploit the regression calibration method to reduce measurement error effects. Theoretical results for the developed methods are established, and numerical assessments are conducted to illustrate the finite sample performance of our methods.

  7. [Hazard evaluation modeling of particulate matters emitted by coal-fired boilers and case analysis].

    PubMed

    Shi, Yan-Ting; Du, Qian; Gao, Jian-Min; Bian, Xin; Wang, Zhi-Pu; Dong, He-Ming; Han, Qiang; Cao, Yang

    2014-02-01

    In order to evaluate the hazard of PM2.5 emitted by various boilers, in this paper, segmentation of particulate matters with sizes of below 2. 5 microm was performed based on their formation mechanisms and hazard level to human beings and environment. Meanwhile, taking into account the mass concentration, number concentration, enrichment factor of Hg, and content of Hg element in different coal ashes, a comprehensive model aimed at evaluating hazard of PM2.5 emitted by coal-fired boilers was established in this paper. Finally, through utilizing filed experimental data of previous literatures, a case analysis of the evaluation model was conducted, and the concept of hazard reduction coefficient was proposed, which can be used to evaluate the performance of dust removers.

  8. A Quantitative Appraisal of Earthquake Hazard Parameters Evaluated from Bayesian Approach for Different Regions in Iranian Plateau

    NASA Astrophysics Data System (ADS)

    Mohammadi, Hiwa; Türker, Tügba; Bayrak, Yusuf

    2016-06-01

    In this study, we used the program for seismic hazard Bayesian estimate which was elaborated by Alexey Lyubushin. Our study is the next in the sequence of applications of this software to seismic hazard assessment in different regions of the world. However, earthquake hazard parameters of maximum regional magnitude (M_{ max }), β value and seismic activity rate or intensity ( λ) and their uncertainties for the 15 different source regions in Iranian Plateau have been evaluated with the help of a complete and homogeneous earthquake catalogue during the period 1900-2014 with M_{{w}} ≥4.0. The estimated M_{ max } values varies between 6.25 and 8.37. Lowest value is observed in the Zagros foredeep whereas highest value is observed in the Makran. Also, it is observed that there is a strong relationship between the estimated maximum earthquake magnitudes estimated by Bayesian approach and maximum observed magnitudes. Moreover, in this study, quantiles of functions of distributions of true and apparent magnitudes for future time intervals of 5, 10, 20, 50 and 100 years are calculated with confidence limits for probability levels of 50, 70 and 90 % in 15 different source regions. Based on computed earthquake hazard parameters, the prerequisite guides to the earthquake estimation of the parameters referred to as the most seismically active regions of Iranian Plateau. The Makran and East Iran show earthquake magnitude greater than 8.0 in next 100-years with 90 % probability level as compared to other regions, which declares that these regions are more susceptible to occurrence of large earthquakes. The outcomes which obtained in the study may have useful implications in the probabilistic seismic hazard studies of Iranian Plateau.

  9. A seismic hazard uncertainty analysis for the New Madrid seismic zone

    USGS Publications Warehouse

    Cramer, C.H.

    2001-01-01

    A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.

  10. Quantitating the subtleties of microglial morphology with fractal analysis

    PubMed Central

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F.

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between “ramified resting” and “activated amoeboid” has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology. PMID:23386810

  11. Quantitating the subtleties of microglial morphology with fractal analysis.

    PubMed

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between "ramified resting" and "activated amoeboid" has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology.

  12. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  13. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  14. Quantitative transverse flow assessment using OCT speckle decorrelation analysis

    NASA Astrophysics Data System (ADS)

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Kang, Jin U.

    2013-03-01

    In this study, we demonstrate the use of inter-Ascan speckle decorrelation analysis of optical coherence tomography (OCT) to assess fluid flow. This method allows quantitative measurement of fluid flow in a plane normal to the scanning beam. To validate this method, OCT images were obtained from a micro fluid channel with bovine milk flowing at different speeds. We also imaged a blood vessel from in vivo animal models and performed speckle analysis to asses blood flow.

  15. Hazard analysis of Arid and semi-Arid (ASAL) regions of Kenya.

    PubMed

    Tabu, J S; Otwelo, J A; Koskei, P; Makokha, P

    2013-06-01

    water scarcity, poverty and low educational levels. The region suffers from a variety of hazards in particular Drought and famine, Epidemics including HIV/AIDS and War and conflict. Environmental degradation though given a low score may be more of a perception. There is need to undertake a comprehensive hazard and Vulnerability analysis at regional and country level to inform interventions and other developmental activities. Women should be targeted at the community and leadership level, and efforts to empower them should be stepped up.

  16. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  17. Quantitative analysis of single-molecule superresolution images

    PubMed Central

    Coltharp, Carla; Yang, Xinxing; Xiao, Jie

    2014-01-01

    This review highlights the quantitative capabilities of single-molecule localization-based superresolution imaging methods. In addition to revealing fine structural details, the molecule coordinate lists generated by these methods provide the critical ability to quantify the number, clustering, and colocalization of molecules with 10 – 50 nm resolution. Here we describe typical workflows and precautions for quantitative analysis of single-molecule superresolution images. These guidelines include potential pitfalls and essential control experiments, allowing critical assessment and interpretation of superresolution images. PMID:25179006

  18. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  19. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  20. Quantitative numerical analysis of transient IR-experiments on buildings

    NASA Astrophysics Data System (ADS)

    Maierhofer, Ch.; Wiggenhauser, H.; Brink, A.; Röllig, M.

    2004-12-01

    Impulse-thermography has been established as a fast and reliable tool in many areas of non-destructive testing. In recent years several investigations have been done to apply active thermography to civil engineering. For quantitative investigations in this area of application, finite difference calculations have been performed for systematic studies on the influence of environmental conditions, heating power and time, defect depth and size and thermal properties of the bulk material (concrete). The comparison of simulated and experimental data enables the quantitative analysis of defects.

  1. Landslide Hazard Analysis with Multidisciplinary Approach: İstanbul example

    NASA Astrophysics Data System (ADS)

    Kılıç, Osman; Baş, Mahmut; Yahya Menteşe, Emin; Tarih, Ahmet; Duran, Kemal; Gümüş, Salim; Rıza Yapar, Evrens; Emin Karasu, Muhammed; Acar Kara, Sema; Karaman, Abdullah; Özalaybey, Serdar; Zor, Ekrem; Ediger, Vedat; Arpat, Esen; Özgül, Necdet; Polat, Feyzi; Doǧan, Uǧur; Çakır, Ziyadin

    2017-04-01

    There are several methods that can be utilized for describing the landslide mechanisms. While some of them are commonly used, there are relatively new methods that have been proven to be useful. Obviously, each method has its own limitations and thus integrated use of these methods contributes to obtaining a realistic landslide model. The slopes of Küçükçekmece and Büyükçekmece Lagoons located at the Marmara Sea coast of İstanbul, Turkey, are among most specific examples of complex type landslides. The landslides in the area started developing at low sea level, and appears to ceased or at least slowed down to be at minimum after the sea level rise, as oppose to the still-active landslides that continue to cause damage especially in the valley slopes above the recent sea level between the two lagoons. To clarify the characteristics of these slope movements and classify them in most accurate way, Directorate of Earthquake and Ground Research of Istanbul Metropolitan Municipality launched a project in cooperation with Marmara Research Center of The Scientific and Technological Research Council of Turkey (TÜBİTAK). The project benefits the utility of the techniques of different disciplines such as geology, geophysics, geomorphology, hydrogeology, geotechnics, geodesy, remote sensing and meteorology. The observations include detailed mapping of topography by airborne LIDAR, deformation monitoring with more than 80 GPS stations, Ground Based Synthetic Aperture Radar measurements in 8 critical zones, 81 geological drills and more than 20 km of geophysical measurements. With three years of monitoring, the acquired data, and the results such as landslide hazard map, were integrated in GIS database for the purpose of easing tasks for the urban planners and the decision makers.

  2. A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen

    2014-05-01

    Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho

  3. Information System Hazard Analysis: A Method for Identifying Technology-induced Latent Errors for Safety.

    PubMed

    Weber, Jens H; Mason-Blakley, Fieran; Price, Morgan

    2015-01-01

    Many health information and communication technologies (ICT) are safety-critical; moreover, reports of technology-induced adverse events related to them are plentiful in the literature. Despite repeated criticism and calls to action, recent data collected by the Institute of Medicine (IOM) and other organization do not indicate significant improvements with respect to the safety of health ICT systems. A large part of the industry still operates on a reactive "break & patch" model; the application of pro-active, systematic hazard analysis methods for engineering ICT that produce "safe by design" products is sparse. This paper applies one such method: Information System Hazard Analysis (ISHA). ISHA adapts and combines hazard analysis techniques from other safety-critical domains and customizes them for ICT. We provide an overview of the steps involved in ISHA and describe.

  4. Markov chain Monte Carlo linkage analysis of complex quantitative phenotypes.

    PubMed

    Hinrichs, A; Reich, T

    2001-01-01

    We report a Markov chain Monte Carlo analysis of the five simulated quantitative traits in Genetic Analysis Workshop 12 using the Loki software. Our objectives were to determine the efficacy of the Markov chain Monte Carlo method and to test a new scoring technique. Our initial blind analysis, on replicate 42 (the "best replicate") successfully detected four out of the five disease loci and found no false positives. A power analysis shows that the software could usually detect 4 of the 10 trait/gene combinations at an empirical point-wise p-value of 1.5 x 10(-4).

  5. Spotsizer: High-throughput quantitative analysis of microbial growth.

    PubMed

    Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg

    2016-10-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.

  6. Quantitative Rietveld analysis of CAC clinker phases using synchrotron radiation

    SciTech Connect

    Guirado, F. . E-mail: francesc.guirado@urv.cat; Gali, S.

    2006-11-15

    The quantitative Rietveld analyses of twenty samples of CAC from four different manufacturers over the world, one synthetic mixture and a NIST standard were performed using synchrotron radiation. As compared with conventional XRD, synchrotron powder diffraction permitted to find new minor phases, improve the characterization of solid solutions of iron rich CAC phases and reduce preferential orientation and microabsorption effects. Diffraction data were complemented with XRF and TG/DT analyses. Synchrotron results were used as a reference test to improve the performance of conventional powder diffraction, by an accurate selection of refinable profile and structural parameters, and permitted to extract several recommendations for conventional quantitative Rietveld procedures. It is shown that with these recommendations in mind, conventional XRD based Rietveld analyses are comparable to those obtained from synchrotron data. In summary, quantitative XRD Rietveld analysis is confirmed as an excellent tool for the CAC cement industry.

  7. Spotsizer: High-throughput quantitative analysis of microbial growth

    PubMed Central

    Jeffares, Daniel C.; Arzhaeva, Yulia; Bähler, Jürg

    2017-01-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license. PMID:27712582

  8. Low hazard refractive index and density-matched fluid for quantitative imaging of concentrated suspensions of particles

    NASA Astrophysics Data System (ADS)

    Zhu, W.; Knapp, Y.; Deplano, V.

    2016-05-01

    A novel refractive index and density-matched liquid-solid suspension system taking into account chemical hazard and health concerns was developed and characterized. The solid phase is made of PMMA spheres, the refractive index of which being adapted with a mixture of 2,2'-thiodiethanol and phosphate-buffered saline (PBS), while the density is adapted with a mixture of PBS and glycerol. The proposed chemicals present low hazard characteristics in comparison with former solutions. Data collected from density and refractive index measurements of the solid phase and of the different fluid constituents are used to define a specific ternary mixture adapted to commercial grade micron-size particles. The defined mixture is validated in a micron-sized granular flow experiment. The described method can be applied to other low-density solids.

  9. A strategy to apply quantitative epistasis analysis on developmental traits.

    PubMed

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  10. Looking Before We Leap: Recent Results From An Ongoing Quantitative Investigation Of Asteroid And Comet Impact Hazard Mitigation.

    NASA Astrophysics Data System (ADS)

    Plesko, Catherine; Weaver, R. P.; Korycansky, D. G.; Huebner, W. F.

    2010-10-01

    The asteroid and comet impact hazard is now part of public consciousness, as demonstrated by movies, Super Bowl commercials, and popular news stories. However, there is a popular misconception that hazard mitigation is a solved problem. Many people think, `we'll just nuke it.’ There are, however, significant scientific questions remaining in the hazard mitigation problem. Before we can say with certainty that an explosive yield Y at height of burst h will produce a momentum change in or dispersion of a potentially hazardous object (PHO), we need to quantify how and where energy is deposited into the rubble pile or conglomerate that may make up the PHO. We then need to understand how shock waves propagate through the system, what causes them to disrupt, and how long gravitationally bound fragments take to recombine. Here we present numerical models of energy deposition from an energy source into various materials that are known PHO constituents, and rigid body dynamics models of the recombination of disrupted objects. In the energy deposition models, we explore the effects of porosity and standoff distance as well as that of composition. In the dynamical models, we explore the effects of fragment size and velocity distributions on the time it takes for gravitationally bound fragments to recombine. Initial models indicate that this recombination time is relatively short, as little as 24 hours for a 1 km sized PHO composed of 1000 meter-scale self-gravitating fragments with an initial velocity field of v/r = 0.001 1/s.

  11. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies.

  12. FIRE HAZARDS ANALYSIS FOR THE FUEL SUPPLY SYSTEM - ESF PACKAGE 1E

    SciTech Connect

    N.M. Ruonavaara

    1995-04-12

    The purpose of the fire hazards analysis is to comprehensively assess the risk from fire within individual fire areas in accordance with US. Department of Energy (DOE) Order 5480.7h (Reference 4.4.7.4). This document will assess the fire hazard risk within the Exploratory Studies Facility (ESF) fuel supply system, Package 1E, and evaluate whether the following objectives are met: (1) Ensure that property damage from fire and related perils do not exceed an acceptable level. (2) Provide input to the facility Safety Analysis Report (SAR).

  13. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... early warning of releases. (Acceptable detection methods might include process monitoring and control... analysis methodology being used. (e) The owner or operator shall establish a system to promptly address the...

  14. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... early warning of releases. (Acceptable detection methods might include process monitoring and control... analysis methodology being used. (e) The owner or operator shall establish a system to promptly address the...

  15. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... early warning of releases. (Acceptable detection methods might include process monitoring and control... analysis methodology being used. (e) The owner or operator shall establish a system to promptly address the...

  16. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... early warning of releases. (Acceptable detection methods might include process monitoring and control... analysis methodology being used. (e) The owner or operator shall establish a system to promptly address the...

  17. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... early warning of releases. (Acceptable detection methods might include process monitoring and control... analysis methodology being used. (e) The owner or operator shall establish a system to promptly address the...

  18. 75 FR 8239 - School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-24

    ... Critical Control Point Principles (HACCP); Approval of Information Collection Request AGENCY: Food and... rule entitled School Food Safety Program Based on Hazard Analysis and Critical Control Point...

  19. Hazard analysis and critical control point (HACCP) history and conceptual overview.

    PubMed

    Hulebak, Karen L; Schlosser, Wayne

    2002-06-01

    The concept of Hazard Analysis and Critical Control Point (HACCP) is a system that enables the production of safe meat and poultry products through the thorough analysis of production processes, identification of all hazards that are likely to occur in the production establishment, the identification of critical points in the process at which these hazards may be introduced into product and therefore should be controlled, the establishment of critical limits for control at those points, the verification of these prescribed steps, and the methods by which the processing establishment and the regulatory authority can monitor how well process control through the HACCP plan is working. The history of the development of HACCP is reviewed, and examples of practical applications of HACCP are described.

  20. Probabilistic seismic hazard analysis for offshore structures in the Santa Barbara Channel phase 2 report

    SciTech Connect

    Foxall, W; Savy, J

    1999-08-06

    This report summarizes progress through Phase 2 of the probabilistic seismic hazards analysis (PSHA) for the Santa Barbara Channel being carried out by the Lawrence Livermore National Laboratory (LLNL) for the Minerals Management Service (MMS) of the US Department of the Interior. The purpose of the PSHA is to provide a basis for development by MMS of regulations governing evaluation of applications to re-license existing oil platforms in federal waters within the Channel with respect to seismic loading. The final product of the analysis will be hazard maps of ground motion parameters at specified probability levels of exceedence. This report summarizes the characterization of local earthquake sources within the Channel and onshore areas of the Western Transverse Ranges, development of a ground motion attenuation model for the region, and presents preliminary hazard results at three selected sites.

  1. Data from quantitative label free proteomics analysis of rat spleen.

    PubMed

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  2. An improved quantitative analysis method for plant cortical microtubules.

    PubMed

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  3. Liquefaction hazard analysis for infrastructure development in gulf of Jakarta

    NASA Astrophysics Data System (ADS)

    Dinata, Indra A.; Darlan, Yudi; Sadisun, Imam A.; Pindratno, Haris; Saryanto, Agus

    2016-05-01

    Gulf of Jakarta is an area of active sedimentation. There exist a wide sediment deposition area on the north coast of Jakarta. Generally, these sediments have not been consolidated, so that the conditions in these area is an important factor to determining liquefaction in these area. Liquefaction may occur because of earthquake that cause loss of strength and stiffness in soils. Analysis of liquefaction potential based from SPT data taken at gulf of Jakarta, include susceptibility rate and the factors that triggering. Liquefaction analysis methods compared with each other to get the factor of safety against liquefaction according to the characteristics of the soil. Liquefaction analysis at surface using susceptibility rating factor (SRF). SRF method controled by factors: history, geology, composition, and groundwater. Each factors have parameters that determine the value of SRF.From the analysis, Gulf of Jakarta has susceptibility rating from liquefaction with SRF value 12 - 35. The value shows that Gulf of Jakarta dominated by area that have susceptibility rating from medium to high. High susceptibility rating from liquefaction concentrated at coast area.

  4. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    NASA Astrophysics Data System (ADS)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  5. An Analysis of the Vulnerability of Global Drinking Water Access to Climate-related Hazards

    NASA Astrophysics Data System (ADS)

    Elliott, M.; Banerjee, O.; Christenson, E.; Holcomb, D.; Hamrick, L.; Bartram, J.

    2014-12-01

    Global drinking water access targets are formulated around "sustainable access." Global climate change (GCC) and associated hazards threaten the sustainability of drinking water supply. Extensive literature exists on the impacts of GCC on precipitation and water resources. However, the literature lacks a credible analysis of the vulnerability of global drinking water access. This research reports on an analysis of the current vulnerability of drinking water access due to three climate-related hazardous events: cyclone, drought and flood. An ArcGIS database was built incorporating the following: population density, hazardous event frequency, drinking water technologies in use and adaptive capacity. Two global grids were incorporated first: (1) LandScanTM global population distribution; and (2) frequency of cyclone, drought and flood from ~1980-2000 from Columbia University Center for Hazards Risk Research (CHRR). Population density was used to characterize cells as urban or rural and country-level urban/rural drinking water technologies in use were added based on the WHO/UNICEF Joint Monitoring Programme data. Expert assessment of the resilience of each technology to each hazardous event based on WHO/DFID Vision 2030 were quantified and added to the database. Finally, country-level adaptive capacity was drawn from the "readiness" parameter of the Global Adaptation Index (GaIn). ArcGIS Model Builder and Python were used to automate the addition of datasets. This presentation will report on the results of this analysis, the first credible attempt to assess the vulnerability of global drinking water access to climate-related hazardous events. This analysis has yielded country-level scores and maps displaying the ranking of exposure score (for flood, drought, cyclone, and all three in aggregate) and the corresponding country-level vulnerability scores and rankings incorporating the impact of drinking water technologies and adaptive capacity (Figure 1).

  6. Damage functions for climate-related hazards: unification and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Prahl, Boris F.; Rybski, Diego; Boettle, Markus; Kropp, Jürgen P.

    2016-05-01

    Most climate change impacts manifest in the form of natural hazards. Damage assessment typically relies on damage functions that translate the magnitude of extreme events to a quantifiable damage. In practice, the availability of damage functions is limited due to a lack of data sources and a lack of understanding of damage processes. The study of the characteristics of damage functions for different hazards could strengthen the theoretical foundation of damage functions and support their development and validation. Accordingly, we investigate analogies of damage functions for coastal flooding and for wind storms and identify a unified approach. This approach has general applicability for granular portfolios and may also be applied, for example, to heat-related mortality. Moreover, the unification enables the transfer of methodology between hazards and a consistent treatment of uncertainty. This is demonstrated by a sensitivity analysis on the basis of two simple case studies (for coastal flood and storm damage). The analysis reveals the relevance of the various uncertainty sources at varying hazard magnitude and on both the microscale and the macroscale level. Main findings are the dominance of uncertainty from the hazard magnitude and the persistent behaviour of intrinsic uncertainties on both scale levels. Our results shed light on the general role of uncertainties and provide useful insight for the application of the unified approach.

  7. Quantitative analysis of synchrotron radiation intravenous angiographic images

    NASA Astrophysics Data System (ADS)

    Sarnelli, Anna; Nemoz, Christian; Elleaume, Hélène; Estève, François; Bertrand, Bernard; Bravin, Alberto

    2005-02-01

    A medical research protocol on clinical intravenous coronary angiography has been completed at the European Synchrotron Radiation Facility (ESRF) biomedical beamline. The aim was to investigate the accuracy of intravenous coronary angiography based on the K-edge digital subtraction technique for the detection of in-stent restenosis. For each patient, diagnosis has been performed on the synchrotron radiation images and monitored with the conventional selective coronary angiography method taken as the golden standard. In this paper, the methods of image processing and the results of the quantitative analysis are described. Image processing includes beam harmonic contamination correction, spatial deconvolution and the extraction of a 'contrast' and a 'tissue' image from each couple of radiograms simultaneously acquired at energies bracketing the K-edge of iodine. Quantitative analysis includes the estimation of the vessel diameter, the calculation of the absolute iodine concentration profiles along the coronary arteries and the stenosis degree measurement.

  8. Hazard Analysis for Pneumatic Flipper Suitport/Z-1 Manned Evaluation, Chamber B, Building 32. Revision: Basic

    NASA Technical Reports Server (NTRS)

    2012-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.

  9. Hazards Analysis of Large Caliber Munitions Metal Parts Plants.

    DTIC Science & Technology

    1980-10-01

    Furnace Types Used Rotary Hearth - 4 plants Roller, Hearth - 1 plant Induction - 3 plants Hydraulic... Rotary hearth furnace with atmos. 1,000 Roller hearth furnace with atmos. 1,700 Induction heating 500 Batch furnace with atmos. 200 Roller type tempering...3) the typical working environment in the plants. A brief analysis of each is shown below. 15 Rotary Hearth Furnace The following diagram is

  10. Implementing a Quantitative Analysis Design Tool for Future Generation Interfaces

    DTIC Science & Technology

    2012-03-01

    future MAC-enabled systems. A human-computer interaction ( HCI ) Index, originally applied to multi-function displays was applied to the prototype Vigilant...Spirit interface. A modified version of the HCI Index was successfully applied to perform a quantitative analysis of the baseline VSCS interface and...two modified interface designs. The modified HCI Index incorporates the Hick-Hyman decision time, Fitts’ Law time, and the physical actions

  11. Quantitative NMR Analysis of Partially Substituted Biodiesel Glycerols

    SciTech Connect

    Nagy, M.; Alleman, T. L.; Dyer, T.; Ragauskas, A. J.

    2009-01-01

    Phosphitylation of hydroxyl groups in biodiesel samples with 2-chloro-4,4,5,5-tetramethyl-1,3,2-dioxaphospholane followed by 31P-NMR analysis provides a rapid quantitative analytical technique for the determination of substitution patterns on partially esterified glycerols. The unique 31P-NMR chemical shift data was established with a series mono and di-substituted fatty acid esters of glycerol and then utilized to characterize an industrial sample of partially processed biodiesel.

  12. Example process hazard analysis of a Department of Energy water chlorination process

    SciTech Connect

    Not Available

    1993-09-01

    On February 24, 1992, the Occupational Safety and Health Administration (OSHA) released a revised version of Section 29 Code of Federal Regulations CFR Part 1910 that added Section 1910.119, entitled ``Process Safety Management of Highly Hazardous Chemicals`` (the PSM Rule). Because US Department of Energy (DOE) Orders 5480.4 and 5483.1A prescribe OSHA 29 CFR 1910 as a standard in DOE, the PSM Rule is mandatory in the DOE complex. A major element in the PSM Rule is the process hazard analysis (PrHA), which is required for all chemical processes covered by the PSM Rule. The PrHA element of the PSM Rule requires the selection and application of appropriate hazard analysis methods to systematically identify hazards and potential accident scenarios associated with processes involving highly hazardous chemicals (HHCs). The analysis in this report is an example PrHA performed to meet the requirements of the PSM Rule. The PrHA method used in this example is the hazard and operability (HAZOP) study, and the process studied is the new Hanford 300-Area Water Treatment Facility chlorination process, which is currently in the design stage. The HAZOP study was conducted on May 18--21, 1993, by a team from the Westinghouse Hanford Company (WHC), Battelle-Columbus, the DOE, and Pacific Northwest Laboratory (PNL). The chlorination process was chosen as the example process because it is common to many DOE sites, and because quantities of chlorine at those sites generally exceed the OSHA threshold quantities (TQs).

  13. 30 CFR 250.1911 - What hazards analysis criteria must my SEMS program meet?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... criminal penalty; (iii) Control technology applicable to the operation your hazards analysis is evaluating; and (iv) A qualitative evaluation of the possible safety and health effects on employees, and potential impacts to the human and marine environments, which may result if the control technology fails....

  14. 30 CFR 250.1911 - What hazards analysis criteria must my SEMS program meet?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... criminal penalty; (iii) Control technology applicable to the operation your hazards analysis is evaluating; and (iv) A qualitative evaluation of the possible safety and health effects on employees, and potential impacts to the human and marine environments, which may result if the control technology fails....

  15. Using bioluminescent biosensors for hazard analysis and critical control point (HACCP) in wastewater control.

    PubMed

    Valat, C; Champiat, D; Degorce-Dumas, J R; Thomas, O

    2004-01-01

    Starting from a new approach for water pollution control and wastewater treatment plant management, the hazard analysis and critical control point (HACCP) quality concept, the interest for the development of new rapid and sensitive methods such as bioluminescence-based methods is evident. After an introduction of the HACCP procedure, a bibliographic study of the bioluminescence potentiality is presented and discussed.

  16. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    PubMed

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-08-19

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  17. Quantitative Proteomic Approaches for Analysis of Protein S-Nitrosylation.

    PubMed

    Qu, Zhe; Greenlief, C Michael; Gu, Zezong

    2016-01-04

    S-Nitrosylation is a redox-based post-translational modification of a protein in response to nitric oxide (NO) signaling, and it participates in a variety of processes in diverse biological systems. The significance of this type of protein modification in health and diseases is increasingly recognized. In the central nervous system, aberrant S-nitrosylation, due to excessive NO production, is known to cause protein misfolding, mitochondrial dysfunction, transcriptional dysregulation, and neuronal death. This leads to an altered physiological state and consequently contributes to pathogenesis of neurodegenerative disorders. To date, much effort has been made to understand the mechanisms underlying protein S-nitrosylation, and several approaches have been developed to unveil S-nitrosylated proteins from different organisms. Interest in determining the dynamic changes of protein S-nitrosylation under different physiological and pathophysiological conditions has underscored the need for the development of quantitative proteomic approaches. Currently, both gel-based and gel-free mass spectrometry-based quantitative methods are widely used, and they each have advantages and disadvantages but may also be used together to produce complementary data. This review evaluates current available quantitative proteomic techniques for the analysis of protein S-nitrosylation and highlights recent advances, with emphasis on applications in neurodegenerative diseases. An important goal is to provide a comprehensive guide of feasible quantitative proteomic methodologies for examining protein S-nitrosylation in research to yield insights into disease mechanisms, diagnostic biomarkers, and drug discovery.

  18. Quantitative analysis of the heterogeneous population of endocytic vesicles.

    PubMed

    Kozlov, Konstantin; Kosheverova, Vera; Kamentseva, Rimma; Kharchenko, Marianna; Sokolkova, Alena; Kornilova, Elena; Samsonova, Maria

    2017-03-07

    The quantitative characterization of endocytic vesicles in images acquired with microscope is critically important for deciphering of endocytosis mechanisms. Image segmentation is the most important step of quantitative image analysis. In spite of availability of many segmentation methods, the accurate segmentation is challenging when the images are heterogeneous with respect to object shapes and signal intensities what is typical for images of endocytic vesicles. We present a Morphological reconstruction and Contrast mapping segmentation method (MrComas) for the segmentation of the endocytic vesicle population that copes with the heterogeneity in their shape and intensity. The method uses morphological opening and closing by reconstruction in the vicinity of local minima and maxima respectively thus creating the strong contrast between their basins of attraction. As a consequence, the intensity is flattened within the objects and their edges are enhanced. The method accurately recovered quantitative characteristics of synthetic images that preserve characteristic features of the endocytic vesicle population. In benchmarks and quantitative comparisons with two other popular segmentation methods, namely manual thresholding and Squash plugin, MrComas shows the best segmentation results on real biological images of EGFR (Epidermal Growth Factor Receptor) endocytosis. As a proof of feasibility, the method was applied to quantify the dynamical behavior of Early Endosomal Autoantigen 1 (EEA1)-positive endosome subpopulations during EGF-stimulated endocytosis.

  19. Probabilistic seismic hazard analysis (PSHA) for Ethiopia and the neighboring region

    NASA Astrophysics Data System (ADS)

    Ayele, Atalay

    2017-10-01

    Seismic hazard calculation is carried out for the Horn of Africa region (0°-20° N and 30°-50°E) based on the probabilistic seismic hazard analysis (PSHA) method. The earthquakes catalogue data obtained from different sources were compiled, homogenized to Mw magnitude scale and declustered to remove the dependent events as required by Poisson earthquake source model. The seismotectonic map of the study area that avails from recent studies is used for area sources zonation. For assessing the seismic hazard, the study area was divided into small grids of size 0.5° × 0.5°, and the hazard parameters were calculated at the center of each of these grid cells by considering contributions from all seismic sources. Peak Ground Acceleration (PGA) corresponding to 10% and 2% probability of exceedance in 50 years were calculated for all the grid points using generic rock site with Vs = 760 m/s. Obtained values vary from 0.0 to 0.18 g and 0.0-0.35 g for 475 and 2475 return periods, respectively. The corresponding contour maps showing the spatial variation of PGA values for the two return periods are presented here. Uniform hazard response spectrum (UHRS) for 10% and 2% probability of exceedance in 50 years and hazard curves for PGA and 0.2 s spectral acceleration (Sa) all at rock site are developed for the city of Addis Ababa. The hazard map of this study corresponding to the 475 return periods has already been used to update and produce the 3rd generation building code of Ethiopia.

  20. ALARA Analysis of Radiological Control Criteria Associated with Alternatives for Disposal of Hazardous Wastes

    SciTech Connect

    Aaberg, Rosanne L.; Bilyard, Gordon R.; Branch, Kristi M.; Lavender, Jay C.; Miller, Peter L.

    2002-05-15

    This ALARA analysis of Radiological Control Criteria (RCC) considers alternatives to continued storage of certain DOE mixed wastes. It also considers the option of treating hazardous wastes generated by DOE facilities, which have a very low concentration of radionuclide contaminants, as purely hazardous waste. Alternative allowable contaminant levels examined correspond to doses to an individual ranging from 0.01 mrem/yr to 10 to 20 mrem/yr. Generic waste inventory data and radionuclide source terms are used in the assessment. Economic issues, potential health and safety issues, and qualitative factors relating to the use of RCCs are considered.

  1. Hazardous Waste Minimization through Life Cycle Cost Analysis at Federal Facilities.

    PubMed

    Ray, Chittaranjan; Jain, Ravi K; Donahue, Bernard A; Smith, E Dean

    1999-01-01

    In response to the Hazardous and Solid Waste Amendments, many federal facilities are carefully examining hazardous waste minimization issues. A hazardous waste minimization assessment was conducted recently at five military installations. Sources and types of waste somewhat varied among the installations. Major waste sources included motor pools and vehicle maintenance facilities; industrial maintenance and small arms shops; aviation maintenance facilities; paint shops; photography, arts and crafts shops; and hospitals, clinics, and laboratories. Typical waste streams included used engine oil, cleaning and degreasing solvents, paint thinners, antifreeze and coolants, batteries, inks, and pathological wastes. Source reduction, recycling, and treatment were considered as the three major modes of waste minimization. Through life-cycle cost analysis, feasible modes of waste minimization are presented.

  2. Spatial temporal analysis of urban heat hazard in Tangerang City

    NASA Astrophysics Data System (ADS)

    Wibowo, Adi; Kuswantoro; Ardiansyah; Rustanto, Andry; Putut Ash Shidiq, Iqbal

    2016-11-01

    Urban heat is a natural phenomenon which might caused by human activities. The human activities were represented by various types of land-use such as urban and non-urban area. The aim of this study is to identify the urban heat behavior in Tangerang City as it might threats the urban environment. This study used three types of remote sensing data namely, Landsat TM, Landsat ETM+ and Landsat OLI-TIRS, to capture the urban heat behavior and to analysis the urban heat signature of Tangerang City in 2001, 2012, 2013, 2014, 2015 and 2016. The result showed that urban heat signature change dynamically each month based on the sun radiation. The urban heat island covered only small part of Tangerang City in 2001, but it was significantly increased and reached 50% of the area in 2012. Based on the result on urban heat signature, the threshold for threatening condition is 30 oC which recognized from land surface temperature (LST). The effective temperature (ET) index explains that condition as warm, uncomfortable, increase stress due to sweating and blood flow and may causing cardiovascular disorder.

  3. Quantitative Phosphoproteomic Analysis of T-Cell Receptor Signaling.

    PubMed

    Ahsan, Nagib; Salomon, Arthur R

    2017-01-01

    TCR signaling critically depends on protein phosphorylation across many proteins. Localization of each phosphorylation event relative to the T-cell receptor (TCR) and canonical T-cell signaling proteins will provide clues about the structure of TCR signaling networks. Quantitative phosphoproteomic analysis by mass spectrometry provides a wide-scale view of cellular phosphorylation networks. However, analysis of phosphorylation by mass spectrometry is still challenging due to the relative low abundance of phosphorylated proteins relative to all proteins and the extraordinary diversity of phosphorylation sites across the proteome. Highly selective enrichment of phosphorylated peptides is essential to provide the most comprehensive view of the phosphoproteome. Optimization of phosphopeptide enrichment methods coupled with highly sensitive mass spectrometry workflows significantly improves the sequencing depth of the phosphoproteome to over 10,000 unique phosphorylation sites from complex cell lysates. Here we describe a step-by-step method for phosphoproteomic analysis that has achieved widespread success for identification of serine, threonine, and tyrosine phosphorylation. Reproducible quantification of relative phosphopeptide abundance is provided by intensity-based label-free quantitation. An ideal set of mass spectrometry analysis parameters is also provided that optimize the yield of identified sites. We also provide guidelines for the bioinformatic analysis of this type of data to assess the quality of the data and to comply with proteomic data reporting requirements.

  4. A Quantitative Method for Microtubule Analysis in Fluorescence Images.

    PubMed

    Lan, Xiaodong; Li, Lingfei; Hu, Jiongyu; Zhang, Qiong; Dang, Yongming; Huang, Yuesheng

    2015-12-01

    Microtubule analysis is of significant value for a better understanding of normal and pathological cellular processes. Although immunofluorescence microscopic techniques have proven useful in the study of microtubules, comparative results commonly rely on a descriptive and subjective visual analysis. We developed an objective and quantitative method based on image processing and analysis of fluorescently labeled microtubular patterns in cultured cells. We used a multi-parameter approach by analyzing four quantifiable characteristics to compose our quantitative feature set. Then we interpreted specific changes in the parameters and revealed the contribution of each feature set using principal component analysis. In addition, we verified that different treatment groups could be clearly discriminated using principal components of the multi-parameter model. High predictive accuracy of four commonly used multi-classification methods confirmed our method. These results demonstrated the effectiveness and efficiency of our method in the analysis of microtubules in fluorescence images. Application of the analytical methods presented here provides information concerning the organization and modification of microtubules, and could aid in the further understanding of structural and functional aspects of microtubules under normal and pathological conditions.

  5. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    ERIC Educational Resources Information Center

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  6. CyberShake: Broadband Physics-Based Probabilistic Seismic Hazard Analysis in Southern California

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Milner, K.; Graves, R. W.; Donovan, J.; Wang, F.; Jordan, T. H.

    2012-12-01

    Researchers at the Southern California Earthquake Center (SCEC) have developed and used the CyberShake computational platform to perform probabilistic seismic hazard analysis (PSHA) in the Los Angeles region (Graves et al., 2010) using deterministic wave propagation simulations at frequencies up to 0.5 Hz, combined with stochastic methods, to produce broadband seismograms up to 10 Hz. CyberShake uses seismic reciprocity to calculate synthetic seismograms for a suite of more than 600,000 rupture realizations. From this set of seismograms we compute intensity measures, which are then combined into a PSHA hazard curve for the site of interest at various periods. With the CyberShake computational platform, we have computed broadband hazard curves for locations around Southern California, including precariously balanced rock sites and locations of Southern California Seismic Network stations. Additionally, for each location we calculated hazard curves with two different community velocity models, Community Velocity Model - Harvard (CVM-H) v11.2 and Community Velocity Model - SCEC (CVM-S) v11.2. At lower frequencies, hazard levels computed with CVM-H for sites within the deep LA basin are lower than those computed with CVM-S. On the other hand, sites within the Ventura basin show the opposite trend. We interpret these results to be related to the underlying nature of the velocity models, which we are continuing to investigate. At higher frequencies, the CyberShake results tend to be lower than hazard levels computed with traditional ground motion prediction equations (GMPEs). We will report on these results, possibly due to the long tail on GMPEs. Additionally, we will describe ways these results are being used by the SCEC community, such as in earthquake early warning, precarious rock analysis, and directivity-basin coupling.

  7. Quantitative multivariate analysis of dynamic multicellular morphogenic trajectories.

    PubMed

    White, Douglas E; Sylvester, Jonathan B; Levario, Thomas J; Lu, Hang; Streelman, J Todd; McDevitt, Todd C; Kemp, Melissa L

    2015-07-01

    Interrogating fundamental cell biology principles that govern tissue morphogenesis is critical to better understanding of developmental biology and engineering novel multicellular systems. Recently, functional micro-tissues derived from pluripotent embryonic stem cell (ESC) aggregates have provided novel platforms for experimental investigation; however elucidating the factors directing emergent spatial phenotypic patterns remains a significant challenge. Computational modelling techniques offer a unique complementary approach to probe mechanisms regulating morphogenic processes and provide a wealth of spatio-temporal data, but quantitative analysis of simulations and comparison to experimental data is extremely difficult. Quantitative descriptions of spatial phenomena across multiple systems and scales would enable unprecedented comparisons of computational simulations with experimental systems, thereby leveraging the inherent power of computational methods to interrogate the mechanisms governing emergent properties of multicellular biology. To address these challenges, we developed a portable pattern recognition pipeline consisting of: the conversion of cellular images into networks, extraction of novel features via network analysis, and generation of morphogenic trajectories. This novel methodology enabled the quantitative description of morphogenic pattern trajectories that could be compared across diverse systems: computational modelling of multicellular structures, differentiation of stem cell aggregates, and gastrulation of cichlid fish. Moreover, this method identified novel spatio-temporal features associated with different stages of embryo gastrulation, and elucidated a complex paracrine mechanism capable of explaining spatiotemporal pattern kinetic differences in ESC aggregates of different sizes.

  8. Cell poking: quantitative analysis of indentation of thick viscoelastic layers.

    PubMed

    Duszyk, M; Schwab, B; Zahalak, G I; Qian, H; Elson, E L

    1989-04-01

    A recently introduced device, the cell poker, measures the force required to indent the exposed surface of a cell adherent to a rigid substratum. The cell poker has provided phenomenological information about the viscoelastic properties of several different types of cells, about mechanical changes triggered by external stimuli, and about the role of the cytoskeleton in these mechanical functions. Except in special cases, however, it has not been possible to extract quantitative estimates of viscosity and elasticity moduli from cell poker measurements. This paper presents cell poker measurements of well characterized viscoelastic polymeric materials, polydimethylsiloxanes of different degrees of polymerization, in a simple shape, a flat, thick layer, which for our purposes can be treated as a half space. Analysis of the measurements in terms of a linear viscoelasticity theory yields viscosity values for three polymer samples in agreement with those determined by measurements on a macroscopic scale. Theoretical analysis further indicates that the measured limiting static elasticity of the layers may result from the tension generated at the interface between the polymer and water. This work demonstrates the possibility of obtaining quantitative viscoelastic material properties from cell poker measurements and represents the first step in extending these quantitative studies to more complicated structures including cells.

  9. Cell poking: quantitative analysis of indentation of thick viscoelastic layers.

    PubMed Central

    Duszyk, M; Schwab, B; Zahalak, G I; Qian, H; Elson, E L

    1989-01-01

    A recently introduced device, the cell poker, measures the force required to indent the exposed surface of a cell adherent to a rigid substratum. The cell poker has provided phenomenological information about the viscoelastic properties of several different types of cells, about mechanical changes triggered by external stimuli, and about the role of the cytoskeleton in these mechanical functions. Except in special cases, however, it has not been possible to extract quantitative estimates of viscosity and elasticity moduli from cell poker measurements. This paper presents cell poker measurements of well characterized viscoelastic polymeric materials, polydimethylsiloxanes of different degrees of polymerization, in a simple shape, a flat, thick layer, which for our purposes can be treated as a half space. Analysis of the measurements in terms of a linear viscoelasticity theory yields viscosity values for three polymer samples in agreement with those determined by measurements on a macroscopic scale. Theoretical analysis further indicates that the measured limiting static elasticity of the layers may result from the tension generated at the interface between the polymer and water. This work demonstrates the possibility of obtaining quantitative viscoelastic material properties from cell poker measurements and represents the first step in extending these quantitative studies to more complicated structures including cells. PMID:2720066

  10. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  11. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  12. Quantitative analysis of endocytosis with cytoplasmic pHluorin chimeras.

    PubMed

    Prosser, Derek C; Whitworth, Karen; Wendland, Beverly

    2010-09-01

    The pH-sensitive green fluorescent protein (GFP) variant pHluorin is typically fused to the extracellular domain of transmembrane proteins to monitor endocytosis. Here, we have turned pHluorin inside-out, and show that cytoplasmic fusions of pHluorin are effective quantitative reporters for endocytosis and multivesicular body (MVB) sorting. In yeast in particular, fusion of GFP and its variants on the extracellular side of transmembrane proteins can result in perturbed trafficking. In contrast, cytoplasmic fusions are well tolerated, allowing for the quantitative assessment of trafficking of virtually any transmembrane protein. Quenching of degradation-resistant pHluorin in the acidic vacuole permits quantification of extravacuolar cargo proteins at steady-state levels and is compatible with kinetic analysis of endocytosis in live cells.

  13. Biomechanical cell analysis using quantitative phase imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wax, Adam; Park, Han Sang; Eldridge, William J.

    2016-03-01

    Quantitative phase imaging provides nanometer scale sensitivity and has been previously used to study spectral and temporal characteristics of individual cells in vitro, especially red blood cells. Here we extend this work to study the mechanical responses of individual cells due to the influence of external stimuli. Cell stiffness may be characterized by analyzing the inherent thermal fluctuations of cells but by applying external stimuli, additional information can be obtained. The time dependent response of cells due to external shear stress is examined with high speed quantitative phase imaging and found to exhibit characteristics that relate to their stiffness. However, analysis beyond the cellular scale also reveals internal organization of the cell and its modulation due to pathologic processes such as carcinogenesis. Further studies with microfluidic platforms point the way for using this approach in high throughput assays.

  14. [Simultaneous quantitative analysis of four lignanoids in Schisandra chinensis by quantitative analysis of multi-components by single marker].

    PubMed

    He, Feng-Cheng; Li, Shou-Xin; Zhao, Zhi-Quan; Dong, Jin-Ping; Liu, Wu-Zhan; Su, Rui-Qiang

    2012-07-01

    The aim of the study is to establish a new method of quality evaluation and validate its feasibilities by the simultaneous quantitative assay of four lignanoids in Schisandra chinensis. A new quality evaluation method, quantitative analysis of multi-components by single marker (QAMS), was established and validated with Schisandra chinensis. Four main lignanoids, schisandrin, schisantherin A, deoxyschizandrin and gamma-schizandrin, were selected as analytes and schisandrin as internal reference substance to evaluate the quality. Their contents in 13 different batches of samples, collected from different bathes, were determined by both external standard method and QAMS. The method was evaluated by comparison of the quantitative results between external standard method and QAMS. No significant differences were found in the quantitative results of four lignanoids in 13 batches of S. chinensis determined by external standard method and QAMS. QAMS is feasible for determination of four lignanoids simultaneously when some authentic standard substances were unavailable, and the developed method can be used for quality control of S. chinensis.

  15. Hazard analysis and critical control point systems applied to public health risks: the example of seafood.

    PubMed

    Williams, R A; Zorn, D J

    1997-08-01

    The authors describe the way in which the two components of risk analysis--risk assessment and risk management--can be used in conjunction with the hazard analysis and critical control points concept to determine the allocation of resources at potential critical control points. This approach is examined in the context of risks to human health associated with seafood, and in particular with regard to ciguatera poisoning.

  16. Hazardous Materials Management System Validation: A Comparison of the Hazard Analysis and Resource Inventories Conducted by Multnomah County and the City of Portland.

    DTIC Science & Technology

    1982-03-01

    Analysis and Resource Inventories Conducted by Multnomah County and the City of Portland Final Report for the Federal Emergency Management Agency Washington...analysis and resource inventory methodologies developed by the Multnomah County Office of Emergency Manage- ment. The hazard analysis and resource... emergency planning and response jurisdiction of the City of Portland. The Portland Office of Emergency Management is, therefore, in a unique posi

  17. 78 FR 64425 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-29

    ... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Food for Animals; Public... proposed rule to establish requirements for current good manufacturing practice and hazard analysis and... that will establish the foundation of, and central framework for, the modern food safety system...

  18. Quantitative 3D analysis of huge nanoparticle assemblies

    NASA Astrophysics Data System (ADS)

    Zanaga, Daniele; Bleichrodt, Folkert; Altantzis, Thomas; Winckelmans, Naomi; Palenstijn, Willem Jan; Sijbers, Jan; de Nijs, Bart; van Huis, Marijn A.; Sánchez-Iglesias, Ana; Liz-Marzán, Luis M.; van Blaaderen, Alfons; Joost Batenburg, K.; Bals, Sara; van Tendeloo, Gustaaf

    2015-12-01

    Nanoparticle assemblies can be investigated in 3 dimensions using electron tomography. However, it is not straightforward to obtain quantitative information such as the number of particles or their relative position. This becomes particularly difficult when the number of particles increases. We propose a novel approach in which prior information on the shape of the individual particles is exploited. It improves the quality of the reconstruction of these complex assemblies significantly. Moreover, this quantitative Sparse Sphere Reconstruction approach yields directly the number of particles and their position as an output of the reconstruction technique, enabling a detailed 3D analysis of assemblies with as many as 10 000 particles. The approach can also be used to reconstruct objects based on a very limited number of projections, which opens up possibilities to investigate beam sensitive assemblies where previous reconstructions with the available electron tomography techniques failed.Nanoparticle assemblies can be investigated in 3 dimensions using electron tomography. However, it is not straightforward to obtain quantitative information such as the number of particles or their relative position. This becomes particularly difficult when the number of particles increases. We propose a novel approach in which prior information on the shape of the individual particles is exploited. It improves the quality of the reconstruction of these complex assemblies significantly. Moreover, this quantitative Sparse Sphere Reconstruction approach yields directly the number of particles and their position as an output of the reconstruction technique, enabling a detailed 3D analysis of assemblies with as many as 10 000 particles. The approach can also be used to reconstruct objects based on a very limited number of projections, which opens up possibilities to investigate beam sensitive assemblies where previous reconstructions with the available electron tomography techniques

  19. Quantitative MRI for analysis of peritumoral edema in malignant gliomas

    PubMed Central

    Warntjes, J. B. Marcel; Smedby, Örjan; Lundberg, Peter

    2017-01-01

    Background and purpose Damage to the blood-brain barrier with subsequent contrast enhancement is a hallmark of glioblastoma. Non-enhancing tumor invasion into the peritumoral edema is, however, not usually visible on conventional magnetic resonance imaging. New quantitative techniques using relaxometry offer additional information about tissue properties. The aim of this study was to evaluate longitudinal relaxation R1, transverse relaxation R2, and proton density in the peritumoral edema in a group of patients with malignant glioma before surgery to assess whether relaxometry can detect changes not visible on conventional images. Methods In a prospective study, 24 patients with suspected malignant glioma were examined before surgery. A standard MRI protocol was used with the addition of a quantitative MR method (MAGIC), which measured R1, R2, and proton density. The diagnosis of malignant glioma was confirmed after biopsy/surgery. In 19 patients synthetic MR images were then created from the MAGIC scan, and ROIs were placed in the peritumoral edema to obtain the quantitative values. Dynamic susceptibility contrast perfusion was used to obtain cerebral blood volume (rCBV) data of the peritumoral edema. Voxel-based statistical analysis was performed using a mixed linear model. Results R1, R2, and rCBV decrease with increasing distance from the contrast-enhancing part of the tumor. There is a significant increase in R1 gradient after contrast agent injection (P < .0001). There is a heterogeneous pattern of relaxation values in the peritumoral edema adjacent to the contrast-enhancing part of the tumor. Conclusion Quantitative analysis with relaxometry of peritumoral edema in malignant gliomas detects tissue changes not visualized on conventional MR images. The finding of decreasing R1 and R2 means shorter relaxation times closer to the tumor, which could reflect tumor invasion into the peritumoral edema. However, these findings need to be validated in the future. PMID

  20. Natural Hazard Susceptibility Assessment for Road Planning Using Spatial Multi-Criteria Analysis.

    PubMed

    Karlsson, Caroline S J; Kalantari, Zahra; Mörtberg, Ulla; Olofsson, Bo; Lyon, Steve W

    2017-08-18

    Inadequate infrastructural networks can be detrimental to society if transport between locations becomes hindered or delayed, especially due to natural hazards which are difficult to control. Thus determining natural hazard susceptible areas and incorporating them in the initial planning process, may reduce infrastructural damages in the long run. The objective of this study was to evaluate the usefulness of expert judgments for assessing natural hazard susceptibility through a spatial multi-criteria analysis approach using hydrological, geological, and land use factors. To utilize spatial multi-criteria analysis for decision support, an analytic hierarchy process was adopted where expert judgments were evaluated individually and in an aggregated manner. The estimates of susceptible areas were then compared with the methods weighted linear combination using equal weights and factor interaction method. Results showed that inundation received the highest susceptibility. Using expert judgment showed to perform almost the same as equal weighting where the difference in susceptibility between the two for inundation was around 4%. The results also showed that downscaling could negatively affect the susceptibility assessment and be highly misleading. Susceptibility assessment through spatial multi-criteria analysis is useful for decision support in early road planning despite its limitation to the selection and use of decision rules and criteria. A natural hazard spatial multi-criteria analysis could be used to indicate areas where more investigations need to be undertaken from a natural hazard point of view, and to identify areas thought to have higher susceptibility along existing roads where mitigation measures could be targeted after in-situ investigations.

  1. Quantitative analysis of in vivo confocal microscopy images: a review.

    PubMed

    Patel, Dipika V; McGhee, Charles N

    2013-01-01

    In vivo confocal microscopy (IVCM) is a non-invasive method of examining the living human cornea. The recent trend towards quantitative studies using IVCM has led to the development of a variety of methods for quantifying image parameters. When selecting IVCM images for quantitative analysis, it is important to be consistent regarding the location, depth, and quality of images. All images should be de-identified, randomized, and calibrated prior to analysis. Numerous image analysis software are available, each with their own advantages and disadvantages. Criteria for analyzing corneal epithelium, sub-basal nerves, keratocytes, endothelium, and immune/inflammatory cells have been developed, although there is inconsistency among research groups regarding parameter definition. The quantification of stromal nerve parameters, however, remains a challenge. Most studies report lower inter-observer repeatability compared with intra-observer repeatability, and observer experience is known to be an important factor. Standardization of IVCM image analysis through the use of a reading center would be crucial for any future large, multi-centre clinical trials using IVCM.

  2. A GIS-based software for lifeline reliability analysis under seismic hazard

    NASA Astrophysics Data System (ADS)

    Sevtap Selcuk-Kestel, A.; Sebnem Duzgun, H.; Oduncuoglu, Lutfi

    2012-05-01

    Lifelines are vital networks, and it is important that those networks are still functional after major natural disasters such as earthquakes. Assessing reliability of lifelines requires spatial analysis of lifelines with respect to a given earthquake hazard map. In this paper, a GIS-based software for the spatial assessment of lifeline reliability which is developed by using GeoTools environment is presented. The developed GIS-based software imports seismic hazard and lifeline network layers and then creates a gridded network structure. Finally, it adopts a network reliability algorithm to calculate the upper and lower bounds for system reliability of the lifeline under seismic hazard. The software enables user visualizing the reliability values in graphical form as well as thematic lifeline reliability map with colors indicating reliability level along with the link and the overall network. It also provides functions for saving the analysis results in shape file format. The software is tested and validated for an application taken from literature which is a part of water distribution system of Bursa in Turkey. The developed GIS-based software module that creates GIS-based reliability map of the lifelines under seismic hazard is user friendly, modifiable, fast in execution time, illustrative and validated for the existing literature studies.

  3. Multi-hazard risk analysis using the FP7 RASOR Platform

    NASA Astrophysics Data System (ADS)

    Koudogbo, Fifamè N.; Duro, Javier; Rossi, Lauro; Rudari, Roberto; Eddy, Andrew

    2014-10-01

    Climate change challenges our understanding of risk by modifying hazards and their interactions. Sudden increases in population and rapid urbanization are changing exposure to risk around the globe, making impacts harder to predict. Despite the availability of operational mapping products, there is no single tool to integrate diverse data and products across hazards, update exposure data quickly and make scenario-based predictions to support both short and long-term risk-related decisions. RASOR (Rapid Analysis and Spatialization Of Risk) will develop a platform to perform multi-hazard risk analysis for the full cycle of disaster management, including targeted support to critical infrastructure monitoring and climate change impact assessment. A scenario-driven query system simulates future scenarios based on existing or assumed conditions and compares them with historical scenarios. RASOR will thus offer a single work environment that generates new risk information across hazards, across data types (satellite EO, in-situ), across user communities (global, local, climate, civil protection, insurance, etc.) and across the world. Five case study areas are considered within the project, located in Haiti, Indonesia, Netherlands, Italy and Greece. Initially available over those demonstration areas, RASOR will ultimately offer global services to support in-depth risk assessment and full-cycle risk management.

  4. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    SciTech Connect

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  5. Comparative hazard analysis of processes leading to remarkable flash floods (France, 1930-1999)

    NASA Astrophysics Data System (ADS)

    Boudou, M.; Lang, M.; Vinet, F.; Cœur, D.

    2016-10-01

    Flash flood events are responsible for large economic losses and lead to fatalities every year in France. This is especially the case in the Mediterranean and oversea territories/departments of France, characterized by extreme hydro-climatological features and with a large part of the population exposed to flood risks. The recurrence of remarkable flash flood events, associated with high hazard intensity, significant damage and socio-political consequences, therefore raises several issues for authorities and risk management policies. This study aims to improve our understanding of the hazard analysis process in the case of four remarkable flood events: March 1930, October 1940, January 1980 and November 1999. Firstly, we present the methodology used to define the remarkability score of a flood event. Then, to identify the factors leading to a remarkable flood event, we explore the main parameters of the hazard analysis process, such as the meteorological triggering conditions, the return period of the rainfall and peak discharge, as well as some additional factors (initial catchment state, flood chronology, cascade effects, etc.). The results contribute to understanding the complexity of the processes leading to flood hazard and highlight the importance for risk managers of taking additional factors into account.

  6. Quantitative Remote Laser-Induced Breakdown Spectroscopy by Multivariate Analysis

    NASA Astrophysics Data System (ADS)

    Clegg, S. M.; Sklute, E. C.; Dyar, M. D.; Barefield, J. E.; Wiens, R. C.

    2007-12-01

    The ChemCam instrument selected for the Mars Science Laboratory (MSL) rover includes a remote Laser- Induced Breakdown Spectrometer (LIBS) that will quantitatively probe samples up to 9m from the rover mast. LIBS is fundamentally an elemental analysis technique. LIBS involves focusing a Nd:YAG laser operating at 1064 nm onto the surface of the sample. The laser ablates material from the surface, generating an expanding plasma containing electronically excited ions, atoms, and small molecules. As these electronically excited species relax back to the ground state, they emit light at wavelengths characteristic of the species present in the sample. Some of this emission is directed into one of three dispersive spectrometers. In this paper, we studied a suite of 18 igneous and highly-metamorphosed samples from a wide variety of parageneses for which chemical analyses by XRF were already available. Rocks were chosen to represent a range of chemical composition from basalt to rhyolite, thus providing significant variations in all of the major element contents (Si, Fe, Al, Ca, Na, K, O, Ti, Mg, and Mn). These samples were probed at a 9m standoff distance under experimental conditions that are similar to ChemCam. Extracting quantitative elemental concentrations from LIBS spectra is complicated by the chemical matrix effects. Conventional methods for obtaining quantitative chemical data from LIBS analyses are compared with new multivariate analysis (MVA) techniques that appear to compensate for these chemical matrix effects. The traditional analyses use specific elemental peak heights or areas, which compared with calibration curves for each element at one or more emission lines for a series of standard samples. Because of matrix effects, the calibration standards generally must have similar chemistries to the unknown samples, and thus this conventional approach imposes severe limitations on application of the technique to remote analyses. In this suite of samples, the use

  7. Integration of environmental and human health risk assessment for industries using hazardous materials: a quantitative multi criteria approach for environmental decision makers.

    PubMed

    Topuz, E; Talinli, I; Aydin, E

    2011-02-01

    Environmental management, for which environmental and human health risk assessment is the first stage, is a requirement for industries both before construction and during operation in order to sustain improved quality of life in the ecosystem. Therefore, the aim of this study is to propose an approach that integrates environmental and human health risk assessment for industries using hazardous materials in order to support environmental decision makers with quantitative and directive results. Analytic hierarchy process and fuzzy logic are used as tools to handle problems caused by complexity of environment and uncertain data. When the proposed approach is implemented to a scenario, it was concluded that it is possible to define risk sources with their risk classes and related membership degrees in that classes which enable the decision maker to decide which risk source has priority. In addition, they can easily point out and rank the factors contributing those risk sources owing to priority weights of them. As a result, environmental decision makers can use this approach while they are developing management alternatives for unfounded and on-going industrial plants using hazardous materials.

  8. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  9. Binary imaging analysis for comprehensive quantitative histomorphometry of peripheral nerve.

    PubMed

    Hunter, Daniel A; Moradzadeh, Arash; Whitlock, Elizabeth L; Brenner, Michael J; Myckatyn, Terence M; Wei, Cindy H; Tung, Thomas H H; Mackinnon, Susan E

    2007-10-15

    Quantitative histomorphometry is the current gold standard for objective measurement of nerve architecture and its components. Many methods still in use rely heavily upon manual techniques that are prohibitively time consuming, predisposing to operator fatigue, sampling error, and overall limited reproducibility. More recently, investigators have attempted to combine the speed of automated morphometry with the accuracy of manual and semi-automated methods. Systematic refinements in binary imaging analysis techniques combined with an algorithmic approach allow for more exhaustive characterization of nerve parameters in the surgically relevant injury paradigms of regeneration following crush, transection, and nerve gap injuries. The binary imaging method introduced here uses multiple bitplanes to achieve reproducible, high throughput quantitative assessment of peripheral nerve. Number of myelinated axons, myelinated fiber diameter, myelin thickness, fiber distributions, myelinated fiber density, and neural debris can be quantitatively evaluated with stratification of raw data by nerve component. Results of this semi-automated method are validated by comparing values against those obtained with manual techniques. The use of this approach results in more rapid, accurate, and complete assessment of myelinated axons than manual techniques.

  10. Variability in quantitative cardiac magnetic resonance perfusion analysis

    PubMed Central

    Bratis, K.

    2013-01-01

    By taking advantage of its high spatial resolution, noninvasive and nontoxic nature first-pass perfusion cardiovascular magnetic resonance (CMR) has rendered an indispensable tool for the noninvasive detection of reversible myocardial ischemia. A potential advantage of perfusion CMR is its ability to quantitatively assess perfusion reserve within a myocardial segment, as expressed semi- quantitatively by myocardial perfusion reserve index (MPRI) and fully- quantitatively by absolute myocardial blood flow (MBF). In contrast to the high accuracy and reliability of CMR in evaluating cardiac function and volumes, perfusion CMR is adversely affected by multiple potential reasons during data acquisition as well as post-processing. Various image acquisition techniques, various contrast agents and doses as well as variable blood flow at rest as well as variable reactions to stress all influence the acquired data. Mechanisms underlying the variability in perfusion CMR post processing, as well as their clinical significance, are yet to be fully elucidated. The development of a universal, reproducible, accurate and easily applicable tool in CMR perfusion analysis remains a challenge and will substantially enforce the role of perfusion CMR in improving clinical care. PMID:23825774

  11. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  12. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  13. Quantitative option analysis for implementation and management of landfills.

    PubMed

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. © The Author(s) 2016.

  14. Quantitative phosphoproteomic analysis using iTRAQ method.

    PubMed

    Asano, Tomoya; Nishiuchi, Takumi

    2014-01-01

    The MAPK (mitogen-activated kinase) cascade plays important roles in plant perception of and reaction to developmental and environmental cues. Phosphoproteomics are useful to identify target proteins regulated by MAPK-dependent signaling pathway. Here, we introduce the quantitative phosphoproteomic analysis using a chemical labeling method. The isobaric tag for relative and absolute quantitation (iTRAQ) method is a MS-based technique to quantify protein expression among up to eight different samples in one experiment. In this technique, peptides were labeled by some stable isotope-coded covalent tags. We perform quantitative phosphoproteomics comparing Arabidopsis wild type and a stress-responsive mapkk mutant after phytotoxin treatment. To comprehensively identify the downstream phosphoproteins of MAPKK, total proteins were extracted from phytotoxin-treated wild-type and mapkk mutant plants. The phosphoproteins were purified by Pro-Q(®) Diamond Phosphoprotein Enrichment Kit and were digested with trypsin. Resulting peptides were labeled with iTRAQ reagents and were quantified and identified by MALDI TOF/TOF analyzer. We identified many phosphoproteins that were decreased in the mapkk mutant compared with wild type.

  15. Research on the spatial analysis method of seismic hazard for island

    NASA Astrophysics Data System (ADS)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  16. Quantitative landslide risk analysis Examples from Bíldudalur, NW-Iceland

    NASA Astrophysics Data System (ADS)

    Bell, R.; Glade, T.

    2003-04-01

    Risk analysis, risk evaluation and risk management are integrated in the holistic concept of risk assessment. Internationally, various quantitative, semiquantitative and qualitative approaches exist to analyse the risk to life and/or the economic risk caused by landslides. In Iceland, a method to carry out snow avalanche risk analysis was developed in 1999, followed by rough guidelines on how to integrate results from landslide hazard assessments into a comprehensive landslide and snow avalanche risk assessment in 2002. The Icelandic regulation on hazard zoning due to snow avalanches and landslides, issued by the Icelandic Ministry of the Environment in the year 2000, aims to prevent people living or working within the areas most at risk, until 2010. The regulation requires to carry out landslide and snow avalanche risk analyses, however, an approach to calculate landslide risk in detail is still missing. Therefore, the ultimate goal of this study is to develop such a method and apply it in Bildudalur, NW-Iceland. Within this presentation, the risk analysis focuses on the risks to loose life. To calculate landslide risk, the spatial and temporal probability of occurrence of potential damaging events, as well as the distribution of the elements at risk in space and time, under the consideration of changing vulnerabilities must be determined. Based on existent debris flow and rock fall run-out maps, hazard maps are derived and the respective risks are calculated. Already digitized elements at risk (people in houses) are verified and updated. The damage potential (the number of all of the people living or working at a specific location), derived from official statistics and own investigations, are attributed to each house. The vulnerability of the elements at risk is mainly based on literature studies. The probability of spatial impact (i.e. of the hazardous event impacting a building) is estimated using benchmarks given in literature, results from field

  17. Lipid biomarker analysis for the quantitative analysis of airborne microorganisms

    SciTech Connect

    Macnaughton, S.J.; Jenkins, T.L.; Cormier, M.R.

    1997-08-01

    There is an ever increasing concern regarding the presence of airborne microbial contaminants within indoor air environments. Exposure to such biocontaminants can give rise to large numbers of different health effects including infectious diseases, allergenic responses and respiratory problems, Biocontaminants typically round in indoor air environments include bacteria, fungi, algae, protozoa and dust mites. Mycotoxins, endotoxins, pollens and residues of organisms are also known to cause adverse health effects. A quantitative detection/identification technique independent of culturability that assays both culturable and non culturable biomass including endotoxin is critical in defining risks from indoor air biocontamination. Traditionally, methods employed for the monitoring of microorganism numbers in indoor air environments involve classical culture based techniques and/or direct microscopic counting. It has been repeatedly documented that viable microorganism counts only account for between 0.1-10% of the total community detectable by direct counting. The classic viable microbiologic approach doe`s not provide accurate estimates of microbial fragments or other indoor air components that can act as antigens and induce or potentiate allergic responses. Although bioaerosol samplers are designed to damage the microbes as little as possible, microbial stress has been shown to result from air sampling, aerosolization and microbial collection. Higher collection efficiency results in greater cell damage while less cell damage often results in lower collection efficiency. Filtration can collect particulates at almost 100% efficiency, but captured microorganisms may become dehydrated and damaged resulting in non-culturability, however, the lipid biomarker assays described herein do not rely on cell culture. Lipids are components that are universally distributed throughout cells providing a means to assess independent of culturability.

  18. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. 3D visualization and quantitative analysis of human erythrocyte phagocytosis.

    PubMed

    Stachurska, Anna; Król, Teodora; Trybus, Wojciech; Szary, Karol; Fabijańska-Mitek, Jadwiga

    2016-11-01

    Since the erythrophagocytosis of opsonized erythrocytes is investigated mainly by calculating the phagocytic index using subjective light microscopy evaluation, we present methods for the quantitative and qualitative analysis of human cell erythrophagocytosis. Erythrocytes from two storage periods were used. Using Imaris software, we were able to create a three-dimensional model of erythrophagocytosis. The use of microscopy instead of cytometry revealed a significantly higher number of monocytes and erythrocytes that appeared active in phagocytosis. Spatial reconstruction allowed for detailed analysis of the process by precisely locating erythrocytes in phagocytes. Additionally, a technique of sequential image registration using Nis Elements software allowed for observation of the course of phagocytosis over a range of time intervals. This in vitro research may be helpful for understanding the cellular interactions between monocytes and erythrocytes. The cytometric method-being relatively rapid, sensitive, and specific-can serve as an alternative technique to microscopy in the quantitative analysis of erythrophagocytosis. This allows us to avoid counting the erythrocytes nonspecifically attached to monocytes and gives objective results. © 2016 International Federation for Cell Biology.

  20. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  1. Quantitative analysis of motion control in long term microgravity.

    PubMed

    Baroni, G; Ferrigno, G; Anolli, A; Andreoni, G; Pedotti, A

    1998-01-01

    In the frame of the 179-days EUROMIR '95 space mission, two in-flight experiments have foreseen quantitative three-dimensional human movement analysis in microgravity. For this aim, a space qualified opto-electronic motion analyser based on passive markers has been installed onboard the Russian Space Station MIR and 8 in flight sessions have been performed. Techhology and method for the collection of kinematics data are described, evaluating the accuracy in three-dimensional marker localisation. Results confirm the suitability of opto-electronic technology for quantitative human motion analysis on orbital modules and raise a set of "lessons learned", leading to the improvement of motion analyser performance with a contemporary swiftness of the on-board operations. Among the experimental program of T4, results of three voluntary posture perturbation protocols are described. The analysis suggests that a short term reinterpretation of proprioceptive information and re-calibration of sensorimotor mechanisms seem to end within the first weeks of flight, while a continuous long term adaptation process allows the refinement of motor performance, in the frame of never abandoned terrestrial strategies.

  2. Correlative SEM SERS for quantitative analysis of dimer nanoparticles.

    PubMed

    Timmermans, F J; Lenferink, A T M; van Wolferen, H A G M; Otto, C

    2016-11-14

    A Raman microscope integrated with a scanning electron microscope was used to investigate plasmonic structures by correlative SEM-SERS analysis. The integrated Raman-SEM microscope combines high-resolution electron microscopy information with SERS signal enhancement from selected nanostructures with adsorbed Raman reporter molecules. Correlative analysis is performed for dimers of two gold nanospheres. Dimers were selected on the basis of SEM images from multi aggregate samples. The effect of the orientation of the dimer with respect to the polarization state of the laser light and the effect of the particle gap size on the Raman signal intensity is observed. Additionally, calculations are performed to simulate the electric near field enhancement. These simulations are based on the morphologies observed by electron microscopy. In this way the experiments are compared with the enhancement factor calculated with near field simulations and are subsequently used to quantify the SERS enhancement factor. Large differences between experimentally observed and calculated enhancement factors are regularly detected, a phenomenon caused by nanoscale differences between the real and 'simplified' simulated structures. Quantitative SERS experiments reveal the structure induced enhancement factor, ranging from ∼200 to ∼20 000, averaged over the full nanostructure surface. The results demonstrate correlative Raman-SEM microscopy for the quantitative analysis of plasmonic particles and structures, thus enabling a new analytical method in the field of SERS and plasmonics.

  3. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE PAGES

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...

    2017-08-23

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10-4, 4x10-5 and 1x10-5.« less

  4. Computer compensation for NMR quantitative analysis of trace components

    SciTech Connect

    Nakayama, T.; Fujiwara, Y.

    1981-07-22

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA.

  5. Quantitative analysis of sideband coupling in photoinduced force microscopy

    NASA Astrophysics Data System (ADS)

    Jahng, Junghoon; Kim, Bongsu; Lee, Eun Seong; Potma, Eric Olaf

    2016-11-01

    We present a theoretical and experimental analysis of the cantilever motions detected in photoinduced force microscopy (PiFM) using the sideband coupling detection scheme. In sideband coupling, the cantilever dynamics are probed at a combination frequency of a fundamental mechanical eigenmode and the modulation frequency of the laser beam. Using this detection mode, we develop a method for reconstructing the modulated photoinduced force gradient from experimental parameters in a quantitative manner. We show evidence, both theoretically and experimentally, that the sideband coupling detection mode provides PiFM images with superior contrast compared to images obtained when detecting the cantilever motions directly at the laser modulation frequency.

  6. Flow quantitation by radio frequency analysis of contrast echocardiography.

    PubMed

    Rovai, D; Lombardi, M; Mazzarisi, A; Landini, L; Taddei, L; Distante, A; Benassi, A; L'Abbate, A

    1993-03-01

    Contrast echocardiography has the potential for measuring cardiac output and regional blood flow. However, accurate quantitation is limited both by the use of non-standard contrast agents and by the electronic signal distortion inherent to the echocardiographic instruments. Thus, the aim of this study is to quantify flow by combining a stable contrast agent and a modified echo equipment, able to sample the radio frequency (RF) signal from a region of interest (ROI) in the echo image. The contrast agent SHU-454 (0.8 ml) was bolus injected into an in vitro calf vein, at 23 flow rates (ranging from 376 to 3620 ml/min) but constant volume and pressure. The ROI was placed in the centre of the vein, the RF signal was processed in real time and transferred to a personal computer to generate time-intensity curves. In the absence of recirculation, contrast washout slope and mean transit time (MTT) of curves (1.11-8.52 seconds) yielded excellent correlations with flow: r = 0.93 and 0.95, respectively. To compare the accuracy of RF analysis with that of conventional image processing as to flow quantitation, conventional images were collected in the same flow model by two different scanners: a) the mechanical sector scanner used for RF analysis, and b) a conventional electronic sector scanner. These images were digitized off-line, mean videodensity inside an identical ROI was measured and time-intensity curves were built. MTT by RF was shorter than by videodensitometric analysis of the images generated by the same scanner (p < 0.001). In contrast, MTT by RF was longer than by the conventional scanner (p < 0.001). Significant differences in MTT were also found with changes in the gain setting controls of the conventional scanner. To study the stability of the contrast effect, 6 contrast injections (20 ml) were performed at a constant flow rate during recirculation: the spontaneous decay in RF signal intensity (t1/2 = 64 +/- 8 seconds) was too long to affect MTT significantly

  7. Simulating the focal volume effect: a quantitative analysis

    NASA Astrophysics Data System (ADS)

    Scarborough, Timothy D.; Uiterwaal, Cornelis J. G. J.

    2013-12-01

    We present quantitative simulations of the focal volume effect. Intensity distributions in detection volumes with two- and three-dimensional spatial resolution are calculated. Results include an analysis of translations of these volumes in the focus along the direction of laser propagation as well as discussion of varying sizes of the spatially resolved volumes. We find that detection volumes less than half the 1/e full-width beam waist and less than half the Rayleigh length along the propagation direction offer an optimal compromise of maintaining intensity resolution without sacrificing peak intensity.

  8. Neutron diffractometer INES for quantitative phase analysis of archaeological objects

    NASA Astrophysics Data System (ADS)

    Imberti, S.; Kockelmann, W.; Celli, M.; Grazzi, F.; Zoppi, M.; Botti, A.; Sodo, A.; Imperiale, M. Leo; de Vries-Melein, M.; Visser, D.; Postma, H.

    2008-03-01

    With the Italian Neutron Experimental Station (INES) a new general purpose neutron powder diffractometer is available at ISIS, characterized by a high resolution at low d-spacings, and particularly suited for the quantitative phase analysis of a wide range of archaeological materials. Time-of-flight neutron diffraction is notable for being a non-destructive technique, allowing a reliable determination of the phase compositions of multiphase artefacts, with or without superficial corrosion layers. A selection of archaeometric studies carried out during the first year of the INES user programme is presented here to demonstrate the capabilities of the instrument.

  9. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes.

  10. Quantitative chemical analysis of ocular melanosomes in the TEM.

    PubMed

    Eibl, O; Schultheiss, S; Blitgen-Heinecke, P; Schraermeyer, U

    2006-01-01

    Melanosomes in retinal tissues of a human, monkey and rat were analyzed by EDX in the TEM. Samples were prepared by ultramicrotomy at different thicknesses. The material was mounted on Al grids and samples were analyzed in a Zeiss 912 TEM equipped with an Omega filter and EDX detector with ultrathin window. Melanosomes consist of C and O as main components, mole fractions are about 90 and 3-10 at.%, respectively, and small mole fraction ratios, between 2 and 0.1 at.%, of Na, Mg, K, Si, P, S, Cl, Ca. All elements were measured quantitatively by standardless EDX with high precision. Mole fractions of transition metals Fe, Cu and Zn were also measured. For Fe a mole fraction ratio of less than 0.1at.% was found and gives the melanin its paramagnetic properties. Its mole fraction is however close to or below the minimum detectable mass fraction of the used equipment. Only in the human eye and only in the retinal pigment epitelium (rpe) the mole fractions of Zn (0.1 at.% or 5000 microg/g) and Cu were clearly beyond the minimum detectable mass fraction. In the rat and monkey eye the mole fraction of Zn was at or below the minimum detectable mass fraction and could not be measured quantitatively. The obtained results yielded the chemical composition of the melanosomes in the choroidal tissue and the retinal pigment epitelium (rpe) of the three different species. The results of the chemical analysis are discussed by mole fraction correlation diagrams. Similarities and differences between the different species are outlined. Correlation behavior was found to hold over species, e.g. the Ca-O correlation. It indicates that Ca is bound to oxygen rich sites in the melanin. These are the first quantitative analyses of melanosomes by EDX reported so far. The quantitative chemical analysis should open a deeper understanding of the metabolic processes in the eye that are of central importance for the understanding of a large number of eye-related diseases. The chemical analysis also

  11. Quantitative sonographic image analysis for hepatic nodules: a pilot study.

    PubMed

    Matsumoto, Naoki; Ogawa, Masahiro; Takayasu, Kentaro; Hirayama, Midori; Miura, Takao; Shiozawa, Katsuhiko; Abe, Masahisa; Nakagawara, Hiroshi; Moriyama, Mitsuhiko; Udagawa, Seiichi

    2015-10-01

    The aim of this study was to investigate the feasibility of quantitative image analysis to differentiate hepatic nodules on gray-scale sonographic images. We retrospectively evaluated 35 nodules from 31 patients with hepatocellular carcinoma (HCC), 60 nodules from 58 patients with liver hemangioma, and 22 nodules from 22 patients with liver metastasis. Gray-scale sonographic images were evaluated with subjective judgment and image analysis using ImageJ software. Reviewers classified the shape of nodules as irregular or round, and the surface of nodules as rough or smooth. Circularity values were lower in the irregular group than in the round group (median 0.823, 0.892; range 0.641-0.915, 0.784-0.932, respectively; P = 3.21 × 10(-10)). Solidity values were lower in the rough group than in the smooth group (median 0.957, 0.968; range 0.894-0.986, 0.933-0.988, respectively; P = 1.53 × 10(-4)). The HCC group had higher circularity and solidity values than the hemangioma group. The HCC and liver metastasis groups had lower median, mean, modal, and minimum gray values than the hemangioma group. Multivariate analysis showed circularity [standardized odds ratio (OR), 2.077; 95 % confidential interval (CI) = 1.295-3.331; P = 0.002] and minimum gray value (OR 0.482; 95 % CI = 0.956-0.990; P = 0.001) as factors predictive of malignancy. The combination of subjective judgment and image analysis provided 58.3 % sensitivity and 89.5 % specificity with AUC = 0.739, representing an improvement over subjective judgment alone (68.4 % sensitivity, 75.0 % specificity, AUC = 0.701) (P = 0.008). Quantitative image analysis for ultrasonic images of hepatic nodules may correlate with subjective judgment in predicting malignancy.

  12. Quantitative Proteomic and Phosphoproteomic Analysis of Trypanosoma cruzi Amastigogenesis*

    PubMed Central

    Queiroz, Rayner M. L.; Charneau, Sébastien; Mandacaru, Samuel C.; Schwämmle, Veit; Lima, Beatriz D.; Roepstorff, Peter; Ricart, Carlos A. O.

    2014-01-01

    Chagas disease is a tropical neglected disease endemic in Latin America caused by the protozoan Trypanosoma cruzi. The parasite has four major life stages: epimastigote, metacyclic trypomastigote, bloodstream trypomastigote, and amastigote. The differentiation from infective trypomastigotes into replicative amastigotes, called amastigogenesis, takes place in vivo inside mammalian host cells after a period of incubation in an acidic phagolysosome. This differentiation process can be mimicked in vitro by incubating tissue-culture-derived trypomastigotes in acidic DMEM. Here we used this well-established differentiation protocol to perform a comprehensive quantitative proteomic and phosphoproteomic analysis of T. cruzi amastigogenesis. Samples from fully differentiated forms and two biologically relevant intermediate time points were Lys-C/trypsin digested, iTRAQ-labeled, and multiplexed. Subsequently, phosphopeptides were enriched using a TiO2 matrix. Non-phosphorylated peptides were fractionated via hydrophilic interaction liquid chromatography prior to LC-MS/MS analysis. LC-MS/MS and bioinformatics procedures were used for protein and phosphopeptide quantitation, identification, and phosphorylation site assignment. We were able to identify regulated proteins and pathways involved in coordinating amastigogenesis. We also observed that a significant proportion of the regulated proteins were membrane proteins. Modulated phosphorylation events coordinated by protein kinases and phosphatases that are part of the signaling cascade induced by incubation in acidic medium were also evinced. To our knowledge, this work is the most comprehensive quantitative proteomics study of T. cruzi amastigogenesis, and these data will serve as a trustworthy basis for future studies, and possibly for new potential drug targets. PMID:25225356

  13. A sensitivity analysis of hazardous waste disposal site climatic and soil design parameters using HELP3

    SciTech Connect

    Adelman, D.D.; Stansbury, J.

    1997-12-31

    The Resource Conservation and Recovery Act (RCRA) Subtitle C, Comprehensive Environmental Response, Compensation, And Liability Act (CERCLA), and subsequent amendments have formed a comprehensive framework to deal with hazardous wastes on the national level. Key to this waste management is guidance on design (e.g., cover and bottom leachate control systems) of hazardous waste landfills. The objective of this research was to investigate the sensitivity of leachate volume at hazardous waste disposal sites to climatic, soil cover, and vegetative cover (Leaf Area Index) conditions. The computer model HELP3 which has the capability to simulate double bottom liner systems as called for in hazardous waste disposal sites was used in the analysis. HELP3 was used to model 54 combinations of climatic conditions, disposal site soil surface curve numbers, and leaf area index values to investigate how sensitive disposal site leachate volume was to these three variables. Results showed that leachate volume from the bottom double liner system was not sensitive to these parameters. However, the cover liner system leachate volume was quite sensitive to climatic conditions and less sensitive to Leaf Area Index and curve number values. Since humid locations had considerably more cover liner system leachate volume than and locations, different design standards may be appropriate for humid conditions than for and conditions.

  14. Environmental justice implications of industrial hazardous waste generation in India: a national scale analysis

    NASA Astrophysics Data System (ADS)

    Basu, Pratyusha; Chakraborty, Jayajit

    2016-12-01

    While rising air and water pollution have become issues of widespread public concern in India, the relationship between spatial distribution of environmental pollution and social disadvantage has received less attention. This lack of attention becomes particularly relevant in the context of industrial pollution, as India continues to pursue industrial development policies without sufficient regard to its adverse social impacts. This letter examines industrial pollution in India from an environmental justice (EJ) perspective by presenting a national scale study of social inequities in the distribution of industrial hazardous waste generation. Our analysis connects district-level data from the 2009 National Inventory of Hazardous Waste Generating Industries with variables representing urbanization, social disadvantage, and socioeconomic status from the 2011 Census of India. Our results indicate that more urbanized and densely populated districts with a higher proportion of socially and economically disadvantaged residents are significantly more likely to generate hazardous waste. The quantity of hazardous waste generated is significantly higher in more urbanized but sparsely populated districts with a higher proportion of economically disadvantaged households, after accounting for other relevant explanatory factors such as literacy and social disadvantage. These findings underscore the growing need to incorporate EJ considerations in future industrial development and waste management in India.

  15. Seismic Hazard Analysis of Aizawl, India with a Focus on Water System Fragilities

    NASA Astrophysics Data System (ADS)

    Belair, G. M.; Tran, A. J.; Dreger, D. S.; Rodgers, J. E.

    2015-12-01

    GeoHazards International (GHI) has partnered with the University of California, Berkeley in a joint Civil Engineering and Earth Science summer internship program to investigate geologic hazards. This year the focus was on Aizawl, the capital of India's Mizoram state, situated on a ridge in the Burma Ranges. Nearby sources have the potential for large (M > 7) earthquakes that would be devastating to the approximately 300,000 people living in the city. Earthquake induced landslides also threaten the population as well as the city's lifelines. Fieldwork conducted in June 2015 identified hazards to vital water system components. The focus of this abstract is a review of the seismic hazards that affect Aizawl, with special attention paid to water system locations. To motivate action to reduce risk, GHI created an earthquake scenario describing effects of a M7 right-lateral strike-slip intraplate earthquake occurring 30 km below the city. We extended this analysis by exploring additional mapped faults as well as hypothetical blind reverse faults in terms of PGA, PGV, and PSA. Ground motions with hanging wall and directivity effects were also examined. Several attenuation relationships were used in order to assess the uncertainty in the ground motion parameters. Results were used to determine the likely seismic performance of water system components, and will be applied in future PSHA studies.

  16. Patient-specific meta-analysis for risk assessment using multivariate proportional hazards regression.

    PubMed

    Crager, Michael R; Tang, Gong

    We propose a method for assessing an individual patient's risk of a future clinical event using clinical trial or cohort data and Cox proportional hazards regression, combining the information from several studies using meta-analysis techniques. The method combines patient-specific estimates of the log cumulative hazard across studies, weighting by the relative precision of the estimates, using either fixed- or random-effects meta-analysis calculations. Risk assessment can be done for any future patient using a few key summary statistics determined once and for all from each study. Generalizations of the method to logistic regression and linear models are immediate. We evaluate the methods using simulation studies and illustrate their application using real data.

  17. Laser safety and hazard analysis for the temperature stabilized BSLT ARES laser system.

    SciTech Connect

    Augustoni, Arnold L.

    2003-08-01

    A laser safety and hazard analysis was performed for the temperature stabilized Big Sky Laser Technology (BSLT) laser central to the ARES system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. As a result of temperature stabilization of the BSLT laser the operating parameters of the laser had changed requiring a hazard analysis based on the new operating conditions. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.

  18. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    PubMed Central

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-01-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  19. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  20. Quantitative analysis of live cells using digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Lewis, Tan Rongwei; Qu, Weijuan; Chee, Oi Choo; Singh, Vijay Raj; Asundi, Anand

    2010-03-01

    During the life time of a cell, it goes through changes to the plasma membrane as well as its internal structures especially distinctive during processes like cell division and death. Different types of microscope are used to fulfill the observation of the cell's variation. In our experiment, Vero cells have been investigated by using phase contrast microscopy and digital holographic microscopy (DHM). A comparison of the images obtained for cell division is presented here. The conventional phase contrast microscope provided a good imaging method in the real time analysis of cell division. The off-axis digital hologram recorded by the DHM system can be reconstructed to obtain both the intensity image and phase contrast image of the test object. These can be used for live cell imaging to provide multiple results from a single equipment setup. The DHM system, besides being a qualitative tool, is able to provide quantitative results and 3D images of the cell division process. The ability of DHM to provide quantitative analysis makes it an ideal tool for life science applications.