Science.gov

Sample records for quantitative hazard analysis

  1. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  2. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    NASA Technical Reports Server (NTRS)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  3. Quantitative electroencephalography analysis in university students with hazardous alcohol consumption, but not alcohol dependence.

    PubMed

    Núñez-Jaramillo, Luis; Vega-Perera, Paulo; Ramírez-Lugo, Leticia; Reyes-López, Julián V; Santiago-Rodríguez, Efraín; Herrera-Morales, Wendy V

    2015-07-01

    Hazardous alcohol consumption is a pattern of consumption that leads to a higher risk of harmful consequences either for the user or for others. This pattern of alcohol consumption has been linked to risky behaviors, accidents, and injuries. Individuals with hazardous alcohol consumption do not necessarily present alcohol dependence; thus, a study of particular neurophysiological correlates of this alcohol consumption pattern needs to be carried out in nondependent individuals. Here, we carried out a quantitative electroencephalography analysis in health sciences university students with hazardous alcohol consumption, but not alcohol dependence (HAC), and control participants without hazardous alcohol consumption or alcohol dependence (NHAC). We analyzed Absolute Power (AP), Relative Power (RP), and Mean Frequency (MF) for beta and theta frequency bands under both eyes closed and eyes open conditions. We found that participants in the HAC group presented higher beta AP at centroparietal region, as well as lower beta MF at frontal and centroparietal regions in the eyes closed condition. Interestingly, participants did not present any change in theta activity (AP, RP, or MF), whereas previous reports indicate an increase in theta AP in alcohol-dependent individuals. Our results partially resemble those found in alcohol-dependent individuals, although are not completely identical, suggesting a possible difference in the underlying neuronal mechanism behind alcohol dependence and hazardous alcohol consumption. Similarities could be explained considering that both hazardous alcohol consumption and alcohol dependence are manifestations of behavioral disinhibition. PMID:26035281

  4. Quantitative electroencephalography analysis in university students with hazardous alcohol consumption, but not alcohol dependence.

    PubMed

    Núñez-Jaramillo, Luis; Vega-Perera, Paulo; Ramírez-Lugo, Leticia; Reyes-López, Julián V; Santiago-Rodríguez, Efraín; Herrera-Morales, Wendy V

    2015-07-01

    Hazardous alcohol consumption is a pattern of consumption that leads to a higher risk of harmful consequences either for the user or for others. This pattern of alcohol consumption has been linked to risky behaviors, accidents, and injuries. Individuals with hazardous alcohol consumption do not necessarily present alcohol dependence; thus, a study of particular neurophysiological correlates of this alcohol consumption pattern needs to be carried out in nondependent individuals. Here, we carried out a quantitative electroencephalography analysis in health sciences university students with hazardous alcohol consumption, but not alcohol dependence (HAC), and control participants without hazardous alcohol consumption or alcohol dependence (NHAC). We analyzed Absolute Power (AP), Relative Power (RP), and Mean Frequency (MF) for beta and theta frequency bands under both eyes closed and eyes open conditions. We found that participants in the HAC group presented higher beta AP at centroparietal region, as well as lower beta MF at frontal and centroparietal regions in the eyes closed condition. Interestingly, participants did not present any change in theta activity (AP, RP, or MF), whereas previous reports indicate an increase in theta AP in alcohol-dependent individuals. Our results partially resemble those found in alcohol-dependent individuals, although are not completely identical, suggesting a possible difference in the underlying neuronal mechanism behind alcohol dependence and hazardous alcohol consumption. Similarities could be explained considering that both hazardous alcohol consumption and alcohol dependence are manifestations of behavioral disinhibition.

  5. Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis

    NASA Astrophysics Data System (ADS)

    Wright, Heather; Pallister, John; Newhall, Chris

    2015-04-01

    Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the

  6. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  7. The RiskScape System - a tool for quantitative multi-risk analysis for natural hazards.

    NASA Astrophysics Data System (ADS)

    Schmidt, J.; Reese, S.; Matcham, I.; King, A.; Bell, R.

    2009-04-01

    This paper introduces a generic framework for multi-risk modelling developed in the project ‘Regional RiskScape' at the Research Organization GNS Science and the National Institute of Water and Atmospheric Research Ltd. (NIWA) in New Zealand. Our goal was to develop a generic technology for modelling risks from multiple natural hazards and for multiple risk elements. The framework is independent on the specific nature of the individual hazard and individual risk element. A software prototype has been developed which is capable of ‘plugging in' various natural hazards and risk elements without reconfiguring / adapting the generic software framework. To achieve that goal we developed a set of standards for treating the fundamental components of a risk model: hazards, assets (risk elements), and vulnerability models (or fragility functions). Thus, the developed prototype system is able to understand any hazard, asset, or fragility model which is provided to the system according to that standard. We tested the software prototype for modelling earthquake, volcanic, flood, wind, and tsunami risks for urban centres in New Zealand.

  8. Quantitative analysis

    PubMed Central

    Nevin, John A.

    1984-01-01

    Quantitative analysis permits the isolation of invariant relations in the study of behavior. The parameters of these relations can serve as higher-order dependent variables in more extensive analyses. These points are illustrated by reference to quantitative descriptions of performance maintained by concurrent schedules, multiple schedules, and signal-detection procedures. Such quantitative descriptions of empirical data may be derived from mathematical theories, which in turn can lead to novel empirical analyses so long as their terms refer to behavioral and environmental events. Thus, quantitative analysis is an integral aspect of the experimental analysis of behavior. PMID:16812400

  9. Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-08-31

    A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public.

  10. Automated Hazard Analysis

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control andmore » job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the work planning process.« less

  11. Hazard Analysis Database Report

    SciTech Connect

    GRAMS, W.H.

    2000-12-28

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for U S . Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for HNF-SD-WM-SAR-067, Tank Farms Final Safety Analysis Report (FSAR). The FSAR is part of the approved Authorization Basis (AB) for the River Protection Project (RPP). This document describes, identifies, and defines the contents and structure of the Tank Farms FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The Hazard Analysis Database supports the preparation of Chapters 3 ,4 , and 5 of the Tank Farms FSAR and the Unreviewed Safety Question (USQ) process and consists of two major, interrelated data sets: (1) Hazard Analysis Database: Data from the results of the hazard evaluations, and (2) Hazard Topography Database: Data from the system familiarization and hazard identification.

  12. Hazard Analysis Database Report

    SciTech Connect

    GAULT, G.W.

    1999-10-13

    The Hazard Analysis Database was developed in conjunction with the hazard analysis activities conducted in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, for the Tank Waste Remediation System (TWRS) Final Safety Analysis Report (FSAR). The FSAR is part of the approved TWRS Authorization Basis (AB). This document describes, identifies, and defines the contents and structure of the TWRS FSAR Hazard Analysis Database and documents the configuration control changes made to the database. The TWRS Hazard Analysis Database contains the collection of information generated during the initial hazard evaluations and the subsequent hazard and accident analysis activities. The database supports the preparation of Chapters 3,4, and 5 of the TWRS FSAR and the USQ process and consists of two major, interrelated data sets: (1) Hazard Evaluation Database--Data from the results of the hazard evaluations; and (2) Hazard Topography Database--Data from the system familiarization and hazard identification.

  13. Software safety hazard analysis

    SciTech Connect

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper.

  14. K Basin Hazard Analysis

    SciTech Connect

    PECH, S.H.

    2000-08-23

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Final Safety Analysis Report. This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  15. K Basins Hazard Analysis

    SciTech Connect

    WEBB, R.H.

    1999-12-29

    This report describes the methodology used in conducting the K Basins Hazard Analysis, which provides the foundation for the K Basins Safety Analysis Report (HNF-SD-WM-SAR-062, Rev.4). This hazard analysis was performed in accordance with guidance provided by DOE-STD-3009-94, Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  16. Identifying hazard parameter to develop quantitative and dynamic hazard map of an active volcano in Indonesia

    NASA Astrophysics Data System (ADS)

    Suminar, Wulan; Saepuloh, Asep; Meilano, Irwan

    2016-05-01

    Analysis of hazard assessment to active volcanoes is crucial for risk management. The hazard map of volcano provides information to decision makers and communities before, during, and after volcanic crisis. The rapid and accurate hazard assessment, especially to an active volcano is necessary to be developed for better mitigation on the time of volcanic crises in Indonesia. In this paper, we identified the hazard parameters to develop quantitative and dynamic hazard map of an active volcano. The Guntur volcano in Garut Region, West Java, Indonesia was selected as study area due population are resided adjacent to active volcanoes. The development of infrastructures, especially related to tourism at the eastern flank from the Summit, are growing rapidly. The remote sensing and field investigation approaches were used to obtain hazard parameters spatially. We developed a quantitative and dynamic algorithm to map spatially hazard potential of volcano based on index overlay technique. There were identified five volcano hazard parameters based on Landsat 8 and ASTER imageries: volcanic products including pyroclastic fallout, pyroclastic flows, lava and lahar, slope topography, surface brightness temperature, and vegetation density. Following this proposed technique, the hazard parameters were extracted, indexed, and calculated to produce spatial hazard values at and around Guntur Volcano. Based on this method, the hazard potential of low vegetation density is higher than high vegetation density. Furthermore, the slope topography, surface brightness temperature, and fragmental volcanic product such as pyroclastics influenced to the spatial hazard value significantly. Further study to this proposed approach will be aimed for effective and efficient analyses of volcano risk assessment.

  17. Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Not Available

    1988-01-01

    The purpose of Probabilistic Seismic Hazard Analysis (PSHA) is to evaluate the hazard of seismic ground motion at a site by considering all possible earthquakes in the area, estimating the associated shaking at the site, and calculating the probabilities of these occurrences. The Panel on Seismic Hazard Analysis is charged with assessment of the capabilities, limitations, and future trends of PSHA in the context of alternatives. The report identifies and discusses key issues of PSHA and is addressed to decision makers with a modest scientific and technical background and to the scientific and technical community. 37 refs., 19 figs.

  18. QMRA (quantitative microbial risk assessment) and HACCP (hazard analysis and critical control points) for management of pathogens in wastewater and sewage sludge treatment and reuse.

    PubMed

    Westrell, T; Schönning, C; Stenström, T A; Ashbolt, N J

    2004-01-01

    Hazard Analysis and Critical Control Points (HACCP) was applied for identifying and controlling exposure to pathogenic microorganisms encountered during normal sludge and wastewater handling at a 12,500 m3/d treatment plant utilising tertiary wastewater treatment and mesophilic sludge digestion. The hazardous scenarios considered were human exposure during treatment, handling, soil application and crop consumption, and exposure via water at the wetland-area and recreational swimming. A quantitative microbial risk assessment (QMRA), including rotavirus, adenovirus, haemorrhagic E. coli, Salmonella, Giardia and Cryptosporidium, was performed in order to prioritise pathogen hazards for control purposes. Human exposures were treated as individual risks but also related to the endemic situation in the general population. The highest individual health risk from a single exposure was via aerosols for workers at the belt press for sludge dewatering (virus infection risk = 1). The largest impact on the community would arise if children ingested sludge at the unprotected storage site, although in the worst-case situation the largest number of infections would arise through vegetables fertilised with sludge and eaten raw (not allowed in Sweden). Acceptable risk for various hazardous scenarios, treatment and/or reuse strategies could be tested in the model.

  19. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  20. Chemical process hazards analysis

    SciTech Connect

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  1. Counterfactual Volcano Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2013-04-01

    The historical database of past disasters is a cornerstone of catastrophe risk assessment. Whereas disasters are fortunately comparatively rare, near-misses are quite common for both natural and man-made hazards. The word disaster originally means 'an unfavourable aspect of a star'. Except for astrologists, disasters are no longer perceived fatalistically as pre-determined. Nevertheless, to this day, historical disasters are treated statistically as fixed events, although in reality there is a large luck element involved in converting a near-miss crisis situation into a disaster statistic. It is possible to conceive a stochastic simulation of the past to explore the implications of this chance factor. Counterfactual history is the exercise of hypothesizing alternative paths of history from what actually happened. Exploring history from a counterfactual perspective is instructive for a variety of reasons. First, it is easy to be fooled by randomness and see regularity in event patterns which are illusory. The past is just one realization of a variety of possible evolutions of history, which may be analyzed through a stochastic simulation of an array of counterfactual scenarios. In any hazard context, there is a random component equivalent to dice being rolled to decide whether a near-miss becomes an actual disaster. The fact that there may be no observed disaster over a period of time may belie the occurrence of numerous near-misses. This may be illustrated using the simple dice paradigm. Suppose a dice is rolled every month for a year, and an event is recorded if a six is thrown. There is still an 11% chance of no events occurring during the year. A variety of perils may be used to illustrate the use of near-miss information within a counterfactual disaster analysis. In the domain of natural hazards, near-misses are a notable feature of the threat landscape. Storm surges are an obvious example. Sea defences may protect against most meteorological scenarios. However

  2. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  3. MGR External Events Hazards Analysis

    SciTech Connect

    L. Booth

    1999-11-06

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses.

  4. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  5. Preliminary hazards analysis -- vitrification process

    SciTech Connect

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  6. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    NASA Astrophysics Data System (ADS)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  7. FIRE HAZARDS ANALYSIS - BUSTED BUTTE

    SciTech Connect

    R. Longwell; J. Keifer; S. Goodin

    2001-01-22

    The purpose of this fire hazards analysis (FHA) is to assess the risk from fire within individual fire areas at the Busted Butte Test Facility and to ascertain whether the DOE fire safety objectives are met. The objective, identified in DOE Order 420.1, Section 4.2, is to establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: (1) The occurrence of a fire related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees. (3) Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. Critical process controls and safety class systems being damaged as a result of a fire and related events.

  8. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard...

  9. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard...

  10. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard...

  11. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.29 Section 437.29... Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with § 437.55(a). (b) An applicant must provide to the FAA all the results of each step of the hazard...

  12. The Integrated Hazard Analysis Integrator

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2009-01-01

    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and

  13. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify...

  14. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify...

  15. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify...

  16. 14 CFR 437.55 - Hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Hazard analysis. 437.55 Section 437.55... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Safety Requirements § 437.55 Hazard analysis. (a) A permittee must... safety of property resulting from each permitted flight. This hazard analysis must— (1) Identify...

  17. 14 CFR 437.29 - Hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Hazard analysis. 437.29 Section 437.29... TRANSPORTATION LICENSING EXPERIMENTAL PERMITS Requirements to Obtain an Experimental Permit Operational Safety Documentation § 437.29 Hazard analysis. (a) An applicant must perform a hazard analysis that complies with §...

  18. INTERNAL HAZARDS ANALYSIS FOR LICENSE APPLICATION

    SciTech Connect

    R.J. Garrett

    2005-02-17

    The purpose of this internal hazards analysis is to identify and document the internal hazards and potential initiating events associated with preclosure operations of the repository at Yucca Mountain. Internal hazards are those hazards presented by the operation of the facility and by its associated processes that can potentially lead to a radioactive release or cause a radiological hazard. In contrast to external hazards, internal hazards do not involve natural phenomena and external man-made hazards. This internal hazards analysis was performed in support of the preclosure safety analysis and the License Application for the Yucca Mountain Project. The methodology for this analysis provides a systematic means to identify internal hazards and potential initiating events that may result in a radiological hazard or radiological release during the repository preclosure period. These hazards are documented in tables of potential internal hazards and potential initiating events (Section 6.6) for input to the repository event sequence categorization process. The results of this analysis will undergo further screening and analysis based on the criteria that apply to the performance of event sequence analyses for the repository preclosure period. The evolving design of the repository will be re-evaluated periodically to ensure that internal hazards that have not been previously evaluated are identified.

  19. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  20. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard analysis and Hazard Analysis...

  1. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  2. Development of a Probabilistic Tsunami Hazard Analysis in Japan

    SciTech Connect

    Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka; Ken Yanagisawa; Tadashi Annaka

    2006-07-01

    It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present an example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)

  3. Canister storage building hazard analysis report

    SciTech Connect

    Krahn, D.E.; Garvin, L.J.

    1997-07-01

    This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the final CSB safety analysis report (SAR) and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Report, and implements the requirements of DOE Order 5480.23, Nuclear Safety Analysis Report.

  4. Cold Vacuum Drying Facility hazard analysis report

    SciTech Connect

    Krahn, D.E.

    1998-02-23

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) hazard analysis to support the CVDF phase 2 safety analysis report (SAR), and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, Preparation Guide for US Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports, and implements the requirements of US Department of Energy (DOE) Order 5480.23, Nuclear Safety Analysis Reports.

  5. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  6. Hazard Analysis for In Tank Spray Leaks

    SciTech Connect

    GRAMS, W.H.

    2000-06-13

    The River Protection Project (RPP) Authorization Basis (AB) contains controls that address spray leaks in tanks. However, there are no hazardous conditions in the Hazards Database that specifically identify in-tank spray leak scenarios. The purpose of this Hazards Evaluation is to develop hazardous conditions related to in-tank spray leaks for the Hazards Database and to provide more complete coverage of Tank Farm facilities. Currently, the in-tank spray leak is part of the ''Spray Leak in Structures or From Waste Transfer Lines'' accidents in Section 3.4.2.9 of the Final Safety Analysis Report (FSAR) (CHG, 2000a). The accident analysis for the ''Spray Leak in Structure or From Waste Transfer Lines'' states the following regarding the location of a possible spray leak: Inside ventilated waste storage tanks (DSTs, DCRTs, and some SSTs). Aerosols could be generated inside a storage tank during a transfer because of a leak from the portion of the transfer pipe inside the tank. The tank ventilation system could help disperse the aerosols to the atmosphere should the vent system HEPA filters fail. This Hazards Evaluation also evaluates the controls currently assigned to the spray leak in structure accident and determines the applicability of the controls to the new hazardous conditions. This comparison reviews both the analysis in the FSAR and the controls found in the Technical Safety Requirements (TSRs) (CHG, 2000h). If the new hazardous conditions do not match the analyzed accident conditions and controls, then additional analysis may be required. This document is not intended to authorize the activity or determine the adequacy of controls; it is only intended to provide information about the hazardous conditions associated with this activity. The Control decision process as defined in the AB will be used to determine the adequacy of controls and whether the proposed activity is within the AB. This hazard evaluation does not constitute an accident analysis.

  7. Integrating waste management with Job Hazard analysis

    SciTech Connect

    2007-07-01

    The web-based Automated Job Hazard Analysis (AJHA) system is a tool designed to help capture and communicate the results of the hazard review and mitigation process for specific work activities. In Fluor Hanford's day-to-day work planning and execution process, AJHA has become the focal point for integrating Integrated Safety Management (ISM) through industrial health and safety principles; environmental safety measures; and involvement by workers, subject-matter experts and management. This paper illustrates how AJHA has become a key element in involving waste-management and environmental-control professionals in planning and executing work. To support implementing requirements for waste management and environmental compliance within the core function and guiding principles of an integrated safety management system (ISMS), Fluor Hanford has developed the a computer-based application called the 'Automated Job Hazard Analysis' (AJHA), into the work management process. This web-based software tool helps integrate the knowledge of site workers, subject-matter experts, and safety principles and requirements established in standards, and regulations. AJHA facilitates a process of work site review, hazard identification, analysis, and the determination of specific work controls. The AJHA application provides a well-organized job hazard analysis report including training and staffing requirements, prerequisite actions, notifications, and specific work controls listed for each sub-task determined for the job. AJHA lists common hazards addressed in the U.S. Occupational, Safety, and Health Administration (OSHA) federal codes; and State regulations such as the Washington Industrial Safety and Health Administration (WISHA). AJHA also lists extraordinary hazards that are unique to a particular industry sector, such as radiological hazards and waste management. The work-planning team evaluates the scope of work and reviews the work site to identify potential hazards. Hazards

  8. Initial guidelines for probabilistic seismic hazard analysis

    SciTech Connect

    Budnitz, R.J.

    1994-10-01

    In the late 1980s, the methodology for performing probabilistic seismic hazard analysis (PSHA) was exercised extensively for eastern-U.S. nuclear power plant sites by the Electric Power Research Institute (EPRI) and Lawrence Livermore National Laboratory (LLNL) under NRC sponsorship. Unfortunately, the seismic-hazard-curve results of these two studies differed substantially for many of the eastern reactor sites, which has motivated all concerned to revisit the approaches taken. This project is that revisitation.

  9. Modeling and Hazard Analysis Using STPA

    NASA Astrophysics Data System (ADS)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis

  10. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    J. L. Kubicek

    2001-09-07

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: (1) The occurrence of a fire or related event. (2) A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment. (3) Vital US. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards. (4) Property losses from a fire and related events exceeding limits established by DOE. (5) Critical process controls and safety class systems being damaged as a result of a fire and related events.

  11. Exploratory Studies Facility Subsurface Fire Hazards Analysis

    SciTech Connect

    Richard C. Logan

    2002-03-28

    The primary objective of this Fire Hazard Analysis (FHA) is to confirm the requirements for a comprehensive fire and related hazards protection program for the Exploratory Studies Facility (ESF) are sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public or the environment; Vital U.S. Department of Energy (DOE) programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  12. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... analysis. Every processor shall conduct, or have conducted for it, a hazard analysis to determine whether... product processed by that processor and to identify the preventive measures that the processor can apply.... A food safety hazard that is reasonably likely to occur is one for which a prudent processor...

  13. Quantitative landslide hazard and risk assessment from long-term space-borne InSAR measurements

    NASA Astrophysics Data System (ADS)

    Lu, P.; Catani, F.; Casagli, N.; Li, R.

    2011-12-01

    Preparing reliable landslide hazard and risk maps is an essential part for landslide studies and nowadays many approaches have been developed for quantitative hazard and risk assessment. However, very few of these hazard and risk maps was reported updated after their first generation. In this study, aiming at a continuous update of landslide hazard and risk maps, a novel approach for quantitative landslide hazard and risk assessment was developed, chiefly based on long-term satellite InSAR products - Persistent Scatterer Interferometry (PSI) point targets. The study was performed in the Arno river basin (central Italy) where most of mass movements are slow-moving landslides which are properly within the detection precision of PSI point targets. In the Arno river basin, the initial hazard and risk assessment were performed by Catani et al. (2005) using all datasets before 2001 whereas in this study the previous hazard and risk maps were updated using PSI point targets processed from 4 years (2003-2006) of RADARSAT images. Those PSI point targets were then used to generate a landslide hotspot map through PSI Hotspot and Clustering Analysis (PSI-HCA). Landslide hazard and risk maps for five temporal predictions of 2, 5, 10, 20 and 30 years were produced based on this landslide hotspot map with the exposure of losses estimated in euro. In particular, the result indicates that a potential loss of approximate 3.22 billion euro were expected in the upcoming 30 years due to these slow-moving landslides detected by PSI point targets.

  14. Lunar mission safety and rescue: Hazards analysis and safety requirements

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The results are presented of the hazards analysis which was concerned only with hazards to personnel and not with loss of equipment or property. Hazards characterization includes the definition of a hazard, the hazard levels, and the hazard groups. The analysis methodology is described in detail. The methodology was used to prepare the top level functional flow diagrams, to perform the first level hazards assessment, and to develop a list of conditions and situations requiring individual hazard studies. The 39 individual hazard study results are presented in total.

  15. Environmental risk analysis of hazardous material rail transportation.

    PubMed

    Saat, Mohd Rapik; Werth, Charles J; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P L

    2014-01-15

    An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials.

  16. Repository Subsurface Preliminary Fire Hazard Analysis

    SciTech Connect

    Richard C. Logan

    2001-07-30

    This fire hazard analysis identifies preliminary design and operations features, fire, and explosion hazards, and provides a reasonable basis to establish the design requirements of fire protection systems during development and emplacement phases of the subsurface repository. This document follows the Technical Work Plan (TWP) (CRWMS M&O 2001c) which was prepared in accordance with AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''; Attachment 4 of AP-ESH-008, ''Hazards Analysis System''; and AP-3.11Q, ''Technical Reports''. The objective of this report is to establish the requirements that provide for facility nuclear safety and a proper level of personnel safety and property protection from the effects of fire and the adverse effects of fire-extinguishing agents.

  17. Probabilistic earthquake hazard analysis for Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  18. Preliminary Hazards Analysis Plasma Hearth Process

    SciTech Connect

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P.

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  19. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Flight hazard area analysis. 417.223..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.223 Flight hazard area analysis. (a) General. A flight safety analysis must include a flight hazard area analysis that...

  20. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight hazard area analysis. 417.223..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.223 Flight hazard area analysis. (a) General. A flight safety analysis must include a flight hazard area analysis that...

  1. Separation and quantitation of hazardous wastes from abrasive blast media.

    PubMed

    Hwang, J Y; Jeong, M L

    2001-01-01

    A sample of glass bead abrasive blasting material (ABM) waste, received from Robins Air Force Base (Georgia), was examined to determine whether the waste could be rendered nonhazardous by separating paint contaminants from the ABM. The sample was analyzed with size distribution and toxicity characteristics leaching procedure. A Microtrac analyzer was used to measure the size of fine particles (-325 Tyler mesh), and scanning electron microscopy analysis was performed to identify the nature of the contaminants in the ABM waste. Tests using froth flotation, magnetic separation, desliming, and acid washing were conducted to develop a process for removing the contaminants. A pilot plant test using the developed process rendered 82.1% or the ABM waste material nonhazardous.

  2. Quantitative hazard assessment at Vulcano (Aeolian islands): integration of geology, event statistics and physical modelling

    NASA Astrophysics Data System (ADS)

    Dellino, Pierfrancesco; de Astis, Gianfilippo; La Volpe, Luigi; Mele, Daniela; Sulpizio, Roberto

    2010-05-01

    The analysis of stratigraphy and of pyroclastic deposits particle features allowed the reconstruction of the volcanic history of La Fossa di Vulcano. An eruptive scenario driven by superficial phreatomagmatic explosions emerged. A statistical analysis of the pyroclastic Successions led to define a repetitive sequence of dilute pyroclastic density currents as the most probable events at short term, followed by fallout of dense ballistic blocks. The scale of such events is related to the amount of magma involved in each explosion. Events involving a million of cubic meters of magma are probable in view of what happened in the most recent eruptions. They led to the formation of hundreds of meters thick dilute pyroclastic density currents, moving down the volcano slope at velocities exceeding 50 m/sec. The dispersion of desnity currents affected the whole Vulcano Porto area, the Vulcanello area and also overrode the Fossa Caldera's rim, spreading over the Piano area. Similarly, older pyroclastic deposits erupted at different times (Piano Grotte dei Rossi formation, ~20-7.7 ka) from vents within La Fossa Caldera and before La Fossa Cone formation. They also were phreatomagmatic in origin and fed dilute pyroclastic density currents (PDC). They represent the eruptions with the highest magnitude on the Island. Therefore, for the aim of hazard assessment, these deposits from La Fossa Cone and La Fossa Caldera were used to depict eruptive scenarios at short term and at long term. On the base of physical models that make use of pyroclastic deposits particle features, the impact parameters for each scenario have been calculated. They are dynamic pressure and particle volumetric concentration of density currents, and impact energy of ballistic blocks. On this base, a quantitative hazard map is presented, which could be of direct use for territory planning and for the calculation of the expected damage.

  3. Seismic hazard analysis at Rocky Flats Plant

    SciTech Connect

    McGuire, R.K.

    1993-10-01

    A probabilistic seismic hazard analysis is being conducted for the DOE Rocky Flats Plant, Jefferson County, Colorado. This is part of the overall review of the seismic exposure to facilities being conducted by DOE. The study has four major elements. (1) The historical seismicity in Colorado is being reviewed and synthesized to estimate historical rates of earthquake activity in the region of the site. (2) The geologic and tectonic evidence in Colorado and along the Front Range is being reviewed to determine appropriate seismic zones, potentially active faults, and constraints on fault slip rates. (3) Earthquake ground motion equations are being derived based on seismological knowledge of the earth`s crust. Site specific soil amplification factors are also being developed using on-site shear wave velocity measurements. (4) The probability of exceedence of various seismic ground motion levels is being calculated based on the inputs developed on tectonic sources, faults, ground motion, and soil amplification. Deterministic ground motion estimates are also being made. This study is a state-of-the-art analysis of seismic hazard. It incorporates uncertainties in the major aspects governing seismic hazard, and has a documented basis founded on solid data interpretations for the ranges of inputs used. The results will be a valid basis on which to evaluate plant structures, equipment, and components for seismic effects.

  4. Automated quantitative analysis for pneumoconiosis

    NASA Astrophysics Data System (ADS)

    Kondo, Hiroshi; Zhao, Bin; Mino, Masako

    1998-09-01

    Automated quantitative analysis for pneumoconiosis is presented. In this paper Japanese standard radiographs of pneumoconiosis are categorized by measuring the area density and the number density of small rounded opacities. And furthermore the classification of the size and shape of the opacities is made from the measuring of the equivalent radiuses of each opacity. The proposed method includes a bi- level unsharp masking filter with a 1D uniform impulse response in order to eliminate the undesired parts such as the images of blood vessels and ribs in the chest x-ray photo. The fuzzy contrast enhancement is also introduced in this method for easy and exact detection of small rounded opacities. Many simulation examples show that the proposed method is more reliable than the former method.

  5. Decision analysis for INEL hazardous waste storage

    SciTech Connect

    Page, L.A.; Roach, J.A.

    1994-01-01

    In mid-November 1993, the Idaho National Engineering Laboratory (INEL) Waste Reduction Operations Complex (WROC) Manager requested that the INEL Hazardous Waste Type Manager perform a decision analysis to determine whether or not a new Hazardous Waste Storage Facility (HWSF) was needed to store INEL hazardous waste (HW). In response to this request, a team was formed to perform a decision analysis for recommending the best configuration for storage of INEL HW. Personnel who participated in the decision analysis are listed in Appendix B. The results of the analysis indicate that the existing HWSF is not the best configuration for storage of INEL HW. The analysis detailed in Appendix C concludes that the best HW storage configuration would be to modify and use a portion of the Waste Experimental Reduction Facility (WERF) Waste Storage Building (WWSB), PBF-623 (Alternative 3). This facility was constructed in 1991 to serve as a waste staging facility for WERF incineration. The modifications include an extension of the current Room 105 across the south end of the WWSB and installing heating, ventilation, and bay curbing, which would provide approximately 1,600 ft{sup 2} of isolated HW storage area. Negotiations with the State to discuss aisle space requirements along with modifications to WWSB operating procedures are also necessary. The process to begin utilizing the WWSB for HW storage includes planned closure of the HWSF, modification to the WWSB, and relocation of the HW inventory. The cost to modify the WWSB can be funded by a reallocation of funding currently identified to correct HWSF deficiencies.

  6. Quantitative physical models of volcanic phenomena for hazards assessment of critical infrastructures

    NASA Astrophysics Data System (ADS)

    Costa, Antonio

    2016-04-01

    Volcanic hazards may have destructive effects on economy, transport, and natural environments at both local and regional scale. Hazardous phenomena include pyroclastic density currents, tephra fall, gas emissions, lava flows, debris flows and avalanches, and lahars. Volcanic hazards assessment is based on available information to characterize potential volcanic sources in the region of interest and to determine whether specific volcanic phenomena might reach a given site. Volcanic hazards assessment is focussed on estimating the distances that volcanic phenomena could travel from potential sources and their intensity at the considered site. Epistemic and aleatory uncertainties strongly affect the resulting hazards assessment. Within the context of critical infrastructures, volcanic eruptions are rare natural events that can create severe hazards. In addition to being rare events, evidence of many past volcanic eruptions is poorly preserved in the geologic record. The models used for describing the impact of volcanic phenomena generally represent a range of model complexities, from simplified physics based conceptual models to highly coupled thermo fluid dynamical approaches. Modelling approaches represent a hierarchy of complexity, which reflects increasing requirements for well characterized data in order to produce a broader range of output information. In selecting models for the hazard analysis related to a specific phenomenon, questions that need to be answered by the models must be carefully considered. Independently of the model, the final hazards assessment strongly depends on input derived from detailed volcanological investigations, such as mapping and stratigraphic correlations. For each phenomenon, an overview of currently available approaches for the evaluation of future hazards will be presented with the aim to provide a foundation for future work in developing an international consensus on volcanic hazards assessment methods.

  7. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  8. Exploring probabilistic tools for the development of a platform for Quantitative Risk Assessment (QRA) of hydro-meteorological hazards in Europe

    NASA Astrophysics Data System (ADS)

    Zumpano, V.; Hussin, H. Y.; Breinl, K.

    2012-04-01

    Mass-movements and floods are hydro-meteorological hazards that can have catastrophic effects on communities living in mountainous areas prone to these disastrous events. Environmental, climate and socio-economic changes are expected to affect the tempo-spatial patterns of hydro-meteorological hazards and associated risks in Europe. These changes and their effects on the occurrence of future hazards need to be analyzed and modeled using probabilistic hazard and risk assessment methods in order to assist stakeholders in disaster management strategies and policy making. Quantitative Risk Assessment (QRA) using probabilistic methods can further calculate damage and losses to multi-hazards and determine the uncertainties related to all the probabilistic components of the hazard and the vulnerability of the elements at risk. Therefore, in order to develop an effective platform that can quantitatively calculate the risk of mass-movements and floods in several European test sites, an extensive inventory and analysis has been carried out of the available tools and software related to the probabilistic risk assessment of single and multi-hazards. The tools have been reviewed based on whether they are open source and freely available, their required input data, the availability and type of hazard and vulnerability modules, transparency of methods used, their validation and calibration techniques, the inclusion of uncertainties and their state of the art. The analysis also specially focused on the applicability of the tools to European study areas. The findings showed that assumptions and simplifications are made when assessing and quantifying the hazards. The interaction between multiple hazards, like cascading effects are not assessed in most tools and some consider the hazard and vulnerability as qualitative components, rather than quantitative ones. This analysis of hazard and risk assessment tools and software will give future developers and experts a better overview of

  9. DSOD Procedures for Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Howard, J. K.; Fraser, W. A.

    2005-12-01

    DSOD, which has jurisdiction over more than 1200 dams in California, routinely evaluates their dynamic stability using seismic shaking input ranging from simple pseudostatic coefficients to spectrally matched earthquake time histories. Our seismic hazard assessments assume maximum earthquake scenarios of nearest active and conditionally active seismic sources. Multiple earthquake scenarios may be evaluated depending on sensitivity of the design analysis (e.g., to certain spectral amplitudes, duration of shaking). Active sources are defined as those with evidence of movement within the last 35,000 years. Conditionally active sources are those with reasonable expectation of activity, which are treated as active until demonstrated otherwise. The Division's Geology Branch develops seismic hazard estimates using spectral attenuation formulas applicable to California. The formulas were selected, in part, to achieve a site response model similar to the 2000 IBC's for rock, soft rock, and stiff soil sites. The level of dynamic loading used in the stability analysis (50th, 67th, or 84th percentile ground shaking estimates) is determined using a matrix that considers consequence of dam failure and fault slip rate. We account for near-source directivity amplification along such faults by adjusting target response spectra and developing appropriate design earthquakes for analysis of structures sensitive to long-period motion. Based on in-house studies, the orientation of the dam analysis section relative to the fault-normal direction is considered for strike-slip earthquakes, but directivity amplification is assumed in any orientation for dip-slip earthquakes. We do not have probabilistic standards, but we evaluate the probability of our ground shaking estimates using hazard curves constructed from the USGS Interactive De-Aggregation website. Typically, return periods for our design loads exceed 1000 years. Excessive return periods may warrant a lower design load. Minimum

  10. Hazard Analysis for Building 34 Vacuum Glove Box Assembly

    NASA Technical Reports Server (NTRS)

    Meginnis, Ian

    2014-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".

  11. Regional Frequency Analysis of Ocean Hazard

    NASA Astrophysics Data System (ADS)

    Bernardara, Pietro; Weiss, Jerome; Benoit, Michel; Andreewsky, Marc

    2015-04-01

    The estimation of the extreme return level (up to 10-4 annual probability of exceedence) of natural phenomena is a very uncertain exercise, when extrapolating using the information and the measure collected in a single site. The aim of the Regional Frequency Analysis (RFA) is to benefit from the information contained in observations and data collected not only on the site of interested but in a larger set of sites, located in the same region of the site of interest or sharing with it similar characteristics. This technique was introduced in the '60 and widely used in various domains including hydrology and meteorology. The RFA was recently acknowledge as a potential choice for the estimation of flooding hazard in the Methodological Guide for flooding hazard estimation [1], published in 2013 by the French Nuclear Safety Autority. The aim of this presentation is to bring in the main concepts of the RFA and illustrate the latest innovation on its application, delivered by EDF R&D. They concerns the statistical definition of storms, the formation of homogeneous regions and a new approach for filtering the redundant information linked to the spatial correlation of natural phenomena. Application to skew surges and waves will be shown 1. ASN, Guide pour la Protection des installations nucléaires de base contre les inondations externes. 2013, ASN. p. 44.

  12. Remote rainfall sensing for landslide hazard analysis

    USGS Publications Warehouse

    Wieczorek, Gerald F.; McWreath, Harry; Davenport, Clay

    2001-01-01

    Methods of assessing landslide hazards and providing warnings are becoming more advanced as remote sensing of rainfall provides more detailed temporal and spatial data on rainfall distribution. Two recent landslide disasters are examined noting the potential for using remotely sensed rainfall data for landslide hazard analysis. For the June 27, 1995, storm in Madison County, Virginia, USA, National Weather Service WSR-88D Doppler radar provided rainfall estimates based on a relation between cloud reflectivity and moisture content on a 1 sq. km. resolution every 6 minutes. Ground-based measurements of rainfall intensity and precipitation total, in addition to landslide timing and distribution, were compared with the radar-derived rainfall data. For the December 14-16, 1999, storm in Vargas State, Venezuela, infrared sensing from the GOES-8 satellite of cloud top temperatures provided the basis for NOAA/NESDIS rainfall estimates on a 16 sq. km. resolution every 30 minutes. These rainfall estimates were also compared with ground-based measurements of rainfall and landslide distribution. In both examples, the remotely sensed data either overestimated or underestimated ground-based values by up to a factor of 2. The factors that influenced the accuracy of rainfall data include spatial registration and map projection, as well as prevailing wind direction, cloud orientation, and topography.

  13. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from...

  14. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from...

  15. 14 CFR 417.227 - Toxic release hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Toxic release hazard analysis. 417.227..., DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.227 Toxic release hazard analysis. A flight safety analysis must establish flight commit criteria that protect the public from...

  16. Fire hazard analysis for the fuel supply shutdown storage buildings

    SciTech Connect

    REMAIZE, J.A.

    2000-09-27

    The purpose of a fire hazards analysis (FHA) is to comprehensively assess the risk from fire and other perils within individual fire areas in a DOE facility in relation to proposed fire protection so as to ascertain whether the objectives of DOE 5480.7A, Fire Protection, are met. This Fire Hazards Analysis was prepared as required by HNF-PRO-350, Fire Hazards Analysis Requirements, (Reference 7) for a portion of the 300 Area N Reactor Fuel Fabrication and Storage Facility.

  17. Fire hazards analysis of transuranic waste storage and assay facility

    SciTech Connect

    Busching, K.R., Westinghouse Hanford

    1996-07-31

    This document analyzes the fire hazards associated with operations at the Central Waste Complex. It provides the analysis and recommendations necessary to ensure compliance with applicable fire codes.

  18. Research on the quantitative diagnosis of drought hazard degree of winter wheat using multi-source remote sensing data

    NASA Astrophysics Data System (ADS)

    He, Haixia

    2015-12-01

    The purpose of this study is to perform the quantitative diagnosis of drought hazard degree using multi-source remote sensing data. Hazard degree is the basic function for disaster risk assessment and loss assessment. Quantitative diagnosis of drought hazard degree is essential to decision-making of drought early warning and emergency relief in practice. The currently used diagnosis methods are based on disaster loss and drought indices. The response process and impacts of drought in different crop growth stages were ignored in these methods. So, the instructions were not dynamic and real time. This study investigated the drought hazard degree diagnosing of winter wheat based on continuous multi-source remote sensing imagery and comprehensive ground-based observations. The resulted indicated that the correlation is high and drought hazard degree is suitable and sensitive to reveal drought disaster-forming environment evolution, drought formation mechanism and drought influence.

  19. Fire hazards analysis for solid waste burial grounds

    SciTech Connect

    McDonald, K.M.

    1995-09-28

    This document comprises the fire hazards analysis for the solid waste burial grounds, including TRU trenches, low-level burial grounds, radioactive mixed waste trenches, etc. It analyzes fire potential, and fire damage potential for these facilities. Fire scenarios may be utilized in future safety analysis work, or for increasing the understanding of where hazards may exist in the present operation.

  20. Cold Vacuum Drying (CVD) Facility Hazards Analysis Report

    SciTech Connect

    PATTERSON, P.R.

    1999-10-20

    This report describes the methodology used in conducting the Cold Vacuum Drying Facility (CVDF) hazard Analysis to support the CVDF Final Safety Analysis Report and documents the results. The hazard analysis was performed in accordance with DOE-STD-3009-94, ''Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports,'' and implements the requirements of DOE Order 5480.23, ''Nuclear Safety Analysis Reports.''

  1. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization.

  2. Quantitative analysis of digital microscope images.

    PubMed

    Wolf, David E; Samarasekera, Champika; Swedlow, Jason R

    2013-01-01

    This chapter discusses quantitative analysis of digital microscope images and presents several exercises to provide examples to explain the concept. This chapter also presents the basic concepts in quantitative analysis for imaging, but these concepts rest on a well-established foundation of signal theory and quantitative data analysis. This chapter presents several examples for understanding the imaging process as a transformation from sample to image and the limits and considerations of quantitative analysis. This chapter introduces to the concept of digitally correcting the images and also focuses on some of the more critical types of data transformation and some of the frequently encountered issues in quantization. Image processing represents a form of data processing. There are many examples of data processing such as fitting the data to a theoretical curve. In all these cases, it is critical that care is taken during all steps of transformation, processing, and quantization. PMID:23931513

  3. Active polarimeter optical system laser hazard analysis.

    SciTech Connect

    Augustoni, Arnold L.

    2005-07-01

    A laser hazard analysis was performed for the SNL Active Polarimeter Optical System based on the ANSI Standard Z136.1-2000, American National Standard for Safe Use of Lasers and the ANSI Standard Z136.6-2000, American National Standard for Safe Use of Lasers Outdoors. The Active Polarimeter Optical System (APOS) uses a pulsed, near-infrared, chromium doped lithium strontium aluminum fluoride (Cr:LiSAF) crystal laser in conjunction with a holographic diffuser and lens to illuminate a scene of interest. The APOS is intended for outdoor operations. The system is mounted on a height adjustable platform (6 feet to 40 feet) and sits atop a tripod that points the beam downward. The beam can be pointed from nadir to as much as 60 degrees off of nadir producing an illuminating spot geometry that can vary from circular (at nadir) to elliptical in shape (off of nadir). The JP Innovations crystal Cr:LiSAF laser parameters are presented in section II. The illuminating laser spot size is variable and can be adjusted by adjusting the separation distance between the lens and the holographic diffuser. The system is adjusted while platform is at the lowest level. The laser spot is adjusted for a particular spot size at a particular distance (elevation) from the laser by adjusting the separation distance (d{sub diffuser}) to predetermined values. The downward pointing angle is also adjusted before the platform is raised to the selected operation elevation.

  4. Optical radiation hazards analysis of ultraviolet headlamps

    NASA Astrophysics Data System (ADS)

    Sliney, David H.; Fast, Peder; Ricksand, Anders

    1995-08-01

    The potential hazards to the eye and skin associated with the use of UV-emitting automotive headlamps are considered. Wide-scale use of high-beam, near-UV headlamps to permit viewing of fluorescence in clothes and fluorescent road markers at great distances can increase automotive safety; however, the potential hazards from exposure of persons to UV radiation must be evaluated. From the study we conclude that such headlamps can be safely designed to preclude human exposure to potentially hazardous levels of UV radiation.

  5. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    SciTech Connect

    Waters, Michael Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  6. Cancer detection by quantitative fluorescence image analysis.

    PubMed

    Parry, W L; Hemstreet, G P

    1988-02-01

    Quantitative fluorescence image analysis is a rapidly evolving biophysical cytochemical technology with the potential for multiple clinical and basic research applications. We report the application of this technique for bladder cancer detection and discuss its potential usefulness as an adjunct to methods used currently by urologists for the diagnosis and management of bladder cancer. Quantitative fluorescence image analysis is a cytological method that incorporates 2 diagnostic techniques, quantitation of nuclear deoxyribonucleic acid and morphometric analysis, in a single semiautomated system to facilitate the identification of rare events, that is individual cancer cells. When compared to routine cytopathology for detection of bladder cancer in symptomatic patients, quantitative fluorescence image analysis demonstrated greater sensitivity (76 versus 33 per cent) for the detection of low grade transitional cell carcinoma. The specificity of quantitative fluorescence image analysis in a small control group was 94 per cent and with the manual method for quantitation of absolute nuclear fluorescence intensity in the screening of high risk asymptomatic subjects the specificity was 96.7 per cent. The more familiar flow cytometry is another fluorescence technique for measurement of nuclear deoxyribonucleic acid. However, rather than identifying individual cancer cells, flow cytometry identifies cellular pattern distributions, that is the ratio of normal to abnormal cells. Numerous studies by others have shown that flow cytometry is a sensitive method to monitor patients with diagnosed urological disease. Based upon results in separate quantitative fluorescence image analysis and flow cytometry studies, it appears that these 2 fluorescence techniques may be complementary tools for urological screening, diagnosis and management, and that they also may be useful separately or in combination to elucidate the oncogenic process, determine the biological potential of tumors

  7. Probabilistic Seismic Hazard Analysis for Southern California Coastal Facilities

    SciTech Connect

    Savy, J; Foxall, B

    2004-04-16

    The overall objective of this study was to develop probabilistic seismic hazard estimates for the coastal and offshore area of Ventura, Los Angeles and Orange counties for use as a basis for the University of Southern California (USC) to develop physical models of tsunami for the coastal regions and by the California State Lands Commission (SLC) to develop regulatory standards for seismic loading and liquefaction evaluation of marine oil terminals. The probabilistic seismic hazard analysis (PSHA) was carried out by the Lawrence Livermore National Laboratory (LLNL), in several phases over a time period of two years, following the method developed by LLNL for the estimation of seismic hazards at Department Of Energy (DOE) facilities, and for 69 locations of nuclear plants in the Eastern United States, for the Nuclear Regulatory Commission (NRC). This method consists in making maximum use of all physical data (qualitative, and quantitative) and to characterize the uncertainties by using a set of alternate spatiotemporal models of occurrence of future earthquakes, as described in the SSHAC, PSHA Guidance Document (Budnitz et al., 1997), and implemented for the NRC (Savy et al., 2002). In general, estimation of seismic hazard is based not only on our understanding of the regional tectonics and detailed characterization of the faults in the area but also on the analysis methods employed and the types of physical and empirical models that are deemed appropriate for the analysis. To develop this understanding, the body of knowledge in the scientific community is sampled in a series of workshops with a group of experts representative of the entire scientific community, including geologists and seismologists from the United States Geological Survey (USGS), members of the South California Earthquake Center (SCEC), and members of academic institutions (University of California Santa-Cruz, Stanford, UC Santa Barbara, and University of Southern California), and members of

  8. Zurich`s hazard analysis process: A systematic team approach

    SciTech Connect

    Frei, H.

    1997-06-01

    The Zurich method of hazard analysis (ZHA) is a process designed to facilitate the systematic identification, assessment and reduction or elimination of hazard and risk in almost any product, system or process. It has been particularly successful as a front-end screening tool in the petrochemical, chemical, and pharmaceutical industries. The complexity and the regulation of these industries and the requirement for management of change have created a demand for highly efficient, yet thorough, hazard analysis techniques capable of capturing and managing the total risk perspective while clearly illuminating the risk priorities. Only when these priorities have been segregated and economically addressed as an organization fully leveraged the power of any hazard analysis tool. This paper will outline the Zurich Hazard Analysis process and will highlight the elements and strategies central to its success as an efficient, yet thorough methodology.

  9. SYNTHESIS OF SAFETY ANALYSIS AND FIRE HAZARD ANALYSIS METHODOLOGIES

    SciTech Connect

    Coutts, D

    2007-04-17

    Successful implementation of both the nuclear safety program and fire protection program is best accomplished using a coordinated process that relies on sound technical approaches. When systematically prepared, the documented safety analysis (DSA) and fire hazard analysis (FHA) can present a consistent technical basis that streamlines implementation. If not coordinated, the DSA and FHA can present inconsistent conclusions, which can create unnecessary confusion and can promulgate a negative safety perception. This paper will compare the scope, purpose, and analysis techniques for DSAs and FHAs. It will also consolidate several lessons-learned papers on this topic, which were prepared in the 1990s.

  10. Coherent and consistent decision making for mixed hazardous waste management: The application of quantitative assessment techniques

    SciTech Connect

    Smith, G.M.; Little, R.H.; Torres, C.

    1994-12-31

    This paper focuses on predictive modelling capacity for post-disposal safety assessments of land-based disposal facilities, illustrated by presentation of the development and application of a comprehensive, yet practicable, assessment framework. The issues addressed include: (1) land-based disposal practice, (2) the conceptual and mathematical representation of processes leading to release, migration and accumulation of contaminants, (3) the identification and evaluation of relevant assessment end-points, including human health, health of non-human biota and eco-systems, and property and resource effects, (4) the gap between data requirements and data availability, and (5) the application of results in decision making, given the uncertainties in assessment results and the difficulty of comparing qualitatively different impacts arising in different temporal and spatial scales. The paper illustrates the issues with examples based on disposal of metals and radionuclides to shallow facilities. The types of disposal facility considered include features consistent with facilities for radioactive wastes as well as other types of design more typical of hazardous wastes. The intention is to raise the question of whether radioactive and other hazardous wastes are being consistently managed, and to show that assessment methods are being developed which can provide quantitative information on the levels of environmental impact as well as a consistent approach for different types of waste, such methods can then be applied to mixed hazardous wastes contained radionuclides as well as other contaminants. The remaining question is whether the will exists to employ them. The discussion and worked illustrations are based on a methodology developed and being extended within the current European Atomic Energy Community`s cost-sharing research program on radioactive waste management and disposal, with co-funding support from Empresa Nacional de Residuous Radiactivos SA, Spain.

  11. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to...

  12. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These... potential for catastrophic consequences. (3) Engineering and administrative controls applicable to...

  13. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 9 Animals and Animal Products 2 2013-01-01 2013-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... determine the food safety hazards reasonably likely to occur in the production process and identify...

  14. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Hazard Analysis and HACCP Plan. 417.2 Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... determine the food safety hazards reasonably likely to occur in the production process and identify...

  15. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These...) Hazard and Operability Study (HAZOP); (5) Failure Mode and Effects Analysis (FMEA); (6) Fault Tree... qualitative evaluation of a range of the possible safety and health effects of failure of controls. (d)...

  16. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These...) Hazard and Operability Study (HAZOP); (5) Failure Mode and Effects Analysis (FMEA); (6) Fault Tree... qualitative evaluation of a range of the possible safety and health effects of failure of controls. (d)...

  17. 40 CFR 68.67 - Process hazard analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... completed to comply with 29 CFR 1910.119(e) are acceptable as initial process hazards analyses. These...) Hazard and Operability Study (HAZOP); (5) Failure Mode and Effects Analysis (FMEA); (6) Fault Tree... qualitative evaluation of a range of the possible safety and health effects of failure of controls. (d)...

  18. Quantitative histogram analysis of images

    NASA Astrophysics Data System (ADS)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  19. Quantitative image analysis of synovial tissue.

    PubMed

    van der Hall, Pascal O; Kraan, Maarten C; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the acquisition, storage and evaluation of images with dedicated hardware and software. Major advantages of quantitative image analysis over traditional techniques include sophisticated calibration systems, interaction, speed, and control of inter- and intraobserver variation. This results in a well controlled environment, which is essential for quality control and reproducibility, and helps to optimize sensitivity and specificity. To achieve this, an optimal quantitative image analysis system combines solid software engineering with easy interactivity with the operator. Moreover, the system also needs to be as transparent as possible in generating the data because a "black box design" will deliver uncontrollable results. In addition to these more general aspects, specifically for the analysis of synovial tissue the necessity of interactivity is highlighted by the added value of identification and quantification of information as present in areas such as the intimal lining layer, blood vessels, and lymphocyte aggregates. Speed is another important aspect of digital cytometry. Currently, rapidly increasing numbers of samples, together with accumulation of a variety of markers and detection techniques has made the use of traditional analysis techniques such as manual quantification and semi-quantitative analysis unpractical. It can be anticipated that the development of even more powerful computer systems with sophisticated software will further facilitate reliable analysis at high speed.

  20. Hazard Analysis of Japanese Boxed Lunches (Bento).

    ERIC Educational Resources Information Center

    Bryan, Frank L.; And Others

    1991-01-01

    For the purposes of identifying contaminants, of assessing risks, and of determining critical food processing control points, hazard analyses were conducted at two "bento" (oriental boxed meals) catering operations. Time and temperature abuses during the holding period, after cooking and prior to consumption, were found to be the primary reason…

  1. Assessing the potential hazard of chemical substances for the terrestrial environment. Development of hazard classification criteria and quantitative environmental indicators.

    PubMed

    Tarazona, J V; Fresno, A; Aycard, S; Ramos, C; Vega, M M; Carbonell, G

    2000-03-20

    Hazard assessment constitutes an essential tool in order to evaluate the potential effects of chemical substances on organisms and ecosystems. It includes as a first step, hazard identification, which must detect the potential dangers of the substance (i.e. the kind of effects that the substance may produce), and a second step to quantify each danger and to set the expected dose/response relationships. Hazard assessment plays a key role in the regulation of chemical substances, including pollution control and sustainable development. However, the aquatic environment has largely received more attention than terrestrial ecosystems. This paper presents the extrapolation of several basic concepts from the aquatic to the terrestrial compartment, and suggests possibilities for their regulatory use. Two specific proposals are discussed. The first focuses on the scientific basis of the hazard identification-classification criteria included in the EU regulations and their extrapolation to the terrestrial environment. The second focuses on the OECD programme for environmental indicators and the development of a soil pollution pressure indicator to quantify the potential hazards for the soil compartment and its associated terrestrial ecosystem related to the toxic chemicals applied deliberately (i.e. pesticides) or not (i.e. heavy metals in sludge-based fertilisers; industrial spills) to the soil. PMID:10803544

  2. Guide for Hydrogen Hazards Analysis on Components and Systems

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Woods, Stephen

    2003-01-01

    The physical and combustion properties of hydrogen give rise to hazards that must be considered when designing and operating a hydrogen system. One of the major concerns in the use of hydrogen is that of fire or detonation because of hydrogen's wide flammability range, low ignition energy, and flame speed. Other concerns include the contact and interaction of hydrogen with materials, such as the hydrogen embrittlement of materials and the formation of hydrogen hydrides. The low temperature of liquid and slush hydrogen bring other concerns related to material compatibility and pressure control; this is especially important when dissimilar, adjoining materials are involved. The potential hazards arising from these properties and design features necessitate a proper hydrogen hazards analysis before introducing a material, component, or system into hydrogen service. The objective of this guide is to describe the NASA Johnson Space Center White Sands Test Facility hydrogen hazards analysis method that should be performed before hydrogen is used in components and/or systems. The method is consistent with standard practices for analyzing hazards. It is recommended that this analysis be made before implementing a hydrogen component qualification procedure. A hydrogen hazards analysis is a useful tool for hydrogen-system designers, system and safety engineers, and facility managers. A hydrogen hazards analysis can identify problem areas before hydrogen is introduced into a system-preventing damage to hardware, delay or loss of mission or objective, and possible injury or loss of life.

  3. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... § 417.205(a) apply. The analysis must account for, at a minimum: (1) All trajectory times from liftoff... trajectory dispersion effects in the surface impact domain. (b) Public notices. A flight hazard...

  4. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... § 417.205(a) apply. The analysis must account for, at a minimum: (1) All trajectory times from liftoff... trajectory dispersion effects in the surface impact domain. (b) Public notices. A flight hazard...

  5. 14 CFR 417.223 - Flight hazard area analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... § 417.205(a) apply. The analysis must account for, at a minimum: (1) All trajectory times from liftoff... trajectory dispersion effects in the surface impact domain. (b) Public notices. A flight hazard...

  6. Seismic hazard assessment of the Province of Murcia (SE Spain): analysis of source contribution to hazard

    NASA Astrophysics Data System (ADS)

    García-Mayordomo, J.; Gaspar-Escribano, J. M.; Benito, B.

    2007-10-01

    A probabilistic seismic hazard assessment of the Province of Murcia in terms of peak ground acceleration (PGA) and spectral accelerations [SA( T)] is presented in this paper. In contrast to most of the previous studies in the region, which were performed for PGA making use of intensity-to-PGA relationships, hazard is here calculated in terms of magnitude and using European spectral ground-motion models. Moreover, we have considered the most important faults in the region as specific seismic sources, and also comprehensively reviewed the earthquake catalogue. Hazard calculations are performed following the Probabilistic Seismic Hazard Assessment (PSHA) methodology using a logic tree, which accounts for three different seismic source zonings and three different ground-motion models. Hazard maps in terms of PGA and SA(0.1, 0.2, 0.5, 1.0 and 2.0 s) and coefficient of variation (COV) for the 475-year return period are shown. Subsequent analysis is focused on three sites of the province, namely, the cities of Murcia, Lorca and Cartagena, which are important industrial and tourism centres. Results at these sites have been analysed to evaluate the influence of the different input options. The most important factor affecting the results is the choice of the attenuation relationship, whereas the influence of the selected seismic source zonings appears strongly site dependant. Finally, we have performed an analysis of source contribution to hazard at each of these cities to provide preliminary guidance in devising specific risk scenarios. We have found that local source zones control the hazard for PGA and SA( T ≤ 1.0 s), although contribution from specific fault sources and long-distance north Algerian sources becomes significant from SA(0.5 s) onwards.

  7. Qualitative and Quantitative Assessment of Naturals Hazards in the Caldera of Mount Bambouto (West Cameroon)

    NASA Astrophysics Data System (ADS)

    Zangmo Tefogoum, G.; Kagou Dongmo, A.; Nkouathio, D. G.; Wandji, P.

    2009-04-01

    Mount Bambouto is polygenic stratovolcano of the Cameroon Volcanic Line, build between 21 Ma and 4,5Ma (Nkouathio et al., 2008). It is situated at about 200 km NE of mount Cameroon, at 09°55' and 10°15' East and, 05°25' and 05°50' Nord. This volcano covers an area of 500 Km2 and culminates at 2740 m at Meletan hill and bears a collapse caldera (13 x 8 km). Fissural, extrusive and explosive dynamism are responsible of the construction in three main stages this volcano including the edification of a sommital large rim caldera. Mount Bambouto structure gives rise to different natural hazards, of volcanological origin and meteorological origin. In the past time, landslides, floodings, firebush, blocks collapse took place in this area with catastrophic impact on the population. New research program had been carried out in the caldera concerning qualitative and quantitative evaluation of natural risks and catastrophes. The main factors of instability are rain, structure of the basement, slopes, lithology and anthropic activities; particularly, the occurrence of exceptional rainfall due to global change are relevant; this gives opportunity to draw landslides hazards zonation map of the Bambouto caldera which is the main risk in this area. We evaluate the financial potential of the caldera base on the average income of breeding, farming, school fees and the cost of houses and equipments for each family. The method of calculation revealed that, the yearly economy of the mounts Bambouto caldera represents about 2 billions FCFA. Some recommendations have been made in order to prevent and reduced the potential losses and the number of victims in particular by better land use planning. These help us to estimate the importance of destruction of the environment and biodiversity in case of catastrophes. We conclude that in the Bambouto caldera there is moderate to high probability that destructive phenomena due to landslides occurs within the upcoming years with enormous

  8. Fire Hazards Analysis for the 200 Area Interim Storage Area

    SciTech Connect

    JOHNSON, D.M.

    2000-01-06

    This documents the Fire Hazards Analysis (FHA) for the 200 Area Interim Storage Area. The Interim Storage Cask, Rad-Vault, and NAC-1 Cask are analyzed for fire hazards and the 200 Area Interim Storage Area is assessed according to HNF-PRO-350 and the objectives of DOE Order 5480 7A. This FHA addresses the potential fire hazards associated with the Interim Storage Area (ISA) facility in accordance with the requirements of DOE Order 5480 7A. It is intended to assess the risk from fire to ensure there are no undue fire hazards to site personnel and the public and to ensure property damage potential from fire is within acceptable limits. This FHA will be in the form of a graded approach commensurate with the complexity of the structure or area and the associated fire hazards.

  9. Quantitative WDS analysis using electron probe microanalyzer

    SciTech Connect

    Ul-Hamid, Anwar . E-mail: anwar@kfupm.edu.sa; Tawancy, Hani M.; Mohammed, Abdul-Rashid I.; Al-Jaroudi, Said S.; Abbas, Nureddin M.

    2006-04-15

    In this paper, the procedure for conducting quantitative elemental analysis by ZAF correction method using wavelength dispersive X-ray spectroscopy (WDS) in an electron probe microanalyzer (EPMA) is elaborated. Analysis of a thermal barrier coating (TBC) system formed on a Ni-based single crystal superalloy is presented as an example to illustrate the analysis of samples consisting of a large number of major and minor elements. The analysis was performed by known standards and measured peak-to-background intensity ratios. The procedure for using separate set of acquisition conditions for major and minor element analysis is explained and its importance is stressed.

  10. Analysis of Brazilian data for seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Drouet, S.; Assumpção, M.

    2013-05-01

    Seismic hazard analysis in Brazil is going to be re-assessed in the framework of the Global Earthquake Model (GEM) project. Since the last worldwide Global Seismic Hazard Analysis Project (GSHAP) there has been no specific study in this field in Brazil. Brazil is a stable continental region and is characterized by a low seismic activity. In this particular type of regions, seismic hazard assessment is a very hard task due to the limited amount of data available regarding the seismic sources, earthquake catalogue, or ground-motion amplitudes, and the uncertainties associated are very large. This study focuses on recorded data in South-East Brazil where broadband stations are installed, belonging to two networks: the network managed by the seismology group at the IAG-USP in São Paulo which exists since about 20 years, and the network managed by the Observatorio Nacional in Rio de Janeiro which has just been set up. The two networks are now integrated into the national network RSB (Rede Sismográfica Brasileira) which will also include stations from the rest of Brazil currently in installation by the Universities of Brasilia and Natal. There are a couple of events with magnitude greater than 3 recorded at these very sensitive stations, usually at rather large distances. At first sight these data may appear meaningless in the context of seismic hazard but they can help to improve different parts involved in the process. The analysis of the S-wave Fourier spectra can help to better resolve source, path and site effects in Brazil. For instance moment magnitudes can be computed from the flat part of the Fourier spectra. These magnitudes are of utmost importance in order to build an homogeneous catalogue in terms of moment magnitude. At the moment only body wave magnitude (or some equivalent scale) are determined routinely for the events in Brazil. Attenuation and site effect, especially the high-frequency attenuation known as the kappa effect will also help to

  11. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  12. Fire hazards analysis for the uranium oxide (UO{sub 3}) facility

    SciTech Connect

    Wyatt, D.M.

    1994-12-06

    The Fire Hazards Analysis (FHA) documents the deactivation end-point status of the UO{sub 3} complex fire hazards, fire protection and life safety systems. This FHA has been prepared for the Uranium Oxide Facility by Westinghouse Hanford Company in accordance with the criteria established in DOE 5480.7A, Fire Protection and RLID 5480.7, Fire Protection. The purpose of the Fire Hazards Analysis is to comprehensively and quantitatively assess the risk from a fire within individual fire areas in a Department of Energy facility so as to ascertain whether the objectives stated in DOE Order 5480.7, paragraph 4 are met. Particular attention has been paid to RLID 5480.7, Section 8.3, which specifies the criteria for deactivating fire protection in decommission and demolition facilities.

  13. Quantitative evaluation of heavy metals' pollution hazards in liquefaction residues of sewage sludge.

    PubMed

    Huang, Huajun; Yuan, Xingzhong; Zeng, Guangming; Zhu, Huina; Li, Hui; Liu, Zhifeng; Jiang, Hongwei; Leng, Lijian; Bi, Wenkai

    2011-11-01

    Liquefaction residues (LR) are the main by-products of sewage sludge (SS) liquefaction. This study quantitatively evaluates the potential ecological risk and pollution degrees of heavy metals (Pb, Zn, Cu, Cd, Cr and Ni) in LR versus SS. The leaching rates (R1) of heavy metals in LR were much lower than those in SS, revealing that the mobility/leachability of heavy metals was well suppressed after liquefaction. Geo-accumulation index (Igeo) indicated that the liquefaction process significantly weakened the contamination degrees of heavy metals. Potential ecological risk index (RI) demonstrated that overall risks caused by heavy metals were obviously lowered from 1093.56 (very high risk) in SS to 4.72 and 1.51 (low risk) in LR1 and LR2, respectively. According to the risk assessment code (RAC), each tested heavy metal had no or low risk to the environments after liquefaction. In a word, the pollution hazards of heavy metals in LR were markedly mitigated.

  14. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    NASA Astrophysics Data System (ADS)

    Liu, B.; Siu, Y. L.; Mitchell, G.

    2015-12-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  15. Hazard interaction analysis for multi-hazard risk assessment: a systematic classification based on hazard-forming environment

    NASA Astrophysics Data System (ADS)

    Liu, Baoyin; Siu, Yim Ling; Mitchell, Gordon

    2016-03-01

    This paper develops a systematic hazard interaction classification based on the geophysical environment that natural hazards arise from - the hazard-forming environment. According to their contribution to natural hazards, geophysical environmental factors in the hazard-forming environment were categorized into two types. The first are relatively stable factors which construct the precondition for the occurrence of natural hazards, whilst the second are trigger factors, which determine the frequency and magnitude of hazards. Different combinations of geophysical environmental factors induce different hazards. Based on these geophysical environmental factors for some major hazards, the stable factors are used to identify which kinds of natural hazards influence a given area, and trigger factors are used to classify the relationships between these hazards into four types: independent, mutex, parallel and series relationships. This classification helps to ensure all possible hazard interactions among different hazards are considered in multi-hazard risk assessment. This can effectively fill the gap in current multi-hazard risk assessment methods which to date only consider domino effects. In addition, based on this classification, the probability and magnitude of multiple interacting natural hazards occurring together can be calculated. Hence, the developed hazard interaction classification provides a useful tool to facilitate improved multi-hazard risk assessment.

  16. Method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, James S.; Gjerde, Douglas T.; Schmuckler, Gabriella

    1981-06-09

    An improved apparatus and method for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single eluent and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  17. Quantitative ADF STEM: acquisition, analysis and interpretation

    NASA Astrophysics Data System (ADS)

    Jones, L.

    2016-01-01

    Quantitative annular dark-field in the scanning transmission electron microscope (ADF STEM), where image intensities are used to provide composition and thickness measurements, has enjoyed a renaissance during the last decade. Now in a post aberration-correction era many aspects of the technique are being revisited. Here the recent progress and emerging best-practice for such aberration corrected quantitative ADF STEM is discussed including issues relating to proper acquisition of experimental data and its calibration, approaches for data analysis, the utility of such data, its interpretation and limitations.

  18. Quantitative Proteomics Analysis of Leukemia Cells.

    PubMed

    Halbach, Sebastian; Dengjel, Jörn; Brummer, Tilman

    2016-01-01

    Chronic myeloid leukemia (CML) is driven by the oncogenic fusion kinase Bcr-Abl, which organizes its own signaling network with various proteins. These proteins, their interactions, and their role in relevant signaling pathways can be analyzed by quantitative mass spectrometry (MS) approaches in various models systems, e.g., in cell culture models. In this chapter, we describe in detail immunoprecipitations and quantitative proteomics analysis using stable isotope labeling by amino acids in cell culture (SILAC) of components of the Bcr-Abl signaling pathway in the human CML cell line K562. PMID:27581145

  19. Quantitative analysis of blood vessel geometry

    NASA Astrophysics Data System (ADS)

    Fuhrman, Michael G.; Abdul-Karim, Othman; Shah, Sujal; Gilbert, Steven G.; Van Bibber, Richard

    2001-07-01

    Re-narrowing or restenosis of a human coronary artery occurs within six months in one third of balloon angioplasty procedures. Accurate and repeatable quantitative analysis of vessel shape is important to characterize the progression and type of restenosis, and to evaluate effects new therapies might have. A combination of complicated geometry and image variability, and the need for high resolution and large image size makes visual/manual analysis slow, difficult, and prone to error. The image processing and analysis described here was developed to automate feature extraction of the lumen, internal elastic lamina, neointima, external elastic lamina, and tunica adventitia and to enable an objective, quantitative definition of blood vessel geometry. The quantitative geometrical analysis enables the measurement of several features including perimeter, area, and other metrics of vessel damage. Automation of feature extraction creates a high throughput capability that enables analysis of serial sections for more accurate measurement of restenosis dimensions. Measurement results are input into a relational database where they can be statistically analyzed compared across studies. As part of the integrated process, results are also imprinted on the images themselves to facilitate auditing of the results. The analysis is fast, repeatable and accurate while allowing the pathologist to control the measurement process.

  20. Seismic hazard analysis for Jayapura city, Papua

    SciTech Connect

    Robiana, R. Cipta, A.

    2015-04-24

    Jayapura city had destructive earthquake which occurred on June 25, 1976 with the maximum intensity VII MMI scale. Probabilistic methods are used to determine the earthquake hazard by considering all possible earthquakes that can occur in this region. Earthquake source models using three types of source models are subduction model; comes from the New Guinea Trench subduction zone (North Papuan Thrust), fault models; derived from fault Yapen, TareraAiduna, Wamena, Memberamo, Waipago, Jayapura, and Jayawijaya, and 7 background models to accommodate unknown earthquakes. Amplification factor using geomorphological approaches are corrected by the measurement data. This data is related to rock type and depth of soft soil. Site class in Jayapura city can be grouped into classes B, C, D and E, with the amplification between 0.5 – 6. Hazard maps are presented with a 10% probability of earthquake occurrence within a period of 500 years for the dominant periods of 0.0, 0.2, and 1.0 seconds.

  1. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  2. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  3. Quantitative Proteomic Analysis of the Human Nucleolus.

    PubMed

    Bensaddek, Dalila; Nicolas, Armel; Lamond, Angus I

    2016-01-01

    Recent years have witnessed spectacular progress in the field of mass spectrometry (MS)-based quantitative proteomics, including advances in instrumentation, chromatography, sample preparation methods, and experimental design for multidimensional analyses. It is now possible not only to identify most of the protein components of a cell proteome in a single experiment, but also to describe additional proteome dimensions, such as protein turnover rates, posttranslational modifications, and subcellular localization. Furthermore, by comparing the proteome at different time points, it is possible to create a "time-lapse" view of proteome dynamics. By combining high-throughput quantitative proteomics with detailed subcellular fractionation protocols and data analysis techniques it is also now possible to characterize in detail the proteomes of specific subcellular organelles, providing important insights into cell regulatory mechanisms and physiological responses. In this chapter we present a reliable workflow and protocol for MS-based analysis and quantitation of the proteome of nucleoli isolated from human cells. The protocol presented is based on a SILAC analysis of human MCF10A-Src-ER cells with analysis performed on a Q-Exactive Plus Orbitrap MS instrument (Thermo Fisher Scientific). The subsequent chapter describes how to process the resulting raw MS files from this experiment using MaxQuant software and data analysis procedures to evaluate the nucleolar proteome using customized R scripts. PMID:27576725

  4. Frequency Analysis of Aircraft hazards for License Application

    SciTech Connect

    K. Ashley

    2006-10-24

    The preclosure safety analysis for the monitored geologic repository at Yucca Mountain must consider the hazard that aircraft may pose to surface structures. Relevant surface structures are located beneath the restricted airspace of the Nevada Test Site (NTS) on the eastern slope of Yucca Mountain, near the North Portal of the Exploratory Studies Facility Tunnel (Figure 1). The North Portal is located several miles from the Nevada Test and Training Range (NTTR), which is used extensively by the U.S. Air Force (USAF) for training and test flights (Figure 1). The NTS airspace, which is controlled by the U.S. Department of Energy (DOE) for NTS activities, is not part of the NTTR. Agreements with the DOE allow USAF aircraft specific use of the airspace above the NTS (Reference 2.1.1 [DIRS 103472], Section 3.1.1 and Appendix A, Section 2.1; and Reference 2.1.2 [DIRS 157987], Sections 1.26 through 1.29). Commercial, military, and general aviation aircraft fly within several miles to the southwest of the repository site in the Beatty Corridor, which is a broad air corridor that runs approximately parallel to U.S. Highway 95 and the Nevada-California border (Figure 2). These aircraft and other aircraft operations are identified and described in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Sections 6 and 8). The purpose of this analysis is to estimate crash frequencies for aircraft hazards identified for detailed analysis in ''Identification of Aircraft Hazards'' (Reference 2.1.3, Section 8). Reference 2.1.3, Section 8, also identifies a potential hazard associated with electronic jamming, which will be addressed in this analysis. This analysis will address only the repository and not the transportation routes to the site. The analysis is intended to provide the basis for: (1) Categorizing event sequences related to aircraft hazards; (2) Identifying design or operational requirements related to aircraft hazards.

  5. Hazard screening application guide. Safety Analysis Report Update Program

    SciTech Connect

    1992-06-01

    The basic purpose of hazard screening is to group precesses, facilities, and proposed modifications according to the magnitude of their hazards so as to determine the need for and extent of follow on safety analysis. A hazard is defined as a material, energy source, or operation that has the potential to cause injury or illness in human beings. The purpose of this document is to give guidance and provide standard methods for performing hazard screening. Hazard screening is applied to new and existing facilities and processes as well as to proposed modifications to existing facilities and processes. The hazard screening process evaluates an identified hazards in terms of the effects on people, both on-site and off-site. The process uses bounding analyses with no credit given for mitigation of an accident with the exception of certain containers meeting DOT specifications. The process is restricted to human safety issues only. Environmental effects are addressed by the environmental program. Interfaces with environmental organizations will be established in order to share information.

  6. FMEA, the alternative process hazard method. [Failure Mode Effects Analysis

    SciTech Connect

    Goyal, R.K. )

    1993-05-01

    Failure mode effects analysis (FMEA) is an old reliability/assurance tool finding its way into the HPI. Not popular yet, this hazard technique has some viable applications that can improve hazard assessment data. Notably, FMEA studies can identify possible areas for improvement that may have not been discovered using other methods. Also, FMEA is not as labor intensive and costly as other process hazard analysis (PHA) methods. PSHA 1910.119 set in place an informational structure whose main purpose is the reduction of potential accidents and minimizing risks in the event of an accident. Consequently, HPI operators must evaluate their process systems and identify potential major hazards, such as fires, explosions and accidental release of toxic/hazardous chemicals, and protect their facilities, employees, the public and the environment. But, which PHA method(s) apply to a particular plant or process still remains a difficult question. This paper describes what FMEA is; types of FMEA; how to conduct a FMEA study; comparison with HAZOP (hazard and operability study); computer software; applicability of FMEA; and examples of its use.

  7. Quantitative image analysis of celiac disease

    PubMed Central

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  8. Quantitative image analysis of celiac disease.

    PubMed

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-03-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients.

  9. A hazard and probabilistic safety analysis of a high-level waste transfer process

    SciTech Connect

    Bott, T.F.; Sasser, M.K.

    1996-09-01

    This paper describes a safety analysis of a transfer process for high-level radioactive and toxic waste. The analysis began with a hazard assessment that used elements of What If, Checklist, Failure Modes and Effects Analysis, and Hazards and Operability Study (HAZOP) techniques to identify and rough-in accident sequences. Based on this preliminary analysis, the most significant accident sequences were developed further using event trees. Quantitative frequency estimates for the accident sequences were based on operational data taken from the historical record of the site where the process is performed. Several modeling challenges were encountered in the course of the study. These included linked initiating and accident progression events, fire propagation modeling, accounting for administrative control violations, and handling mission-phase effects.

  10. A Bayesian Seismic Hazard Analysis for the city of Naples

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  11. Preliminary hazards analysis of thermal scrap stabilization system. Revision 1

    SciTech Connect

    Lewis, W.S.

    1994-08-23

    This preliminary analysis examined the HA-21I glovebox and its supporting systems for potential process hazards. Upon further analysis, the thermal stabilization system has been installed in gloveboxes HC-21A and HC-21C. The use of HC-21C and HC-21A simplified the initial safety analysis. In addition, these gloveboxes were cleaner and required less modification for operation than glovebox HA-21I. While this document refers to glovebox HA-21I for the hazards analysis performed, glovebox HC-21C is sufficiently similar that the following analysis is also valid for HC-21C. This hazards analysis document is being re-released as revision 1 to include the updated flowsheet document (Appendix C) and the updated design basis (Appendix D). The revised Process Flow Schematic has also been included (Appendix E). This Current revision incorporates the recommendations provided from the original hazards analysis as well. The System Design Description (SDD) has also been appended (Appendix H) to document the bases for Safety Classification of thermal stabilization equipment.

  12. Pedestrian Evacuation Analysis for Tsunami Hazards

    NASA Astrophysics Data System (ADS)

    Jones, J. M.; Ng, P.; Wood, N. J.

    2014-12-01

    Recent catastrophic tsunamis in the last decade, as well as the 50th anniversary of the 1964 Alaskan event, have heightened awareness of the threats these natural hazards present to large and increasing coastal populations. For communities located close to the earthquake epicenter that generated the tsunami, strong shaking may also cause significant infrastructure damage, impacting the road network and hampering evacuation. There may also be insufficient time between the earthquake and first wave arrival to rely on a coordinated evacuation, leaving at-risk populations to self-evacuate on foot and across the landscape. Emergency managers evaluating these coastal risks need tools to assess the evacuation potential of low-lying areas in order to discuss mitigation options, which may include vertical evacuation structures to provide local safe havens in vulnerable communities. The U.S. Geological Survey has developed the Pedestrian Evacuation Analyst software tool for use by researchers and emergency managers to assist in the assessment of a community's evacuation potential by modeling travel times across the landscape and producing both maps of travel times and charts of population counts with corresponding times. The tool uses an anisotropic (directionally dependent) least cost distance model to estimate evacuation potential and allows for the variation of travel speed to measure its effect on travel time. The effectiveness of vertical evacuation structures on evacuation time can also be evaluated and compared with metrics such as travel time maps showing each structure in place and graphs displaying the percentage change in population exposure for each structure against the baseline. Using the tool, travel time maps and at-risk population counts have been generated for some coastal communities of the U.S. Pacific Northwest and Alaska. The tool can also be used to provide valuable decision support for tsunami vertical evacuation siting.

  13. Influence of Analysis Methods on Interpretation of Hazard Maps

    PubMed Central

    Koehler, Kirsten A.

    2013-01-01

    Exposure or hazard mapping is becoming increasingly popular among industrial hygienists. Direct-reading instruments used for hazard mapping of data collection are steadily increasing in reliability and portability while decreasing in cost. Exposure measurements made with these instruments generally require no laboratory analysis although hazard mapping can be a time-consuming process. To inform decision making by industrial hygienists and management, it is crucial that the maps generated from mapping data are as accurate and representative as possible. Currently, it is unclear how many sampling locations are necessary to produce a representative hazard map. As such, researchers typically collect as many points as can be sampled in several hours and interpolation methods are used to produce higher resolution maps. We have reanalyzed hazard-mapping data sets from three industrial settings to determine which interpolation methods yield the most accurate results. The goal is to provide practicing industrial hygienists with some practical guidelines to generate accurate hazard maps with ‘off-the-shelf’ mapping software. Visually verifying the fit of the variogram model is crucial for accurate interpolation. Exponential and spherical variogram models performed better than Gaussian models. It was also necessary to diverge from some of the default interpolation parameters such as the number of bins used for the experimental variogram and whether or not to allow for a nugget effect to achieve reasonable accuracy of the interpolation for some data sets. PMID:23258453

  14. Quantitative mass spectrometry methods for pharmaceutical analysis.

    PubMed

    Loos, Glenn; Van Schepdael, Ann; Cabooter, Deirdre

    2016-10-28

    Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage.This article is part of the themed issue 'Quantitative mass spectrometry'.

  15. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  16. Tracking Hazard Analysis Data in a Jungle of Changing Design

    SciTech Connect

    Sullivan, Robin S.; Young, Jonathan

    2006-05-14

    The biggest fear of the hazard analyst is the loss of data in the middle of the design jungle. When project schedules are demanding and design is changing rapidly it is essential that the hazard analysis data be tracked and kept current in order to provide the required project design, development, and regulatory support. Being able to identify the current information, as well as the past archived information, as the design progresses and to be able to show how the project is designing in safety through modifications based on hazard analysis results is imperative. At the DOE Hanford site in Washington State, Flour Hanford Inc is in the process of the removal and disposition of sludge from the 100 Area K Basins. The K Basins were used to store spent fuel from the operating reactors at the Hanford Site. The sludge is a by-product from the corrosion of the fuel and fuel storage canisters. The sludge removal project has been very dynamic involving the design, procurement and, more recently, the operation of processes at two basins, K East and K West. The project has an ambitious schedule with a large number of changes to design concepts. In order to support the complex K Basins project a technique to track the status of the hazard analysis data was developed. This paper will identify the most important elements of the tracking system and how it was used to assist the project in ensuring that current design data was reflected in a specific version of the hazard analysis and to show how the project was keeping up with the design and ensuring compliance with the requirements to design in safety. While the specifics of the data tracking strategy for the K Basins sludge removal project will be described in the paper, the general concepts of the strategy are applicable to similar projects requiring iteration of hazard analysis and design.

  17. Quantitative Bias Analysis in Regulatory Settings.

    PubMed

    Lash, Timothy L; Fox, Matthew P; Cooney, Darryl; Lu, Yun; Forshee, Richard A

    2016-07-01

    Nonrandomized studies are essential in the postmarket activities of the US Food and Drug Administration, which, however, must often act on the basis of imperfect data. Systematic errors can lead to inaccurate inferences, so it is critical to develop analytic methods that quantify uncertainty and bias and ensure that these methods are implemented when needed. "Quantitative bias analysis" is an overarching term for methods that estimate quantitatively the direction, magnitude, and uncertainty associated with systematic errors influencing measures of associations. The Food and Drug Administration sponsored a collaborative project to develop tools to better quantify the uncertainties associated with postmarket surveillance studies used in regulatory decision making. We have described the rationale, progress, and future directions of this project. PMID:27196652

  18. Hazardous-waste analysis plan for LLNL operations

    SciTech Connect

    Roberts, R.S.

    1982-02-12

    The Lawrence Livermore National Laboratory is involved in many facets of research ranging from nuclear weapons research to advanced Biomedical studies. Approximately 80% of all programs at LLNL generate hazardous waste in one form or another. Aside from producing waste from industrial type operations (oils, solvents, bottom sludges, etc.) many unique and toxic wastes are generated such as phosgene, dioxin (TCDD), radioactive wastes and high explosives. One key to any successful waste management program must address the following: proper identification of the waste, safe handling procedures and proper storage containers and areas. This section of the Waste Management Plan will address methodologies used for the Analysis of Hazardous Waste. In addition to the wastes defined in 40 CFR 261, LLNL and Site 300 also generate radioactive waste not specifically covered by RCRA. However, for completeness, the Waste Analysis Plan will address all hazardous waste.

  19. Landslide hazards and systems analysis: A Central European perspective

    NASA Astrophysics Data System (ADS)

    Klose, Martin; Damm, Bodo; Kreuzer, Thomas

    2016-04-01

    Part of the problem with assessing landslide hazards is to understand the variable settings in which they occur. There is growing consensus that hazard assessments require integrated approaches that take account of the coupled human-environment system. Here we provide a synthesis of societal exposure and vulnerability to landslide hazards, review innovative approaches to hazard identification, and lay a focus on hazard assessment, while presenting the results of historical case studies and a landslide time series for Germany. The findings add to a growing body of literature that recognizes societal exposure and vulnerability as a complex system of hazard interactions that evolves over time as a function of social change and development. We therefore propose to expand hazard assessments by the framework and concepts of systems analysis (e.g., Liu et al., 2007) Results so far have been promising in ways that illustrate the importance of feedbacks, thresholds, surprises, and time lags in the evolution of landslide hazard and risk. In densely populated areas of Central Europe, landslides often occur in urbanized landscapes or on engineered slopes that had been transformed or created intentionally by human activity, sometimes even centuries ago. The example of Germany enables to correlate the causes and effects of recent landslides with the historical transition of urbanization to urban sprawl, ongoing demographic change, and some chronic problems of industrialized countries today, including ageing infrastructures or rising government debts. In large parts of rural Germany, the combination of ageing infrastructures, population loss, and increasing budget deficits starts to erode historical resilience gains, which brings especially small communities to a tipping point in their efforts to risk reduction. While struggling with budget deficits and demographic change, these communities are required to maintain ageing infrastructures that are particularly vulnerable to

  20. Historical analysis of US pipeline accidents triggered by natural hazards

    NASA Astrophysics Data System (ADS)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  1. Risk analysis for roadways subjected to multiple landslide-related hazards

    NASA Astrophysics Data System (ADS)

    Corominas, Jordi; Mavrouli, Olga

    2014-05-01

    dynamic factors: the service load and the wall deformation. The consequences are then calculated for each hazard type according to its characteristics (mechanism, magnitude, frequency). The difference of this method in comparison with other methodologies for landslide-related hazards lies in the hazard scenarios and consequence profiles that are investigated. The depth of analysis permits to account for local conditions either concerning the hazard or the consequences (the latter with respect to the very particular characteristics of the roadway such as traffic, number of lanes, velocity…). Furthermore it provides an extensive list of quantitative risk descriptors, including both individual and collective ones. The methodology was made automatic using the data sheets by Microsoft Excel. The results can be used to support decision-taking for the planning of protection measures. Gaps in knowledge and restrictions are discussed as well.

  2. Hydraulic fracturing in unconventional reservoirs - Identification of hazards and strategies for a quantitative risk assessment

    NASA Astrophysics Data System (ADS)

    Helmig, R.; Kissinger, A.; Class, H.; Ebigbo, A.

    2012-12-01

    The production of unconventional gas resources, which require a fracking process to be released, such as shale gas, tight gas and coal bed methane, has become an economically attractive technology for a continued supply of fossil-fuel energy sources in many countries. Just recently, a major focus of interest has been directed to hydraulic fracking in Germany. The technology is controversial since it involves severe risks. The main difference in risk with respect to other technologies in the subsurface such as carbon sequestration is that fracking is remunerative, and it is important to distinguish between economical and environmental issues. The hydrofracking process may pose a threat to groundwater resources if fracking fluid or brine can migrate through fault zones into shallow aquifers. Diffuse methane emissions from the gas reservoir may not only contaminate shallow groundwater aquifers but also escape into the atmosphere where methane acts as a greenhouse gas. The working group "Risks in the Geological System" as part of ExxonMobil's hydrofracking dialogue and information dissemination processes was tasked with the assessment of possible hazards posed by migrating fluids as a result of hydrofracking activities. In this work several flow paths for fracking fluid, brine and methane are identified and scenarios are set up to qualitatively estimate under what circumstances these fluids would leak into shallower layers. The parametrization for potential fracking sites in North Rhine-Westphalia and Lower Saxony (both in Germany) is derived from literature using upper and lower bounds of hydraulic parameters. The results show that a significant fluid migration is only possible if a combination of several conservative assumptions are met by a scenario. Another outcome of this work is the demand for further research, as many of the involved processes in the hydrofracking process have yet not been fully understood (e.g. quantification of source terms for methane in the

  3. Phase 2 fire hazard analysis for the canister storage building

    SciTech Connect

    Sadanaga, C.T., Westinghouse Hanford

    1996-07-01

    The fire hazard analysis assesses the risk from fire in a facility to ascertain whether the fire protection policies are met. This document provides a preliminary FHA for the CSB facility. Open items have been noted in the document. A final FHA will be required at the completion of definitive design, prior to operation of the facility.

  4. Fire Hazard Analysis for the Cold Vacuum Drying (CVD) Facility

    SciTech Connect

    JOHNSON, B.H.

    1999-08-19

    This Fire Hazard Analysis assesses the risk from fire within individual fire areas in the Cold Vacuum Drying Facility at the Hanford Site in relation to existing or proposed fire protection features to ascertain whether the objectives of DOE Order 5480.7A Fire Protection are met.

  5. DEM processing for the analysis of hydraulic hazards

    NASA Astrophysics Data System (ADS)

    Dresen, M.

    2003-04-01

    The digital analysis of hydrological processes and hydraulic hazards requires high data accuracy especially for topographic data that can not be insured by standard digital elevation models (DEMs). For this reason the terrain analysis and the analysis of topographical factors are highly significant for the modelling of hydrological processes. Most of the common GIS do not fulfill these requirements and do not allow detailed process oriented analysis. As a result, the estimation of hazard potential as well as the derivation of possible effects of catastrophic events are not possible. Due to this the improvement of DEM creation and expansion of placeable methods and functionalities have high priority in hydraulic hazard assessment. We can demonstrate that the quality of DEMs can be clearly improved with the help of different extensions and adaptations. The comparison of different flood events in Europe reveal the better accuracy of topographical factors and the derived hydrological parameters. In this way the simulation of hydrological processes and hydraulic hazards can be improved.

  6. Fire Hazards Analysis for the Inactive Equipment Storage Sprung Structure

    SciTech Connect

    MYOTT, C.F.

    2000-02-03

    The purpose of the analysis is to comprehensively assess the risk from fire within individual fire areas in relation to proposed fire protection so as to ascertain whether the fire protection objective of DOE Order 5480.1A are met. The order acknowledges a graded approach commensurate with the hazards involved.

  7. Appalachian Play Fairway Analysis Seismic Hazards Supporting Data

    DOE Data Explorer

    Frank Horowitz

    2016-07-20

    These are the data used in estimating the seismic hazards (both natural and induced) for candidate direct use geothermal locations in the Appalachian Basin Play Fairway Analysis by Jordan et al. (2015). xMin,yMin -83.1407,36.7461 : xMax,yMax -71.5175,45.1729

  8. Quantitative NIR Raman analysis in liquid mixtures.

    PubMed

    Sato-Berrú, R Ysacc; Medina-Valtierra, Jorge; Medina-Gutiérrez, Cirilo; Frausto-Reyes, Claudio

    2004-08-01

    The capability to obtain quantitative information of a simple way from Raman spectra is a subject of considerable interest. In this work, this is demonstrated for mixtures of ethanol with water and rhodamine-6G (R-6G) with methanol, which were analyzed directly in glass vessel. The Raman intensities and a simple mathematical model have been used and applied for the analysis of liquid samples. It is starting point to generate a general expression, from the experimental spectra, as the sum of the particular expression for each pure compound allow us to obtain an expression for the mixtures which can be used for determining concentrations, from the Raman spectrum, of the mixture.

  9. Quantitative proteomic analysis of intact plastids.

    PubMed

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described. PMID:24136541

  10. Quantitative proteomic analysis of intact plastids.

    PubMed

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  11. PO*WW*ER mobile treatment unit process hazards analysis

    SciTech Connect

    Richardson, R.B.

    1996-06-01

    The objective of this report is to demonstrate that a thorough assessment of the risks associated with the operation of the Rust Geotech patented PO*WW*ER mobile treatment unit (MTU) has been performed and documented. The MTU was developed to treat aqueous mixed wastes at the US Department of Energy (DOE) Albuquerque Operations Office sites. The MTU uses evaporation to separate organics and water from radionuclides and solids, and catalytic oxidation to convert the hazardous into byproducts. This process hazards analysis evaluated a number of accident scenarios not directly related to the operation of the MTU, such as natural phenomena damage and mishandling of chemical containers. Worst case accident scenarios were further evaluated to determine the risk potential to the MTU and to workers, the public, and the environment. The overall risk to any group from operation of the MTU was determined to be very low; the MTU is classified as a Radiological Facility with low hazards.

  12. Hazard analysis of Clostridium perfringens in the Skylab Food System

    NASA Technical Reports Server (NTRS)

    Bourland, C. T.; Huber, C. S.; Kiser, P. R.; Heidelbaugh, N. D.; Rowley, D. B.

    1974-01-01

    The Skylab Food System presented unique microbiological problems because food was warmed in null-gravity and because the heat source was limited to 69.4 C (to prevent boiling in null-gravity). For these reasons, the foods were manufactured using critical control point techniques of quality control coupled with appropriate hazard analyses. One of these hazard analyses evaluated the threat from Clostridium perfringens. Samples of food were inoculated with C. perfringens and incubated for 2 h at temperatures ranging from 25 to 55 C. Generation times were determined for the foods at various temperatures. Results of these tests were evaluated taking into consideration: food-borne disease epidemiology, the Skylab food manufacturing procedures, and the performance requirements of the Skylab Food System. Based on this hazard analysis, a limit for C. perfringens of 100/g was established for Skylab foods.

  13. Environmental hazard analysis and effective remediation of highway seepage.

    PubMed

    Yuan, Renmao; Yang, Y S; Qiu, X; Ma, F S

    2007-04-01

    Risk assessment and minimisation of environmental hazards are critical issues to consider in the geotechnical engineering projects. A case of highway pavement seepage induced by groundwater, at a locality along the section of Hua-Qing Highway of Guangdong Province, China, is presented for environmental hazard analysis and effective remediation. The environmental hazard analyses were based on in situ hydrogeologic investigation, rock-soil testing and integrated environmental understanding. The analyses indicate that the highway seepage was caused by elevation of groundwater hydraulic pressure in low permeable strata near the highway pavement, which was controlled by landform, hydrology, weather and road structure. The risk source of groundwater 'flooding' was the groundwater and surface water in the ring-like valley around Fenshui Village. A blind-ditch system for effective remediation of the pavement seepage hazard was proposed and successfully implemented by declining groundwater table near the highway based on the comprehensive assessment of various conditions. This geotechnical accident shows that the role of groundwater is an essential factor to consider in the geotechnical and environmental engineering studies and multidisciplinary effort for risk assessment of environmental hazards is important under current global climate change condition.

  14. Challenges to Seismic Hazard Analysis of Critical Infrastructures

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2005-12-01

    Based on the background of the review of a large scale probabilistic seismic hazard analysis (PSHA) performed in Switzerland for the sites of Swiss nuclear power plants- the PEGASOS project (2000-2004) - challenges to seismic hazard analysis of critical infrastructures from the perspective of a professional safety analyst are discussed. The PEGASOS study was performed to provide a meaningful input for the update of the plant specific PRAs (Probabilistic Risk Assessment) of Swiss nuclear power plants. Earlier experience had shown that the results of these studies to a large extend are driven by the results of the seismic hazard analysis. The PEGASOS-study was performed in full compliance with the procedures developed by the Senior Seismic Hazard Analysis Committee (SSHAC) of U.S.A (SSHAC, 1997) developed for the treatment of uncertainties by the use of a structured expert elicitation process. The preliminary results derived from the project did show an unexpected amount of uncertainty and were regarded as not suitable for direct application. A detailed review of the SSHAC-methodology revealed a number of critical issues with respect to the treatment of uncertainties and the mathematical models applied, which will be presented in the paper. The most important issued to be discussed are: * The ambiguous solution of PSHA-logic trees * The inadequate mathematical treatment of the results of expert elicitations based on the assumption of bias free expert estimates * The problems associated with the "think model" of the separation of epistemic and aleatory uncertainties * The consequences of the ergodic assumption used to justify the transfer of attenuation equations of other regions to the region of interest. Based on these observations methodological questions with respect to the development of a risk-consistent design basis for new nuclear power plants as required by the U.S. NRC RG 1.165 will be evaluated. As an principal alternative for the development of a

  15. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where the processor has knowledge or has reason to know that the parasite-containing fish or fishery product will be consumed without a process sufficient to kill the parasites, or where the processor...

  16. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... other species where a food safety hazard has been associated with decomposition; (vii) Parasites, where the processor has knowledge or has reason to know that the parasite-containing fish or fishery product will be consumed without a process sufficient to kill the parasites, or where the processor...

  17. Evaluation and Application of Probabilistic Tsunami Hazard Analysis in California

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Wilson, R. I.; Miller, K.

    2014-12-01

    The California Geological Survey (CGS) and URS Corporation are in the process of generating tsunami hazard map products for land-use planning and construction through the California Seismic Hazard Mapping Act (Public Resources Code, sec 2690 et seq.). Similar to seismic hazard zonation, these products for land-use and development decision-making are typically based on a probabilistic analysis, and require information on the frequency of occurrence through a probabilistic tsunami hazard analysis (PTHA). In Phase 1 of CGS's work, the California PTHA Work Group was established to evaluate the results of PTHA demonstration projects in Crescent City and Huntington Beach. The results of this Phase 1 review of the two independent analyses indicate PTHA's can be developed with recommended improvements in source characterization, PTHA methods, and numerical model use. An immediate significant improvement is to align the characterization of the Cascadia Subduction Zone PTHA with the seismic characterization of the National Seismic Hazard Map Program of the USGS. In addition to applying PTHA to land-use planning and the two demonstration projects, CGS and the CA-PTHA Work Group identified other potential applications for various PTHA risk levels (ARP = Average Return Period), including flood insurance (100 and 500 year ARP), building codes (2,500 year ARP), and emergency response planning (1000 year ARP or larger). CGS is working with URS Corp., the California Office of Emergency Services, and FEMA on a Phase 2 plan to produce a single set of reliable and consistent PTHA maps for multiple risk levels and work with various end-users to determine how to use the maps. The California PTHA and the results of the Work Group review are also proposed to be used by the U.S. National Tsunami Hazard Mitigation Program to develop guidelines for production in other coastal states.

  18. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko

    2010-05-01

    system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a

  19. Graded approach for initiating event selection in a facility hazard analysis

    SciTech Connect

    Majumdar, K.; Altenbach, T.

    1998-04-01

    This paper describes a methodology for selecting initiating events or event scenarios for the hazard analysis of a new Department of Energy (DOE) facility at the Nevada Test Site for nuclear explosive operations called the Device Assembly Facility (DAF). The selection process is a very important first step in conducting the hazard analysis for the facility, which in turn may feed into a quantitative risk analysis. A comprehensive risk analysis is dependent on the identification and inclusion of a complete set of initiating events in the analysis model. A systematic and logical method of grading or screening all the potential initiating events satisfies the needs for completeness within the bounds of efficiency and practicality. By applying the graded approach to the selection of the initiating events, the task and hazard analysis was able to focus its attention on only those events having the potential to develop into credible accident scenarios. Resources were concentrated into the understanding of those scenarios, and assuring that adequate positive measures are in place to control the risk associated with them.

  20. Multi-hazard risk analysis for management strategies

    NASA Astrophysics Data System (ADS)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  1. Quantitative interactome analysis reveals a chemoresistant edgotype

    PubMed Central

    Chavez, Juan D.; Schweppe, Devin K.; Eng, Jimmy K.; Zheng, Chunxiang; Taipale, Alex; Zhang, Yiyi; Takara, Kohji; Bruce, James E.

    2015-01-01

    Chemoresistance is a common mode of therapy failure for many cancers. Tumours develop resistance to chemotherapeutics through a variety of mechanisms, with proteins serving pivotal roles. Changes in protein conformations and interactions affect the cellular response to environmental conditions contributing to the development of new phenotypes. The ability to understand how protein interaction networks adapt to yield new function or alter phenotype is limited by the inability to determine structural and protein interaction changes on a proteomic scale. Here, chemical crosslinking and mass spectrometry were employed to quantify changes in protein structures and interactions in multidrug-resistant human carcinoma cells. Quantitative analysis of the largest crosslinking-derived, protein interaction network comprising 1,391 crosslinked peptides allows for ‘edgotype' analysis in a cell model of chemoresistance. We detect consistent changes to protein interactions and structures, including those involving cytokeratins, topoisomerase-2-alpha, and post-translationally modified histones, which correlate with a chemoresistant phenotype. PMID:26235782

  2. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  3. Uncertainty analysis for seismic hazard in Northern and Central Italy

    USGS Publications Warehouse

    Lombardi, A.M.; Akinci, A.; Malagnini, L.; Mueller, C.S.

    2005-01-01

    In this study we examine uncertainty and parametric sensitivity of Peak Ground Acceleration (PGA) and 1-Hz Spectral Acceleration (1-Hz SA) in probabilistic seismic hazard maps (10% probability of exceedance in 50 years) of Northern and Central Italy. The uncertainty in hazard is estimated using a Monte Carlo approach to randomly sample a logic tree that has three input-variables branch points representing alternative values for b-value, maximum magnitude (Mmax) and attenuation relationships. Uncertainty is expressed in terms of 95% confidence band and Coefficient Of Variation (COV). The overall variability of ground motions and their sensitivity to each parameter of the logic tree are investigated. The largest values of the overall 95% confidence band are around 0.15 g for PGA in the Friuli and Northern Apennines regions and around 0.35 g for 1-Hz SA in the Central Apennines. The sensitivity analysis shows that the largest contributor to seismic hazard variability is uncertainty in the choice of ground-motion attenuation relationships, especially in the Friuli Region (???0.10 g) for PGA and in the Friuli and Central Apennines regions (???0.15 g) for 1-Hz SA. This is followed by the variability of the b-value: its main contribution is evident in the Friuli and Central Apennines regions for both 1-Hz SA (???0.15 g) and PGA (???0.10 g). We observe that the contribution of Mmax to seismic hazard variability is negligible, at least for 10% exceedance in 50-years hazard. The overall COV map for PGA shows that the uncertainty in the hazard is larger in the Friuli and Northern Apennine regions, around 20-30%, than the Central Apennines and Northwestern Italy, around 10-20%. The overall uncertainty is larger for the 1-Hz SA map and reaches 50-60% in the Central Apennines and Western Alps.

  4. Probabilistic Seismic Hazard Disaggregation Analysis for the South of Portugal

    NASA Astrophysics Data System (ADS)

    Rodrigues, I.; Sousa, M.; Teves-Costa, P.

    2010-12-01

    Probabilistic seismic hazard disaggregation analysis was performed and seismic scenarios were identified for Southern Mainland Portugal. This region’s seismicity is characterized by small and moderate magnitude events and by the sporadic occurrence of large earthquakes (e.g. the 1755 Lisbon earthquake). Thus, the Portuguese Civil Protection Agency (ANPC) sponsored a collaborative research project for the study of the seismic and tsunami risks in the Algarve (project ERSTA). In the framework of this project, a series of new developments were obtained, namely the revision of the seismic catalogue (IM, 2008), the delineation of new seismogenic zones affecting the Algarve region, which reflects the growing knowledge of this region's seismotectonic context, the derivation of new spectral attenuation laws (Carvalho and Campos Costa, 2008) and the revision of the probabilistic seismic hazard (Sousa et al. 2008). Seismic hazard was disaggregated considering different spaces of random variables, namely, bivariate conditional hazard distributions of X-Y (seismic source latitude and longitude) and multivariate 4D conditional hazard distributions of M-(X-Y)-ɛ (ɛ - deviation of ground motion to the median value predicted by an attenuation model). These procedures were performed for the peak ground acceleration (PGA) and for the 5% damped 1.0 and 2.5 Hz spectral acceleration levels of three return periods: 95, 475 and 975 years. The seismic scenarios controlling the hazard of a given ground motion level, were identified as the modal values of the 4D disaggregation analysis for each of the 84 parishes of the Algarve region. Those scenarios, based on a probabilistic analysis, are meant to be used in the emergency planning as a complement to the historical scenarios that severely affected this region. Seismic scenarios share a few number of geographical locations for all return periods. Moreover, seismic hazard of most Algarve’s parishes is dominated by the seismicity located

  5. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    NASA Astrophysics Data System (ADS)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  6. Empirical Bayes Analysis of Quantitative Proteomics Experiments

    PubMed Central

    Margolin, Adam A.; Ong, Shao-En; Schenone, Monica; Gould, Robert; Schreiber, Stuart L.; Carr, Steven A.; Golub, Todd R.

    2009-01-01

    Background Advances in mass spectrometry-based proteomics have enabled the incorporation of proteomic data into systems approaches to biology. However, development of analytical methods has lagged behind. Here we describe an empirical Bayes framework for quantitative proteomics data analysis. The method provides a statistical description of each experiment, including the number of proteins that differ in abundance between 2 samples, the experiment's statistical power to detect them, and the false-positive probability of each protein. Methodology/Principal Findings We analyzed 2 types of mass spectrometric experiments. First, we showed that the method identified the protein targets of small-molecules in affinity purification experiments with high precision. Second, we re-analyzed a mass spectrometric data set designed to identify proteins regulated by microRNAs. Our results were supported by sequence analysis of the 3′ UTR regions of predicted target genes, and we found that the previously reported conclusion that a large fraction of the proteome is regulated by microRNAs was not supported by our statistical analysis of the data. Conclusions/Significance Our results highlight the importance of rigorous statistical analysis of proteomic data, and the method described here provides a statistical framework to robustly and reliably interpret such data. PMID:19829701

  7. Long term volcanic hazard analysis in the Canary Islands

    NASA Astrophysics Data System (ADS)

    Becerril, L.; Galindo, I.; Laín, L.; Llorente, M.; Mancebo, M. J.

    2009-04-01

    Historic volcanism in Spain is restricted to the Canary Islands, a volcanic archipelago formed by seven volcanic islands. Several historic eruptions have been registered in the last five hundred years. However, and despite the huge amount of citizens and tourist in the archipelago, only a few volcanic hazard studies have been carried out. These studies are mainly focused in the developing of hazard maps in Lanzarote and Tenerife islands, especially for land use planning. The main handicap for these studies in the Canary Islands is the lack of well reported historical eruptions, but also the lack of data such as geochronological, geochemical or structural. In recent years, the use of Geographical Information Systems (GIS) and the improvement in the volcanic processes modelling has provided an important tool for volcanic hazard assessment. Although this sophisticated programs are really useful they need to be fed by a huge amount of data that sometimes, such in the case of the Canary Islands, are not available. For this reason, the Spanish Geological Survey (IGME) is developing a complete geo-referenced database for long term volcanic analysis in the Canary Islands. The Canarian Volcanic Hazard Database (HADA) is based on a GIS helping to organize and manage volcanic information efficiently. HADA includes the following groups of information: (1) 1:25.000 scale geologic maps, (2) 1:25.000 topographic maps, (3) geochronologic data, (4) geochemical data, (5) structural information, (6) climatic data. Data must pass a quality control before they are included in the database. New data are easily integrated in the database. With the HADA database the IGME has started a systematic organization of the existing data. In the near future, the IGME will generate new information to be included in HADA, such as volcanological maps of the islands, structural information, geochronological data and other information to assess long term volcanic hazard analysis. HADA will permit

  8. Surface Fire Hazards Analysis Technical Report-Constructor Facilities

    SciTech Connect

    R.E. Flye

    2000-10-24

    The purpose of this Fire Hazards Analysis Technical Report (hereinafter referred to as Technical Report) is to assess the risk from fire within individual fire areas to ascertain whether the U.S. Department of Energy (DOE) fire safety objectives are met. The objectives identified in DOE Order 420.1, Change 2, Facility Safety, Section 4.2, establish requirements for a comprehensive fire and related hazards protection program for facilities sufficient to minimize the potential for: The occurrence of a fire or related event; A fire that causes an unacceptable on-site or off-site release of hazardous or radiological material that will threaten the health and safety of employees, the public, or the environment; Vital DOE programs suffering unacceptable interruptions as a result of fire and related hazards; Property losses from a fire and related events exceeding defined limits established by DOE; and Critical process controls and safety class systems being damaged as a result of a fire and related events.

  9. Ergonomics hazards analysis of linemen's power line fixing work in China.

    PubMed

    Yu, Ming; Sun, Linyan; Du, Jianhua; Wu, Fengge

    2009-01-01

    This study used qualitative and quantitative methods, such as OWAS (Ovako working posture analysis system) and behavior observation, to analyze musculoskeletal disorder (MSD) risk factors of power line fixing work in China. Video-based sampling was used to record and analyze the frequency and posture of on-pole activities. Those key subtasks showed ergonomics characteristics of on-pole fixing tasks. Insulator-fixing was the longest subtask (33% of total working time). Bar-installing was the second longest (26% of total working time). It was evident that bar-installing and insulator-fixing were full of hazardous risks. The action categories of the 2 subtasks were higher than of the other ones. The 2 subtasks were also time-consuming, difficult and induced MSDs. Assistant linemen faced more hazardous factors than chief linemen.

  10. A methodology for probabilistic fault displacement hazard analysis (PFDHA)

    USGS Publications Warehouse

    Youngs, R.R.; Arabasz, W.J.; Anderson, R.E.; Ramelli, A.R.; Ake, J.P.; Slemmons, D.B.; McCalpin, J.P.; Doser, D.I.; Fridrich, C.J.; Swan, F. H.; Rogers, A.M.; Yount, J.C.; Anderson, L.W.; Smith, K.D.; Bruhn, R.L.; Knuepfer, P.L.K.; Smith, R.B.; DePolo, C.M.; O'Leary, D. W.; Coppersmith, K.J.; Pezzopane, S.K.; Schwartz, D.P.; Whitney, J.W.; Olig, S.S.; Toro, G.R.

    2003-01-01

    We present a methodology for conducting a site-specific probabilistic analysis of fault displacement hazard. Two approaches are outlined. The first relates the occurrence of fault displacement at or near the ground surface to the occurrence of earthquakes in the same manner as is done in a standard probabilistic seismic hazard analysis (PSHA) for ground shaking. The methodology for this approach is taken directly from PSHA methodology with the ground-motion attenuation function replaced by a fault displacement attenuation function. In the second approach, the rate of displacement events and the distribution for fault displacement are derived directly from the characteristics of the faults or geologic features at the site of interest. The methodology for probabilistic fault displacement hazard analysis (PFDHA) was developed for a normal faulting environment and the probability distributions we present may have general application in similar tectonic regions. In addition, the general methodology is applicable to any region and we indicate the type of data needed to apply the methodology elsewhere.

  11. Coupling induced seismic hazard analysis with reservoir design

    NASA Astrophysics Data System (ADS)

    Gischig, V.; Wiemer, S.; Alcolea, A. R.

    2013-12-01

    positive impact on seismic hazard. However, as smaller magnitudes contribute less to permeability enhancement the efficiency of stimulation is degraded in case of high b-value conditions. Nevertheless, target permeability enhancement can be still be achieved under high b-value condition without reaching an unacceptable seismic hazard level, if either initial permeability is already high or if several fractures are stimulated. The proposed modelling approach is a first step towards including induced seismic hazard analysis into the design of reservoir stimulation.

  12. Quantitative analysis of protein turnover in plants.

    PubMed

    Nelson, Clark J; Li, Lei; Millar, A Harvey

    2014-03-01

    Proteins are constantly being synthesised and degraded as plant cells age and as plants grow, develop and adapt the proteome. Given that plants develop through a series of events from germination to fruiting and even undertake whole organ senescence, an understanding of protein turnover as a fundamental part of this process in plants is essential. Both synthesis and degradation processes are spatially separated in a cell across its compartmented structure. The majority of protein synthesis occurs in the cytosol, while synthesis of specific components occurs inside plastids and mitochondria. Degradation of proteins occurs in both the cytosol, through the action of the plant proteasome, and in organelles and lytic structures through different protease classes. Tracking the specific synthesis and degradation rate of individual proteins can be undertaken using stable isotope feeding and the ability of peptide MS to track labelled peptide fractions over time. Mathematical modelling can be used to follow the isotope signature of newly synthesised protein as it accumulates and natural abundance proteins as they are lost through degradation. Different technical and biological constraints govern the potential for the use of (13)C, (15)N, (2)H and (18)O for these experiments in complete labelling and partial labelling strategies. Future development of quantitative protein turnover analysis will involve analysis of protein populations in complexes and subcellular compartments, assessing the effect of PTMs and integrating turnover studies into wider system biology study of plants.

  13. A Hazard Analysis for a Generic Insulin Infusion Pump

    PubMed Central

    Zhang, Yi; Jones, Paul L.; Jetley, Raoul

    2010-01-01

    Background Researchers at the Food and Drug Administration (FDA)/Center for Device and Radiological Health/Office of Science and Engineering Laboratories have been exploring the concept of model-based engineering as a means for improving the quality of medical device software. Insulin pumps were chosen as a research subject because their design provides the desired degree of research complexity and these types of devices present an ongoing regulatory challenge. Methods Insulin pump hazards and their contributing factors are considered in the context of a highly abstract generic insulin infusion pump (GIIP) model. Hazards were identified by consulting with manufacturers, pump users, and clinicians; by reviewing national and international standards and adverse event reports collected by the FDA; and from workshops sponsored by Diabetes Technology Society. This information has been consolidated in tabular form to facilitate further community analysis and discussion. Results A generic insulin infusion pump model architecture has been established. A fairly comprehensive hazard analysis document, corresponding to the GIIP model, is presented in this article. Conclusions We believe that this work represents the genesis of an insulin pump safety reference standard upon which future insulin pump designs can be based to help ensure a basic level of safety. More interaction with the diabetes community is needed to assure the quality of this safety modeling process. PMID:20307387

  14. Fire hazard analysis of the radioactive mixed waste trenchs

    SciTech Connect

    McDonald, K.M.

    1995-04-27

    This Fire Hazards Analysis (FHA) is intended to assess comprehensively the risk from fire associated with the disposal of low level radioactive mixed waste in trenches within the lined landfills, provided by Project W-025, designated Trench 31 and 34 of the Burial Ground 218-W-5. Elements within the FHA make recommendations for minimizing risk to workers, the public, and the environment from fire during the course of the operation`s activity. Transient flammables and combustibles present that support the operation`s activity are considered and included in the analysis. The graded FHA contains the following elements: description of construction, protection of essential safety class equipment, fire protection features, description of fire hazards, life safety considerations, critical process equipment, high value property, damage potential--maximum credible fire loss (MCFL) and maximum possible fire loss (MPFL), fire department/brigade response, recovery potential, potential for a toxic, biological and/or radiation incident due to a fire, emergency planning, security considerations related to fire protection, natural hazards (earthquake, flood, wind) impact on fire safety, and exposure fire potential, including the potential for fire spread between fire areas. Recommendations for limiting risk are made in the text of this report and printed in bold type. All recommendations are repeated in a list in Section 18.0.

  15. Deep Borehole Emplacement Mode Hazard Analysis Revision 0

    SciTech Connect

    Sevougian, S. David

    2015-08-07

    This letter report outlines a methodology and provides resource information for the Deep Borehole Emplacement Mode Hazard Analysis (DBEMHA). The main purpose is identify the accident hazards and accident event sequences associated with the two emplacement mode options (wireline or drillstring), to outline a methodology for computing accident probabilities and frequencies, and to point to available databases on the nature and frequency of accidents typically associated with standard borehole drilling and nuclear handling operations. Risk mitigation and prevention measures, which have been incorporated into the two emplacement designs (see Cochran and Hardin 2015), are also discussed. A key intent of this report is to provide background information to brief subject matter experts involved in the Emplacement Mode Design Study. [Note: Revision 0 of this report is concentrated more on the wireline emplacement mode. It is expected that Revision 1 will contain further development of the preliminary fault and event trees for the drill string emplacement mode.

  16. Hazards Analysis Report Addendum Buildign 518/518A Industrial Gases & Chemtrack Receiving & Barcoding Facility

    SciTech Connect

    Hickman, R D

    2000-02-04

    This report documents the Hazards Analysis Report (HAR) Addendum for Buildings 518 and 518A. In summary, the description of the facility and the operations given in the 1995 PHA are the same as the present in this year 2000. The hazards description also remains the same. The hazards analysis in this HAR Addendum is different in that it needs to be compared to operations routinely ''performed'' by the public. The HAR Addendum characterizes the level of intrinsic potential hazards associated with a facility and provides the basis for hazard classification. The hazard classification determines the level of safety documentation required and the DOE order governing the safety analysis. The hazard classification also determines the level of review and approval required for the safety analysis. This facility does not contain any safety class systems or systems important to safety as defined in Department of Energy standard DOE-STD-3009-94. The hazards of primary concern associated with B518 and B518A are chemical in nature. The hazard classification is determined by comparing facility inventories of chemicals with threshold values for the various hazard classification levels. In this way, the hazard level of the facility can be ascertained. The most significant hazards that could affect people in the local area of B518 and B518A, elsewhere on the LLNL site, and off site, are associated with hazardous and toxic materials. These hazards are the focus of this report and are the basis for the facility hazard classification.

  17. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  18. Error Propagation Analysis for Quantitative Intracellular Metabolomics

    PubMed Central

    Tillack, Jana; Paczia, Nicole; Nöh, Katharina; Wiechert, Wolfgang; Noack, Stephan

    2012-01-01

    Model-based analyses have become an integral part of modern metabolic engineering and systems biology in order to gain knowledge about complex and not directly observable cellular processes. For quantitative analyses, not only experimental data, but also measurement errors, play a crucial role. The total measurement error of any analytical protocol is the result of an accumulation of single errors introduced by several processing steps. Here, we present a framework for the quantification of intracellular metabolites, including error propagation during metabolome sample processing. Focusing on one specific protocol, we comprehensively investigate all currently known and accessible factors that ultimately impact the accuracy of intracellular metabolite concentration data. All intermediate steps are modeled, and their uncertainty with respect to the final concentration data is rigorously quantified. Finally, on the basis of a comprehensive metabolome dataset of Corynebacterium glutamicum, an integrated error propagation analysis for all parts of the model is conducted, and the most critical steps for intracellular metabolite quantification are detected. PMID:24957773

  19. Quantitative gold nanoparticle analysis methods: A review.

    PubMed

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  20. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  1. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  2. Quantitative analysis of saccadic search strategy

    NASA Astrophysics Data System (ADS)

    Over, E. A. B.

    2007-06-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye movement parameters. Chapter 2 provides a method to quantify a general property of fixation locations. We proposed a quantitative measure based on Voronoi diagrams for the characterization of the uniformity of fixation density. This measure may be thought of as indicating the clustering of fixations. We showed that during a visual search task, a structured (natural) background leads to higher clustering of fixations compared to a homogeneous background. In addition, in natural stimuli, a search task leads to higher clustering of fixations than the instruction to freely view the stimuli. Chapter 3 provides a method to identify the overall field of saccade directions in the viewing area. We extended the Voronoi method of chapter 2 so that it became possible to create vector maps. These maps indicate the preferred saccade direction for each position in the viewing area. Several measures of these vector maps were used to quantify the influence of observer-dependent and stimulus-dependent factors on saccade direction in a search task with natural scenes. The results showed that the influence of stimulus-dependent factors appeared to be larger than the influence of observer-dependent factors. In chapter 4 we showed that the border of the search area played a role in the search strategy. In a search experiment in differently shaped areas we measured that search performance was poorer near the search area luminance edges. Fixation density, however, was higher in the edge region, and saccade direction was mainly along the edges of the search areas. In a target visibility experiment we established that the visibility of targets near a luminance edge is less than the visibility of

  3. Quantitative hazard assessment of phreatomagmatic eruptions at Vulcano (Aeolian Islands, Southern Italy) as obtained by combining stratigraphy, event statistics and physical modelling

    NASA Astrophysics Data System (ADS)

    Dellino, Pierfrancesco; De Astis, Gianfilippo; La Volpe, Luigi; Mele, Daniela; Sulpizio, Roberto

    2011-04-01

    The detailed analysis of stratigraphy allowed the reconstruction of the complex volcanic history of La Fossa di Vulcano. An eruptive activity mainly driven by superficial phreatomagmatic explosions emerged. A statistical analysis of the pyroclastic Successions led to the identification of dilute pyroclastic density currents (base surges) as the most recurrent events, followed by fallout of dense ballistic blocks. The scale of events is related to the amount of magma involved in each explosion. Events involving about 1 million cm 3 of magma occurred during recent eruptions. They led to the formation of hundreds of meters thick dilute pyroclastic density currents, moving down the volcano slope at velocities exceeding 50 m/s. The dispersion of density currents affected the whole Vulcano Porto area, the Vulcanello area. They also overrode the Fossa Caldera's rim, spreading over the Piano area. For the aim of hazard assessment, deposits from La Fossa Cone and La Fossa Caldera were studied in detail, to depict the eruptive scenarios at short-term and at long-term. By means of physical models that make use of deposit particle features, the impact parameters have been calculated. They are dynamic pressure and particle volumetric concentration of density currents, and impact energy of ballistic blocks. A quantitative hazard map, based on these impact parameters, is presented. It could be useful for territory planning and for the calculation of the expected damage.

  4. Hazard analysis of a computer based medical diagnostic system.

    PubMed

    Chudleigh, M F

    1994-07-01

    Medical screening of sectors of the population is now a routine and vital part of health care: an example is cervical smear testing. There is currently significant interest in the possible introduction of semi-automated microscopy systems for cervical cytology and one such experimental system is now undergoing laboratory trials. A collaborative project has been set up to demonstrate the benefits and constraints that arise from applying safety-critical methods developed in other domains to such a diagnostic system. We have carried out a system hazard analysis, successfully using the HAZOP technique adapted from the petrochemical industry. PMID:7988111

  5. Probabilistic Seismic Hazard Analysis: Adaptation for CO2 Sequestration Sites

    NASA Astrophysics Data System (ADS)

    Vasudevan, K.; Eaton, D. W.

    2011-12-01

    Large-scale sequestration of CO2 in depleted oil and gas fields in sedimentary basins such as the Western Canada Sedimentary Basin (WCSB) and in particular, central Alberta, should consider, among other safety and risk issues, a seismic hazard analysis that would include potential ground motions induced by earthquakes. The region is juxtaposed to major tectonically active seismogenic zones such as the Cascadia Subduction Zone, the Queen Charlotte Fault Zone, and the northern Cordillera region. Hazards associated with large-scale storage from strong ground motions caused by large-magnitude earthquakes along the west coast of Canada, and/or medium-to-large magnitude earthquakes triggered by such earthquakes in the neighbourhood of the storage site, must be clearly understood. To this end, stochastic modeling of the accelerograms recorded during large magnitude earthquakes in western Canada has been undertaken. A lack of recorded accelerograms and the absence of a catalogue of ground-motion prediction equations similar to the Next Generation Attenuation (NGA) database, however, hamper such analysis for the WCSB. In order to generate our own database of ground-motions for probabilistic seismic hazard analysis, we employ a site-based stochastic simulation approach. We use it to simulate three-component ground-motion accelerograms recorded during the November 3, 2002 Denali earthquake to mimic the Queen Charlotte Fault earthquakes. To represent a Cascadia megathrust earthquake, we consider three-component strong-motion accelerograms recorded during the March 11, 2011 Tohoku earthquake in Japan. Finally, to simulate an event comparable to the thrust-style Kinbasket Lake earthquake of 1908, we use three-component ground-motion accelerograms recorded during the 1985 Nahanni earthquake and the 2004 Chuetsu earthquake. Here, we develop predictive equations for the stochastic model parameters that describe ground motions in terms of earthquake and site characteristics such as

  6. Fire hazards analysis for W030 tank farm ventilation upgrade

    SciTech Connect

    Huckfeldt, R.A.

    1996-07-17

    This Fire Hazard Analysis (FHA) was prepared according to the requirements of U.S. Department of Energy (DOE) Order 5480.7A,FIRE PROTECTION, 2-17-93. The purpose of this FHA is to ascertain whether the objectives of DOE 5480.7A are being met. This purpose is accomplished through a conservative comprehensive assessment of the risk from fire and other perils within individual fire areas of a DOE facility in relation to proposed fire protection. This FHA is based on conditions set forth within this document and is valid only under these conditions.

  7. A LiDAR based analysis of hydraulic hazard mapping

    NASA Astrophysics Data System (ADS)

    Cazorzi, F.; De Luca, A.; Checchinato, A.; Segna, F.; Dalla Fontana, G.

    2012-04-01

    Mapping hydraulic hazard is a ticklish procedure as it involves technical and socio-economic aspects. On the one hand no dangerous areas should be excluded, on the other hand it is important not to exceed, beyond the necessary, with the surface assigned to some use limitations. The availability of a high resolution topographic survey allows nowadays to face this task with innovative procedures, both in the planning (mapping) and in the map validation phases. The latter is the object of the present work. It should be stressed that the described procedure is proposed purely as a preliminary analysis based on topography only, and therefore does not intend in any way to replace more sophisticated analysis methods requiring based on hydraulic modelling. The reference elevation model is a combination of the digital terrain model and the digital building model (DTM+DBM). The option of using the standard surface model (DSM) is not viable, as the DSM represents the vegetation canopy as a solid volume. This has the consequence of unrealistically considering the vegetation as a geometric obstacle to water flow. In some cases the topographic model construction requires the identification and digitization of the principal breaklines, such as river banks, ditches and similar natural or artificial structures. The geometrical and topological procedure for the validation of the hydraulic hazard maps is made of two steps. In the first step the whole area is subdivided into fluvial segments, with length chosen as a reasonable trade-off between the need to keep the hydrographical unit as complete as possible, and the need to separate sections of the river bed with significantly different morphology. Each of these segments is made of a single elongated polygon, whose shape can be quite complex, especially for meandering river sections, where the flow direction (i.e. the potential energy gradient associated to the talweg) is often inverted. In the second step the segments are analysed

  8. Estimating an Applying Uncertainties in Probabilistic Tsunami Hazard Analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Thio, H. K.

    2013-12-01

    An integral part of the a probabilistic analysis is the formal inclusion of uncertainties, both due to a limited understanding of the physics processes (epistemic) as well their natural variability (aleatory). Because of the strong non-linearity of the tsunami inundation process, it is also important to not only understand the extent of the uncertainties, but also how and where to apply them. We can divide up the uncertainties into several stages: the source, ocean propagation and nearshore/inundation. On the source side, many of the uncertainties are identical to those used in probabilistic seismic hazard analysis (PSHA). However, the details of slip distributions are very significant in tsunami excitation, especially for near-field tsunamis.. We will show several ways of including slip variability, both stochastic and non-stochastic, by developing a probabilistic set of source scenarios. The uncertainties in ocean propagation are less significant since modern algorithms are very successful in modeling open ocean tsunami propagation. However, in the near-shore regime and the inundation, the situation is much more complex. Here, errors in the local elevation models, variability in bottom friction and the omission of built environment can lead to significant errors. Details of the implementation of the tsunami algorithms can yield different results. We will discuss the most significant sources of uncertainty and the alternative ways to implement them using examples for the probabilistic tsunami hazard mapping that we are currently carrying out for the state of California and other regions.

  9. Fire hazard analysis for Plutonium Finishing Plant complex

    SciTech Connect

    MCKINNIS, D.L.

    1999-02-23

    A fire hazards analysis (FHA) was performed for the Plutonium Finishing Plant (PFP) Complex at the Department of Energy (DOE) Hanford site. The scope of the FHA focuses on the nuclear facilities/structures in the Complex. The analysis was conducted in accordance with RLID 5480.7, [DOE Directive RLID 5480.7, 1/17/94] and DOE Order 5480.7A, ''Fire Protection'' [DOE Order 5480.7A, 2/17/93] and addresses each of the sixteen principle elements outlined in paragraph 9.a(3) of the Order. The elements are addressed in terms of the fire protection objectives stated in paragraph 4 of DOE 5480.7A. In addition, the FHA also complies with WHC-CM-4-41, Fire Protection Program Manual, Section 3.4 [1994] and WHC-SD-GN-FHA-30001, Rev. 0 [WHC, 1994]. Objectives of the FHA are to determine: (1) the fire hazards that expose the PFP facilities, or that are inherent in the building operations, (2) the adequacy of the fire safety features currently located in the PFP Complex, and (3) the degree of compliance of the facility with specific fire safety provisions in DOE orders, related engineering codes, and standards.

  10. Application of disease burden to quantitative assessment of health hazards for a decentralized water reuse system.

    PubMed

    Gao, Tingting; Chen, Rong; Wang, Xiaochang; Ngo, Huu Hao; Li, Yu-You; Zhou, Jinhong; Zhang, Lu

    2016-05-01

    The aim of this article is to introduce the methodology of disease burden (DB) to quantify the health impact of microbial regrowth during wastewater reuse, using the case study of a decentralized water reuse system in Xi'an Si-yuan University, located in Xi'an, China. Based on field investigation findings, Escherichia coli (E. coli), Salmonella and rotavirus were selected as typical regrowth pathogens causing potential health hazards during the reuse of reclaimed water. Subsequently, major exposure routes including sprinkler irrigation, landscape fountains and toilet flushing were identified. Mathematical models were established to build the relationship between exposure dose and disease burden by calculating the disability adjusted life year (DALY). Results of disease burden for this case study show that DALYs attributed to E. coli were significantly greater than those caused by other pathogens, and DALYs associated with sprinkler irrigation were higher than those originating from other routes. A correlation between exposure dose and disease was obtained by introducing a modified calculation of morbidity, which can extend the assessment endpoint of health risk to disease burden from the conventional infection rate. PMID:26874764

  11. Resources Available for Hazards Analysis of Aerospace Fluids

    NASA Technical Reports Server (NTRS)

    Woods, S. S.; Stewart, W. F.; Baker, D. L.

    2001-01-01

    In recent years, the legislative and executive branches of the federal government have pushed to make government more efficient and responsive to the needs of the marketplace. One of these initiatives, Public Law 104-113, also known as the National Technology Transfer and Advancement Act of 1995 (NTTAA), is designed to accelerate technology transfer to industry and promote government-industry partnership. Summarized, NTTAA states that '... all Federal agencies and departments shall use technical standards that are developed or adopted by voluntary consensus standards bodies, using such technical standards as a means to carry out policy objectives or activities determined by the agencies and departments. Government agencies must now determine if their in-house requirement-setting activities are sufficiently unique that no public interest is served by having them adopted by a voluntary consensus organization (VCO), or if not, to use or develop voluntary consensus standards. The Office of Management and Budget (OMB) is chartered by the law to monitor federal agency progress and report the results to Congress. In response to NTTAA, agency-wide oxygen and hydrogen safety standards sponsored by the NASA Headquarters (HQ) Office of Safety and Mission Assurance (OSMA) were obvious choices for early adoption by VCOs. In 1996, HQ sought assistance from the Johnson Space Center (JSC) White Sands Test Facility (WSTF), the technical lead for development of these safety standards, to evaluate their adoption by VCOs. At that time, WSTF-developed propellant hazards manuals were likewise identified for possible VCO adoption. Subsequently, WSTF was asked to represent NASA for development of an international ISO safety standard for hydrogen use. Concurrent with these WSTF standards activities are related efforts to develop and publish propellant hazards analysis protocols and safety courses for the industrial, propellant use of oxygen, hydrogen, and hypergols. This paper reports on

  12. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  13. Some Epistemological Considerations Concerning Quantitative Analysis

    ERIC Educational Resources Information Center

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that question the…

  14. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Technical Reports Server (NTRS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-01-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  15. Quantitative analysis of planetary reflectance spectra with principal components analysis

    NASA Astrophysics Data System (ADS)

    Johnson, P. E.; Smith, M. O.; Adams, J. B.

    1985-02-01

    A technique is presented for quantitative analysis of planetary reflectance spectra as mixtures of particles on microscopic and macroscopic scales using principal components analysis. This technique allows for determination of the endmembers being mixed, their abundance, and the scale of mixing, as well as other physical parameters. Eighteen lunar telescopic reflectance spectra of the Copernicus crater region, from 600 nm to 1800 nm in wavelength, are modeled in terms of five likely endmembers: mare basalt, mature mare soil, anorthosite, mature highland soil, and clinopyroxene. These endmembers were chosen from a similar analysis of 92 lunar soil and rock samples. The models fit the data to within 2 percent rms. It is found that the goodness of fit is marginally better for intimate mixing over macroscopic mixing.

  16. Probabilistic seismic hazard analysis for the city of Quetta, Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Shafiq; Lindholm, Conrad; Ahmed, Najeeb; Rafi, Zahid

    2014-08-01

    Seismic hazard assessment for Quetta is carried out using probabilistic seismic hazard analysis technique based on area sources and augmented by line source used for the first time in Pakistan. Seismic data has been collected and analyzed in spatial and temporal domains. Five Seismic Zones have been modeled in line with tectonics of the region with b-value of 1.14 using regression. The b-value is slightly higher, which is attributed to the fact that aftershocks were not removed as it distorted the dataset. Five fault sources are modeled, with three as reverse and two as strike-slip with 7.8 as maximum magnitude. Mach Structure is included in the tectonics for the first time. The attenuation relation used in the present study is recommended by various researchers. The expected Peak Ground Acceleration for 500-year return period is 4.79 m/s2 for rock outcrop and characterized as very high. Furthermore, variation in spectral acceleration within Quetta city is observed, for which spectral curves are developed for four different places.

  17. Analyzing Distributed Functions in an Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2010-01-01

    Large scale integration of today's aerospace systems is achievable through the use of distributed systems. Validating the safety of distributed systems is significantly more difficult as compared to centralized systems because of the complexity of the interactions between simultaneously active components. Integrated hazard analysis (IHA), a process used to identify unacceptable risks and to provide a means of controlling them, can be applied to either centralized or distributed systems. IHA, though, must be tailored to fit the particular system being analyzed. Distributed systems, for instance, must be analyzed for hazards in terms of the functions that rely on them. This paper will describe systems-oriented IHA techniques (as opposed to traditional failure-event or reliability techniques) that should be employed for distributed systems in aerospace environments. Special considerations will be addressed when dealing with specific distributed systems such as active thermal control, electrical power, command and data handling, and software systems (including the interaction with fault management systems). Because of the significance of second-order effects in large scale distributed systems, the paper will also describe how to analyze secondary functions to secondary functions through the use of channelization.

  18. Using quantitative acid-base analysis in the ICU.

    PubMed

    Lloyd, P; Freebairn, R

    2006-03-01

    The quantitative acid-base 'Strong Ion' calculator is a practical application of quantitative acid-base chemistry, as developed by Peter Stewart and Peter Constable. It quantifies the three independent factors that control acidity, calculates the concentration and charge of unmeasured ions, produces a report based on these calculations and displays a Gamblegram depicting measured ionic species. Used together with the medical history, quantitative acid-base analysis has advantages over traditional approaches.

  19. Systematic analysis of natural hazards along infrastructure networks using a GIS-tool for risk assessment

    NASA Astrophysics Data System (ADS)

    Baruffini, Mirko

    2010-05-01

    system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a

  20. Regional Analysis of the Hazard Level of Glacial Lakes in the Cordillera Blanca, Peru

    NASA Astrophysics Data System (ADS)

    Chisolm, Rachel E.; Jhon Sanchez Leon, Walter; McKinney, Daene C.; Cochachin Rapre, Alejo

    2016-04-01

    The Cordillera Blanca mountain range is the highest in Peru and contains many of the world's tropical glaciers. This region is severely impacted by climate change causing accelerated glacier retreat. Secondary impacts of climate change on glacier retreat include stress on water resources and the risk of glacial lake outburst floods (GLOFs) from the many lakes that are forming and growing at the base of glaciers. A number of GLOFs originating from lakes in the Cordillera Blanca have occurred over the last century, several of which have had catastrophic impacts on cities and communities downstream. Glaciologists and engineers in Peru have been studying the lakes of the Cordillera Blanca for many years and have identified several lakes that are considered dangerous. However, a systematic analysis of all the lakes in the Cordillera Blanca has never before been attempted. Some methodologies for this type of systematic analysis have been proposed (eg. Emmer and Vilimek 2014; Wang, et al. 2011), but as yet they have only been applied to a few select lakes in the Cordillera Blanca. This study uses remotely sensed data to study all of the lakes of the Glacial Lake Inventory published by the Glaciology and Water Resources Unit of Peru's National Water Authority (UGRH 2011). The objective of this study is to assign a level of potential hazard to each glacial lake in the Cordillera Blanca and to ascertain if any of the lakes beyond those that have already been studied might pose a danger to nearby populations. A number of parameters of analysis, both quantitative and qualitative, have been selected to assess the hazard level of each glacial lake in the Cordillera Blanca using digital elevation models, satellite imagery, and glacier outlines. These parameters are then combined to come up with a preliminary assessment of the hazard level of each lake; the equation weighting each parameter draws on previously published methodologies but is tailored to the regional characteristics

  1. A DOE-STD-3009 hazard and accident analysis methodology for non-reactor nuclear facilities

    SciTech Connect

    MAHN,JEFFREY A.; WALKER,SHARON ANN

    2000-03-23

    This paper demonstrates the use of appropriate consequence evaluation criteria in conjunction with generic likelihood of occurrence data to produce consistent hazard analysis results for nonreactor nuclear facility Safety Analysis Reports (SAR). An additional objective is to demonstrate the use of generic likelihood of occurrence data as a means for deriving defendable accident sequence frequencies, thereby enabling the screening of potentially incredible events (<10{sup {minus}6} per year) from the design basis accident envelope. Generic likelihood of occurrence data has been used successfully in performing SAR hazard and accident analyses for two nonreactor nuclear facilities at Sandia National Laboratories. DOE-STD-3009-94 addresses and even encourages use of a qualitative binning technique for deriving and ranking nonreactor nuclear facility risks. However, qualitative techniques invariably lead to reviewer requests for more details associated with consequence or likelihood of occurrence bin assignments in the test of the SAR. Hazard analysis data displayed in simple worksheet format generally elicits questions about not only the assumptions behind the data, but also the quantitative bases for the assumptions themselves (engineering judgment may not be considered sufficient by some reviewers). This is especially true where the criteria for qualitative binning of likelihood of occurrence involves numerical ranges. Oftentimes reviewers want to see calculations or at least a discussion of event frequencies or failure probabilities to support likelihood of occurrence bin assignments. This may become a significant point of contention for events that have been binned as incredible. This paper will show how the use of readily available generic data can avoid many of the reviewer questions that will inevitably arise from strictly qualitative analyses, while not significantly increasing the overall burden on the analyst.

  2. UPDATE TO THE PROBABILISTIC VOLCANIC HAZARD ANALYSIS, YUCCA MOUNTAIN, NEVADA

    SciTech Connect

    K.J. Coppersmith

    2005-09-14

    A probabilistic volcanic hazard analysis (PVHA) was conducted in 1996 for the proposed repository at Yucca Mountain, Nevada. Based on data gathered by the Yucca Mountain Project over the course of about 15 years, the analysis integrated the judgments of a panel of ten volcanic experts using methods of formal expert elicitation. PVHA resulted in a probability distribution of the annual frequency of a dike intersecting the repository, which ranges from 10E-7 to 10E-10 (mean 1.6 x 10E-8). The analysis incorporates assessments of the future locations, rates, and types of volcanic dikes that could intersect the repository, which lies about 300 m below the surface. A particular focus of the analysis is the quantification of uncertainties. Since the 1996 PVHA, additional aeromagnetic data have been collected in the Yucca Mountain region, including a high-resolution low-altitude survey. A number of anomalies have been identified within alluvial areas and modeling suggests that some of these may represent buried eruptive centers (basaltic cinder cones). A program is currently underway to drill several of the anomalies to gain information on their origin and, if basalt, their age and composition. To update the PVHA in light of the new aeromagnetic and drilling data as well as other advancements in volcanic hazard modeling over the past decade, the expert panel has been reconvened and the expert elicitation process has been fully restarted. The analysis requires assessments of the spatial distribution of igneous events, temporal distributions, and geometries and characteristics of future events (both intrusive and extrusive). The assessments are for future time periods of 10,000 years and 1,000,000 years. Uncertainties are being quantified in both the conceptual models that define these elements as well as in the parameters for the models. The expert elicitation process is centered around a series of workshops that focus on the available data; alternative approaches to

  3. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  4. Structural and quantitative analysis of Equisetum alkaloids.

    PubMed

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved.

  5. Analysis of hazardous material releases due to natural hazards in the United States.

    PubMed

    Sengul, Hatice; Santella, Nicholas; Steinberg, Laura J; Cruz, Ana Maria

    2012-10-01

    Natural hazards were the cause of approximately 16,600 hazardous material (hazmat) releases reported to the National Response Center (NRC) between 1990 and 2008-three per cent of all reported hazmat releases. Rain-induced releases were most numerous (26 per cent of the total), followed by those associated with hurricanes (20 per cent), many of which resulted from major episodes in 2005 and 2008. Winds, storms or other weather-related phenomena were responsible for another 25 per cent of hazmat releases. Large releases were most frequently due to major natural disasters. For instance, hurricane-induced releases of petroleum from storage tanks account for a large fraction of the total volume of petroleum released during 'natechs' (understood here as a natural hazard and the hazardous materials release that results). Among the most commonly released chemicals were nitrogen oxides, benzene, and polychlorinated biphenyls. Three deaths, 52 injuries, and the evacuation of at least 5,000 persons were recorded as a consequence of natech events. Overall, results suggest that the number of natechs increased over the study period (1990-2008) with potential for serious human and environmental impacts.

  6. Analysis of hazardous material releases due to natural hazards in the United States.

    PubMed

    Sengul, Hatice; Santella, Nicholas; Steinberg, Laura J; Cruz, Ana Maria

    2012-10-01

    Natural hazards were the cause of approximately 16,600 hazardous material (hazmat) releases reported to the National Response Center (NRC) between 1990 and 2008-three per cent of all reported hazmat releases. Rain-induced releases were most numerous (26 per cent of the total), followed by those associated with hurricanes (20 per cent), many of which resulted from major episodes in 2005 and 2008. Winds, storms or other weather-related phenomena were responsible for another 25 per cent of hazmat releases. Large releases were most frequently due to major natural disasters. For instance, hurricane-induced releases of petroleum from storage tanks account for a large fraction of the total volume of petroleum released during 'natechs' (understood here as a natural hazard and the hazardous materials release that results). Among the most commonly released chemicals were nitrogen oxides, benzene, and polychlorinated biphenyls. Three deaths, 52 injuries, and the evacuation of at least 5,000 persons were recorded as a consequence of natech events. Overall, results suggest that the number of natechs increased over the study period (1990-2008) with potential for serious human and environmental impacts. PMID:22329456

  7. NFPA 70E: Performing the electrical flash hazard analysis.

    PubMed

    Wallace, W Jon

    2005-08-01

    Arc flash hazards pose a serious risk to employee safety; proper safe work practices must be utilized. Electrical equipment > or = 50 volts must be deenergized and locked out/tagged out prior to servicing and maintenance unless doing so would increase hazards or is infeasible. Remember, convenience is not an acceptable reason for keeping equipment energized during servicing and maintenance. If electrical equipment must remain energized during Servicing and maintenance, NFPA 70E should be consulted to determine flash hazard boundaries as well as required PPE. Finally, circuit breakers and electrical disconnects must be marked to warn qualified employees of potential arc flash hazards. PMID:16212025

  8. Quantitative data analysis of ESAR data

    NASA Astrophysics Data System (ADS)

    Phruksahiran, N.; Chandra, M.

    2013-07-01

    A synthetic aperture radar (SAR) data processing uses the backscattered electromagnetic wave to map radar reflectivity of the ground surface. The polarization property in radar remote sensing was used successfully in many applications, especially in target decomposition. This paper presents a case study to the experiments which are performed on ESAR L-Band full polarized data sets from German Aerospace Center (DLR) to demonstrate the potential of coherent target decomposition and the possibility of using the weather radar measurement parameter, such as the differential reflectivity and the linear depolarization ratio to obtain the quantitative information of the ground surface. The raw data of ESAR has been processed by the SAR simulator developed using MATLAB program code with Range-Doppler algorithm.

  9. Qualitative and quantitative analysis of endocytic recycling.

    PubMed

    Reineke, James B; Xie, Shuwei; Naslavsky, Naava; Caplan, Steve

    2015-01-01

    Endocytosis, which encompasses the internalization and sorting of plasma membrane (PM) lipids and proteins to distinct membrane-bound intracellular compartments, is a highly regulated and fundamental cellular process by which eukaryotic cells dynamically regulate their PM composition. Indeed, endocytosis is implicated in crucial cellular processes that include proliferation, migration, and cell division as well as maintenance of tissue homeostasis such as apical-basal polarity. Once PM constituents have been taken up into the cell, either via clathrin-dependent endocytosis (CDE) or clathrin-independent endocytosis (CIE), they typically have two fates: degradation through the late-endosomal/lysosomal pathway or returning to the PM via endocytic recycling pathways. In this review, we will detail experimental procedures that allow for both qualitative and quantitative assessment of endocytic recycling of transmembrane proteins internalized by CDE and CIE, using the HeLa cervical cancer cell line as a model system. PMID:26360033

  10. Quantitative Risk Analysis of Obstacle Limitation Standards

    NASA Astrophysics Data System (ADS)

    Sandaradura, Amila Silva

    Obstacle limitation surfaces (OLS) are the main safeguard against objects that can pose a hazard to aircraft operations at and around the airports. The standard dimensions of the most of these surfaces were estimated using the pilot's experience at the time when they were included in to the standard documents. As a result, some of these standards may have been overestimated while others may not provide an adequate level of safety. With airports moving to the Safety Management System (SMS) approach to design and operations safety, proper evaluation of the level of safety provided by OLS at specific sites becomes great importance to airport operators. There is no published evidence, however, for the estimation of the safety level provided by the existing OLS standards. Moreover, the rationale used by the ICAO to establish existing OLS standards is not readily available in the standard documents. Therefore this study attempts to collect actual flight path data using information provided by air traffic control radars and construct a methodology to assess the probability of aircraft deviating from their intended/protected path. The extension of the developed methodology can be used to estimate the OLS dimensions that provide an acceptable safety level for the aircraft operations. This will be helpful to estimate safe and efficient standard dimensions of the OLS and assess the risk level of objects to the aircraft operations around airports. In order to assess the existing standards and show the applications of the methodology, three case studies were conducted using aircraft data collected from Ottawa (CYOW), Calgary (CYYC) and Edmonton (CYEG) International Airports.

  11. Analysis of the Proportional Hazards Model with Sparse Longitudinal Covariates

    PubMed Central

    Cao, Hongyuan; Churpek, Mathew M.; Zeng, Donglin; Fine, Jason P.

    2014-01-01

    Regression analysis of censored failure observations via the proportional hazards model permits time-varying covariates which are observed at death times. In practice, such longitudinal covariates are typically sparse and only measured at infrequent and irregularly spaced follow-up times. Full likelihood analyses of joint models for longitudinal and survival data impose stringent modelling assumptions which are difficult to verify in practice and which are complicated both inferentially and computationally. In this article, a simple kernel weighted score function is proposed with minimal assumptions. Two scenarios are considered: half kernel estimation in which observation ceases at the time of the event and full kernel estimation for data where observation may continue after the event, as with recurrent events data. It is established that these estimators are consistent and asymptotically normal. However, they converge at rates which are slower than the parametric rates which may be achieved with fully observed covariates, with the full kernel method achieving an optimal convergence rate which is superior to that of the half kernel method. Simulation results demonstrate that the large sample approximations are adequate for practical use and may yield improved performance relative to last value carried forward approach and joint modelling method. The analysis of the data from a cardiac arrest study demonstrates the utility of the proposed methods. PMID:26576066

  12. Hazard function analysis for flood planning under nonstationarity

    NASA Astrophysics Data System (ADS)

    Read, Laura K.; Vogel, Richard M.

    2016-05-01

    The field of hazard function analysis (HFA) involves a probabilistic assessment of the "time to failure" or "return period," T, of an event of interest. HFA is used in epidemiology, manufacturing, medicine, actuarial statistics, reliability engineering, economics, and elsewhere. For a stationary process, the probability distribution function (pdf) of the return period always follows an exponential distribution, the same is not true for nonstationary processes. When the process of interest, X, exhibits nonstationary behavior, HFA can provide a complementary approach to risk analysis with analytical tools particularly useful for hydrological applications. After a general introduction to HFA, we describe a new mathematical linkage between the magnitude of the flood event, X, and its return period, T, for nonstationary processes. We derive the probabilistic properties of T for a nonstationary one-parameter exponential model of X, and then use both Monte-Carlo simulation and HFA to generalize the behavior of T when X arises from a nonstationary two-parameter lognormal distribution. For this case, our findings suggest that a two-parameter Weibull distribution provides a reasonable approximation for the pdf of T. We document how HFA can provide an alternative approach to characterize the probabilistic properties of both nonstationary flood series and the resulting pdf of T.

  13. Recent Advances in Launch Vehicle Toxic Hazard and Risk Analysis

    NASA Astrophysics Data System (ADS)

    Nyman, R. L.

    2012-01-01

    A number of widely used rocket propellants produce toxic combustion byproducts or are themselves toxic in their un-reacted state. In this paper we focus on the methodology used to evaluate early flight catastrophic failures and nominal launch emissions that release large amounts of propellant or combustion products into the planetary boundary layer that pose a potential risk to launch area personnel, spectators, or the general public. The United States has traditionally used the Rocket Exhaust Effluent Diffusion Model (REEDM) [1] to access the hazard zones associated with such releases. REEDM is a 1970's vintage Gaussian atmospheric dispersion model that is limited in its ability to accurately simulate certain aspects of the initial source geometry and dynamics of a vehicle breakup and propellant fragment dispersion. The Launch Area Toxic Risk Analysis 3-Dimensional (LATRA3D) [2] computer program has been developed that addresses many of REEDM's deficiencies. LATRA3D is a probabilistic risk analysis tool that simulates both nominal vehicle flight and in-flight failure emissions.

  14. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Section 417.2 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... determine the food safety hazards reasonably likely to occur in the production process and identify the... food safety hazards that can occur before, during, and after entry into the establishment. A...

  15. Quantitative infrared analysis of hydrogen fluoride

    SciTech Connect

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF{sub 6}. This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm{sup -1} as a function of pressure for 100% HF. (2) Absorbance at 3877 cm{sup -1} as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm{sup -1} for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm{sup -1} can be quantitatively analyzed via infrared methods.

  16. Quantitative multi-modal NDT data analysis

    SciTech Connect

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.

  17. Using fire tests for quantitative risk analysis

    SciTech Connect

    Ling, W.C.T.; Williamson, R.B.

    1980-03-01

    Fires can be considered a causal chain-of-events in which the growth and spread of fire may cause damage and injury if it is rapid enough to overcome the barriers placed in its way. Fire tests for fire resistance of the barriers can be used in a quantitative risk assessment. The fire growth and spread is modelled in a State Transition Model (STM). The fire barriers are presented as part of the Fire Protection Model (FPM) which is based on a portion of the NFPA Decision Tree. An Emergency Equivalent Network is introduced to couple the Fire Growth Model (FGM) and the FPM so that the spread of fire beyond the room-of-origin can be computed. An example is presented in which a specific building floor plan is analyzed to obtain the shortest expected time for fire to spread between two points. To obtain the probability and time for each link in the network, data from the results of fire tests were used. These results were found to be lacking and new standards giving better data are advocated.

  18. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    PubMed

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.

  19. A new approach to hazardous materials transportation risk analysis: decision modeling to identify critical variables.

    PubMed

    Clark, Renee M; Besterfield-Sacre, Mary E

    2009-03-01

    We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk. PMID:19087232

  20. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    SciTech Connect

    Thomas, John

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  1. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...) plan. 120.8 Section 120.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN... General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan. Each processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more...

  2. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    PubMed Central

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H; Jacobsen, Christina; Vainer, Ben

    2016-01-01

    Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework. PMID:27141321

  3. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  4. The quantitative failure of human reliability analysis

    SciTech Connect

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  5. Comparative risk analysis of technological hazards (a review).

    PubMed Central

    Kates, R W; Kasperson, J X

    1983-01-01

    Hazards are threats to people and what they value and risks are measures of hazards. Comparative analyses of the risks and hazards of technology can be dated to Starr's 1969 paper [Starr, C. (1969) Science 165, 1232-1238] but are rooted in recent trends in the evolution of technology, the identification of hazard, the perception of risk, and the activities of society. These trends have spawned an interdisciplinary quasi profession with new terminology, methodology, and literature. A review of 54 English-language monographs and book-length collections, published between 1970 and 1983, identified seven recurring themes: (i) overviews of the field of risk assessment, (ii) efforts to estimate and quantify risk, (iii) discussions of risk acceptability, (iv) perception, (v) analyses of regulation, (vi) case studies of specific technological hazards, and (vii) agenda for research. Within this field, science occupies a unique niche, for many technological hazards transcend the realm of ordinary experience and require expert study. Scientists can make unique contributions to each area of hazard management but their primary contribution is the practice of basic science. Beyond that, science needs to further risk assessment by understanding the more subtle processes of hazard creation and by establishing conventions for estimating risk and for presenting and handling uncertainty. Scientists can enlighten the discussion of tolerable risk by setting risks into comparative contexts, by studying the process of evaluation, and by participating as knowledgeable individuals, but they cannot decide the issue. Science can inform the hazard management process by broadening the range of alternative control actions and modes of implementation and by devising methods to evaluate their effectiveness. PMID:6580625

  6. A Quantitative Analysis of Countries' Research Strengths

    ERIC Educational Resources Information Center

    Saxena, Anurag; Brazer, S. David; Gupta, B. M.

    2009-01-01

    This study employed a multidimensional analysis to evaluate transnational patterns of scientific research to determine relative research strengths among widely varying nations. Findings from this study may inform national policy with regard to the most efficient use of scarce national research resources, including government and private funding.…

  7. Analysis of hazardous biological material by MALDI mass spectrometry

    SciTech Connect

    KL Wahl; KH Jarman; NB Valentine; MT Kingsley; CE Petersen; ST Cebula; AJ Saenz

    2000-03-21

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-MS) has become a valuable tool for analyzing microorganisms. The speed with which data can be obtained from MALDI-MS makes this a potentially important tool for biological health hazard monitoring and forensic applications. The excitement in the mass spectrometry community in this potential field of application is evident by the expanding list of research laboratories pursuing development of MALDI-MS for bacterial identification. Numerous research groups have demonstrated the ability to obtain unique MALDI-MS spectra from intact bacterial cells and bacterial cell extracts. The ability to differentiate strains of the same species has been investigated. Reproducibility of MALDI-MS spectra from bacterial species under carefully controlled experimental conditions has also been demonstrated. Wang et al. have reported on interlaboratory reproducibility of the MALDI-MS analysis of several bacterial species. However, there are still issues that need to be addressed, including the careful control of experimental parameters for reproducible spectra and selection of optimal experimental parameters such as solvent and matrix.

  8. Hazardous Gas Leak Analysis in the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Barile, Ronald G.

    1991-01-01

    Helium tests of the main propulsion system in the Space Shuttle and on hydrogen leaks are examined. The hazardous gas detection system (HGDS) in the mobile launch pad uses mass spectrometers (MS) to monitor the shuttle environment for leaks. The mass spectrometers are fed by long tubes to sample gas from the payload bay, mid-body, aft engine compartment, and external tank. The purpose is to improve the HGDS, especially in its potential for locating cryogen leaks. Pre-existing leak data was analyzed for transient information to determine if the leak location could be pinpointed from test data. A rapid response leak detection experiment was designed, built, and tested. Large eddies and vortices were visually seen with Schlieren imaging, and they were detected in the time plots of the various instruments. The response time of the MS was found in the range of 0.05 to 0.1 sec. Pulsed concentration waves were clearly detected at 25 cycles per sec by spectral analysis of MS data. One conclusion is that the backup HGDS sampling frequency should be increased above the present rate of 1 sample per second.

  9. Preparation of waste oil for analysis to determine hazardous metals

    SciTech Connect

    Essling, A.M.; Huff, D.R.; Huff, E.A.; Fox, I.M.; Graczyk, D.G.

    1995-07-01

    Two methods for preparing waste-oil samples to permit measurement of their metals content were evaluated. For this evaluation, metals-in-oil standard reference materials were prepared by each method and the resulting solutions were analyzed for 20 metals, including those (As, Ba, Cd, Cr, Pb, Hg, Se, and Ag) regulated as hazardous under the Resource Conservation and Recovery Act. One preparation method involved combustion of the waste oil under oxygen at 25 atm pressure, as described in the American Society for Testing and Materials test method E926-88. As we applied it, this method gave recoveries well under 90% for most of the metals that we examined and, hence, proved unsatisfactory for routine application to waste-oil analysis. With the other method, nitric acid decomposition in a sealed vessel heated with microwave energy (analogous to US Environmental Protection Agency Method 3051), recoveries of all 20 metal contaminants were within 90 to 110% of the certified values. This microwave digestion procedure was also more efficient since it allowed six samples to be prepared together, whereas the oxygen combustion approach allowed processing of only one sample at a time.

  10. Scout: orbit analysis and hazard assessment for NEOCP objects

    NASA Astrophysics Data System (ADS)

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  11. Quantitative analysis of cascade impactor samples - revisited

    NASA Astrophysics Data System (ADS)

    Orlić , I.; Chiam, S. Y.; Sanchez, J. L.; Tang, S. M.

    1999-04-01

    Concentrations of aerosols collected in Singapore during the three months long haze period that affected the whole South-East Asian region in 1997 are reported. Aerosol samples were continuously collected by using a fine aerosol sampler (PM2.5) and occasionally with a single orifice cascade impactor (CI) sampler. Our results show that in the fine fraction (<2.5 μm) the concentrations of two well-known biomass burning products, i.e. K and S were generally increased by a factor 2-3 compared to the non-hazy periods. However, a discrepancy was noticed, at least for elements with lower atomic number (Ti and below) between the results obtained by the fine aerosol sampler and the cascade impactor. Careful analysis by means of Nuclear Microscopy, in particular by the Scanning Transmission Ion Microscopy (STIM) technique, revealed that thicknesses of the lower CI stages exceeded thick target limits for 2 MeV protons. Detailed depth profiles of all CI stages were therefore measured using the STIM technique and concentrations corrected for absorption and proton energy loss. After correcting results for the actual sample thickness, concentrations of all major elements (S, Cl, K, Ca) agreed much better with the PM2.5 results. The importance of implementing thick target corrections in analysis of CI samples, especially those collected in the urban environments, is emphasized. Broad beam PIXE analysis approach is certainly not adequate in these cases.

  12. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  13. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD... the intended use or consumers of the finished product shall be identified. (3) Food safety...

  14. 9 CFR 417.2 - Hazard Analysis and HACCP Plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD... the intended use or consumers of the finished product shall be identified. (3) Food safety...

  15. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  16. SLUDGE TREATMENT PROJECT ENGINEERED CONTAINER RETRIEVAL AND TRANSFER SYSTEM PRELIMINARY DESIGN HAZARD ANALYSIS SUPPLEMENT 1

    SciTech Connect

    FRANZ GR; MEICHLE RH

    2011-07-18

    This 'What/If' Hazards Analysis addresses hazards affecting the Sludge Treatment Project Engineered Container Retrieval and Transfer System (ECRTS) NPH and external events at the preliminary design stage. In addition, the hazards of the operation sequence steps for the mechanical handling operations in preparation of Sludge Transport and Storage Container (STSC), disconnect STSC and prepare STSC and Sludge Transport System (STS) for shipping are addressed.

  17. Quantitative analysis of heart rate variability

    NASA Astrophysics Data System (ADS)

    Kurths, J.; Voss, A.; Saparin, P.; Witt, A.; Kleiner, H. J.; Wessel, N.

    1995-03-01

    In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The individual risk for this sudden cardiac death cannot be defined precisely by common available, noninvasive diagnostic tools like Holter monitoring, highly amplified ECG and traditional linear analysis of heart rate variability (HRV). Therefore, we apply some rather unconventional methods of nonlinear dynamics to analyze the HRV. Especially, some complexity measures that are based on symbolic dynamics as well as a new measure, the renormalized entropy, detect some abnormalities in the HRV of several patients who have been classified in the low risk group by traditional methods. A combination of these complexity measures with the parameters in the frequency domain seems to be a promising way to get a more precise definition of the individual risk. These findings have to be validated by a representative number of patients.

  18. Hazard Analysis for the Mark III Space Suit Assembly (SSA) Used in One-g Operations

    NASA Technical Reports Server (NTRS)

    Mitchell, Kate; Ross, Amy; Blanco, Raul; Wood, Art

    2012-01-01

    This Hazard Analysis document encompasses the Mark III Space Suit Assembly (SSA) and associated ancillary equipment. It has been prepared using JSC17773, "Preparing Hazard Analyses for JSC Ground Operation", as a guide. The purpose of this document is to present the potential hazards involved in ground (23 % maximum O2, One-g) operations of the Mark III and associated ancillary support equipment system. The hazards listed in this document are specific to suit operations only; each supporting facility (Bldg. 9, etc.) is responsible for test specific Hazard Analyses. A "hazard" is defined as any condition that has the potential for harming personnel or equipment. This analysis was performed to document the safety aspects associated with manned use of the Mark III for pressurized and unpressurized ambient, ground-based, One-g human testing. The hazards identified herein represent generic hazards inherent to all standard JSC test venues for nominal ground test configurations. Non-standard test venues or test specific configurations may warrant consideration of additional hazards analysis prior to test. The cognizant suit engineer is responsible for the safety of the astronaut/test subject, space suit, and suit support personnel. The test requester, for the test supported by the suit test engineer and suited subject, is responsible for overall safety and any necessary Test Readiness Reviews (TRR).

  19. Quantitative surface spectroscopic analysis of multicomponent polymers

    NASA Astrophysics Data System (ADS)

    Zhuang, Hengzhong

    Angle-dependent electron spectroscopy for chemical analysis (ESCA) has been successfully used to examine the surface compositional gradient of a multicomponent polymer. However, photoelectron intensities detected at each take-off angle of ESCA measurements are convoluted signals. The convoluted nature of the signal distorts depth profiles for samples having compositional gradients. To recover the true concentration profiles for the samples, a deconvolution program has been described in Chapter 2. The compositional profiles of two classes of important multicomponent polymers, i.e., poly(dimethysiloxane urethane) (PU-DMS) segmented copolymers and fluorinated poly(amide urethane) block copolymers, are achieved using this program. The effects of the polymer molecular structure and the processing variation on its surface compositional profile have been studied. Besides surface composition, it is desirable to know whether the distribution of segment or block lengths at the surface is different than in the bulk, because this aspect of surface structure may lead to properties different than that predicted simply by knowledge of the surface composition and the bulk structure. In Chapter 3, we pioneered the direct determination of the distribution of polydimethylsiloxane (PDMS) segment lengths at the surface of PU-DMS using time-of-flight secondary ion mass spectrometry (SUMS). Exciting preliminary results are provided: for the thick film of PU-DMS with nominal MW of PDMS = 1000, the distribution of the PDMS segment lengths at the surface is nearly identical to that in the bulk, whereas in the case of the thick films of PU-DMS with nominal MW of PDMS = 2400, only those PDMS segments with MW of ca. 1000 preferentially segregated at the surface. As a potential minimal fouling coating or biocompatible cardio-vascular materials, PU-DMS copolymers eventually come into contact with water once in use. Could such an environmental change (from air to aqueous) induce any undesirable

  20. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. PMID:26456251

  1. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  2. Quantitative signal analysis in pulsed resonant photoacoustics

    NASA Astrophysics Data System (ADS)

    Schäfer, Stefan; Miklós, András; Hess, Peter

    1997-05-01

    The pulsed excitation of acoustic resonances was studied by means of a high- Q photoacoustic resonator with different types of microphone. The signal strength of the first radial mode was calculated by the basic theory as well as by a modeling program, which takes into account the acoustic impedances of the resonator, the acoustic filter system, and the influence of the microphone coupling on the photoacoustic cavity. When the calculated signal strength is used, the high- Q system can be calibrated for trace-gas analysis without a certified gas mixture. The theoretical results were compared with measurements and show good agreement for different microphone configurations. From the measured pressure signal (in pascals per joule), the absorption coefficient of ethylene was calculated; it agreed within 10 with literature values. In addition, a Helmholtz configuration with a highly sensitive 1-in. (2.54-cm) microphone was realized. Although the Q factor was reduced, the sensitivity could be increased by the Helmholtz resonator in the case of pulsed experiments. A maximum sensitivity of the coupled system of 341 mV Pa was achieved.

  3. Quantitative analysis of in vivo cell proliferation.

    PubMed

    Cameron, Heather A

    2006-11-01

    Injection and immunohistochemical detection of 5-bromo-2'-deoxyuridine (BrdU) has become the standard method for studying the birth and survival of neurons, glia, and other cell types in the nervous system. BrdU, a thymidine analog, becomes stably incorporated into DNA during the S-phase of mitosis. Because DNA containing BrdU can be specifically recognized by antibodies, this method allows dividing cells to be marked at any given time and then identified at time points from a few minutes to several years later. BrdU immunohistochemistry is suitable for cell counting to examine the regulation of cell proliferation and cell fate. It can be combined with labeling by other antibodies, allowing confocal analysis of cell phenotype or expression of other proteins. The potential for nonspecific labeling and toxicity are discussed. Although BrdU immunohistochemistry has almost completely replaced tritiated thymidine autoradiography for labeling dividing cells, this method and situations in which it is still useful are also described. PMID:18428635

  4. A critical analysis of hazard resilience measures within sustainability assessment frameworks

    SciTech Connect

    Matthews, Elizabeth C.; Sattler, Meredith; Friedland, Carol J.

    2014-11-15

    Today, numerous sustainability assessment frameworks (SAFs) exist to guide designers in achieving sustainable performance in the design of structures and communities. SAFs are beneficial in educating users and are useful tools for incorporating sustainability strategies into planning, design, and construction; however, there is currently a substantial gap in the ability of existing SAFs to incorporate hazard resistance and hazard mitigation in the broader context of sustainable design. This paper analyzes the incorporation of hazard resistant design and hazard mitigation strategies within SAFs via a multi-level analysis of eleven SAFs. The SAFs analyzed range in scale of application (i.e. building, site, community). Three levels of analysis are presented: (1) macro-level analysis comparing the number of measures strictly addressing resilience versus sustainability, (2) meso-level analysis of the coverage of types of hazards within SAFs (e.g. flood, fire), and (3) micro-level analysis of SAF measures connected to flood-related hazard resilience. The results demonstrate that hazard resistance and hazard mitigation do not figure prominently in the intent of SAFs and that weaknesses in resilience coverage exist that have the potential to lead to the design of structures and communities that are still highly vulnerable to the impacts of extreme events. - Highlights: • Sustainability assessment frameworks (SAFs) were analyzed for resilience coverage • Hazard resistance and mitigation do not figure prominently in the intent of SAFs • Approximately 75% of SAFs analyzed address three or fewer hazards • Lack of economic measures within SAFs could impact resilience and sustainability • Resilience measures for flood hazards are not consistently included in SAFs.

  5. The importance of source area mapping for rockfall hazard analysis

    NASA Astrophysics Data System (ADS)

    Valagussa, Andrea; Frattini, Paolo; Crosta, Giovanni B.

    2013-04-01

    A problem in the characterization of the area affected by rockfall is the correct source areas definition. Different positions or different size of the source areas along a cliff result in different possibilities of propagation and diverse interaction with passive countermeasures present in the area. Through the use of Hy-Stone (Crosta et al., 2004), a code able to perform 3D numerical modeling of rockfall processes, different types of source areas were tested on a case study slope along the western flank of the Mt. de La Saxe (Courmayeur, AO), developing between 1200 and 2055 m s.l.m. The first set of source areas consists of unstable rock masses identified on the basis of field survey and Terrestrial Laser Scanning (IMAGEO, 2011). A second set of source areas has been identified by using different thresholds of slope gradient. We tested slope thresholds between 50° and 75° at 5° intervals. The third source area dataset has been generating by performing a kinematic stability analysis. For this analysis, we mapped the join sets along the rocky cliff by means of the software COLTOP 3D (Jaboyedoff, 2004), and then we identified the portions of rocky cliff where planar/wedge and toppling failures are possible assuming an average friction angle of 35°. Through the outputs of the Hy-Stone models we extracted and analyzed the kinetic energy, height of fly and velocity of the blocks falling along the rocky cliff in order to compare the controls of different source areas. We observed strong variations of kinetic energy and fly height among the different models, especially when using unstable masses identified through Terrestrial Laser Scanning. This is mainly related to the size of the blocks identified as susceptible to failure. On the contrary, the slope gradient thresholds does not have a strong impact on rockfall propagation. This contribution highlights the importance of a careful and appropriate mapping of rockfall source area for rockfall hazard analysis and the

  6. Control of separation and quantitative analysis by GC-FTIR

    NASA Astrophysics Data System (ADS)

    Semmoud, A.; Huvenne, Jean P.; Legrand, P.

    1992-03-01

    Software for 3-D representations of the 'Absorbance-Wavenumber-Retention time' is used to control the quality of the GC separation. Spectral information given by the FTIR detection allows the user to be sure that a chromatographic peak is 'pure.' The analysis of peppermint essential oil is presented as an example. This assurance is absolutely required for quantitative applications. In these conditions, we have worked out a quantitative analysis of caffeine. Correlation coefficients between integrated absorbance measurements and concentration of caffeine are discussed at two steps of the data treatment.

  7. Hazard Detection Analysis for a Forward-Looking Interferometer

    NASA Technical Reports Server (NTRS)

    West, Leanne; Gimmestad, Gary; Herkert, Ralph; Smith, William L.; Kireev, Stanislav; Schaffner, Philip R.; Daniels, Taumi S.; Cornman, Larry B.; Sharman, Robert; Weekley, Andrew; Perram, Glen; Gross, Kevin; Smith, Greg; Feltz, Wayne; Taylor, Joe; Olson, Erik

    2010-01-01

    The Forward-Looking Interferometer (FLI) is a new instrument concept for obtaining the measurements required to alert flight crews to potential weather hazards to safe flight. To meet the needs of the commercial fleet, such a sensor should address multiple hazards to warrant the costs of development, certification, installation, training, and maintenance. The FLI concept is based on high-resolution Infrared Fourier Transform Spectrometry (FTS) technologies that have been developed for satellite remote sensing. These technologies have also been applied to the detection of aerosols and gases for other purposes. The FLI concept is being evaluated for its potential to address multiple hazards including clear air turbulence (CAT), volcanic ash, wake vortices, low slant range visibility, dry wind shear, and icing during all phases of flight (takeoff, cruise, and landing). The research accomplished in this second phase of the FLI project was in three major areas: further sensitivity studies to better understand the potential capabilities and requirements for an airborne FLI instrument, field measurements that were conducted in an effort to provide empirical demonstrations of radiometric hazard detection, and theoretical work to support the development of algorithms to determine the severity of detected hazards

  8. Job load and hazard analysis: a method for the analysis of workplace conditions for occupational health care.

    PubMed Central

    Mattila, M K

    1985-01-01

    One requirement for successful occupational health care is reliable information on occupational hazards. The aim of this study was to develop a simple, standardised method for workplace investigations for use in occupational health care. The theoretical framework of the method comprises the stress-strain model, the hazard-danger model, and risk behaviour theory. The new method, termed job load and hazard analysis, includes four stages: identification of hazards, their evaluation, conclusions and proposals, and follow up. Different methods are available for hazard identification. The identification starts with a rough analysis of five factors, chemical hazards, physical hazards, physical load, mental stress, and accident risk. Hazards and stress factors are assessed with an ordinal scale. Specialised methods are used if all hazards cannot otherwise be identified. The analytical procedure comprises: detection of hazards through observations and interviews at the workplace and with a questionnaire; assessment of findings as teamwork; and evaluation of the results of these assessments to yield conclusions and proposals made by occupational health care personnel. A data processing system has been developed for data storage and future use. The method has functioned in practice, improving the contents of the occupational health care programme and generating preventive measures. The method offers many new possibilities for controlling occupational hazards and studying relations between working conditions and workers' health. PMID:4041383

  9. Preliminary fire hazard analysis for the PUTDR and TRU trenches in the Solid Waste Burial Ground

    SciTech Connect

    Gaschott, L.J.

    1995-06-16

    This document represents the Preliminary Fire Hazards Analysis for the Pilot Unvented TRU Drum Retrieval effort and for the Transuranic drum trenches in the low level burial grounds. The FHA was developed in accordance with DOE Order 5480.7A to address major hazards inherent in the facility.

  10. 14 CFR Appendix I to Part 417 - Methodologies for Toxic Release Hazard Analysis and Operational Procedures

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... hazards and each hazard control involved in the process. An analysis that complies with 29 CFR 1910.119(e... physics on the transport and diffusion of each toxicant. (5) Meteorological conditions at the time of..., or for use in any real-time physics models used to ensure compliance with the toxic flight...

  11. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard Analysis and Critical Control Point (HACCP) plan. 120.8 Section 120.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND...

  12. In silico analysis of nanomaterials hazard and risk.

    PubMed

    Cohen, Yoram; Rallo, Robert; Liu, Rong; Liu, Haoyang Haven

    2013-03-19

    false positive relative to false negative predictions and the reliability of toxicity data. To establish the environmental impact of ENMs identified as toxic, researchers will need to estimate the potential level of environmental exposure concentration of ENMs in the various media such as air, water, soil, and vegetation. When environmental monitoring data are not available, models of ENMs fate and transport (at various levels of complexity) serve as alternative approaches for estimating exposure concentrations. Risk management decisions regarding the manufacturing, use, and environmental regulations of ENMs would clearly benefit from both the assessment of potential ENMs exposure concentrations and suitable toxicity metrics. The decision process should consider the totality of available information: quantitative and qualitative data and the analysis of nanomaterials toxicity, and fate and transport behavior in the environment. Effective decision-making to address the potential impacts of nanomaterials will require considerations of the relevant environmental, ecological, technological, economic, and sociopolitical factors affecting the complete lifecycle of nanomaterials, while accounting for data and modeling uncertainties. Accordingly, researchers will need to establish standardized data management and analysis tools through nanoinformatics as a basis for the development of rational decision tools.

  13. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    DOE PAGESBeta

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; Baker, Nathan A.; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto D.; Harper, Stacey

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistrymore » of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.« less

  14. Topographic and Remote Sensing Analysis of Explosion Craters in the Nevada Test Site: Application to Mars Hazards

    NASA Astrophysics Data System (ADS)

    Hurtado, J. M.; Chee, Y.; Farr, T. G.

    2004-12-01

    High-resolution LIDAR (LIght Detection and Ranging) data were recently acquired for several explosion craters at the Nevada Test Site (NTS) for the purpose of generating high-resolution (1- to 2-m horizontal posting; 50-cm vertical accuracy) digital elevation models (DEMs). The LIDAR DEMs and ancillary datasets, including ASTER visible-IR images and SIR-C radar images, are to be used in the study of cratering processes. Among the questions we seek to answer are whether these explosion craters can be used as analogs for impact craters on Mars. To this end, we present an error analysis of the LIDAR DEMs and an assessment of their utility in hazard avoidance applications for landers and rovers. We show preliminary data integration results in which we geologically characterize the multisensor signatures of the craters and their ejecta. These results are interpreted in terms of hazard zones defined by different types of ejecta, rock populations, and topographic characteristics (e.g. slope, aspect, relief). From these, we aim to produce a quantitative hazards map for craters in the NTS that can serve as a model for the type of information LIDAR and imagery can provide to a hazard avoidance system. All data and derived products are being manipulated and cataloged in a geographic information system (GIS), which will facilitate the packaging and distribution of materials via the Planetary Data System or other community data repository. Work performed under contract to NASA.

  15. Comparative hazard analysis and toxicological modeling of diverse nanomaterials using the embryonic zebrafish (EZ) metric of toxicity

    SciTech Connect

    Harper, Bryan; Thomas, Dennis G.; Chikkagoudar, Satish; Baker, Nathan A.; Tang, Kaizhi; Heredia-Langner, Alejandro; Lins, Roberto D.; Harper, Stacey

    2015-06-04

    The integration of rapid assays, large data sets, informatics and modeling can overcome current barriers in understanding nanomaterial structure-toxicity relationships by providing a weight-of-the-evidence mechanism to generate hazard rankings for nanomaterials. Here we present the use of a rapid, low-cost assay to perform screening-level toxicity evaluations of nanomaterials in vivo. Calculated EZ Metric scores, a combined measure of morbidity and mortality, were established at realistic exposure levels and used to develop a predictive model of nanomaterial toxicity. Hazard ranking and clustering analysis of 68 diverse nanomaterials revealed distinct patterns of toxicity related to both core composition and outermost surface chemistry of nanomaterials. The resulting clusters guided the development of a predictive model of gold nanoparticle toxicity to embryonic zebrafish. In addition, our findings suggest that risk assessments based on the size and core composition of nanomaterials alone may be wholly inappropriate, especially when considering complex engineered nanomaterials. These findings reveal the need to expeditiously increase the availability of quantitative measures of nanomaterial hazard and broaden the sharing of that data and knowledge to support predictive modeling. In addition, research should continue to focus on methodologies for developing predictive models of nanomaterial hazard based on sub-lethal responses to low dose exposures.

  16. The value of integrating information from multiple hazards for flood risk analysis and management

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, J. T.; Escuder-Bueno, I.; Altarejos-García, L.; Serrano-Lombillo, A.

    2014-02-01

    This article presents a methodology for estimating flood risk in urban areas integrating pluvial flooding, river flooding and failure of both small and large dams. The first part includes a review of basic concepts on flood risk analysis, evaluation and management. Flood risk analyses may be developed at local, regional and national level, however a general methodology to perform a quantitative flood risk analysis including different flood hazards is still required. The second part describes the proposed methodology, which presents an integrated approach - combining pluvial, river flooding and flooding from dam failure, as applied to a case study: an urban area located downstream of a dam under construction. The methodology enhances the approach developed within the SUFRI project ("Sustainable Strategies of Urban Flood Risk Management to cope with the residual risk", 2009-2011). This article also shows how outcomes from flood risk analysis provide better and more complete information to inform authorities, local entities and the stakeholders involved in decision-making with regard to flood risk management.

  17. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  18. Occupational hazards control of hazardous substances in clean room of semiconductor manufacturing plant using CFD analysis.

    PubMed

    Li, Jianfeng; Zhou, Ya-Fei

    2015-02-01

    The manufacturing processes in chip industries are complex, and many kinds of raw materials and solvents of different nature are used, most of which are highly toxic and dangerous. During the machine preventive maintenance period, these toxic and harmful substances will escape from the sealed reaction chamber to the clean workshop environment and endanger the health of the workers on-site, resulting in occupational diseases. From the perspective of prevention, the spread and prediction of hydrochloric acid (HCl) that escaped from the metal-etching chamber during maintenance were studied in this article. The computational fluid dynamics technology was used for a three-dimensional numerical simulation of the indoor air velocity field and the HCl concentration field, and the simulation results were then compared with the on-site monitoring data to verify the correctness and feasibility. The occupational hazards and control measures were analyzed based on the numerical simulation, and the optimal control measure was obtained. In this article, using the method of ambient air to analyze the occupational exposure can provide a new idea to the field of occupational health research in the integrated circuit industry and had theoretical and practical significance.

  19. Hazard analysis for 300 Area N Reactor Fuel Fabrication and Storage Facilty

    SciTech Connect

    Johnson, D.J.; Brehm, J.R.

    1994-01-25

    This hazard analysis (HA) has been prepared for the 300 Area N Reactor Fuel Fabrication and Storage Facility (Facility), in compliance with the requirements of Westinghouse Hanford Company (Westinghouse Hanford) controlled manual WHC-CM-4-46, Nonreactor Facility Safety Analysis Manual, and to the direction of WHC-IP-0690, Safety Analysis and Regulation Desk Instructions, (WHC 1992). An HA identifies potentially hazardous conditions in a facility and the associated potential accident scenarios. Unlike the Facility hazard classification documented in WHC-SD-NR-HC-004, Hazard Classification for 300 Area N Reactor Fuel Fabrication and Storage Facility, (Huang 1993), which is based on unmitigated consequences, credit is taken in an HA for administrative controls or engineered safety features planned or in place. The HA is the foundation for the accident analysis. The significant event scenarios identified by this HA will be further evaluated in a subsequent accident analysis.

  20. Princeton Plasma Physics Laboratory (PPPL) seismic hazard analysis

    SciTech Connect

    Savy, J.

    1989-10-01

    New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the results of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs.

  1. Reliability analysis of common hazardous waste treatment processes

    SciTech Connect

    Waters, R.D.

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  2. Quantitative methods for the analysis of zoosporic fungi.

    PubMed

    Marano, Agostina V; Gleason, Frank H; Bärlocher, Felix; Pires-Zottarelli, Carmen L A; Lilje, Osu; Schmidt, Steve K; Rasconi, Serena; Kagami, Maiko; Barrera, Marcelo D; Sime-Ngando, Télesphore; Boussiba, Sammy; de Souza, José I; Edwards, Joan E

    2012-04-01

    Quantitative estimations of zoosporic fungi in the environment have historically received little attention, primarily due to methodological challenges and their complex life cycles. Conventional methods for quantitative analysis of zoosporic fungi to date have mainly relied on direct observation and baiting techniques, with subsequent fungal identification in the laboratory using morphological characteristics. Although these methods are still fundamentally useful, there has been an increasing preference for quantitative microscopic methods based on staining with fluorescent dyes, as well as the use of hybridization probes. More recently however PCR based methods for profiling and quantification (semi- and absolute) have proven to be rapid and accurate diagnostic tools for assessing zoosporic fungal assemblages in environmental samples. Further application of next generation sequencing technologies will however not only advance our quantitative understanding of zoosporic fungal ecology, but also their function through the analysis of their genomes and gene expression as resources and databases expand in the future. Nevertheless, it is still necessary to complement these molecular-based approaches with cultivation-based methods in order to gain a fuller quantitative understanding of the ecological and physiological roles of zoosporic fungi.

  3. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  4. Early Child Grammars: Qualitative and Quantitative Analysis of Morphosyntactic Production

    ERIC Educational Resources Information Center

    Legendre, Geraldine

    2006-01-01

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is…

  5. Quantitative transverse flow measurement using OCT speckle decorrelation analysis

    PubMed Central

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Mathews, Scott A.; Kang, Jin U.

    2014-01-01

    We propose an inter-Ascan speckle decorrelation based method that can quantitatively assess blood flow normal to the direction of the OCT imaging beam. To validate this method, we performed a systematic study using both phantom and in vivo animal models. Results show that our speckle analysis method can accurately extract transverse flow speed with high spatial and temporal resolution. PMID:23455305

  6. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  7. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

    NASA Astrophysics Data System (ADS)

    Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

    2016-04-01

    In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach

  8. Quantitating the subtleties of microglial morphology with fractal analysis

    PubMed Central

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F.

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between “ramified resting” and “activated amoeboid” has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology. PMID:23386810

  9. Quantitative analysis of regional myocardial performance in coronary artery disease

    NASA Technical Reports Server (NTRS)

    Stewart, D. K.; Dodge, H. T.; Frimer, M.

    1975-01-01

    Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.

  10. Improved method and apparatus for chromatographic quantitative analysis

    DOEpatents

    Fritz, J.S.; Gjerde, D.T.; Schmuckler, G.

    An improved apparatus and method are described for the quantitative analysis of a solution containing a plurality of anion species by ion exchange chromatography which utilizes a single element and a single ion exchange bed which does not require periodic regeneration. The solution containing the anions is added to an anion exchange resin bed which is a low capacity macroreticular polystyrene-divinylbenzene resin containing quarternary ammonium functional groups, and is eluted therefrom with a dilute solution of a low electrical conductance organic acid salt. As each anion species is eluted from the bed, it is quantitatively sensed by conventional detection means such as a conductivity cell.

  11. Quantitative analysis of culture using millions of digitized books

    PubMed Central

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  12. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  13. Quantitative analysis of culture using millions of digitized books.

    PubMed

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  14. Chemical hazards analysis of resilient flooring for healthcare.

    PubMed

    Lent, Tom; Silas, Julie; Vallette, Jim

    2010-01-01

    This article addresses resilient flooring, evaluating the potential health effects of vinyl flooring and the leading alternatives-synthetic rubber, polyolefin, and linoleum-currently used in the healthcare marketplace. The study inventories chemicals incorporated as components of each of the four material types or involved in their life cycle as feedstocks, intermediary chemicals, or emissions. It then characterizes those chemicals using a chemical hazard-based framework that addresses persistence and bioaccumulation, human toxicity, and human exposures. PMID:21165873

  15. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. PMID:24889823

  16. Quantitative mass spectrometric analysis of glycoproteins combined with enrichment methods.

    PubMed

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies.

  17. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and spatial planning) and links back to the risk assessment module to calculate the new level of risk if the measure is implemented, and a cost-benefit (or cost-effectiveness/ Spatial Multi Criteria Evaluation) component to compare the alternatives and make decision on the optimal one. The third component of the SDSS is a temporal scenario component, which allows to define future scenarios in terms of climate change, land use change and population change, and the time periods for which these scenarios will be made. The component doesn't generate these scenarios but uses input maps for the effect of the scenarios on the hazard and assets maps. The last component is a communication and visualization component, which can compare scenarios and alternatives, not only in the form of maps, but also in other forms (risk curves, tables, graphs)

  18. Assessing the long-term probabilistic volcanic hazard for tephra fallout in Reykjavik, Iceland: a preliminary multi-source analysis

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Barsotti, Sara; Sandri, Laura; Tumi Guðmundsson, Magnús

    2015-04-01

    Icelandic volcanism is largely dominated by basaltic magma. Nevertheless the presence of glaciers over many Icelandic volcanic systems results in frequent phreatomagmatic eruptions and associated tephra production, making explosive eruptions the most common type of volcanic activity. Jökulhlaups are commonly considered as major volcanic hazard in Iceland for their high frequency and potentially very devastating local impact. Tephra fallout is also frequent and can impact larger areas. It is driven by the wind direction that can change with both altitude and season, making impossible to predict a priori where the tephra will be deposited during the next eruptions. Most of the volcanic activity in Iceland occurs in the central eastern part, over 100 km to the east of the main population centre around the capital Reykjavík. Therefore, the hazard from tephra fallout in Reykjavík is expected to be smaller than for communities settled near the main volcanic systems. However, within the framework of quantitative hazard and risk analyses, less frequent and/or less intense phenomena should not be neglected, since their risk evaluation depends on the effects suffered by the selected target. This is particularly true if the target is highly vulnerable, as large urban areas or important infrastructures. In this work we present the preliminary analysis aiming to perform a Probabilistic Volcanic Hazard Assessment (PVHA) for tephra fallout focused on the target area which includes the municipality of Reykjavík and the Keflavík international airport. This approach reverts the more common perspective where the hazard analysis is focused on the source (the volcanic system) and it follows a multi-source approach: indeed, the idea is to quantify, homogeneously, the hazard due to the main hazardous volcanoes that could pose a tephra fallout threat for the municipality of Reykjavík and the Keflavík airport. PVHA for each volcanic system is calculated independently and the results

  19. Sensitivity analysis for Probabilistic Tsunami Hazard Assessment (PTHA)

    NASA Astrophysics Data System (ADS)

    Spada, M.; Basili, R.; Selva, J.; Lorito, S.; Sorensen, M. B.; Zonker, J.; Babeyko, A. Y.; Romano, F.; Piatanesi, A.; Tiberti, M.

    2012-12-01

    In modern societies, probabilistic hazard assessment of natural disasters is commonly used by decision makers for designing regulatory standards and, more generally, for prioritizing risk mitigation efforts. Systematic formalization of Probabilistic Tsunami Hazard Assessment (PTHA) has started only in recent years, mainly following the giant tsunami disaster of Sumatra in 2004. Typically, PTHA for earthquake sources exploits the long-standing practices developed in probabilistic seismic hazard assessment (PSHA), even though important differences are evident. In PTHA, for example, it is known that far-field sources are more important and that physical models for tsunami propagation are needed for the highly non-isotropic propagation of tsunami waves. However, considering the high impact that PTHA may have on societies, an important effort to quantify the effect of specific assumptions should be performed. Indeed, specific standard hypotheses made in PSHA may prove inappropriate for PTHA, since tsunami waves are sensitive to different aspects of sources (e.g. fault geometry, scaling laws, slip distribution) and propagate differently. In addition, the necessity of running an explicit calculation of wave propagation for every possible event (tsunami scenario) forces analysts to finding strategies for diminishing the computational burden. In this work, we test the sensitivity of hazard results with respect to several assumptions that are peculiar of PTHA and others that are commonly accepted in PSHA. Our case study is located in the central Mediterranean Sea and considers the Western Hellenic Arc as the earthquake source with Crete and Eastern Sicily as near-field and far-field target coasts, respectively. Our suite of sensitivity tests includes: a) comparison of random seismicity distribution within area sources as opposed to systematically distributed ruptures on fault sources; b) effects of statistical and physical parameters (a- and b-value, Mc, Mmax, scaling laws

  20. Earthquake Hazard Mitigation Using a Systems Analysis Approach to Risk Assessment

    NASA Astrophysics Data System (ADS)

    Legg, M.; Eguchi, R. T.

    2015-12-01

    The earthquake hazard mitigation goal is to reduce losses due to severe natural events. The first step is to conduct a Seismic Risk Assessment consisting of 1) hazard estimation, 2) vulnerability analysis, 3) exposure compilation. Seismic hazards include ground deformation, shaking, and inundation. The hazard estimation may be probabilistic or deterministic. Probabilistic Seismic Hazard Assessment (PSHA) is generally applied to site-specific Risk assessments, but may involve large areas as in a National Seismic Hazard Mapping program. Deterministic hazard assessments are needed for geographically distributed exposure such as lifelines (infrastructure), but may be important for large communities. Vulnerability evaluation includes quantification of fragility for construction or components including personnel. Exposure represents the existing or planned construction, facilities, infrastructure, and population in the affected area. Risk (expected loss) is the product of the quantified hazard, vulnerability (damage algorithm), and exposure which may be used to prepare emergency response plans, retrofit existing construction, or use community planning to avoid hazards. The risk estimate provides data needed to acquire earthquake insurance to assist with effective recovery following a severe event. Earthquake Scenarios used in Deterministic Risk Assessments provide detailed information on where hazards may be most severe, what system components are most susceptible to failure, and to evaluate the combined effects of a severe earthquake to the whole system or community. Casualties (injuries and death) have been the primary factor in defining building codes for seismic-resistant construction. Economic losses may be equally significant factors that can influence proactive hazard mitigation. Large urban earthquakes may produce catastrophic losses due to a cascading of effects often missed in PSHA. Economic collapse may ensue if damaged workplaces, disruption of utilities, and

  1. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  2. Some selected quantitative methods of thermal image analysis in Matlab.

    PubMed

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  3. Data from quantitative label free proteomics analysis of rat spleen.

    PubMed

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis. PMID:27358910

  4. Hazard analysis and critical control point systems in the United States Department of Agriculture regulatory policy.

    PubMed

    Billy, T J; Wachsmuth, I K

    1997-08-01

    Recent outbreaks of foodborne illness and studies by expert groups have established the need for fundamental change in the United States meat and poultry inspection programme to reduce the risk of foodborne illness. The Food Safety and Inspection Service (FSIS) of the United States Department of Agriculture (USDA) has embarked on a broad effort to bring about such change, with particular emphasis on the reduction of pathogenic micro-organisms in raw meat and poultry products. The publication on 25 July 1996 of the Final Rule on pathogen reduction and hazard analysis and critical control point (HACCP) systems was a major milestone in the FSIS strategy for change. The Final Rule provides a framework for change and clarifies the respective roles of industry and government in ensuring the safety of meat and poultry products. With the implementation of this Final Rule underway, the FSIS has been exploring ways in which slaughter inspection carried out under an HACCP-based system can be changed so that food safety risks are addressed more adequately and the allocation of inspection resources is improved further. In addition, the FSIS is broadening the focus of food safety activities to extend beyond slaughter and processing plants by working with industry, academia and other government agencies. Such co-operation should lead to the development of measures to improve food safety before animals reach the slaughter plant and after products leave the inspected establishment for distribution to the retail level. For the future, the FSIS believes that quantitative risk assessments will be at the core of food safety activities. Risk assessments provide the most effective means of identifying how specific pathogens and other hazards may be encountered throughout the farm-to-table chain and of measuring the potential impact of various interventions. In addition, these assessments will be used in the development and evaluation of HACCP systems. The FSIS is currently conducting a

  5. Single-Molecule Sensors: Challenges and Opportunities for Quantitative Analysis.

    PubMed

    Gooding, J Justin; Gaus, Katharina

    2016-09-12

    Measurement science has been converging to smaller and smaller samples, such that it is now possible to detect single molecules. This Review focuses on the next generation of analytical tools that combine single-molecule detection with the ability to measure many single molecules simultaneously and/or process larger and more complex samples. Such single-molecule sensors constitute a new type of quantitative analytical tool, as they perform analysis by molecular counting and thus potentially capture the heterogeneity of the sample. This Review outlines the advantages and potential of these new, quantitative single-molecule sensors, the measurement challenges in making single-molecule devices suitable for analysis, the inspiration biology provides for overcoming these challenges, and some of the solutions currently being explored.

  6. Single-Molecule Sensors: Challenges and Opportunities for Quantitative Analysis.

    PubMed

    Gooding, J Justin; Gaus, Katharina

    2016-09-12

    Measurement science has been converging to smaller and smaller samples, such that it is now possible to detect single molecules. This Review focuses on the next generation of analytical tools that combine single-molecule detection with the ability to measure many single molecules simultaneously and/or process larger and more complex samples. Such single-molecule sensors constitute a new type of quantitative analytical tool, as they perform analysis by molecular counting and thus potentially capture the heterogeneity of the sample. This Review outlines the advantages and potential of these new, quantitative single-molecule sensors, the measurement challenges in making single-molecule devices suitable for analysis, the inspiration biology provides for overcoming these challenges, and some of the solutions currently being explored. PMID:27444661

  7. Evaluating the spatial distribution of quantitative risk and hazard level of arsenic exposure in groundwater, case study of Qorveh County, Kurdistan Iran.

    PubMed

    Nasrabadi, Touraj; Bidabadi, Niloufar Shirani

    2013-01-01

    Regional distribution of quantitative risk and hazard levels due to arsenic poisoning in some parts of Iran's Kurdistan province is considered. To investigate the potential risk and hazard level regarding arsenic-contaminated drinking water and further carcinogenic and non-carcinogenic effects on villagers, thirteen wells in rural areas of Qorveh County were considered for evaluation of arsenic concentration in water. Sampling campaign was performed in August 2010 and arsenic concentration was measured via the Silver Diethyldithiocarbamate method. The highest and lowest arsenic concentration are reported in Guilaklu and Qezeljakand villages with 420 and 67 μg/L, respectively. None of thirteen water samples met the maximum contaminant level issued by USEPA and Institute of Standards and Industrial Research of Iran (10 ppb). The highest arsenic concentration and consequently risk and hazard levels belong to villages situated alongside the eastern frontiers of the county. Existence of volcanic activities within the upper Miocene and Pleistocene in this part of the study area may be addressed as the main geopogenic source of arsenic pollution. Quantitative risk values are varying from 1.49E-03 in Qezeljakand to 8.92E-03 in Guilaklu and may be interpreted as very high when compared by similar studies in Iran. Regarding non-carcinogenic effects, all thirteen water samples are considered hazardous while all calculated chronic daily intakes are greater than arsenic reference dose. Such drinking water source has the potential to impose adverse carcinogenic and non-carcinogenic effects on villagers. Accordingly, an urgent decision must be made to substitute the current drinking water source with a safer one.

  8. Evaluating the spatial distribution of quantitative risk and hazard level of arsenic exposure in groundwater, case study of Qorveh County, Kurdistan Iran

    PubMed Central

    2013-01-01

    Regional distribution of quantitative risk and hazard levels due to arsenic poisoning in some parts of Iran’s Kurdistan province is considered. To investigate the potential risk and hazard level regarding arsenic-contaminated drinking water and further carcinogenic and non-carcinogenic effects on villagers, thirteen wells in rural areas of Qorveh County were considered for evaluation of arsenic concentration in water. Sampling campaign was performed in August 2010 and arsenic concentration was measured via the Silver Diethyldithiocarbamate method. The highest and lowest arsenic concentration are reported in Guilaklu and Qezeljakand villages with 420 and 67 μg/L, respectively. None of thirteen water samples met the maximum contaminant level issued by USEPA and Institute of Standards and Industrial Research of Iran (10 ppb). The highest arsenic concentration and consequently risk and hazard levels belong to villages situated alongside the eastern frontiers of the county. Existence of volcanic activities within the upper Miocene and Pleistocene in this part of the study area may be addressed as the main geopogenic source of arsenic pollution. Quantitative risk values are varying from 1.49E-03 in Qezeljakand to 8.92E-03 in Guilaklu and may be interpreted as very high when compared by similar studies in Iran. Regarding non-carcinogenic effects, all thirteen water samples are considered hazardous while all calculated chronic daily intakes are greater than arsenic reference dose. Such drinking water source has the potential to impose adverse carcinogenic and non-carcinogenic effects on villagers. Accordingly, an urgent decision must be made to substitute the current drinking water source with a safer one. PMID:23574885

  9. Solar Glaze Hazard Analysis Tool v. 3.0

    2015-08-25

    SGHAT predicts the occurrence and intensity of glare caused by a user-specified solar panel array when viewed from one or more observation points. An interactive mapping interface is used to determine the latitude, longitude and elevation of the array and observation points. The presence and intensity of glare is then calculated along a given time interval throughout the year, based on the position of the sun. The potential ocular hazard is also reported. The maximummore » energy production of the solar array is also estimated so that alternative designs can be compared to determine the design that yields the most energy production while mitigating glare.« less

  10. Solar Glaze Hazard Analysis Tool v. 3.0

    SciTech Connect

    Ho, Clifford K.; Sims, Cianan A.

    2015-08-25

    SGHAT predicts the occurrence and intensity of glare caused by a user-specified solar panel array when viewed from one or more observation points. An interactive mapping interface is used to determine the latitude, longitude and elevation of the array and observation points. The presence and intensity of glare is then calculated along a given time interval throughout the year, based on the position of the sun. The potential ocular hazard is also reported. The maximum energy production of the solar array is also estimated so that alternative designs can be compared to determine the design that yields the most energy production while mitigating glare.

  11. Quantitative NMR Analysis of Partially Substituted Biodiesel Glycerols

    SciTech Connect

    Nagy, M.; Alleman, T. L.; Dyer, T.; Ragauskas, A. J.

    2009-01-01

    Phosphitylation of hydroxyl groups in biodiesel samples with 2-chloro-4,4,5,5-tetramethyl-1,3,2-dioxaphospholane followed by 31P-NMR analysis provides a rapid quantitative analytical technique for the determination of substitution patterns on partially esterified glycerols. The unique 31P-NMR chemical shift data was established with a series mono and di-substituted fatty acid esters of glycerol and then utilized to characterize an industrial sample of partially processed biodiesel.

  12. A quantitative analysis of coupled oscillations using mobile accelerometer sensors

    NASA Astrophysics Data System (ADS)

    Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.

    2013-05-01

    In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.

  13. Defining geologic Hazards for natural resources management using tree-ring analysis

    USGS Publications Warehouse

    DeGraff, J.V.; Agard, S.S.

    1984-01-01

    Landslides, avalanches, floods, and other geologic hazards impair natural resources management by jeopardizing public safety, damaging or restricting resource utilization, and necessitating expenditures for corrective measures The negative impact of geologic hazard events can be reduced by tailoring resources management to hazard potential of an area This requires assessment of where and how frequently the events occur National forests and other managed wildlands often lack monitoring or historical records to compute frequency of hazard occurrence Tree-ring analysis, based on internal growth response to external events such as tilting and abrasion, can provide frequency data Two examples of the use of tree-ring analysis to date landslide activity illustrate advantages and limitations of the technique An example from the Fishlake National Forest in central Utah illustrates assessment for planning purposes An example from the Sierra National Forest in east-central California shows assessment applied to project design Many geologic hazards in addition to landslides are suited to tree-ring analysis to establish frequency of occurrence Hazard reduction efforts in natural resources management could be enhanced by careful application of tree-ring analysis ?? 1984 Springer-Verlag New York Inc.

  14. Fire hazard analysis for Project W-320 Tank 241-C-106 waste retrieval

    SciTech Connect

    Conner, J.C.

    1995-09-12

    This Fire Hazards Analysis (FHA) for Project W-320, `Tank 241-C-106 Waste Retrieval` addresses fire hazards or fire related concerns in accordance with DOE 5480.7A (DOE 1998), resulting from or related to the processes and equipment to be installed or modified under Project W-320 to ensure that there are no undue fire hazards to site personnel and the public; the potential for the occurrence of a fire is minimized, process control and safety systems are not damaged by fire or related perils; and property damage from fire and related perils does not exceed an acceptable level.

  15. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    PubMed

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-01-01

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections. PMID:27548134

  16. Quantitation of glycerophosphorylcholine by flow injection analysis using immobilized enzymes.

    PubMed

    Mancini, A; Del Rosso, F; Roberti, R; Caligiana, P; Vecchini, A; Binaglia, L

    1996-09-20

    A method for quantitating glycerophosphorylcholine by flow injection analysis is reported in the present paper. Glycerophosphorylcholine phosphodiesterase and choline oxidase, immobilized on controlled porosity glass beads, are packed in a small reactor inserted in a flow injection manifold. When samples containing glycerophosphorylcholine are injected, glycerophosphorylcholine is hydrolyzed into choline and sn-glycerol-3-phosphate. The free choline produced in this reaction is oxidized to betain and hydrogen peroxide. Hydrogen peroxide is detected amperometrically. Quantitation of glycerophosphorylcholine in samples containing choline and phosphorylcholine is obtained inserting ahead of the reactor a small column packed with a mixed bed ion exchange resin. The time needed for each determination does not exceed one minute. The present method, applied to quantitate glycerophosphorylcholine in samples of seminal plasma, gave results comparable with those obtained using the standard enzymatic-spectrophotometric procedure. An alternative procedure, making use of co-immobilized glycerophosphorylcholine phosphodiesterase and glycerol-3-phosphate oxidase for quantitating glycerophosphorylcholine, glycerophosphorylethanolamine and glycerophosphorylserine is also described. PMID:8905629

  17. Quantitative Proteomic Approaches for Analysis of Protein S-Nitrosylation.

    PubMed

    Qu, Zhe; Greenlief, C Michael; Gu, Zezong

    2016-01-01

    S-Nitrosylation is a redox-based post-translational modification of a protein in response to nitric oxide (NO) signaling, and it participates in a variety of processes in diverse biological systems. The significance of this type of protein modification in health and diseases is increasingly recognized. In the central nervous system, aberrant S-nitrosylation, due to excessive NO production, is known to cause protein misfolding, mitochondrial dysfunction, transcriptional dysregulation, and neuronal death. This leads to an altered physiological state and consequently contributes to pathogenesis of neurodegenerative disorders. To date, much effort has been made to understand the mechanisms underlying protein S-nitrosylation, and several approaches have been developed to unveil S-nitrosylated proteins from different organisms. Interest in determining the dynamic changes of protein S-nitrosylation under different physiological and pathophysiological conditions has underscored the need for the development of quantitative proteomic approaches. Currently, both gel-based and gel-free mass spectrometry-based quantitative methods are widely used, and they each have advantages and disadvantages but may also be used together to produce complementary data. This review evaluates current available quantitative proteomic techniques for the analysis of protein S-nitrosylation and highlights recent advances, with emphasis on applications in neurodegenerative diseases. An important goal is to provide a comprehensive guide of feasible quantitative proteomic methodologies for examining protein S-nitrosylation in research to yield insights into disease mechanisms, diagnostic biomarkers, and drug discovery.

  18. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    PubMed

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-08-19

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  19. A Quantitative Appraisal of Earthquake Hazard Parameters Evaluated from Bayesian Approach for Different Regions in Iranian Plateau

    NASA Astrophysics Data System (ADS)

    Mohammadi, Hiwa; Türker, Tügba; Bayrak, Yusuf

    2016-06-01

    In this study, we used the program for seismic hazard Bayesian estimate which was elaborated by Alexey Lyubushin. Our study is the next in the sequence of applications of this software to seismic hazard assessment in different regions of the world. However, earthquake hazard parameters of maximum regional magnitude (M_{ max }), β value and seismic activity rate or intensity ( λ) and their uncertainties for the 15 different source regions in Iranian Plateau have been evaluated with the help of a complete and homogeneous earthquake catalogue during the period 1900-2014 with M_{{w}} ≥4.0. The estimated M_{ max } values varies between 6.25 and 8.37. Lowest value is observed in the Zagros foredeep whereas highest value is observed in the Makran. Also, it is observed that there is a strong relationship between the estimated maximum earthquake magnitudes estimated by Bayesian approach and maximum observed magnitudes. Moreover, in this study, quantiles of functions of distributions of true and apparent magnitudes for future time intervals of 5, 10, 20, 50 and 100 years are calculated with confidence limits for probability levels of 50, 70 and 90 % in 15 different source regions. Based on computed earthquake hazard parameters, the prerequisite guides to the earthquake estimation of the parameters referred to as the most seismically active regions of Iranian Plateau. The Makran and East Iran show earthquake magnitude greater than 8.0 in next 100-years with 90 % probability level as compared to other regions, which declares that these regions are more susceptible to occurrence of large earthquakes. The outcomes which obtained in the study may have useful implications in the probabilistic seismic hazard studies of Iranian Plateau.

  20. Sensitivity analysis of seismic hazard for Western Liguria (North Western Italy): A first attempt towards the understanding and quantification of hazard uncertainty

    NASA Astrophysics Data System (ADS)

    Barani, Simone; Spallarossa, Daniele; Bazzurro, Paolo; Eva, Claudio

    2007-05-01

    The use of logic trees in probabilistic seismic hazard analyses often involves a large number of branches that reflect the uncertainty in the selection of different models and in the selection of the parameter values of each model. The sensitivity analysis, as proposed by Rabinowitz and Steinberg [Rabinowitz, N., Steinberg, D.M., 1991. Seismic hazard sensitivity analysis: a multi-parameter approach. Bull. Seismol. Soc. Am. 81, 796-817], is an efficient tool that allows the construction of logic trees focusing attention on the parameters that have greater impact on the hazard. In this paper the sensitivity analysis is performed in order to identify the parameters that have the largest influence on the Western Liguria (North Western Italy) seismic hazard. The analysis is conducted for six strategic sites following the multi-parameter approach developed by Rabinowitz and Steinberg [Rabinowitz, N., Steinberg, D.M., 1991. Seismic hazard sensitivity analysis: a multi-parameter approach. Bull. Seismol. Soc. Am. 81, 796-817] and accounts for both mean hazard values and hazard values corresponding to different percentiles (e.g., 16%-ile and 84%-ile). The results are assessed in terms of the expected PGA with a 10% probability of exceedance in 50 years for rock conditions and account for both the contribution from specific source zones using the Cornell approach [Cornell, C.A., 1968. Engineering seismic risk analysis. Bull. Seismol. Soc. Am. 58, 1583-1606] and the spatially smoothed seismicity [Frankel, A., 1995. Mapping seismic hazard in the Central and Eastern United States. Seismol. Res. Lett. 66, 8-21]. The influence of different procedures for calculating seismic hazard, seismic catalogues (epicentral parameters), source zone models, frequency-magnitude parameters, maximum earthquake magnitude values and attenuation relationships is considered. As a result, the sensitivity analysis allows us to identify the parameters with higher influence on the hazard. Only these

  1. North Portal Fuel Storage System Fire Hazard Analysis-ESF Surface Design Package ID

    SciTech Connect

    N.M. Ruonavaara

    1995-01-18

    The purpose of the fire hazard analysis is to comprehensively assess the risk from fire within the individual fire areas. This document will only assess the fire hazard analysis within the Exploratory Studies Facility (ESF) Design Package ID, which includes the fuel storage system area of the North Portal facility, and evaluate whether the following objectives are met: 1.1.1--This analysis, performed in accordance with the requirements of this document, will satisfy the requirements for a fire hazard analysis in accordance with U.S. Department of Energy (DOE) Order 5480.7A. 1.1.2--Ensure that property damage from fire and related perils does not exceed an acceptable level. 1.1.3--Provide input to the ESF Basis For Design (BFD) Document. 1.1.4 Provide input to the facility Safety Analysis Report (SAR) (Paragraph 3.8).

  2. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    PubMed

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module.

  3. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  4. Quantitative analysis of surface electromyography: Biomarkers for convulsive seizures.

    PubMed

    Beniczky, Sándor; Conradsen, Isa; Pressler, Ronit; Wolf, Peter

    2016-08-01

    Muscle activity during seizures is in electroencephalographical (EEG) praxis often considered an irritating artefact. This article discusses ways by surface electromyography (EMG) to turn it into a valuable tool of epileptology. Muscles are in direct synaptic contact with motor neurons. Therefore, EMG signals provide direct information about the electric activity in the motor cortex. Qualitative analysis of EMG has traditionally been a part of the long-term video-EEG recordings. Recent development in quantitative analysis of EMG signals yielded valuable information on the pathomechanisms of convulsive seizures, demonstrating that it was different from maximal voluntary contraction, and different from convulsive psychogenic non-epileptic seizures. Furthermore, the tonic phase of the generalised tonic-clonic seizures (GTCS) proved to have different quantitative features than tonic seizures. The high temporal resolution of EMG allowed detailed characterisation of temporal dynamics of the GTCS, suggesting that the same inhibitory mechanisms that try to prevent the build-up of the seizure activity, contribute to ending the seizure. These findings have clinical implications: the quantitative EMG features provided the pathophysiologic substrate for developing neurophysiologic biomarkers that accurately identify GTCS. This proved to be efficient both for seizure detection and for objective, automated distinction between convulsive and non-convulsive epileptic seizures.

  5. Label-Free Technologies for Quantitative Multiparameter Biological Analysis

    PubMed Central

    Qavi, Abraham J.; Washburn, Adam L.; Byeon, Ji-Yeon; Bailey, Ryan C.

    2009-01-01

    In the post-genomic era, information is king and information-rich technologies are critically important drivers in both fundamental biology and medicine. It is now known that single-parameter measurements provide only limited detail and that quantitation of multiple biomolecular signatures can more fully illuminate complex biological function. Label-free technologies have recently attracted significant interest for sensitive and quantitative multiparameter analysis of biological systems. There are several different classes of label-free sensors that are currently being developed both in academia and in industry. In this critical review, we highlight, compare, and contrast some of the more promising approaches. We will describe the fundamental principles of these different methodologies and discuss advantages and disadvantages that might potentially help one in selecting the appropriate technology for a given bioanalytical application. PMID:19221722

  6. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells. PMID:26039484

  7. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.

  8. Biomechanical cell analysis using quantitative phase imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Wax, Adam; Park, Han Sang; Eldridge, William J.

    2016-03-01

    Quantitative phase imaging provides nanometer scale sensitivity and has been previously used to study spectral and temporal characteristics of individual cells in vitro, especially red blood cells. Here we extend this work to study the mechanical responses of individual cells due to the influence of external stimuli. Cell stiffness may be characterized by analyzing the inherent thermal fluctuations of cells but by applying external stimuli, additional information can be obtained. The time dependent response of cells due to external shear stress is examined with high speed quantitative phase imaging and found to exhibit characteristics that relate to their stiffness. However, analysis beyond the cellular scale also reveals internal organization of the cell and its modulation due to pathologic processes such as carcinogenesis. Further studies with microfluidic platforms point the way for using this approach in high throughput assays.

  9. From "weight of evidence" to quantitative data integration using multicriteria decision analysis and Bayesian methods.

    PubMed

    Linkov, Igor; Massey, Olivia; Keisler, Jeff; Rusyn, Ivan; Hartung, Thomas

    2015-01-01

    "Weighing" available evidence in the process of decision-making is unavoidable, yet it is one step that routinely raises suspicions: what evidence should be used, how much does it weigh, and whose thumb may be tipping the scales? This commentary aims to evaluate the current state and future roles of various types of evidence for hazard assessment as it applies to environmental health. In its recent evaluation of the US Environmental Protection Agency's Integrated Risk Information System assessment process, the National Research Council committee singled out the term "weight of evidence" (WoE) for critique, deeming the process too vague and detractive to the practice of evaluating human health risks of chemicals. Moving the methodology away from qualitative, vague and controversial methods towards generalizable, quantitative and transparent methods for appropriately managing diverse lines of evidence is paramount for both regulatory and public acceptance of the hazard assessments. The choice of terminology notwithstanding, a number of recent Bayesian WoE-based methods, the emergence of multi criteria decision analysis for WoE applications, as well as the general principles behind the foundational concepts of WoE, show promise in how to move forward and regain trust in the data integration step of the assessments. We offer our thoughts on the current state of WoE as a whole and while we acknowledge that many WoE applications have been largely qualitative and subjective in nature, we see this as an opportunity to turn WoE towards a quantitative direction that includes Bayesian and multi criteria decision analysis.

  10. From "weight of evidence" to quantitative data integration using multicriteria decision analysis and Bayesian methods.

    PubMed

    Linkov, Igor; Massey, Olivia; Keisler, Jeff; Rusyn, Ivan; Hartung, Thomas

    2015-01-01

    "Weighing" available evidence in the process of decision-making is unavoidable, yet it is one step that routinely raises suspicions: what evidence should be used, how much does it weigh, and whose thumb may be tipping the scales? This commentary aims to evaluate the current state and future roles of various types of evidence for hazard assessment as it applies to environmental health. In its recent evaluation of the US Environmental Protection Agency's Integrated Risk Information System assessment process, the National Research Council committee singled out the term "weight of evidence" (WoE) for critique, deeming the process too vague and detractive to the practice of evaluating human health risks of chemicals. Moving the methodology away from qualitative, vague and controversial methods towards generalizable, quantitative and transparent methods for appropriately managing diverse lines of evidence is paramount for both regulatory and public acceptance of the hazard assessments. The choice of terminology notwithstanding, a number of recent Bayesian WoE-based methods, the emergence of multi criteria decision analysis for WoE applications, as well as the general principles behind the foundational concepts of WoE, show promise in how to move forward and regain trust in the data integration step of the assessments. We offer our thoughts on the current state of WoE as a whole and while we acknowledge that many WoE applications have been largely qualitative and subjective in nature, we see this as an opportunity to turn WoE towards a quantitative direction that includes Bayesian and multi criteria decision analysis. PMID:25592482

  11. Low hazard refractive index and density-matched fluid for quantitative imaging of concentrated suspensions of particles

    NASA Astrophysics Data System (ADS)

    Zhu, W.; Knapp, Y.; Deplano, V.

    2016-05-01

    A novel refractive index and density-matched liquid-solid suspension system taking into account chemical hazard and health concerns was developed and characterized. The solid phase is made of PMMA spheres, the refractive index of which being adapted with a mixture of 2,2'-thiodiethanol and phosphate-buffered saline (PBS), while the density is adapted with a mixture of PBS and glycerol. The proposed chemicals present low hazard characteristics in comparison with former solutions. Data collected from density and refractive index measurements of the solid phase and of the different fluid constituents are used to define a specific ternary mixture adapted to commercial grade micron-size particles. The defined mixture is validated in a micron-sized granular flow experiment. The described method can be applied to other low-density solids.

  12. An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study

    NASA Technical Reports Server (NTRS)

    Ray, Paul S.

    1996-01-01

    The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.

  13. Looking Before We Leap: Recent Results From An Ongoing Quantitative Investigation Of Asteroid And Comet Impact Hazard Mitigation.

    NASA Astrophysics Data System (ADS)

    Plesko, Catherine; Weaver, R. P.; Korycansky, D. G.; Huebner, W. F.

    2010-10-01

    The asteroid and comet impact hazard is now part of public consciousness, as demonstrated by movies, Super Bowl commercials, and popular news stories. However, there is a popular misconception that hazard mitigation is a solved problem. Many people think, `we'll just nuke it.’ There are, however, significant scientific questions remaining in the hazard mitigation problem. Before we can say with certainty that an explosive yield Y at height of burst h will produce a momentum change in or dispersion of a potentially hazardous object (PHO), we need to quantify how and where energy is deposited into the rubble pile or conglomerate that may make up the PHO. We then need to understand how shock waves propagate through the system, what causes them to disrupt, and how long gravitationally bound fragments take to recombine. Here we present numerical models of energy deposition from an energy source into various materials that are known PHO constituents, and rigid body dynamics models of the recombination of disrupted objects. In the energy deposition models, we explore the effects of porosity and standoff distance as well as that of composition. In the dynamical models, we explore the effects of fragment size and velocity distributions on the time it takes for gravitationally bound fragments to recombine. Initial models indicate that this recombination time is relatively short, as little as 24 hours for a 1 km sized PHO composed of 1000 meter-scale self-gravitating fragments with an initial velocity field of v/r = 0.001 1/s.

  14. Quantitative analysis of in vivo confocal microscopy images: a review.

    PubMed

    Patel, Dipika V; McGhee, Charles N

    2013-01-01

    In vivo confocal microscopy (IVCM) is a non-invasive method of examining the living human cornea. The recent trend towards quantitative studies using IVCM has led to the development of a variety of methods for quantifying image parameters. When selecting IVCM images for quantitative analysis, it is important to be consistent regarding the location, depth, and quality of images. All images should be de-identified, randomized, and calibrated prior to analysis. Numerous image analysis software are available, each with their own advantages and disadvantages. Criteria for analyzing corneal epithelium, sub-basal nerves, keratocytes, endothelium, and immune/inflammatory cells have been developed, although there is inconsistency among research groups regarding parameter definition. The quantification of stromal nerve parameters, however, remains a challenge. Most studies report lower inter-observer repeatability compared with intra-observer repeatability, and observer experience is known to be an important factor. Standardization of IVCM image analysis through the use of a reading center would be crucial for any future large, multi-centre clinical trials using IVCM.

  15. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  16. Site specific probabilistic seismic hazard analysis at Dubai Creek on the west coast of UAE

    NASA Astrophysics Data System (ADS)

    Shama, Ayman A.

    2011-03-01

    A probabilistic seismic hazard analysis (PSHA) was conducted to establish the hazard spectra for a site located at Dubai Creek on the west coast of the United Arab Emirates (UAE). The PSHA considered all the seismogenic sources that affect the site, including plate boundaries such as the Makran subduction zone, the Zagros fold-thrust region and the transition fault system between them; and local crustal faults in UAE. PSHA indicated that local faults dominate the hazard. The peak ground acceleration (PGA) for the 475-year return period spectrum is 0.17 g and 0.33 g for the 2,475-year return period spectrum. The hazard spectra are then employed to establish rock ground motions using the spectral matching technique.

  17. [Hazard evaluation modeling of particulate matters emitted by coal-fired boilers and case analysis].

    PubMed

    Shi, Yan-Ting; Du, Qian; Gao, Jian-Min; Bian, Xin; Wang, Zhi-Pu; Dong, He-Ming; Han, Qiang; Cao, Yang

    2014-02-01

    In order to evaluate the hazard of PM2.5 emitted by various boilers, in this paper, segmentation of particulate matters with sizes of below 2. 5 microm was performed based on their formation mechanisms and hazard level to human beings and environment. Meanwhile, taking into account the mass concentration, number concentration, enrichment factor of Hg, and content of Hg element in different coal ashes, a comprehensive model aimed at evaluating hazard of PM2.5 emitted by coal-fired boilers was established in this paper. Finally, through utilizing filed experimental data of previous literatures, a case analysis of the evaluation model was conducted, and the concept of hazard reduction coefficient was proposed, which can be used to evaluate the performance of dust removers.

  18. Preliminary fire hazards analysis for W-211, Initial Tank Retrieval Systems

    SciTech Connect

    Huckfeldt, R.A.

    1995-03-16

    A fire hazards analysis (FHA) was performed for Project W-211, Initial Tank Retrieval System (ITRS), at the Department of Energy (DOE) Hanford site. The objectives of this FHA was to determine (1) the fire hazards that expose the Initial Tank Retrieval System or are inherent in the process, (2) the adequacy of the fire-safety features planned, and (3) the degree of compliance of the project with specific fire safety provisions in DOE orders and related engineering codes and standards. The scope included the construction, the process hazards, building fire protection, and site wide fire protection. The results are presented in terms of the fire hazards present, the potential extent of fire damage, and the impact on employees and public safety. This study evaluated the ITRS with respect to its use at Tank 241-SY-101 only.

  19. A seismic hazard uncertainty analysis for the New Madrid seismic zone

    USGS Publications Warehouse

    Cramer, C.H.

    2001-01-01

    A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.

  20. Crossing thresholds: Analysis of hazardous tipping points in alpine catchments

    NASA Astrophysics Data System (ADS)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Steep mountain channels or torrents in small alpine catchments are characterized by high geomorphic activity with sediment dynamics being inherently nonlinear and threshold-mediated. Localized, high intensity rainstorms can drive torrential systems past a tipping point resulting in a sudden onset of hazardous events like (flash-) flooding, heavy bedload transport or debris flows. Such responses exhibit an abrupt switch in the fluvial system's mode (e.g. transport / supply limited). Changes in functional connectivity may persist beyond the tipping point. Torrential hazards cause costly damage in the densely populated Alpine Region. Thus, there is a rising interest in potential effects of climate change on torrential sediment dynamics. Understanding critical conditions close to tipping points is important to reduce uncertainty in predicting sediment fluxes. In this study we aim at (i) establishing threshold precipitation characteristics for the Eastern Alps of Austria. Precipitation is hypothesized to be the main forcing factor of torrential events. (ii) How do thresholds vary in space and time? (iii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which internal conditions are critical for susceptibility? (iv) Is there a change in magnitude or frequency in the recent past and what can be expected for the future? The 71 km2 catchment of the river Schöttlbach in the East Alpine Region of Styria (Austria) is monitored since a heavy precipitation event resulted in a catastrophic flood in July 2011. Sediment mobilization from slopes as well as within-channel storage and bedload transport are regularly measured using photogrammetric methods and sediment impact sensors. Thus, detailed knowledge exists on magnitude and spatial propagation of sediment waves through the catchment. The associated hydro-meteorological (pre-) conditions can be inferred from a dense station network. Changing bedload transport rates and

  1. Techniques for hazard analysis and their use at CERN.

    PubMed

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  2. Techniques for hazard analysis and their use at CERN.

    PubMed

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors. PMID:11840707

  3. Fluorescent foci quantitation for high-throughput analysis

    PubMed Central

    Ledesma-Fernández, Elena; Thorpe, Peter H.

    2015-01-01

    A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells. PMID:26290880

  4. Binary Imaging Analysis for Comprehensive Quantitative Assessment of Peripheral Nerve

    PubMed Central

    Hunter, Daniel A.; Moradzadeh, Arash; Whitlock, Elizabeth L.; Brenner, Michael J.; Myckatyn, Terence M.; Wei, Cindy H.; Tung, Thomas H.H.; Mackinnon, Susan E.

    2007-01-01

    Quantitative histomorphometry is the current gold standard for objective measurement of nerve architecture and its components. Many methods still in use rely heavily upon manual techniques that are prohibitively time consuming, predisposing to operator fatigue, sampling error, and overall limited reproducibility. More recently, investigators have attempted to combine the speed of automated morphometry with the accuracy of manual and semi-automated methods. Systematic refinements in binary imaging analysis techniques combined with an algorithmic approach allow for more exhaustive characterization of nerve parameters in the surgically relevant injury paradigms of regeneration following crush, transection, and nerve gap injuries. The binary imaging method introduced here uses multiple bitplanes to achieve reproducible, high throughput quantitative assessment of peripheral nerve. Number of myelinated axons, myelinated fiber diameter, myelin thickness, fiber distributions, myelinated fiber density, and neural debris can be quantitatively evaluated with stratification of raw data by nerve component. Results of this semi-automated method are validated by comparing values against those obtained with manual techniques. The use of this approach results in more rapid, accurate, and complete assessment of myelinated axons than manual techniques. PMID:17675163

  5. Quantitative option analysis for implementation and management of landfills.

    PubMed

    Kerestecioğlu, Merih

    2016-09-01

    The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. PMID:27354014

  6. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  7. A quantitative analysis of IRAS maps of molecular clouds

    NASA Technical Reports Server (NTRS)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  8. Facegram - Objective quantitative analysis in facial reconstructive surgery.

    PubMed

    Gerós, Ana; Horta, Ricardo; Aguiar, Paulo

    2016-06-01

    Evaluation of effectiveness in reconstructive plastic surgery has become an increasingly important asset in comparing and choosing the most suitable medical procedure to handle facial disfigurement. Unfortunately, traditional methods to assess the results of surgical interventions are mostly qualitative and lack information about movement dynamics. Along with this, the few existing methodologies tailored to objectively quantify surgery results are not practical in the medical field due to constraints in terms of cost, complexity and poor suitability to clinical environment. These limitations enforce an urgent need for the creation of a new system to quantify facial movement and allow for an easy interpretation by medical experts. With this in mind, we present here a novel method capable of quantitatively and objectively assess complex facial movements, using a set of morphological, static and dynamic measurements. For this purpose, RGB-D cameras are used to acquire both color and depth images, and a modified block matching algorithm, combining depth and color information, was developed to track the position of anatomical landmarks of interest. The algorithms are integrated into a user-friendly graphical interface and the analysis outcomes are organized into an innovative medical tool, named facegram. This system was developed in close collaboration with plastic surgeons and the methods were validated using control subjects and patients with facial paralysis. The system was shown to provide useful and detailed quantitative information (static and dynamic) making it an appropriate solution for objective quantitative characterization of facial movement in a clinical environment. PMID:26994664

  9. Quantitative Northern Blot Analysis of Mammalian rRNA Processing.

    PubMed

    Wang, Minshi; Pestov, Dimitri G

    2016-01-01

    Assembly of eukaryotic ribosomes is an elaborate biosynthetic process that begins in the nucleolus and requires hundreds of cellular factors. Analysis of rRNA processing has been instrumental for studying the mechanisms of ribosome biogenesis and effects of stress conditions on the molecular milieu of the nucleolus. Here, we describe the quantitative analysis of the steady-state levels of rRNA precursors, applicable to studies in mammalian cells and other organisms. We include protocols for gel electrophoresis and northern blotting of rRNA precursors using procedures optimized for the large size of these RNAs. We also describe the ratio analysis of multiple precursors, a technique that facilitates the accurate assessment of changes in the efficiency of individual pre-rRNA processing steps. PMID:27576717

  10. A fluvial and pluvial probabilistic flood hazard analysis for Can Tho city, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, Heiko; Martinez, Oriol; Thi Chinh, Do; Viet Dung, Nguyen

    2014-05-01

    Can Tho city is the largest city and the economic heart of the Mekong Delta, Vietnam. Due to its economic importance and envisaged development goals the city grew rapidly in population size and extend over the last two decades. Large parts of the city are located in flood prone areas, and also the central parts of the city recently experienced an increasing number of flood events, both of fluvial and pluvial nature. As the economic power and asset values are constantly increasing, this poses a considerable risk for the city. The the aim of this study is to perform a flood hazard analysis considering both fluvial and pluvial floods and to derive probabilistic flood hazard maps. This requires in a first step an understanding of the typical flood mechanisms. Fluvial floods are triggered by a coincidence of high water levels during the annual flood period in the Mekong Delta with high tidal levels, which cause in combination short term inundations in Can Tho. Pluvial floods are triggered by typical tropical convective rain storms during the monsoon season. These two flood pathways are essentially independent in its sources and can thus be treated in the hazard analysis accordingly. For the fluvial hazard analysis we propose a bivariate frequency analysis of the Mekong flood characteristics, the annual maximum flood discharge Q and the annual flood volume V at the upper boundary of the Mekong Delta, the gauging station Kratie. This defines probabilities of exceedance of different Q-V pairs, which are transferred into synthetic flood hydrographs. The synthetic hydrographs are routed through a quasi-2D hydrodynamic model of the entire Mekong Delta in order to provide boundary conditions for a detailed hazard mapping of Can Tho. This downscaling step is necessary, because the huge complexity of the river and channel network does not allow for a proper definition of boundary conditions for Can Tho city by gauge data alone. In addition the available gauge data around Can Tho

  11. Information System Hazard Analysis: A Method for Identifying Technology-induced Latent Errors for Safety.

    PubMed

    Weber, Jens H; Mason-Blakley, Fieran; Price, Morgan

    2015-01-01

    Many health information and communication technologies (ICT) are safety-critical; moreover, reports of technology-induced adverse events related to them are plentiful in the literature. Despite repeated criticism and calls to action, recent data collected by the Institute of Medicine (IOM) and other organization do not indicate significant improvements with respect to the safety of health ICT systems. A large part of the industry still operates on a reactive "break & patch" model; the application of pro-active, systematic hazard analysis methods for engineering ICT that produce "safe by design" products is sparse. This paper applies one such method: Information System Hazard Analysis (ISHA). ISHA adapts and combines hazard analysis techniques from other safety-critical domains and customizes them for ICT. We provide an overview of the steps involved in ISHA and describe. PMID:25676999

  12. QUANTITATIVE MASS SPECTROMETRIC ANALYSIS OF GLYCOPROTEINS COMBINED WITH ENRICHMENT METHODS

    PubMed Central

    Ahn, Yeong Hee; Kim, Jin Young; Yoo, Jong Shin

    2015-01-01

    Mass spectrometry (MS) has been a core technology for high sensitive and high-throughput analysis of the enriched glycoproteome in aspects of quantitative assays as well as qualitative profiling of glycoproteins. Because it has been widely recognized that aberrant glycosylation in a glycoprotein may involve in progression of a certain disease, the development of efficient analysis tool for the aberrant glycoproteins is very important for deep understanding about pathological function of the glycoprotein and new biomarker development. This review first describes the protein glycosylation-targeting enrichment technologies mainly employing solid-phase extraction methods such as hydrizide-capturing, lectin-specific capturing, and affinity separation techniques based on porous graphitized carbon, hydrophilic interaction chromatography, or immobilized boronic acid. Second, MS-based quantitative analysis strategies coupled with the protein glycosylation-targeting enrichment technologies, by using a label-free MS, stable isotope-labeling, or targeted multiple reaction monitoring (MRM) MS, are summarized with recent published studies. © 2014 The Authors. Mass Spectrometry Reviews Published by Wiley Periodicals, Inc. Rapid Commun. Mass Spec Rev 34:148–165, 2015. PMID:24889823

  13. Quantitative analysis of motion control in long term microgravity.

    PubMed

    Baroni, G; Ferrigno, G; Anolli, A; Andreoni, G; Pedotti, A

    1998-01-01

    In the frame of the 179-days EUROMIR '95 space mission, two in-flight experiments have foreseen quantitative three-dimensional human movement analysis in microgravity. For this aim, a space qualified opto-electronic motion analyser based on passive markers has been installed onboard the Russian Space Station MIR and 8 in flight sessions have been performed. Techhology and method for the collection of kinematics data are described, evaluating the accuracy in three-dimensional marker localisation. Results confirm the suitability of opto-electronic technology for quantitative human motion analysis on orbital modules and raise a set of "lessons learned", leading to the improvement of motion analyser performance with a contemporary swiftness of the on-board operations. Among the experimental program of T4, results of three voluntary posture perturbation protocols are described. The analysis suggests that a short term reinterpretation of proprioceptive information and re-calibration of sensorimotor mechanisms seem to end within the first weeks of flight, while a continuous long term adaptation process allows the refinement of motor performance, in the frame of never abandoned terrestrial strategies.

  14. Computer compensation for NMR quantitative analysis of trace components

    SciTech Connect

    Nakayama, T.; Fujiwara, Y.

    1981-07-22

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA.

  15. Flow quantitation by radio frequency analysis of contrast echocardiography.

    PubMed

    Rovai, D; Lombardi, M; Mazzarisi, A; Landini, L; Taddei, L; Distante, A; Benassi, A; L'Abbate, A

    1993-03-01

    Contrast echocardiography has the potential for measuring cardiac output and regional blood flow. However, accurate quantitation is limited both by the use of non-standard contrast agents and by the electronic signal distortion inherent to the echocardiographic instruments. Thus, the aim of this study is to quantify flow by combining a stable contrast agent and a modified echo equipment, able to sample the radio frequency (RF) signal from a region of interest (ROI) in the echo image. The contrast agent SHU-454 (0.8 ml) was bolus injected into an in vitro calf vein, at 23 flow rates (ranging from 376 to 3620 ml/min) but constant volume and pressure. The ROI was placed in the centre of the vein, the RF signal was processed in real time and transferred to a personal computer to generate time-intensity curves. In the absence of recirculation, contrast washout slope and mean transit time (MTT) of curves (1.11-8.52 seconds) yielded excellent correlations with flow: r = 0.93 and 0.95, respectively. To compare the accuracy of RF analysis with that of conventional image processing as to flow quantitation, conventional images were collected in the same flow model by two different scanners: a) the mechanical sector scanner used for RF analysis, and b) a conventional electronic sector scanner. These images were digitized off-line, mean videodensity inside an identical ROI was measured and time-intensity curves were built. MTT by RF was shorter than by videodensitometric analysis of the images generated by the same scanner (p < 0.001). In contrast, MTT by RF was longer than by the conventional scanner (p < 0.001). Significant differences in MTT were also found with changes in the gain setting controls of the conventional scanner. To study the stability of the contrast effect, 6 contrast injections (20 ml) were performed at a constant flow rate during recirculation: the spontaneous decay in RF signal intensity (t1/2 = 64 +/- 8 seconds) was too long to affect MTT significantly

  16. Quantitative proteomic analysis of drug-induced changes in mycobacteria.

    PubMed

    Hughes, Minerva A; Silva, Jeffrey C; Geromanos, Scott J; Townsend, Craig A

    2006-01-01

    A new approach for qualitative and quantitative proteomic analysis using capillary liquid chromatography and mass spectrometry to study the protein expression response in mycobacteria following isoniazid treatment is discussed. In keeping with known effects on the fatty acid synthase II pathway, proteins encoded by the kas operon (AcpM, KasA, KasB, Accd6) were significantly overexpressed, as were those involved in iron metabolism and cell division suggesting a complex interplay of metabolic events leading to cell death. PMID:16396495

  17. [Quantitative analysis for mast cells in obstructive sialadenitis].

    PubMed

    Diao, G X

    1993-03-01

    Quantitative analysis for mast cells in 27 cases of obstructive sialadenitis, 12 cases of approximate normal salivary gland tissues and 5 cases of lymphoepithelial lesion of salivary glands shows that the number of mast cells is slightly increased with the increase of gravity-grade of obstructive sialadenitis and this is closely related to fibrosis of salivary glands and infiltration grade of inflammation cells (dominated by lymphocyte cells), whereas not closely relating to the age change of patients. For the cases of benign lymphoepithelial lesion of salivary glands with malignant changes despite of malignant lymphoma or squamous cell carcinoma the numbers of mast cells are obviously decreased.

  18. Lipid biomarker analysis for the quantitative analysis of airborne microorganisms

    SciTech Connect

    Macnaughton, S.J.; Jenkins, T.L.; Cormier, M.R.

    1997-08-01

    There is an ever increasing concern regarding the presence of airborne microbial contaminants within indoor air environments. Exposure to such biocontaminants can give rise to large numbers of different health effects including infectious diseases, allergenic responses and respiratory problems, Biocontaminants typically round in indoor air environments include bacteria, fungi, algae, protozoa and dust mites. Mycotoxins, endotoxins, pollens and residues of organisms are also known to cause adverse health effects. A quantitative detection/identification technique independent of culturability that assays both culturable and non culturable biomass including endotoxin is critical in defining risks from indoor air biocontamination. Traditionally, methods employed for the monitoring of microorganism numbers in indoor air environments involve classical culture based techniques and/or direct microscopic counting. It has been repeatedly documented that viable microorganism counts only account for between 0.1-10% of the total community detectable by direct counting. The classic viable microbiologic approach doe`s not provide accurate estimates of microbial fragments or other indoor air components that can act as antigens and induce or potentiate allergic responses. Although bioaerosol samplers are designed to damage the microbes as little as possible, microbial stress has been shown to result from air sampling, aerosolization and microbial collection. Higher collection efficiency results in greater cell damage while less cell damage often results in lower collection efficiency. Filtration can collect particulates at almost 100% efficiency, but captured microorganisms may become dehydrated and damaged resulting in non-culturability, however, the lipid biomarker assays described herein do not rely on cell culture. Lipids are components that are universally distributed throughout cells providing a means to assess independent of culturability.

  19. Process hazards analysis (PrHA) program, bridging accident analyses and operational safety

    SciTech Connect

    Richardson, J. A.; McKernan, S. A.; Vigil, M. J.

    2003-01-01

    Recently the Final Safety Analysis Report (FSAR) for the Plutonium Facility at Los Alamos National Laboratory, Technical Area 55 (TA-55) was revised and submitted to the US. Department of Energy (DOE). As a part of this effort, over seventy Process Hazards Analyses (PrHAs) were written and/or revised over the six years prior to the FSAR revision. TA-55 is a research, development, and production nuclear facility that primarily supports US. defense and space programs. Nuclear fuels and material research; material recovery, refining and analyses; and the casting, machining and fabrication of plutonium components are some of the activities conducted at TA-35. These operations involve a wide variety of industrial, chemical and nuclear hazards. Operational personnel along with safety analysts work as a team to prepare the PrHA. PrHAs describe the process; identi fy the hazards; and analyze hazards including determining hazard scenarios, their likelihood, and consequences. In addition, the interaction of the process to facility systems, structures and operational specific protective features are part of the PrHA. This information is rolled-up to determine bounding accidents and mitigating systems and structures. Further detailed accident analysis is performed for the bounding accidents and included in the FSAR. The FSAR is part of the Documented Safety Analysis (DSA) that defines the safety envelope for all facility operations in order to protect the worker, the public, and the environment. The DSA is in compliance with the US. Code of Federal Regulations, 10 CFR 830, Nuclear Safety Management and is approved by DOE. The DSA sets forth the bounding conditions necessary for the safe operation for the facility and is essentially a 'license to operate.' Safely of day-to-day operations is based on Hazard Control Plans (HCPs). Hazards are initially identified in the PrI-IA for the specific operation and act as input to the HCP. Specific protective features important to worker

  20. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes.

  1. Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.

    PubMed

    Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao

    2016-04-01

    To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes. PMID:27104857

  2. FIRE HAZARDS ANALYSIS FOR THE FUEL SUPPLY SYSTEM - ESF PACKAGE 1E

    SciTech Connect

    N.M. Ruonavaara

    1995-04-12

    The purpose of the fire hazards analysis is to comprehensively assess the risk from fire within individual fire areas in accordance with US. Department of Energy (DOE) Order 5480.7h (Reference 4.4.7.4). This document will assess the fire hazard risk within the Exploratory Studies Facility (ESF) fuel supply system, Package 1E, and evaluate whether the following objectives are met: (1) Ensure that property damage from fire and related perils do not exceed an acceptable level. (2) Provide input to the facility Safety Analysis Report (SAR).

  3. Quantitative analysis of Caenorhabditis elegans chemotaxis using a microfluidic device.

    PubMed

    Hu, Liang; Ye, Jinjuan; Tan, Haowei; Ge, Anle; Tang, Lichun; Feng, Xiaojun; Du, Wei; Liu, Bi-Feng

    2015-08-01

    Caenorhabditis elegans, one of the widely studied model organisms, sense external chemical cues and perform relative chemotaxis behaviors through its simple chemosensory neuronal system. To study the mechanism underlying chemosensory behavior, a rapid and reliable method for quantitatively analyzing the worms' behaviors is essential. In this work, we demonstrated a microfluidic approach for investigating chemotaxis responses of worms to chemical gradients. The flow-based microfluidic chip was consisted of circular tree-like microchannels, which was able to generate eight flow streams containing stepwise chemical concentrations without the difference in flow velocity. Worms' upstream swimming into microchannels with various concentrations was monitored for quantitative analysis of the chemotaxis behavior. By using this microfluidic chip, the attractive and repellent responses of C. elegans to NaCl were successfully quantified within several minutes. The results demonstrated the wild type-like repellent responses and severely impaired attractive responses in grk-2 mutant animals with defects in calcium influx. In addition, the chemotaxis analysis of the third stage larvae revealed that its gustatory response was different from that in the adult stage. Thus, our microfluidic method provided a useful platform for studying the chemosensory behaviors of C. elegans and screening of chemosensation-related chemical drugs.

  4. Quantitative analysis of Caenorhabditis elegans chemotaxis using a microfluidic device.

    PubMed

    Hu, Liang; Ye, Jinjuan; Tan, Haowei; Ge, Anle; Tang, Lichun; Feng, Xiaojun; Du, Wei; Liu, Bi-Feng

    2015-08-01

    Caenorhabditis elegans, one of the widely studied model organisms, sense external chemical cues and perform relative chemotaxis behaviors through its simple chemosensory neuronal system. To study the mechanism underlying chemosensory behavior, a rapid and reliable method for quantitatively analyzing the worms' behaviors is essential. In this work, we demonstrated a microfluidic approach for investigating chemotaxis responses of worms to chemical gradients. The flow-based microfluidic chip was consisted of circular tree-like microchannels, which was able to generate eight flow streams containing stepwise chemical concentrations without the difference in flow velocity. Worms' upstream swimming into microchannels with various concentrations was monitored for quantitative analysis of the chemotaxis behavior. By using this microfluidic chip, the attractive and repellent responses of C. elegans to NaCl were successfully quantified within several minutes. The results demonstrated the wild type-like repellent responses and severely impaired attractive responses in grk-2 mutant animals with defects in calcium influx. In addition, the chemotaxis analysis of the third stage larvae revealed that its gustatory response was different from that in the adult stage. Thus, our microfluidic method provided a useful platform for studying the chemosensory behaviors of C. elegans and screening of chemosensation-related chemical drugs. PMID:26320797

  5. Quantitative analysis of echogenicity for patients with thyroid nodules

    PubMed Central

    Wu, Ming-Hsun; Chen, Chiung-Nien; Chen, Kuen-Yuan; Ho, Ming-Chih; Tai, Hao-Chih; Wang, Yu-Hsin; Chen, Argon; Chang, King-Jen

    2016-01-01

    Hypoechogenicity has been described qualitatively and is potentially subject to intra- and inter-observer variability. The aim of this study was to clarify whether quantitative echoic indexes (EIs) are useful for the detection of malignant thyroid nodules. Overall, 333 participants with 411 nodules were included in the final analysis. Quantification of echogenicity was performed using commercial software (AmCAD-UT; AmCad BioMed, Taiwan). The coordinates of three defined regions, the nodule, thyroid parenchyma, and strap muscle regions, were recorded in the database separately for subsequent analysis. And the results showed that ultrasound echogenicity (US-E), as assessed by clinicians, defined hypoechogenicity as an independent factor for malignancy. The EI, adjusted EI (EIN-T; EIN-M) and automatic EI(N-R)/R values between benign and malignant nodules were all significantly different, with lower values for malignant nodules. All of the EIs showed similar percentages of sensitivity and specificity and had better accuracies than US-E. In conclusion, the proposed quantitative EI seems more promising to constitute an important advancement than the conventional qualitative US-E in allowing for a more reliable distinction between benign and malignant thyroid nodules. PMID:27762299

  6. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  7. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    SciTech Connect

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  8. Multivariate calibration applied to the quantitative analysis of infrared spectra

    SciTech Connect

    Haaland, D.M.

    1991-01-01

    Multivariate calibration methods are very useful for improving the precision, accuracy, and reliability of quantitative spectral analyses. Spectroscopists can more effectively use these sophisticated statistical tools if they have a qualitative understanding of the techniques involved. A qualitative picture of the factor analysis multivariate calibration methods of partial least squares (PLS) and principal component regression (PCR) is presented using infrared calibrations based upon spectra of phosphosilicate glass thin films on silicon wafers. Comparisons of the relative prediction abilities of four different multivariate calibration methods are given based on Monte Carlo simulations of spectral calibration and prediction data. The success of multivariate spectral calibrations is demonstrated for several quantitative infrared studies. The infrared absorption and emission spectra of thin-film dielectrics used in the manufacture of microelectronic devices demonstrate rapid, nondestructive at-line and in-situ analyses using PLS calibrations. Finally, the application of multivariate spectral calibrations to reagentless analysis of blood is presented. We have found that the determination of glucose in whole blood taken from diabetics can be precisely monitored from the PLS calibration of either mind- or near-infrared spectra of the blood. Progress toward the non-invasive determination of glucose levels in diabetics is an ultimate goal of this research. 13 refs., 4 figs.

  9. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    PubMed Central

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-01-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  10. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    DOE PAGESBeta

    Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; Mamakhel, Aref H.; Wang, Xiaoping; Hoffmann, Christina M.; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-08-14

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically,more » the presence of Cπ...Cπinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.« less

  11. Quantitative analysis of intermolecular interactions in orthorhombic rubrene.

    PubMed

    Hathwar, Venkatesha R; Sist, Mattia; Jørgensen, Mads R V; Mamakhel, Aref H; Wang, Xiaoping; Hoffmann, Christina M; Sugimoto, Kunihisa; Overgaard, Jacob; Iversen, Bo Brummerstedt

    2015-09-01

    Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ⋯Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H-H interactions. The electron density features of H-H bonding, and the interaction energy of molecular dimers connected by H-H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations. PMID:26306198

  12. Segmentation and quantitative analysis of individual cells in developmental tissues.

    PubMed

    Nandy, Kaustav; Kim, Jusub; McCullough, Dean P; McAuliffe, Matthew; Meaburn, Karen J; Yamaguchi, Terry P; Gudla, Prabhakar R; Lockett, Stephen J

    2014-01-01

    Image analysis is vital for extracting quantitative information from biological images and is used extensively, including investigations in developmental biology. The technique commences with the segmentation (delineation) of objects of interest from 2D images or 3D image stacks and is usually followed by the measurement and classification of the segmented objects. This chapter focuses on the segmentation task and here we explain the use of ImageJ, MIPAV (Medical Image Processing, Analysis, and Visualization), and VisSeg, three freely available software packages for this purpose. ImageJ and MIPAV are extremely versatile and can be used in diverse applications. VisSeg is a specialized tool for performing highly accurate and reliable 2D and 3D segmentation of objects such as cells and cell nuclei in images and stacks.

  13. Quantitatively understanding cellular uptake of gold nanoparticles via radioactivity analysis

    PubMed Central

    Shao, Xia; Schnau, Paul; Qian, Wei; Wang, Xueding

    2015-01-01

    The development of multifunctional gold nanoparticles (AuNPs) underwent an explosion in the last two decades. However, many questions regarding detailed surface chemistry and how they are affecting the behaviors of AuNPs in vivo and in vitro still need to be addressed before AuNPs can be widely adapted into clinical settings. In this work, radioactivity analysis was employed for quantitative evaluation of I-125 radiolabeled AuNPs uptakes by cancer cells. Facilitated with this new method, we have conducted initial bioevaluation of surfactant-free AuNPs produced by femtosecond laser ablation. Cellular uptake of AuNPs as a function of the RGD density on the AuNP surface, as well as a function of time, has been quantified. The radioactivity analysis may shed light on the dynamic interactions of AuNPs with cancer cells, and help achieve optimized designs of AuNPs for future clinical applications. PMID:26505012

  14. [Quantitative analysis of butachlor, oxadiazon and simetryn by gas chromatography].

    PubMed

    Liu, F; Mu, W; Wang, J

    1999-03-01

    The quantitative analysis of the ingredients in 26% B-O-S (butachlor, oxadiazon and simetryn) emulsion by gas chromatographic method was carried out with a 5% SE-30 on Chromosorb AW DMCS, 2 m x 3 mm i.d., glass column at column temperature of 210 degrees C and detector temperature of 230 degrees C. The internal standard is di-n-butyl sebacate. The retentions of simetryn, internal standard, butachlor and oxadiazon were 6.5, 8.3, 9.9 and 11.9 min respectively. This method has a recovery of 98.62%-100.77% and the coefficients of variation of this analysis of butachlor, oxadiazon and simetryn were 0.46%, 0.32% and 0.57% respectively. All coefficients of linear correlation were higher than 0.999.

  15. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  16. The effect of pedigree complexity on quantitative trait linkage analysis.

    PubMed

    Dyer, T D; Blangero, J; Williams, J T; Göring, H H; Mahaney, M C

    2001-01-01

    Due to the computational difficulties of performing linkage analysis on large complex pedigrees, most investigators resort to simplifying such pedigrees by some ad hoc strategy. In this paper, we suggest an analytical method to compare the power of various pedigree simplification schemes by using the asymptotic distribution of the likelihood-ratio statistic. We applied the method to the large Hutterine pedigree. Our results indicate that the breaking and reduction of inbreeding loops can greatly diminish the power to localize quantitative trait loci. We also present an efficient Monte Carlo method for estimating identity-by-descent allele sharing in large complex pedigrees. This method is used to facilitate a linkage analysis of serum IgE levels in the Hutterites without simplifying the pedigree.

  17. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    PubMed

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  18. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics

    PubMed Central

    Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999–2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  19. Probabilistic seismic hazard analysis for offshore structures in the Santa Barbara Channel phase 2 report

    SciTech Connect

    Foxall, W; Savy, J

    1999-08-06

    This report summarizes progress through Phase 2 of the probabilistic seismic hazards analysis (PSHA) for the Santa Barbara Channel being carried out by the Lawrence Livermore National Laboratory (LLNL) for the Minerals Management Service (MMS) of the US Department of the Interior. The purpose of the PSHA is to provide a basis for development by MMS of regulations governing evaluation of applications to re-license existing oil platforms in federal waters within the Channel with respect to seismic loading. The final product of the analysis will be hazard maps of ground motion parameters at specified probability levels of exceedence. This report summarizes the characterization of local earthquake sources within the Channel and onshore areas of the Western Transverse Ranges, development of a ground motion attenuation model for the region, and presents preliminary hazard results at three selected sites.

  20. An Analysis of the Vulnerability of Global Drinking Water Access to Climate-related Hazards

    NASA Astrophysics Data System (ADS)

    Elliott, M.; Banerjee, O.; Christenson, E.; Holcomb, D.; Hamrick, L.; Bartram, J.

    2014-12-01

    Global drinking water access targets are formulated around "sustainable access." Global climate change (GCC) and associated hazards threaten the sustainability of drinking water supply. Extensive literature exists on the impacts of GCC on precipitation and water resources. However, the literature lacks a credible analysis of the vulnerability of global drinking water access. This research reports on an analysis of the current vulnerability of drinking water access due to three climate-related hazardous events: cyclone, drought and flood. An ArcGIS database was built incorporating the following: population density, hazardous event frequency, drinking water technologies in use and adaptive capacity. Two global grids were incorporated first: (1) LandScanTM global population distribution; and (2) frequency of cyclone, drought and flood from ~1980-2000 from Columbia University Center for Hazards Risk Research (CHRR). Population density was used to characterize cells as urban or rural and country-level urban/rural drinking water technologies in use were added based on the WHO/UNICEF Joint Monitoring Programme data. Expert assessment of the resilience of each technology to each hazardous event based on WHO/DFID Vision 2030 were quantified and added to the database. Finally, country-level adaptive capacity was drawn from the "readiness" parameter of the Global Adaptation Index (GaIn). ArcGIS Model Builder and Python were used to automate the addition of datasets. This presentation will report on the results of this analysis, the first credible attempt to assess the vulnerability of global drinking water access to climate-related hazardous events. This analysis has yielded country-level scores and maps displaying the ranking of exposure score (for flood, drought, cyclone, and all three in aggregate) and the corresponding country-level vulnerability scores and rankings incorporating the impact of drinking water technologies and adaptive capacity (Figure 1).

  1. Hazard Analysis for Pneumatic Flipper Suitport/Z-1 Manned Evaluation, Chamber B, Building 32. Revision: Basic

    NASA Technical Reports Server (NTRS)

    2012-01-01

    One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.

  2. Damage functions for climate-related hazards: unification and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Prahl, Boris F.; Rybski, Diego; Boettle, Markus; Kropp, Jürgen P.

    2016-05-01

    Most climate change impacts manifest in the form of natural hazards. Damage assessment typically relies on damage functions that translate the magnitude of extreme events to a quantifiable damage. In practice, the availability of damage functions is limited due to a lack of data sources and a lack of understanding of damage processes. The study of the characteristics of damage functions for different hazards could strengthen the theoretical foundation of damage functions and support their development and validation. Accordingly, we investigate analogies of damage functions for coastal flooding and for wind storms and identify a unified approach. This approach has general applicability for granular portfolios and may also be applied, for example, to heat-related mortality. Moreover, the unification enables the transfer of methodology between hazards and a consistent treatment of uncertainty. This is demonstrated by a sensitivity analysis on the basis of two simple case studies (for coastal flood and storm damage). The analysis reveals the relevance of the various uncertainty sources at varying hazard magnitude and on both the microscale and the macroscale level. Main findings are the dominance of uncertainty from the hazard magnitude and the persistent behaviour of intrinsic uncertainties on both scale levels. Our results shed light on the general role of uncertainties and provide useful insight for the application of the unified approach.

  3. Quantitative image analysis in sonograms of the thyroid gland

    NASA Astrophysics Data System (ADS)

    Catherine, Skouroliakou; Maria, Lyra; Aristides, Antoniou; Lambros, Vlahos

    2006-12-01

    High-resolution, real-time ultrasound is a routine examination for assessing the disorders of the thyroid gland. However, the current diagnosis practice is based mainly on qualitative evaluation of the resulting sonograms, therefore depending on the physician's experience. Computerized texture analysis is widely employed in sonographic images of various organs (liver, breast), and it has been proven to increase the sensitivity of diagnosis by providing a better tissue characterization. The present study attempts to characterize thyroid tissue by automatic texture analysis. The texture features that are calculated are based on co-occurrence matrices as they have been proposed by Haralick. The sample consists of 40 patients. For each patient two sonographic images (one for each lobe) are recorded in DICOM format. The lobe is manually delineated in each sonogram, and the co-occurrence matrices for 52 separation vectors are calculated. The texture features extracted from each one of these matrices are: contrast, correlation, energy and homogeneity. Primary component analysis is used to select the optimal set of features. The statistical analysis resulted in the extraction of 21 optimal descriptors. The optimal descriptors are all co-occurrence parameters as the first-order statistics did not prove to be representative of the images characteristics. The bigger number of components depends mainly on correlation for very close or very far distances. The results indicate that quantitative analysis of thyroid sonograms can provide an objective characterization of thyroid tissue.

  4. Liquefaction hazard analysis for infrastructure development in gulf of Jakarta

    NASA Astrophysics Data System (ADS)

    Dinata, Indra A.; Darlan, Yudi; Sadisun, Imam A.; Pindratno, Haris; Saryanto, Agus

    2016-05-01

    Gulf of Jakarta is an area of active sedimentation. There exist a wide sediment deposition area on the north coast of Jakarta. Generally, these sediments have not been consolidated, so that the conditions in these area is an important factor to determining liquefaction in these area. Liquefaction may occur because of earthquake that cause loss of strength and stiffness in soils. Analysis of liquefaction potential based from SPT data taken at gulf of Jakarta, include susceptibility rate and the factors that triggering. Liquefaction analysis methods compared with each other to get the factor of safety against liquefaction according to the characteristics of the soil. Liquefaction analysis at surface using susceptibility rating factor (SRF). SRF method controled by factors: history, geology, composition, and groundwater. Each factors have parameters that determine the value of SRF.From the analysis, Gulf of Jakarta has susceptibility rating from liquefaction with SRF value 12 - 35. The value shows that Gulf of Jakarta dominated by area that have susceptibility rating from medium to high. High susceptibility rating from liquefaction concentrated at coast area.

  5. Epistasis analysis for quantitative traits by functional regression model.

    PubMed

    Zhang, Futao; Boerwinkle, Eric; Xiong, Momiao

    2014-06-01

    The critical barrier in interaction analysis for rare variants is that most traditional statistical methods for testing interactions were originally designed for testing the interaction between common variants and are difficult to apply to rare variants because of their prohibitive computational time and poor ability. The great challenges for successful detection of interactions with next-generation sequencing (NGS) data are (1) lack of methods for interaction analysis with rare variants, (2) severe multiple testing, and (3) time-consuming computations. To meet these challenges, we shift the paradigm of interaction analysis between two loci to interaction analysis between two sets of loci or genomic regions and collectively test interactions between all possible pairs of SNPs within two genomic regions. In other words, we take a genome region as a basic unit of interaction analysis and use high-dimensional data reduction and functional data analysis techniques to develop a novel functional regression model to collectively test interactions between all possible pairs of single nucleotide polymorphisms (SNPs) within two genome regions. By intensive simulations, we demonstrate that the functional regression models for interaction analysis of the quantitative trait have the correct type 1 error rates and a much better ability to detect interactions than the current pairwise interaction analysis. The proposed method was applied to exome sequence data from the NHLBI's Exome Sequencing Project (ESP) and CHARGE-S study. We discovered 27 pairs of genes showing significant interactions after applying the Bonferroni correction (P-values < 4.58 × 10(-10)) in the ESP, and 11 were replicated in the CHARGE-S study.

  6. Functional Linear Models for Association Analysis of Quantitative Traits

    PubMed Central

    Fan, Ruzong; Wang, Yifan; Mills, James L.; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Xiong, Momiao

    2014-01-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. PMID:24130119

  7. Fractal Spectrum Technique for Quantitative Analysis of Volcanic Particle Shapes

    NASA Astrophysics Data System (ADS)

    Maria, A. H.; Carey, S. N.

    2001-12-01

    The shapes of volcanic particles reflect numerous eruptive parameters (e.g. magma viscosity, volatile content, degree of interaction with water) and are useful for understanding fragmentation and transport processes associated with volcanic eruptions. However, quantitative analysis of volcanic particle shapes has proven difficult due to their morphological complexity and variability. Shape analysis based on fractal geometry has been successfully applied to a wide variety of particles and appears to be well suited for describing complex features. The technique developed and applied to volcanic particles in this study uses fractal data produced by dilation of the 2-D particle boundary to produce a full spectrum of fractal dimensions over a range of scales for each particle. Multiple fractal dimensions, which can be described as a fractal spectrum curve, are calculated by taking the first derivative of data points on a standard Richardson plot. Quantitative comparisons are carried out using multivariate statistical techniques such as cluster and principal components analysis. Compared with previous fractal methods that express shape in terms of only one or two fractal dimensions, use of multiple fractal dimensions results in more effective discrimination between samples. In addition, the technique eliminates the subjectivity associated with selecting linear segments on Richardson plots for fractal dimension calculation, and allows direct comparison of particles as long as instantaneous dimensions used as input to multivariate analyses are selected at the same scales for each particle. Applications to samples from well documented eruptions (e.g. Mt. St. Helens, Tambora, Surtsey) indicate that the fractal spectrum technique provides a useful means of characterizing volcanic particles and can be helpful for identifying the products of specific fragmentation processes (volatile exsolution, phreatomagmatic, quench granulation) and modes of volcanic deposition (tephra fall

  8. Functional linear models for association analysis of quantitative traits.

    PubMed

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study.

  9. The Quantitative Analysis of Chennai Automotive Industry Cluster

    NASA Astrophysics Data System (ADS)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  10. 30 CFR 250.1911 - What hazards analysis criteria must my SEMS program meet?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... criminal penalty; (iii) Control technology applicable to the operation your hazards analysis is evaluating; and (iv) A qualitative evaluation of the possible safety and health effects on employees, and potential impacts to the human and marine environments, which may result if the control technology fails....

  11. 30 CFR 250.1911 - What hazards analysis criteria must my SEMS program meet?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... criminal penalty; (iii) Control technology applicable to the operation your hazards analysis is evaluating; and (iv) A qualitative evaluation of the possible safety and health effects on employees, and potential impacts to the human and marine environments, which may result if the control technology fails....

  12. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...) Each location where juice is processed by that processor; and (2) Each type of juice processed by the processor. The plan may group types of juice products together, or group types of production...

  13. Example process hazard analysis of a Department of Energy water chlorination process

    SciTech Connect

    Not Available

    1993-09-01

    On February 24, 1992, the Occupational Safety and Health Administration (OSHA) released a revised version of Section 29 Code of Federal Regulations CFR Part 1910 that added Section 1910.119, entitled ``Process Safety Management of Highly Hazardous Chemicals`` (the PSM Rule). Because US Department of Energy (DOE) Orders 5480.4 and 5483.1A prescribe OSHA 29 CFR 1910 as a standard in DOE, the PSM Rule is mandatory in the DOE complex. A major element in the PSM Rule is the process hazard analysis (PrHA), which is required for all chemical processes covered by the PSM Rule. The PrHA element of the PSM Rule requires the selection and application of appropriate hazard analysis methods to systematically identify hazards and potential accident scenarios associated with processes involving highly hazardous chemicals (HHCs). The analysis in this report is an example PrHA performed to meet the requirements of the PSM Rule. The PrHA method used in this example is the hazard and operability (HAZOP) study, and the process studied is the new Hanford 300-Area Water Treatment Facility chlorination process, which is currently in the design stage. The HAZOP study was conducted on May 18--21, 1993, by a team from the Westinghouse Hanford Company (WHC), Battelle-Columbus, the DOE, and Pacific Northwest Laboratory (PNL). The chlorination process was chosen as the example process because it is common to many DOE sites, and because quantities of chlorine at those sites generally exceed the OSHA threshold quantities (TQs).

  14. Bayesian robust analysis for genetic architecture of quantitative traits

    PubMed Central

    Yang, Runqing; Wang, Xin; Li, Jian; Deng, Hongwen

    2009-01-01

    Motivation: In most quantitative trait locus (QTL) mapping studies, phenotypes are assumed to follow normal distributions. Deviations from this assumption may affect the accuracy of QTL detection and lead to detection of spurious QTLs. To improve the robustness of QTL mapping methods, we replaced the normal distribution for residuals in multiple interacting QTL models with the normal/independent distributions that are a class of symmetric and long-tailed distributions and are able to accommodate residual outliers. Subsequently, we developed a Bayesian robust analysis strategy for dissecting genetic architecture of quantitative traits and for mapping genome-wide interacting QTLs in line crosses. Results: Through computer simulations, we showed that our strategy had a similar power for QTL detection compared with traditional methods assuming normal-distributed traits, but had a substantially increased power for non-normal phenotypes. When this strategy was applied to a group of traits associated with physical/chemical characteristics and quality in rice, more main and epistatic QTLs were detected than traditional Bayesian model analyses under the normal assumption. Contact: runqingyang@sjtu.edu.cn; dengh@umkc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:18974168

  15. Quantitative analysis of the polarization characteristics of atherosclerotic plaques

    NASA Astrophysics Data System (ADS)

    Gubarkova, Ekaterina V.; Kirillin, Michail Y.; Dudenkova, Varvara V.; Kiseleva, Elena B.; Moiseev, Alexander A.; Gelikonov, Grigory V.; Timofeeva, Lidia B.; Fiks, Ilya I.; Feldchtein, Felix I.; Gladkova, Natalia D.

    2016-04-01

    In this study we demonstrate the capability of cross-polarization optical coherence tomography (CP OCT) to assess collagen and elastin fibers condition in atherosclerotic plaques basing on ratio of the OCT signal levels in cross- and co- polarizations. We consider the depolarization factor (DF) and the effective birefringence (Δn) as quantitative characteristics of CP OCT images. We revealed that calculation of both DF and Δn in the region of interest (fibrous cap) yields a statistically significant difference between stable and unstable plaques (0.46+/-0.21 vs 0.09+/-0.04 for IDF; (4.7+/-1.0)•10-4 vs (2.5+/-0.7)•10-4 for Δn p<0.05). In parallel with CP OCT we used the nonlinear microscopy for analysis of thin cross-section of atherosclerotic plaque, revealing the different average isotropy index of collagen and elastin fibers for stable and unstable plaques (0.30 +/- 0.10 vs 0.70 +/- 0.08; p<0.001). The proposed approach for quantitative assessment of CP OCT images allows cross-scattering and birefringence characterization of stable and unstable atherosclerotic plaques.

  16. [Development of rapid methods for quantitative analysis of proteolytic reactions].

    PubMed

    Beloivan, O A; Tsvetkova, M N; Bubriak, O A

    2002-01-01

    The approaches for development of express methods for quantitative control of proteolytic reactions are discussed. Recently, these reactions have taken on special significance for revealing many important problems of theoretical and practical medicine and biology as well as for technological, pharmacological and ecological monitoring. Traditional methods can be improved both by use of immobilized enzymes and substrates, and on the basis of combination of various classic biochemical and immunological approaches. The synthesis of substrates with specified properties allows new methods to be realized for the study of the proteinase activity and kinetic characteristics of the corresponding reactions both in vitro and in vivo. An application of biosensor technology is promising trend since it allows the analysis time and cost to be saved, the direct interaction between enzymes and their inhibitors and activators to be studied in a real time scale, the quantitative measurements to be performed both in liquids and in the air. Besides, biosensor technique is well compatible with computer data processing. PMID:12924013

  17. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  18. Quantitative analysis of incipient mineral loss in hard tissues

    NASA Astrophysics Data System (ADS)

    Matvienko, Anna; Mandelis, Andreas; Hellen, Adam; Jeon, Raymond; Abrams, Stephen; Amaechi, Bennett

    2009-02-01

    A coupled diffuse-photon-density-wave and thermal-wave theoretical model was developed to describe the biothermophotonic phenomena in multi-layered hard tissue structures. Photothermal Radiometry was applied as a safe, non-destructive, and highly sensitive tool for the detection of early tooth enamel demineralization to test the theory. Extracted human tooth was treated sequentially with an artificial demineralization gel to simulate controlled mineral loss in the enamel. The experimental setup included a semiconductor laser (659 nm, 120 mW) as the source of the photothermal signal. Modulated laser light generated infrared blackbody radiation from teeth upon absorption and nonradiative energy conversion. The infrared flux emitted by the treated region of the tooth surface and sub-surface was monitored with an infrared detector, both before and after treatment. Frequency scans with a laser beam size of 3 mm were performed in order to guarantee one-dimensionality of the photothermal field. TMR images showed clear differences between sound and demineralized enamel, however this technique is destructive. Dental radiographs did not indicate any changes. The photothermal signal showed clear change even after 1 min of gel treatment. As a result of the fittings, thermal and optical properties of sound and demineralized enamel were obtained, which allowed for quantitative differentiation of healthy and non-healthy regions. In conclusion, the developed model was shown to be a promising tool for non-invasive quantitative analysis of early demineralization of hard tissues.

  19. Application of Synchrotron-XRF to Quantitative Elemental Aerosol Analysis

    NASA Astrophysics Data System (ADS)

    Cliff, S. S.; Perry, K. D.; Jimenez-Cruz, M. P.; Cahill, T. A.

    2001-12-01

    Recent advances in synchrotron x-ray fluorescence (s-XRF) analysis of atmospheric particulate matter have improved elemental sensitivity, quantification and time-resolution. Analysis of both filter and impactor based aerosol samples have yielded quantitative data for elements Na-U, if present, in ambient aerosols. The increased sensitivity allows higher time resolution through either smaller spatial analysis of time-resolved impactor samples or shorter sample time-integration using filter-based samplers. Of particular interest is the application of s-XRF to aerodynamically sized rotating substrate impactor samples. These samplers, 8- and 3-stage DRUM's, have the ability to aerodynamically size-classify particles in either 8 or 3 categories, respectively. In addition, the rotating substrate allows time-resolved analysis of samples with little or no loss in elemental sensitivity. The s-XRF analyses are performed on Beamline 10.3.1 at the Advanced Light Source-Lawrence Berkeley Laboratory (ALS-LBL). Beamline 10.3.1, originally designed for materials analysis, has been supplemented with aerosol analysis capability from several substrate options. Typical analysis involves Teflon filters or Mylar impaction substrates. The newly formed Participating Research Team (PRT) for beamline 10.3.1 encompasses both global climate and material science research. The s-XRF capabilities of beamline 10.3.1 are now available for PRT researchers and independent investigators through a proposal process to the ALS. The technology, application to aerosol research and monitoring, and availability of the facility to the aerosol research community will be presented.

  20. Quantitative analysis of the reconstruction performance of interpolants

    NASA Technical Reports Server (NTRS)

    Lansing, Donald L.; Park, Stephen K.

    1987-01-01

    The analysis presented provides a quantitative measure of the reconstruction or interpolation performance of linear, shift-invariant interpolants. The performance criterion is the mean square error of the difference between the sampled and reconstructed functions. The analysis is applicable to reconstruction algorithms used in image processing and to many types of splines used in numerical analysis and computer graphics. When formulated in the frequency domain, the mean square error clearly separates the contribution of the interpolation method from the contribution of the sampled data. The equations provide a rational basis for selecting an optimal interpolant; that is, one which minimizes the mean square error. The analysis has been applied to a selection of frequently used data splines and reconstruction algorithms: parametric cubic and quintic Hermite splines, exponential and nu splines (including the special case of the cubic spline), parametric cubic convolution, Keys' fourth-order cubic, and a cubic with a discontinuous first derivative. The emphasis in this paper is on the image-dependent case in which no a priori knowledge of the frequency spectrum of the sampled function is assumed.

  1. Integration of environmental and human health risk assessment for industries using hazardous materials: a quantitative multi criteria approach for environmental decision makers.

    PubMed

    Topuz, E; Talinli, I; Aydin, E

    2011-02-01

    Environmental management, for which environmental and human health risk assessment is the first stage, is a requirement for industries both before construction and during operation in order to sustain improved quality of life in the ecosystem. Therefore, the aim of this study is to propose an approach that integrates environmental and human health risk assessment for industries using hazardous materials in order to support environmental decision makers with quantitative and directive results. Analytic hierarchy process and fuzzy logic are used as tools to handle problems caused by complexity of environment and uncertain data. When the proposed approach is implemented to a scenario, it was concluded that it is possible to define risk sources with their risk classes and related membership degrees in that classes which enable the decision maker to decide which risk source has priority. In addition, they can easily point out and rank the factors contributing those risk sources owing to priority weights of them. As a result, environmental decision makers can use this approach while they are developing management alternatives for unfounded and on-going industrial plants using hazardous materials. PMID:21111481

  2. Hydrology Analysis and Modelling for Klang River Basin Flood Hazard Map

    NASA Astrophysics Data System (ADS)

    Sidek, L. M.; Rostam, N. E.; Hidayah, B.; Roseli, ZA; Majid, W. H. A. W. A.; Zahari, N. Z.; Salleh, S. H. M.; Ahmad, R. D. R.; Ahmad, M. N.

    2016-03-01

    Flooding, a common environmental hazard worldwide has in recent times, increased as a result of climate change and urbanization with the effects felt more in developing countries. As a result, the explosive of flooding to Tenaga Nasional Berhad (TNB) substation is increased rapidly due to existing substations are located in flood prone area. By understanding the impact of flood to their substation, TNB has provided the non-structure mitigation with the integration of Flood Hazard Map with their substation. Hydrology analysis is the important part in providing runoff as the input for the hydraulic part.

  3. Quantitative genetic analysis of injury liability in infants and toddlers

    SciTech Connect

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  4. Quantitative image analysis of WE43-T6 cracking behavior

    NASA Astrophysics Data System (ADS)

    Ahmad, A.; Yahya, Z.

    2013-06-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  5. Preparation of Buffers. An Experiment for Quantitative Analysis Laboratory

    NASA Astrophysics Data System (ADS)

    Buckley, P. T.

    2001-10-01

    In our experience, students who have a solid grounding in the theoretical aspects of buffers, buffer preparation, and buffering capacity are often at a loss when required to actually prepare a buffer in a research setting. However, there are very few published laboratory experiments pertaining to buffers. This laboratory experiment for the undergraduate quantitative analysis lab gives students hands-on experience in the preparation of buffers. By preparing a buffer to a randomly chosen pH value and comparing the theoretical pH to the actual pH, students apply their theoretical understanding of the Henderson-Hasselbalch equation, activity coefficients, and the effect of adding acid or base to a buffer. This experiment gives students experience in buffer preparation for research situations and helps them in advanced courses such as biochemistry where a fundamental knowledge of buffer systems is essential.

  6. Quantitative Image Analysis of HIV-1 Infection in Lymphoid Tissue

    NASA Astrophysics Data System (ADS)

    Haase, Ashley T.; Henry, Keith; Zupancic, Mary; Sedgewick, Gerald; Faust, Russell A.; Melroe, Holly; Cavert, Winston; Gebhard, Kristin; Staskus, Katherine; Zhang, Zhi-Qiang; Dailey, Peter J.; Balfour, Henry H., Jr.; Erice, Alejo; Perelson, Alan S.

    1996-11-01

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productively infected cells Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment.

  7. Quantitative microstructure analysis of polymer-modified mortars.

    PubMed

    Jenni, A; Herwegh, M; Zurbriggen, R; Aberle, T; Holzer, L

    2003-11-01

    Digital light, fluorescence and electron microscopy in combination with wavelength-dispersive spectroscopy were used to visualize individual polymers, air voids, cement phases and filler minerals in a polymer-modified cementitious tile adhesive. In order to investigate the evolution and processes involved in formation of the mortar microstructure, quantifications of the phase distribution in the mortar were performed including phase-specific imaging and digital image analysis. The required sample preparation techniques and imaging related topics are discussed. As a form of case study, the different techniques were applied to obtain a quantitative characterization of a specific mortar mixture. The results indicate that the mortar fractionates during different stages ranging from the early fresh mortar until the final hardened mortar stage. This induces process-dependent enrichments of the phases at specific locations in the mortar. The approach presented provides important information for a comprehensive understanding of the functionality of polymer-modified mortars.

  8. Quantitative analysis of forest island pattern in selected Ohio landscapes

    SciTech Connect

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  9. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy.

    PubMed

    Singh, Vivek K; Singh, Vinita; Rai, Awadhesh K; Thakur, Surya N; Rai, Pradeep K; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  10. Large-Scale Quantitative Analysis of Painting Arts

    NASA Astrophysics Data System (ADS)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  11. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  12. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy

    SciTech Connect

    Singh, Vivek K.; Singh, Vinita; Rai, Awadhesh K.; Thakur, Surya N.; Rai, Pradeep K.; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  13. Quantitative multielement analysis using high energy particle bombardment

    NASA Technical Reports Server (NTRS)

    Clark, P. J.; Neal, G. F.; Allen, R. O.

    1974-01-01

    Charged particles ranging in energy from 0.8 to 4.0 MeV are used to induce resonant nuclear reactions, Coulomb excitation (gamma X-rays), and X-ray emission in both thick and thin targets. Quantitative analysis is possible for elements from Li to Pb in complex environmental samples, although the matrix can severely reduce the sensitivity. It is necessary to use a comparator technique for the gamma-rays, while for X-rays an internal standard can be used. A USGS standard rock is analyzed for a total of 28 elements. Water samples can be analyzed either by nebulizing the sample doped with Cs or Y onto a thin formvar film or by extracting the sample (with or without an internal standard) onto ion exchange resin which is pressed into a pellet.

  14. Risk analysis of environmental hazards at the High Flux Beam Reactor

    SciTech Connect

    Boccio, J.L.; Ho, V.S.; Johnson, D.H.

    1994-01-01

    In the late 1980s, a Level 1 internal event probabilistic risk assessment (PRA) was performed for the High-Flux Beam Reactor (HFBR), a US Department of Energy research reactor located at Brookhaven National Laboratory. Prior to the completion of that study, a level 1 PRA for external events was initiated, including environmental hazards such as fire, internal flooding, etc. Although this paper provides a brief summary of the risks from environmental hazards, emphasis will be placed on the methodology employed in utilizing industrial event databases for event frequency determination for the HFBR complex. Since the equipment in the HFBR is different from that of, say, a commercial nuclear power plant, the current approach is to categorize the industrial events according to the hazard initiators instead of categorizing by initiator location. But first a general overview of the analysis.

  15. Considering both aleatory variability and epistemic variability in probabilistic seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Sung, Chih-Hsuan; Gao, Jia-Cian; Lee, Chyi-Tyi

    2015-04-01

    In the modern probabilistic seismic hazard analysis (PSHA), a standard deviation (sigma) of total variability was considered in the integration for seismic exceeding rate, and this lead to increased seismic hazard estimates. Epistemic uncertainty results from incomplete knowledge of the earthquake process and has nothing to do with neither the temporal variation nor the spatial variation of ground motions. It is not could be considered in the integration, epistemic variability may be included in the logic trees. This study uses Taiwan data as example to test a case in Taipei. Results reveal that if only the aleatory variability is considered in the integration, the hazard level could be reduced about 33% at the 475-year return period, and it reduced about 36% and 50% at 10000-year and 100000-year, respectively. However, if epistemic variability is considered in the logic trees besides the aleatory variability is considered in the integration, then the hazard level is similar to that from using total variability; it shows only a little bit smaller at long return period. Much effort in reducing the hazard level to a reasonable value still remains to be studied.

  16. Considering both aleatory variability and epistemic variability in probabilistic seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Gao, J. C.; Lee, C. T.

    2014-12-01

    In the modern probabilistic seismic hazard analysis (PSHA), a standard deviation (sigma) of total variability was considered in the integration for seismic exceeding rate, and this lead to increased seismic hazard estimates. Epistemic uncertainty results from incomplete knowledge of the earthquake process and has nothing to do with neither the temporal variation nor the spatial variation of ground motions. It is not could be considered in the integration, epistemic variability may be included in the logic trees. This study uses Taiwan data as example to test a case in Taipei. Results reveal that if only the aleatory variability is considered in the integration, the hazard level could be reduced about 33% at the 475-year return period, and it reduced about 36% and 50% at 10000-year and 100000-year, respectively. However, if epistemic variability is considered in the logic trees besides the aleatory variability is considered in the integration, then the hazard level is similar to that from using total variability; it shows only a little bit smaller at long return period. Much effort in reducing the hazard level to a reasonable value still remains to be studied.

  17. Ocean thermal conversion (OTEC) project bottom cable protection study: environmental characteristics and hazards analysis

    SciTech Connect

    Chern, C.; Tudor, W.

    1981-10-01

    Seafloor cable-protection criteria and technology as applied to the four proposed OTEC plant sites and cable routes at Hawaii, Puerto Rico, Guam and Florida were examined. Study of environmental characteristics for each site covered: (A) natural factors of location, tide and currents, wind and wave, bottom soil type and seafloor movement; and (B) man-made factors such as ship traffic, fishing activities, ocean mining, government regulations. These characteristics were studied to determine the hazards which are potential sources of damage to a cable system. Hazards include: chafe and corrosion, hydrodynamic forces due to wave and current action, mudslides, earthquakes, trawler and/or dredge action and ship anchors. An analysis of the history of submarine-cable failures was conducted. Included are the probabilities of damage related to water depth. Probabilities become minimal for all hazards in water depths of 1500 feet and more. Chafe and corrosion had the highest probability of causing damage to a seafloor cable compared to the other hazards. Because of the hazards present at all sites, cable burial is recommended as the best means of protection.

  18. Mass spectrometry-based quantitative analysis and biomarker discovery.

    PubMed

    Suzuki, Naoto

    2011-01-01

      Mass spectrometry-based quantitative analysis and biomarker discovery using metabolomics approach represent one of the major platforms in clinical fields including for the prognosis or diagnosis, assessment of severity and response to therapy in a number of clinical disease states as well as therapeutic drug monitoring (TDM). This review first summarizes our mass spectrometry-based research strategy and some results on relationship between cysteinyl leukotriene (cysLT), thromboxane (TX), 12-hydroxyeicosatetraenoic acid (12-HETE) and other metabolites of arachidonic acid and diseases such as atopic dermatitis, rheumatoid arthritis and diabetes mellitus. For the purpose of evaluating the role of these metabolites of arachidonic acid in disease status, we have developed sensitive determination methods with simple solid-phase extraction and applied in clinical settings. In addition to these endogenous compounds, using mass spectrometry, we have developed actually applicable quantitative methods for TDM. Representative example was a method of TDM for sirolimus, one of the immunosuppressant agents for a recipient of organ transplant, which requires rigorous monitoring of blood level. As we recognized great potential in mass spectrometry during these researches, we have become interested in metabolomics as the non-targeted analysis of metabolites. Now, established strategy for the metabolomics investigation applies to samples from cells, animals and humans to separate groups based on altered patterns of metabolites in biological fluids and to identify metabolites as potential biomarkers discriminating groups. We would be honored if our research using mass spectrometry would contribute to provide useful information in the field of medical pharmacy. PMID:21881303

  19. A Novel Quantitative Approach to Concept Analysis: The Internomological Network

    PubMed Central

    Cook, Paul F.; Larsen, Kai R.; Sakraida, Teresa J.; Pedro, Leli

    2012-01-01

    Background When a construct such as patients’ transition to self-management of chronic illness is studied by researchers across multiple disciplines, the meaning of key terms can become confused. This results from inherent problems in language where a term can have multiple meanings (polysemy) and different words can mean the same thing (synonymy). Objectives To test a novel quantitative method for clarifying the meaning of constructs by examining the similarity of published contexts in which they are used. Method Published terms related to the concept transition to self-management of chronic illness were analyzed using the internomological network (INN), a type of latent semantic analysis to calculate the mathematical relationships between constructs based on the contexts in which researchers use each term. This novel approach was tested by comparing results to those from concept analysis, a best-practice qualitative approach to clarifying meanings of terms. By comparing results of the two methods, the best synonyms of transition to self-management, as well as key antecedent, attribute, and consequence terms, were identified. Results Results from INN analysis were consistent with those from concept analysis. The potential synonyms self-management, transition, and adaptation had the greatest utility. Adaptation was the clearest overall synonym, but had lower cross-disciplinary use. The terms coping and readiness had more circumscribed meanings. The INN analysis confirmed key features of transition to self-management, and suggested related concepts not found by the previous review. Discussion The INN analysis is a promising novel methodology that allows researchers to quantify the semantic relationships between constructs. The method works across disciplinary boundaries, and may help to integrate the diverse literature on self-management of chronic illness. PMID:22592387

  20. Quantitative analysis of protein-ligand interactions by NMR.

    PubMed

    Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji

    2016-08-01

    Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used

  1. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, Jesica Tamara; Escuder-Bueno, Ignacio; Perales-Momparler, Sara; Ramón Porta-Sancho, Juan

    2016-07-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequences and using event tree modelling for risk calculation. The study area is the city of Oliva, located on the eastern coast of Spain. Results from risk modelling have been used to inform local action planning and to assess the benefits of structural and non-structural risk reduction measures. Results show the potential impact on risk reduction of flood defences and improved warning communication schemes through local action planning: societal flood risk (in terms of annual expected affected population) would be reduced up to 51 % by combining both structural and non-structural measures. In addition, the effect of seasonal population variability is analysed (annual expected affected population ranges from 82 to 107 %, compared with the current situation, depending on occupancy rates in hotels and campsites). Results highlight the need for robust and standardized methods for urban flood risk analysis replicability at regional and national scale.

  2. Description of the Northwest hazardous waste site data base and preliminary analysis of site characteristics

    SciTech Connect

    Woodruff, D.L.; Hartz, K.E.; Triplett, M.B.

    1988-08-01

    The Northwest Hazardous Waste RD and D Center (the Center) conducts research, development, and demonstration (RD and D) activities for hazardous and radioactive mixed-waste technologies applicable to remediating sites in the states of Idaho, Montana, Oregon, and Washington. To properly set priorities for these RD and D activities and to target development efforts it is necessary to understand the nature of the sites requiring remediation. A data base of hazardous waste site characteristics has been constructed to facilitate this analysis. The data base used data from EPA's Region X Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) and from Preliminary Assessment/Site Investigation (PA/SI) forms for sites in Montana. The Center's data base focuses on two sets of sites--those on the National Priorities List (NPL) and other sites that are denoted as ''active'' CERCLIS sites. Active CERCLIS sites are those sites that are undergoing active investigation and analysis. The data base contains information for each site covering site identification and location, type of industry associated with the site, waste categories present (e.g., heavy metals, pesticides, etc.), methods of disposal (e.g., tanks, drums, land, etc.), waste forms (e.g., liquid, solid, etc.), and hazard targets (e.g., surface water, groundwater, etc.). As part of this analysis, the Northwest region was divided into three geographic subregions to identify differences in disposal site characteristics within the Northwest. 2 refs., 18 figs., 5 tabs.

  3. Quantitative Medical Image Analysis for Clinical Development of Therapeutics

    NASA Astrophysics Data System (ADS)

    Analoui, Mostafa

    There has been significant progress in development of therapeutics for prevention and management of several disease areas in recent years, leading to increased average life expectancy, as well as of quality of life, globally. However, due to complexity of addressing a number of medical needs and financial burden of development of new class of therapeutics, there is a need for better tools for decision making and validation of efficacy and safety of new compounds. Numerous biological markers (biomarkers) have been proposed either as adjunct to current clinical endpoints or as surrogates. Imaging biomarkers are among rapidly increasing biomarkers, being examined to expedite effective and rational drug development. Clinical imaging often involves a complex set of multi-modality data sets that require rapid and objective analysis, independent of reviewer's bias and training. In this chapter, an overview of imaging biomarkers for drug development is offered, along with challenges that necessitate quantitative and objective image analysis. Examples of automated and semi-automated analysis approaches are provided, along with technical review of such methods. These examples include the use of 3D MRI for osteoarthritis, ultrasound vascular imaging, and dynamic contrast enhanced MRI for oncology. Additionally, a brief overview of regulatory requirements is discussed. In conclusion, this chapter highlights key challenges and future directions in this area.

  4. Automatic quantitative analysis of cardiac MR perfusion images

    NASA Astrophysics Data System (ADS)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  5. Multiple Trait Analysis of Genetic Mapping for Quantitative Trait Loci

    PubMed Central

    Jiang, C.; Zeng, Z. B.

    1995-01-01

    We present in this paper models and statistical methods for performing multiple trait analysis on mapping quantitative trait loci (QTL) based on the composite interval mapping method. By taking into account the correlated structure of multiple traits, this joint analysis has several advantages, compared with separate analyses, for mapping QTL, including the expected improvement on the statistical power of the test for QTL and on the precision of parameter estimation. Also this joint analysis provides formal procedures to test a number of biologically interesting hypotheses concerning the nature of genetic correlations between different traits. Among the testing procedures considered are those for joint mapping, pleiotropy, QTL by environment interaction, and pleiotropy vs. close linkage. The test of pleiotropy (one pleiotropic QTL at a genome position) vs. close linkage (multiple nearby nonpleiotropic QTL) can have important implications for our understanding of the nature of genetic correlations between different traits in certain regions of a genome and also for practical applications in animal and plant breeding because one of the major goals in breeding is to break unfavorable linkage. Results of extensive simulation studies are presented to illustrate various properties of the analyses. PMID:7672582

  6. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  7. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    SciTech Connect

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  8. Uncertain Characterization of Flood Hazard Using Bivariate Analysis Based on Copulas

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2015-04-01

    This study presents a methodology to derive probabilistic flood hazard map in flood prone areas taking into account uncertainties in the definition of design-hydrographs. Particularly, we present an innovative approach to obtain probabilistic inundation and flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from a bivariate statistical analysis, through the use of copulas. This study also aims to quantify the contribution of boundary conditions uncertainty in order to explore the impact of this uncertainty on probabilistic flood hazard mapping. The uncertainty of extreme flood events is considered in terms of different possible combinations of peak discharge and flood volume given by the copula. Further, we analyzed the role of a multivariate probability hydrological analysis on inundation and flood hazard maps highlighting the differences between deterministic and probabilistic approaches. The methodology has been applied to a study area located in Sicily that was subject to several flooding events in the past.

  9. Multi-hazard risk analysis using the FP7 RASOR Platform

    NASA Astrophysics Data System (ADS)

    Koudogbo, Fifamè N.; Duro, Javier; Rossi, Lauro; Rudari, Roberto; Eddy, Andrew

    2014-10-01

    Climate change challenges our understanding of risk by modifying hazards and their interactions. Sudden increases in population and rapid urbanization are changing exposure to risk around the globe, making impacts harder to predict. Despite the availability of operational mapping products, there is no single tool to integrate diverse data and products across hazards, update exposure data quickly and make scenario-based predictions to support both short and long-term risk-related decisions. RASOR (Rapid Analysis and Spatialization Of Risk) will develop a platform to perform multi-hazard risk analysis for the full cycle of disaster management, including targeted support to critical infrastructure monitoring and climate change impact assessment. A scenario-driven query system simulates future scenarios based on existing or assumed conditions and compares them with historical scenarios. RASOR will thus offer a single work environment that generates new risk information across hazards, across data types (satellite EO, in-situ), across user communities (global, local, climate, civil protection, insurance, etc.) and across the world. Five case study areas are considered within the project, located in Haiti, Indonesia, Netherlands, Italy and Greece. Initially available over those demonstration areas, RASOR will ultimately offer global services to support in-depth risk assessment and full-cycle risk management.

  10. Quantitative DNA Methylation Analysis of Candidate Genes in Cervical Cancer

    PubMed Central

    Siegel, Erin M.; Riggs, Bridget M.; Delmas, Amber L.; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D.

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97–1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated. PMID:25826459

  11. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    PubMed

    Siegel, Erin M; Riggs, Bridget M; Delmas, Amber L; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  12. Quantitative Analysis Of Acoustic Emission From Rock Fracture Experiments

    NASA Astrophysics Data System (ADS)

    Goodfellow, Sebastian David

    This thesis aims to advance the methods of quantitative acoustic emission (AE) analysis by calibrating sensors, characterizing sources, and applying the results to solve engi- neering problems. In the first part of this thesis, we built a calibration apparatus and successfully calibrated two commercial AE sensors. The ErgoTech sensor was found to have broadband velocity sensitivity and the Panametrics V103 was sensitive to surface normal displacement. These calibration results were applied to two AE data sets from rock fracture experiments in order to characterize the sources of AE events. The first data set was from an in situ rock fracture experiment conducted at the Underground Research Laboratory (URL). The Mine-By experiment was a large scale excavation response test where both AE (10 kHz - 1 MHz) and microseismicity (MS) (1 Hz - 10 kHz) were monitored. Using the calibration information, magnitude, stress drop, dimension and energy were successfully estimated for 21 AE events recorded in the tensile region of the tunnel wall. Magnitudes were in the range -7.5 < Mw < -6.8, which is consistent with other laboratory AE results, and stress drops were within the range commonly observed for induced seismicity in the field (0.1 - 10 MPa). The second data set was AE collected during a true-triaxial deformation experiment, where the objectives were to characterize laboratory AE sources and identify issues related to moving the analysis from ideal in situ conditions to more complex laboratory conditions in terms of the ability to conduct quantitative AE analysis. We found AE magnitudes in the range -7.8 < Mw < -6.7 and as with the in situ data, stress release was within the expected range of 0.1 - 10 MPa. We identified four major challenges to quantitative analysis in the laboratory, which in- hibited our ability to study parameter scaling (M0 ∝ fc -3 scaling). These challenges were 0c (1) limited knowledge of attenuation which we proved was continuously evolving, (2

  13. Quantitative analysis of fault slip evolution in analogue transpression models

    NASA Astrophysics Data System (ADS)

    Leever, Karen; Gabrielsen, Roy H.; Schmid, Dani; Braathen, Alvar

    2010-05-01

    A quantitative analysis of fault slip evolution in crustal scale brittle and brittle-ductile analogue models of doubly vergent transpressional wedges was performed by means of Particle Image Velocimetry (PIV). The kinematic analyses allow detailed comparison between model results and field kinematic data. This novel approach leads to better understanding of the evolution of transpressional orogens such as the Tertiary West Spitsbergen fold and thrust belt in particular and will advance the understanding of transpressional wedge mechanics in general. We ran a series of basal-driven models with convergence angles of 4, 7.5, 15 and 30 degrees. In these crustal scale models, brittle rheology was represented by quartz sand; in one model a viscous PDMS layer was included at shallow depth. Total sand pack thickness was 6cm, its extent 120x60cm. The PIV method was used to calculate a vector field from pairs of images that were recorded from the top of the experiments at a 2mm displacement increment. The slip azimuth on discrete faults was calculated and visualized by means of a directional derivative of this vector field. From this data set, several stages in the evolution of the models could be identified. The stages were defined by changes in the degree of displacement partitioning, i.e. slip along-strike and orthogonal to the plate boundary. A first stage of distributed strain (with no visible faults at the model surface) was followed by a shear lens stage with oblique displacement on pro- and retro-shear. The oblique displacement became locally partitioned during progressive displacement. During the final stage, strain was more fully partitioned between a newly formed central strike slip zone and reverse faults at the sides. Strain partitioning was best developed in the 15 degrees model, which shows near-reverse faults along both sides of the wedge in addition to strike slip displacement in the center. In further analysis we extracted average slip vectors for

  14. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling

    PubMed Central

    Jacobs, Kris; Klammer, Martin; Jordan, Nicole; Elschenbroich, Sarah; Parade, Marc; Jacoby, Edgar; Linders, Joannes T. M.; Brehmer, Dirk; Cools, Jan; Daub, Henrik

    2016-01-01

    The four members of the epidermal growth factor receptor (EGFR/ERBB) family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1) treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS) experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies. PMID:26745281

  15. Semiautomatic Software For Quantitative Analysis Of Cardiac Positron Tomography Studies

    NASA Astrophysics Data System (ADS)

    Ratib, Osman; Bidaut, Luc; Nienaber, Christoph; Krivokapich, Janine; Schelbert, Heinrich R.; Phelps, Michael E.

    1988-06-01

    In order to derive accurate values for true tissue radiotracers concentrations from gated positron emission tomography (PET) images of the heart, which are critical for quantifying noninvasively regional myocardial blood flow and metabolism, appropriate corrections for partial volume effect (PVE) and contamination from adjacent anatomical structures are required. We therefore developed an integrated software package for quantitative analysis of tomographic images which provides for such corrections. A semiautomatic edge detection technique outlines and partitions the myocardium into sectors. Myocardial wall thickness is measured on the images perpendicularly to the detected edges and used to correct for PVE. The programs automatically correct for radioactive decay, activity calibration and cross contaminations for both static and dynamic studies. Parameters derived with these programs include tracer concentrations and their changes over time. They are used for calculating regional metabolic rates and can be further displayed as color coded parametric images. The approach was validated for PET imaging in 11 dog experiments. 2D echocardiograms (Echo) were recorded simultaneously to validate the edge detection and wall thickness measurement techniques. After correction for PVE using automatic WT measurement, regional tissue tracer concentrations derived from PET images correlated well with true tissue concentrations as determined by well counting (r=0.98). These preliminary studies indicate that the developed automatic image analysis technique allows accurate and convenient evaluation of cardiac PET images for the measurement of both, regional tracer tissue concentrations as well as regional myocardial function.

  16. Inside Single Cells: Quantitative Analysis with Advanced Optics and Nanomaterials

    PubMed Central

    Cui, Yi; Irudayaraj, Joseph

    2014-01-01

    Single cell explorations offer a unique window to inspect molecules and events relevant to mechanisms and heterogeneity constituting the central dogma of biology. A large number of nucleic acids, proteins, metabolites and small molecules are involved in determining and fine-tuning the state and function of a single cell at a given time point. Advanced optical platforms and nanotools provide tremendous opportunities to probe intracellular components with single-molecule accuracy, as well as promising tools to adjust single cell activity. In order to obtain quantitative information (e.g. molecular quantity, kinetics and stoichiometry) within an intact cell, achieving the observation with comparable spatiotemporal resolution is a challenge. For single cell studies both the method of detection and the biocompatibility are critical factors as they determine the feasibility, especially when considering live cell analysis. Although a considerable proportion of single cell methodologies depend on specialized expertise and expensive instruments, it is our expectation that the information content and implication will outweigh the costs given the impact on life science enabled by single cell analysis. PMID:25430077

  17. [Quantitative Analysis of Mn in Soil Samples Using LIBS].

    PubMed

    Zhang, Bao-hua; Jiang, Yong-cheng; Zhang, Xian-yan; Cui, Zhi-feng

    2015-06-01

    The trace element of Manganese element in the agricultural farm (Anhui Huaiyuan Nongkang) soil was quantitatively analyzed by Laser-induced breakdown spectroscopy. The line of 403.1 nm was selected as the analysis line of Mn. The matrix element of Fe in soil was chosen as the internal calibration element and the analysis line was 407.2 nm. Ten soil samples were used to construct calibration curves with traditional method and internal standard method, and four soil samples were selected as test samples. The experimental results showed that the fitting correlation coefficient (r) is 0.954 when using the traditional method, the maximum relative error of the measurement samples is 5.72%, and the detection limit of Mn in soil is 93 mg x kg(-1). While using the internal standard method to construct the calibration curve, the fitting correlation coefficient (r) is 0.983, the relative error of measurement samples is reduced to 4.1%, and the detection limit of Mn in soil is 71 mg x kg(-1). The result indicates that LIBS technique can be used to detect trace element Mn in soil. In a certain extent, the internal standard method can improve the accuracy of measurement.

  18. Quantitative analysis of polyethylene blends by Fourier transform infrared spectroscopy.

    PubMed

    Cran, Marlene J; Bigger, Stephen W

    2003-08-01

    The quantitative analysis of binary polyethylene (PE) blends by Fourier transform infrared (FT-IR) spectroscopy has been achieved based on the ratio of two absorbance peaks in an FT-IR spectrum. The frequencies for the absorbance ratio are selected based on structural entities of the PE components in the blend. A linear relationship between the absorbance ratio and the blend composition was found to exist if one of the absorbance peaks is distinct to one of the components and the other peak is common to both components. It was also found that any peak resulting from short-chain branching in copolymers (such as linear low-density polyethylene (LLDPE) or metallocene-catalyzed LLDPE (mLLDPE)), is suitable for use as the peak that is designated as being distinct to that component. In order to optimize the linearity of the equation, however, the selection of the second common peak is the most important and depends on the blend system studied. Indeed, under certain circumstances peaks that are not spectrally distinct can be used successfully to apply the method. The method exhibits potential for the routine analysis of PE blends that have been calibrated prior to its application.

  19. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  20. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation.

  1. Optimal display conditions for quantitative analysis of stereoscopic cerebral angiograms

    SciTech Connect

    Charland, P.; Peters, T. |

    1996-10-01

    For several years the authors have been using a stereoscopic display as a tool in the planning of stereotactic neurosurgical techniques. This PC-based workstation allows the surgeon to interact with and view vascular images in three dimensions, as well as to perform quantitative analysis of the three-dimensional (3-D) space. Some of the perceptual issues relevant to the presentation of medical images on this stereoscopic display were addressed in five experiments. The authors show that a number of parameters--namely the shape, color, and depth cue, associated with a cursor--as well as the image filtering and observer position, have a role in improving the observer`s perception of a 3-D image and his ability to localize points within the stereoscopically presented 3-D image. However, an analysis of the results indicates that while varying these parameters can lead to an effect on the performance of individual observers, the effects are not consistent across observers, and the mean accuracy remains relatively constant under the different experimental conditions.

  2. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    SciTech Connect

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  3. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    PubMed

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created

  4. The Measles Vaccination Narrative in Twitter: A Quantitative Analysis

    PubMed Central

    Radzikowski, Jacek; Jacobsen, Kathryn H; Croitoru, Arie; Crooks, Andrew; Delamater, Paul L

    2016-01-01

    Background The emergence of social media is providing an alternative avenue for information exchange and opinion formation on health-related issues. Collective discourse in such media leads to the formation of a complex narrative, conveying public views and perceptions. Objective This paper presents a study of Twitter narrative regarding vaccination in the aftermath of the 2015 measles outbreak, both in terms of its cyber and physical characteristics. We aimed to contribute to the analysis of the data, as well as presenting a quantitative interdisciplinary approach to analyze such open-source data in the context of health narratives. Methods We collected 669,136 tweets referring to vaccination from February 1 to March 9, 2015. These tweets were analyzed to identify key terms, connections among such terms, retweet patterns, the structure of the narrative, and connections to the geographical space. Results The data analysis captures the anatomy of the themes and relations that make up the discussion about vaccination in Twitter. The results highlight the higher impact of stories contributed by news organizations compared to direct tweets by health organizations in communicating health-related information. They also capture the structure of the antivaccination narrative and its terms of reference. Analysis also revealed the relationship between community engagement in Twitter and state policies regarding child vaccination. Residents of Vermont and Oregon, the two states with the highest rates of non-medical exemption from school-entry vaccines nationwide, are leading the social media discussion in terms of participation. Conclusions The interdisciplinary study of health-related debates in social media across the cyber-physical debate nexus leads to a greater understanding of public concerns, views, and responses to health-related issues. Further coalescing such capabilities shows promise towards advancing health communication, thus supporting the design of more

  5. Laser Safety and Hazard Analysis for the Trailer (B70) Based AURA Laser System

    SciTech Connect

    AUGUSTONI, ARNOLD L.

    2003-01-01

    A laser safety and hazard analysis was performed for the AURA laser system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for ''Safe Use of Lasers'' and the 2000 version of the ANSI Standard Z136.6, for ''Safe Use of Lasers Outdoors''. The trailer based AURA laser system is a mobile platform, which is used to perform laser interaction experiments and tests at various national test sites. The trailer (B70) based AURA laser system is generally operated on the United State Air Force Starfire Optical Range (SOR) at Kirtland Air Force Base (KAFB), New Mexico. The laser is used to perform laser interaction testing inside the laser trailer as well as outside the trailer at target sites located at various distances from the exit telescope. In order to protect personnel, who work inside the Nominal Hazard Zone (NHZ), from hazardous laser emission exposures it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength (wavelength bands) and calculate the appropriate minimum Optical Density (OD{sub min}) of the laser safety eyewear used by authorized personnel and the Nominal Ocular Hazard Distance (NOHD) to protect unauthorized personnel who may have violated the boundaries of the control area and enter into the laser's NHZ.

  6. A sensitivity analysis of hazardous waste disposal site climatic and soil design parameters using HELP3

    SciTech Connect

    Adelman, D.D.; Stansbury, J.

    1997-12-31

    The Resource Conservation and Recovery Act (RCRA) Subtitle C, Comprehensive Environmental Response, Compensation, And Liability Act (CERCLA), and subsequent amendments have formed a comprehensive framework to deal with hazardous wastes on the national level. Key to this waste management is guidance on design (e.g., cover and bottom leachate control systems) of hazardous waste landfills. The objective of this research was to investigate the sensitivity of leachate volume at hazardous waste disposal sites to climatic, soil cover, and vegetative cover (Leaf Area Index) conditions. The computer model HELP3 which has the capability to simulate double bottom liner systems as called for in hazardous waste disposal sites was used in the analysis. HELP3 was used to model 54 combinations of climatic conditions, disposal site soil surface curve numbers, and leaf area index values to investigate how sensitive disposal site leachate volume was to these three variables. Results showed that leachate volume from the bottom double liner system was not sensitive to these parameters. However, the cover liner system leachate volume was quite sensitive to climatic conditions and less sensitive to Leaf Area Index and curve number values. Since humid locations had considerably more cover liner system leachate volume than and locations, different design standards may be appropriate for humid conditions than for and conditions.

  7. Seismic Hazard Analysis of Aizawl, India with a Focus on Water System Fragilities

    NASA Astrophysics Data System (ADS)

    Belair, G. M.; Tran, A. J.; Dreger, D. S.; Rodgers, J. E.

    2015-12-01

    GeoHazards International (GHI) has partnered with the University of California, Berkeley in a joint Civil Engineering and Earth Science summer internship program to investigate geologic hazards. This year the focus was on Aizawl, the capital of India's Mizoram state, situated on a ridge in the Burma Ranges. Nearby sources have the potential for large (M > 7) earthquakes that would be devastating to the approximately 300,000 people living in the city. Earthquake induced landslides also threaten the population as well as the city's lifelines. Fieldwork conducted in June 2015 identified hazards to vital water system components. The focus of this abstract is a review of the seismic hazards that affect Aizawl, with special attention paid to water system locations. To motivate action to reduce risk, GHI created an earthquake scenario describing effects of a M7 right-lateral strike-slip intraplate earthquake occurring 30 km below the city. We extended this analysis by exploring additional mapped faults as well as hypothetical blind reverse faults in terms of PGA, PGV, and PSA. Ground motions with hanging wall and directivity effects were also examined. Several attenuation relationships were used in order to assess the uncertainty in the ground motion parameters. Results were used to determine the likely seismic performance of water system components, and will be applied in future PSHA studies.

  8. Patient-specific meta-analysis for risk assessment using multivariate proportional hazards regression

    PubMed Central

    Crager, Michael R.; Tang, Gong

    2015-01-01

    We propose a method for assessing an individual patient’s risk of a future clinical event using clinical trial or cohort data and Cox proportional hazards regression, combining the information from several studies using meta-analysis techniques. The method combines patient-specific estimates of the log cumulative hazard across studies, weighting by the relative precision of the estimates, using either fixed- or random-effects meta-analysis calculations. Risk assessment can be done for any future patient using a few key summary statistics determined once and for all from each study. Generalizations of the method to logistic regression and linear models are immediate. We evaluate the methods using simulation studies and illustrate their application using real data. PMID:26664111

  9. Hospital food hygiene: the application of Hazard Analysis Critical Control Points to conventional hospital catering.

    PubMed

    Richards, J; Parr, E; Riseborough, P

    1993-08-01

    The Hazard Analysis Critical Control Point (HACCP) concept is a preventive approach to quality control. It is based on a logical, structured exploration of potential hazard points in a food operation and the introduction of control and monitoring measures. HACCP studies have been extensively applied to manufacturing systems. It has been difficult to extend this detailed systematic analysis to conventional catering, partly due to the wide range of foods being processed. We describe here the application of HACCP methods to a hospital department that uses conventional catering methods. Basic principles, based on the work flow and the range of products are established. The guidelines produced can be applied to any kitchen using similar catering methods. Examples of log charts used for monitoring are provided.

  10. Laser safety and hazard analysis for the temperature stabilized BSLT ARES laser system.

    SciTech Connect

    Augustoni, Arnold L.

    2003-08-01

    A laser safety and hazard analysis was performed for the temperature stabilized Big Sky Laser Technology (BSLT) laser central to the ARES system based on the 2000 version of the American National Standards Institute's (ANSI) Standard Z136.1, for Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for Safe Use of Lasers Outdoors. As a result of temperature stabilization of the BSLT laser the operating parameters of the laser had changed requiring a hazard analysis based on the new operating conditions. The ARES laser system is a Van/Truck based mobile platform, which is used to perform laser interaction experiments and tests at various national test sites.

  11. Teaching Quantitative Literacy through a Regression Analysis of Exam Performance

    ERIC Educational Resources Information Center

    Lindner, Andrew M.

    2012-01-01

    Quantitative literacy is increasingly essential for both informed citizenship and a variety of careers. Though regression is one of the most common methods in quantitative sociology, it is rarely taught until late in students' college careers. In this article, the author describes a classroom-based activity introducing students to regression…

  12. Quantitative Analysis of Cellular Metabolic Dissipative, Self-Organized Structures

    PubMed Central

    de la Fuente, Ildefonso Martínez

    2010-01-01

    One of the most important goals of the postgenomic era is understanding the metabolic dynamic processes and the functional structures generated by them. Extensive studies during the last three decades have shown that the dissipative self-organization of the functional enzymatic associations, the catalytic reactions produced during the metabolite channeling, the microcompartmentalization of these metabolic processes and the emergence of dissipative networks are the fundamental elements of the dynamical organization of cell metabolism. Here we present an overview of how mathematical models can be used to address the properties of dissipative metabolic structures at different organizational levels, both for individual enzymatic associations and for enzymatic networks. Recent analyses performed with dissipative metabolic networks have shown that unicellular organisms display a singular global enzymatic structure common to all living cellular organisms, which seems to be an intrinsic property of the functional metabolism as a whole. Mathematical models firmly based on experiments and their corresponding computational approaches are needed to fully grasp the molecular mechanisms of metabolic dynamical processes. They are necessary to enable the quantitative and qualitative analysis of the cellular catalytic reactions and also to help comprehend the conditions under which the structural dynamical phenomena and biological rhythms arise. Understanding the molecular mechanisms responsible for the metabolic dissipative structures is crucial for unraveling the dynamics of cellular life. PMID:20957111

  13. Comparison of multivariate calibration methods for quantitative spectral analysis

    SciTech Connect

    Thomas, E.V.; Haaland, D.M. )

    1990-05-15

    The quantitative prediction abilities of four multivariate calibration methods for spectral analyses are compared by using extensive Monte Carlo simulations. The calibration methods compared include inverse least-squares (ILS), classical least-squares (CLS), partial least-squares (PLS), and principal component regression (PCR) methods. ILS is a frequency-limited method while the latter three are capable of full-spectrum calibration. The simulations were performed assuming Beer's law holds and that spectral measurement errors and concentration errors associated with the reference method are normally distributed. Eight different factors that could affect the relative performance of the calibration methods were varied in a two-level, eight-factor experimental design in order to evaluate their effect on the prediction abilities of the four methods. It is found that each of the three full-spectrum methods has its range of superior performance. The frequency-limited ILS method was never the best method, although in the presence of relatively large concentration errors it sometimes yields comparable analysis precision to the full-spectrum methods for the major spectral component. The importance of each factor in the absolute and relative performances of the four methods is compared.

  14. Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms

    SciTech Connect

    Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne

    2008-08-03

    Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

  15. Quantitative image analysis of HIV-1 infection in lymphoid tissue

    SciTech Connect

    Haase, A.T.; Zupancic, M.; Cavert, W.

    1996-11-08

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy. A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productivity infected cells. Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment. 22 refs., 2 figs., 2 tabs.

  16. Quantitative analysis of plasma interleiukin-6 by immunoassay on microchip

    NASA Astrophysics Data System (ADS)

    Abe, K.; Hashimoto, Y.; Yatsushiro, S.; Yamamura, S.; Tanaka, M.; Ooie, T.; Baba, Y.; Kataoka, M.

    2012-03-01

    Sandwich enzyme-linked immunoassay (ELISA) is one of the most frequently employed assays for clinical diagnosis, since this enables the investigator to identify specific protein biomarkers. However, the conventional assay using a 96-well microtitration plate is time- and sample-consuming, and therefore is not suitable for rapid diagnosis. To overcome these drawbacks, we performed a sandwich ELISA on a microchip. We employed the piezoelectric inkjet printing for deposition and fixation of 1st antibody on the microchannnel surface (300 μm width and 100 μm depth). Model analyte was interleukin-6 (IL-6) which was one of the inflammatory cytokine. After blocking the microchannel, antigen, biotin-labeled 2nd antibody, and avidin-labeled peroxidase were infused into the microchannel and incubated for 20 min, 10 min, and 5 min, respectively. This assay could detect 2 pg/ml and quantitatively measure the range of 0-32 pg/ml. Liner regression analysis of plasma IL-6 concentration obtained by microchip and conventional methods exhibited a significant relationship (R2 = 0.9964). This assay reduced the time for the antigen-antibody reaction to 1/6, and the consumption of samples and reagents to 1/50 compared with the conventional method. This assay enables us to determine plasma IL-6 with accuracy, high sensitivity, time saving ability, and low consumption of sample and reagents, and thus will be applicable to clinic diagnosis.

  17. Quantitative produced water analysis using mobile 1H NMR

    NASA Astrophysics Data System (ADS)

    Wagner, Lisabeth; Kalli, Chris; Fridjonsson, Einar O.; May, Eric F.; Stanwix, Paul L.; Graham, Brendan F.; Carroll, Matthew R. J.; Johns, Michael L.

    2016-10-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1-30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography.

  18. Quantitative analysis of biomedical samples using synchrotron radiation microbeams

    NASA Astrophysics Data System (ADS)

    Ektessabi, Ali; Shikine, Shunsuke; Yoshida, Sohei

    2001-07-01

    X-ray fluorescence (XRF) using a synchrotron radiation (SR) microbeam was applied to investigate distributions and concentrations of elements in single neurons of patients with neurodegenerative diseases. In this paper we introduce a computer code that has been developed to quantify the trace elements and matrix elements at the single cell level. This computer code has been used in studies of several important neurodegenerative diseases such as Alzheimer's disease (AD), Parkinson's disease (PD) and parkinsonism-dementia complex (PDC), as well as in basic biological experiments to determine the elemental changes in cells due to incorporation of foreign metal elements. The substantial nigra (SN) tissue obtained from the autopsy specimens of patients with Guamanian parkinsonism-dementia complex (PDC) and control cases were examined. Quantitative XRF analysis showed that neuromelanin granules of Parkinsonian SN contained higher levels of Fe than those of the control. The concentrations were in the ranges of 2300-3100 ppm and 2000-2400 ppm respectively. On the contrary, Zn and Ni in neuromelanin granules of SN tissue from the PDC case were lower than those of the control. Especially Zn was less than 40 ppm in SN tissue from the PDC case while it was 560-810 ppm in the control. These changes are considered to be closely related to the neuro-degeneration and cell death.

  19. Hyperspectral imaging and quantitative analysis for prostate cancer detection

    PubMed Central

    Akbari, Hamed; Halig, Luma V.; Schuster, David M.; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T.; Chen, Georgia Z.

    2012-01-01

    Abstract. Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology. PMID:22894488

  20. Quantitative SERS sensors for environmental analysis of naphthalene.

    PubMed

    Péron, O; Rinnert, E; Toury, T; Lamy de la Chapelle, M; Compère, C

    2011-03-01

    In the investigation of chemical pollutants, such as PAHs (Polycyclic Aromatic Hydrocarbons) at low concentration in aqueous medium, Surface-Enhanced Raman Scattering (SERS) stands for an alternative to the inherent low cross-section of normal Raman scattering. Indeed, SERS is a very sensitive spectroscopic technique due to the excitation of the surface plasmon modes of the nanostructured metallic film. The surface of quartz substrates was coated with a hydrophobic film obtained by silanization and subsequently reacted with polystyrene (PS) beads coated with gold nanoparticles. The hydrophobic surface of the SERS substrates pre-concentrates non-polar molecules such as naphthalene. Under laser excitation, the SERS-active substrates allow the detection and the identification of the target molecules localized close to the gold nanoparticles. The morphology of the SERS substrates based on polystyrene beads surrounded by gold nanoparticles was characterized by scanning electron microscopy (SEM). Furthermore, the Raman fingerprint of the polystyrene stands for an internal spectral reference. To this extent, an innovative method to detect and to quantify organic molecules, as naphthalene in the range of 1 to 20 ppm, in aqueous media was carried out. Such SERS-active substrates tend towards an application as quantitative SERS sensors for the environmental analysis of naphthalene. PMID:21165476

  1. Active contour approach for accurate quantitative airway analysis

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Slabaugh, Greg G.; Novak, Carol L.; Naidich, David P.; Lerallut, Jean-Francois

    2008-03-01

    Chronic airway disease causes structural changes in the lungs including peribronchial thickening and airway dilatation. Multi-detector computed tomography (CT) yields detailed near-isotropic images of the lungs, and thus the potential to obtain quantitative measurements of lumen diameter and airway wall thickness. Such measurements would allow standardized assessment, and physicians to diagnose and locate airway abnormalities, adapt treatment, and monitor progress over time. However, due to the sheer number of airways per patient, systematic analysis is infeasible in routine clinical practice without automation. We have developed an automated and real-time method based on active contours to estimate both airway lumen and wall dimensions; the method does not require manual contour initialization but only a starting point on the targeted airway. While the lumen contour segmentation is purely region-based, the estimation of the outer diameter considers the inner wall segmentation as well as local intensity variation, in order anticipate the presence of nearby arteries and exclude them. These properties make the method more robust than the Full-Width Half Maximum (FWHM) approach. Results are demonstrated on a phantom dataset with known dimensions and on a human dataset where the automated measurements are compared against two human operators. The average error on the phantom measurements was 0.10mm and 0.14mm for inner and outer diameters, showing sub-voxel accuracy. Similarly, the mean variation from the average manual measurement was 0.14mm and 0.18mm for inner and outer diameters respectively.

  2. Quantitative analysis of flagellar proteins in Drosophila sperm tails.

    PubMed

    Mendes Maia, Teresa; Paul-Gilloteaux, Perrine; Basto, Renata

    2015-01-01

    The cilium has a well-defined structure, which can still accommodate some morphological and molecular composition diversity to suit the functional requirements of different cell types. The sperm flagellum of the fruit fly Drosophila melanogaster appears as a good model to study the genetic regulation of axoneme assembly and motility, due to the wealth of genetic tools publically available for this organism. In addition, the fruit fly's sperm flagellum displays quite a long axoneme (∼1.8mm), which may facilitate both histological and biochemical analyses. Here, we present a protocol for imaging and quantitatively analyze proteins, which associate with the fly differentiating, and mature sperm flagella. We will use as an example the quantification of tubulin polyglycylation in wild-type testes and in Bug22 mutant testes, which present defects in the deposition of this posttranslational modification. During sperm biogenesis, flagella appear tightly bundled, which makes it more challenging to get accurate measurements of protein levels from immunostained specimens. The method we present is based on the use of a novel semiautomated, macro installed in the image processing software ImageJ. It allows to measure fluorescence levels in closely associated sperm tails, through an exact distinction between positive and background signals, and provides background-corrected pixel intensity values that can directly be used for data analysis. PMID:25837396

  3. Copulation patterns in captive hamadryas baboons: a quantitative analysis.

    PubMed

    Nitsch, Florian; Stueckle, Sabine; Stahl, Daniel; Zinner, Dietmar

    2011-10-01

    For primates, as for many other vertebrates, copulation which results in ejaculation is a prerequisite for reproduction. The probability of ejaculation is affected by various physiological and social factors, for example reproductive state of male and female and operational sex-ratio. In this paper, we present quantitative and qualitative data on patterns of sexual behaviour in a captive group of hamadryas baboons (Papio hamadryas), a species with a polygynous-monandric mating system. We observed more than 700 copulations and analysed factors that can affect the probability of ejaculation. Multilevel logistic regression analysis and Akaike's information criterion (AIC) model selection procedures revealed that the probability of successful copulation increased as the size of female sexual swellings increased, indicating increased probability of ovulation, and as the number of females per one-male unit (OMU) decreased. In contrast, occurrence of female copulation calls, sex of the copulation initiator, and previous male aggression toward females did not affect the probability of ejaculation. Synchrony of oestrus cycles also had no effect (most likely because the sample size was too small). We also observed 29 extra-group copulations by two non-adult males. Our results indicate that male hamadryas baboons copulated more successfully around the time of ovulation and that males in large OMUs with many females may be confronted by time or energy-allocation problems.

  4. Quantitative image analysis of cell colocalization in murine bone marrow.

    PubMed

    Mokhtari, Zeinab; Mech, Franziska; Zehentmeier, Sandra; Hauser, Anja E; Figge, Marc Thilo

    2015-06-01

    Long-term antibody production is a key property of humoral immunity and is accomplished by long-lived plasma cells. They mainly reside in the bone marrow, whose importance as an organ hosting immunological memory is becoming increasingly evident. Signals provided by stromal cells and eosinophils may play an important role for plasma cell maintenance, constituting a survival microenvironment. In this joint study of experiment and theory, we investigated the spatial colocalization of plasma cells, eosinophils and B cells by applying an image-based systems biology approach. To this end, we generated confocal fluorescence microscopy images of histological sections from murine bone marrow that were subsequently analyzed in an automated fashion. This quantitative analysis was combined with computer simulations of the experimental system for hypothesis testing. In particular, we tested the observed spatial colocalization of cells in the bone marrow against the hypothesis that cells are found within available areas at positions that were drawn from a uniform random number distribution. We find that B cells and plasma cells highly colocalize with stromal cells, to an extent larger than in the simulated random situation. While B cells are preferentially in contact with each other, i.e., form clusters among themselves, plasma cells seem to be solitary or organized in aggregates, i.e., loosely defined groups of cells that are not necessarily in direct contact. Our data suggest that the plasma cell bone marrow survival niche facilitates colocalization of plasma cells with stromal cells and eosinophils, respectively, promoting plasma cell longevity.

  5. Quantitative genetic analysis of flowering time in tomato.

    PubMed

    Jiménez-Gómez, José M; Alonso-Blanco, Carlos; Borja, Alicia; Anastasio, Germán; Angosto, Trinidad; Lozano, Rafael; Martínez-Zapater, José M

    2007-03-01

    Artificial selection of cultivated tomato (Solanum lycopersicum L.) has resulted in the generation of early-flowering, day-length-insensitive cultivars, despite its close relationship to other Solanum species that need more time and specific photoperiods to flower. To investigate the genetic mechanisms controlling flowering time in tomato and related species, we performed a quantitative trait locus (QTL) analysis for flowering time in an F2 mapping population derived from S. lycopersicum and its late-flowering wild relative S. chmielewskii. Flowering time was scored as the number of days from sowing to the opening of the first flower (days to flowering), and as the number of leaves under the first inflorescence (leaf number). QTL analyses detected 2 QTLs affecting days to flowering, which explained 55.3% of the total phenotypic variance, and 6 QTLs for leaf number, accounting for 66.7% of the corresponding phenotypic variance. Four of the leaf number QTLs had not previously been detected for this trait in tomato. Colocation of some QTLs with flowering-time genes included in the genetic map suggests PHYB2, FALSIFLORA, and a tomato FLC-like sequence as candidate genes that might have been targets of selection during the domestication of tomato.

  6. Early child grammars: qualitative and quantitative analysis of morphosyntactic production.

    PubMed

    Legendre, Géraldine

    2006-09-10

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is argued that acquisition of morphosyntax proceeds via overlapping grammars (rather than through abrupt changes), which OT formalizes in terms of partial rather than total constraint rankings. Initially, economy of structure constraints take priority over faithfulness constraints that demand faithful expression of a speaker's intent, resulting in child production of tense that is comparable in level to that of child-directed speech. Using the independent Predominant Length of Utterance measure of syntactic development proposed in Vainikka, Legendre, and Todorova (1999), production of agreement is shown first to lag behind tense then to compete with tense at an intermediate stage of development. As the child's development progresses, faithfulness constraints become more dominant, and the overall production of tense and agreement becomes adult-like.

  7. Limits of normality of quantitative thoracic CT analysis

    PubMed Central

    2013-01-01

    Introduction Although computed tomography (CT) is widely used to investigate different pathologies, quantitative data from normal populations are scarce. Reference values may be useful to estimate the anatomical or physiological changes induced by various diseases. Methods We analyzed 100 helical CT scans taken for clinical purposes and referred as nonpathological by the radiologist. Profiles were manually outlined on each CT scan slice and each voxel was classified according to its gas/tissue ratio. For regional analysis, the lungs were divided into 10 sterno-vertebral levels. Results We studied 53 males and 47 females (age 64 ± 13 years); males had a greater total lung volume, lung gas volume and lung tissue. Noninflated tissue averaged 7 ± 4% of the total lung weight, poorly inflated tissue averaged 18 ± 3%, normally inflated tissue averaged 65 ± 8% and overinflated tissue averaged 11 ± 7%. We found a significant correlation between lung weight and subject's height (P <0.0001, r2 = 0.49); the total lung capacity in a supine position was 4,066 ± 1,190 ml, ~1,800 ml less than the predicted total lung capacity in a sitting position. Superimposed pressure averaged 2.6 ± 0.5 cmH2O. Conclusion Subjects without lung disease present significant amounts of poorly inflated and overinflated tissue. Normal lung weight can be predicted from patient's height with reasonable confidence. PMID:23706034

  8. Analysis of the selected optical parameters of filters protecting against hazardous infrared radiation

    PubMed Central

    Gralewicz, Grzegorz; Owczarek, Grzegorz

    2016-01-01

    The paper analyses the selected optical parameters of protective optic filters used for protection of the eyes against hazardous radiation within the visible (VIS) and near infrared (NIR) spectrum range. The indexes characterizing transmission and reflection of optic radiation incident on the filter are compared. As it follows from the completed analysis, the newly developed interference filters provide more effective blocking of infrared radiation in comparison with the currently used protective filters. PMID:26327153

  9. Analysis of the selected optical parameters of filters protecting against hazardous infrared radiation.

    PubMed

    Gralewicz, Grzegorz; Owczarek, Grzegorz

    2016-09-01

    The paper analyses the selected optical parameters of protective optic filters used for protection of the eyes against hazardous radiation within the visible (VIS) and near infrared (NIR) spectrum range. The indexes characterizing transmission and reflection of optic radiation incident on the filter are compared. As it follows from the completed analysis, the newly developed interference filters provide more effective blocking of infrared radiation in comparison with the currently used protective filters.

  10. Fire hazards analysis of the Radioactive Waste Management Complex Air Support Buildings

    SciTech Connect

    Davis, M.L.; Satterwhite, D.G.

    1989-09-01

    This report describes the methods, analyses, results, and conclusions of a fire hazards risk analysis performed for the RWMC Air Support Buildings. An evaluation of the impact for adding a sprinkler system is also presented. Event and fault trees were used to model and analyze the waste storage process. Tables are presented indicating the fire initiators providing the highest potential for release of radioactive materials into the environment. Engineering insights drawn form the data are also provided.

  11. Communication about vaccinations in Italian websites: a quantitative analysis.

    PubMed

    Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia

    2014-01-01

    Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy.

  12. Quantitative Analysis of Human Cancer Cell Extravasation Using Intravital Imaging.

    PubMed

    Willetts, Lian; Bond, David; Stoletov, Konstantin; Lewis, John D

    2016-01-01

    Metastasis, or the spread of cancer cells from a primary tumor to distant sites, is the leading cause of cancer-associated death. Metastasis is a complex multi-step process comprised of invasion, intravasation, survival in circulation, extravasation, and formation of metastatic colonies. Currently, in vitro assays are limited in their ability to investigate these intricate processes and do not faithfully reflect metastasis as it occurs in vivo. Traditional in vivo models of metastasis are limited by their ability to visualize the seemingly sporadic behavior of where and when cancer cells spread (Reymond et al., Nat Rev Cancer 13:858-870, 2013). The avian embryo model of metastasis is a powerful platform to study many of the critical steps in the metastatic cascade including the migration, extravasation, and invasion of human cancer cells in vivo (Sung et al., Nat Commun 6:7164, 2015; Leong et al., Cell Rep 8, 1558-1570, 2014; Kain et al., Dev Dyn 243:216-28, 2014; Leong et al., Nat Protoc 5:1406-17, 2010; Zijlstra et al., Cancer Cell 13:221-234, 2008; Palmer et al., J Vis Exp 51:2815, 2011). The chicken chorioallantoic membrane (CAM) is a readily accessible and well-vascularized tissue that surrounds the developing embryo. When the chicken embryo is grown in a shell-less, ex ovo environment, the nearly transparent CAM provides an ideal environment for high-resolution fluorescent microcopy approaches. In this model, the embryonic chicken vasculature and labeled cancer cells can be visualized simultaneously to investigate specific steps in the metastatic cascade including extravasation. When combined with the proper image analysis tools, the ex ovo chicken embryo model offers a cost-effective and high-throughput platform for the quantitative analysis of tumor cell metastasis in a physiologically relevant in vivo setting. Here we discuss detailed procedures to quantify cancer cell extravasation in the shell-less chicken embryo model with advanced fluorescence

  13. Quantitative petrographic analysis of Cretaceous sandstones from southwest Montana

    SciTech Connect

    Dyman, T.S. Krystinik, K.B.; Takahashi, K.I.

    1986-05-01

    The Albian Blackleaf Formation and the Cenomanian lower Frontier Formation in southwest Montana lie within or east of the fold and thrust belt in the Cretaceous foreland basin complex. Petrography of these strata record a complex interaction between source-area tectonism, basin subsidence, and sedimentation patterns associated with a cyclic sequence of transgressions and regressions. Because the petrographic data set was large (127 thin sections) and difficult to interpret subjectively, statistical techniques were used to establish sample and variable relationships. Theta-mode cluster and correspondence analysis were used to determine the contributing effect (total variance) of key framework grains. Monocrystalline quartz, plagioclase, potassium feldspar, and sandstone-, limestone-, and volcanic-lithic grain content contribute most to the variation in the framework-grain population. Theta-mode cluster and correspondence analysis were used to identify six petrofacies. Lower Blackleaf petrofacies (I-III) contain abundant monocrystalline quartz (55-90%) and sedimentary lithic grains (10-50%), which are distributed throughout the study area. Petrofacies I-III are differentiated by variable monocrystalline quartz and sedimentary lithic grain content. Upper Blackleaf and lower Frontier petrofacies (IV-VI) exhibit highly variable, sedimentary and volcanic lithic ratios, and contain less monocrystalline quartz (20-50%) than lower Blackleaf petrofacies. Information from quantitative analyses combined with available paleocurrent data indicates that Blackleaf and lower Frontier detritus was derived from variable source areas through time. Lower Blackleaf detritus was derived from Precambrian through Paleozoic sedimentary terranes to the west, north, and east; whereas, upper Blackleaf and lower Frontier detritus was derived from both sedimentary and volcanic terranes to the south.

  14. Annotated bibliography, seismicity of and near the island of Hawaii and seismic hazard analysis of the East Rift of Kilauea

    SciTech Connect

    Klein, F.W.

    1994-03-28

    This bibliography is divided into the following four sections: Seismicity of Hawaii and Kilauea Volcano; Occurrence, locations and accelerations from large historical Hawaiian earthquakes; Seismic hazards of Hawaii; and Methods of seismic hazard analysis. It contains 62 references, most of which are accompanied by short abstracts.

  15. WHC-SD-W252-FHA-001, Rev. 0: Preliminary fire hazard analysis for Phase II Liquid Effluent Treatment and Disposal Facility, Project W-252

    SciTech Connect

    Barilo, N.F.

    1995-05-11

    A Fire Hazards Analysis was performed to assess the risk from fire and other related perils and the capability of the facility to withstand these hazards. This analysis will be used to support design of the facility.

  16. Hazard screening and proposals for prevention by occupational health service: an experiment with job load and hazard analysis at a Finnish construction company.

    PubMed

    Mattila, M; Kivi, P

    1991-01-01

    In this study a systematic method for workplace investigation was developed and then tested as part of the regular occupational health care procedures in the building trade. Workplace investigation is a concept which entails the analysis of hazards inherent in the work as well as assessment of their effects on workers' well-being. The aim of this paper is to evaluate the effectiveness of the workplace investigation method. The newly developed method, called Job Load and Hazard Analysis, has the following characteristics: a job analytic approach; the application of group problem-solving; and cooperation between occupational health professionals, occupational safety personnel, and line management. The method comprises the identification of health hazards, their assessment, and conclusions and proposals as to their prevention and follow-up. The method was tested as part of one constructor's actual occupational health care programme, over a 2.5-year period. The method worked well as a central component of preventive occupational health care. It yielded concrete data that could be applied to make the occupational health care programme better suited to preventing the hazards inherent in the building trade. The contents of the occupational health care programme were clearly enhanced, the number of preventive measures increased, and the organizational climate improved; the workers praised the increased emphasis on safety. More research is needed, eg in other production settings and to determine the most effective utilization of the data gathered by the method. PMID:2011001

  17. Supplemental Analysis to Support Postulated Events in Process Hazards Analysis for the HEAF

    SciTech Connect

    Lambert, H; Johnson, G

    2001-07-20

    The purpose of this report is to conduct a limit scope risk assessment by generating event trees for the accident scenarios described in table 4-2 of the HEAF SAR, ref 1. Table 4-2 lists the postulated event/scenario descriptions for non-industrial hazards for HEAF. The event tree analysis decomposes accident scenarios into basic causes that appear as branches on the event tree. Bold downward branches indicate paths leading to the accident. The basic causes include conditions, failure of administrative controls (procedural or human error events) or failure of engineered controls (hardware, software or equipment failure) that singly or in combination can cause an accident to occur. Event tree analysis is useful since it can display the minimum number of events to cause an accident. Event trees can address statistical dependency of events such as a sequence of human error events conducted by the same operator. In this case, dependent probabilities are used. Probabilities/frequencies are assigned to each branch. Another example of dependency would be when the same software is used to conduct separate actions such as activating a hard and soft crow bar for grounding detonator circuits. Generally, the first event considered in the event tree describes the annual frequency at which a specific operation is conducted and probabilities are assigned to the remaining branches. An exception may be when the first event represents a condition, then a probability is used to indicate the percentage of time the condition exists. The annual probability (frequency) of the end state leading to the accident scenario in the event tree is obtained by multiplying the branch probabilities together.

  18. Study on the Application of Probabilistic Tsunami Hazard Analysis for the Nuclear Power Plant Site in Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Rhee, H. M.; Kim, M.; Sheen, D. H.; Choi, I. K.

    2014-12-01

    The necessity of study on the tsunami hazard assessment for Nuclear Power Plant (NPP) site was suggested since the event of Fukushima in 2011 had been occurred. It has being emphasized because all of the NPPs in Korean Peninsula are located in coastal region. The tsunami hazard is regarded as the annual exceedance probability for the wave heights. The methodology for analysis of tsunami hazard is based on the seismic hazard analysis. The seismic hazard analysis had been performed by using both deterministic and probabilistic method. Recently, the probabilistic method had been received more attention than the deterministic method because the uncertainties of hazard analysis could be considered by using the logic tree approach. In this study, the probabilistic tsunami hazard analysis for Uljin NPP site was performed by using the information of fault sources which was published by Atomic Energy Society of Japan (AESJ). The wave parameter is the most different parameter with seismic hazard. It could be estimated from the results of tsunami propagation analysis. The TSUNAMI_ver1.0 which was developed by Japan nuclear energy safety organization (JNES), was used for the tsunami simulation. The 80 cases tsunami simulations were performed and then the wave parameters were estimated. For reducing the sensitivity which was encouraged by location of sampling point, the wave parameters were estimated from group of sampling points.The probability density function on the tsunami height was computed by using the recurrence intervals and the wave parameters. And then the exceedance probability distribution was calculated from the probability density function. The tsunami hazards for the sampling groups were calculated. The fractile curves which were shown the uncertainties of input parameters were estimated from the hazards by using the round-robin algorithm. In general, tsunami hazard analysis is focused on the maximum wave heights. But the minimum wave height should be considered

  19. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system

  20. Quantitative analysis of LISA pathfinder test-mass noise

    NASA Astrophysics Data System (ADS)

    Ferraioli, Luigi; Congedo, Giuseppe; Hueller, Mauro; Vitale, Stefano; Hewitson, Martin; Nofrarias, Miquel; Armano, Michele

    2011-12-01

    LISA Pathfinder (LPF) is a mission aiming to test the critical technology for the forthcoming space-based gravitational-wave detectors. The main scientific objective of the LPF mission is to demonstrate test masses free falling with residual accelerations below 3×10-14ms-2/Hz at 1 mHz. Reaching such an ambitious target will require a significant amount of system optimization and characterization, which will in turn require accurate and quantitative noise analysis procedures. In this paper, we discuss two main problems associated with the analysis of the data from LPF: i) excess noise detection and ii) noise parameter identification. The mission is focused on the low-frequency region ([0.1, 10] mHz) of the available signal spectrum. In such a region, the signal is dominated by the force noise acting on test masses. At the same time, the mission duration is limited to 90 days and typical data segments will be 24 hours in length. Considering those constraints, noise analysis is expected to deal with a limited amount of non-Gaussian data, since the spectrum statistics will be far from Gaussian and the lowest available frequency is limited by the data length. In this paper, we analyze the details of the expected statistics for spectral data and develop two suitable excess noise estimators. One is based on the statistical properties of the integrated spectrum, the other is based on the Kolmogorov-Smirnov test. The sensitivity of the estimators is discussed theoretically for independent data, then the algorithms are tested on LPF synthetic data. The test on realistic LPF data allows the effect of spectral data correlations on the efficiency of the different noise excess estimators to be highlighted. It also reveals the versatility of the Kolmogorov-Smirnov approach, which can be adapted to provide reasonable results on correlated data from a modified version of the standard equations for the inversion of the test statistic. Closely related to excess noise detection, the

  1. Column precipitation chromatography: an approach to quantitative analysis of eigencolloids.

    PubMed

    Breynaert, E; Maes, A

    2005-08-01

    A new column precipitation chromatography (CPC) technique, capable of quantitatively measuring technetium eigencolloids in aqueous solutions, is presented. The CPC technique is based on the destabilization and precipitation of eigencolloids by polycations in a confined matrix. Tc(IV) colloids can be quantitatively determined from their precipitation onto the CPC column (separation step) and their subsequent elution upon oxidation to pertechnetate by peroxide (elution step). A clean-bed particle removal model was used to explain the experimental results. PMID:16053321

  2. Hazard analysis of EUCLIDIAN: an image-guided robotic brachytherapy system.

    PubMed

    Hu, Yida; Podder, Tarun; Buzurovic, Ivan; Yan, Kaiguo; Ng, Wan Sing; Yu, Yan

    2007-01-01

    Robotic assistance can help clinicians to improve the flexibility of needle insertion and accuracy of seed deposition. However, the robotic platform is a safety critical system for its automated operational mode. Thus, it is important to perform Hazard Identification & Safety Insurance Control (HISIC) for securing the safety of a medical robotic system. In this paper, we have performed HISIC for our robotic platform, called Endo-Uro Computer Lattice for Intratumoral Delivery, Implementation, and Ablation with Nanosensing (ECLIDIAN). The definition and requirements of the system are described by Unified Modeling Language (UML). Failure Mode and Effect Analysis (FMEA) are executed for the principles of HISIC, such as hazard identification, safety insurance control, safety critical limit, monitoring and control. FMEA combined with UML can also be implemented to ensure reliability of the human operation. On the basis of safety control index and fuzzy mathematics, safety effective value is outlined to assess the validity of safety insurance control for robotic system. The above principles and methods are feasible and effective for hazard analysis during the development of the robotic system.

  3. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  4. Descriptive Quantitative Analysis of Rearfoot Alignment Radiographic Parameters.

    PubMed

    Meyr, Andrew J; Wagoner, Matthew R

    2015-01-01

    Although the radiographic parameters of the transverse talocalcaneal angle (tTCA), calcaneocuboid angle (CCA), talar head uncovering (THU), calcaneal inclination angle (CIA), talar declination angle (TDA), lateral talar-first metatarsal angle (lTFA), and lateral talocalcaneal angle (lTCA) form the basis of the preoperative evaluation and procedure selection for pes planovalgus deformity, the so-called normal values of these measurements are not well-established. The objectives of the present study were to retrospectively evaluate the descriptive statistics of these radiographic parameters (tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA) in a large population, and, second, to determine an objective basis for defining "normal" versus "abnormal" measurements. As a secondary outcome, the relationship of these variables to the body mass index was assessed. Anteroposterior and lateral foot radiographs from 250 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated. The results revealed a mean measurement of 24.12°, 13.20°, 74.32%, 16.41°, 26.64°, 8.37°, and 43.41° for the tTCA, CCA, THU, CIA, TDA, lTFA, and lTCA, respectively. These were generally in line with the reported historical normal values. Descriptive statistical analysis demonstrated that the tTCA, THU, and TDA met the standards to be considered normally distributed but that the CCA, CIA, lTFA, and lTCA demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, only the CIA (R = -0.2428) and lTCA (R = -0.2449) demonstrated substantial correlation with the body mass index. No differentiations in deformity progression were observed when the radiographic parameters were plotted against each other to lead to a quantitative basis for defining "normal" versus "abnormal" measurements. PMID:26002682

  5. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    NASA Astrophysics Data System (ADS)

    Cruikshank, D. P.; Dalle Ore, C. M.; Pendleton, Y. J.; Clark, R. N.

    2012-12-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iapetus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-bys of these satellites, are the C-H stretching modes of aromatic hydrocarbons at ~3.28 μm (~3050 cm-1), and the are four blended bands of aliphatic -CH2- and -CH3 in the range ~3.36-3.52 μm (~2980-2840 cm-1). In these data, the aromatic band, probably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signature among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph ~24; for Hyperion the value is ~12, while Iapetus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 ~2.2 in the spectrum of low-albedo material on Iapetus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  6. Quantitative analysis of harmonic convergence in mosquito auditory interactions.

    PubMed

    Aldersley, Andrew; Champneys, Alan; Homer, Martin; Robert, Daniel

    2016-04-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the 'harmonic convergence' phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male-female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male-male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  7. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    PubMed Central

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  8. Quantitative analysis of mycoflora on commercial domestic fruits in Japan.

    PubMed

    Watanabe, Maiko; Tsutsumi, Fumiyuki; Konuma, Rumi; Lee, Ken-Ichi; Kawarada, Kensuke; Sugita-Konishi, Yoshiko; Kumagai, Susumu; Takatori, Kosuke; Konuma, Hirotaka; Hara-Kudo, Yukiko

    2011-09-01

    A comprehensive and quantitative analysis of the mycoflora on the surface of commercial fruit was performed. Nine kinds of fruits grown in Japan were tested. Overall fungal counts on the fruits ranged from 3.1 to 6.5 log CFU/g. The mean percentages of the total yeast counts were higher than those of molds in samples of apples, Japanese pears, and strawberries, ranging from 58.5 to 67.0%, and were lower than those of molds in samples of the other six fruits, ranging from 9.8 to 48.3%. Cladosporium was the most frequent fungus and was found in samples of all nine types of fruits, followed by Penicillium found in eight types of fruits. The fungi with the highest total counts in samples of the various fruits were Acremonium in cantaloupe melons (47.6% of the total fungal count), Aspergillus in grapes (32.2%), Aureobasidium in apples (21.3%), blueberries (63.6%), and peaches (33.6%), Cladosporium in strawberries (38.4%), Cryptococcus in Japanese pears (37.6%), Penicillium in mandarins (22.3%), and Sporobolomyces in lemons (26.9%). These results demonstrated that the mycoflora on the surfaces of these fruits mainly consists of common pre- and postharvest inhabitants of the plants or in the environment; fungi that produce mycotoxins or cause market diseases were not prominent in the mycoflora of healthy fruits. These findings suggest fruits should be handled carefully with consideration given to fungal contaminants, including nonpathogenic fungi, to control the quality of fruits and processed fruit products. PMID:21902918

  9. Quantitative analysis of harmonic convergence in mosquito auditory interactions

    PubMed Central

    Aldersley, Andrew; Champneys, Alan; Robert, Daniel

    2016-01-01

    This article analyses the hearing and behaviour of mosquitoes in the context of inter-individual acoustic interactions. The acoustic interactions of tethered live pairs of Aedes aegypti mosquitoes, from same and opposite sex mosquitoes of the species, are recorded on independent and unique audio channels, together with the response of tethered individual mosquitoes to playbacks of pre-recorded flight tones of lone or paired individuals. A time-dependent representation of each mosquito's non-stationary wing beat frequency signature is constructed, based on Hilbert spectral analysis. A range of algorithmic tools is developed to automatically analyse these data, and used to perform a robust quantitative identification of the ‘harmonic convergence’ phenomenon. The results suggest that harmonic convergence is an active phenomenon, which does not occur by chance. It occurs for live pairs, as well as for lone individuals responding to playback recordings, whether from the same or opposite sex. Male–female behaviour is dominated by frequency convergence at a wider range of harmonic combinations than previously reported, and requires participation from both partners in the duet. New evidence is found to show that male–male interactions are more varied than strict frequency avoidance. Rather, they can be divided into two groups: convergent pairs, typified by tightly bound wing beat frequencies, and divergent pairs, that remain widely spaced in the frequency domain. Overall, the results reveal that mosquito acoustic interaction is a delicate and intricate time-dependent active process that involves both individuals, takes place at many different frequencies, and which merits further enquiry. PMID:27053654

  10. Comparing proportional hazards and accelerated failure time models for survival analysis.

    PubMed

    Orbe, Jesus; Ferreira, Eva; Núñez-Antón, Vicente

    2002-11-30

    This paper describes a method proposed for a censored linear regression model that can be used in the context of survival analysis. The method has the important characteristic of allowing estimation and inference without knowing the distribution of the duration variable. Moreover, it does not need the assumption of proportional hazards. Therefore, it can be an interesting alternative to the Cox proportional hazards models when this assumption does not hold. In addition, implementation and interpretation of the results is simple. In order to analyse the performance of this methodology, we apply it to two real examples and we carry out a simulation study. We present its results together with those obtained with the traditional Cox model and AFT parametric models. The new proposal seems to lead to more precise results.

  11. Rockfall Hazard Analysis From Discrete Fracture Network Modelling with Finite Persistence Discontinuities

    NASA Astrophysics Data System (ADS)

    Lambert, Cédric; Thoeni, Klaus; Giacomini, Anna; Casagrande, Davide; Sloan, Scott

    2012-09-01

    Developing an accurate representation of the rock mass fabric is a key element in rock fall hazard analysis. The orientation, persistence and density of fractures control the volume and shape of unstable blocks or compartments. In this study, the discrete fracture modelling technique and digital photogrammetry were used to accurately depict the fabric. A volume distribution of unstable blocks was derived combining polyhedral modelling and kinematic analyses. For each block size, probabilities of failure and probabilities of propagation were calculated. A complete energy distribution was obtained by considering, for each block size, its occurrence in the rock mass, its probability of falling, its probability to reach a given location, and the resulting distribution of energies at each location. This distribution was then used with an energy-frequency diagram to assess the hazard.

  12. Analysis of Hazards Associated with a Process Involving Uranium Metal and Uranium Hydride Powders

    SciTech Connect

    Bullock, J.S.

    2000-05-01

    An analysis of the reaction chemistry and operational factors associated with processing uranium and uranium hydride powders is presented, focusing on a specific operation in the Development Division which was subjected to the Job Hazard Analysis (JHA) process. Primary emphasis is on the thermodynamic factors leading to pyrophoricity in common atmospheres. The discussion covers feed powders, cold-pressed and hot-pressed materials, and stray material resulting from the operations. The sensitivity of the various forms of material to pyrophoricity in common atmospheres is discussed. Operational recommendations for performing the work described are given.

  13. Determination of controlling earthquakes from probabilistic seismic hazard analysis for nuclear reactor sites

    SciTech Connect

    Boissonnade, A.; Bernreuter, D.; Chokshi, N.; Murphy, A.

    1995-04-04

    Recently, the US Nuclear Regulatory Commission published, for public comments, a revision to 10 CFR Part 100. The proposed regulation acknowledges that uncertainties are inherent in estimates of the Safe Shutdown Earthquake Ground Motion (SSE) and requires that these uncertainties be addressed through an appropriate analysis. One element of this evaluation is the assessment of the controlling earthquake through the probabilistic seismic hazard analysis (PSHA) and its use in determining the SSE. This paper reviews the basis for the various key choices in characterizing the controlling earthquake.

  14. Quantitative PCR analysis of salivary pathogen burden in periodontitis.

    PubMed

    Salminen, Aino; Kopra, K A Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S; Sinisalo, Juha; Pussinen, Pirkko J

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39-4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51-4.52). The highest OR 3.59 (95% CI 1.94-6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T

  15. Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas

    NASA Astrophysics Data System (ADS)

    Necmioglu, Ocal; Meral Ozel, Nurcan

    2015-04-01

    Accurate earthquake source parameters are essential for any tsunami hazard assessment and mitigation, including early warning systems. Complex tectonic setting makes the a priori accurate assumptions of earthquake source parameters difficult and characterization of the faulting type is a challenge. Information on tsunamigenic sources is of crucial importance in the Eastern Mediterranean and its Connected Seas, especially considering the short arrival times and lack of offshore sea-level measurements. In addition, the scientific community have had to abandon the paradigm of a ''maximum earthquake'' predictable from simple tectonic parameters (Ruff and Kanamori, 1980) in the wake of the 2004 Sumatra event (Okal, 2010) and one of the lessons learnt from the 2011 Tohoku event was that tsunami hazard maps may need to be prepared for infrequent gigantic earthquakes as well as more frequent smaller-sized earthquakes (Satake, 2011). We have initiated an extensive modeling study to perform a deterministic Tsunami Hazard Analysis for the Eastern Mediterranean and its Connected Seas. Characteristic earthquake source parameters (strike, dip, rake, depth, Mwmax) at each 0.5° x 0.5° size bin for 0-40 km depth (total of 310 bins) and for 40-100 km depth (total of 92 bins) in the Eastern Mediterranean, Aegean and Black Sea region (30°N-48°N and 22°E-44°E) have been assigned from the harmonization of the available databases and previous studies. These parameters have been used as input parameters for the deterministic tsunami hazard modeling. Nested Tsunami simulations of 6h duration with a coarse (2 arc-min) grid resolution have been simulated at EC-JRC premises for Black Sea and Eastern and Central Mediterranean (30°N-41.5°N and 8°E-37°E) for each source defined using shallow water finite-difference SWAN code (Mader, 2004) for the magnitude range of 6.5 - Mwmax defined for that bin with a Mw increment of 0.1. Results show that not only the earthquakes resembling the

  16. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    ERIC Educational Resources Information Center

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  17. Near-Field Probabilistic Seismic Hazard Analysis of Metropolitan Tehran Using Region-Specific Directivity Models

    NASA Astrophysics Data System (ADS)

    Yazdani, Azad; Nicknam, Ahmad; Dadras, Ehsan Yousefi; Eftekhari, Seyed Nasrollah

    2016-09-01

    Ground motions are affected by directivity effects at near-fault regions which result in low-frequency cycle pulses at the beginning of the velocity time history. The directivity features of near-fault ground motions can lead to significant increase in the risk of earthquake-induced damage on engineering structures. The ordinary probabilistic seismic hazard analysis (PSHA) does not take into account such effects; recent studies have thus proposed new frameworks to incorporate directivity effects in PSHA. The objective of this study is to develop the seismic hazard mapping of Tehran City according to near-fault PSHA procedure for different return periods. To this end, the directivity models required in the modified PSHA were developed based on a database of the simulated ground motions. The simulated database was used in this study because there are no recorded near-fault data in the region to derive purely empirically based pulse prediction models. The results show that the directivity effects can significantly affect the estimate of regional seismic hazard.

  18. Laser hazard analysis for airborne AURA (Big Sky variant) Proteus platform.

    SciTech Connect

    Augustoni, Arnold L.

    2004-02-01

    A laser safety and hazard analysis was performed for the airborne AURA (Big Sky Laser Technology) lidar system based on the 2000 version of the American National Standard Institute's (ANSI) Standard Z136.1, for the Safe Use of Lasers and the 2000 version of the ANSI Standard Z136.6, for the Safe Use of Lasers Outdoors. The AURA lidar system is installed in the instrument pod of a Proteus airframe and is used to perform laser interaction experiments and tests at various national test sites. The targets are located at various distances or ranges from the airborne platform. In order to protect personnel, who may be in the target area and may be subjected to exposures, it was necessary to determine the Maximum Permissible Exposure (MPE) for each laser wavelength, calculate the Nominal Ocular Hazard Distance (NOHD), and determine the maximum 'eye-safe' dwell times for various operational altitudes and conditions. It was also necessary to calculate the appropriate minimum Optical Density (ODmin) of the laser safety eyewear used by authorized personnel who may receive hazardous exposures during ground base operations of the airborne AURA laser system (system alignment and calibration).

  19. Hazardous waste crime: a contextual analysis of the offense and the offender

    SciTech Connect

    Rebovich, D.J.

    1986-01-01

    The goal of this study is to analyze hazardous waste offense and offender characteristics. Criminal case data were collected from four sample states (Maine, Maryland, New Jersey, and Pennsylvania). Files of disposed criminal cases charged between 1977 and 1984 were content-analyzed, and interviews were conducted with prominent hazardous waste crime enforcement personnel from the sample states. Areas of analysis include methods of crime commission, skills required for crime commission, patterns of criminal network relationships, and degree of syndicate crime influence. While there has been some previous speculation that hazardous waste criminal behavior is directed through centralized racketeering, the present study of known offenders found little evidence of syndicate crime family infiltration. Crimes occurred within small, informal networks of waste generators, waste transporters, the employees of treatment/storage/disposal (TSD) facilities and certain non-industry peripheral actors. The study concludes that, while attempts have been made by syndicate crime operatives to infiltrate, these attempts have failed largely due to features of criminal commission methods and to the inherent fragmentation of hauling and TSD firm interests.

  20. Additive interaction in survival analysis: use of the additive hazards model.

    PubMed

    Rod, Naja Hulvej; Lange, Theis; Andersen, Ingelise; Marott, Jacob Louis; Diderichsen, Finn

    2012-09-01

    It is a widely held belief in public health and clinical decision-making that interventions or preventive strategies should be aimed at patients or population subgroups where most cases could potentially be prevented. To identify such subgroups, deviation from additivity of absolute effects is the relevant measure of interest. Multiplicative survival models, such as the Cox proportional hazards model, are often used to estimate the association between exposure and risk of disease in prospective studies. In Cox models, deviations from additivity have usually been assessed by surrogate measures of additive interaction derived from multiplicative models-an approach that is both counter-intuitive and sometimes invalid. This paper presents a straightforward and intuitive way of assessing deviation from additivity of effects in survival analysis by use of the additive hazards model. The model directly estimates the absolute size of the deviation from additivity and provides confidence intervals. In addition, the model can accommodate both continuous and categorical exposures and models both exposures and potential confounders on the same underlying scale. To illustrate the approach, we present an empirical example of interaction between education and smoking on risk of lung cancer. We argue that deviations from additivity of effects are important for public health interventions and clinical decision-making, and such estimations should be encouraged in prospective studies on health. A detailed implementation guide of the additive hazards model is provided in the appendix.

  1. Landscape analysis for multi-hazard prevention in Orco and Soana valleys, North-Western Italy

    NASA Astrophysics Data System (ADS)

    Turconi, L.; Tropeano, D.; Savio, G.; De, S. Kumar; Mason, P. J.

    2015-04-01

    A Civil Protection Plan has been drafted for a 600 km2 mountainous region in NW Italy Consisting of Orco and Soana Valleys. It is a part of the oldest natural park in Italy and attracts several thousand tourists every year. The work is concerned with the analysis of relevant physiographic characteristics of this Alpine landscapehaving extremely variable geomorphology and possess a long history of instability. Thousands of records as well as digital maps (involving overlay and comparison of up to 90 GIS layers) have been analyzed and cross-correlated to find out the details of the events. The study area experienced different types of natural hazards, typical of the whole Alpine environment. Thus, the present area has been selected for such multi-hazard research in which several natural processes have been investigated, concerning their damaging effects over the land. Due to 36 different severe hazardous events at least 250 deaths have been recorded in the area since 18th Century, in the occasion of.

  2. Quantitative analysis of localized surface plasmons based on molecular probing.

    PubMed

    Deeb, Claire; Bachelot, Renaud; Plain, Jérôme; Baudrion, Anne-Laure; Jradi, Safi; Bouhelier, Alexandre; Soppera, Olivier; Jain, Prashant K; Huang, Libai; Ecoffet, Carole; Balan, Lavinia; Royer, Pascal

    2010-08-24

    We report on the quantitative characterization of the plasmonic optical near-field of a single silver nanoparticle. Our approach relies on nanoscale molecular molding of the confined electromagnetic field by photoactivated molecules. We were able to directly image the dipolar profile of the near-field distribution with a resolution better than 10 nm and to quantify the near-field depth and its enhancement factor. A single nanoparticle spectral signature was also assessed. This quantitative characterization constitutes a prerequisite for developing nanophotonic applications.

  3. Quantitative Analysis of Autophagy using Advanced 3D Fluorescence Microscopy

    PubMed Central

    Changou, Chun A.; Wolfson, Deanna L.; Ahluwalia, Balpreet Singh; Bold, Richard J.; Kung, Hsing-Jien; Chuang, Frank Y.S.

    2013-01-01

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine1. This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)1,10. Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)1,2,3. Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation4,5. Although the essential components of this pathway are well-characterized6,7,8,9, many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy11,12. Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early stages of

  4. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    PubMed

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-01-01

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  5. A diagnostic programme for quantitative analysis of proteinuria.

    PubMed

    Hofmann, W; Guder, W G

    1989-09-01

    A spectrum of quantitative methods was adapted to the Kone Specific Analyser for the purpose of recognizing, quantifying and differentiating various forms of proteinuria. Total protein, IgG, albumin and alpha 1-microglobulin (measured by turbidimetry), N-acetyl-beta-D-glucosaminidase activity and creatinine (measured photometrically), were measured in undiluted urine; in addition alpha 1-microglobulin was measured in serum. Within and between run precision, accuracy and linearity of the turbidimetric methods were in good agreement with nephelometric procedures. All turbidimetric methods exhibited a correlation coefficient r greater than 0.98 when compared with the radial immunodiffusion procedure as reference method. Total protein measured turbidimetrically with the Kone Specific Analyser was in good agreement with the manual biuret procedure. The low detection limits and linearities allowed quantification of urine analytes from the lower range of normals up to ten times the upper limit of normals. The measured analytes exhibited stability in urine at pH 4-8 over at least seven days at 4-6 degrees C and -20 degrees C. Only IgG showed a significant loss (up to 30 percent), when measured after storage at -20 degrees C. Quantities per mol creatinine showed significantly lower intra-individual and inter-individual variability than quantities per liter. In 31 normal persons, the intraindividual variation was lowest for N-acetyl-beta-D-glucosaminidase activity (13%) and highest for total protein (33%), when measured in the second morning urine on 5 consecutive days. When related to creatinine, results obtained in the second morning urine showed no significant differences from those in 24 h urine, except for alpha 1-microglobulin which gave lower values in 24 h urines. The upper normal limits, calculated as the 95% ranges, were determined from 154 urines of 31 individuals. Nearly all analytes showed an asymmetric distribution. Because of a wide tailing of the upper limit

  6. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    PubMed

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-05-03

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  7. The hazard analysis and critical control point system in food safety.

    PubMed

    Herrera, Anavella Gaitan

    2004-01-01

    The Hazard Analysis and Critical Control Point (HACCP) system is a preventive method of ensuring food safety. Its objectives are the identification of consumer safety hazards that can occur in the production line and the establishment of a control process to guarantee a safer product for the consumer; it is based on the identification of potential hazards to food safety and on measures aimed at preventing these hazards. HACCP is the system of choice in the management of food safety. The principles of HACCP are applicable to all phases of food production, including basic husbandry practices, food preparation and handling, food processing, food service, distribution systems, and consumer handling and use. The HACCP system is involved in every aspect of food safety production (according to the UN Food and Agriculture Organization [FAO] and the International Commission on Microbiological Specifications for Foods [ICMSF]). The most basic concept underlying the HACCP system is that of prevention rather than inspection. The control of processes and conditions comprises the critical control point (CCP) element. HACCP is simply a methodical, flexible, and systematic application of the appropriate science and technology for planning, controlling, and documenting the safe production of foods. The successful application of HACCP requires the full commitment and involvement of management and the workforce, using a multidisciplinary approach that should include, as appropriate, expertise in agronomy, veterinary health, microbiology, public health, food technology, environmental health, chemistry, engineering, and so on according to the particular situation. Application of the HACCP system is compatible with the implementation of total quality management (TQM) systems such as the ISO 9000 series. PMID:15156035

  8. A Framework for the Validation of Probabilistic Seismic Hazard Analysis Maps Using Strong Ground Motion Data

    NASA Astrophysics Data System (ADS)

    Bydlon, S. A.; Beroza, G. C.

    2015-12-01

    Recent debate on the efficacy of Probabilistic Seismic Hazard Analysis (PSHA), and the utility of hazard maps (i.e. Stein et al., 2011; Hanks et al., 2012), has prompted a need for validation of such maps using recorded strong ground motion data. Unfortunately, strong motion records are limited spatially and temporally relative to the area and time windows hazard maps encompass. We develop a framework to test the predictive powers of PSHA maps that is flexible with respect to a map's specified probability of exceedance and time window, and the strong motion receiver coverage. Using a combination of recorded and interpolated strong motion records produced through the ShakeMap environment, we compile a record of ground motion intensity measures for California from 2002-present. We use this information to perform an area-based test of California PSHA maps inspired by the work of Ward (1995). Though this framework is flexible in that it can be applied to seismically active areas where ShakeMap-like ground shaking interpolations have or can be produced, this testing procedure is limited by the relatively short lifetime of strong motion recordings and by the desire to only test with data collected after the development of the PSHA map under scrutiny. To account for this, we use the assumption that PSHA maps are time independent to adapt the testing procedure for periods of recorded data shorter than the lifetime of a map. We note that accuracy of this testing procedure will only improve as more data is collected, or as the time-horizon of interest is reduced, as has been proposed for maps of areas experiencing induced seismicity. We believe that this procedure can be used to determine whether PSHA maps are accurately portraying seismic hazard and whether discrepancies are localized or systemic.

  9. Google Earth Views of Probabilistic Tsunami Hazard Analysis Pilot Study, Seaside, Oregon

    NASA Astrophysics Data System (ADS)

    Wong, F. L.; Venturato, A. J.; Geist, E. L.

    2006-12-01

    Virtual globes such as Google Earth provide immediate geographic context for research data for coastal hazard planning. We present Google Earth views of data from a Tsunami Pilot Study conducted within and near Seaside and Gearhart, Oregon, as part of FEMA's Flood Insurance Rate Map Modernization Program (Tsunami Pilot Study Working Group, 2006). Two goals of the pilot study were to develop probabilistic 100- year and 500-year tsunami inundation maps using Probabilistic Tsunami Hazard Analysis (PTHA) and to provide recommendations for improved tsunami hazard assessment guidelines. The Seaside area was chosen because it is typical of many coastal communities along the Cascadia subduction zone that extends from Cape Mendocino, California, to the Strait of Juan de Fuca, Washington. State and local stakeholders also expressed considerable interest in mapping the tsunami threat to this area. The study was an interagency effort by the National Oceanic and Atmospheric Administration, U.S. Geological Survey, and FEMA, in collaboration with the University of Southern California, Middle East Technical University, Portland State University, Horning Geoscience, Northwest Hydraulics Consultants, and the Oregon Department of Geological and Mineral Industries. The pilot study report will be augmented by a separate geographic information systems (GIS) data publication that provides model data and results. In addition to traditional GIS data formats, Google Earth kmz files are available to provide rapid visualization of the data against the rich base map provided by the interface. The data include verbal and geologic observations of historic tsunami events, newly constructed DEMs, historic shorelines, earthquake sources, models of tsunami wave heights, and maps of the estimated 100- and 500-year probabilistic floods. Tsunami Pilot Study Working Group, 2006, Seaside, Oregon Tsunami Pilot Study - Modernization of FEMA Flood Hazard Maps: U.S. Geological Survey Open-file Report 2006

  10. Critical load analysis in hazard assessment of metals using a Unit World Model.

    PubMed

    Gandhi, Nilima; Bhavsar, Satyendra P; Diamond, Miriam L

    2011-09-01

    A Unit World approach has been used extensively to rank chemicals for their hazards and to understand differences in chemical behavior. Whereas the fate and effects of an organic chemical in a Unit World Model (UWM) analysis vary systematically according to one variable (fraction of organic carbon), and the chemicals have a singular ranking regardless of environmental characteristics, metals can change their hazard ranking according to freshwater chemistry, notably pH and dissolved organic carbon (DOC). Consequently, developing a UWM approach for metals requires selecting a series of representative freshwater chemistries, based on an understanding of the sensitivity of model results to this chemistry. Here we analyze results from a UWM for metals with the goal of informing the selection of appropriate freshwater chemistries for a UWM. The UWM loosely couples the biotic ligand model (BLM) to a geochemical speciation model (Windermere Humic Adsorption Model [WHAM]) and then to the multi-species fate transport-speciation (Transpec) model. The UWM is applied to estimate the critical load (CL) of cationic metals Cd, Cu, Ni, Pb, and Zn, using three lake chemistries that vary in trophic status, pH, and other parameters. The model results indicated a difference of four orders of magnitude in particle-to-total dissolved partitioning (K(d)) that translated into minimal differences in fate because of the short water residence time used. However, a maximum 300-fold difference was calculated in Cu toxicity among the three chemistries and three aquatic organisms. Critical loads were lowest (greatest hazard) in the oligotrophic water chemistry and highest (least hazard) in the eutrophic water chemistry, despite the highest fraction of free metal ion as a function of total metal occurring in the mesotrophic system, where toxicity was ameliorated by competing cations. Water hardness, DOC, and pH had the greatest influence on CL, because of the influence of these factors on aquatic

  11. A deterministic analysis of tsunami hazard and risk for the southwest coast of Sri Lanka

    NASA Astrophysics Data System (ADS)

    Wijetunge, J. J.

    2014-05-01

    This paper describes a multi-scenario, deterministic analysis carried out as a pilot study to evaluate the tsunami hazard and risk distribution in the southwest coast of Sri Lanka. The hazard and risk assessment procedure adopted was also assessed against available field records of the impact of the Indian Ocean tsunami in 2004. An evaluation of numerically simulated nearshore tsunami amplitudes corresponding to ‘maximum-credible' scenarios from different subduction segments in the Indian Ocean surrounding Sri Lanka suggests that a seismic event similar to that generated the tsunami in 2004 can still be considered as the ‘worst-case' scenario for the southwest coast. Furthermore, it appears that formation of edge waves trapped by the primary waves diffracting around the southwest significantly influences the nearshore tsunami wave field and is largely responsible for relatively higher tsunami amplitudes in certain stretches of the coastline under study. The extent of inundation from numerical simulations corresponding to the worst-case scenario shows good overall agreement with the points of maximum penetration of inundation from field measurements in the aftermath of the 2004 tsunami. It can also be seen that the inundation distribution is strongly influenced by onshore topography. The present study indicates that the mean depth of inundation could be utilised as a primary parameter to quantify the spatial distribution of the tsunami hazard. The spatial distribution of the risk of the tsunami hazard to the population and residential buildings computed by employing the standard risk formula shows satisfactory correlation with published statistics of the affected population and the damage to residential property during the tsunami in 2004.

  12. The hazard analysis and critical control point system in food safety.

    PubMed

    Herrera, Anavella Gaitan

    2004-01-01

    The Hazard Analysis and Critical Control Point (HACCP) system is a preventive method of ensuring food safety. Its objectives are the identification of consumer safety hazards that can occur in the production line and the establishment of a control process to guarantee a safer product for the consumer; it is based on the identification of potential hazards to food safety and on measures aimed at preventing these hazards. HACCP is the system of choice in the management of food safety. The principles of HACCP are applicable to all phases of food production, including basic husbandry practices, food preparation and handling, food processing, food service, distribution systems, and consumer handling and use. The HACCP system is involved in every aspect of food safety production (according to the UN Food and Agriculture Organization [FAO] and the International Commission on Microbiological Specifications for Foods [ICMSF]). The most basic concept underlying the HACCP system is that of prevention rather than inspection. The control of processes and conditions comprises the critical control point (CCP) element. HACCP is simply a methodical, flexible, and systematic application of the appropriate science and technology for planning, controlling, and documenting the safe production of foods. The successful application of HACCP requires the full commitment and involvement of management and the workforce, using a multidisciplinary approach that should include, as appropriate, expertise in agronomy, veterinary health, microbiology, public health, food technology, environmental health, chemistry, engineering, and so on according to the particular situation. Application of the HACCP system is compatible with the implementation of total quality management (TQM) systems such as the ISO 9000 series.

  13. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  14. MOLD SPECIFIC QUANTITATIVE PCR: THE EMERGING STANDARD IN MOLD ANALYSIS

    EPA Science Inventory

    Today I will talk about the use of quantitative or Real time PCR for the standardized identification and quantification of molds. There are probably at least 100,000 species of molds or fungi. But there are actually about 100 typically found indoors. Some pose a threat to human...

  15. Teaching Quantitative Research Methods: A Quasi-Experimental Analysis.

    ERIC Educational Resources Information Center

    Bridges, George S.; Gillmore, Gerald M.; Pershing, Jana L.; Bates, Kristin A.

    1998-01-01

    Describes an experiment designed to introduce aspects of quantitative reasoning to a large, substantively-focused class in the social sciences. Reveals that participating students' abilities to interpret and manipulate empirical data increased significantly, independent of baseline SAT verbal and mathematics scores. Discusses implications for…

  16. Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...

  17. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    ERIC Educational Resources Information Center

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  18. Fundamental quantitative analysis of microbial activity in aquifer bioreclamation

    SciTech Connect

    Rittman, B.E.; Valocchi, A.J.; Baveye, P.

    1993-08-01

    In situ bioremediation of hazardous organic chemicals that contaminate aquifer solids and ground water is a highly promising technique for many sites at DOE facilities. Its potential stems from having agents for destruction of the contaminants (bacteria) close to the separate-phase liquid or sorbed contaminants. This project was designed to advance knowledge in several of the microbiological fundamentals most important to in situ bioremediation: biodegradation of poorly soluable organic contaminants; dual limitation kinetics of electron donors and acceptors; kinetics of sequential degradation involving oxygenase reaction; biologically induced clogging in porous media, and two dimensional modeling of biofilm reactions in non homogeneous porous media.

  19. ANALYSIS OF MULTIVARIATE FAILURE TIME DATA USING MARGINAL PROPORTIONAL HAZARDS MODEL.

    PubMed

    Chen, Ying; Chen, Kani; Ying, Zhiliang

    2010-01-01

    The marginal proportional hazards model is an important tool in the analysis of multivariate failure time data in the presence of censoring. We propose a method of estimation via the linear combinations of martingale residuals. The estimation and inference procedures are easy to implement numerically. The estimation is generally more accurate than the existing pseudo-likelihood approach: the size of efficiency gain can be considerable in some cases, and the maximum relative efficiency in theory is infinite. Consistency and asymptotic normality are established. Empirical evidence in support of the theoretical claims is shown in simulation studies. PMID:24307815

  20. [Incorporation of the Hazard Analysis and Critical Control Point system (HACCP) in food legislation].

    PubMed

    Castellanos Rey, Liliana C; Villamil Jiménez, Luis C; Romero Prada, Jaime R

    2004-01-01

    The Hazard Analysis and Critical Control Point system (HACCP), recommended by different international organizations as the Codex Alimentarius Commission, the World Trade Organization (WTO), the International Office of Epizootics (OIE) and the International Convention for Vegetables Protection (ICPV) amongst others, contributes to ensuring the innocuity of food along the agro-alimentary chain and requires of Good Manufacturing Practices (GMP) for its implementation, GMP's which are legislated in most countries. Since 1997, Colombia has set rules and legislation for application of HACCP system in agreement with international standards. This paper discusses the potential and difficulties of the legislation enforcement and suggests some policy implications towards food safety.