Science.gov

Sample records for quantitative risk evaluation

  1. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  2. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  3. Safety evaluation of disposable baby diapers using principles of quantitative risk assessment.

    PubMed

    Rai, Prashant; Lee, Byung-Mu; Liu, Tsung-Yun; Yuhui, Qin; Krause, Edburga; Marsman, Daniel S; Felter, Susan

    2009-01-01

    Baby diapers are complex products consisting of multiple layers of materials, most of which are not in direct contact with the skin. The safety profile of a diaper is determined by the biological properties of individual components and the extent to which the baby is exposed to each component during use. Rigorous evaluation of the toxicological profile and realistic exposure conditions of each material is important to ensure the overall safety of the diaper under normal and foreseeable use conditions. Quantitative risk assessment (QRA) principles may be applied to the safety assessment of diapers and similar products. Exposure to component materials is determined by (1) considering the conditions of product use, (2) the degree to which individual layers of the product are in contact with the skin during use, and (3) the extent to which some components may be extracted by urine and delivered to skin. This assessment of potential exposure is then combined with data from standard safety assessments of components to determine the margin of safety (MOS). This study examined the application of QRA to the safety evaluation of baby diapers, including risk assessments for some diaper ingredient chemicals for which establishment of acceptable and safe exposure levels were demonstrated.

  4. 77 FR 41985 - Use of Influenza Disease Models To Quantitatively Evaluate the Benefits and Risks of Vaccines: A...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-17

    ... model used by the Center for Biologics Evaluation and Research (CBER) and suggestions for further...: Richard Forshee, Center for Biologics Evaluation and Research (HFM-210), Food and Drug Administration... disease computer simulation models to generate quantitative estimates of the benefits and risks...

  5. Microbiological Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  6. A quantitative approach for integrating multiple lines of evidence for the evaluation of environmental health risks

    PubMed Central

    Schleier III, Jerome J.; Marshall, Lucy A.; Davis, Ryan S.

    2015-01-01

    Decision analysis often considers multiple lines of evidence during the decision making process. Researchers and government agencies have advocated for quantitative weight-of-evidence approaches in which multiple lines of evidence can be considered when estimating risk. Therefore, we utilized Bayesian Markov Chain Monte Carlo to integrate several human-health risk assessment, biomonitoring, and epidemiology studies that have been conducted for two common insecticides (malathion and permethrin) used for adult mosquito management to generate an overall estimate of risk quotient (RQ). The utility of the Bayesian inference for risk management is that the estimated risk represents a probability distribution from which the probability of exceeding a threshold can be estimated. The mean RQs after all studies were incorporated were 0.4386, with a variance of 0.0163 for malathion and 0.3281 with a variance of 0.0083 for permethrin. After taking into account all of the evidence available on the risks of ULV insecticides, the probability that malathion or permethrin would exceed a level of concern was less than 0.0001. Bayesian estimates can substantially improve decisions by allowing decision makers to estimate the probability that a risk will exceed a level of concern by considering seemingly disparate lines of evidence. PMID:25648367

  7. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  8. Development of a software for quantitative evaluation radiotherapy target and organ-at-risk segmentation comparison.

    PubMed

    Kalpathy-Cramer, Jayashree; Awan, Musaddiq; Bedrick, Steven; Rasch, Coen R N; Rosenthal, David I; Fuller, Clifton D

    2014-02-01

    Modern radiotherapy requires accurate region of interest (ROI) inputs for plan optimization and delivery. Target delineation, however, remains operator-dependent and potentially serves as a major source of treatment delivery error. In order to optimize this critical, yet observer-driven process, a flexible web-based platform for individual and cooperative target delineation analysis and instruction was developed in order to meet the following unmet needs: (1) an open-source/open-access platform for automated/semiautomated quantitative interobserver and intraobserver ROI analysis and comparison, (2) a real-time interface for radiation oncology trainee online self-education in ROI definition, and (3) a source for pilot data to develop and validate quality metrics for institutional and cooperative group quality assurance efforts. The resultant software, Target Contour Testing/Instructional Computer Software (TaCTICS), developed using Ruby on Rails, has since been implemented and proven flexible, feasible, and useful in several distinct analytical and research applications.

  9. Development of a software for quantitative evaluation radiotherapy target and organ-at-risk segmentation comparison.

    PubMed

    Kalpathy-Cramer, Jayashree; Awan, Musaddiq; Bedrick, Steven; Rasch, Coen R N; Rosenthal, David I; Fuller, Clifton D

    2014-02-01

    Modern radiotherapy requires accurate region of interest (ROI) inputs for plan optimization and delivery. Target delineation, however, remains operator-dependent and potentially serves as a major source of treatment delivery error. In order to optimize this critical, yet observer-driven process, a flexible web-based platform for individual and cooperative target delineation analysis and instruction was developed in order to meet the following unmet needs: (1) an open-source/open-access platform for automated/semiautomated quantitative interobserver and intraobserver ROI analysis and comparison, (2) a real-time interface for radiation oncology trainee online self-education in ROI definition, and (3) a source for pilot data to develop and validate quality metrics for institutional and cooperative group quality assurance efforts. The resultant software, Target Contour Testing/Instructional Computer Software (TaCTICS), developed using Ruby on Rails, has since been implemented and proven flexible, feasible, and useful in several distinct analytical and research applications. PMID:24043593

  10. Evaluation of New Zealand's high-seas bottom trawl closures using predictive habitat models and quantitative risk assessment.

    PubMed

    Penney, Andrew J; Guinotte, John M

    2013-01-01

    United Nations General Assembly Resolution 61/105 on sustainable fisheries (UNGA 2007) establishes three difficult questions for participants in high-seas bottom fisheries to answer: 1) Where are vulnerable marine systems (VMEs) likely to occur?; 2) What is the likelihood of fisheries interaction with these VMEs?; and 3) What might qualify as adequate conservation and management measures to prevent significant adverse impacts? This paper develops an approach to answering these questions for bottom trawling activities in the Convention Area of the South Pacific Regional Fisheries Management Organisation (SPRFMO) within a quantitative risk assessment and cost : benefit analysis framework. The predicted distribution of deep-sea corals from habitat suitability models is used to answer the first question. Distribution of historical bottom trawl effort is used to answer the second, with estimates of seabed areas swept by bottom trawlers being used to develop discounting factors for reduced biodiversity in previously fished areas. These are used in a quantitative ecological risk assessment approach to guide spatial protection planning to address the third question. The coral VME likelihood (average, discounted, predicted coral habitat suitability) of existing spatial closures implemented by New Zealand within the SPRFMO area is evaluated. Historical catch is used as a measure of cost to industry in a cost : benefit analysis of alternative spatial closure scenarios. Results indicate that current closures within the New Zealand SPRFMO area bottom trawl footprint are suboptimal for protection of VMEs. Examples of alternative trawl closure scenarios are provided to illustrate how the approach could be used to optimise protection of VMEs under chosen management objectives, balancing protection of VMEs against economic loss to commercial fishers from closure of historically fished areas. PMID:24358162

  11. Evaluation of New Zealand’s High-Seas Bottom Trawl Closures Using Predictive Habitat Models and Quantitative Risk Assessment

    PubMed Central

    Penney, Andrew J.; Guinotte, John M.

    2013-01-01

    United Nations General Assembly Resolution 61/105 on sustainable fisheries (UNGA 2007) establishes three difficult questions for participants in high-seas bottom fisheries to answer: 1) Where are vulnerable marine systems (VMEs) likely to occur?; 2) What is the likelihood of fisheries interaction with these VMEs?; and 3) What might qualify as adequate conservation and management measures to prevent significant adverse impacts? This paper develops an approach to answering these questions for bottom trawling activities in the Convention Area of the South Pacific Regional Fisheries Management Organisation (SPRFMO) within a quantitative risk assessment and cost : benefit analysis framework. The predicted distribution of deep-sea corals from habitat suitability models is used to answer the first question. Distribution of historical bottom trawl effort is used to answer the second, with estimates of seabed areas swept by bottom trawlers being used to develop discounting factors for reduced biodiversity in previously fished areas. These are used in a quantitative ecological risk assessment approach to guide spatial protection planning to address the third question. The coral VME likelihood (average, discounted, predicted coral habitat suitability) of existing spatial closures implemented by New Zealand within the SPRFMO area is evaluated. Historical catch is used as a measure of cost to industry in a cost : benefit analysis of alternative spatial closure scenarios. Results indicate that current closures within the New Zealand SPRFMO area bottom trawl footprint are suboptimal for protection of VMEs. Examples of alternative trawl closure scenarios are provided to illustrate how the approach could be used to optimise protection of VMEs under chosen management objectives, balancing protection of VMEs against economic loss to commercial fishers from closure of historically fished areas. PMID:24358162

  12. A quantitative evaluation method of flood risks in low-lying areas associated with increase of heavy rainfall in Japan

    NASA Astrophysics Data System (ADS)

    Minakawa, H.; Masumoto, T.

    2012-12-01

    An increase in flood risk, especially in low-lying areas, is predicted as a consequence of global climate change or other causes. Immediate measures such as strengthening of drainage capacity are needed to minimize the damage caused by more-frequent flooding. Typically, drainage pump capacities of in paddy areas are planned by using a result of drainage analysis with design rainfall (e.g. 3-day rainfall amount with a 10-year return period). However, the result depends on a hyetograph of input rainfall even if a total amount of rainfall is equal, and the flood risk may be different with rainfall patterns. Therefore, it is important to assume various patterns of heavy rainfall for flood risk assessment. On the other hand, a rainfall synthesis simulation is useful to generate many patterns of rainfall data for flood studies. We previously proposed a rainfall simulation method called diurnal rainfall pattern generator which can generate short-time step rainfall and internal pattern of them. This study discusses a quantitative evaluation method for detecting a relationship between flood damage risk and heavy rainfall scale by using the diurnal rainfall pattern generator. In addition, we also approached an estimation of flood damage which focused on rice yield. Our study area was in the Kaga three-lagoon basin in Ishikawa Prefecture, Japan. There are two lagoons in the study area, and the low-lying paddy areas extend over about 4,000 ha in the lower reaches of the basin. First, we developed a drainage analysis model that incorporates kinematic and diffusive runoff models for calculating water level on channels and paddies. Next, the heavy rainfall data for drainage analysis were generated. Here, the 3-day rainfalls amounts with 9 kinds of different return periods (2-, 3-, 5-, 8-, 10-, 15-, 50-, 100-, and 200-year) were derived, and three hundred hyetograph patterns were generated for each rainfall amount by using the diurnal rainfall pattern generator. Finally, all data

  13. A toolbox for rockfall Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Agliardi, F.; Mavrouli, O.; Schubert, M.; Corominas, J.; Crosta, G. B.; Faber, M. H.; Frattini, P.; Narasimhan, H.

    2012-04-01

    Rockfall Quantitative Risk Analysis for mitigation design and implementation requires evaluating the probability of rockfall events, the probability and intensity of impacts on structures (elements at risk and countermeasures), their vulnerability, and the related expected costs for different scenarios. A sound theoretical framework has been developed during the last years both for spatially-distributed and local (i.e. single element at risk) analyses. Nevertheless, the practical application of existing methodologies remains challenging, due to difficulties in the collection of required data and to the lack of simple, dedicated analysis tools. In order to fill this gap, specific tools have been developed in the form of Excel spreadsheets, in the framework of Safeland EU project. These tools can be used by stakeholders, practitioners and other interested parties for the quantitative calculation of rock fall risk through its key components (probabilities, vulnerability, loss), using combinations of deterministic and probabilistic approaches. Three tools have been developed, namely: QuRAR (by UNIMIB), VulBlock (by UPC), and RiskNow-Falling Rocks (by ETH Zurich). QuRAR implements a spatially distributed, quantitative assessment methodology of rockfall risk for individual buildings or structures in a multi-building context (urban area). Risk is calculated in terms of expected annual cost, through the evaluation of rockfall event probability, propagation and impact probability (by 3D numerical modelling of rockfall trajectories), and empirical vulnerability for different risk protection scenarios. Vulblock allows a detailed, analytical calculation of the vulnerability of reinforced concrete frame buildings to rockfalls and related fragility curves, both as functions of block velocity and the size. The calculated vulnerability can be integrated in other methodologies/procedures based on the risk equation, by incorporating the uncertainty of the impact location of the rock

  14. Development of quantitative risk acceptance criteria

    SciTech Connect

    Griesmeyer, J. M.; Okrent, D.

    1981-01-01

    Some of the major considerations for effective management of risk are discussed, with particular emphasis on risks due to nuclear power plant operations. Although there are impacts associated with the rest of the fuel cycle, they are not addressed here. Several previously published proposals for quantitative risk criteria are reviewed. They range from a simple acceptance criterion on individual risk of death to a quantitative risk management framework. The final section discussed some of the problems in the establishment of a framework for the quantitative management of risk.

  15. Understanding Pre-Quantitative Risk in Projects

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  16. Evaluating the effectiveness of pasteurization for reducing human illnesses from Salmonella spp. in egg products: results of a quantitative risk assessment.

    PubMed

    Latimer, Heejeong K; Marks, Harry M; Coleman, Margaret E; Schlosser, Wayne D; Golden, Neal J; Ebel, Eric D; Kause, Janell; Schroeder, Carl M

    2008-02-01

    As part of the process for developing risk-based performance standards for egg product processing, the United States Department of Agriculture (USDA) Food Safety and Inspection Service (FSIS) undertook a quantitative microbial risk assessment for Salmonella spp. in pasteurized egg products. The assessment was designed to assist risk managers in evaluating egg handling and pasteurization performance standards for reducing the likelihood of Salmonella in pasteurized egg products and the subsequent risk to human health. The following seven pasteurized liquid egg product formulations were included in the risk assessment model, with the value in parentheses indicating the estimated annual number of human illnesses from Salmonella from each: egg white (2636), whole egg (1763), egg yolk (708), whole egg with 10% salt (407), whole egg with 10% sugar (0), egg yolk with 10% salt (11), and egg yolk with 10% sugar (0). Increased levels of pasteurization were predicted to be highly effective mitigations for reducing the number of illnesses. For example, if all egg white products were pasteurized for a 6-log(10) reduction of Salmonella, the estimated annual number of illnesses from these products would be reduced from 2636 to 270. The risk assessment identified several data gaps and research needs, including a quantitative study of cross-contamination during egg product processing and characterization of egg storage times and temperatures (i) on farms and in homes, (ii) for eggs produced off-line, and (iii) for egg products at retail. Pasteurized egg products are a relatively safe food; however, findings from this study suggest increased pasteurization can make them safer.

  17. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  18. Evaluating the spatial distribution of quantitative risk and hazard level of arsenic exposure in groundwater, case study of Qorveh County, Kurdistan Iran.

    PubMed

    Nasrabadi, Touraj; Bidabadi, Niloufar Shirani

    2013-01-01

    Regional distribution of quantitative risk and hazard levels due to arsenic poisoning in some parts of Iran's Kurdistan province is considered. To investigate the potential risk and hazard level regarding arsenic-contaminated drinking water and further carcinogenic and non-carcinogenic effects on villagers, thirteen wells in rural areas of Qorveh County were considered for evaluation of arsenic concentration in water. Sampling campaign was performed in August 2010 and arsenic concentration was measured via the Silver Diethyldithiocarbamate method. The highest and lowest arsenic concentration are reported in Guilaklu and Qezeljakand villages with 420 and 67 μg/L, respectively. None of thirteen water samples met the maximum contaminant level issued by USEPA and Institute of Standards and Industrial Research of Iran (10 ppb). The highest arsenic concentration and consequently risk and hazard levels belong to villages situated alongside the eastern frontiers of the county. Existence of volcanic activities within the upper Miocene and Pleistocene in this part of the study area may be addressed as the main geopogenic source of arsenic pollution. Quantitative risk values are varying from 1.49E-03 in Qezeljakand to 8.92E-03 in Guilaklu and may be interpreted as very high when compared by similar studies in Iran. Regarding non-carcinogenic effects, all thirteen water samples are considered hazardous while all calculated chronic daily intakes are greater than arsenic reference dose. Such drinking water source has the potential to impose adverse carcinogenic and non-carcinogenic effects on villagers. Accordingly, an urgent decision must be made to substitute the current drinking water source with a safer one.

  19. Evaluating the spatial distribution of quantitative risk and hazard level of arsenic exposure in groundwater, case study of Qorveh County, Kurdistan Iran

    PubMed Central

    2013-01-01

    Regional distribution of quantitative risk and hazard levels due to arsenic poisoning in some parts of Iran’s Kurdistan province is considered. To investigate the potential risk and hazard level regarding arsenic-contaminated drinking water and further carcinogenic and non-carcinogenic effects on villagers, thirteen wells in rural areas of Qorveh County were considered for evaluation of arsenic concentration in water. Sampling campaign was performed in August 2010 and arsenic concentration was measured via the Silver Diethyldithiocarbamate method. The highest and lowest arsenic concentration are reported in Guilaklu and Qezeljakand villages with 420 and 67 μg/L, respectively. None of thirteen water samples met the maximum contaminant level issued by USEPA and Institute of Standards and Industrial Research of Iran (10 ppb). The highest arsenic concentration and consequently risk and hazard levels belong to villages situated alongside the eastern frontiers of the county. Existence of volcanic activities within the upper Miocene and Pleistocene in this part of the study area may be addressed as the main geopogenic source of arsenic pollution. Quantitative risk values are varying from 1.49E-03 in Qezeljakand to 8.92E-03 in Guilaklu and may be interpreted as very high when compared by similar studies in Iran. Regarding non-carcinogenic effects, all thirteen water samples are considered hazardous while all calculated chronic daily intakes are greater than arsenic reference dose. Such drinking water source has the potential to impose adverse carcinogenic and non-carcinogenic effects on villagers. Accordingly, an urgent decision must be made to substitute the current drinking water source with a safer one. PMID:23574885

  20. Quantitative risk modeling in aseptic manufacture.

    PubMed

    Tidswell, Edward C; McGarvey, Bernard

    2006-01-01

    Expedient risk assessment of aseptic manufacturing processes offers unique opportunities for improved and sustained assurance of product quality. Contemporary risk assessments applied to aseptic manufacturing processes, however, are commonly handicapped by assumptions and subjectivity, leading to inexactitude. Quantitative risk modeling augmented with Monte Carlo simulations represents a novel, innovative, and more efficient means of risk assessment. This technique relies upon fewer assumptions and removes subjectivity to more swiftly generate an improved, more realistic, quantitative estimate of risk. The fundamental steps and requirements for an assessment of the risk of bioburden ingress into aseptically manufactured products are described. A case study exemplifies how quantitative risk modeling and Monte Carlo simulations achieve a more rapid and improved determination of the risk of bioburden ingress during the aseptic filling of a parenteral product. Although application of quantitative risk modeling is described here purely for the purpose of process improvement, the technique has far wider relevance in the assisted disposition of batches, cleanroom management, and the utilization of real-time data from rapid microbial monitoring technologies. PMID:17089696

  1. A Risk Assessment Model for Reduced Aircraft Separation: A Quantitative Method to Evaluate the Safety of Free Flight

    NASA Technical Reports Server (NTRS)

    Cassell, Rick; Smith, Alex; Connors, Mary; Wojciech, Jack; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    As new technologies and procedures are introduced into the National Airspace System, whether they are intended to improve efficiency, capacity, or safety level, the quantification of potential changes in safety levels is of vital concern. Applications of technology can improve safety levels and allow the reduction of separation standards. An excellent example is the Precision Runway Monitor (PRM). By taking advantage of the surveillance and display advances of PRM, airports can run instrument parallel approaches to runways separated by 3400 feet with the same level of safety as parallel approaches to runways separated by 4300 feet using the standard technology. Despite a wealth of information from flight operations and testing programs, there is no readily quantifiable relationship between numerical safety levels and the separation standards that apply to aircraft on final approach. This paper presents a modeling approach to quantify the risk associated with reducing separation on final approach. Reducing aircraft separation, both laterally and longitudinally, has been the goal of several aviation R&D programs over the past several years. Many of these programs have focused on technological solutions to improve navigation accuracy, surveillance accuracy, aircraft situational awareness, controller situational awareness, and other technical and operational factors that are vital to maintaining flight safety. The risk assessment model relates different types of potential aircraft accidents and incidents and their contribution to overall accident risk. The framework links accident risks to a hierarchy of failsafe mechanisms characterized by procedures and interventions. The model will be used to assess the overall level of safety associated with reducing separation standards and the introduction of new technology and procedures, as envisaged under the Free Flight concept. The model framework can be applied to various aircraft scenarios, including parallel and in

  2. Biologically based, quantitative risk assessment of neurotoxicants.

    PubMed

    Slikker, W; Crump, K S; Andersen, M E; Bellinger, D

    1996-01-01

    The need for biologically based, quantitative risk assessment procedures for noncancer endpoints such as neurotoxicity has been discussed in reports by the United States Congress (Office of Technology Assessment, OTA), National Research Council (NRC), and a federal coordinating council. According to OTA, current attention and resources allocated to health risk assessment research are inadequate and not commensurate with its impact on public health and the economy. Methods to include continuous rather than dichotomous data for neurotoxicity endpoints, biomarkers of exposure and effects, and pharmacokinetic and mechanistic data have been proposed for neurotoxicity risk assessment but require further review and validation before acceptance. The purpose of this symposium was to examine procedures to enhance the risk assessment process for neurotoxicants and to discuss techniques to make the process more quantitative. Accordingly, a review of the currently used safety factor risk assessment approach for neurotoxicants is provided along with specific examples of how this process may be enhanced with the use of the benchmark dose approach. The importance of including physiologically based pharmacokinetic data in the risk assessment process and specific examples of this approach is presented for neurotoxicants. The role of biomarkers of exposure and effect and mechanistic information in the risk assessment process are also addressed. Finally, quantitative approaches with the use of continuous neurotoxicity data are demonstrated and the outcomes compared to those generated by currently used risk assessment procedures. PMID:8838636

  3. QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT

    EPA Science Inventory

    In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...

  4. Quantitative evaluation of Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  5. Quantitative evaluation of dermatological antiseptics.

    PubMed

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus.

  6. Quantitative Risk Assessment for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Lowry, T. S.; McKenna, S. A.; Hadgu, T.; Kalinina, E.

    2011-12-01

    This study uses a quantitative risk-assessment approach to place the uncertainty associated with enhanced geothermal systems (EGS) development into meaningful context and to identify points of attack that can reduce risk the most. Using the integrated geothermal assessment tool, GT-Mod, we calculate the complimentary cumulative distribution function of the levelized cost of electricity (LCOE) that results from uncertainty in a variety of geologic and economic input parameter values. EGS is a developing technology that taps deep (2-10km) geologic heat sources for energy production by "enhancing" non-permeable hot rock through hydraulic stimulation. Despite the promise of EGS, uncertainties in predicting the physical end economic performance of a site has hindered its development. To address this, we apply a quantitative risk-assessment approach that calculates risk as the sum of the consequence, C, multiplied by the range of the probability, ΔP, over all estimations of a given exceedance probability, n, over time, t. The consequence here is defined as the deviation from the best estimate LCOE, which is calculated using the 'best-guess' input parameter values. The analysis assumes a realistic but fictitious EGS site with uncertainties in the exploration success rate, the sub-surface thermal gradient, the reservoir fracture pattern, and the power plant performance. Uncertainty in the exploration, construction, O&M, and drilling costs are also included. The depth to the resource is calculated from the thermal gradient and a target resource temperature of 225 °C. Thermal performance is simulated using the Gringarten analytical solution. The mass flow rate is set to produce 30 MWe of power for the given conditions and is adjusted over time to maintain that rate over the plant lifetime of 30 years. Simulations are conducted using GT-Mod, which dynamically links the physical systems of a geothermal site to simulate, as an integrated, multi-system component, the

  7. Bayes` theorem and quantitative risk assessment

    SciTech Connect

    Kaplan, S.

    1994-12-31

    This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``

  8. Trends in quantitative cancer risk assessment.

    PubMed Central

    Morris, S C

    1991-01-01

    Quantitative cancer risk assessment is a dynamic field, more closely coupled to rapidly advancing biomedical research than ever before. Six areas of change and growth are identified: expansion from models of cancer initiation to a more complete picture of the total carcinogenic process; trend from curve-fitting to biologically based models; movement from upperbound estimates to best estimates, with a more complete treatment of uncertainty; increased consideration of the role of susceptibility; growing development of expert systems and decision support systems; and emerging importance of risk communication. PMID:2050076

  9. Asbestos exposure--quantitative assessment of risk

    SciTech Connect

    Hughes, J.M.; Weill, H.

    1986-01-01

    Methods for deriving quantitative estimates of asbestos-associated health risks are reviewed and their numerous assumptions and uncertainties described. These methods involve extrapolation of risks observed at past relatively high asbestos concentration levels down to usually much lower concentration levels of interest today--in some cases, orders of magnitude lower. These models are used to calculate estimates of the potential risk to workers manufacturing asbestos products and to students enrolled in schools containing asbestos products. The potential risk to workers exposed for 40 yr to 0.5 fibers per milliliter (f/ml) of mixed asbestos fiber type (a permissible workplace exposure limit under consideration by the Occupational Safety and Health Administration (OSHA) ) are estimated as 82 lifetime excess cancers per 10,000 exposed. The risk to students exposed to an average asbestos concentration of 0.001 f/ml of mixed asbestos fiber types for an average enrollment period of 6 school years is estimated as 5 lifetime excess cancers per one million exposed. If the school exposure is to chrysotile asbestos only, then the estimated risk is 1.5 lifetime excess cancers per million. Risks from other causes are presented for comparison; e.g., annual rates (per million) of 10 deaths from high school football, 14 from bicycling (10-14 yr of age), 5 to 20 for whooping cough vaccination. Decisions concerning asbestos products require participation of all parties involved and should only be made after a scientifically defensible estimate of the associated risk has been obtained. In many cases to date, such decisions have been made without adequate consideration of the level of risk or the cost-effectiveness of attempts to lower the potential risk. 73 references.

  10. Quantitative risk modelling for new pharmaceutical compounds.

    PubMed

    Tang, Zhengru; Taylor, Mark J; Lisboa, Paulo; Dyas, Mark

    2005-11-15

    The process of discovering and developing new drugs is long, costly and risk-laden. Faced with a wealth of newly discovered compounds, industrial scientists need to target resources carefully to discern the key attributes of a drug candidate and to make informed decisions. Here, we describe a quantitative approach to modelling the risk associated with drug development as a tool for scenario analysis concerning the probability of success of a compound as a potential pharmaceutical agent. We bring together the three strands of manufacture, clinical effectiveness and financial returns. This approach involves the application of a Bayesian Network. A simulation model is demonstrated with an implementation in MS Excel using the modelling engine Crystal Ball. PMID:16257374

  11. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  12. Quantitative Security Risk Assessment and Management for Railway Transportation Infrastructures

    NASA Astrophysics Data System (ADS)

    Flammini, Francesco; Gaglione, Andrea; Mazzocca, Nicola; Pragliola, Concetta

    Scientists have been long investigating procedures, models and tools for the risk analysis in several domains, from economics to computer networks. This paper presents a quantitative method and a tool for the security risk assessment and management specifically tailored to the context of railway transportation systems, which are exposed to threats ranging from vandalism to terrorism. The method is based on a reference mathematical model and it is supported by a specifically developed tool. The tool allows for the management of data, including attributes of attack scenarios and effectiveness of protection mechanisms, and the computation of results, including risk and cost/benefit indices. The main focus is on the design of physical protection systems, but the analysis can be extended to logical threats as well. The cost/benefit analysis allows for the evaluation of the return on investment, which is a nowadays important issue to be addressed by risk analysts.

  13. Hydrogen quantitative risk assessment workshop proceedings.

    SciTech Connect

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersion 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.

  14. Quantitative risk assessment of Cryptosporidium species infection in dairy calves.

    PubMed

    Nydam, D V; Mohammed, H O

    2005-11-01

    Cryptosporidium parvum is a zoonotic protozoan that infects many different mammals including cattle and humans. Cryptosporidiosis has become a concern for dairy producers because of the direct losses due to calves not performing well and the potential for environmental contamination with C. parvum. Identifying modifiable control points in the dynamics of infection in dairy herds will help identify management strategies that mitigate its risk. The quantitative risk assessment approach provides estimates of the risk associated with these factors so that cost-effective strategies can be implemented. Using published data from epidemiologic studies and a stochastic approach, we modeled the risk that C. parvum presents to dairy calves in 2 geographic areas: 1) the New York City Watershed (NYCW) in southeastern New York, and 2) the entire United States. The approach focused on 2 possible areas of exposure--the rearing environment and the maternity environment. In addition, we evaluated the contribution of many risk factors (e.g., age, housing, flies) to the end-state (i.e., total) risk to identify areas of intervention to decrease the risk to dairy calves. Expected risks from C. parvum in US dairy herds in rearing and maternity environments were 41.7 and 33.9%, respectively. In the NYCW, the expected risks from C. parvum in the rearing and maternity environments were 0.36 and 0.33%, respectively. In the US scenarios, the immediate environment contributed most of the risk to calves, whereas in the NYCW scenario, it was new calf infection. Therefore, within the NYCW, risk management activities may be focused on preventing new calf infections, whereas in the general US population, cleaning of calf housing would be a good choice for resource allocation. Despite the many assumptions inherent with modeling techniques, its usefulness to quantify the likelihood of risk and identify risk management areas is illustrated.

  15. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  16. Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.

    2012-04-01

    Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central

  17. Using fire tests for quantitative risk analysis

    SciTech Connect

    Ling, W.C.T.; Williamson, R.B.

    1980-03-01

    Fires can be considered a causal chain-of-events in which the growth and spread of fire may cause damage and injury if it is rapid enough to overcome the barriers placed in its way. Fire tests for fire resistance of the barriers can be used in a quantitative risk assessment. The fire growth and spread is modelled in a State Transition Model (STM). The fire barriers are presented as part of the Fire Protection Model (FPM) which is based on a portion of the NFPA Decision Tree. An Emergency Equivalent Network is introduced to couple the Fire Growth Model (FGM) and the FPM so that the spread of fire beyond the room-of-origin can be computed. An example is presented in which a specific building floor plan is analyzed to obtain the shortest expected time for fire to spread between two points. To obtain the probability and time for each link in the network, data from the results of fire tests were used. These results were found to be lacking and new standards giving better data are advocated.

  18. [QUANTITATIVE DNA EVALUATION OF THE HIGH CARCINOGENIC RISK OF HUMAN PAPILLOMA VIRUSES AND HUMAN HERPES VIRUSES IN MALES WITH FERTILITY DISORDERS].

    PubMed

    Evdokimov, V V; Naumenko, V A; Tulenev, Yu A; Kurilo, L F; Kovalyk, V P; Sorokina, T M; Lebedeva, A L; Gomberg, M A; Kushch, A A

    2016-01-01

    Infertility is an actual medical and social problem. In 50% of couples it is associated with the male factor and in more than 50% of cases the etiology of the infertility remains insufficiently understood. The goal of this work was to study the prevalence and to perform quantitative analysis of the human herpes viruses (HHV) and high carcinogenic risk papilloma viruses (HR HPV) in males with infertility, as well as to assess the impact of these infections on sperm parameters. Ejaculate samples obtained from 196 males fall into 3 groups. Group 1 included men with the infertility of unknown etiology (n = 112); group 2, patients who had female partners with the history of spontaneous abortion (n = 63); group 3 (control), healthy men (n = 21). HHV and HR HPV DNA in the ejaculates were detected in a total of 42/196 (21.4%) males: in 31 and 11 patients in groups 1 and 2, respectively (p > 0.05) and in none of healthy males. HHV were detected in 24/42; HR HPV, in 18/42 males (p > 0.05) without significant difference between the groups. Among HR HPV genotypes of the clade A9 in ejaculate were more frequent (14/18, p = 0.04). Comparative analysis of the sperm parameters showed that in the ejaculates of the infected patients sperm motility as well as the number of morphologically normal cells were significantly reduced compared with the healthy men. The quantification of the viral DNA revealed that in 31% of the male ejaculates the viral load was high: > 3 Ig10/100000 cells. Conclusion. The detection of HHV and HR HPV in the ejaculate is associated with male infertility. Quantification of the viral DNA in the ejaculate is a useful indicator for monitoring viral infections in infertility and for decision to start therapy.

  19. [QUANTITATIVE DNA EVALUATION OF THE HIGH CARCINOGENIC RISK OF HUMAN PAPILLOMA VIRUSES AND HUMAN HERPES VIRUSES IN MALES WITH FERTILITY DISORDERS].

    PubMed

    Evdokimov, V V; Naumenko, V A; Tulenev, Yu A; Kurilo, L F; Kovalyk, V P; Sorokina, T M; Lebedeva, A L; Gomberg, M A; Kushch, A A

    2016-01-01

    Infertility is an actual medical and social problem. In 50% of couples it is associated with the male factor and in more than 50% of cases the etiology of the infertility remains insufficiently understood. The goal of this work was to study the prevalence and to perform quantitative analysis of the human herpes viruses (HHV) and high carcinogenic risk papilloma viruses (HR HPV) in males with infertility, as well as to assess the impact of these infections on sperm parameters. Ejaculate samples obtained from 196 males fall into 3 groups. Group 1 included men with the infertility of unknown etiology (n = 112); group 2, patients who had female partners with the history of spontaneous abortion (n = 63); group 3 (control), healthy men (n = 21). HHV and HR HPV DNA in the ejaculates were detected in a total of 42/196 (21.4%) males: in 31 and 11 patients in groups 1 and 2, respectively (p > 0.05) and in none of healthy males. HHV were detected in 24/42; HR HPV, in 18/42 males (p > 0.05) without significant difference between the groups. Among HR HPV genotypes of the clade A9 in ejaculate were more frequent (14/18, p = 0.04). Comparative analysis of the sperm parameters showed that in the ejaculates of the infected patients sperm motility as well as the number of morphologically normal cells were significantly reduced compared with the healthy men. The quantification of the viral DNA revealed that in 31% of the male ejaculates the viral load was high: > 3 Ig10/100000 cells. Conclusion. The detection of HHV and HR HPV in the ejaculate is associated with male infertility. Quantification of the viral DNA in the ejaculate is a useful indicator for monitoring viral infections in infertility and for decision to start therapy. PMID:27451497

  20. A comparison of risk assessment techniques from qualitative to quantitative

    SciTech Connect

    Altenbach, T.J.

    1995-02-13

    Risk assessment techniques vary from purely qualitative approaches, through a regime of semi-qualitative to the more traditional quantitative. Constraints such as time, money, manpower, skills, management perceptions, risk result communication to the public, and political pressures all affect the manner in which risk assessments are carried out. This paper surveys some risk matrix techniques, examining the uses and applicability for each. Limitations and problems for each technique are presented and compared to the others. Risk matrix approaches vary from purely qualitative axis descriptions of accident frequency vs consequences, to fully quantitative axis definitions using multi-attribute utility theory to equate different types of risk from the same operation.

  1. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  2. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    SciTech Connect

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  3. Quantitative Risk Analysis of Obstacle Limitation Standards

    NASA Astrophysics Data System (ADS)

    Sandaradura, Amila Silva

    Obstacle limitation surfaces (OLS) are the main safeguard against objects that can pose a hazard to aircraft operations at and around the airports. The standard dimensions of the most of these surfaces were estimated using the pilot's experience at the time when they were included in to the standard documents. As a result, some of these standards may have been overestimated while others may not provide an adequate level of safety. With airports moving to the Safety Management System (SMS) approach to design and operations safety, proper evaluation of the level of safety provided by OLS at specific sites becomes great importance to airport operators. There is no published evidence, however, for the estimation of the safety level provided by the existing OLS standards. Moreover, the rationale used by the ICAO to establish existing OLS standards is not readily available in the standard documents. Therefore this study attempts to collect actual flight path data using information provided by air traffic control radars and construct a methodology to assess the probability of aircraft deviating from their intended/protected path. The extension of the developed methodology can be used to estimate the OLS dimensions that provide an acceptable safety level for the aircraft operations. This will be helpful to estimate safe and efficient standard dimensions of the OLS and assess the risk level of objects to the aircraft operations around airports. In order to assess the existing standards and show the applications of the methodology, three case studies were conducted using aircraft data collected from Ottawa (CYOW), Calgary (CYYC) and Edmonton (CYEG) International Airports.

  4. Genetic determinants of quantitative traits associated with cardiovascular disease risk.

    PubMed

    Smolková, Božena; Bonassi, Stefano; Buociková, Verona; Dušinská, Mária; Horská, Alexandra; Kuba, Daniel; Džupinková, Zuzana; Rašlová, Katarína; Gašparovič, Juraj; Slíž, Ivan; Ceppi, Marcello; Vohnout, Branislav; Wsólová, Ladislava; Volkovová, Katarína

    2015-08-01

    Established risk factors for cardiovascular diseases (CVD) may be moderated by genetic variants. In 2403 unrelated individuals from general practice (mean age 40.5 years), we evaluated the influence of 15 variants in 12 candidate genes on quantitative traits (QT) associated with CVD (body mass index, abdominal obesity, glucose, serum lipids, and blood pressure). Prior to multiple testing correction, univariate analysis associated APOE rs429358, rs7412 and ATG16L1 rs2241880 variants with serum lipid levels, while LEPR rs1137100 and ATG16L1 rs2241880 variants were linked to obesity related QTs. After taking into account confounding factors and correcting for multiple comparisons only APOE rs429358 and rs7412 variants remained significantly associated with risk of dyslipidemia. APOE rs429358 variant almost tripled the risk in homozygous subjects (OR = 2.97; 95% CI 1.09-8.10, p < 0.03) and had a lesser but still highly significant association also in heterozygous individuals (OR = 1.67; 95% CI 1.24-2.10; p < 0.001). Associations with hypertension, diabetes mellitus, and metabolic syndrome were not significant after Bonferroni correction. The influence of genetic variation is more evident in dyslipidemia than in other analyzed QTs. These results may contribute to strategic research aimed at including genetic variation in the set of data required to identify subjects at high risk of CVD. PMID:26043189

  5. Quantitative Measures of Mineral Supply Risk

    NASA Astrophysics Data System (ADS)

    Long, K. R.

    2009-12-01

    Almost all metals and many non-metallic minerals are traded internationally. An advantage of global mineral markets is that minerals can be obtained from the globally lowest-cost source. For example, one rare-earth element (REE) mine in China, Bayan Obo, is able to supply most of world demand for rare earth elements at a cost significantly less than its main competitors. Concentration of global supplies at a single mine raises significant political risks, illustrated by China’s recent decision to prohibit the export of some REEs and severely limit the export of others. The expected loss of REE supplies will have a significant impact on the cost and production of important national defense technologies and on alternative energy programs. Hybrid vehicles and wind-turbine generators, for example, require REEs for magnets and batteries. Compact fluorescent light bulbs use REE-based phosphors. These recent events raise the general issue of how to measure the degree of supply risk for internationally sourced minerals. Two factors, concentration of supply and political risk, must first be addressed. Concentration of supply can be measured with standard economic tools for measuring industry concentration, using countries rather than firms as the unit of analysis. There are many measures of political risk available. That of the OECD is a measure of a country’s commitment to rule-of-law and enforcement of contracts, as well as political stability. Combining these measures provides a comparative view of mineral supply risk across commodities and identifies several minerals other than REEs that could suddenly become less available. Combined with an assessment of the impact of a reduction in supply, decision makers can use these measures to prioritize risk reduction efforts.

  6. CUMULATIVE RISK ASSESSMENT FOR QUANTITATIVE RESPONSE DATA

    EPA Science Inventory

    The Relative Potency Factor approach (RPF) is used to normalize and combine different toxic potencies among a group of chemicals selected for cumulative risk assessment. The RPF method assumes that the slopes of the dose-response functions are all equal; but this method depends o...

  7. QUANTITATIVE RISK ASSESSMENT FOR MICROBIAL AGENTS

    EPA Science Inventory

    Compared to chemical risk assessment, the process for microbial agents and infectious disease is more complex because of host factors and the variety of settings in which disease transmission can occur. While the National Academy of Science has established a paradigm for performi...

  8. Quantitative risk stratification of oral leukoplakia with exfoliative cytology.

    PubMed

    Liu, Yao; Li, Jianying; Liu, Xiaoyong; Liu, Xudong; Khawar, Waqaar; Zhang, Xinyan; Wang, Fan; Chen, Xiaoxin; Sun, Zheng

    2015-01-01

    Exfoliative cytology has been widely used for early diagnosis of oral squamous cell carcinoma (OSCC). Test outcome is reported as "negative", "atypical" (defined as abnormal epithelial changes of uncertain diagnostic significance), and "positive" (defined as definitive cellular evidence of epithelial dysplasia or carcinoma). The major challenge is how to properly manage the "atypical" patients in order to diagnose OSCC early and prevent OSCC. In this study, we collected exfoliative cytology data, histopathology data, and clinical data of normal subjects (n=102), oral leukoplakia (OLK) patients (n=82), and OSCC patients (n=93), and developed a data analysis procedure for quantitative risk stratification of OLK patients. This procedure involving a step called expert-guided data transformation and reconstruction (EdTAR) which allows automatic data processing and reconstruction and reveals informative signals for subsequent risk stratification. Modern machine learning techniques were utilized to build statistical prediction models on the reconstructed data. Among the several models tested using resampling methods for parameter pruning and performance evaluation, Support Vector Machine (SVM) was found to be optimal with a high sensitivity (median>0.98) and specificity (median>0.99). With the SVM model, we constructed an oral cancer risk index (OCRI) which may potentially guide clinical follow-up of OLK patients. One OLK patient with an initial OCRI of 0.88 developed OSCC after 40 months of follow-up. In conclusion, we have developed a statistical method for qualitative risk stratification of OLK patients. This method may potentially improve cost-effectiveness of clinical follow-up of OLK patients, and help design clinical chemoprevention trial for high-risk populations.

  9. Status and future of Quantitative Microbiological Risk Assessment in China

    PubMed Central

    Dong, Q.L.; Barker, G.C.; Gorris, L.G.M.; Tian, M.S.; Song, X.Y.; Malakar, P.K.

    2015-01-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives. PMID:26089594

  10. The application of quantitative risk assessment to microbial food safety risks.

    PubMed

    Jaykus, L A

    1996-01-01

    Regulatory programs and guidelines for the control of foodborne microbial agents have existed in the U.S. for nearly 100 years. However, increased awareness of the scope and magnitude of foodborne disease, as well as the emergence of previously unrecognized human pathogens transmitted via the foodborne route, have prompted regulatory officials to consider new and improved strategies to reduce the health risks associated with pathogenic microorganisms in foods. Implementation of these proposed strategies will involve definitive costs for a finite level of risk reduction. While regulatory decisions regarding the management of foodborne disease risk have traditionally been done with the aid of the scientific community, a formal conceptual framework for the evaluation of health risks from pathogenic microorganisms in foods is warranted. Quantitative risk assessment (QRA), which is formally defined as the technical assessment of the nature and magnitude of a risk caused by a hazard, provides such a framework. Reproducing microorganisms in foods present a particular challenge to QRA because both their introduction and numbers may be affected by numerous factors within the food chain, with all of these factors representing significant stages in food production, handling, and consumption, in a farm-to-table type of approach. The process of QRA entails four designated phases: (1) hazard identification, (2) exposure assessment, (3) dose-response assessment, and (4) risk characterization. Specific analytical tools are available to accomplish the analyses required for each phase of the QRA. The purpose of this paper is to provide a description of the conceptual framework for quantitative microbial risk assessment within the standard description provided by the National Academy of Sciences (NAS) paradigm. Each of the sequential steps in QRA are discussed in detail, providing information on current applications, tools for conducting the analyses, and methodological and/or data

  11. Quantitative evaluation of gait ataxia by accelerometers.

    PubMed

    Shirai, Shinichi; Yabe, Ichiro; Matsushima, Masaaki; Ito, Yoichi M; Yoneyama, Mitsuru; Sasaki, Hidenao

    2015-11-15

    An appropriate biomarker for spinocerebellar degeneration (SCD) has not been identified. Here, we performed gait analysis on patients with pure cerebellar type SCD and assessed whether the obtained data could be used as a neurophysiological biomarker for cerebellar ataxia. We analyzed 25 SCD patients, 25 patients with Parkinson's disease as a disease control, and 25 healthy control individuals. Acceleration signals during 6 min of walking and 1 min of standing were measured by two sets of triaxial accelerometers that were secured with a fixation vest to the middle of the lower and upper back of each subject. We extracted two gait parameters, the average and the coefficient of variation of motion trajectory amplitude, from each acceleration component. Then, each component was analyzed by correlation with the Scale for the Assessment and Rating of Ataxia (SARA) and the Berg Balance Scale (BBS). Compared with the gait control of healthy subjects and concerning correlation with severity and disease specificity, our results suggest that the average amplitude of medial-lateral (upper back) of straight gait is a physiological biomarker for cerebellar ataxia. Our results suggest that gait analysis is a quantitative and concise evaluation scale for the severity of cerebellar ataxia.

  12. A Team Mental Model Perspective of Pre-Quantitative Risk

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.

  13. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  14. Quantitative risk assessment: an emerging tool for emerging foodborne pathogens.

    PubMed Central

    Lammerding, A. M.; Paoli, G. M.

    1997-01-01

    New challenges to the safety of the food supply require new strategies for evaluating and managing food safety risks. Changes in pathogens, food preparation, distribution, and consumption, and population immunity have the potential to adversely affect human health. Risk assessment offers a framework for predicting the impact of changes and trends on the provision of safe food. Risk assessment models facilitate the evaluation of active or passive changes in how foods are produced, processed, distributed, and consumed. PMID:9366601

  15. Quantitative risk assessment—a place in laser safety?

    NASA Astrophysics Data System (ADS)

    Gardner, R.; Smith, P. A.

    1995-02-01

    Since 1976 the United Kingdom Ministry of Defence (MOD) has used quantitative risk assessment (QRA) as a tool to manage the risks involved with airborne laser rangefinders and target designators. It has done this against the background of the Health and Safety at Work Act (1974) and the regulations made under this act. These apply equally to MOD and civilian employers. More recently there has been legislation requiring that all risks should be assessed. The MOD's use of QRA in laser safety is cited as an example of a useful tool, the use of which should not be precluded by further legislation.

  16. Quantitative risk assessment methods for cancer and noncancer effects.

    PubMed

    Baynes, Ronald E

    2012-01-01

    Human health risk assessments have evolved from the more qualitative approaches to more quantitative approaches in the past decade. This has been facilitated by the improvement in computer hardware and software capability and novel computational approaches being slowly recognized by regulatory agencies. These events have helped reduce the reliance on experimental animals as well as better utilization of published animal toxicology data in deriving quantitative toxicity indices that may be useful for risk management purposes. This chapter briefly describes some of the approaches as described in the guidance documents from several of the regulatory agencies as it pertains to hazard identification and dose-response assessment of a chemical. These approaches are contrasted with more novel computational approaches that provide a better grasp of the uncertainty often associated with chemical risk assessments.

  17. The role of quantitative safety evaluation in regulatory decision making of drugs.

    PubMed

    Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat

    2016-01-01

    Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.

  18. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment.

  19. Molecular sensitivity threshold of wet mount and an immunochromatographic assay evaluated by quantitative real-time PCR for diagnosis of Trichomonas vaginalis infection in a low-risk population of childbearing women.

    PubMed

    Leli, Christian; Castronari, Roberto; Levorato, Lucia; Luciano, Eugenio; Pistoni, Eleonora; Perito, Stefano; Bozza, Silvia; Mencacci, Antonella

    2016-06-01

    Vaginal trichomoniasis is a sexually transmitted infection caused by Trichomonas vaginalis, a flagellated protozoan. Diagnosis of T. vaginalis infection is mainly performed by wet mount microscopy, with a sensitivity ranging from 38% to 82%, compared to culture, still considered the gold standard. Commercial immunochromatographic tests for monoclonal-antibody-based detection have been introduced as alternative methods for diagnosis of T. vaginalis infection and have been reported in some studies to be more sensitive than wet mount. Real-time PCR methods have been recently developed, with optimal sensitivity and specificity. The aim of this study was to evaluate whether there is a molecular sensitivity threshold for both wet mount and imunochromatographic assays. To this aim, a total of 1487 low-risk childbearing women (median age 32 years, interquartile range 27-37) were included in the study, and underwent vaginal swab for T. vaginalis detection by means of a quantitative real-time PCR assay, wet mount and an immunochromatographic test. Upon comparing the results, prevalence values observed were 1.3% for real-time PCR, 0.5% for microscopic examination, and 0.8% for the immunochromatographic test. Compared to real-time PCR, wet mount sensitivity was 40% (95% confidence interval 19.1% to 63.9%) and specificity was 100% (95% CI 99.7% to 100%). The sensitivity and specificity of the immunochromatographic assay were 57.9% (95% CI 33.5% to 79.8%) and 99.9% (95% CI 99.6% to 100%), respectively. Evaluation of the wet mount results and those of immunochromatographic assay detection in relation to the number of T. vaginalis DNA copies detected in vaginal samples showed that the lower identification threshold for both wet mount (chi-square 6.1; P = 0.016) and the immunochromatographic assay (chi-square 10.7; P = 0.002) was ≥100 copies of T. vaginalis DNA/5 mcl of eluted DNA. PMID:27367320

  20. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    PubMed

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  1. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  2. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  3. Extending the quantitative assessment of industrial risks to earthquake effects.

    PubMed

    Campedel, Michela; Cozzani, Valerio; Garcia-Agreda, Anita; Salzano, Ernesto

    2008-10-01

    In the general framework of quantitative methods for natural-technological (NaTech) risk analysis, a specific methodology was developed for assessing risks caused by hazardous substances released due to earthquakes. The contribution of accidental scenarios initiated by seismic events to the overall industrial risk was assessed in three case studies derived from the actual plant layout of existing oil refineries. Several specific vulnerability models for different equipment classes were compared and assessed. The effect of differing structural resistances for process equipment on the final risk results was also investigated. The main factors influencing the final risk values resulted from the models for equipment vulnerability and the assumptions for the reference damage states of the process equipment. The analysis of case studies showed that in seismic zones the additional risk deriving from damage caused by earthquakes may be up to more than one order of magnitude higher than that associated to internal failure causes. Critical equipment was determined to be mainly pressurized tanks, even though atmospheric tanks were more vulnerable to containment loss. Failure of minor process equipment having a limited hold-up of hazardous substances (such as pumps) was shown to have limited influence on the final values of the risk increase caused by earthquakes.

  4. Evaluating quantitative proton-density-mapping methods.

    PubMed

    Mezer, Aviv; Rokem, Ariel; Berman, Shai; Hastie, Trevor; Wandell, Brian A

    2016-10-01

    Quantitative magnetic resonance imaging (qMRI) aims to quantify tissue parameters by eliminating instrumental bias. We describe qMRI theory, simulations, and software designed to estimate proton density (PD), the apparent local concentration of water protons in the living human brain. First, we show that, in the absence of noise, multichannel coil data contain enough information to separate PD and coil sensitivity, a limiting instrumental bias. Second, we show that, in the presence of noise, regularization by a constraint on the relationship between T1 and PD produces accurate coil sensitivity and PD maps. The ability to measure PD quantitatively has applications in the analysis of in-vivo human brain tissue and enables multisite comparisons between individuals and across instruments. Hum Brain Mapp 37:3623-3635, 2016. © 2016 Wiley Periodicals, Inc.

  5. Evaluating quantitative proton-density-mapping methods.

    PubMed

    Mezer, Aviv; Rokem, Ariel; Berman, Shai; Hastie, Trevor; Wandell, Brian A

    2016-10-01

    Quantitative magnetic resonance imaging (qMRI) aims to quantify tissue parameters by eliminating instrumental bias. We describe qMRI theory, simulations, and software designed to estimate proton density (PD), the apparent local concentration of water protons in the living human brain. First, we show that, in the absence of noise, multichannel coil data contain enough information to separate PD and coil sensitivity, a limiting instrumental bias. Second, we show that, in the presence of noise, regularization by a constraint on the relationship between T1 and PD produces accurate coil sensitivity and PD maps. The ability to measure PD quantitatively has applications in the analysis of in-vivo human brain tissue and enables multisite comparisons between individuals and across instruments. Hum Brain Mapp 37:3623-3635, 2016. © 2016 Wiley Periodicals, Inc. PMID:27273015

  6. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.

  7. Production Risk Evaluation Program (PREP) - summary

    SciTech Connect

    Kjeldgaard, E.A.; Saloio, J.H.; Vannoni, M.G.

    1997-03-01

    Nuclear weapons have been produced in the US since the early 1950s by a network of contractor-operated Department of Energy (DOE) facilities collectively known as the Nuclear Weapon Complex (NWC). Recognizing that the failure of an essential process might stop weapon production for a substantial period of time, the DOE Albuquerque Operations office initiated the Production Risk Evaluation Program (PREP) at Sandia National Laboratories (SNL) to assess quantitatively the potential for serious disruptions in the NWC weapon production process. PREP was conducted from 1984-89. This document is an unclassified summary of the effort.

  8. Quantitative damage evaluation of localized deep pitting

    SciTech Connect

    Al Beed, A.A.; Al Garni, M.A.

    2000-04-01

    Localized deep pitting is considered difficult to precisely measure and evaluate using simple techniques and daily-use analysis approaches. A case study was made of carbon steel heat exchangers in a typical fresh cooling water environment that experienced severe pitting. To effectively and precisely evaluate the encountered pitting damage, a simple measurement and analyses approach was devised. In this article, the pitting measurement technique and the damage evaluation approach are presented and discussed in detail.

  9. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams. PMID:27505659

  10. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams.

  11. QUANTITATIVE EVALUATION OF FIRE SEPARATION AND BARRIERS

    SciTech Connect

    Coutts, D

    2007-04-17

    Fire barriers, and physical separation are key components in managing the fire risk in Nuclear Facilities. The expected performance of these features have often been predicted using rules-of-thumb or expert judgment. These approaches often lack the convincing technical bases that exist when addressing other Nuclear Facility accident events. This paper presents science-based approaches to demonstrate the effectiveness of fire separation methods.

  12. A Program to Evaluate Quantitative Analysis Unknowns

    ERIC Educational Resources Information Center

    Potter, Larry; Brown, Bruce

    1978-01-01

    Reports on a computer batch program that will not only perform routine grading using several grading algorithms, but will also calculate various statistical measures by which the class performance can be evaluated and cumulative data collected. ( Author/CP)

  13. Quantitative risk assessment modeling for nonhomogeneous urban road tunnels.

    PubMed

    Meng, Qiang; Qu, Xiaobo; Wang, Xinchang; Yuanita, Vivi; Wong, Siew Chee

    2011-03-01

    Urban road tunnels provide an increasingly cost-effective engineering solution, especially in compact cities like Singapore. For some urban road tunnels, tunnel characteristics such as tunnel configurations, geometries, provisions of tunnel electrical and mechanical systems, traffic volumes, etc. may vary from one section to another. These urban road tunnels that have characterized nonuniform parameters are referred to as nonhomogeneous urban road tunnels. In this study, a novel quantitative risk assessment (QRA) model is proposed for nonhomogeneous urban road tunnels because the existing QRA models for road tunnels are inapplicable to assess the risks in these road tunnels. This model uses a tunnel segmentation principle whereby a nonhomogeneous urban road tunnel is divided into various homogenous sections. Individual risk for road tunnel sections as well as the integrated risk indices for the entire road tunnel is defined. The article then proceeds to develop a new QRA model for each of the homogeneous sections. Compared to the existing QRA models for road tunnels, this section-based model incorporates one additional top event-toxic gases due to traffic congestion-and employs the Poisson regression method to estimate the vehicle accident frequencies of tunnel sections. This article further illustrates an aggregated QRA model for nonhomogeneous urban tunnels by integrating the section-based QRA models. Finally, a case study in Singapore is carried out.

  14. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  15. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  16. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  17. Evaluation (not validation) of quantitative models.

    PubMed

    Oreskes, N

    1998-12-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  18. Evaluation (not validation) of quantitative models.

    PubMed Central

    Oreskes, N

    1998-01-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  19. Evaluation (not validation) of quantitative models.

    PubMed

    Oreskes, N

    1998-12-01

    The present regulatory climate has led to increasing demands for scientists to attest to the predictive reliability of numerical simulation models used to help set public policy, a process frequently referred to as model validation. But while model validation may reveal useful information, this paper argues that it is not possible to demonstrate the predictive reliability of any model of a complex natural system in advance of its actual use. All models embed uncertainties, and these uncertainties can and frequently do undermine predictive reliability. In the case of lead in the environment, we may categorize model uncertainties as theoretical, empirical, parametrical, and temporal. Theoretical uncertainties are aspects of the system that are not fully understood, such as the biokinetic pathways of lead metabolism. Empirical uncertainties are aspects of the system that are difficult (or impossible) to measure, such as actual lead ingestion by an individual child. Parametrical uncertainties arise when complexities in the system are simplified to provide manageable model input, such as representing longitudinal lead exposure by cross-sectional measurements. Temporal uncertainties arise from the assumption that systems are stable in time. A model may also be conceptually flawed. The Ptolemaic system of astronomy is a historical example of a model that was empirically adequate but based on a wrong conceptualization. Yet had it been computerized--and had the word then existed--its users would have had every right to call it validated. Thus, rather than talking about strategies for validation, we should be talking about means of evaluation. That is not to say that language alone will solve our problems or that the problems of model evaluation are primarily linguistic. The uncertainties inherent in large, complex models will not go away simply because we change the way we talk about them. But this is precisely the point: calling a model validated does not make it valid

  20. Is there a place for quantitative risk assessment?

    PubMed Central

    Hall, Eric J

    2013-01-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk–benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  1. A Quantitative Evaluation of Dissolved Oxygen Instrumentation

    NASA Technical Reports Server (NTRS)

    Pijanowski, Barbara S.

    1971-01-01

    The implications of the presence of dissolved oxygen in water are discussed in terms of its deleterious or beneficial effects, depending on the functional consequences to those affected, e.g., the industrialist, the oceanographer, and the ecologist. The paper is devoted primarily to an examination of the performance of five commercially available dissolved oxygen meters. The design of each is briefly reviewed and ease or difficulty of use in the field described. Specifically, the evaluation program treated a number of parameters and user considerations including an initial check and trial calibration for each instrument and a discussion of the measurement methodology employed. Detailed test results are given relating to the effects of primary power variation, water-flow sensitivity, response time, relative accuracy of dissolved-oxygen readout, temperature accuracy (for those instruments which included this feature), error and repeatability, stability, pressure and other environmental effects, and test results obtained in the field. Overall instrument performance is summarized comparatively by chart.

  2. Quantitative risk assessment of foods containing peanut advisory labeling.

    PubMed

    Remington, Benjamin C; Baumert, Joseph L; Marx, David B; Taylor, Steve L

    2013-12-01

    Foods with advisory labeling (i.e. "may contain") continue to be prevalent and the warning may be increasingly ignored by allergic consumers. We sought to determine the residual levels of peanut in various packaged foods bearing advisory labeling, compare similar data from 2005 and 2009, and determine any potential risk for peanut-allergic consumers. Of food products bearing advisory statements regarding peanut or products that had peanut listed as a minor ingredient, 8.6% and 37.5% contained detectable levels of peanut (>2.5 ppm whole peanut), respectively. Peanut-allergic individuals should be advised to avoid such products regardless of the wording of the advisory statement. Peanut was detected at similar rates and levels in products tested in both 2005 and 2009. Advisory labeled nutrition bars contained the highest levels of peanut and an additional market survey of 399 products was conducted. Probabilistic risk assessment showed the risk of a reaction to peanut-allergic consumers from advisory labeled nutrition bars was significant but brand-dependent. Peanut advisory labeling may be overused on some nutrition bars but prudently used on others. The probabilistic approach could provide the food industry with a quantitative method to assist with determining when advisory labeling is most appropriate.

  3. Quantitative risk assessment of foods containing peanut advisory labeling.

    PubMed

    Remington, Benjamin C; Baumert, Joseph L; Marx, David B; Taylor, Steve L

    2013-12-01

    Foods with advisory labeling (i.e. "may contain") continue to be prevalent and the warning may be increasingly ignored by allergic consumers. We sought to determine the residual levels of peanut in various packaged foods bearing advisory labeling, compare similar data from 2005 and 2009, and determine any potential risk for peanut-allergic consumers. Of food products bearing advisory statements regarding peanut or products that had peanut listed as a minor ingredient, 8.6% and 37.5% contained detectable levels of peanut (>2.5 ppm whole peanut), respectively. Peanut-allergic individuals should be advised to avoid such products regardless of the wording of the advisory statement. Peanut was detected at similar rates and levels in products tested in both 2005 and 2009. Advisory labeled nutrition bars contained the highest levels of peanut and an additional market survey of 399 products was conducted. Probabilistic risk assessment showed the risk of a reaction to peanut-allergic consumers from advisory labeled nutrition bars was significant but brand-dependent. Peanut advisory labeling may be overused on some nutrition bars but prudently used on others. The probabilistic approach could provide the food industry with a quantitative method to assist with determining when advisory labeling is most appropriate. PMID:23994086

  4. Quantitative evaluation of ocean thermal energy conversion (OTEC): executive briefing

    SciTech Connect

    Gritton, E.C.; Pei, R.Y.; Hess, R.W.

    1980-08-01

    Documentation is provided of a briefing summarizing the results of an independent quantitative evaluation of Ocean Thermal Energy Conversion (OTEC) for central station applications. The study concentrated on a central station power plant located in the Gulf of Mexico and delivering power to the mainland United States. The evaluation of OTEC is based on three important issues: resource availability, technical feasibility, and cost.

  5. Quantitative evaluation fo cerebrospinal fluid shunt flow

    SciTech Connect

    Chervu, S.; Chervu, L.R.; Vallabhajosyula, B.; Milstein, D.M.; Shapiro, K.M.; Shulman, K.; Blaufox, M.D.

    1984-01-01

    The authors describe a rigorous method for measuring the flow of cerebrospinal fluid (CSF) in shunt circuits implanted for the relief of obstructive hydrocephalus. Clearance of radioactivity for several calibrated flow rates was determined with a Harvard infusion pump by injecting the Rickham reservoir of a Rickham-Holter valve system with 100 ..mu..Ci of Tc-99m as pertechnetate. The elliptical and the cylindrical Holter valves used as adjunct valves with the Rickham reservoir yielded two different regression lines when the clearances were plotted against flow rats. The experimental regression lines were used to determine the in vivo flow rates from clearances calculated after injecting the Rickham reservoirs of the patients. The unique clearance characteristics of the individual shunt systems available requires that calibration curves be derived for an entire system identical to one implanted in the patient being evaluated, rather than just the injected chamber. Excellent correlation between flow rates and the clinical findings supports the reliability of this method of quantification of CSF shunt flow, and the results are fully accepted by neurosurgeons.

  6. A quantitative method for silica flux evaluation

    NASA Astrophysics Data System (ADS)

    Schonewille, R. H.; O'Connell, G. J.; Toguri, J. M.

    1993-02-01

    In the smelting of copper and copper/nickel concentrates, the role of silica flux is to aid in the removal of iron by forming a slag phase. Alternatively, the role of flux may be regarded as a means of controlling the formation of magnetite, which can severely hinder the operation of a furnace. To adequately control the magnetite level, the flux must react rapidly with all of the FeO within the bath. In the present study, a rapid method for silica flux evaluation that can be used directly in the smelter has been developed. Samples of flux are mixed with iron sulfide and magnetite and then smelted at a temperature of 1250 °C. Argon was swept over the reaction mixture and analyzed continuously for sulfur dioxide. The sulfur dioxide concentration with time was found to contain two peaks, the first one being independent of the flux content of the sample. A flux quality parameter has been defined as the height-to-time ratio of the second peak. The value of this parameter for pure silica is 5100 ppm/min. The effects of silica content, silica particle size, and silicate mineralogy were investigated. It was found that a limiting flux quality is achieved for particle sizes less than 0.1 mm in diameter and that fluxes containing feldspar are generally of a poorer quality. The relative importance of free silica and melting point was also studied using synthetic flux mixtures, with free silica displaying the strongest effect.

  7. Quantitative analysis of the fall-risk assessment test with wearable inertia sensors.

    PubMed

    Tmaura, Toshiyo; Zakaria, Nor Aini; Kuwae, Yutaka; Sekine, Masaki; Minato, Kotaro; Yoshida, Masaki

    2013-01-01

    We performed a quantitative analysis of the fall-risk assessment test using a wearable inertia sensor focusing on two tests: the time up and go (TUG) test and the four square step test (FSST). These tests consist of various daily activities, such as sitting, standing, walking, stepping, and turning. The TUG test was performed by subjects at low and high fall risk, while FSST was performed by healthy elderly and hemiplegic patients with high fall risk. In general, the total performance time of activities was evaluated. Clinically, it is important to evaluate each activity for further training and management. The wearable sensor consisted of an accelerometer and angular velocity sensor. The angular velocity and angle of pitch direction were used for TUG evaluation, and those in the pitch and yaw directions at the thigh were used for FSST. Using the threshold of the angular velocity signal, we classified the phase corresponding to each activity. We then observed the characteristics of each activity and recommended suitable training and management. The wearable sensor can be used for more detailed evaluation in fall risk management. The wearable sensor can be used more detailed evaluation for fall-risk management test.

  8. 76 FR 77543 - Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-13

    ... HUMAN SERVICES Food and Drug Administration Quantitative Summary of the Benefits and Risks of... ``Quantitative Summary of the Benefits and Risks of Prescription Drugs: A Literature Review'' (literature review... FDA is announcing the availability of a draft report entitled ``Quantitative Summary of the...

  9. A Scalable Distribution Network Risk Evaluation Framework via Symbolic Dynamics

    PubMed Central

    Yuan, Kai; Liu, Jian; Liu, Kaipei; Tan, Tianyuan

    2015-01-01

    Background Evaluations of electric power distribution network risks must address the problems of incomplete information and changing dynamics. A risk evaluation framework should be adaptable to a specific situation and an evolving understanding of risk. Methods This study investigates the use of symbolic dynamics to abstract raw data. After introducing symbolic dynamics operators, Kolmogorov-Sinai entropy and Kullback-Leibler relative entropy are used to quantitatively evaluate relationships between risk sub-factors and main factors. For layered risk indicators, where the factors are categorized into four main factors – device, structure, load and special operation – a merging algorithm using operators to calculate the risk factors is discussed. Finally, an example from the Sanya Power Company is given to demonstrate the feasibility of the proposed method. Conclusion Distribution networks are exposed and can be affected by many things. The topology and the operating mode of a distribution network are dynamic, so the faults and their consequences are probabilistic. PMID:25789859

  10. Quantitative risk assessment for human salmonellosis through the consumption of pork sausage in Porto Alegre, Brazil.

    PubMed

    Mürmann, Lisandra; Corbellini, Luis Gustavo; Collor, Alexandre Ávila; Cardoso, Marisa

    2011-04-01

    A quantitative microbiology risk assessment was conducted to evaluate the risk of Salmonella infection to consumers of fresh pork sausages prepared at barbecues in Porto Alegre, Brazil. For the analysis, a prevalence of 24.4% positive pork sausages with a level of contamination between 0.03 and 460 CFU g(-1) was assumed. Data related to frequency and habits of consumption were obtained by a questionnaire survey given to 424 people. A second-order Monte Carlo simulation separating the uncertain parameter of cooking time from the variable parameters was run. Of the people interviewed, 87.5% consumed pork sausage, and 85.4% ate it at barbecues. The average risk of salmonellosis per barbecue at a minimum cooking time of 15.6 min (worst-case scenario) was 6.24 × 10(-4), and the risk assessed per month was 1.61 × 10(-3). Cooking for 19 min would fully inactivate Salmonella in 99.9% of the cases. At this cooking time, the sausage reached a mean internal temperature of 75.7°C. The results of the quantitative microbiology risk assessment revealed that the consumption of fresh pork sausage is safe when cooking time is approximately 19 min, whereas undercooked pork sausage may represent a nonnegligible health risk for consumers.

  11. Software design for professional risk evaluation

    NASA Astrophysics Data System (ADS)

    Ionescu, V.; Calea, G.; Amza, G.; Iacobescu, G.; Nitoi, D.; Dimitrescu, A.

    2016-08-01

    Professional risk evaluation represents a complex activity involving each economic operator, with important repercussion upon health and security in work. Article represents an innovative study method, regarding professional risk analyze in which cumulative working posts are evaluated. Work presents a new software that helps in putting together all the working positions from a complex organizational system and analyzing them in order to evaluate the possible risks. Using this software, a multiple analysis can be done like: risk estimation, risk evaluation, estimation of residual risks and finally searching of risk reduction measures.

  12. A Comprehensive Quantitative Assessment of Bird Extinction Risk in Brazil

    PubMed Central

    Machado, Nathália; Loyola, Rafael Dias

    2013-01-01

    In an effort to avoid species loss, scientists have focused their efforts on the mechanisms making some species more prone to extinction than others. However, species show different responses to threats given their evolutionary history, behavior, and intrinsic biological features. We used bird biological features and external threats to (1) understand the multiple pathways driving Brazilian bird species to extinction, (2) to investigate if and how extinction risk is geographically structured, and (3) to quantify how much diversity is currently represented inside protected areas. We modeled the extinction risk of 1557 birds using classification trees and evaluated the relative contribution of each biological feature and external threat in predicting extinction risk. We also quantified the proportion of species and their geographic range currently protected by the network of Brazilian protected areas. The optimal classification tree showed different pathways to bird extinction. Habitat conversion was the most important predictor driving extinction risk though other variables, such as geographic range size, type of habitat, hunting or trapping and trophic guild, were also relevant in our models. Species under higher extinction risk were concentrated mainly in the Cerrado Biodiversity Hotspot and were not quite represented inside protected areas, neither in richness nor range. Predictive models could assist conservation actions, and this study could contribute by highlighting the importance of natural history and ecology in these actions. PMID:23951302

  13. Quantitative risk assessment of thermophilic Campylobacter spp. and cross-contamination during handling of raw broiler chickens evaluating strategies at the producer level to reduce human campylobacteriosis in Sweden.

    PubMed

    Lindqvist, Roland; Lindblad, Mats

    2008-01-15

    Campylobacter is a major bacterial cause of infectious diarrheal illness in Sweden and in many other countries. Handling and consumption of chicken has been identified as important risk factors. The purpose of the present study was to use data from a national baseline study of thermophilic Campylobacter spp. in raw Swedish broiler chickens in order to evaluate some risk management strategies and the frequency of consumer mishandling, i.e., handling leading to possible cross-contamination. A probabilistic model describing variability but not uncertainty was developed in Excel and @Risk. The output of the model was the probability of illness per handling if the chicken was mishandled. Uncertainty was evaluated by performing repeated simulations and substituting model parameters, distributions and software (Analytica). The effect of uncertainty was within a factor of 3.2 compared to the baseline scenario. For Campylobacter spp. prevalence but not concentration, there was a one-to-one relation with risk. The effect of a 100-fold reduction in the levels of Campylobacter spp. on raw chicken reduced the risk by a factor of 12 (fresh chicken) to 30 (frozen chicken). Highly-contaminated carcasses contributed most to risk and it was estimated that by limiting the contamination to less than 4 log CFU per carcass, the risk would be reduced to less than 17% of the baseline scenario. Diverting all positive flocks to freezing was estimated to result in 43% as many cases as the baseline. The second best diversion option (54% of baseline cases) was to direct all chickens from the two worst groups of producers, in terms of percentages of positive flocks delivered, to freezing. The improvement of using diverting was estimated to correspond to between 5 to 767 fewer reported cases for the different strategies depending on the assumptions of the proportion of reported cases (1 to 50%) caused by Campylobacter spp. from Swedish chicken. The estimated proportion of consumer mishandlings

  14. A methodology to quantitatively evaluate the safety of a glazing robot.

    PubMed

    Lee, Seungyeol; Yu, Seungnam; Choi, Junho; Han, Changsoo

    2011-03-01

    A new construction method using robots is spreading widely among construction sites in order to overcome labour shortages and frequent construction accidents. Along with economical efficiency, safety is a very important factor for evaluating the use of construction robots in construction sites. However, the quantitative evaluation of safety is difficult compared with that of economical efficiency. In this study, we suggested a safety evaluation methodology by defining the 'worker' and 'work conditions' as two risk factors, defining the 'worker' factor as posture load and the 'work conditions' factor as the work environment and the risk exposure time. The posture load evaluation reflects the risk of musculoskeletal disorders which can be caused by work posture and the risk of accidents which can be caused by reduced concentration. We evaluated the risk factors that may cause various accidents such as falling, colliding, capsizing, and squeezing in work environments, and evaluated the operational risk by considering worker exposure time to risky work environments. With the results of the evaluations for each factor, we calculated the general operational risk and deduced the improvement ratio in operational safety by introducing a construction robot. To verify these results, we compared the safety of the existing human manual labour and the proposed robotic labour construction methods for manipulating large glass panels.

  15. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    PubMed

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation.

  16. Quantitative microbial risk assessment for Staphylococcus aureus in natural and processed cheese in Korea.

    PubMed

    Lee, Heeyoung; Kim, Kyunga; Choi, Kyoung-Hee; Yoon, Yohan

    2015-09-01

    This study quantitatively assessed the microbial risk of Staphylococcus aureus in cheese in Korea. The quantitative microbial risk assessment was carried out for natural and processed cheese from factory to consumption. Hazards for S. aureus in cheese were identified through the literature. For exposure assessment, the levels of S. aureus contamination in cheeses were evaluated, and the growth of S. aureus was predicted by predictive models at the surveyed temperatures, and at the time of cheese processing and distribution. For hazard characterization, a dose-response model for S. aureus was found, and the model was used to estimate the risk of illness. With these data, simulation models were prepared with @RISK (Palisade Corp., Ithaca, NY) to estimate the risk of illness per person per day in risk characterization. Staphylococcus aureus cell counts on cheese samples from factories and markets were below detection limits (0.30-0.45 log cfu/g), and pert distribution showed that the mean temperature at markets was 6.63°C. Exponential model [P=1 - exp(7.64×10(-8) × N), where N=dose] for dose-response was deemed appropriate for hazard characterization. Mean temperature of home storage was 4.02°C (log-logistic distribution). The results of risk characterization for S. aureus in natural and processed cheese showed that the mean values for the probability of illness per person per day were higher in processed cheese (mean: 2.24×10(-9); maximum: 7.97×10(-6)) than in natural cheese (mean: 7.84×10(-10); maximum: 2.32×10(-6)). These results indicate that the risk of S. aureus-related foodborne illness due to cheese consumption can be considered low under the present conditions in Korea. In addition, the developed stochastic risk assessment model in this study can be useful in establishing microbial criteria for S. aureus in cheese.

  17. Application of quantitative uncertainty analysis for human health risk assessment at Rocky Flats

    SciTech Connect

    Duncan, F.L.W.; Gordon, J.W. ); Smith, D. ); Singh, S.P. )

    1993-01-01

    The characterization of uncertainty is an important component of the risk assessment process. According to the U.S. Environmental Protection Agency's (EPA's) [open quotes]Guidance on Risk Characterization for Risk Managers and Risk Assessors,[close quotes] point estimates of risk [open quotes]do not fully convey the range of information considered and used in developing the assessment.[close quotes] Furthermore, the guidance states that the Monte Carlo simulation may be used to estimate descriptive risk percentiles. To provide information about the uncertainties associated with the reasonable maximum exposure (RME) estimate and the relation of the RME to other percentiles of the risk distribution for Operable Unit 1 (OU-1) at Rocky Flats, uncertainties were identified and quantitatively evaluated. Monte Carlo simulation is a technique that can be used to provide a probability function of estimated risk using random values of exposure factors and toxicity values in an exposure scenario. The Monte Carlo simulation involves assigning a joint probability distribution to the input variables (i.e., exposure factors) of an exposure scenario. Next, a large number of independent samples from the assigned joint distribution are taken and the corresponding outputs calculated. Methods of statistical inference are used to estimate, from the output sample, some parameters of the output distribution, such as percentiles and the expected value.

  18. Health risks in wastewater irrigation: comparing estimates from quantitative microbial risk analyses and epidemiological studies.

    PubMed

    Mara, D D; Sleigh, P A; Blumenthal, U J; Carr, R M

    2007-03-01

    The combination of standard quantitative microbial risk analysis (QMRA) techniques and 10,000-trial Monte Carlo risk simulations was used to estimate the human health risks associated with the use of wastewater for unrestricted and restricted crop irrigation. A risk of rotavirus infection of 10(-2) per person per year (pppy) was used as the reference level of acceptable risk. Using the model scenario of involuntary soil ingestion for restricted irrigation, the risk of rotavirus infection is approximately 10(-2) pppy when the wastewater contains < or =10(6) Escherichia coli per 100ml and when local agricultural practices are highly mechanised. For labour-intensive agriculture the risk of rotavirus infection is approximately 10(-2) pppy when the wastewater contains < or = 10(5) E. coli per 100ml; however, the wastewater quality should be < or = 10(4) E. coli per 100ml when children under 15 are exposed. With the model scenario of lettuce consumption for unrestricted irrigation, the use of wastewaters containing < or =10(4) E. coli per 100ml results in a rotavirus infection risk of approximately 10(-2) pppy; however, again based on epidemiological evidence from Mexico, the current WHO guideline level of < or =1,000 E. coli per 100ml should be retained for root crops eaten raw. PMID:17402278

  19. Quantitative Measures for Evaluation of Ultrasound Therapies of the Prostate

    NASA Astrophysics Data System (ADS)

    Kobelevskiy, Ilya; Burtnyk, Mathieu; Bronskill, Michael; Chopra, Rajiv

    2010-03-01

    Development of non-invasive techniques for prostate cancer treatment requires implementation of quantitative measures for evaluation of the treatment results. In this paper. we introduce measures that estimate spatial targeting accuracy and potential thermal damage to the structures surrounding the prostate. The measures were developed for the technique of treating prostate cancer with a transurethral ultrasound heating applicators guided by active MR temperature feedback. Variations of ultrasound element length and related MR imaging parameters such as MR slice thickness and update time were investigated by performing numerical simulations of the treatment on a database of ten patient prostate geometries segmented from clinical MR images. Susceptibility of each parameter configuration to uncertainty in MR temperature measurements was studied by adding noise to the temperature measurements. Gaussian noise with zero mean and standard deviation of 0, 1, 3 and 5° C was used to model different levels of uncertainty in MR temperature measurements. Results of simulations for each parameter configuration were averaged over the database of the ten prostate patient geometries studied. Results have shown that for update time of 5 seconds both 3- and 5-mm elements achieve appropriate performance for temperature uncertainty up to 3° C, while temperature uncertainty of 5° C leads to noticeable reduction in spatial accuracy and increased risk of damaging rectal wall. Ten-mm elements lacked spatial accuracy and had higher risk of damaging rectal wall compared to 3- and 5-mm elements, but were less sensitive to the level of temperature uncertainty. The effect of changing update time was studied for 5-mm elements. Simulations showed that update time had minor effects on all aspects of treatment for temperature uncertainty of 0° C and 1° C, while temperature uncertainties of 3° C and 5° C led to reduced spatial accuracy, increased potential damage to the rectal wall, and

  20. Quantitative breast MRI radiomics for cancer risk assessment and the monitoring of high-risk populations

    NASA Astrophysics Data System (ADS)

    Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.

    2016-03-01

    Breast density is routinely assessed qualitatively in screening mammography. However, it is challenging to quantitatively determine a 3D density from a 2D image such as a mammogram. Furthermore, dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) is used more frequently in the screening of high-risk populations. The purpose of our study is to segment parenchyma and to quantitatively determine volumetric breast density on pre-contrast axial DCE-MRI images (i.e., non-contrast) using a semi-automated quantitative approach. In this study, we retroactively examined 3D DCE-MRI images taken for breast cancer screening of a high-risk population. We analyzed 66 cases with ages between 28 and 76 (mean 48.8, standard deviation 10.8). DCE-MRIs were obtained on a Philips 3.0 T scanner. Our semi-automated DCE-MRI algorithm includes: (a) segmentation of breast tissue from non-breast tissue using fuzzy cmeans clustering (b) separation of dense and fatty tissues using Otsu's method, and (c) calculation of volumetric density as the ratio of dense voxels to total breast voxels. We examined the relationship between pre-contrast DCE-MRI density and clinical BI-RADS density obtained from radiology reports, and obtained a statistically significant correlation [Spearman ρ-value of 0.66 (p < 0.0001)]. Our method within precision medicine may be useful for monitoring high-risk populations.

  1. Quantitative vs. Qualitative Approaches to Quality Special Education Program Evaluation.

    ERIC Educational Resources Information Center

    Council of Administrators of Special Education, Inc.

    One in a series of issue papers commissioned by the Council of Administrators of Special Education (CASE), this document presents a comparison of contemporary evaluation approaches for special education programs. The first section describes the two approaches to be compared: (1) traditional scientific inquiry which emphasizes quantitative methods;…

  2. Dermal sensitization quantitative risk assessment (QRA) for fragrance ingredients.

    PubMed

    Api, Anne Marie; Basketter, David A; Cadby, Peter A; Cano, Marie-France; Ellis, Graham; Gerberick, G Frank; Griem, Peter; McNamee, Pauline M; Ryan, Cindy A; Safford, Robert

    2008-10-01

    Based on chemical, cellular, and molecular understanding of dermal sensitization, an exposure-based quantitative risk assessment (QRA) can be conducted to determine safe use levels of fragrance ingredients in different consumer product types. The key steps are: (1) determination of benchmarks (no expected sensitization induction level (NESIL)); (2) application of sensitization assessment factors (SAF); and (3) consumer exposure (CEL) calculation through product use. Using these parameters, an acceptable exposure level (AEL) can be calculated and compared with the CEL. The ratio of AEL to CEL must be favorable to support safe use of the potential skin sensitizer. This ratio must be calculated for the fragrance ingredient in each product type. Based on the Research Institute for Fragrance Materials, Inc. (RIFM) Expert Panel's recommendation, RIFM and the International Fragrance Association (IFRA) have adopted the dermal sensitization QRA approach described in this review for fragrance ingredients identified as potential dermal sensitizers. This now forms the fragrance industry's core strategy for primary prevention of dermal sensitization to these materials in consumer products. This methodology is used to determine global fragrance industry product management practices (IFRA Standards) for fragrance ingredients that are potential dermal sensitizers. This paper describes the principles of the recommended approach, provides detailed review of all the information used in the dermal sensitization QRA approach for fragrance ingredients and presents key conclusions for its use now and refinement in the future.

  3. Quantitative cancer risk assessment for dioxins using an occupational cohort.

    PubMed Central

    Becher, H; Steindorf, K; Flesch-Janys, D

    1998-01-01

    We consider a cohort of 1189 male German factory workers (production period 1952-1984) who produced phenoxy herbicides and were exposed to dioxins. Follow-up until the end of 1992 yielded a significantly increased standardized mortality ratio (SMR) for total cancer (SMR 141; 95% confidence interval 117-168). 2,3,7,8-Tetrachlorodibenzo-p-dioxin (TCDD) concentrations up to 2252 ng/kg body fat were measured in 275 cohort members. Other higher chlorinated dioxins and furans also occurred in high concentrations. For quantitative analysis, the integrated TCDD concentration over time was used as an exposure variable, which was calculated using results from half-life estimation for TCDD and workplace history data. The other congeners were expressed as toxic equivalency (TEQ) and compared to TCDD using international toxic equivalency factors. Poisson and Cox regressions were used to investigate dose-response relationships. Various covariables (e.g., exposure to beta-hexachlorocyclohexane, employment characteristics) were considered. In all analyses, TCDD and TEQ exposures were related to total cancer mortality. The power model yielded a relative risk (RR) function RR(x) = (1 + 0.17x)0.326 for TCDD (in microgram/kilogram blood fat x years)--only a slightly better fit than a linear RR function--and RR(x) = (1 + 0.023x)0.795 for TEQ. Investigations on latency did not show strong effects. Different methods were applied to investigate the robustness of the results and yielded almost identical results. The results were used for unit risk estimation. Taking into account different sources of variation, an interval of 10(-3) to 10(-2) for the additional lifetime cancer risk under a daily intake of 1 pg TCDD/kg body weight/day was estimated from the dose-response models considered. Uncertainties regarding the dose-response function remain. These data did not indicate the existence of a threshold value; however, such a value cannot be excluded with any certainty. PMID:9599714

  4. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  5. 78 FR 9701 - Draft Joint Food and Drug Administration/Health Canada Quantitative Assessment of the Risk of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ... purpose of the draft QRA is to evaluate the effect of factors such as the microbiological status of milk... milk. II. Quantitative Risk Assessment The draft QRA (Refs. 3 to 6) provides a science-based analytical... made from raw milk, in its reevaluation of the existing 60-day aging requirements for cheeses made...

  6. Quantitative microbial risk assessment combined with hydrodynamic modelling to estimate the public health risk associated with bathing after rainfall events.

    PubMed

    Eregno, Fasil Ejigu; Tryland, Ingun; Tjomsland, Torulv; Myrmel, Mette; Robertson, Lucy; Heistad, Arve

    2016-04-01

    This study investigated the public health risk from exposure to infectious microorganisms at Sandvika recreational beaches, Norway and dose-response relationships by combining hydrodynamic modelling with Quantitative Microbial Risk Assessment (QMRA). Meteorological and hydrological data were collected to produce a calibrated hydrodynamic model using Escherichia coli as an indicator of faecal contamination. Based on average concentrations of reference pathogens (norovirus, Campylobacter, Salmonella, Giardia and Cryptosporidium) relative to E. coli in Norwegian sewage from previous studies, the hydrodynamic model was used for simulating the concentrations of pathogens at the local beaches during and after a heavy rainfall event, using three different decay rates. The simulated concentrations were used as input for QMRA and the public health risk was estimated as probability of infection from a single exposure of bathers during the three consecutive days after the rainfall event. The level of risk on the first day after the rainfall event was acceptable for the bacterial and parasitic reference pathogens, but high for the viral reference pathogen at all beaches, and severe at Kalvøya-small and Kalvøya-big beaches, supporting the advice of avoiding swimming in the day(s) after heavy rainfall. The study demonstrates the potential of combining discharge-based hydrodynamic modelling with QMRA in the context of bathing water as a tool to evaluate public health risk and support beach management decisions. PMID:26802355

  7. Quantitative ultrasound criteria for risk stratification in clinical practice: a comparative assessment.

    PubMed

    Noale, Marianna; Maggi, Stefania; Gonnelli, Stefano; Limongi, Federica; Zanoni, Silvia; Zambon, Sabina; Rozzini, Renzo; Crepaldi, Gaetano

    2012-07-01

    This study aimed to compare two different classifications of the risk of fracture/osteoporosis (OP) based on quantitative ultrasound (QUS). Analyses were based on data from the Epidemiological Study on the Prevalence of Osteoporosis, a cross-sectional study conducted in 2000 aimed at assessing the risk of OP in a representative sample of the Italian population. Subjects were classified into 5 groups considering the cross-classification found in previous studies; logistic regression models were defined separately for women and men to study the fracture risk attributable to groups defined by the cross-classification, adjusting for traditional risk factors. Eight-thousand six-hundred eighty-one subjects were considered in the analyses. Logistic regression models revealed that the two classifications seem to be able to identify a common core of individuals at low and at high risk of fractures, and the importance of a multidimensional assessment in older patients to evaluate clinical risk factors together with a simple, inexpensive, radiation-free device such as QUS.

  8. Quantitatively evaluating the CBM reservoir using logging data

    NASA Astrophysics Data System (ADS)

    Liu, Zhidi; Zhao, Jingzhou

    2016-02-01

    In order to evaluate coal bed methane (CBM) reservoirs, this paper select five parameters: porosity, permeability, CBM content, the coal structure index and effective thickness of the coal seam. Making full use of logging and the laboratory analysis data of a coal core, the logging evaluation methods of the five parameters were discussed in detail, and the comprehensive evaluation model of the CBM reservoir was established. The #5 coal seam of the Hancheng mine on the eastern edge of the Ordos Basin in China was quantitatively evaluated using this method. The results show that the CBM reservoir in the study area is better than in the central and northern regions. The actual development of CBM shows that the region with a good reservoir has high gas production—indicating that the method introduced in this paper can evaluate the CBM reservoir more effectively.

  9. A methodology for the quantitative risk assessment of major accidents triggered by seismic events.

    PubMed

    Antonioni, Giacomo; Spadoni, Gigliola; Cozzani, Valerio

    2007-08-17

    A procedure for the quantitative risk assessment of accidents triggered by seismic events in industrial facilities was developed. The starting point of the procedure was the use of available historical data to assess the expected frequencies and the severity of seismic events. Available equipment-dependant failure probability models (vulnerability or fragility curves) were used to assess the damage probability of equipment items due to a seismic event. An analytic procedure was subsequently developed to identify, evaluate the credibility and finally assess the expected consequences of all the possible scenarios that may follow the seismic events. The procedure was implemented in a GIS-based software tool in order to manage the high number of event sequences that are likely to be generated in large industrial facilities. The developed methodology requires a limited amount of additional data with respect to those used in a conventional QRA, and yields with a limited effort a preliminary quantitative assessment of the contribution of the scenarios triggered by earthquakes to the individual and societal risk indexes. The application of the methodology to several case-studies evidenced that the scenarios initiated by seismic events may have a relevant influence on industrial risk, both raising the overall expected frequency of single scenarios and causing specific severe scenarios simultaneously involving several plant units.

  10. Provisional guidance for quantitative risk assessment of polycyclic aromatic hydrocarbons. Final report

    SciTech Connect

    Schoeny, R.; Poirier, K.

    1993-07-01

    PAHs are products of incomplete combustion of organic materials; sources are, thus, widespread including cigarette smoke, municipal waste incineration, wood stove emissions, coal conversion, energy production form fossil fuels, and automobile and diesel exhaust. As PAHs are common environmental contaminants, it is important that EPA have a scientifically justified, consistent approach to the evaluation of human health risk from exposure to these compounds. For the majority of PAHs classified as B2, probable human carcinogen, data are insufficient for calculation of an inhalation or drinking water unit risk. Benzo(a)pyrene (BAP) is the most completely studied of the PAHs, and data, while problematic, are sufficient for calculation of quantitative estimates of carcinogenic potency. Toxicity Equivalency Factors (TEF) have been used by U.S. EPA on an interim basis for risk assessment of chlorinated dibenzodioxins and dibenzofurans. Data for PAHs do not meet all criteria for use of TEF. The document presents a somewhat different approach to quantitative estimation for PAHs using weighted potential potencies.

  11. Qualitative and quantitative procedures for health risk assessment.

    PubMed

    Lohman, P H

    1999-07-16

    Numerous reactive mutagenic electrophiles are present in the environment or are formed in the human body through metabolizing processes. Those electrophiles can directly react with DNA and are considered to be ultimate carcinogens. In the past decades more than 200 in vitro and in vivo genotoxic tests have been described to identify, monitor and characterize the exposure of humans to such agents. When the responses of such genotoxic tests are quantified by a weight-of-evidence analysis, it is found that the intrinsic potency of electrophiles being mutagens does not differ much for the majority of the agents studied. Considering the fact that under normal environmental circumstances human are exposed to low concentration of about a million electrophiles, the relation between exposure to such agents and adverse health effects (e.g., cancer) will become a 'Pandora's box'. For quantitative risk assessment it will be necessary not only to detect whether the agent is genotoxic, but also understand the mechanism of interaction of the agent with the DNA in target cells needs to be taken into account. Examples are given for a limited group of important environmental and carcinogenic agents for which such an approach is feasible. The groups identified are agents that form cross-links with DNA or are mono-alkylating agents that react with base-moieties in the DNA strands. Quantitative hazard ranking of the mutagenic potency of these groups of chemical can be performed and there is ample evidence that such a ranking corresponds with the individual carcinogenic potency of those agents in rodents. Still, in practice, with the exception of certain occupational or accidental exposure situations, these approaches have not be successful in preventing cancer death in the human population. However, this is not only due to the described 'Pandora's box' situation. At least three other factors are described. Firstly, in the industrial world the medical treatment of cancer in patients

  12. Modeling number of bacteria per food unit in comparison to bacterial concentration in quantitative risk assessment: impact on risk estimates.

    PubMed

    Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin

    2015-02-01

    When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise.

  13. Quantitative microbial risk assessment of human illness from exposure to marine beach sand.

    PubMed

    Shibata, Tomoyuki; Solo-Gabriele, Helena M

    2012-03-01

    Currently no U.S. federal guideline is available for assessing risk of illness from sand at recreational sites. The objectives of this study were to compute a reference level guideline for pathogens in beach sand and to compare these reference levels with measurements from a beach impacted by nonpoint sources of contamination. Reference levels were computed using quantitative microbial risk assessment (QMRA) coupled with Monte Carlo simulations. In order to reach an equivalent level of risk of illness as set by the U.S. EPA for marine water exposure (1.9 × 10(-2)), levels would need to be at least about 10 oocysts/g (about 1 oocyst/g for a pica child) for Cryptosporidium, about 5 MPN/g (about 1 MPN/g for pica) for enterovirus, and less than 10(6) CFU/g for S. aureus. Pathogen levels measured in sand at a nonpoint source recreational beach were lower than the reference levels. More research is needed in evaluating risk from yeast and helminth exposures as well as in identifying acceptable levels of risk for skin infections associated with sand exposures.

  14. Evaluating quantitative formulas for dose-response assessment of chemical mixtures.

    PubMed Central

    Hertzberg, Richard C; Teuschler, Linda K

    2002-01-01

    Risk assessment formulas are often distinguished from dose-response models by being rough but necessary. The evaluation of these rough formulas is described here, using the example of mixture risk assessment. Two conditions make the dose-response part of mixture risk assessment difficult, lack of data on mixture dose-response relationships, and the need to address risk from combinations of chemicals because of public demands and statutory requirements. Consequently, the U.S. Environmental Protection Agency has developed methods for carrying out quantitative dose-response assessment for chemical mixtures that require information only on the toxicity of single chemicals and of chemical pair interactions. These formulas are based on plausible ideas and default parameters but minimal supporting data on whole mixtures. Because of this lack of mixture data, the usual evaluation of accuracy (predicted vs. observed) cannot be performed. Two approaches to the evaluation of such formulas are to consider fundamental biological concepts that support the quantitative formulas (e.g., toxicologic similarity) and to determine how well the proposed method performs under simplifying constraints (e.g., as the toxicologic interactions disappear). These ideas are illustrated using dose addition and two weight-of-evidence formulas for incorporating toxicologic interactions. PMID:12634126

  15. A study on the quantitative evaluation of skin barrier function

    NASA Astrophysics Data System (ADS)

    Maruyama, Tomomi; Kabetani, Yasuhiro; Kido, Michiko; Yamada, Kenji; Oikaze, Hirotoshi; Takechi, Yohei; Furuta, Tomotaka; Ishii, Shoichi; Katayama, Haruna; Jeong, Hieyong; Ohno, Yuko

    2015-03-01

    We propose a quantitative evaluation method of skin barrier function using Optical Coherence Microscopy system (OCM system) with coherency of near-infrared light. There are a lot of skin problems such as itching, irritation and so on. It has been recognized skin problems are caused by impairment of skin barrier function, which prevents damage from various external stimuli and loss of water. To evaluate skin barrier function, it is a common strategy that they observe skin surface and ask patients about their skin condition. The methods are subjective judgements and they are influenced by difference of experience of persons. Furthermore, microscopy has been used to observe inner structure of the skin in detail, and in vitro measurements like microscopy requires tissue sampling. On the other hand, it is necessary to assess objectively skin barrier function by quantitative evaluation method. In addition, non-invasive and nondestructive measuring method and examination changes over time are needed. Therefore, in vivo measurements are crucial for evaluating skin barrier function. In this study, we evaluate changes of stratum corneum structure which is important for evaluating skin barrier function by comparing water-penetrated skin with normal skin using a system with coherency of near-infrared light. Proposed method can obtain in vivo 3D images of inner structure of body tissue, which is non-invasive and non-destructive measuring method. We formulate changes of skin ultrastructure after water penetration. Finally, we evaluate the limit of performance of the OCM system in this work in order to discuss how to improve the OCM system.

  16. Approach for evaluating inundation risks in urban drainage systems.

    PubMed

    Zhu, Zhihua; Chen, Zhihe; Chen, Xiaohong; He, Peiying

    2016-05-15

    Urban inundation is a serious challenge that increasingly confronts the residents of many cities, as well as policymakers. Hence, inundation evaluation is becoming increasingly important around the world. This comprehensive assessment involves numerous indices in urban catchments, but the high-dimensional and non-linear relationship between the indices and the risk presents an enormous challenge for accurate evaluation. Therefore, an approach is hereby proposed to qualitatively and quantitatively evaluate inundation risks in urban drainage systems based on a storm water management model, the projection pursuit method, the ordinary kriging method and the K-means clustering method. This approach is tested using a residential district in Guangzhou, China. Seven evaluation indices were selected and twenty rainfall-runoff events were used to calibrate and validate the parameters of the rainfall-runoff model. The inundation risks in the study area drainage system were evaluated under different rainfall scenarios. The following conclusions are reached. (1) The proposed approach, without subjective factors, can identify the main driving factors, i.e., inundation duration, largest water flow and total flood amount in this study area. (2) The inundation risk of each manhole can be qualitatively analyzed and quantitatively calculated. There are 1, 8, 11, 14, 21, and 21 manholes at risk under the return periods of 1-year, 5-years, 10-years, 20-years, 50-years and 100-years, respectively. (3) The areas of levels III, IV and V increase with increasing rainfall return period based on analyzing the inundation risks for a variety of characteristics. (4) The relationships between rainfall intensity and inundation-affected areas are revealed by a logarithmic model. This study proposes a novel and successful approach to assessing risk in urban drainage systems and provides guidance for improving urban drainage systems and inundation preparedness. PMID:26897578

  17. Approach for evaluating inundation risks in urban drainage systems.

    PubMed

    Zhu, Zhihua; Chen, Zhihe; Chen, Xiaohong; He, Peiying

    2016-05-15

    Urban inundation is a serious challenge that increasingly confronts the residents of many cities, as well as policymakers. Hence, inundation evaluation is becoming increasingly important around the world. This comprehensive assessment involves numerous indices in urban catchments, but the high-dimensional and non-linear relationship between the indices and the risk presents an enormous challenge for accurate evaluation. Therefore, an approach is hereby proposed to qualitatively and quantitatively evaluate inundation risks in urban drainage systems based on a storm water management model, the projection pursuit method, the ordinary kriging method and the K-means clustering method. This approach is tested using a residential district in Guangzhou, China. Seven evaluation indices were selected and twenty rainfall-runoff events were used to calibrate and validate the parameters of the rainfall-runoff model. The inundation risks in the study area drainage system were evaluated under different rainfall scenarios. The following conclusions are reached. (1) The proposed approach, without subjective factors, can identify the main driving factors, i.e., inundation duration, largest water flow and total flood amount in this study area. (2) The inundation risk of each manhole can be qualitatively analyzed and quantitatively calculated. There are 1, 8, 11, 14, 21, and 21 manholes at risk under the return periods of 1-year, 5-years, 10-years, 20-years, 50-years and 100-years, respectively. (3) The areas of levels III, IV and V increase with increasing rainfall return period based on analyzing the inundation risks for a variety of characteristics. (4) The relationships between rainfall intensity and inundation-affected areas are revealed by a logarithmic model. This study proposes a novel and successful approach to assessing risk in urban drainage systems and provides guidance for improving urban drainage systems and inundation preparedness.

  18. Compressed natural gas bus safety: a quantitative risk assessment.

    PubMed

    Chamberlain, Samuel; Modarres, Mohammad

    2005-04-01

    This study assesses the fire safety risks associated with compressed natural gas (CNG) vehicle systems, comprising primarily a typical school bus and supporting fuel infrastructure. The study determines the sensitivity of the results to variations in component failure rates and consequences of fire events. The components and subsystems that contribute most to fire safety risk are determined. Finally, the results are compared to fire risks of the present generation of diesel-fueled school buses. Direct computation of the safety risks associated with diesel-powered vehicles is possible because these are mature technologies for which historical performance data are available. Because of limited experience, fatal accident data for CNG bus fleets are minimal. Therefore, this study uses the probabilistic risk assessment (PRA) approach to model and predict fire safety risk of CNG buses. Generic failure data, engineering judgments, and assumptions are used in this study. This study predicts the mean fire fatality risk for typical CNG buses as approximately 0.23 fatalities per 100-million miles for all people involved, including bus passengers. The study estimates mean values of 0.16 fatalities per 100-million miles for bus passengers only. Based on historical data, diesel school bus mean fire fatality risk is 0.091 and 0.0007 per 100-million miles for all people and bus passengers, respectively. One can therefore conclude that CNG buses are more prone to fire fatality risk by 2.5 times that of diesel buses, with the bus passengers being more at risk by over two orders of magnitude. The study estimates a mean fire risk frequency of 2.2 x 10(-5) fatalities/bus per year. The 5% and 95% uncertainty bounds are 9.1 x 10(-6) and 4.0 x 10(-5), respectively. The risk result was found to be affected most by failure rates of pressure relief valves, CNG cylinders, and fuel piping.

  19. An Overview of Quantitative Risk Assessment of Space Shuttle Propulsion Elements

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    1998-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has been working to incorporate quantitative risk assessment (QRA) in decisions concerning the Space Shuttle and other NASA projects. One current major NASA QRA study is the creation of a risk model for the overall Space Shuttle system. The model is intended to provide a tool to estimate Space Shuttle risk and to perform sensitivity analyses/trade studies, including the evaluation of upgrades. Marshall Space Flight Center (MSFC) is a part of the NASA team conducting the QRA study; MSFC responsibility involves modeling the propulsion elements of the Space Shuttle, namely: the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). This paper discusses the approach that MSFC has used to model its Space Shuttle elements, including insights obtained from this experience in modeling large scale, highly complex systems with a varying availability of success/failure data. Insights, which are applicable to any QRA study, pertain to organizing the modeling effort, obtaining customer buy-in, preparing documentation, and using varied modeling methods and data sources. Also provided is an overall evaluation of the study results, including the strengths and the limitations of the MSFC QRA approach and of qRA technology in general.

  20. Simulation evaluation of quantitative myocardial perfusion assessment from cardiac CT

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-03-01

    Contrast enhancement on cardiac CT provides valuable information about myocardial perfusion and methods have been proposed to assess perfusion with static and dynamic acquisitions. There is a lack of knowledge and consensus on the appropriate approach to ensure 1) sufficient diagnostic accuracy for clinical decisions and 2) low radiation doses for patient safety. This work developed a thorough dynamic CT simulation and several accepted blood flow estimation techniques to evaluate the performance of perfusion assessment across a range of acquisition and estimation scenarios. Cardiac CT acquisitions were simulated for a range of flow states (Flow = 0.5, 1, 2, 3 ml/g/min, cardiac output = 3,5,8 L/min). CT acquisitions were simulated with a validated CT simulator incorporating polyenergetic data acquisition and realistic x-ray flux levels for dynamic acquisitions with a range of scenarios including 1, 2, 3 sec sampling for 30 sec with 25, 70, 140 mAs. Images were generated using conventional image reconstruction with additional image-based beam hardening correction to account for iodine content. Time attenuation curves were extracted for multiple regions around the myocardium and used to estimate flow. In total, 2,700 independent realizations of dynamic sequences were generated and multiple MBF estimation methods were applied to each of these. Evaluation of quantitative kinetic modeling yielded blood flow estimates with an root mean square error (RMSE) of ~0.6 ml/g/min averaged across multiple scenarios. Semi-quantitative modeling and qualitative static imaging resulted in significantly more error (RMSE = ~1.2 and ~1.2 ml/min/g respectively). For quantitative methods, dose reduction through reduced temporal sampling or reduced tube current had comparable impact on the MBF estimate fidelity. On average, half dose acquisitions increased the RMSE of estimates by only 18% suggesting that substantial dose reductions can be employed in the context of quantitative myocardial

  1. A Computational Framework for Quantitative Evaluation of Movement during Rehabilitation

    NASA Astrophysics Data System (ADS)

    Chen, Yinpeng; Duff, Margaret; Lehrer, Nicole; Sundaram, Hari; He, Jiping; Wolf, Steven L.; Rikakis, Thanassis

    2011-06-01

    This paper presents a novel generalized computational framework for quantitative kinematic evaluation of movement in a rehabilitation clinic setting. The framework integrates clinical knowledge and computational data-driven analysis together in a systematic manner. The framework provides three key benefits to rehabilitation: (a) the resulting continuous normalized measure allows the clinician to monitor movement quality on a fine scale and easily compare impairments across participants, (b) the framework reveals the effect of individual movement components on the composite movement performance helping the clinician decide the training foci, and (c) the evaluation runs in real-time, which allows the clinician to constantly track a patient's progress and make appropriate adaptations to the therapy protocol. The creation of such an evaluation is difficult because of the sparse amount of recorded clinical observations, the high dimensionality of movement and high variations in subject's performance. We address these issues by modeling the evaluation function as linear combination of multiple normalized kinematic attributes y = Σwiφi(xi) and estimating the attribute normalization function φi(ṡ) by integrating distributions of idealized movement and deviated movement. The weights wi are derived from a therapist's pair-wise comparison using a modified RankSVM algorithm. We have applied this framework to evaluate upper limb movement for stroke survivors with excellent results—the evaluation results are highly correlated to the therapist's observations.

  2. Quantitative risk assessment for skin sensitisation: consideration of a simplified approach for hair dye ingredients.

    PubMed

    Goebel, Carsten; Diepgen, Thomas L; Krasteva, Maya; Schlatter, Harald; Nicolas, Jean-Francois; Blömeke, Brunhilde; Coenraads, Pieter Jan; Schnuch, Axel; Taylor, James S; Pungier, Jacquemine; Fautz, Rolf; Fuchs, Anne; Schuh, Werner; Gerberick, G Frank; Kimber, Ian

    2012-12-01

    With the availability of the local lymph node assay, and the ability to evaluate effectively the relative skin sensitizing potency of contact allergens, a model for quantitative-risk-assessment (QRA) has been developed. This QRA process comprises: (a) determination of a no-expected-sensitisation-induction-level (NESIL), (b) incorporation of sensitization-assessment-factors (SAFs) reflecting variations between subjects, product use patterns and matrices, and (c) estimation of consumer-exposure-level (CEL). Based on these elements an acceptable-exposure-level (AEL) can be calculated by dividing the NESIL of the product by individual SAFs. Finally, the AEL is compared with the CEL to judge about risks to human health. We propose a simplified approach to risk assessment of hair dye ingredients by making use of precise experimental product exposure data. This data set provides firmly established dose/unit area concentrations under relevant consumer use conditions referred to as the measured-exposure-level (MEL). For that reason a direct comparison is possible between the NESIL with the MEL as a proof-of-concept quantification of the risk of skin sensitization. This is illustrated here by reference to two specific hair dye ingredients p-phenylenediamine and resorcinol. Comparison of these robust and toxicologically relevant values is therefore considered an improvement versus a hazard-based classification of hair dye ingredients. PMID:23069142

  3. Quantitative microbial risk assessment of distributed drinking water using faecal indicator incidence and concentrations.

    PubMed

    van Lieverloo, J Hein M; Blokker, E J Mirjam; Medema, Gertjan

    2007-01-01

    Quantitative Microbial Risk Assessments (QMRA) have focused on drinking water system components upstream of distribution to customers, for nominal and event conditions. Yet some 15-33% of waterborne outbreaks are reported to be caused by contamination events in distribution systems. In the majority of these cases and probably in all non-outbreak contamination events, no pathogen concentration data was available. Faecal contamination events are usually detected or confirmed by the presence of E. coli or other faecal indicators, although the absence of this indicator is no guarantee of the absence of faecal pathogens. In this paper, the incidence and concentrations of various coliforms and sources of faecal contamination were used to estimate the possible concentrations of faecal pathogens and consequently the infection risks to consumers in event-affected areas. The results indicate that the infection risks may be very high, especially from Campylobacter and enteroviruses, but also that the uncertainties are very high. The high variability of pathogen to thermotolerant coliform ratios estimated in environmental samples severely limits the applicability of the approach described. Importantly, the highest ratios of enteroviruses to thermotolerant coliform were suggested from soil and shallow groundwaters, the most likely sources of faecal contamination that are detected in distribution systems. Epidemiological evaluations of non-outbreak faecal contamination of drinking water distribution systems and thorough tracking and characterisation of the contamination sources are necessary to assess the actual risks of these events.

  4. Semi-quantitative exposure assessment of occupational exposure to wood dust and nasopharyngeal cancer risk.

    PubMed

    Ekpanyaskul, Chatchai; Sangrajrang, Suleeporn; Ekburanawat, Wiwat; Brennan, Paul; Mannetje, Andrea; Thetkathuek, Anamai; Saejiw, Nutjaree; Ruangsuwan, Tassanu; Boffetta, Paolo

    2015-01-01

    Occupational exposure to wood dust is one cause of nasopharyngeal cancer (NPC); however, assessing this exposure remains problematic. Therefore, the objective of this study was to develop a semi-quantitative exposure assessment method and then utilize it to evaluate the association between occupational exposure to wood dust and the development of NPC. In addition, variations in risk by histology were examined. A case-control study was conducted with 327 newly diagnosed cases of NPC at the National Cancer Institute and regional cancer centers in Thailand with 1:1 controls matched for age, gender and geographical residence. Occupational information was obtained through personal interviews. The potential probability, frequency and intensity of exposure to wood dust were assessed on a job-by-job basis by experienced experts. Analysis was performed by conditional logistic regression and presented in odds ratio (ORs) estimates and 95% confidence intervals (CI). Overall, a non significant relationship between occupational wood dust exposure and NPC risk for all subjects was observed (ORs=1.61, 95%CI 0.99-2.59); however, the risk became significant when analyses focused on types 2 and 3 of NPC (ORs=1.62, 95%CI 1.03-2.74). The significant association was stronger for those exposed to wood dust for >10 year (ORs=2.26, 95%CI 1.10-4.63), for those with first-time exposure at age>25 year (ORs=2.07, 95%CI 1.08-3.94), and for those who had a high cumulative exposure (ORs=2.17, 95%CI 1.03-4.58) when compared with those considered unexposed. In conclusion, wood dust is likely to be associated with an increased risk of type 2 or 3 NPC in the Thai population. The results of this study show that semi-quantitative exposure assessment is suitable for occupational exposure assessment in a case control study and complements the information from self-reporting.

  5. Study on the performance evaluation of quantitative precipitation estimation and quantitative precipitation forecast

    NASA Astrophysics Data System (ADS)

    Yang, H.; Chang, K.; Suk, M.; cha, J.; Choi, Y.

    2011-12-01

    Rainfall estimation and short-term (several hours) quantitative prediction of precipitation based on meteorological radar data is one of the intensely studied topics. The Korea Peninsula has the horizontally narrow land area and complex topography with many of mountains, and so it has the characteristics that the rainfall system changes in many cases. Quantitative precipitation estimation (QPE) and quantitative precipitation forecasts (QPF) are the crucial information for severe weather or water management. We have been conducted the performance evaluation of QPE/QPF of Korea Meteorological Administration (KMA), which is the first step for optimizing QPE/QPF system in South Korea. The real-time adjusted RAR (Radar-AWS-Rainrate) system gives better agreement with the observed rain-rate than that of the fixed Z-R relation, and the additional bias correction of RAR yields the slightly better results. A correlation coefficient of R2 = 0.84 is obtained between the daily accumulated observed and RAR estimated rainfall. The RAR will be available for the hydrological applications such as the water budget. The VSRF (Very Short Range Forecast) shows better performance than the MAPLE (McGill Algorithm for Precipitation Nowcasting by Lagrangian) within 40 minutes, but the MAPLE better than the VSRF after 40 minutes. In case of hourly forecast, MAPLE shows better performance than the VSRF. QPE and QPF are thought to be meaningful for the nowcasting (1~2 hours) except the model forecast. The long-term forecast longer than 3 hours by meteorological model is especially meaningful for such as water management.

  6. Risk assessment technique for evaluating research laboratories

    SciTech Connect

    Bolander, T.W.; Meale, B.M.; Eide, S.A.

    1992-01-01

    A technique has been developed to evaluate research laboratories according to risk, where risk is defined as the product of frequency and consequence. This technique was used to evaluate several laboratories at the Idaho National Engineering Laboratory under the direction of the Department of Energy, Idaho Field Office to assist in the risk management of the Science and Technology Department laboratories. With this technique, laboratories can be compared according to risk, and management can use the results to make cost effective decisions associated with the operation of the facility.

  7. Risk assessment technique for evaluating research laboratories

    SciTech Connect

    Bolander, T.W.; Meale, B.M.; Eide, S.A.

    1992-09-01

    A technique has been developed to evaluate research laboratories according to risk, where risk is defined as the product of frequency and consequence. This technique was used to evaluate several laboratories at the Idaho National Engineering Laboratory under the direction of the Department of Energy, Idaho Field Office to assist in the risk management of the Science and Technology Department laboratories. With this technique, laboratories can be compared according to risk, and management can use the results to make cost effective decisions associated with the operation of the facility.

  8. Evaluation of protective action risks

    SciTech Connect

    Witzig, W.F.; Shillenn, J.K.

    1987-06-01

    The purpose of this study is to determine how the risks of the protective action of evacuation compare with the radiological risks from a radiation release if no protective actions are taken. Evacuation risks of death and injury have been determined by identifying from newspapers and other sources 902 possible evacuation events which occurred in the US during the period January 1, 1973 through April 30, 1986. A survey form was developed to determine evacuation risks and other information relating to the evacuation events and sent to local emergency management personnel located in the vicinity of 783 events. There were 310 completed surveys received and the data summarized. This study found that the key factors for a successful evacuation included an emergency plan, good communications and coordination, practice drills, and defined authority. Few successful evacuations used the emergency broadcasting system or warning sirens to communicate the need to evacuate. Reports of panic and traffic jams during an evacuation were very few. Traffic jams occurring during reentry were more likely than during the evacuation exodus. A summary of potential societal consequences of evacuation is included in this study. 5 refs., 9 figs., 20 tabs.

  9. Optimizing Digital Health Informatics Interventions Through Unobtrusive Quantitative Process Evaluations.

    PubMed

    Gude, Wouter T; van der Veer, Sabine N; de Keizer, Nicolette F; Coiera, Enrico; Peek, Niels

    2016-01-01

    Health informatics interventions such as clinical decision support (CDS) and audit and feedback (A&F) are variably effective at improving care because the underlying mechanisms through which these interventions bring about change are poorly understood. This limits our possibilities to design better interventions. Process evaluations can be used to improve this understanding by assessing fidelity and quality of implementation, clarifying causal mechanisms, and identifying contextual factors associated with variation in outcomes. Coiera describes the intervention process as a series of stages extending from interactions to outcomes: the "information value chain". However, past process evaluations often did not assess the relationships between those stages. In this paper we argue that the chain can be measured quantitatively and unobtrusively in digital interventions thanks to the availability of electronic data that are a by-product of their use. This provides novel possibilities to study the mechanisms of informatics interventions in detail and inform essential design choices to optimize their efficacy. PMID:27577453

  10. The Nuclear Renaissance - Implications on Quantitative Nondestructive Evaluations

    SciTech Connect

    Matzie, Regis A.

    2007-03-21

    The world demand for energy is growing rapidly, particularly in developing countries that are trying to raise the standard of living for billions of people, many of whom do not even have access to electricity. With this increased energy demand and the high and volatile price of fossil fuels, nuclear energy is experiencing resurgence. This so-called nuclear renaissance is broad based, reaching across Asia, the United States, Europe, as well as selected countries in Africa and South America. Some countries, such as Italy, that have actually turned away from nuclear energy are reconsidering the advisability of this design. This renaissance provides the opportunity to deploy more advanced reactor designs that are operating today, with improved safety, economy, and operations. In this keynote address, I will briefly present three such advanced reactor designs in whose development Westinghouse is participating. These designs include the advanced passive PWR, AP1000, which recently received design certification for the US Nuclear Regulatory Commission; the Pebble Bed Modular reactor (PBMR) which is being demonstrated in South Africa; and the International Reactor Innovative and Secure (IRIS), which was showcased in the US Department of Energy's recently announced Global Nuclear Energy Partnership (GNEP), program. The salient features of these designs that impact future requirements on quantitative nondestructive evaluations will be discussed. Such features as reactor vessel materials, operating temperature regimes, and new geometric configurations will be described, and mention will be made of the impact on quantitative nondestructive evaluation (NDE) approaches.

  11. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  12. Use of Quantitative Microbial Risk Assessment to Improve Interpretation of a Recreational Water Epidemiological Study

    EPA Science Inventory

    We conducted a supplemental water quality monitoring study and quantitative microbial risk assessment (QMRA) to complement the United States Environmental Protection Agency’s (U.S. EPA) National Epidemiological and Environmental Assessment of Recreational Water study at Boq...

  13. A quantitative risk-based model for reasoning over critical system properties

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  14. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  15. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  16. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  17. Factors Distinguishing between Achievers and At Risk Students: A Qualitative and Quantitative Synthesis

    ERIC Educational Resources Information Center

    Eiselen, R.; Geyser, H.

    2003-01-01

    The purpose of this article is to identify factors that distinguish between Achievers and At Risk Students in Accounting 1A, and to explore how qualitative and quantitative research methods complement each other. Differences between the two groups were explored from both a quantitative and a qualitative perspective, focusing on study habits,…

  18. Potential Use of Quantitative Tissue Phenotype to Predict Malignant Risk for Oral Premalignant Lesions

    PubMed Central

    Guillaud, Martial; Zhang, Lewei; Poh, Catherine; Rosin, Miriam P.; MacAulay, Calum

    2009-01-01

    The importance of early diagnosis in improving mortality and morbidity rates of oral squamous cell carcinoma (SCC) has long been recognized. However, a major challenge for early diagnosis is our limited ability to differentiate oral premalignant lesions (OPLs) at high risk of progressing into invasive SCC from those at low risk. We investigated the potential of Quantitative Tissue Phenotype (QTP), measured by high-resolution image analysis, to recognize severe dysplasia/carcinoma in situ (CIS) (known to have an increased risk of progression) and to predict progression within hyperplasia or mild/moderate dysplasia (termed HMD). We generated a Nuclear Phenotypic Score (NPS), a combination of 5 nuclear morphometric features that best discriminate 4,027 “normal” nuclei (selected from 29 normal oral biopsies) from 4,298 “abnormal” nuclei (selected from 30 SCC biopsies). This NPS was then determined for a set of 69 OPLs. Severe dysplasia/CIS, showed a significant increase in NPS compared to HMD. However, within the latter group, elevated NPS was strongly associated with the presence of high-risk LOH patterns. There was a statistical difference between NPS of HMD that progressed to cancer and those that did not. Individuals with a high NPS had a 10-fold increase in relative risk of progression. In the multivariate Cox model, LOH and NPS together were the strongest predictors for cancer development. These data suggest that QTP could be used to identify lesions that require molecular evaluation and should be integrated with such approaches to facilitate the identification of HMD OPLs at high risk of progression. PMID:18451134

  19. Risk effectiveness evaluation of surveillance testing

    SciTech Connect

    Martorell, S.; Kim, I.S.; Samanta, P.K.; Vesely, W.E.

    1992-07-20

    In nuclear power plants surveillance tests are required to detect failures in standby safety system components as a means of assuring their availability in case of an accident. However, the performance of surveillance tests at power may have adverse impact on safety as evidenced by the operating experience of the plants. The risk associated with a test includes two different aspects: (1) a positive aspect, i.e., risk contribution detected by the test, that results from the detection of failures which occur between tests and are detected by the test, and (2) a negative aspect, i.e., risk contribution caused by the test, that includes failures and degradations which are caused by the test or are related to the performance of the test. In terms of the two different risk contributions, the risk effectiveness of a test can be simply defined as follows: a test is risk effective if the risk contribution detected by the test is greater than the risk contribution caused by the test; otherwise it is risk ineffective. The methodology presentation will focus on two important kinds of negative test risk impacts, that is, the risk impacts of test-caused transients and equipment wear-out. The evaluation results of the risk effectiveness of the test will be presented in the full paper along with the risk assessment methodology and the insights from the sensitivity analysis. These constitute the core of the NUREG/CR-5775.

  20. Quantitative microbial risk assessment of antibacterial hand hygiene products on risk of shigellosis.

    PubMed

    Schaffner, Donald W; Bowman, James P; English, Donald J; Fischler, George E; Fuls, Janice L; Krowka, John F; Kruszewski, Francis H

    2014-04-01

    There are conflicting reports on whether antibacterial hand hygiene products are more effective than nonantibacterial products in reducing bacteria on hands and preventing disease. This research used new laboratory data, together with simulation techniques, to compare the ability of nonantibacterial and antibacterial products to reduce shigellosis risk. One hundred sixtythree subjects were used to compare five different hand treatments: two nonantibacterial products and three antibacterial products, i.e., 0.46% triclosan, 4% chlorhexidine gluconate, or 62% ethyl alcohol. Hands were inoculated with 5.5 to 6 log CFU Shigella; the simulated food handlers then washed their hands with one of the five products before handling melon balls. Each simulation scenario represented an event in which 100 people would be exposed to Shigella from melon balls that had been handled by food workers with Shigella on their hands. Analysis of experimental data showed that the two nonantibacterial treatments produced about a 2-log reduction on hands. The three antibacterial treatments showed log reductions greater than 3 but less than 4 on hands. All three antibacterial treatments resulted in statistically significantly lower concentration on the melon balls relative to the nonantibacterial treatments. A simulation that assumed 1 million Shigella bacteria on the hands and the use of a nonantibacterial treatment predicted that 50 to 60 cases of shigellosis would result (of 100 exposed). Each of the antibacterial treatments was predicted to result in an appreciable number of simulations for which the number of illness cases would be 0, with the most common number of illness cases being 5 (of 100 exposed). These effects maintained statistical significance from 10(6) Shigella per hand down to as low as 100 Shigella per hand, with some evidence to support lower levels. This quantitative microbial risk assessment shows that antibacterial hand treatments can significantly reduce Shigella risk

  1. Quantitative evaluation of heavy metals' pollution hazards in liquefaction residues of sewage sludge.

    PubMed

    Huang, Huajun; Yuan, Xingzhong; Zeng, Guangming; Zhu, Huina; Li, Hui; Liu, Zhifeng; Jiang, Hongwei; Leng, Lijian; Bi, Wenkai

    2011-11-01

    Liquefaction residues (LR) are the main by-products of sewage sludge (SS) liquefaction. This study quantitatively evaluates the potential ecological risk and pollution degrees of heavy metals (Pb, Zn, Cu, Cd, Cr and Ni) in LR versus SS. The leaching rates (R1) of heavy metals in LR were much lower than those in SS, revealing that the mobility/leachability of heavy metals was well suppressed after liquefaction. Geo-accumulation index (Igeo) indicated that the liquefaction process significantly weakened the contamination degrees of heavy metals. Potential ecological risk index (RI) demonstrated that overall risks caused by heavy metals were obviously lowered from 1093.56 (very high risk) in SS to 4.72 and 1.51 (low risk) in LR1 and LR2, respectively. According to the risk assessment code (RAC), each tested heavy metal had no or low risk to the environments after liquefaction. In a word, the pollution hazards of heavy metals in LR were markedly mitigated.

  2. Quantitative evaluation of mefenamic acid polymorphs by terahertz-chemometrics.

    PubMed

    Otsuka, Makoto; Nishizawa, Jun-ichi; Shibata, Jiro; Ito, Masahiko

    2010-09-01

    The purpose of the present study is to measure polymorphic content in a bulk powder, mefenamic acid polymorph of pharmaceuticals, as a model drug by THz-spectrometer using frequency-tunable THz wave generators based on difference-frequency generation in gallium phosphate crystals. Mefenamic acid polymorphic forms I and II were obtained by recrystallisation. Eleven standard samples varying a various polymorphic form I content (0-100%) were prepared by physical mixing. After smoothing and area normalising, the THz-spectra of all standard samples showed an isosbestic point at 3.70 THz. After the THz-spectral data sets were arranged into five frequency ranges, and pretreated using various functions, calibration models were calculated by the partial least square regression method. The effect of spectral data management on the chemometric parameters of the calibration models was investigated. The relationship between predicted and actual form I content was the best linear plot. On the regression vector (RV) that corresponded to absorption THz-spectral data, the peak at 1.45 THz was the highest value, and the peak at 2.25 THz was the lowest on RV. THz-spectroscopy with chemometrics would be useful for the quantitative evaluation of mefenamic acid polymorphs in the pharmaceutical industry. This method is expected to provide a rapid and nondestructive quantitative analysis of polymorphs. PMID:20665848

  3. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection

    PubMed Central

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  4. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection.

    PubMed

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  5. Quantitative risk assessment for the induction of allergic contact dermatitis: uncertainty factors for mucosal exposures.

    PubMed

    Farage, Miranda A; Bjerke, Donald L; Mahony, Catherine; Blackburn, Karen L; Gerberick, G Frank

    2003-09-01

    The quantitative risk assessment (QRA) paradigm has been extended to evaluating the risk of induction of allergic contact dermatitis from consumer products. Sensitization QRA compares product-related, topical exposures to a safe benchmark, the sensitization reference dose. The latter is based on an experimentally or clinically determined 'no observable adverse effect level' (NOAEL) and further refined by incorporating 'sensitization uncertainty factors' (SUFs) that address variables not adequately reflected in the data from which the threshold NOAEL was derived. A critical area of uncertainty for the risk assessment of oral care or feminine hygiene products is the extrapolation from skin to mucosal exposures. Most sensitization data are derived from skin contact, but the permeability of vulvovaginal and oral mucosae is greater than that of keratinized skin. Consequently, the QRA for some personal products that are exposed to mucosal tissue may require the use of more conservative SUFs. This article reviews the scientific basis for SUFs applied to topical exposure to vulvovaginal and oral mucosae. We propose a 20-fold range in the default uncertainty factor used in the contact sensitization QRA when extrapolating from data derived from the skin to situations involving exposure to non-keratinized mucosal tissue.

  6. Fully automated quantitative analysis of breast cancer risk in DCE-MR images

    NASA Astrophysics Data System (ADS)

    Jiang, Luan; Hu, Xiaoxin; Gu, Yajia; Li, Qiang

    2015-03-01

    Amount of fibroglandular tissue (FGT) and background parenchymal enhancement (BPE) in dynamic contrast enhanced magnetic resonance (DCE-MR) images are two important indices for breast cancer risk assessment in the clinical practice. The purpose of this study is to develop and evaluate a fully automated scheme for quantitative analysis of FGT and BPE in DCE-MR images. Our fully automated method consists of three steps, i.e., segmentation of whole breast, fibroglandular tissues, and enhanced fibroglandular tissues. Based on the volume of interest extracted automatically, dynamic programming method was applied in each 2-D slice of a 3-D MR scan to delineate the chest wall and breast skin line for segmenting the whole breast. This step took advantages of the continuity of chest wall and breast skin line across adjacent slices. We then further used fuzzy c-means clustering method with automatic selection of cluster number for segmenting the fibroglandular tissues within the segmented whole breast area. Finally, a statistical method was used to set a threshold based on the estimated noise level for segmenting the enhanced fibroglandular tissues in the subtraction images of pre- and post-contrast MR scans. Based on the segmented whole breast, fibroglandular tissues, and enhanced fibroglandular tissues, FGT and BPE were automatically computed. Preliminary results of technical evaluation and clinical validation showed that our fully automated scheme could obtain good segmentation of the whole breast, fibroglandular tissues, and enhanced fibroglandular tissues to achieve accurate assessment of FGT and BPE for quantitative analysis of breast cancer risk.

  7. INCORPORATION OF MOLECULAR ENDPOINTS INTO QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    The U.S. Environmental Protection Agency has recently released its Guidelines for Carcinogen Risk Assessment. These new guidelines benefit from the significant progress that has been made in understanding the cancer process and also from the more than 20 years experience that EPA...

  8. Evaluation of Slit Sampler in Quantitative Studies of Bacterial Aerosols

    PubMed Central

    Ehrlich, Richard; Miller, Sol; Idoine, L. S.

    1966-01-01

    Quantitative studies were conducted to evaluate the efficiency of the slit sampler in collecting airborne Serratia marcescens and Bacillus subtilis var. niger, and to compare it with the collecting efficiency of the all-glass impinger AGI-30. The slit sampler was approximately 50% less efficient than the AGI-30. This ratio remained the same whether liquid or dry cultures were disseminated when the sample was taken at 2 min of aerosol cloud life. At 30 min of aerosol cloud life, this ratio was approximately 30% for B. subtilis var. niger. S. marcescens recoveries by the slit sampler were, however, only 17% lower than the AGI-30 at 30 min of cloud age, indicating a possible interaction involving the more labile vegetative cells, aerosol age, and method of collection. PMID:4961550

  9. Quantitative surface evaluation by matching experimental and simulated ronchigram images

    NASA Astrophysics Data System (ADS)

    Kantún Montiel, Juana Rosaura; Cordero Dávila, Alberto; González García, Jorge

    2011-09-01

    To estimate qualitatively the surface errors with Ronchi test, the experimental and simulated ronchigrams are compared. Recently surface errors have been obtained quantitatively matching the intersection point coordinates of ronchigrama fringes with x-axis . In this case, gaussian fit must be done for each fringe, and interference orders are used in Malacara algorithm for the simulations. In order to evaluate surface errors, we added an error function in simulations, described with cubic splines, to the sagitta function of the ideal surface. We used the vectorial transversal aberration formula and a ruling with cosinusoidal transmittance, because these rulings reproduce better experimental ronchigram fringe profiles. Several error functions are tried until the whole experimental ronchigrama image is reproduced. The optimization process was done using genetic algorithms.

  10. Quantitative Evaluation of the Environmental Impact Quotient (EIQ) for Comparing Herbicides.

    PubMed

    Kniss, Andrew R; Coburn, Carl W

    2015-01-01

    Various indicators of pesticide environmental risk have been proposed, and one of the most widely known and used is the environmental impact quotient (EIQ). The EIQ has been criticized by others in the past, but it continues to be used regularly in the weed science literature. The EIQ is typically considered an improvement over simply comparing the amount of herbicides applied by weight. Herbicides are treated differently compared to other pesticide groups when calculating the EIQ, and therefore, it is important to understand how different risk factors affect the EIQ for herbicides. The purpose of this work was to evaluate the suitability of the EIQ as an environmental indicator for herbicides. Simulation analysis was conducted to quantify relative sensitivity of the EIQ to changes in risk factors, and actual herbicide EIQ values were used to quantify the impact of herbicide application rate on the EIQ Field Use Rating. Herbicide use rate was highly correlated with the EIQ Field Use Rating (Spearman's rho >0.96, P-value <0.001) for two herbicide datasets. Two important risk factors for herbicides, leaching and surface runoff potential, are included in the EIQ calculation but explain less than 1% of total variation in the EIQ. Plant surface half-life was the risk factor with the greatest relative influence on herbicide EIQ, explaining 26 to 28% of the total variation in EIQ for actual and simulated EIQ values, respectively. For herbicides, the plant surface half-life risk factor is assigned values without any supporting quantitative data, and can result in EIQ estimates that are contrary to quantitative risk estimates for some herbicides. In its current form, the EIQ is a poor measure of herbicide environmental impact.

  11. Quantitative Evaluation of the Environmental Impact Quotient (EIQ) for Comparing Herbicides

    PubMed Central

    Kniss, Andrew R.; Coburn, Carl W.

    2015-01-01

    Various indicators of pesticide environmental risk have been proposed, and one of the most widely known and used is the environmental impact quotient (EIQ). The EIQ has been criticized by others in the past, but it continues to be used regularly in the weed science literature. The EIQ is typically considered an improvement over simply comparing the amount of herbicides applied by weight. Herbicides are treated differently compared to other pesticide groups when calculating the EIQ, and therefore, it is important to understand how different risk factors affect the EIQ for herbicides. The purpose of this work was to evaluate the suitability of the EIQ as an environmental indicator for herbicides. Simulation analysis was conducted to quantify relative sensitivity of the EIQ to changes in risk factors, and actual herbicide EIQ values were used to quantify the impact of herbicide application rate on the EIQ Field Use Rating. Herbicide use rate was highly correlated with the EIQ Field Use Rating (Spearman’s rho >0.96, P-value <0.001) for two herbicide datasets. Two important risk factors for herbicides, leaching and surface runoff potential, are included in the EIQ calculation but explain less than 1% of total variation in the EIQ. Plant surface half-life was the risk factor with the greatest relative influence on herbicide EIQ, explaining 26 to 28% of the total variation in EIQ for actual and simulated EIQ values, respectively. For herbicides, the plant surface half-life risk factor is assigned values without any supporting quantitative data, and can result in EIQ estimates that are contrary to quantitative risk estimates for some herbicides. In its current form, the EIQ is a poor measure of herbicide environmental impact. PMID:26121252

  12. A quantitative evaluation of the public response to climate engineering

    NASA Astrophysics Data System (ADS)

    Wright, Malcolm J.; Teagle, Damon A. H.; Feetham, Pamela M.

    2014-02-01

    Atmospheric greenhouse gas concentrations continue to increase, with CO2 passing 400 parts per million in May 2013. To avoid severe climate change and the attendant economic and social dislocation, existing energy efficiency and emissions control initiatives may need support from some form of climate engineering. As climate engineering will be controversial, there is a pressing need to inform the public and understand their concerns before policy decisions are taken. So far, engagement has been exploratory, small-scale or technique-specific. We depart from past research to draw on the associative methods used by corporations to evaluate brands. A systematic, quantitative and comparative approach for evaluating public reaction to climate engineering is developed. Its application reveals that the overall public evaluation of climate engineering is negative. Where there are positive associations they favour carbon dioxide removal (CDR) over solar radiation management (SRM) techniques. Therefore, as SRM techniques become more widely known they are more likely to elicit negative reactions. Two climate engineering techniques, enhanced weathering and cloud brightening, have indistinct concept images and so are less likely to draw public attention than other CDR or SRM techniques.

  13. Quantitative Evaluation and Selection of Reference Genes for Quantitative RT-PCR in Mouse Acute Pancreatitis

    PubMed Central

    Yan, Zhaoping; Gao, Jinhang; Lv, Xiuhe; Yang, Wenjuan; Wen, Shilei; Tong, Huan; Tang, Chengwei

    2016-01-01

    The analysis of differences in gene expression is dependent on normalization using reference genes. However, the expression of many of these reference genes, as evaluated by quantitative RT-PCR, is upregulated in acute pancreatitis, so they cannot be used as the standard for gene expression in this condition. For this reason, we sought to identify a stable reference gene, or a suitable combination, for expression analysis in acute pancreatitis. The expression stability of 10 reference genes (ACTB, GAPDH, 18sRNA, TUBB, B2M, HPRT1, UBC, YWHAZ, EF-1α, and RPL-13A) was analyzed using geNorm, NormFinder, and BestKeeper software and evaluated according to variations in the raw Ct values. These reference genes were evaluated using a comprehensive method, which ranked the expression stability of these genes as follows (from most stable to least stable): RPL-13A, YWHAZ > HPRT1 > GAPDH > UBC > EF-1α > 18sRNA > B2M > TUBB > ACTB. RPL-13A was the most suitable reference gene, and the combination of RPL-13A and YWHAZ was the most stable group of reference genes in our experiments. The expression levels of ACTB, TUBB, and B2M were found to be significantly upregulated during acute pancreatitis, whereas the expression level of 18sRNA was downregulated. Thus, we recommend the use of RPL-13A or a combination of RPL-13A and YWHAZ for normalization in qRT-PCR analyses of gene expression in mouse models of acute pancreatitis. PMID:27069927

  14. Quantitative genetic activity graphical profiles for use in chemical evaluation

    SciTech Connect

    Waters, M.D.; Stack, H.F.; Garrett, N.E.; Jackson, M.A.

    1990-12-31

    A graphic approach, terms a Genetic Activity Profile (GAP), was developed to display a matrix of data on the genetic and related effects of selected chemical agents. The profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each chemical. Either the lowest effective dose or highest ineffective dose is recorded for each agent and bioassay. Up to 200 different test systems are represented across the GAP. Bioassay systems are organized according to the phylogeny of the test organisms and the end points of genetic activity. The methodology for producing and evaluating genetic activity profile was developed in collaboration with the International Agency for Research on Cancer (IARC). Data on individual chemicals were compiles by IARC and by the US Environmental Protection Agency (EPA). Data are available on 343 compounds selected from volumes 1-53 of the IARC Monographs and on 115 compounds identified as Superfund Priority Substances. Software to display the GAPs on an IBM-compatible personal computer is available from the authors. Structurally similar compounds frequently display qualitatively and quantitatively similar profiles of genetic activity. Through examination of the patterns of GAPs of pairs and groups of chemicals, it is possible to make more informed decisions regarding the selection of test batteries to be used in evaluation of chemical analogs. GAPs provided useful data for development of weight-of-evidence hazard ranking schemes. Also, some knowledge of the potential genetic activity of complex environmental mixtures may be gained from an assessment of the genetic activity profiles of component chemicals. The fundamental techniques and computer programs devised for the GAP database may be used to develop similar databases in other disciplines. 36 refs., 2 figs.

  15. [Clinical evaluation of a novel HBsAg quantitative assay].

    PubMed

    Takagi, Kazumi; Tanaka, Yasuhito; Naganuma, Hatsue; Hiramatsu, Kumiko; Iida, Takayasu; Takasaka, Yoshimitsu; Mizokami, Masashi

    2007-07-01

    The clinical implication of the hepatitis B surface antigen (HBsAg) concentrations in HBV-infected individuals remains unclear. The aim of this study was to evaluate a novel fully automated Chemiluminescence Enzyme Immunoassay (Sysmex HBsAg quantitative assay) by comparative measurements of the reference serum samples versus two independent commercial assays (Lumipulse f or Architect HBsAg QT). Furthermore, clinical usefulness was assessed for monitoring of the serum HBsAg levels during antiviral therapy. A dilution test using 5 reference-serum samples showed linear correlation curve in range from 0.03 to 2,360 IU/ml. The HBsAg was measured in total of 400 serum samples and 99.8% had consistent results between Sysmex and Lumipulse f. Additionally, a positive linear correlation was observed between Sysmex and Architect. To compare the Architect and Sysmex, both methods were applied to quantify the HBsAg in serum samples with different HBV genotypes/subgenotypes, as well as in serum contained HBV vaccine escape mutants (126S, 145R). Correlation between the methods was observed in results for escape mutants and common genotypes (A, B, C) in Japan. Observed during lamivudine therapy, an increase in HBsAg and HBV DNA concentrations preceded the aminotransferase (ALT) elevation associated with drug-resistant HBV variant emergence (breakthrough hepatitis). In conclusion, reliability of the Sysmex HBsAg quantitative assay was confirmed for all HBV genetic variants common in Japan. Monitoring of serum HBsAg concentrations in addition to HBV DNA quantification, is helpful in evaluation of the response to lamivudine treatment and diagnosis of the breakthrough hepatitis.

  16. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  17. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  18. A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...

  19. Evaluation of a virucidal quantitative carrier test for surface disinfectants.

    PubMed

    Rabenau, Holger F; Steinmann, Jochen; Rapp, Ingrid; Schwebke, Ingeborg; Eggers, Maren

    2014-01-01

    Surface disinfectants are part of broader preventive strategies preventing the transmission of bacteria, fungi and viruses in medical institutions. To evaluate their virucidal efficacy, these products must be tested with appropriate model viruses with different physico-chemical properties under conditions representing practical application in hospitals. The aim of this study was to evaluate a quantitative carrier assay. Furthermore, different putative model viruses like adenovirus type 5 (AdV-5) and different animal parvoviruses were evaluated with respect to their tenacity and practicability in laboratory handling. To evaluate the robustness of the method, some of the viruses were tested in parallel in different laboratories in a multi-center study. Different biocides, which are common active ingredients of surface disinfectants, were used in the test. After drying on stainless steel discs as the carrier, model viruses were exposed to different concentrations of three alcohols, peracetic acid (PAA) or glutaraldehyde (GDA), with a fixed exposure time of 5 minutes. Residual virus was determined after treatment by endpoint titration. All parvoviruses exhibited a similar stability with respect to GDA, while AdV-5 was more susceptible. For PAA, the porcine parvovirus was more sensitive than the other parvoviruses, and again, AdV-5 presented a higher susceptibility than the parvoviruses. All parvoviruses were resistant to alcohols, while AdV-5 was only stable when treated with 2-propanol. The analysis of the results of the multi-center study showed a high reproducibility of this test system. In conclusion, two viruses with different physico-chemical properties can be recommended as appropriate model viruses for the evaluation of the virucidal efficacy of surface disinfectants: AdV-5, which has a high clinical impact, and murine parvovirus (MVM) with the highest practicability among the parvoviruses tested.

  20. Evaluation of a virucidal quantitative carrier test for surface disinfectants.

    PubMed

    Rabenau, Holger F; Steinmann, Jochen; Rapp, Ingrid; Schwebke, Ingeborg; Eggers, Maren

    2014-01-01

    Surface disinfectants are part of broader preventive strategies preventing the transmission of bacteria, fungi and viruses in medical institutions. To evaluate their virucidal efficacy, these products must be tested with appropriate model viruses with different physico-chemical properties under conditions representing practical application in hospitals. The aim of this study was to evaluate a quantitative carrier assay. Furthermore, different putative model viruses like adenovirus type 5 (AdV-5) and different animal parvoviruses were evaluated with respect to their tenacity and practicability in laboratory handling. To evaluate the robustness of the method, some of the viruses were tested in parallel in different laboratories in a multi-center study. Different biocides, which are common active ingredients of surface disinfectants, were used in the test. After drying on stainless steel discs as the carrier, model viruses were exposed to different concentrations of three alcohols, peracetic acid (PAA) or glutaraldehyde (GDA), with a fixed exposure time of 5 minutes. Residual virus was determined after treatment by endpoint titration. All parvoviruses exhibited a similar stability with respect to GDA, while AdV-5 was more susceptible. For PAA, the porcine parvovirus was more sensitive than the other parvoviruses, and again, AdV-5 presented a higher susceptibility than the parvoviruses. All parvoviruses were resistant to alcohols, while AdV-5 was only stable when treated with 2-propanol. The analysis of the results of the multi-center study showed a high reproducibility of this test system. In conclusion, two viruses with different physico-chemical properties can be recommended as appropriate model viruses for the evaluation of the virucidal efficacy of surface disinfectants: AdV-5, which has a high clinical impact, and murine parvovirus (MVM) with the highest practicability among the parvoviruses tested. PMID:24475079

  1. [Cardiovascular risk and cardiometabolic risk: an epidemiological evaluation].

    PubMed

    Vanuzzo, Diego; Pilotto, Lorenza; Mirolo, Renata; Pirelli, Salvatore

    2008-04-01

    cardiometabolic risk factors were considered those "closely related to diabetes and cardiovascular disease: fasting/postprandial hyperglycemia, overweight/obesity, elevated systolic and diastolic blood pressure, and dyslipidemia". The association among the cardiometabolic risk factors has been known for a long time, and much of their etiology has been ascribed to insulin resistance. Also, the fact that these "metabolic" abnormalities can cluster in many individuals gave rise to the term "metabolic syndrome", a construct embraced by many organizations but questioned by other authors. From an epidemiological point of view the metabolic syndrome seems to increase modestly the cardiovascular risk, whereas in non-diabetic individuals it predicts diabetes much more efficiently. Many studies have compared the performance of the classical cardiovascular evaluation tools (the Framingham risk score, the SCORE charts, the Progetto CUORE score) and metabolic syndrome in cardiovascular disease prediction. Usually in people at high risk the presence of the metabolic syndrome does not improve the risk, whereas in people at lower risk its presence increases significantly the chances of cardiovascular disease. Many studies have shown that positive lifestyle interventions markedly reduce the rate of progression of type 2 diabetes. Also some drugs were tested for diabetes prevention, usually in people with impaired glucose tolerance. Oral diabetes drugs considered together (acarbose, metformin, flumamine, glipizide, phenformin) were less effective than lifestyle interventions, with different results among the drugs; the antiobesity drug orlistat gave similar results to lifestyle interventions. In Italy an appropriate approach to cardiovascular disease and diabetes prevention may be that of first evaluating the global cardiovascular risk using the charts or the score software of the Progetto CUORE, because high-risk subjects (> or =20%) must be treated aggressively independently of the presence of

  2. Quantitative Risk Modeling of Fire on the International Space Station

    NASA Technical Reports Server (NTRS)

    Castillo, Theresa; Haught, Megan

    2014-01-01

    The International Space Station (ISS) Program has worked to prevent fire events and to mitigate their impacts should they occur. Hardware is designed to reduce sources of ignition, oxygen systems are designed to control leaking, flammable materials are prevented from flying to ISS whenever possible, the crew is trained in fire response, and fire response equipment improvements are sought out and funded. Fire prevention and mitigation are a top ISS Program priority - however, programmatic resources are limited; thus, risk trades are made to ensure an adequate level of safety is maintained onboard the ISS. In support of these risk trades, the ISS Probabilistic Risk Assessment (PRA) team has modeled the likelihood of fire occurring in the ISS pressurized cabin, a phenomenological event that has never before been probabilistically modeled in a microgravity environment. This paper will discuss the genesis of the ISS PRA fire model, its enhancement in collaboration with fire experts, and the results which have informed ISS programmatic decisions and will continue to be used throughout the life of the program.

  3. THE NEGLECTED SIDE OF THE COIN: QUANTITATIVE BENEFIT-RISK ANALYSES IN MEDICAL IMAGING

    PubMed Central

    Zanzonico, Pat B.

    2016-01-01

    While it is implicitly recognized that the benefits of diagnostic imaging far outweigh any theoretical radiogenic risks, quantitative estimates of the benefits are rarely, if ever, juxtaposed with quantitative estimates of risk. This alone - expression of benefit in purely qualitative terms versus expression of risk in quantitative, and therefore seemingly more certain, terms - may well contribute to a skewed sense of the relative benefits and risks of diagnostic imaging among healthcare providers as well as patients. The current paper, therefore, briefly compares the benefits of diagnostic imaging in several cases, based on actual mortality or morbidity data if ionizing radiation were not employed, with theoretical estimates of radiogenic cancer mortality based on the “linear no-threshold” (LNT) dose-response model. PMID:26808890

  4. The Neglected Side of the Coin: Quantitative Benefit-risk Analyses in Medical Imaging.

    PubMed

    Zanzonico, Pat B

    2016-03-01

    While it is implicitly recognized that the benefits of diagnostic imaging far outweigh any theoretical radiogenic risks, quantitative estimates of the benefits are rarely, if ever, juxtaposed with quantitative estimates of risk. This alone - expression of benefit in purely qualitative terms versus expression of risk in quantitative, and therefore seemingly more certain, terms - may well contribute to a skewed sense of the relative benefits and risks of diagnostic imaging among healthcare providers as well as patients. The current paper, therefore, briefly compares the benefits of diagnostic imaging in several cases, based on actual mortality or morbidity data if ionizing radiation were not employed, with theoretical estimates of radiogenic cancer mortality based on the "linear no-threshold" (LNT) dose-response model. PMID:26808890

  5. D & D screening risk evaluation guidance

    SciTech Connect

    Robers, S.K.; Golden, K.M.; Wollert, D.A.

    1995-09-01

    The Screening Risk Evaluation (SRE) guidance document is a set of guidelines provided for the uniform implementation of SREs performed on decontamination and decommissioning (D&D) facilities. Although this method has been developed for D&D facilities, it can be used for transition (EM-60) facilities as well. The SRE guidance produces screening risk scores reflecting levels of risk through the use of risk ranking indices. Five types of possible risk are calculated from the SRE: current releases, worker exposures, future releases, physical hazards, and criticality. The Current Release Index (CRI) calculates the current risk to human health and the environment, exterior to the building, from ongoing or probable releases within a one-year time period. The Worker Exposure Index (WEI) calculates the current risk to workers, occupants and visitors inside contaminated D&D facilities due to contaminant exposure. The Future Release Index (FRI) calculates the hypothetical risk of future releases of contaminants, after one year, to human health and the environment. The Physical Hazards Index (PHI) calculates the risks to human health due to factors other than that of contaminants. Criticality is approached as a modifying factor to the entire SRE, due to the fact that criticality issues are strictly regulated under DOE. Screening risk results will be tabulated in matrix form, and Total Risk will be calculated (weighted equation) to produce a score on which to base early action recommendations. Other recommendations from the screening risk scores will be made based either on individual index scores or from reweighted Total Risk calculations. All recommendations based on the SRE will be made based on a combination of screening risk scores, decision drivers, and other considerations, as determined on a project-by-project basis.

  6. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled 'Instrumentation and Quantitative Methods of Evaluation.' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  7. Quantitative methods for somatosensory evaluation in atypical odontalgia.

    PubMed

    Porporatti, André Luís; Costa, Yuri Martins; Stuginski-Barbosa, Juliana; Bonjardim, Leonardo Rigoldi; Conti, Paulo César Rodrigues; Svensson, Peter

    2015-01-01

    A systematic review was conducted to identify reliable somatosensory evaluation methods for atypical odontalgia (AO) patients. The computerized search included the main databases (MEDLINE, EMBASE, and Cochrane Library). The studies included used the following quantitative sensory testing (QST) methods: mechanical detection threshold (MDT), mechanical pain threshold (MPT) (pinprick), pressure pain threshold (PPT), dynamic mechanical allodynia with a cotton swab (DMA1) or a brush (DMA2), warm detection threshold (WDT), cold detection threshold (CDT), heat pain threshold (HPT), cold pain detection (CPT), and/or wind-up ratio (WUR). The publications meeting the inclusion criteria revealed that only mechanical allodynia tests (DMA1, DMA2, and WUR) were significantly higher and pain threshold tests to heat stimulation (HPT) were significantly lower in the affected side, compared with the contralateral side, in AO patients; however, for MDT, MPT, PPT, CDT, and WDT, the results were not significant. These data support the presence of central sensitization features, such as allodynia and temporal summation. In contrast, considerable inconsistencies between studies were found when AO patients were compared with healthy subjects. In clinical settings, the most reliable evaluation method for AO in patients with persistent idiopathic facial pain would be intraindividual assessments using HPT or mechanical allodynia tests. PMID:25627886

  8. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  9. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    PubMed Central

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p < 0.001). Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p < 0.001). A test in which fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation.

  10. Quantitative relations between risk, return and firm size

    NASA Astrophysics Data System (ADS)

    Podobnik, B.; Horvatic, D.; Petersen, A. M.; Stanley, H. E.

    2009-03-01

    We analyze —for a large set of stocks comprising four financial indices— the annual logarithmic growth rate R and the firm size, quantified by the market capitalization MC. For the Nasdaq Composite and the New York Stock Exchange Composite we find that the probability density functions of growth rates are Laplace ones in the broad central region, where the standard deviation σ(R), as a measure of risk, decreases with the MC as a power law σ(R)~(MC)- β. For both the Nasdaq Composite and the S&P 500, we find that the average growth rate langRrang decreases faster than σ(R) with MC, implying that the return-to-risk ratio langRrang/σ(R) also decreases with MC. For the S&P 500, langRrang and langRrang/σ(R) also follow power laws. For a 20-year time horizon, for the Nasdaq Composite we find that σ(R) vs. MC exhibits a functional form called a volatility smile, while for the NYSE Composite, we find power law stability between σ(r) and MC.

  11. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    SciTech Connect

    Waters, Michael Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  12. Qualitative and quantitative evaluation of solvent systems for countercurrent separation.

    PubMed

    Friesen, J Brent; Ahmed, Sana; Pauli, Guido F

    2015-01-16

    Rational solvent system selection for countercurrent chromatography and centrifugal partition chromatography technology (collectively known as countercurrent separation) studies continues to be a scientific challenge as the fundamental questions of comparing polarity range and selectivity within a solvent system family and between putative orthogonal solvent systems remain unanswered. The current emphasis on metabolomic investigations and analysis of complex mixtures necessitates the use of successive orthogonal countercurrent separation (CS) steps as part of complex fractionation protocols. Addressing the broad range of metabolite polarities demands development of new CS solvent systems with appropriate composition, polarity (π), selectivity (σ), and suitability. In this study, a mixture of twenty commercially available natural products, called the GUESSmix, was utilized to evaluate both solvent system polarity and selectively characteristics. Comparisons of GUESSmix analyte partition coefficient (K) values give rise to a measure of solvent system polarity range called the GUESSmix polarity index (GUPI). Solvatochromic dye and electrical permittivity measurements were also evaluated in quantitatively assessing solvent system polarity. The relative selectivity of solvent systems were evaluated with the GUESSmix by calculating the pairwise resolution (αip), the number of analytes found in the sweet spot (Nsw), and the pairwise resolution of those sweet spot analytes (αsw). The combination of these parameters allowed for both intra- and inter-family comparison of solvent system selectivity. Finally, 2-dimensional reciprocal shifted symmetry plots (ReSS(2)) were created to visually compare both the polarities and selectivities of solvent system pairs. This study helps to pave the way to the development of new solvent systems that are amenable to successive orthogonal CS protocols employed in metabolomic studies. PMID:25542704

  13. [Using the evaluation of carcinogenic risk in the mining and metallurgical enterprises of the Arctic].

    PubMed

    Serebriakov, P V

    2012-01-01

    The aim of this study--hygienic assessment of the contribution of factors of working environment) in the formation of carcinogenic risk to the mining and metallurgical enterprises of the Far North, the establishment of the structural features of cancer pathology among workers of these enterprises, quantitative evaluation of individual professional cancer risk in different nosological forms and morphological variants of malignant neoplasms.

  14. Comparing models for quantitative risk assessment: an application to the European Registry of foreign body injuries in children.

    PubMed

    Berchialla, Paola; Scarinzi, Cecilia; Snidero, Silvia; Gregori, Dario

    2016-08-01

    Risk Assessment is the systematic study of decisions subject to uncertain consequences. An increasing interest has been focused on modeling techniques like Bayesian Networks since their capability of (1) combining in the probabilistic framework different type of evidence including both expert judgments and objective data; (2) overturning previous beliefs in the light of the new information being received and (3) making predictions even with incomplete data. In this work, we proposed a comparison among Bayesian Networks and other classical Quantitative Risk Assessment techniques such as Neural Networks, Classification Trees, Random Forests and Logistic Regression models. Hybrid approaches, combining both Classification Trees and Bayesian Networks, were also considered. Among Bayesian Networks, a clear distinction between purely data-driven approach and combination of expert knowledge with objective data is made. The aim of this paper consists in evaluating among this models which best can be applied, in the framework of Quantitative Risk Assessment, to assess the safety of children who are exposed to the risk of inhalation/insertion/aspiration of consumer products. The issue of preventing injuries in children is of paramount importance, in particular where product design is involved: quantifying the risk associated to product characteristics can be of great usefulness in addressing the product safety design regulation. Data of the European Registry of Foreign Bodies Injuries formed the starting evidence for risk assessment. Results showed that Bayesian Networks appeared to have both the ease of interpretability and accuracy in making prediction, even if simpler models like logistic regression still performed well.

  15. Quantitative assessment of direct and indirect landslide risk along transportation lines in southern India

    NASA Astrophysics Data System (ADS)

    Jaiswal, P.; van Westen, C. J.; Jetten, V.

    2010-06-01

    A quantitative approach for landslide risk assessment along transportation lines is presented and applied to a road and a railway alignment in the Nilgiri hills in southern India. The method allows estimating direct risk affecting the alignments, vehicles and people, and indirect risk resulting from the disruption of economic activities. The data required for the risk estimation were obtained from historical records. A total of 901 landslides were catalogued initiating from cut slopes along the railway and road alignment. The landslides were grouped into three magnitude classes based on the landslide type, volume, scar depth, run-out distance, etc and their probability of occurrence was obtained using frequency-volume distribution. Hazard, for a given return period, expressed as the number of landslides of a given magnitude class per kilometre of cut slopes, was obtained using Gumbel distribution and probability of landslide magnitude. In total 18 specific hazard scenarios were generated using the three magnitude classes and six return periods (1, 3, 5, 15, 25, and 50 years). The assessment of the vulnerability of the road and railway line was based on damage records whereas the vulnerability of different types of vehicles and people was subjectively assessed based on limited historic incidents. Direct specific loss for the alignments (railway line and road), vehicles (train, bus, lorry, car and motorbike) was expressed in monetary value (US), and direct specific loss of life of commuters was expressed in annual probability of death. Indirect specific loss (US) derived from the traffic interruption was evaluated considering alternative driving routes, and includes losses resulting from additional fuel consumption, additional travel cost, loss of income to the local business, and loss of revenue to the railway department. The results indicate that the total loss, including both direct and indirect loss, from 1 to 50 years return period, varies from US 90 840 to US

  16. Evaluating Alternative Risk Adjusters for Medicare.

    PubMed

    Pope, Gregory C; Adamache, Killard W; Walsh, Edith G; Khandker, Rezaul K

    1998-01-01

    In this study the authors use 3 years of the Medicare Current Beneficiary Survey (MCBS) to evaluate alternative demographic, survey, and claims-based risk adjusters for Medicare capitation payment. The survey health-status models have three to four times the predictive power of the demographic models. The risk-adjustment model derived from claims diagnoses has 75-percent greater predictive power than a comprehensive survey model. No single model predicts average expenditures well for all beneficiary subgroups of interest, suggesting a combined model may be appropriate. More data are needed to obtain stable estimates of model parameters. Advantages and disadvantages of alternative risk adjusters are discussed.

  17. Quantitative risk-based approach for improving water quality management in mining.

    PubMed

    Liu, Wenying; Moran, Chris J; Vink, Sue

    2011-09-01

    The potential environmental threats posed by freshwater withdrawal and mine water discharge are some of the main drivers for the mining industry to improve water management. The use of multiple sources of water supply and introducing water reuse into the mine site water system have been part of the operating philosophies employed by the mining industry to realize these improvements. However, a barrier to implementation of such good water management practices is concomitant water quality variation and the resulting impacts on the efficiency of mineral separation processes, and an increased environmental consequence of noncompliant discharge events. There is an increasing appreciation that conservative water management practices, production efficiency, and environmental consequences are intimately linked through the site water system. It is therefore essential to consider water management decisions and their impacts as an integrated system as opposed to dealing with each impact separately. This paper proposes an approach that could assist mine sites to manage water quality issues in a systematic manner at the system level. This approach can quantitatively forecast the risk related with water quality and evaluate the effectiveness of management strategies in mitigating the risk by quantifying implications for production and hence economic viability. PMID:21797262

  18. Quantitative risk-based approach for improving water quality management in mining.

    PubMed

    Liu, Wenying; Moran, Chris J; Vink, Sue

    2011-09-01

    The potential environmental threats posed by freshwater withdrawal and mine water discharge are some of the main drivers for the mining industry to improve water management. The use of multiple sources of water supply and introducing water reuse into the mine site water system have been part of the operating philosophies employed by the mining industry to realize these improvements. However, a barrier to implementation of such good water management practices is concomitant water quality variation and the resulting impacts on the efficiency of mineral separation processes, and an increased environmental consequence of noncompliant discharge events. There is an increasing appreciation that conservative water management practices, production efficiency, and environmental consequences are intimately linked through the site water system. It is therefore essential to consider water management decisions and their impacts as an integrated system as opposed to dealing with each impact separately. This paper proposes an approach that could assist mine sites to manage water quality issues in a systematic manner at the system level. This approach can quantitatively forecast the risk related with water quality and evaluate the effectiveness of management strategies in mitigating the risk by quantifying implications for production and hence economic viability.

  19. Quantitative Integration of Ndt with Probabilistic Fracture Mechanics for the Assessment of Fracture Risk in Pipelines

    NASA Astrophysics Data System (ADS)

    Kurz, J. H.; Cioclov, D.; Dobmann, G.; Boller, C.

    2010-02-01

    In the context of probabilistic paradigm of fracture risk assessment in structural components a computer simulation rationale is presented which has at the base the integration of Quantitative Non-destructive Inspection and Probabilistic Fracture Mechanics. In this study the static failure under static loading is assessed in the format known as Failure Assessment Diagram (FAD). The fracture risk is evaluated in probabilistic terms. The superposed probabilistic pattern over the deterministic one is implemented via Monte-Carlo sampling. The probabilistic fracture simulation yields a more informative analysis in terms of probability of failure. The ability to simulate the influence of the quality and reliability of non-destructive inspection (NDI) is an important feature of this approach. It is achieved by integrating, algorithmically, probabilistic FAD analysis and the Probability of Detection (POD). The POD information can only be applied in a probabilistic analysis and leads to a refinement of the assessment. By this means, it can be ascertained the decrease of probability of failure when POD-characterized NDI is applied. Therefore, this procedure can be used as a tool for inspection based life time conceptions. In this paper results of sensitivity analyses are presented with the aim to outline, in terms of non-failure probabilities, the benefits of applying NDI, in various qualities, in comparison with the situation when NDI is lacking. A better substantiation is enabled of both the component reliability management and the costs-effectiveness of NDI timing.

  20. Utilization of quantitative structure-activity relationships (QSARs) in risk assessment: Alkylphenols

    SciTech Connect

    Beck, B.D.; Toole, A.P.; Callahan, B.G.; Siddhanti, S.K. )

    1991-12-01

    Alkylphenols are a class of environmentally pervasive compounds, found both in natural (e.g., crude oils) and in anthropogenic (e.g., wood tar, coal gasification waste) materials. Despite the frequent environmental occurrence of these chemicals, there is a limited toxicity database on alkylphenols. The authors have therefore developed a 'toxicity equivalence approach' for alkylphenols which is based on their ability to inhibit, in a specific manner, the enzyme cyclooxygenase. Enzyme-inhibiting ability for individual alkylphenols can be estimated based on the quantitative structure-activity relationship developed by Dewhirst (1980) and is a function of the free hydroxyl group, electron-donating ring substituents, and hydrophobic aromatic ring substituents. The authors evaluated the toxicological significance of cyclooxygenase inhibition by comparison of the inhibitory capacity of alkylphenols with the inhibitory capacity of acetylsalicylic acid, or aspirin, a compound whose low-level effects are due to cyclooxygenase inhibition. Since nearly complete absorption for alkylphenols and aspirin is predicted, based on estimates of hydrophobicity and fraction of charged molecules at gastrointestinal pHs, risks from alkylphenols can be expressed directly in terms of 'milligram aspirin equivalence,' without correction for absorption differences. They recommend this method for assessing risks of mixtures of alkylphenols, especially for those compounds with no chronic toxicity data.38 references.

  1. Usefulness of quantitative versus qualitative ST-segment depression for risk stratification of non-ST elevation acute coronary syndromes in contemporary clinical practice.

    PubMed

    Yan, Raymond T; Yan, Andrew T; Granger, Christopher B; Lopez-Sendon, Jose; Brieger, David; Kennelly, Brian; Budaj, Andrzej; Steg, Ph Gabriel; Georgescu, Alina A; Hassan, Quamrul; Goodman, Shaun G

    2008-04-01

    This aim of this study was to assess the clinical utility of quantitative ST-segment depression (STD) for refining the risk stratification of non-ST elevation acute coronary syndromes in the prospective, multinational Global Registry of Acute Coronary Events (GRACE). Quantitative measurements of STD on admission electrocardiograms were evaluated independently by a core laboratory, and their predictive value for in-hospital and cumulative 6-month mortality was examined. Although more severe STD is a marker of increased short- and long-term mortality, it is also associated with higher risk clinical features and biomarkers. Thus, after adjustment for these clinically important predictors, quantitative STD does not provide incremental prognostic value beyond simple dichotomous evaluation for the presence of STD. Furthermore, adopting quantitative instead of the prognostically proven qualitative evaluation of STD does not improve risk discrimination afforded by the validated GRACE risk models. In conclusion, the findings do not support the quantification of STD in routine clinical practice beyond simple evaluation for the presence of STD as an integral part of comprehensive risk stratification using the GRACE risk score.

  2. Towards quantitative ecological risk assessment of elevated carbon dioxide levels in the marine environment.

    PubMed

    de Vries, Pepijn; Tamis, Jacqueline E; Foekema, Edwin M; Klok, Chris; Murk, Albertinka J

    2013-08-30

    The environmental impact of elevated carbon dioxide (CO2) levels has become of more interest in recent years. This, in relation to globally rising CO2 levels and related considerations of geological CO2 storage as a mitigating measure. In the present study effect data from literature were collected in order to conduct a marine ecological risk assessment of elevated CO2 levels, using a Species Sensitivity Distribution (SSD). It became evident that information currently available from the literature is mostly insufficient for such a quantitative approach. Most studies focus on effects of expected future CO2 levels, testing only one or two elevated concentrations. A full dose-response relationship, a uniform measure of exposure, and standardized test protocols are essential for conducting a proper quantitative risk assessment of elevated CO2 levels. Improvements are proposed to make future tests more valuable and usable for quantitative risk assessment.

  3. A risk methodology to evaluate sensitvity of plant risk to human errors

    SciTech Connect

    Samanta, P.; Wong, S.; Higgins, J.; Haber, S.; Luckas, W.

    1988-01-01

    This paper presents an evaluation of sensitivity of plant risk parameters, namely the core melt frequency and the accident sequence frequencies, to the human errors involved in various aspects of nuclear power plant operations. Results are provided using the Oconee-3 Probabilistic Risk Assessment model as an example application of the risk methodology described herein. Sensitivity analyses in probabilistic risk assessment (PRA) involve three areas: (1) a determination of the set of input parameters; in this case, various categories of human errors signifying aspects of plant operation, (2) the range over which the input parameters vary, and (3) an assessment of the sensitivity of the plant risk parameters to the input parameters which, in this case, consist of all postulated human errors, or categories of human errors. The methodology presents a categorization scheme where human errors are categorized in terms of types of activity, location, personnel involved, etc., to relate the significance of sensitivity of risk parameters to specific aspects of human performance in the nuclear plant. Ranges of variability for human errors have been developed considering the various known causes of uncertainty in human error probability estimates in PRAs. The sensitivity of the risk parameters are assessed using the event/fault tree methodology of the PRA. The results of the risk-based sensitivity evaluation using the Oconee-3 PRA as an example show the quantitative impact on the plant risk level due to variations in human error probabilities. The relative effects of various human error categories and human error sorts within the categories are also presented to identify and characterize significant human errors for effective risk management in nuclear power plant operational activities. 8 refs., 10 figs., 4 tabs.

  4. Quantitative evaluation of photoplethysmographic artifact reduction for pulse oximetry

    NASA Astrophysics Data System (ADS)

    Hayes, Matthew J.; Smith, Peter R.

    1999-01-01

    Motion artefact corruption of pulse oximeter output, causing both measurement inaccuracies and false alarm conditions, is a primary restriction in the current clinical practice and future applications of this useful technique. Artefact reduction in photoplethysmography (PPG), and therefore by application in pulse oximetry, is demonstrated using a novel non-linear methodology recently proposed by the authors. The significance of these processed PPG signals for pulse oximetry measurement is discussed, with particular attention to the normalization inherent in the artefact reduction process. Quantitative experimental investigation of the performance of PPG artefact reduction is then utilized to evaluate this technology for application to pulse oximetry. While the successfully demonstrated reduction of severe artefacts may widen the applicability of all PPG technologies and decrease the occurrence of pulse oximeter false alarms, the observed reduction of slight artefacts suggests that many such effects may go unnoticed in clinical practice. The signal processing and output averaging used in most commercial oximeters can incorporate these artefact errors into the output, while masking the true PPG signal corruption. It is therefore suggested that PPG artefact reduction should be incorporated into conventional pulse oximetry measurement, even in the absence of end-user artefact problems.

  5. Quantitative image quality evaluation for cardiac CT reconstructions

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-Wu; Fan, Jiahua; Kupinski, Matthew A.; Balhorn, William; Okerlund, Darin R.

    2016-03-01

    Maintaining image quality in the presence of motion is always desirable and challenging in clinical Cardiac CT imaging. Different image-reconstruction algorithms are available on current commercial CT systems that attempt to achieve this goal. It is widely accepted that image-quality assessment should be task-based and involve specific tasks, observers, and associated figures of merits. In this work, we developed an observer model that performed the task of estimating the percentage of plaque in a vessel from CT images. We compared task performance of Cardiac CT image data reconstructed using a conventional FBP reconstruction algorithm and the SnapShot Freeze (SSF) algorithm, each at default and optimal reconstruction cardiac phases. The purpose of this work is to design an approach for quantitative image-quality evaluation of temporal resolution for Cardiac CT systems. To simulate heart motion, a moving coronary type phantom synchronized with an ECG signal was used. Three different percentage plaques embedded in a 3 mm vessel phantom were imaged multiple times under motion free, 60 bpm, and 80 bpm heart rates. Static (motion free) images of this phantom were taken as reference images for image template generation. Independent ROIs from the 60 bpm and 80 bpm images were generated by vessel tracking. The observer performed estimation tasks using these ROIs. Ensemble mean square error (EMSE) was used as the figure of merit. Results suggest that the quality of SSF images is superior to the quality of FBP images in higher heart-rate scans.

  6. Quantitative evaluation of phase processing approaches in susceptibility weighted imaging

    NASA Astrophysics Data System (ADS)

    Li, Ningzhi; Wang, Wen-Tung; Sati, Pascal; Pham, Dzung L.; Butman, John A.

    2012-03-01

    Susceptibility weighted imaging (SWI) takes advantage of the local variation in susceptibility between different tissues to enable highly detailed visualization of the cerebral venous system and sensitive detection of intracranial hemorrhages. Thus, it has been increasingly used in magnetic resonance imaging studies of traumatic brain injury as well as other intracranial pathologies. In SWI, magnitude information is combined with phase information to enhance the susceptibility induced image contrast. Because of global susceptibility variations across the image, the rate of phase accumulation varies widely across the image resulting in phase wrapping artifacts that interfere with the local assessment of phase variation. Homodyne filtering is a common approach to eliminate this global phase variation. However, filter size requires careful selection in order to preserve image contrast and avoid errors resulting from residual phase wraps. An alternative approach is to apply phase unwrapping prior to high pass filtering. A suitable phase unwrapping algorithm guarantees no residual phase wraps but additional computational steps are required. In this work, we quantitatively evaluate these two phase processing approaches on both simulated and real data using different filters and cutoff frequencies. Our analysis leads to an improved understanding of the relationship between phase wraps, susceptibility effects, and acquisition parameters. Although homodyne filtering approaches are faster and more straightforward, phase unwrapping approaches perform more accurately in a wider variety of acquisition scenarios.

  7. Closure plan evaluation for risk of acid rock drainage

    SciTech Connect

    Dwire, D.L.; Krause, A.J.; Russell, L.J.

    1999-07-01

    Control of acid rock drainage (ARD) is a long-term issue for many mine sites and is often a primary objective of remediation efforts. Some sites continue to require monitoring and management of ARD long after mine operation has ceased and closure is complete. In New Zealand, an innovative and quantitative approach was applied to evaluate the expected risk of ARD after implementation of the closure plan for the Golden Cross Mine. In addition, this future risk was compared to current operating conditions to provide an estimate of the reduction in risk provided by the remediation activities. This approach was useful to both the mine proponent and the regulatory agencies in assessing the effectiveness of the existing closure plan and providing focus on the components of greatest risk. Mine components remaining on site after closure that could potentially generate ARD under various failure scenarios were identified and evaluated. These components included the tailings decant pond, waste rock, stockpiles, open pit mine and water treatment systems. For each component, a series of initiating events and failure scenarios were identified, and a decision tree methodology was utilized to estimate the probability of ARD generation for both current and closure conditions. Due to the implementation of closure plans designed to minimize or eliminate ARD through regarding, construction of engineered covers and water management designs, the risk of ARD generation will be significantly reduced over time.

  8. Risk evaluation mitigation strategies: the evolution of risk management policy.

    PubMed

    Hollingsworth, Kristen; Toscani, Michael

    2013-04-01

    The United States Food and Drug Administration (FDA) has the primary regulatory responsibility to ensure that medications are safe and effective both prior to drug approval and while the medication is being actively marketed by manufacturers. The responsibility for safe medications prior to marketing was signed into law in 1938 under the Federal Food, Drug, and Cosmetic Act; however, a significant risk management evolution has taken place since 1938. Additional federal rules, entitled the Food and Drug Administration Amendments Act, were established in 2007 and extended the government's oversight through the addition of a Risk Evaluation and Mitigation Strategy (REMS) for certain drugs. REMS is a mandated strategy to manage a known or potentially serious risk associated with a medication or biological product. Reasons for this extension of oversight were driven primarily by the FDA's movement to ensure that patients and providers are better informed of drug therapies and their specific benefits and risks prior to initiation. This article provides an historical perspective of the evolution of medication risk management policy and includes a review of REMS programs, an assessment of the positive and negative aspects of REMS, and provides suggestions for planning and measuring outcomes. In particular, this publication presents an overview of the evolution of the REMS program and its implications.

  9. Towards a Confluence of Quantitative and Qualitative Approaches to Curriculum Evaluation.

    ERIC Educational Resources Information Center

    Smith, D. L.; Fraser, B. J.

    1980-01-01

    Discusses a project in which quantitative and qualitative methodologies were combined in an evaluation of the High School Education Law Project (HELP) in Australia. Qualitative and quantitative evaluation were combined in several aspects of the study including field testing of preliminary versions of HELP materials, further evaluation work on…

  10. Evaluation of health risks for contaminated aquifers.

    PubMed Central

    Piver, W T; Jacobs, T L; Medina, M A

    1997-01-01

    This review focuses on progress in the development of transport models for heterogeneous contaminated aquifers, the use of predicted contaminant concentrations in groundwater for risk assessment for heterogeneous human populations, and the evaluation of aquifer remediation technologies. Major limitations and areas for continuing research for all methods presented in this review are identified. Images Figure 2. PMID:9114282

  11. Evaluation of health risks for contaminated aquifers.

    PubMed

    Piver, W T; Jacobs, T L; Medina, M A

    1997-02-01

    This review focuses on progress in the development of transport models for heterogeneous contaminated aquifers, the use of predicted contaminant concentrations in groundwater for risk assessment for heterogeneous human populations, and the evaluation of aquifer remediation technologies. Major limitations and areas for continuing research for all methods presented in this review are identified.

  12. Evaluating Potential Health Risks in Relocatable Classrooms.

    ERIC Educational Resources Information Center

    Katchen, Mark; LaPierre, Adrienne; Charlin, Cary; Brucker, Barry; Ferguson, Paul

    2001-01-01

    Only limited data exist describing potential exposures to chemical and biological agents when using portable classrooms or outlining how to assess and reduce associated health risks. Evaluating indoor air quality involves examining ventilating rates, volatile organic compounds, and microbiologicals. Open communication among key stakeholders is…

  13. A novel quantitative approach for evaluating contact mechanics of meniscal replacements.

    PubMed

    Linder-Ganz, E; Elsner, J J; Danino, A; Guilak, F; Shterling, A

    2010-02-01

    One of the functions of the meniscus is to distribute contact forces over the articular surfaces by increasing the joint contact areas. It is widely accepted that total/partial loss of the meniscus increases the risk of joint degeneration. A short-term method for evaluating whether degenerative arthritis can be prevented or not would be to determine if the peak pressure and contact area coverage of the tibial plateau (TP) in the knee are restored at the time of implantation. Although several published studies already utilized TP contact pressure measurements as an indicator for biomechanical performance of allograft menisci, there is a paucity of a quantitative method for evaluation of these parameters in situ with a single effective parameter. In the present study, we developed such a method and used it to assess the load distribution ability of various meniscal implant configurations in human cadaveric knees (n=3). Contact pressures under the intact meniscus were measured under compression (1200 N, 0 deg flexion). Next, total meniscectomy was performed and the protocol was repeated with meniscal implants. Resultant pressure maps were evaluated for the peak pressure value, total contact area, and its distribution pattern, all with respect to the natural meniscus output. Two other measures--implant-dislocation and implant-impingement on the ligaments--were also considered. If any of these occurred, the score was zeroed. The total implant score was based on an adjusted calculation of the aforementioned measures, where the natural meniscus score was always 100. Laboratory experiments demonstrated a good correlation between qualitative and quantitative evaluations of the same pressure map outputs, especially in cases where there were contradicting indications between different parameters. Overall, the proposed approach provides a novel, validated method for quantitative assessment of the biomechanical performance of meniscal implants, which can be used in various

  14. Dating Violence among High-Risk Young Women: A Systematic Review Using Quantitative and Qualitative Methods

    PubMed Central

    Joly, Lauren E.; Connolly, Jennifer

    2016-01-01

    Our systematic review identified 21 quantitative articles and eight qualitative articles addressing dating violence among high risk young women. The groups of high-risk young women in this review include street-involved, justice-involved, pregnant or parenting, involved with Child Protective Services, and youth diagnosed with a mental health issue. Our meta-analysis of the quantitative articles indicated that 34% (CI = 0.24–0.45) of high-risk young women report that they have been victims of physical dating violence and 45% (CI = 0.31–0.61) of these young women report perpetrating physical dating violence. Significant moderator variables included questionnaire and timeframe. Meta-synthesis of the qualitative studies revealed that high-risk young women report perpetrating dating violence to gain power and respect, whereas women report becoming victims of dating violence due to increased vulnerability. PMID:26840336

  15. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

    PubMed Central

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  16. Use of global sensitivity analysis in quantitative microbial risk assessment: application to the evaluation of a biological time temperature integrator as a quality and safety indicator for cold smoked salmon.

    PubMed

    Ellouze, M; Gauchi, J-P; Augustin, J-C

    2011-06-01

    The aim of this study was to apply a global sensitivity analysis (SA) method in model simplification and to evaluate (eO)®, a biological Time Temperature Integrator (TTI) as a quality and safety indicator for cold smoked salmon (CSS). Models were thus developed to predict the evolutions of Listeria monocytogenes and the indigenous food flora in CSS and to predict TTIs endpoint. A global SA was then applied on the three models to identify the less important factors and simplify the models accordingly. Results showed that the subset of the most important factors of the three models was mainly composed of the durations and temperatures of two chill chain links, out of the control of the manufacturers: the domestic refrigerator and the retail/cabinet links. Then, the simplified versions of the three models were run with 10(4) time temperature profiles representing the variability associated to the microbial behavior, to the TTIs evolution and to the French chill chain characteristics. The results were used to assess the distributions of the microbial contaminations obtained at the TTI endpoint and at the end of the simulated profiles and proved that, in the case of poor storage conditions, the TTI use could reduce the number of unacceptable foods by 50%. PMID:21511136

  17. Gasbuggy Site Assessment and Risk Evaluation

    SciTech Connect

    2011-03-01

    This report describes the geologic and hydrologic conditions and evaluates potential health risks to workers in the natural gas industry in the vicinity of the Gasbuggy, New Mexico, site, where the U.S. Atomic Energy Commission detonated an underground nuclear device in 1967. The 29-kiloton detonation took place 4,240 feet below ground surface and was designed to evaluate the use of a nuclear detonation to enhance natural gas production from the Pictured Cliffs Formation in the San Juan Basin, Rio Arriba County, New Mexico, on land administered by Carson National Forest. A site-specific conceptual model was developed based on current understanding of the hydrologic and geologic environment. This conceptual model was used for establishing plausible contaminant exposure scenarios, which were then evaluated for human health risk potential. The most mobile and, therefore, the most probable contaminant that could result in human exposure is tritium. Natural gas production wells were identified as having the greatest potential for bringing detonation-derived contaminants (tritium) to the ground surface in the form of tritiated produced water. Three exposure scenarios addressing potential contamination from gas wells were considered in the risk evaluation: a gas well worker during gas-well-drilling operations, a gas well worker performing routine maintenance, and a residential exposure. The residential exposure scenario was evaluated only for comparison; permanent residences on national forest lands at the Gasbuggy site are prohibited

  18. Quantitative comparison between crowd models for evacuation planning and evaluation

    NASA Astrophysics Data System (ADS)

    Viswanathan, Vaisagh; Lee, Chong Eu; Lees, Michael Harold; Cheong, Siew Ann; Sloot, Peter M. A.

    2014-02-01

    Crowd simulation is rapidly becoming a standard tool for evacuation planning and evaluation. However, the many crowd models in the literature are structurally different, and few have been rigorously calibrated against real-world egress data, especially in emergency situations. In this paper we describe a procedure to quantitatively compare different crowd models or between models and real-world data. We simulated three models: (1) the lattice gas model, (2) the social force model, and (3) the RVO2 model, and obtained the distributions of six observables: (1) evacuation time, (2) zoned evacuation time, (3) passage density, (4) total distance traveled, (5) inconvenience, and (6) flow rate. We then used the DISTATIS procedure to compute the compromise matrix of statistical distances between the three models. Projecting the three models onto the first two principal components of the compromise matrix, we find the lattice gas and RVO2 models are similar in terms of the evacuation time, passage density, and flow rates, whereas the social force and RVO2 models are similar in terms of the total distance traveled. Most importantly, we find that the zoned evacuation times of the three models to be very different from each other. Thus we propose to use this variable, if it can be measured, as the key test between different models, and also between models and the real world. Finally, we compared the model flow rates against the flow rate of an emergency evacuation during the May 2008 Sichuan earthquake, and found the social force model agrees best with this real data.

  19. Dual-band infrared thermography for quantitative nondestructive evaluation

    SciTech Connect

    Durbin, P.F.; Del Grande, N.K.; Dolan, K.W.; Perkins, D.E.; Shapiro, A.B.

    1993-04-01

    The authors have developed dual-band infrared (DBIR) thermography that is being applied to quantitative nondestructive evaluation (NDE) of aging aircraft. The DBIR technique resolves 0.2 degrees C surface temperature differences for inspecting interior flaws in heated aircraft structures. It locates cracks, corrosion sites, disbonds or delaminations in metallic laps and composite patches. By removing clutter from surface roughness effects, the authors clarify interpretation of subsurface flaws. To accomplish this, the authors ratio images recorded at two infrared bands, centered near 5 microns and 10 microns. These image ratios are used to decouple temperature patterns associated with interior flaw sites from spatially varying surface emissivity noise. They also discuss three-dimensional (3D) dynamic thermal imaging of structural flaws using dual-band infrared (DBIR) computed tomography. Conventional thermography provides single-band infrared images which are difficult to interpret. Standard procedures yield imprecise (or qualitative) information about subsurface flaw sites which are typically masked by surface clutter. They use a DBIR imaging technique pioneered at LLNL to capture the time history of surface temperature difference patterns for flash-heated targets. They relate these patterns to the location, size, shape and depth of subsurface flaws. They have demonstrated temperature accuracies of 0.2{degree}C, timing synchronization of 3 ms (after onset of heat flash) and intervals of 42 ms, between images, during an 8 s cooling (and heating) interval characterizing the front (and back) surface temperature-time history of an epoxy-glue disbond site in a flash-heated aluminum lap joint.

  20. A method of quantitative risk assessment for transmission pipeline carrying natural gas.

    PubMed

    Jo, Young-Do; Ahn, Bum Jong

    2005-08-31

    Regulatory authorities in many countries are moving away from prescriptive approaches for keeping natural gas pipelines safe. As an alternative, risk management based on a quantitative assessment is being considered to improve the level of safety. This paper focuses on the development of a simplified method for the quantitative risk assessment for natural gas pipelines and introduces parameters of fatal length and cumulative fatal length. The fatal length is defined as the integrated fatality along the pipeline associated with hypothetical accidents. The cumulative fatal length is defined as the section of pipeline in which an accident leads to N or more fatalities. These parameters can be estimated easily by using the information of pipeline geometry and population density of a Geographic Information Systems (GIS). To demonstrate the proposed method, individual and societal risks for a sample pipeline have been estimated from the historical data of European Gas Pipeline Incident Data Group and BG Transco. With currently acceptable criteria taken into account for individual risk, the minimum proximity of the pipeline to occupied buildings is approximately proportional to the square root of the operating pressure of the pipeline. The proposed method of quantitative risk assessment may be useful for risk management during the planning and building stages of a new pipeline, and modification of a buried pipeline.

  1. Evaluation of errors in quantitative determination of asbestos in rock

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Marini, Paola; Vitaliti, Martina

    2016-04-01

    The quantitative determination of the content of asbestos in rock matrices is a complex operation which is susceptible to important errors. The principal methodologies for the analysis are Scanning Electron Microscopy (SEM) and Phase Contrast Optical Microscopy (PCOM). Despite the PCOM resolution is inferior to that of SEM, PCOM analysis has several advantages, including more representativity of the analyzed sample, more effective recognition of chrysotile and a lower cost. The DIATI LAA internal methodology for the analysis in PCOM is based on a mild grinding of a rock sample, its subdivision in 5-6 grain size classes smaller than 2 mm and a subsequent microscopic analysis of a portion of each class. The PCOM is based on the optical properties of asbestos and of the liquids with note refractive index in which the particles in analysis are immersed. The error evaluation in the analysis of rock samples, contrary to the analysis of airborne filters, cannot be based on a statistical distribution. In fact for airborne filters a binomial distribution (Poisson), which theoretically defines the variation in the count of fibers resulting from the observation of analysis fields, chosen randomly on the filter, can be applied. The analysis in rock matrices instead cannot lean on any statistical distribution because the most important object of the analysis is the size of the of asbestiform fibers and bundles of fibers observed and the resulting relationship between the weights of the fibrous component compared to the one granular. The error evaluation generally provided by public and private institutions varies between 50 and 150 percent, but there are not, however, specific studies that discuss the origin of the error or that link it to the asbestos content. Our work aims to provide a reliable estimation of the error in relation to the applied methodologies and to the total content of asbestos, especially for the values close to the legal limits. The error assessments must

  2. Studying Biology to Understand Risk: Dosimetry Models and Quantitative Adverse Outcome Pathways

    EPA Science Inventory

    Confidence in the quantitative prediction of risk is increased when the prediction is based to as great an extent as possible on the relevant biological factors that constitute the pathway from exposure to adverse outcome. With the first examples now over 40 years old, physiologi...

  3. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  4. Application of a quantitative carrier test to evaluate microbicides against mycobacteria.

    PubMed

    Springthorpe, V Susan; Sattar, Syed A

    2007-01-01

    Microbicides for reprocessing heat-sensitive medical devices, such as flexible endoscopes, must be mycobactericidal to reduce the risk of nosocomial infections. Suspension test methods currently used for efficacy evaluation lack the stringency required for assessing inactivation of mycobacteria on surfaces. The quantitative carrier test method reported here is based on mycobacteria-contaminated reference carrier disks of brushed stainless steel. Each disk was contaminated with 10 microL of a suspension of Mycobacterium terrae containing a soil load. Each disk with a dried inoculum was placed in a glass or Teflon vial, and then overlaid with 50 microL of the test formulation or 50 microL saline for the control carriers. Five test and 3 control disks were used in each run. At the end of the contact time, each vial received 9.95 mL neutralizer solution with 0.1% Tween-80 to stop the reaction and perform the initial microbicide dilution. The inoculum was eluted by mixing on a Vortex mixer for 60 s, and the eluates and saline used to subsequently wash the vials and the funnels were membrane-filtered. Filters were placed on plates of Middlebrook 7H11 agar and incubated at 37 degrees C for at least 30 days before colonies were counted and log10 reductions were calculated in colony-forming units. Tests with a range of commercially available products, having claims against mycobacteria, or believed to be broad-spectrum microbicides, showed that the method gave reproducible results. Products used included oxidizing agents (sodium hypochlorite and an iodophore), a phenolic, a quaternary ammonium compound, and ortho-phthalaldehyde. This method represents a much more realistic evaluation than the currently used quantitative suspension test method for the evaluation of mycobactericidal formulations for registration and, when performed at different product concentrations, allows an assessment of any safety margin or risks in using the test formulation in the field.

  5. Pharmacology-based toxicity assessment: towards quantitative risk prediction in humans.

    PubMed

    Sahota, Tarjinder; Danhof, Meindert; Della Pasqua, Oscar

    2016-05-01

    Despite ongoing efforts to better understand the mechanisms underlying safety and toxicity, ~30% of the attrition in drug discovery and development is still due to safety concerns. Changes in current practice regarding the assessment of safety and toxicity are required to reduce late stage attrition and enable effective development of novel medicines. This review focuses on the implications of empirical evidence generation for the evaluation of safety and toxicity during drug development. A shift in paradigm is needed to (i) ensure that pharmacological concepts are incorporated into the evaluation of safety and toxicity; (ii) facilitate the integration of historical evidence and thereby the translation of findings across species as well as between in vitro and in vivo experiments and (iii) promote the use of experimental protocols tailored to address specific safety and toxicity questions. Based on historical examples, we highlight the challenges for the early characterisation of the safety profile of a new molecule and discuss how model-based methodologies can be applied for the design and analysis of experimental protocols. Issues relative to the scientific rationale are categorised and presented as a hierarchical tree describing the decision-making process. Focus is given to four different areas, namely, optimisation, translation, analytical construct and decision criteria. From a methodological perspective, the relevance of quantitative methods for estimation and extrapolation of risk from toxicology and safety pharmacology experimental protocols, such as points of departure and potency, is discussed in light of advancements in population and Bayesian modelling techniques (e.g. non-linear mixed effects modelling). Their use in the evaluation of pharmacokinetics (PK) and pharmacokinetic-pharmacodynamic relationships (PKPD) has enabled great insight into the dose rationale for medicines in humans, both in terms of efficacy and adverse events. Comparable benefits

  6. Risk evaluation: A cost-oriented approach

    SciTech Connect

    Rogers, B.H.

    1998-02-03

    This method provides a structured and cost-oriented way to determine risks associated with loss and destruction of industrial security interests consisting of material assets and human resources. Loss and destruction are assumed to be adversary perpetrated, high-impact events in which the health and safety of people or high-value property is at risk. This concept provides a process for: (1) assessing effectiveness of all integrated protection system, which includes facility operations, safety, emergency and security systems, and (2) a qualitative prioritization scheme to determine the level of consequence relative to cost and subsequent risk. The method allows managers the flexibility to establish asset protection appropriate to programmatic requirements and priorities and to decide if funding is appropriate. The evaluation objectives are to: (1) provide for a systematic, qualitative tabletop process to estimate the potential for an undesirable event and its impact; and (2) identify ineffective protection and cost-effective solutions.

  7. Credit risk evaluation based on social media.

    PubMed

    Yang, Yang; Gu, Jing; Zhou, Zongfang

    2016-07-01

    Social media has been playing an increasingly important role in the sharing of individuals' opinions on many financial issues, including credit risk in investment decisions. This paper analyzes whether these opinions, which are transmitted through social media, can accurately predict enterprises' future credit risk. We consider financial statements oriented evaluation results based on logit and probit approaches as the benchmarks. We then conduct textual analysis to retrieve both posts and their corresponding commentaries published on two of the most popular social media platforms for financial investors in China. Professional advice from financial analysts is also investigated in this paper. We surprisingly find that the opinions extracted from both posts and commentaries surpass opinions of analysts in terms of credit risk prediction. PMID:26739372

  8. Credit risk evaluation based on social media.

    PubMed

    Yang, Yang; Gu, Jing; Zhou, Zongfang

    2016-07-01

    Social media has been playing an increasingly important role in the sharing of individuals' opinions on many financial issues, including credit risk in investment decisions. This paper analyzes whether these opinions, which are transmitted through social media, can accurately predict enterprises' future credit risk. We consider financial statements oriented evaluation results based on logit and probit approaches as the benchmarks. We then conduct textual analysis to retrieve both posts and their corresponding commentaries published on two of the most popular social media platforms for financial investors in China. Professional advice from financial analysts is also investigated in this paper. We surprisingly find that the opinions extracted from both posts and commentaries surpass opinions of analysts in terms of credit risk prediction.

  9. On the quantitative analysis and evaluation of magnetic hysteresis data

    NASA Astrophysics Data System (ADS)

    Jackson, Mike; Solheid, Peter

    2010-04-01

    Magnetic hysteresis data are centrally important in pure and applied rock magnetism, but to date, no objective quantitative methods have been developed for assessment of data quality and of the uncertainty in parameters calculated from imperfect data. We propose several initial steps toward such assessment, using loop symmetry as an important key. With a few notable exceptions (e.g., related to field cooling and exchange bias), magnetic hysteresis loops possess a high degree of inversion symmetry (M(H) = -M(-H)). This property enables us to treat the upper and lower half-loops as replicate measurements for quantification of random noise, drift, and offsets. This, in turn, makes it possible to evaluate the statistical significance of nonlinearity, either in the high-field region (due to nonsaturation of the ferromagnetic moment) or over the complete range of applied fields (due to nonnegligible contribution of ferromagnetic phases to the total magnetic signal). It also allows us to quantify the significance of fitting errors for model loops constructed from analytical basis functions. When a statistically significant high-field nonlinearity is found, magnetic parameters must be calculated by approach-to-saturation fitting, e.g., by a model of the form M(H) = Ms + χHFH + αHβ. This nonlinear high-field inverse modeling problem is strongly ill conditioned, resulting in large and strongly covariant uncertainties in the fitted parameters, which we characterize through bootstrap analyses. For a variety of materials, including ferrihydrite and mid-ocean ridge basalts, measured in applied fields up to about 1.5 T, we find that the calculated value of the exponent β is extremely sensitive to small differences in the data or in the method of processing and that the overall uncertainty exceeds the range of physically reasonable values. The "unknowability" of β is accompanied by relatively large uncertainties in the other parameters, which can be characterized, if not

  10. The RiskScape System - a tool for quantitative multi-risk analysis for natural hazards.

    NASA Astrophysics Data System (ADS)

    Schmidt, J.; Reese, S.; Matcham, I.; King, A.; Bell, R.

    2009-04-01

    This paper introduces a generic framework for multi-risk modelling developed in the project ‘Regional RiskScape' at the Research Organization GNS Science and the National Institute of Water and Atmospheric Research Ltd. (NIWA) in New Zealand. Our goal was to develop a generic technology for modelling risks from multiple natural hazards and for multiple risk elements. The framework is independent on the specific nature of the individual hazard and individual risk element. A software prototype has been developed which is capable of ‘plugging in' various natural hazards and risk elements without reconfiguring / adapting the generic software framework. To achieve that goal we developed a set of standards for treating the fundamental components of a risk model: hazards, assets (risk elements), and vulnerability models (or fragility functions). Thus, the developed prototype system is able to understand any hazard, asset, or fragility model which is provided to the system according to that standard. We tested the software prototype for modelling earthquake, volcanic, flood, wind, and tsunami risks for urban centres in New Zealand.

  11. Characterizing trabecular bone structure for assessing vertebral fracture risk on volumetric quantitative computed tomography

    NASA Astrophysics Data System (ADS)

    Nagarajan, Mahesh B.; Checefsky, Walter A.; Abidin, Anas Z.; Tsai, Halley; Wang, Xixi; Hobbs, Susan K.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2015-03-01

    While the proximal femur is preferred for measuring bone mineral density (BMD) in fracture risk estimation, the introduction of volumetric quantitative computed tomography has revealed stronger associations between BMD and spinal fracture status. In this study, we propose to capture properties of trabecular bone structure in spinal vertebrae with advanced second-order statistical features for purposes of fracture risk assessment. For this purpose, axial multi-detector CT (MDCT) images were acquired from 28 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. A semi-automated method was used to annotate the trabecular compartment in the central vertebral slice with a circular region of interest (ROI) to exclude cortical bone; pixels within were converted to values indicative of BMD. Six second-order statistical features derived from gray-level co-occurrence matrices (GLCM) and the mean BMD within the ROI were then extracted and used in conjunction with a generalized radial basis functions (GRBF) neural network to predict the failure load of the specimens; true failure load was measured through biomechanical testing. Prediction performance was evaluated with a root-mean-square error (RMSE) metric. The best prediction performance was observed with GLCM feature `correlation' (RMSE = 1.02 ± 0.18), which significantly outperformed all other GLCM features (p < 0.01). GLCM feature correlation also significantly outperformed MDCTmeasured mean BMD (RMSE = 1.11 ± 0.17) (p< 10-4). These results suggest that biomechanical strength prediction in spinal vertebrae can be significantly improved through characterization of trabecular bone structure with GLCM-derived texture features.

  12. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, Jesica Tamara; Escuder-Bueno, Ignacio; Perales-Momparler, Sara; Ramón Porta-Sancho, Juan

    2016-07-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequences and using event tree modelling for risk calculation. The study area is the city of Oliva, located on the eastern coast of Spain. Results from risk modelling have been used to inform local action planning and to assess the benefits of structural and non-structural risk reduction measures. Results show the potential impact on risk reduction of flood defences and improved warning communication schemes through local action planning: societal flood risk (in terms of annual expected affected population) would be reduced up to 51 % by combining both structural and non-structural measures. In addition, the effect of seasonal population variability is analysed (annual expected affected population ranges from 82 to 107 %, compared with the current situation, depending on occupancy rates in hotels and campsites). Results highlight the need for robust and standardized methods for urban flood risk analysis replicability at regional and national scale.

  13. Evaluation of cardiovascular risk in school children.

    PubMed

    Sporisević, Lutvo; Krzelj, Vjekoslav; Bajraktarević, Adnan; Jahić, Elmedina

    2009-08-01

    Atherosclerosis is a pathological condition that begins in early childhood, but clinically the disease manifests in older age. The aim of work was to determine frequency of atherosclerosis risk factors in healthy school children. Cross-sectional study included 214 children in mean age 10,99+/-2,52 years, within range 7 to 15 years. Patients body mass index, blood pressure, lipid status, dietary habits, physical activity and sedentary habits have been evaluated. Cardiovascular risk factors are significantly present in children (P<0,05) i.e. one cardiovascular risk factor is present in 47/214 (21,96%) children, two risk factors had 25/214 (11,68%) children, while 17/214 (7,94%) children had three or more cardiovascular risk factors. Obesity was present in 20/214 (9,34%) children, while overweight was present in 23/214 (10,74%) children. Hypertension was present in 10/214 (4,67%) children, and it was significantly present (p<0,05) in obese and overweight children. Total cholesterol was increased in 17/214 (7,94%) children, LDL-cholesterol was increased in 11/214 (5,14%) [corrected], increased triglycerides had 4/214 (1,86%) children, while decreased HDL-cholesterol was found in (3/214, 1,40%) children. Unhealthy dietary habits were present in 45/214 (21,02%) children, 42/214 (19,62%) children is physically inactive, while sedentary habits were shown in 39/214 (18,22%) children. Research shows that a large number within study group has one or more cardiovascular risk factors that can lead to premature atherosclerosis. Using massive screening of cardiovascular risk factors, along with adequate physical activity, healthy dietary habits, reduced sedentary habits, doctors and teacher's education, parents and children can reduce premature clinical sequels in atherosclerotic process.

  14. Evaluating methods for estimating existential risks.

    PubMed

    Tonn, Bruce; Stiefel, Dorian

    2013-10-01

    Researchers and commissions contend that the risk of human extinction is high, but none of these estimates have been based upon a rigorous methodology suitable for estimating existential risks. This article evaluates several methods that could be used to estimate the probability of human extinction. Traditional methods evaluated include: simple elicitation; whole evidence Bayesian; evidential reasoning using imprecise probabilities; and Bayesian networks. Three innovative methods are also considered: influence modeling based on environmental scans; simple elicitation using extinction scenarios as anchors; and computationally intensive possible-worlds modeling. Evaluation criteria include: level of effort required by the probability assessors; level of effort needed to implement the method; ability of each method to model the human extinction event; ability to incorporate scientific estimates of contributory events; transparency of the inputs and outputs; acceptability to the academic community (e.g., with respect to intellectual soundness, familiarity, verisimilitude); credibility and utility of the outputs of the method to the policy community; difficulty of communicating the method's processes and outputs to nonexperts; and accuracy in other contexts. The article concludes by recommending that researchers assess the risks of human extinction by combining these methods. PMID:23551083

  15. Quantitative, Notional, and Comprehensive Evaluations of Spontaneous Engaged Speech

    ERIC Educational Resources Information Center

    Molholt, Garry; Cabrera, Maria Jose; Kumar, V. K.; Thompsen, Philip

    2011-01-01

    This study provides specific evidence regarding the extent to which quantitative measures, common sense notional measures, and comprehensive measures adequately characterize spontaneous, although engaged, speech. As such, the study contributes to the growing body of literature describing the current limits of automatic systems for evaluating…

  16. Quantitative microbial risk assessment for Staphylococcus aureus and Staphylococcus enterotoxin A in raw milk.

    PubMed

    Heidinger, Joelle C; Winter, Carl K; Cullor, James S

    2009-08-01

    A quantitative microbial risk assessment was constructed to determine consumer risk from Staphylococcus aureus and staphylococcal enterotoxin in raw milk. A Monte Carlo simulation model was developed to assess the risk from raw milk consumption using data on levels of S. aureus in milk collected by the University of California-Davis Dairy Food Safety Laboratory from 2,336 California dairies from 2005 to 2008 and using U.S. milk consumption data from the National Health and Nutrition Examination Survey of 2003 and 2004. Four modules were constructed to simulate pathogen growth and staphylococcal enterotoxin A production scenarios to quantify consumer risk levels under various time and temperature storage conditions. The three growth modules predicted that S. aureus levels could surpass the 10(5) CFU/ml level of concern at the 99.9th or 99.99th percentile of servings and therefore may represent a potential consumer risk. Results obtained from the staphylococcal enterotoxin A production module predicted that exposure at the 99.99th percentile could represent a dose capable of eliciting staphylococcal enterotoxin intoxication in all consumer age groups. This study illustrates the utility of quantitative microbial risk assessments for identifying potential food safety issues. PMID:19722395

  17. Summary of the workshop on issues in risk assessment: quantitative methods for developmental toxicology.

    PubMed

    Mattison, D R; Sandler, J D

    1994-08-01

    This report summarizes the proceedings of a conference on quantitative methods for assessing the risks of developmental toxicants. The conference was planned by a subcommittee of the National Research Council's Committee on Risk Assessment Methodology in conjunction with staff from several federal agencies, including the U.S. Environmental Protection Agency, U.S. Food and Drug Administration, U.S. Consumer Products Safety Commission, and Health and Welfare Canada. Issues discussed at the workshop included computerized techniques for hazard identification, use of human and animal data for defining risks in a clinical setting, relationships between end points in developmental toxicity testing, reference dose calculations for developmental toxicology, analysis of quantitative dose-response data, mechanisms of developmental toxicity, physiologically based pharmacokinetic models, and structure-activity relationships. Although a formal consensus was not sought, many participants favored the evolution of quantitative techniques for developmental toxicology risk assessment, including the replacement of lowest observed adverse effect levels (LOAELs) and no observed adverse effect levels (NOAELs) with the benchmark dose methodology.

  18. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  19. A new approach to risk assessment integrating scientific evaluation and economic assessment of costs and benefits.

    PubMed

    Barnard, R C

    1996-10-01

    Traditional quantitative risk assessment based on conservative generic assumptions led to an upper-bound risk value with minimum or no consideration of costs and benefits. There is a growing consensus for a new approach to risk assessment based on a combination of scientific risk assessment and economic cost-benefit analysis. Scientific evaluation would be improved to support the economic cost-benefit analysis. The objective is to demonstrate whether the benefits justify the costs. The move in the new direction is shown by Executive Order 12866 and the Office of Management and Budget implementing document, the proposed regulatory reform legislation in Congress, the draft report of the Risk Assessment and Risk Management Commission, and the Safe Drinking Water Act Amendments of 1996 that enacted the new approach combining scientific and economic assessment of risk. This Commentary discusses these developments with particular reference to contemplated changes in scientific risk assessment to support a parallel economic risk-benefit analysis. PMID:8933625

  20. Veterinary drugs: disposition, biotransformation and risk evaluation.

    PubMed

    Fink-Gremmels, J; van Miert, A S

    1994-12-01

    Veterinary drugs may only be produced, distributed and administered after being licensed. This implies that, prior to marketing, a critical evaluation of the pharmaceutical quality, the clinical efficacy and the over-all pharmacological and toxicological properties of the active substances will be performed by national and/or supranational authorities. However, despite a sophisticated legal (harmonized) framework, a number of factors involved in residue formation and safety assessment remain unpredictable or dependant on the current 'state of the art' in the understanding of molecular pharmacology and toxicology. For example, drug disposition and residue formation in the target animal species may be influenced by a broad variety of physiological parameters including age, sex and diet, as well as by pathological conditions especially the acute phase response to infection. These factors affect both drug disposition and metabolite formation. Furthermore, current thinking in toxicological risk assessment is influenced by recent developments in molecular toxicology and thus by an increased but still incomplete understanding of the interaction of a toxic compound with the living organism. General recognized principles in the evaluation of potential toxicants are applied in the recommendation of withdrawal times and the establishment of maximum residue limits (MRL values). Apart from toxicological-based assessment, increasing awareness is directed to other than toxicological responses, especially the potential risk of effects of antimicrobial residues on human gastrointestinal microflora. Thus, the methodology of risk assessment is discussed in the context of the recently established legal framework within the European Union.

  1. [Thermal comfort in perioperatory risk's evaluation].

    PubMed

    Masia, M D; Dettori, M; Liperi, G; Deriu, G M; Posadino, S; Maida, G; Mura, I

    2009-01-01

    Studies till now conducted about operating rooms' microclimate have been focused mainly on operators' thermal comfort, considering that uneasiness conditions may compromise their working performance. In last years, nevertheless, the anesthesiologic community recalled attention on patients' risks determined by perioperatory variations of normothermia, underlining the necessity of orientating studies to individuate microclimate characteristics act to guarantee thermal comfort of the patient too. Looking at these considerations, a study has been conducted in the operating rooms of the hospital-university Firm and the n.1 USL of Sassari, finalized, on one hand, to determinate microclimate characteristics of the operating blocks and to evaluate operators' and patients' thermal comfort, on the other to individuate, through a software simulation, microclimate conditions that ensure contemporarily thermal comfort for both the categories. Results confirm the existence of a thermal "gap" among operators and patients, these last constantly submitted to "cold-stress", sometimes very accentuated. So, we underline microclimate's importance in operating rooms, because there are particular situations that can condition perioperatory risks. Moreover it can be useful to integrate risk's classes of the American Society of Anestesiology (ASA) with a score attributed to the PMV/PPD variation, reaching more real operatory risk indicators. PMID:19798902

  2. Quantitative autoradiographic microimaging in the development and evaluation of radiopharmaceuticals

    SciTech Connect

    Som, P.; Oster, Z.H.

    1994-04-01

    Autoradiographic (ARG) microimaging is the method for depicting biodistribution of radiocompounds with highest spatial resolution. ARG is applicable to gamma, positron and negatron emitting radiotracers. Dual or multiple-isotope studies can be performed using half-lives and energies for discrimination of isotopes. Quantitation can be performed by digital videodensitometry and by newer filmless technologies. ARG`s obtained at different time intervals provide the time dimension for determination of kinetics.

  3. Review of progress in quantitative NDE. [Nondestructive Evaluation (NDE)

    SciTech Connect

    Not Available

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques. (GHH)

  4. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  5. Quantitative assessment of the microbial risk of leafy greens from farm to consumption: preliminary framework, data, and risk estimates.

    PubMed

    Danyluk, Michelle D; Schaffner, Donald W

    2011-05-01

    This project was undertaken to relate what is known about the behavior of Escherichia coli O157:H7 under laboratory conditions and integrate this information to what is known regarding the 2006 E. coli O157:H7 spinach outbreak in the context of a quantitative microbial risk assessment. The risk model explicitly assumes that all contamination arises from exposure in the field. Extracted data, models, and user inputs were entered into an Excel spreadsheet, and the modeling software @RISK was used to perform Monte Carlo simulations. The model predicts that cut leafy greens that are temperature abused will support the growth of E. coli O157:H7, and populations of the organism may increase by as much a 1 log CFU/day under optimal temperature conditions. When the risk model used a starting level of -1 log CFU/g, with 0.1% of incoming servings contaminated, the predicted numbers of cells per serving were within the range of best available estimates of pathogen levels during the outbreak. The model predicts that levels in the field of -1 log CFU/g and 0.1% prevalence could have resulted in an outbreak approximately the size of the 2006 E. coli O157:H7 outbreak. This quantitative microbial risk assessment model represents a preliminary framework that identifies available data and provides initial risk estimates for pathogenic E. coli in leafy greens. Data gaps include retail storage times, correlations between storage time and temperature, determining the importance of E. coli O157:H7 in leafy greens lag time models, and validation of the importance of cross-contamination during the washing process.

  6. Quantitative analysis of the benefits and risks of consuming farmed and wild salmon.

    PubMed

    Foran, Jeffery A; Good, David H; Carpenter, David O; Hamilton, M Coreen; Knuth, Barbara A; Schwager, Steven J

    2005-11-01

    Contaminants in farmed Atlantic and wild Pacific salmon raise important questions about the competing health benefits and risks of fish consumption. A benefit-risk analysis was conducted to compare quantitatively the cancer and noncancer risks of exposure to organic contaminants in salmon with the (n-3) fatty acid-associated health benefits of salmon consumption. Recommended levels of (n-3) fatty acid intake, as eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA), may be achieved by consuming farmed or wild salmon while maintaining an acceptable level of noncarcinogenic risk. However, the recommended level of EPA+DHA intake cannot be achieved solely from farmed or wild salmon while maintaining an acceptable level of carcinogenic risk. Although the benefit-risk ratio for carcinogens and noncarcinogens is significantly greater for wild Pacific salmon than for farmed Atlantic salmon as a group, the ratio for some subgroups of farmed salmon is on par with the ratio for wild salmon. This analysis suggests that risk of exposure to contaminants in farmed and wild salmon is partially offset by the fatty acid-associated health benefits. However, young children, women of child-bearing age, pregnant women, and nursing mothers not at significant risk for sudden cardiac death associated with CHD but concerned with health impairments such as reduction in IQ and other cognitive and behavioral effects, can minimize contaminant exposure by choosing the least contaminated wild salmon or by selecting other sources of (n-3) fatty acids.

  7. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    SciTech Connect

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and the potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments

  8. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    PubMed

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. PMID:27197566

  9. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    PubMed

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  10. Quantitative assessment of cancer risk from exposure to diesel engine emissions

    SciTech Connect

    Pepelko, W.E.; Chen, C.

    1993-01-01

    Quantitative estimates of lung cancer risk from exposure to diesel engine emissions were developed using data from three chronic bioassays with Fischer 344 rats. Human target organ dose was estimated with the aid of a comprehensive dosimetry model. The epithelial tissue lining the alveoli and lower airways is the primary target site for induction of lung tumors. Dose was therefore based upon the concentration of carbon particulate matter per unit lung surface area.

  11. Quantitative risk assessment of Listeriosis due to consumption of raw milk

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objectives of this study were to estimate the risk of illnesses for raw milk consumers due to L. monocytogenes contamination in raw milk sold by permitted raw milk dealers, and the risk of listeriosis for people on farms who consume raw milk. Three scenarios were evaluated for raw milk sold by ...

  12. Quantitative evaluation of multi-walled carbon nanotube uptake in wheat and rapeseed.

    PubMed

    Larue, Camille; Pinault, Mathieu; Czarny, Bertrand; Georgin, Dominique; Jaillard, Danielle; Bendiab, Nedjma; Mayne-L'Hermite, Martine; Taran, Frédéric; Dive, Vincent; Carrière, Marie

    2012-08-15

    Environmental contamination with carbon nanotubes would lead to plant exposure and particularly exposure of agricultural crops. The only quantitative exposure data available to date which can be used for risk assessment comes from computer modeling. The aim of this study was to provide quantitative data relative to multi-walled carbon nanotube (MWCNT) uptake and distribution in agricultural crops, and to correlate accumulation data with impact on plant development and physiology. Roots of wheat and rapeseed were exposed in hydroponics to uniformly (14)C-radiolabeled MWCNTs. Radioimaging, transmission electron microscopy and raman spectroscopy were used to identify CNT distribution. Radioactivity counting made it possible absolute quantification of CNT accumulation in plant leaves. Impact of CNTs on seed germination, root elongation, plant biomass, evapotranspiration, chlorophyll, thiobarbituric acid reactive species and H(2)O(2) contents was evaluated. We demonstrate that less than 0.005‰ of the applied MWCNT dose is taken up by plant roots and translocated to the leaves. This accumulation does not impact plant development and physiology. In addition, it does not induce any modifications in photosynthetic activity nor cause oxidative stress in plant leaves. Our results suggest that if environmental contamination occurs and MWCNTs are in the same physico-chemical state than the ones used in the present article, MWCNT transfer to the food chain via food crops would be very low.

  13. Gasbuggy Site Assessment and Risk Evaluation

    SciTech Connect

    2011-03-01

    The Gasbuggy site is in northern New Mexico in the San Juan Basin, Rio Arriba County (Figure 1-1). The Gasbuggy experiment was designed to evaluate the use of a nuclear detonation to enhance natural gas production from the Pictured Cliffs Formation, a tight, gas-bearing sandstone formation. The 29-kiloton-yield nuclear device was placed in a 17.5-inch wellbore at 4,240 feet (ft) below ground surface (bgs), approximately 40 ft below the Pictured Cliffs/Lewis shale contact, in an attempt to force the cavity/chimney formed by the detonation up into the Pictured Cliffs Sandstone. The test was conducted below the southwest quarter of Section 36, Township 29 North, Range 4 West, New Mexico Principal Meridian. The device was detonated on December 10, 1967, creating a 335-ft-high chimney above the detonation point and a cavity 160 ft in diameter. The gas produced from GB-ER (the emplacement and reentry well) during the post-detonation production tests was radioactive and diluted, primarily by carbon dioxide. After 2 years, the energy content of the gas had recovered to 80 percent of the value of gas in conventionally developed wells in the area. There is currently no technology capable of remediating deep underground nuclear detonation cavities and chimneys. Consequently, the U.S. Department of Energy (DOE) must continue to manage the Gasbuggy site to ensure that no inadvertent intrusion into the residual contamination occurs. DOE has complete control over the 1/4 section (160 acres) containing the shot cavity, and no drilling is permitted on that property. However, oil and gas leases are on the surrounding land. Therefore, the most likely route of intrusion and potential exposure would be through contaminated natural gas or contaminated water migrating into a producing natural gas well outside the immediate vicinity of ground zero. The purpose of this report is to describe the current site conditions and evaluate the potential health risks posed by the most plausible

  14. A quantitative assessment of risks of heavy metal residues in laundered shop towels and their use by workers.

    PubMed

    Connor, Kevin; Magee, Brian

    2014-10-01

    This paper presents a risk assessment of exposure to metal residues in laundered shop towels by workers. The concentrations of 27 metals measured in a synthetic sweat leachate were used to estimate the releasable quantity of metals which could be transferred to workers' skin. Worker exposure was evaluated quantitatively with an exposure model that focused on towel-to-hand transfer and subsequent hand-to-food or -mouth transfers. The exposure model was based on conservative, but reasonable assumptions regarding towel use and default exposure factor values from the published literature or regulatory guidance. Transfer coefficients were derived from studies representative of the exposures to towel users. Contact frequencies were based on assumed high-end use of shop towels, but constrained by a theoretical maximum dermal loading. The risk estimates for workers developed for all metals were below applicable regulatory risk benchmarks. The risk assessment for lead utilized the Adult Lead Model and concluded that predicted lead intakes do not constitute a significant health hazard based on potential worker exposures. Uncertainties are discussed in relation to the overall confidence in the exposure estimates developed for each exposure pathway and the likelihood that the exposure model is under- or overestimating worker exposures and risk. PMID:24973502

  15. Evaluation of the "Respect Not Risk" Firearm Safety Lesson for 3rd-Graders

    ERIC Educational Resources Information Center

    Liller, Karen D.; Perrin, Karen; Nearns, Jodi; Pesce, Karen; Crane, Nancy B.; Gonzalez, Robin R.

    2003-01-01

    The purpose of this study was to evaluate the MORE HEALTH "Respect Not Risk" Firearm Safety Lesson for 3rd-graders in Pinellas County, Florida. Six schools representative of various socioeconomic levels were selected as the test sites. Qualitative and quantitative data were collected. A total of 433 matched pretests/posttests were used to…

  16. Biostatistical considerations in pharmacovigilance and pharmacoepidemiology: linking quantitative risk assessment in pre-market licensure application safety data, post-market alert reports and formal epidemiological studies.

    PubMed

    O'Neill, R T

    This paper deals with a conceptual discussion of a variety of statistical concepts, methods and strategies that are relevant to the quantitative assessment of risk derived from safety data collected during the pre- and post-marketing phase of a new drug's life cycle. A call is made for the use of more standard approaches to the analysis of safety data that are statistically and epidemiologically rigorous and for attempts to link the strategies for pre-market safety assessment with strategies for post-market safety evaluation. This link may be facilitated by recognizing the limitations and complementary roles played by pre- and post-market safety data collection schemes and by linking the quantitative analyses utilized for either exploratory or confirmatory purposes of risk assessment in each phase of safety data collection. Examples are provided of studies specifically designed to evaluate risk in a post approval setting and several available guidelines intended to improve the quality of these studies are discussed.

  17. Designs for Risk Evaluation and Management

    SciTech Connect

    2015-12-01

    The Designs for Risk Evaluation and Management (DREAM) tool was developed as part of the effort to quantify the risk of geologic storage of carbon dioxide (CO2) under the U.S. Department of Energy’s National Risk Assessment Partnership (NRAP). DREAM is an optimization tool created to identify optimal monitoring schemes that minimize the time to first detection of CO2 leakage from a subsurface storage formation. DREAM acts as a post-processer on user-provided output from subsurface leakage simulations. While DREAM was developed for CO2 leakage scenarios, it is applicable to any subsurface leakage simulation of the same output format. The DREAM tool is comprised of three main components: (1) a Java wizard used to configure and execute the simulations, (2) a visualization tool to view the domain space and optimization results, and (3) a plotting tool used to analyze the results. A secondary Java application is provided to aid users in converting common American Standard Code for Information Interchange (ASCII) output data to the standard DREAM hierarchical data format (HDF5). DREAM employs a simulated annealing approach that searches the solution space by iteratively mutating potential monitoring schemes built of various configurations of monitoring locations and leak detection parameters. This approach has proven to be orders of magnitude faster than an exhaustive search of the entire solution space. The user’s manual illustrates the program graphical user interface (GUI), describes the tool inputs, and includes an example application.

  18. Evaluation of residue drum storage safety risks

    SciTech Connect

    Conner, W.V.

    1994-06-17

    A study was conducted to determine if any potential safety problems exist in the residue drum backlog at the Rocky Flats Plant. Plutonium residues stored in 55-gallon drums were packaged for short-term storage until the residues could be processed for plutonium recovery. These residues have now been determined by the Department of Energy to be waste materials, and the residues will remain in storage until plans for disposal of the material can be developed. The packaging configurations which were safe for short-term storage may not be safe for long-term storage. Interviews with Rocky Flats personnel involved with packaging the residues reveal that more than one packaging configuration was used for some of the residues. A tabulation of packaging configurations was developed based on the information obtained from the interviews. A number of potential safety problems were identified during this study, including hydrogen generation from some residues and residue packaging materials, contamination containment loss, metal residue packaging container corrosion, and pyrophoric plutonium compound formation. Risk factors were developed for evaluating the risk potential of the various residue categories, and the residues in storage at Rocky Flats were ranked by risk potential. Preliminary drum head space gas sampling studies have demonstrated the potential for formation of flammable hydrogen-oxygen mixtures in some residue drums.

  19. Quantitative risk assessment of the New York State operated West Valley Radioactive Waste Disposal Area.

    PubMed

    Garrick, B John; Stetkar, John W; Bembia, Paul J

    2010-08-01

    This article is based on a quantitative risk assessment (QRA) that was performed on a radioactive waste disposal area within the Western New York Nuclear Service Center in western New York State. The QRA results were instrumental in the decision by the New York State Energy Research and Development Authority to support a strategy of in-place management of the disposal area for another decade. The QRA methodology adopted for this first of a kind application was a scenario-based approach in the framework of the triplet definition of risk (scenarios, likelihoods, consequences). The measure of risk is the frequency of occurrence of different levels of radiation dose to humans at prescribed locations. The risk from each scenario is determined by (1) the frequency of disruptive events or natural processes that cause a release of radioactive materials from the disposal area; (2) the physical form, quantity, and radionuclide content of the material that is released during each scenario; (3) distribution, dilution, and deposition of the released materials throughout the environment surrounding the disposal area; and (4) public exposure to the distributed material and the accumulated radiation dose from that exposure. The risks of the individual scenarios are assembled into a representation of the risk from the disposal area. In addition to quantifying the total risk to the public, the analysis ranks the importance of each contributing scenario, which facilitates taking corrective actions and implementing effective risk management. Perhaps most importantly, quantification of the uncertainties is an intrinsic part of the risk results. This approach to safety analysis has demonstrated many advantages of applying QRA principles to assessing the risk of facilities involving hazardous materials.

  20. Quantitative evaluation of digital dental radiograph imaging systems.

    PubMed

    Hildebolt, C F; Vannier, M W; Pilgram, T K; Shrout, M K

    1990-11-01

    Two digital imaging systems, a video camera and analog-to-digital converter, and a charge-coupled device linear photodiode array slide scanner, were tested for their suitability in quantitative studies of periodontal disease. The information content in the original films was estimated, and digital systems were assessed according to these requirements. Radiometric and geometric performance criteria for the digital systems were estimated from measurements and observations. The scanner-based image acquisition (digitization) system had no detectable noise and had a modulation transfer function curve superior to that of the video-based system. The scanner-based system was equivalent to the video-based system in recording radiographic film densities and had more geometric distortion than the video-based system. The comparison demonstrated the superiority of the charge-coupled device linear array system for the quantification of periodontal disease extent and activity. PMID:2234888

  1. An evaluation of recent quantitative magnetospheric magnetic field models

    NASA Technical Reports Server (NTRS)

    Walker, R. J.

    1976-01-01

    Magnetospheric field models involving dipole tilt effects are discussed, with particular reference to defined magnetopause models and boundary surface models. The models are compared with observations and with each other whenever possible. It is shown that models containing only contributions from magnetopause and tail current systems are capable of reproducing the observed quiet time field just in a qualitative way. The best quantitative agreement between models and observations take place when currents distributed in the inner magnetosphere are added to the magnetopause and tail current systems. One region in which all the models fall short is the region around the polar cusp. Obtaining physically reasonable gradients should have high priority in the development of future models.

  2. Using quantitative interference phase microscopy for sperm acrosome evaluation (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Balberg, Michal; Kalinowski, Ksawery; Levi, Mattan; Shaked, Natan T.

    2016-03-01

    We demonstrate quantitative assessment of sperm cell morphology, primarily acrosomal volume, using quantitative interference phase microscopy (IPM). Normally, the area of the acrosome is assessed using dyes that stain the acrosomal part of the cell. We have imaged fixed individual sperm cells using IPM. Following, the sample was stained and the same cells were imaged using bright field microscopy (BFM). We identified the acrosome using the stained BFM image, and used it to define a quantitative corresponding area in the IPM image and determine a quantitative threshold for evaluating the volume of the acrosome.

  3. Benchmark dose profiles for joint-action continuous data in quantitative risk assessment.

    PubMed

    Deutsch, Roland C; Piegorsch, Walter W

    2013-09-01

    Benchmark analysis is a widely used tool in biomedical and environmental risk assessment. Therein, estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a prespecified benchmark response (BMR) is well understood for the case of an adverse response to a single stimulus. For cases where two agents are studied in tandem, however, the benchmark approach is far less developed. This paper demonstrates how the benchmark modeling paradigm can be expanded from the single-agent setting to joint-action, two-agent studies. Focus is on continuous response outcomes. Extending the single-exposure setting, representations of risk are based on a joint-action dose-response model involving both agents. Based on such a model, the concept of a benchmark profile-a two-dimensional analog of the single-dose BMD at which both agents achieve the specified BMR-is defined for use in quantitative risk characterization and assessment.

  4. Attribution of human VTEC O157 infection from meat products: a quantitative risk assessment approach.

    PubMed

    Kosmider, Rowena D; Nally, Pádraig; Simons, Robin R L; Brouwer, Adam; Cheung, Susan; Snary, Emma L; Wooldridge, Marion

    2010-05-01

    To address the risk posed to human health by the consumption of VTEC O157 within contaminated pork, lamb, and beef products within Great Britain, a quantitative risk assessment model has been developed. This model aims to simulate the prevalence and amount of VTEC O157 in different meat products at consumption within a single model framework by adapting previously developed models. The model is stochastic in nature, enabling both variability (natural variation between animals, carcasses, products) and uncertainty (lack of knowledge) about the input parameters to be modeled. Based on the model assumptions and data, it is concluded that the prevalence of VTEC O157 in meat products (joints and mince) at consumption is low (i.e., <0.04%). Beef products, particularly beef burgers, present the highest estimated risk with an estimated eight out of 100,000 servings on average resulting in human infection with VTEC O157.

  5. Quantitative versus Qualitative Evaluation: A Tool to Decide Which to Use

    ERIC Educational Resources Information Center

    Dobrovolny, Jackie L.; Fuentes, Stephanie Christine G.

    2008-01-01

    Evaluation is often avoided in human performance technology (HPT), but it is an essential and frequently catalytic activity that adds significant value to projects. Knowing how to approach an evaluation and whether to use qualitative, quantitative, or both methods makes evaluation much easier. In this article, we provide tools to help determine…

  6. Online versus Paper Evaluations: Differences in Both Quantitative and Qualitative Data

    ERIC Educational Resources Information Center

    Burton, William B.; Civitano, Adele; Steiner-Grossman, Penny

    2012-01-01

    This study sought to determine if differences exist in the quantitative and qualitative data collected with paper and online versions of a medical school clerkship evaluation form. Data from six-and-a-half years of clerkship evaluations were used, some collected before and some after the conversion from a paper to an online evaluation system. The…

  7. A quantitative methodology to assess the risks to human health from CO 2 leakage into groundwater

    NASA Astrophysics Data System (ADS)

    Siirila, Erica R.; Navarre-Sitchler, Alexis K.; Maxwell, Reed M.; McCray, John E.

    2012-02-01

    Leakage of CO 2 and associated gases into overlying aquifers as a result of geologic carbon capture and sequestration may have adverse impacts on aquifer drinking-water quality. Gas or aqueous-phase leakage may occur due to transport via faults and fractures, through faulty well bores, or through leaky confining materials. Contaminants of concern include aqueous salts and dissolved solids, gaseous or aqueous-phase organic contaminants, and acidic gas or aqueous-phase fluids that can liberate metals from aquifer minerals. Here we present a quantitative risk assessment framework to predict potential human health risk from CO 2 leakage into drinking water aquifers. This framework incorporates the potential release of CO 2 into the drinking water aquifer; mobilization of metals due to a decrease in pH; transport of these metals down gradient to municipal receptors; distributions of contaminated groundwater to multiple households; and exposure and health risk to individuals using this water for household purposes. Additionally, this framework is stochastic, incorporates detailed variations in geological and geostatistical parameters and discriminates between uncertain and variable parameters using a two-stage, or nested, Monte Carlo approach. This approach is demonstrated using example simulations with hypothetical, yet realistic, aquifer characteristics and leakage scenarios. These example simulations show a greater risk for arsenic than for lead for both cancer and non-cancer endpoints, an unexpected finding. Higher background groundwater gradients also yield higher risk. The overall risk and the associated uncertainty are sensitive to the extent of aquifer stratification and the degree of local-scale dispersion. These results all highlight the importance of hydrologic modeling in risk assessment. A linear relationship between carcinogenic and noncarcinogenic risk was found for arsenic and suggests action levels for carcinogenic risk will be exceeded in exposure

  8. Quantitative microbial risk assessment applied to irrigation of salad crops with waste stabilization pond effluents.

    PubMed

    Pavione, D M S; Bastos, R K X; Bevilacqua, P D

    2013-01-01

    A quantitative microbial risk assessment model for estimating infection risks arising from consuming crops eaten raw that have been irrigated with effluents from stabilization ponds was constructed. A log-normal probability distribution function was fitted to a large database from a comprehensive monitoring of an experimental pond system to account for variability in Escherichia coli concentration in irrigation water. Crop contamination levels were estimated using predictive models derived from field experiments involving the irrigation of several crops with different effluent qualities. Data on daily intake of salad crops were obtained from a national survey in Brazil. Ten thousand-trial Monte Carlo simulations were used to estimate human health risks associated with the use of wastewater for irrigating low- and high-growing crops. The use of effluents containing 10(3)-10(4) E. coli per 100 ml resulted in median rotavirus infection risk of approximately 10(-3) and 10(-4) pppy when irrigating, respectively, low- and high-growing crops; the corresponding 95th percentile risk estimates were around 10(-2) in both scenarios. Sensitivity analyses revealed that variations in effluent quality, in the assumed ratios of pathogens to E. coli, and in the reduction of pathogens between harvest and consumption had great impact upon risk estimates.

  9. Bioaerosol Deposition to Food Crops near Manure Application: Quantitative Microbial Risk Assessment.

    PubMed

    Jahne, Michael A; Rogers, Shane W; Holsen, Thomas M; Grimberg, Stefan J; Ramler, Ivan P; Kim, Seungo

    2016-03-01

    Production of both livestock and food crops are central priorities of agriculture; however, food safety concerns arise where these practices intersect. In this study, we investigated the public health risks associated with potential bioaerosol deposition to crops grown in the vicinity of manure application sites. A field sampling campaign at dairy manure application sites supported the emission, transport, and deposition modeling of bioaerosols emitted from these lands following application activities. Results were coupled with a quantitative microbial risk assessment model to estimate the infection risk due to consumption of leafy green vegetable crops grown at various distances downwind from the application area. Inactivation of pathogens ( spp., spp., and O157:H7) on both the manure-amended field and on crops was considered to determine the maximum loading of pathogens to plants with time following application. Overall median one-time infection risks at the time of maximum loading decreased from 1:1300 at 0 m directly downwind from the field to 1:6700 at 100 m and 1:92,000 at 1000 m; peak risks (95th percentiles) were considerably greater (1:18, 1:89, and 1:1200, respectively). Median risk was below 1:10,000 at >160 m downwind. As such, it is recommended that a 160-m setback distance is provided between manure application and nearby leafy green crop production. Additional distance or delay before harvest will provide further protection of public health.

  10. Medication Exposure in Pregnancy Risk Evaluation Program

    PubMed Central

    Andrade, Susan E.; Davis, Robert L.; Cheetham, T. Craig; Cooper, William O.; Li, De-Kun; Amini, Thushi; Beaton, Sarah J.; Dublin, Sascha; Hammad, Tarek A.; Pawloski, Pamala A.; Raebel, Marsha A.; Smith, David H.; Staffa, Judy A.; Toh, Sengwee; Dashevsky, Inna; Haffenreffer, Katherine; Lane, Kimberly; Platt, Richard; Scott, Pamela E.

    2011-01-01

    To describe a program to study medication safety in pregnancy, the Medication Exposure in Pregnancy Risk Evaluation Program (MEPREP). MEPREP is a multi-site collaborative research program developed to enable the conduct of studies of medication use and outcomes in pregnancy. Collaborators include the U.S. Food and Drug Administration and researchers at the HMO Research Network, Kaiser Permanente Northern and Southern California, and Vanderbilt University. Datasets have been created at each site linking healthcare data for women delivering an infant between January 1, 2001 and December 31, 2008 and infants born to these women. Standardized data files include maternal and infant characteristics, medication use, and medical care at 11 health plans within 9 states; birth certificate data were obtained from the state departments of public health. MEPREP currently involves more than 20 medication safety researchers and includes data for 1,221,156 children delivered to 933,917 mothers. Current studies include evaluations of the prevalence and patterns of use of specific medications and a validation study of data elements in the administrative and birth certificate data files. MEPREP can support multiple studies by providing information on a large, ethnically and geographically diverse population. This partnership combines clinical and research expertise and data resources to enable the evaluation of outcomes associated with medication use during pregnancy. PMID:22002179

  11. The Children, Youth, and Families at Risk (CYFAR) Evaluation Collaboration.

    ERIC Educational Resources Information Center

    Marek, Lydia I.; Byrne, Richard A. W.; Marczak, Mary S.; Betts, Sherry C.; Mancini, Jay A.

    1999-01-01

    The Cooperative Extension Service's Children, Youth, and Families at Risk initiative is being assessed by the Evaluation Collaboration's three projects: state-strengthening evaluation project (resources to help states evaluate community programs); NetCon (evaluation of electronic and other networks); and National Youth at Risk Sustainability Study…

  12. Digital holographic microscopy for quantitative cell dynamic evaluation during laser microsurgery

    PubMed Central

    Yu, Lingfeng; Mohanty, Samarendra; Zhang, Jun; Genc, Suzanne; Kim, Myung K.; Berns, Michael W.; Chen, Zhongping

    2010-01-01

    Digital holographic microscopy allows determination of dynamic changes in the optical thickness profile of a transparent object with subwavelength accuracy. Here, we report a quantitative phase laser microsurgery system for evaluation of cellular/ sub-cellular dynamic changes during laser micro-dissection. The proposed method takes advantage of the precise optical manipulation by the laser microbeam and quantitative phase imaging by digital holographic microscopy with high spatial and temporal resolution. This system will permit quantitative evaluation of the damage and/or the repair of the cell or cell organelles in real time. PMID:19582118

  13. Quantitative analysis and health risk assessment of polycyclic aromatic hydrocarbons in edible vegetable oils marketed in Shandong of China.

    PubMed

    Jiang, Dafeng; Xin, Chenglong; Li, Wei; Chen, Jindong; Li, Fenghua; Chu, Zunhua; Xiao, Peirui; Shao, Lijun

    2015-09-01

    This work studies on the quantitative analysis and health risk assessment of polycyclic aromatic hydrocarbons (PAHs) in edible vegetable oils in Shandong, China. The concentrations of 15 PAHs in 242 samples were determined by high performance liquid chromatography coupled with fluorescence detection. The results indicated that the mean concentration of 15 PAHs in oil samples was 54.37 μg kg(-1). Low molecular weight PAH compounds were the predominant contamination. Especially, the carcinogenic benzo(a)pyrene (BaP) was detected at a mean concentration of 1.28 μg kg(-1), which was lower than the limit of European Union and China. A preliminary evaluation of human health risk assessment for PAHs was accomplished using BaP toxic equivalency factors and the incremental lifetime cancer risk (ILCR). The ILCR values for children, adolescents, adults, and seniors were all larger than 1 × 10(-6), indicating a high potential carcinogenic risk on the dietary exposed populations.

  14. Quantitative analysis and health risk assessment of polycyclic aromatic hydrocarbons in edible vegetable oils marketed in Shandong of China.

    PubMed

    Jiang, Dafeng; Xin, Chenglong; Li, Wei; Chen, Jindong; Li, Fenghua; Chu, Zunhua; Xiao, Peirui; Shao, Lijun

    2015-09-01

    This work studies on the quantitative analysis and health risk assessment of polycyclic aromatic hydrocarbons (PAHs) in edible vegetable oils in Shandong, China. The concentrations of 15 PAHs in 242 samples were determined by high performance liquid chromatography coupled with fluorescence detection. The results indicated that the mean concentration of 15 PAHs in oil samples was 54.37 μg kg(-1). Low molecular weight PAH compounds were the predominant contamination. Especially, the carcinogenic benzo(a)pyrene (BaP) was detected at a mean concentration of 1.28 μg kg(-1), which was lower than the limit of European Union and China. A preliminary evaluation of human health risk assessment for PAHs was accomplished using BaP toxic equivalency factors and the incremental lifetime cancer risk (ILCR). The ILCR values for children, adolescents, adults, and seniors were all larger than 1 × 10(-6), indicating a high potential carcinogenic risk on the dietary exposed populations. PMID:26072099

  15. Predictive value of quantitative dipyridamole-thallium scintigraphy in assessing cardiovascular risk after vascular surgery in diabetes mellitus

    SciTech Connect

    Lane, S.E.; Lewis, S.M.; Pippin, J.J.; Kosinski, E.J.; Campbell, D.; Nesto, R.W.; Hill, T. )

    1989-12-01

    Cardiac complications represent a major risk to patients undergoing vascular surgery. Diabetic patients may be particularly prone to such complications due to the high incidence of concomitant coronary artery disease, the severity of which may be clinically unrecognized. Attempts to stratify groups by clinical criteria have been useful but lack the predictive value of currently used noninvasive techniques such as dipyridamole-thallium scintigraphy. One hundred one diabetic patients were evaluated with dipyridamole-thallium scintigraphy before undergoing vascular surgery. The incidence of thallium abnormalities was high (80%) and did not correlate with clinical markers of coronary disease. Even in a subgroup of patients with no overt clinical evidence of underlying heart disease, thallium abnormalities were present in 59%. Cardiovascular complications, however, occurred in only 11% of all patients. Statistically significant prediction of risk was not achieved with simple assessment of thallium results as normal or abnormal. Quantification of total number of reversible defects, as well as assessment of ischemia in the distribution of the left anterior descending coronary artery was required for optimum predictive accuracy. The prevalence of dipyridamole-thallium abnormalities in a diabetic population is much higher than that reported in nondiabetic patients and cannot be predicted by usual clinical indicators of heart disease. In addition, cardiovascular risk of vascular surgery can be optimally assessed by quantitative analysis of dipyridamole-thallium scintigraphy and identification of high- and low-risk subgroups.

  16. A lighting metric for quantitative evaluation of accent lighting systems

    NASA Astrophysics Data System (ADS)

    Acholo, Cyril O.; Connor, Kenneth A.; Radke, Richard J.

    2014-09-01

    Accent lighting is critical for artwork and sculpture lighting in museums, and subject lighting for stage, Film and television. The research problem of designing effective lighting in such settings has been revived recently with the rise of light-emitting-diode-based solid state lighting. In this work, we propose an easy-to-apply quantitative measure of the scene's visual quality as perceived by human viewers. We consider a well-accent-lit scene as one which maximizes the information about the scene (in an information-theoretic sense) available to the user. We propose a metric based on the entropy of the distribution of colors, which are extracted from an image of the scene from the viewer's perspective. We demonstrate that optimizing the metric as a function of illumination configuration (i.e., position, orientation, and spectral composition) results in natural, pleasing accent lighting. We use a photorealistic simulation tool to validate the functionality of our proposed approach, showing its successful application to two- and three-dimensional scenes.

  17. Designs for Risk Evaluation and Management

    2015-12-01

    The Designs for Risk Evaluation and Management (DREAM) tool was developed as part of the effort to quantify the risk of geologic storage of carbon dioxide (CO2) under the U.S. Department of Energy’s National Risk Assessment Partnership (NRAP). DREAM is an optimization tool created to identify optimal monitoring schemes that minimize the time to first detection of CO2 leakage from a subsurface storage formation. DREAM acts as a post-processer on user-provided output from subsurface leakagemore » simulations. While DREAM was developed for CO2 leakage scenarios, it is applicable to any subsurface leakage simulation of the same output format. The DREAM tool is comprised of three main components: (1) a Java wizard used to configure and execute the simulations, (2) a visualization tool to view the domain space and optimization results, and (3) a plotting tool used to analyze the results. A secondary Java application is provided to aid users in converting common American Standard Code for Information Interchange (ASCII) output data to the standard DREAM hierarchical data format (HDF5). DREAM employs a simulated annealing approach that searches the solution space by iteratively mutating potential monitoring schemes built of various configurations of monitoring locations and leak detection parameters. This approach has proven to be orders of magnitude faster than an exhaustive search of the entire solution space. The user’s manual illustrates the program graphical user interface (GUI), describes the tool inputs, and includes an example application.« less

  18. A Quantitative Investigation of Stakeholder Variation in Training Program Evaluation.

    ERIC Educational Resources Information Center

    Michalski, Greg V.

    A survey was conducted to investigate variation in stakeholder perceptions of training results and evaluation within the context of a high-technology product development firm (the case organization). A scannable questionnaire survey booklet was developed and scanned data were exported and analyzed. Based on an achieved sample of 280 (70% response…

  19. Evaluation of Seismic Risk of Siberia Territory

    NASA Astrophysics Data System (ADS)

    Seleznev, V. S.; Soloviev, V. M.; Emanov, A. F.

    The outcomes of modern geophysical researches of the Geophysical Survey SB RAS, directed on study of geodynamic situation in large industrial and civil centers on the territory of Siberia with the purpose of an evaluation of seismic risk of territories and prediction of origin of extreme situations of natural and man-caused character, are pre- sented in the paper. First of all it concerns the testing and updating of a geoinformation system developed by Russian Emergency Ministry designed for calculations regarding the seismic hazard and response to distructive earthquakes. The GIS database contains the catalogues of earthquakes and faults, seismic zonation maps, vectorized city maps, information on industrial and housing fund, data on character of building and popula- tion in inhabited places etc. The geoinformation system allows to solve on a basis of probabilistic approaches the following problems: - estimating the earthquake impact, required forces, facilities and supplies for life-support of injured population; - deter- mining the consequences of failures on chemical and explosion-dangerous objects; - optimization problems on assurance technology of conduct of salvage operations. Using this computer program, the maps of earthquake risk have been constructed for several seismically dangerous regions of Siberia. These maps display the data on the probable amount of injured people and relative economic damage from an earthquake, which can occur in various sites of the territory according to the map of seismic zona- tion. The obtained maps have allowed determining places where the detailed seismo- logical observations should be arranged. Along with it on the territory of Siberia the wide-ranging investigations with use of new methods of evaluation of physical state of industrial and civil establishments (buildings and structures, hydroelectric power stations, bridges, dams, etc.), high-performance detailed electromagnetic researches of ground conditions of city

  20. Quantitative vertebral compression fracture evaluation using a height compass

    NASA Astrophysics Data System (ADS)

    Yao, Jianhua; Burns, Joseph E.; Wiese, Tatjana; Summers, Ronald M.

    2012-03-01

    Vertebral compression fractures can be caused by even minor trauma in patients with pathological conditions such as osteoporosis, varying greatly in vertebral body location and compression geometry. The location and morphology of the compression injury can guide decision making for treatment modality (vertebroplasty versus surgical fixation), and can be important for pre-surgical planning. We propose a height compass to evaluate the axial plane spatial distribution of compression injury (anterior, posterior, lateral, and central), and distinguish it from physiologic height variations of normal vertebrae. The method includes four steps: spine segmentation and partition, endplate detection, height compass computation and compression fracture evaluation. A height compass is computed for each vertebra, where the vertebral body is partitioned in the axial plane into 17 cells oriented about concentric rings. In the compass structure, a crown-like geometry is produced by three concentric rings which are divided into 8 equal length arcs by rays which are subtended by 8 common central angles. The radius of each ring increases multiplicatively, with resultant structure of a central node and two concentric surrounding bands of cells, each divided into octants. The height value for each octant is calculated and plotted against octants in neighboring vertebrae. The height compass shows intuitive display of the height distribution and can be used to easily identify the fracture regions. Our technique was evaluated on 8 thoraco-abdominal CT scans of patients with reported compression fractures and showed statistically significant differences in height value at the sites of the fractures.

  1. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.

  2. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process. PMID:21197601

  3. Quantitative evaluation of heavy metals in solid residues from sub- and super-critical water gasification of sewage sludge.

    PubMed

    Li, Lei; Xu, Z R; Zhang, Chunlei; Bao, Jianping; Dai, Xiaoxuan

    2012-10-01

    Solid residues (SRs) are important byproducts of sub- and super-critical water gasification of sewage sludge (SS). In this study, the quantitative evaluation of heavy metals (HMs) in SRs, compared with SS, is applied in terms of potential ecological risks, pollution levels, and both bioavailability and eco-toxicity. The results show the bioavailability and eco-toxicity of HMs in SRs decrease, although the total concentration of HMs increased, particularly in the bioavailable fraction of Cu, which decreased nearly 97%. The geo-accumulation and potential ecological risk index indicated that the gasification process increased contamination by two levels (to the maximum), while the overall risk was in keeping with SS. However, based on the risk assessment code, each tested HM exhibited lower environmental risk after gasification, especially for Cd, which drastically dropped from 66.67 (very high risk) in SS to 0.71 (no risk) in SRs, with a reaction temperature of 375°C for 60 min.

  4. A quantitative method for visual phantom image quality evaluation

    NASA Astrophysics Data System (ADS)

    Chakraborty, Dev P.; Liu, Xiong; O'Shea, Michael; Toto, Lawrence C.

    2000-04-01

    This work presents an image quality evaluation technique for uniform-background target-object phantom images. The Degradation-Comparison-Threshold (DCT) method involves degrading the image quality of a target-containing region with a blocking processing and comparing the resulting image to a similarly degraded target-free region. The threshold degradation needed for 92% correct detection of the target region is the image quality measure of the target. Images of American College of Radiology (ACR) mammography accreditation program phantom were acquired under varying x-ray conditions on a digital mammography machine. Five observers performed ACR and DCT evaluations of the images. A figure-of-merit (FOM) of an evaluation method was defined which takes into account measurement noise and the change of the measure as a function of x-ray exposure to the phantom. The FOM of the DCT method was 4.1 times that of the ACR method for the specks, 2.7 times better for the fibers and 1.4 times better for the masses. For the specks, inter-reader correlations on the same image set increased significantly from 87% for the ACR method to 97% for the DCT method. The viewing time per target for the DCT method was 3 - 5 minutes. The observed greater sensitivity of the DCT method could lead to more precise Quality Control (QC) testing of digital images, which should improve the sensitivity of the QC process to genuine image quality variations. Another benefit of the method is that it can measure the image quality of high detectability target objects, which is impractical by existing methods.

  5. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  6. Quantitative evaluation of erythropoietic activity in dysmyelopoietic syndromes.

    PubMed

    Cazzola, M; Barosi, G; Berzuini, C; Dacco, M; Orlandi, E; Stefanelli, M; Ascari, E

    1982-01-01

    Based on the morphological appearances of the bone marrow and peripheral blood, 43 patients with dysmyelopoietic syndromes were categorized into four types: refractory anaemia with excess of blasts, chronic myelomonocytic leukaemia, primary acquired sideroblastic anaemia and refractory anaemia with cellular marrow, without excess of blasts and/or ring sideroblasts. Ferrokinetics allowed three distinct groups of patients to be defined. All cases of refractory anaemia with excess of blasts and chronic myelomonocytic leukaemia were classified in the same group. They were characterized by relative marrow failure and had a high likelihood of developing acute leukaemia. At the other end of the spectrum, individuals with primary acquired sideroblastic anaemia had high erythropoietic activity which was largely ineffective. They had a benign clinical course without evidence of leukaemic transformation. In the middle group, in terms of erythropoietic activity, lay patients with refractory anaemia with cellular marrow and a few individuals with primary acquired sideroblastic anaemia. Their clinical course and risk of developing acute leukaemia were intermediate between the other two groups. These findings indicate that separate entities may exist within the spectrum of dysmyelopoietic syndromes. In clinical practice, they may be recognized by morphological studies and other simple laboratory means.

  7. Object-oriented fault tree evaluation program for quantitative analyses

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1988-01-01

    Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.

  8. Computerized quantitative evaluation of mammographic accreditation phantom images

    SciTech Connect

    Lee, Yongbum; Tsai, Du-Yih; Shinohara, Norimitsu

    2010-12-15

    Purpose: The objective was to develop and investigate an automated scoring scheme of the American College of Radiology (ACR) mammographic accreditation phantom (RMI 156, Middleton, WI) images. Methods: The developed method consisted of background subtraction, determination of region of interest, classification of fiber and mass objects by Mahalanobis distance, detection of specks by template matching, and rule-based scoring. Fifty-one phantom images were collected from 51 facilities for this study (one facility provided one image). A medical physicist and two radiologic technologists also scored the images. The human and computerized scores were compared. Results: In terms of meeting the ACR's criteria, the accuracies of the developed method for computerized evaluation of fiber, mass, and speck were 90%, 80%, and 98%, respectively. Contingency table analysis revealed significant association between observer and computer scores for microcalcifications (p<5%) but not for masses and fibers. Conclusions: The developed method may achieve a stable assessment of visibility for test objects in mammographic accreditation phantom image in whether the phantom image meets the ACR's criteria in the evaluation test, although there is room left for improvement in the approach for fiber and mass objects.

  9. NASA's New Approach for Evaluating Risk Reduction Due to Space Shuttle Upgrades

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Belyeu, Rebecca L.

    2000-01-01

    As part of NASA's intensive effort to incorporate quantitative risk assessment (QRA) tools in the Agency's decision-making process concerning Space Shuttle risk, NASA has developed a powerful risk assessment tool called the Quantitative Risk Assessment System (QRAS). The QRAS is a tool designed to estimate Space Shuttle risk and evaluate Space Shuttle upgrades. This paper presents an overview of the QRAS with focus on its application for evaluating the risk reduction due to proposed Space Shuttle upgrades. The application includes a case study from the Space Shuttle main engine (SSME). The QRAS overview section of the paper includes the QRAS development process, the technical approach to model development, the QRA quantification methods and techniques, and observations concerning the complex modeling involved in QRAS. The application section of the paper describes a practical case study using QRAS models for evaluating critical Space Shuttle Program upgrades, specifically a proposed SSME nozzle upgrade. This paper presents the method for evaluating the proposed upgrade by comparing the current nozzle (old design with well-established probabilistic models) to the channel wall nozzle (new design at the preliminary design level).

  10. Quantitative Cyber Risk Reduction Estimation Methodology for a Small Scada Control System

    SciTech Connect

    Miles A. McQueen; Wayne F. Boyer; Mark A. Flynn; George A. Beitel

    2006-01-01

    We propose a new methodology for obtaining a quick quantitative measurement of the risk reduction achieved when a control system is modified with the intent to improve cyber security defense against external attackers. The proposed methodology employs a directed graph called a compromise graph, where the nodes represent stages of a potential attack and the edges represent the expected time-to-compromise for differing attacker skill levels. Time-to-compromise is modeled as a function of known vulnerabilities and attacker skill level. The methodology was used to calculate risk reduction estimates for a specific SCADA system and for a specific set of control system security remedial actions. Despite an 86% reduction in the total number of vulnerabilities, the estimated time-to-compromise was increased only by about 3 to 30% depending on target and attacker skill level.

  11. Quantitative Evaluation of Strain Near Tooth Fillet by Image Processing

    NASA Astrophysics Data System (ADS)

    Masuyama, Tomoya; Yoshiizumi, Satoshi; Inoue, Katsumi

    The accurate measurement of strain and stress in a tooth is important for the reliable evaluation of the strength or life of gears. In this research, a strain measurement method which is based on image processing is applied to the analysis of strain near the tooth fillet. The loaded tooth is photographed using a CCD camera and stored as a digital image. The displacement of the point in the tooth flank is tracked by the cross-correlation method, and then, the strain is calculated. The interrogation window size of the correlation method and the overlap amount affect the accuracy and resolution. In the case of measurements at structures with complicated profiles such as fillets, the interrogation window maintains a large size and the overlap amount should be large. The surface condition also affects the accuracy. The white painted surface with a small black particle is suitable for measurement.

  12. Towards the quantitative evaluation of visual attention models.

    PubMed

    Bylinskii, Z; DeGennaro, E M; Rajalingham, R; Ruda, H; Zhang, J; Tsotsos, J K

    2015-11-01

    Scores of visual attention models have been developed over the past several decades of research. Differences in implementation, assumptions, and evaluations have made comparison of these models very difficult. Taxonomies have been constructed in an attempt at the organization and classification of models, but are not sufficient at quantifying which classes of models are most capable of explaining available data. At the same time, a multitude of physiological and behavioral findings have been published, measuring various aspects of human and non-human primate visual attention. All of these elements highlight the need to integrate the computational models with the data by (1) operationalizing the definitions of visual attention tasks and (2) designing benchmark datasets to measure success on specific tasks, under these definitions. In this paper, we provide some examples of operationalizing and benchmarking different visual attention tasks, along with the relevant design considerations.

  13. Towards the quantitative evaluation of visual attention models.

    PubMed

    Bylinskii, Z; DeGennaro, E M; Rajalingham, R; Ruda, H; Zhang, J; Tsotsos, J K

    2015-11-01

    Scores of visual attention models have been developed over the past several decades of research. Differences in implementation, assumptions, and evaluations have made comparison of these models very difficult. Taxonomies have been constructed in an attempt at the organization and classification of models, but are not sufficient at quantifying which classes of models are most capable of explaining available data. At the same time, a multitude of physiological and behavioral findings have been published, measuring various aspects of human and non-human primate visual attention. All of these elements highlight the need to integrate the computational models with the data by (1) operationalizing the definitions of visual attention tasks and (2) designing benchmark datasets to measure success on specific tasks, under these definitions. In this paper, we provide some examples of operationalizing and benchmarking different visual attention tasks, along with the relevant design considerations. PMID:25951756

  14. Quantitative Ultrasonic Evaluation of Mechanical Properties of Engineering Materials

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1978-01-01

    Progress in the application of ultrasonic techniques to nondestructive measurement of mechanical strength of engineering materials is reviewed. A dormant concept in nondestructive evaluation (NDE) is invoked. The availability of ultrasonic methods that can be applied to actual parts to assess their potential susceptibility to failure under design conditions is discussed. It was shown that ultrasonic methods yield measurements of elastic moduli, microstructure, hardness, fracture toughness, tensile strength, yield strength, and shear strength for a wide range of materials (including many types of metals, ceramics, and fiber composites). It was also indicated that although most of these methods were shown feasible in laboratory studies, more work is needed before they can be used on actual parts in processing, assembly, inspection, and maintenance lines.

  15. A Quantitative Evaluation of Medication Histories and Reconciliation by Discipline

    PubMed Central

    Stewart, Michael R.; Fogg, Sarah M.; Schminke, Brandon C.; Zackula, Rosalee E.; Nester, Tina M.; Eidem, Leslie A.; Rosendale, James C.; Ragan, Robert H.; Bond, Jack A.; Goertzen, Kreg W.

    2014-01-01

    Abstract Background/Objective: Medication reconciliation at transitions of care decreases medication errors, hospitalizations, and adverse drug events. We compared inpatient medication histories and reconciliation across disciplines and evaluated the nature of discrepancies. Methods: We conducted a prospective cohort study of patients admitted from the emergency department at our 760-bed hospital. Eligible patients had their medication histories conducted and reconciled in order by the admitting nurse (RN), certified pharmacy technician (CPhT), and pharmacist (RPh). Discharge medication reconciliation was not altered. Admission and discharge discrepancies were categorized by discipline, error type, and drug class and were assigned a criticality index score. A discrepancy rating system systematically measured discrepancies. Results: Of 175 consented patients, 153 were evaluated. Total admission and discharge discrepancies were 1,461 and 369, respectively. The average number of medications per participant at admission was 8.59 (1,314) with 9.41 (1,374) at discharge. Most discrepancies were committed by RNs: 53.2% (777) at admission and 56.1% (207) at discharge. The majority were omitted or incorrect. RNs had significantly higher admission discrepancy rates per medication (0.59) compared with CPhTs (0.36) and RPhs (0.16) (P < .001). RPhs corrected significantly more discrepancies per participant than RNs (6.39 vs 0.48; P < .001); average criticality index reduction was 79.0%. Estimated prevented adverse drug events (pADEs) cost savings were $589,744. Conclusions: RPhs committed the fewest discrepancies compared with RNs and CPhTs, resulting in more accurate medication histories and reconciliation. RPh involvement also prevented the greatest number of medication errors, contributing to considerable pADE-related cost savings. PMID:25477614

  16. Effect of Surface Sampling and Recovery of Viruses and Non-Spore-Forming Bacteria on a Quantitative Microbial Risk Assessment Model for Fomites.

    PubMed

    Weir, Mark H; Shibata, Tomoyuki; Masago, Yoshifumi; Cologgi, Dena L; Rose, Joan B

    2016-06-01

    Quantitative microbial risk assessment (QMRA) is a powerful decision analytics tool, yet it faces challenges when modeling health risks for the indoor environment. One limitation is uncertainty in fomite recovery for evaluating the efficiency of decontamination. Addressing this data gap has become more important as a result of response and recovery from a potential malicious pathogen release. To develop more accurate QMRA models, recovery efficiency from non-porous fomites (aluminum, ceramic, glass, plastic, steel, and wood laminate) was investigated. Fomite material, surface area (10, 100, and 900 cm(2)), recovery tool (swabs and wipes), initial concentration on the fomites and eluent (polysorbate 80, trypticase soy broth, and beef extract) were evaluated in this research. Recovery was shown to be optimized using polysorbate 80, sampling with wipes, and sampling a surface area of 10-100 cm(2). The QMRA model demonstrated, through a relative risk comparison, the need for recovery efficiency to be used in these models to prevent underestimated risks.

  17. RISK MANAGEMENT EVALUATION FOR CONCENTRATED ANIMAL FEEDING OPERATIONS

    EPA Science Inventory

    The National Risk Management Research Laboratory (NRMRL) developed a Risk Management Evaluation (RME) to provide information needed to help plan future research in the Laboratory dealing with the environmental impact of concentrated animal feeding operations (CAFOs). Agriculture...

  18. Evaluation of Cardiovascular Risk Scores Applied to NASA's Astronant Corps

    NASA Technical Reports Server (NTRS)

    Jain, I.; Charvat, J. M.; VanBaalen, M.; Lee, L.; Wear, M. L.

    2014-01-01

    In an effort to improve cardiovascular disease (CVD) risk prediction, this analysis evaluates and compares the applicability of multiple CVD risk scores to the NASA Astronaut Corps which is extremely healthy at selection.

  19. The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Latimer, John A.

    2009-01-01

    This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.

  20. Documentation Protocols to Generate Risk Indicators Regarding Degradation Processes for Cultural Heritage Risk Evaluation

    NASA Astrophysics Data System (ADS)

    Kioussi, A.; Karoglou, M.; Bakolas, A.; Labropoulos, K.; Moropoulou, A.

    2013-07-01

    Sustainable maintenance and preservation of cultural heritage assets depends highly on its resilience to external or internal alterations and to various hazards. Risk assessment of a heritage asset's can be defined as the identification of all potential hazards affecting it and the evaluation of the asset's vulnerability (building materials and building structure conservation state).Potential hazards for cultural heritage are complex and varying. The risk of decay and damage associated with monuments is not limited to certain long term natural processes, sudden events and human impact (macroscale of the heritage asset) but is also a function of the degradation processes within materials and structural elements due to physical and chemical procedures. Obviously, these factors cover different scales of the problem. The deteriorating processes in materials may be triggered by external influences or caused because of internal chemical and/or physical variations of materials properties and characteristics. Therefore risk evaluation should be dealt in the direction of revealing the specific active decay and damage mechanism both in mesoscale [type of decay and damage] and microscale [decay phenomenon mechanism] level. A prerequisite for risk indicators identification and development is the existence of an organised source of comparable and interoperable data about heritage assets under observation. This unified source of information offers a knowledge based background of the asset's vulnerability through the diagnosis of building materials' and building structure's conservation state, through the identification of all potential hazards affecting these and through mapping of its possible alterations during its entire life-time. In this framework the identification and analysis of risks regarding degradation processes for the development of qualitative and quantitative indicators can be supported by documentation protocols. The data investigated by such protocols help

  1. Risk Perception as the Quantitative Parameter of Ethics and Responsibility in Disaster Study

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy; Movchan, Dmytro

    2014-05-01

    Intensity of impacts of natural disasters is increasing with climate and ecological changes spread. Frequency of disasters is increasing, and recurrence of catastrophes characterizing by essential spatial heterogeneity. Distribution of losses is fundamentally non-linear and reflects complex interrelation of natural, social and environmental factor in the changing world on multi scale range. We faced with new types of risks, which require a comprehensive security concept. Modern understanding of complex security, and complex risk management require analysis of all natural and social phenomena, involvement of all available data, constructing of advanced analytical tools, and transformation of our perception of risk and security issues. Traditional deterministic models used for risk analysis are difficult applicable for analysis of social issues, as well as for analysis of multi scale multi-physics phenomena quantification. Also parametric methods are not absolutely effective because the system analyzed is essentially non-ergodic. The stochastic models of risk analysis are applicable for quantitative analysis of human behavior and risk perception. In framework of risk analysis models the risk perception issues were described. Risk is presented as the superposition of distribution (f(x,y)) and damage functions (p(x,y)): P →δΣ x,yf(x,y)p(x,y). As it was shown risk perception essentially influents to the damage function. Basing on the prospect theory and decision making under uncertainty on cognitive bias and handling of risk, modification of damage function is proposed: p(x,y|α(t)). Modified damage function includes an awareness function α(t), which is the system of risk perception function (rp) and function of education and log-term experience (c) as: α(t) → (c - rp). Education function c(t) describes the trend of education and experience. Risk perception function rp reflects security concept of human behavior, is the basis for prediction of socio-economic and

  2. At-Risk Youth Appearance and Job Performance Evaluation

    ERIC Educational Resources Information Center

    Freeburg, Beth Winfrey; Workman, Jane E.

    2008-01-01

    The goal of this study was to identify the relationship of at-risk youth workplace appearance to other job performance criteria. Employers (n = 30; each employing from 1 to 17 youths) evaluated 178 at-risk high school youths who completed a paid summer employment experience. Appearance evaluations were significantly correlated with evaluations of…

  3. Evaluation of the National Science Foundation's Local Course Improvement Program, Volume II: Quantitative Analyses.

    ERIC Educational Resources Information Center

    Kulik, James A.; And Others

    This report is the second of three volumes describing the results of the evaluation of the National Science Foundation (NSF) Local Course Improvement (LOCI) program. This volume describes the quantitative results of the program. Evaluation of the LOCI program involved answering questions in the areas of the need for science course improvement as…

  4. Benchmark dose profiles for joint-action quantal data in quantitative risk assessment.

    PubMed

    Deutsch, Roland C; Piegorsch, Walter W

    2012-12-01

    Benchmark analysis is a widely used tool in public health risk analysis. Therein, estimation of minimum exposure levels, called Benchmark Doses (BMDs), that induce a prespecified Benchmark Response (BMR) is well understood for the case of an adverse response to a single stimulus. For cases where two agents are studied in tandem, however, the benchmark approach is far less developed. This article demonstrates how the benchmark modeling paradigm can be expanded from the single-dose setting to joint-action, two-agent studies. Focus is on response outcomes expressed as proportions. Extending the single-exposure setting, representations of risk are based on a joint-action dose-response model involving both agents. Based on such a model, the concept of a benchmark profile (BMP) - a two-dimensional analog of the single-dose BMD at which both agents achieve the specified BMR - is defined for use in quantitative risk characterization and assessment. The resulting, joint, low-dose guidelines can improve public health planning and risk regulation when dealing with low-level exposures to combinations of hazardous agents.

  5. [Quantitative risk model for verocytotoxigenic Escherichia coli cross-contamination during homemade hamburger preparation].

    PubMed

    Signorini, M L; Frizzo, L S

    2009-01-01

    The objective of this study was to develop a quantitative risk model for verocytotoxigenic Escherichia coil (VTEC) cross-contamination during hamburger preparation at home. Published scientific information about the disease was considered for the elaboration of the model, which included a number of routines performed during food preparation in kitchens. The associated probabilities of bacterial transference between food items and kitchen utensils which best described each stage of the process were incorporated into the model by using @Risk software. Handling raw meat before preparing ready-to-eat foods (Odds ratio, OR, 6.57), as well as hand (OR = 12.02) and cutting board (OR = 5.02) washing habits were the major risk factors of VTEC cross-contamination from meat to vegetables. The information provided by this model should be considered when designing public information campaigns on hemolytic uremic syndrome risk directed to food handlers, in order to stress the importance of the above mentioned factors in disease transmission.

  6. Quantitative Assessment of Current Risks to Harlequin Ducks in Prince William Sound, Alaska, from the Exxon Valdez Oil Spill

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Parker, Keith R.; Murphy, Stephen M.; Day, Robert H.; Bence, A. Edward; Neff, Jerry M.; Wiens, John A.

    2012-01-01

    Harlequin Ducks (Histrionicus histrionicus) were adversely affected by the Exxon Valdez oil spill (EVOS) in Prince William Sound (PWS), Alaska, and some have suggested effects continue two decades later. We present an ecological risk assessment evaluating quantitatively whether PWS seaducks continue to be at-risk from polycyclic aromatic hydrocarbons (PAHs) in residual Exxon Valdez oil. Potential pathways for PAH exposures are identified for initially oiled and never-oiled reference sites. Some potential pathways are implausible (e.g., a seaduck excavating subsurface oil residues), whereas other pathways warrant quantification. We used data on PAH concentrations in PWS prey species, sediments, and seawater collected during 2001–2008 to develop a stochastic individual-based model projecting assimilated doses to seaducks. We simulated exposures to 500,000 individuals in each of eight age/gender classes, capturing the variability within a population of seaducks living in PWS. Doses to the maximum-exposed individuals are ∼400–4,000 times lower than chronic toxicity reference values established using USEPA protocols for seaducks. These exposures are so low that no individual-level effects are plausible, even within a simulated population that is orders-of-magnitude larger than exists in PWS. We conclude that toxicological risks to PWS seaducks from residual Exxon Valdez oil two decades later are essentially non-existent. PMID:23723680

  7. Quantitative Assessment of Current Risks to Harlequin Ducks in Prince William Sound, Alaska, from the Exxon Valdez Oil Spill.

    PubMed

    Harwell, Mark A; Gentile, John H; Parker, Keith R; Murphy, Stephen M; Day, Robert H; Bence, A Edward; Neff, Jerry M; Wiens, John A

    2012-03-01

    Harlequin Ducks (Histrionicus histrionicus) were adversely affected by the Exxon Valdez oil spill (EVOS) in Prince William Sound (PWS), Alaska, and some have suggested effects continue two decades later. We present an ecological risk assessment evaluating quantitatively whether PWS seaducks continue to be at-risk from polycyclic aromatic hydrocarbons (PAHs) in residual Exxon Valdez oil. Potential pathways for PAH exposures are identified for initially oiled and never-oiled reference sites. Some potential pathways are implausible (e.g., a seaduck excavating subsurface oil residues), whereas other pathways warrant quantification. We used data on PAH concentrations in PWS prey species, sediments, and seawater collected during 2001-2008 to develop a stochastic individual-based model projecting assimilated doses to seaducks. We simulated exposures to 500,000 individuals in each of eight age/gender classes, capturing the variability within a population of seaducks living in PWS. Doses to the maximum-exposed individuals are ∼400-4,000 times lower than chronic toxicity reference values established using USEPA protocols for seaducks. These exposures are so low that no individual-level effects are plausible, even within a simulated population that is orders-of-magnitude larger than exists in PWS. We conclude that toxicological risks to PWS seaducks from residual Exxon Valdez oil two decades later are essentially non-existent. PMID:23723680

  8. Quantitative Assessment of Current Risks to Harlequin Ducks in Prince William Sound, Alaska, from the Exxon Valdez Oil Spill.

    PubMed

    Harwell, Mark A; Gentile, John H; Parker, Keith R; Murphy, Stephen M; Day, Robert H; Bence, A Edward; Neff, Jerry M; Wiens, John A

    2012-03-01

    Harlequin Ducks (Histrionicus histrionicus) were adversely affected by the Exxon Valdez oil spill (EVOS) in Prince William Sound (PWS), Alaska, and some have suggested effects continue two decades later. We present an ecological risk assessment evaluating quantitatively whether PWS seaducks continue to be at-risk from polycyclic aromatic hydrocarbons (PAHs) in residual Exxon Valdez oil. Potential pathways for PAH exposures are identified for initially oiled and never-oiled reference sites. Some potential pathways are implausible (e.g., a seaduck excavating subsurface oil residues), whereas other pathways warrant quantification. We used data on PAH concentrations in PWS prey species, sediments, and seawater collected during 2001-2008 to develop a stochastic individual-based model projecting assimilated doses to seaducks. We simulated exposures to 500,000 individuals in each of eight age/gender classes, capturing the variability within a population of seaducks living in PWS. Doses to the maximum-exposed individuals are ∼400-4,000 times lower than chronic toxicity reference values established using USEPA protocols for seaducks. These exposures are so low that no individual-level effects are plausible, even within a simulated population that is orders-of-magnitude larger than exists in PWS. We conclude that toxicological risks to PWS seaducks from residual Exxon Valdez oil two decades later are essentially non-existent.

  9. Quantitative Evaluation of the Stability of Engineered Water Soluble Nanoparticles

    NASA Astrophysics Data System (ADS)

    Mulvihill, M. J.; Habas, S.; Mokari, T.; Wan, J.

    2009-12-01

    Stability of nanoparticle solutions is a key factor dictating the bioavailability and transport characteristics of nanoparticles (NPs) in the environment. The synthesis of materials with dimensions less than 100 nm relies on the ability to stabilize surfaces. If the stabilization of the material is disrupted by aggregation, precipitation, or dissolution, the chemical and physical properties often revert to the properties of the bulk material or molecular constituents. We synthesized CdSe and gold NPs, and studied their aggregation rate and the critical coagulation concentration (CCC) using Dynamic Light Scattering (DLS). The chemical and physical properties of our NPs have been characterized by Transmission Electron Microscopy (TEM), UV-VIS spectroscopy, IR spectroscopy, Zeta potential measurements, and Nuclear Magnetic Resonance (NMR) measurements. This comprehensive approach to synthesis and characterization enables the isolation of design parameters with greater precision that can be obtained using commercially available NPs. This research evaluates NP design parameters including composition, size, and surface coating, as a function of concentration, pH, and ionic strength, to determine which factors most affect NP stability. The aggregation characteristics of both gold NPs and cadmium selinide NPs, which are between 2-12 nm in diameter, and have been capped with various ligands, have been studied. While previous work demonstrates that these variables influence stability, it does not systematically compare their relative significance. Our results indicate that changing the ligand shell radically affects the stability of NP as a function of both pH and ionic strength, while changing the material from CdSe to gold has only a moderate influence on the stability and aggregation characteristics of our particles. Additionally, the ligand charge, length, and binding affinity all significantly effect NP stability. Funding was provided by the U.S. Department of Energy

  10. Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages

    PubMed Central

    Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert

    2014-01-01

    This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns. PMID:24817829

  11. Evaluating emergency risk communications: a dialogue with the experts.

    PubMed

    Thomas, Craig W; Vanderford, Marsha L; Crouse Quinn, Sandra

    2008-10-01

    Evaluating emergency risk communications is fraught with challenges since communication can be approached from both a systemic and programmatic level. Therefore, one must consider stakeholders' perspectives, effectiveness issues, standards of evidence and utility, and channels of influence (e.g., mass media and law enforcement). Evaluation issues related to timing, evaluation questions, methods, measures, and accountability are raised in this dialogue with emergency risk communication specialists. Besides the usual evaluation competencies, evaluators in this area need to understand and work collaboratively with stakeholders and be attuned to the dynamic contextual nature of emergency risk communications. Sample resources and measures are provided here to aid in this emerging and exciting field of evaluation.

  12. Risk Evaluation of Endocrine-Disrupting Chemicals

    PubMed Central

    Gioiosa, Laura; Palanza, Paola; vom Saal, Frederick S.

    2015-01-01

    We review here our studies on early exposure to low doses of the estrogenic endocrine-disrupting chemical bisphenol A (BPA) on behavior and metabolism in CD-1 mice. Mice were exposed in utero from gestation day (GD) 11 to delivery (prenatal exposure) or via maternal milk from birth to postnatal day 7 (postnatal exposure) to 10 µg/kg body weight/d of BPA or no BPA (controls). Bisphenol A exposure resulted in long-term disruption of sexually dimorphic behaviors. Females exposed to BPA pre- and postnatally showed increased anxiety and behavioral profiles similar to control males. We also evaluated metabolic effects in prenatally exposed adult male offspring of dams fed (from GD 9 to 18) with BPA at doses ranging from 5 to 50 000 µg/kg/d. The males showed an age-related significant change in a number of metabolic indexes ranging from food intake to glucose regulation at BPA doses below the no observed adverse effect level (5000 µg/kg/d). Consistent with prior findings, low but not high BPA doses produced significant effects for many outcomes. These findings provide further evidence of the potential risks that developmental exposure to low doses of the endocrine disrupter BPA may pose to human health, with fetuses and infants being highly vulnerable. PMID:26740806

  13. EVALUATING TOOLS AND MODELS USED FOR QUANTITATIVE EXTRAPOLATION OF IN VITRO TO IN VIVO DATA FOR NEUROTOXICANTS*

    EPA Science Inventory

    There are a number of risk management decisions, which range from prioritization for testing to quantitative risk assessments. The utility of in vitro studies in these decisions depends on how well the results of such data can be qualitatively and quantitatively extrapolated to i...

  14. A quantitative methodology to assess the risks to human health from CO2 leakage into groundwater

    NASA Astrophysics Data System (ADS)

    Siirila, E.; Sitchler, A.; Maxwell, R. M.; McCray, J. E.

    2010-12-01

    Leakage of CO2 and associated gases into overlying aquifers as a result of geologic carbon capture and sequestration may have adverse impacts on aquifer drinking-water quality. Gas or aqueous-phase leakage may occur due to transport via faults and fractures, through faulty well bores, or through leaky confining materials. Contaminants of concern include aqueous salts and dissolved solids, gaseous or aqueous-phase organic contaminants, and acidic gas or aqueous-phase fluids that can liberate metals from aquifer minerals. Here we present a quantitative risk assessment framework to predict potential human health risk from CO2 leakage into drinking water aquifers. This framework incorporates the potential release of CO2 into the drinking water aquifer; mobilization of metals due to a decrease in pH; transport of these metals down gradient to municipal receptors; distributions of contaminated groundwater to multiple households; and exposure and health risk to individuals using this water for household purposes. Additionally, this framework is stochastic, incorporates detailed variations in geological and geostatistical parameters and discriminates between uncertain and variable parameters using a two-stage, or nested, Monte Carlo approach. This approach is demonstrated using example simulations with hypothetical, yet realistic, aquifer characteristics and leakage scenarios. These example simulations show a greater risk for arsenic than for lead for both cancer and non-cancer endpoints, an unexpected finding given greater toxicity of lead at lower doses than arsenic. It was also found that higher background groundwater gradients also yield higher risk. The overall risk and the associated uncertainty are sensitive to the extent of aquifer stratification and the degree of local-scale dispersion. These results all highlight the importance of hydrologic modeling in risk assessment. A linear relationship between carcinogenic and noncarcinogenic risk was found for arsenic and

  15. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation). Progress report, January 15, 1992--January 14, 1993

    SciTech Connect

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ``Instrumentation and Quantitative Methods of Evaluation.`` Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging.

  16. Roadmap to risk evaluation and mitigation strategies (REMS) success

    PubMed Central

    Balian, John D.; Malhotra, Rachpal; Perentesis, Valerie

    2010-01-01

    Medical safety-related risk management is a rapidly evolving and increasingly important aspect of drug approval and market longevity. To effectively meet the challenges of this new era, we describe a risk management roadmap that proactively yet practically anticipates risk-management requirements, provides the foundation for enduring yet appropriately flexible risk-management practices, and leverages these techniques to efficiently and effectively utilize risk evaluation and mitigation strategies (REMS)/risk minimization programs as market access enablers. This fully integrated risk-management paradigm creates exciting opportunities for newer tools, techniques, and approaches to more successfully optimize product development, approval, and commercialization, with patients as the ultimate beneficiaries. PMID:25083193

  17. An educationally inspired illustration of two-dimensional Quantitative Microbiological Risk Assessment (QMRA) and sensitivity analysis.

    PubMed

    Vásquez, G A; Busschaert, P; Haberbeck, L U; Uyttendaele, M; Geeraerd, A H

    2014-11-01

    Quantitative Microbiological Risk Assessment (QMRA) is a structured methodology used to assess the risk involved by ingestion of a pathogen. It applies mathematical models combined with an accurate exploitation of data sets, represented by distributions and - in the case of two-dimensional Monte Carlo simulations - their hyperparameters. This research aims to highlight background information, assumptions and truncations of a two-dimensional QMRA and advanced sensitivity analysis. We believe that such a detailed listing is not always clearly presented in actual risk assessment studies, while it is essential to ensure reliable and realistic simulations and interpretations. As a case-study, we are considering the occurrence of listeriosis in smoked fish products in Belgium during the period 2008-2009, using two-dimensional Monte Carlo and two sensitivity analysis methods (Spearman correlation and Sobol sensitivity indices) to estimate the most relevant factors of the final risk estimate. A risk estimate of 0.018% per consumption of contaminated smoked fish by an immunocompromised person was obtained. The final estimate of listeriosis cases (23) is within the actual reported result obtained for the same period and for the same population. Variability on the final risk estimate is determined by the variability regarding (i) consumer refrigerator temperatures, (ii) the reference growth rate of L. monocytogenes, (iii) the minimum growth temperature of L. monocytogenes and (iv) consumer portion size. Variability regarding the initial contamination level of L. monocytogenes tends to appear as a determinant of risk variability only when the minimum growth temperature is not included in the sensitivity analysis; when it is included the impact regarding the variability on the initial contamination level of L. monocytogenes is disappearing. Uncertainty determinants of the final risk indicated the need of gathering more information on the reference growth rate and the minimum

  18. An educationally inspired illustration of two-dimensional Quantitative Microbiological Risk Assessment (QMRA) and sensitivity analysis.

    PubMed

    Vásquez, G A; Busschaert, P; Haberbeck, L U; Uyttendaele, M; Geeraerd, A H

    2014-11-01

    Quantitative Microbiological Risk Assessment (QMRA) is a structured methodology used to assess the risk involved by ingestion of a pathogen. It applies mathematical models combined with an accurate exploitation of data sets, represented by distributions and - in the case of two-dimensional Monte Carlo simulations - their hyperparameters. This research aims to highlight background information, assumptions and truncations of a two-dimensional QMRA and advanced sensitivity analysis. We believe that such a detailed listing is not always clearly presented in actual risk assessment studies, while it is essential to ensure reliable and realistic simulations and interpretations. As a case-study, we are considering the occurrence of listeriosis in smoked fish products in Belgium during the period 2008-2009, using two-dimensional Monte Carlo and two sensitivity analysis methods (Spearman correlation and Sobol sensitivity indices) to estimate the most relevant factors of the final risk estimate. A risk estimate of 0.018% per consumption of contaminated smoked fish by an immunocompromised person was obtained. The final estimate of listeriosis cases (23) is within the actual reported result obtained for the same period and for the same population. Variability on the final risk estimate is determined by the variability regarding (i) consumer refrigerator temperatures, (ii) the reference growth rate of L. monocytogenes, (iii) the minimum growth temperature of L. monocytogenes and (iv) consumer portion size. Variability regarding the initial contamination level of L. monocytogenes tends to appear as a determinant of risk variability only when the minimum growth temperature is not included in the sensitivity analysis; when it is included the impact regarding the variability on the initial contamination level of L. monocytogenes is disappearing. Uncertainty determinants of the final risk indicated the need of gathering more information on the reference growth rate and the minimum

  19. Quantitative Microbial Risk Assessment for Campylobacter spp. on Ham in Korea.

    PubMed

    Lee, Jeeyeon; Ha, Jimyeong; Kim, Sejeong; Lee, Heeyoung; Lee, Soomin; Yoon, Yohan

    2015-01-01

    The objective of this study was to evaluate the risk of illness from Campylobacter spp. on ham. To identify the hazards of Campylobacter spp. on ham, the general characteristics and microbial criteria for Campylobacter spp., and campylobacteriosis outbreaks were investigated. In the exposure assessment, the prevalence of Campylobacter spp. on ham was evaluated, and the probabilistic distributions for the temperature of ham surfaces in retail markets and home refrigerators were prepared. In addition, the raw data from the Korea National Health and Nutrition Examination Survey (KNHNES) 2012 were used to estimate the consumption amount and frequency of ham. In the hazard characterization, the Beta-Poisson model for Campylobacter spp. infection was used. For risk characterization, a simulation model was developed using the collected data, and the risk of Campylobacter spp. on ham was estimated with @RISK. The Campylobacter spp. cell counts on ham samples were below the detection limit (<0.70 Log CFU/g). The daily consumption of ham was 23.93 g per person, and the consumption frequency was 11.57%. The simulated mean value of the initial contamination level of Campylobacter spp. on ham was -3.95 Log CFU/g, and the mean value of ham for probable risk per person per day was 2.20×10(-12). It is considered that the risk of foodborne illness for Campylobacter spp. was low. Furthermore, these results indicate that the microbial risk assessment of Campylobacter spp. in this study should be useful in providing scientific evidence to set up the criteria of Campylobacter spp.. PMID:26761897

  20. Assessment of Semi-Quantitative Health Risks of Exposure to Harmful Chemical Agents in the Context of Carcinogenesis in the Latex Glove Manufacturing Industry.

    PubMed

    Yari, Saeed; Fallah Asadi, Ayda; Varmazyar, Sakineh

    2016-01-01

    Excessive exposure to chemicals in the workplace can cause poisoning and various diseases. Thus, for the protection of labor, it is necessary to examine the exposure of people to chemicals and risks from these materials. The purpose of this study is to evaluate semi-quantitative health risks of exposure to harmful chemical agents in the context of carcinogenesis in a latex glove manufacturing industry. In this cross-sectional study, semi-quantitative risk assessment methods provided by the Department of Occupational Health of Singapore were used and index of LD50, carcinogenesis (ACGIH and IARC) and corrosion capacity were applied to calculate the hazard rate and the biggest index was placed as the basis of risk. To calculate the exposure rate, two exposure index methods and the actual level of exposure were employed. After identifying risks, group H (high) and E (very high) classified as high-risk were considered. Of the total of 271 only 39 (15%) were at a high risk level and 3% were very high (E). These risks only was relevant to 7 materials with only sulfuric acid placed in group E and 6 other materials in group H, including nitric acid (48.3%), chromic acid (6.9%), hydrochloric acid (10.3%), ammonia (3.4%), potassium hydroxide (20.7%) and chlorine (10.3%). Overall, the average hazard rate level was estimated to be 4 and average exposure rate to be 3.5. Health risks identified in this study showed that the manufacturing industry for latex gloves has a high level of risk because of carcinogens, acids and strong alkalisand dangerous drugs. Also according to the average level of risk impact, it is better that the safety design strategy for latex gloves production industry be placed on the agenda. PMID:27165227

  1. Assessment of Semi-Quantitative Health Risks of Exposure to Harmful Chemical Agents in the Context of Carcinogenesis in the Latex Glove Manufacturing Industry.

    PubMed

    Yari, Saeed; Fallah Asadi, Ayda; Varmazyar, Sakineh

    2016-01-01

    Excessive exposure to chemicals in the workplace can cause poisoning and various diseases. Thus, for the protection of labor, it is necessary to examine the exposure of people to chemicals and risks from these materials. The purpose of this study is to evaluate semi-quantitative health risks of exposure to harmful chemical agents in the context of carcinogenesis in a latex glove manufacturing industry. In this cross-sectional study, semi-quantitative risk assessment methods provided by the Department of Occupational Health of Singapore were used and index of LD50, carcinogenesis (ACGIH and IARC) and corrosion capacity were applied to calculate the hazard rate and the biggest index was placed as the basis of risk. To calculate the exposure rate, two exposure index methods and the actual level of exposure were employed. After identifying risks, group H (high) and E (very high) classified as high-risk were considered. Of the total of 271 only 39 (15%) were at a high risk level and 3% were very high (E). These risks only was relevant to 7 materials with only sulfuric acid placed in group E and 6 other materials in group H, including nitric acid (48.3%), chromic acid (6.9%), hydrochloric acid (10.3%), ammonia (3.4%), potassium hydroxide (20.7%) and chlorine (10.3%). Overall, the average hazard rate level was estimated to be 4 and average exposure rate to be 3.5. Health risks identified in this study showed that the manufacturing industry for latex gloves has a high level of risk because of carcinogens, acids and strong alkalisand dangerous drugs. Also according to the average level of risk impact, it is better that the safety design strategy for latex gloves production industry be placed on the agenda.

  2. Evaluation of four genes in rice for their suitability as endogenous reference standards in quantitative PCR.

    PubMed

    Wang, Chong; Jiang, Lingxi; Rao, Jun; Liu, Yinan; Yang, Litao; Zhang, Dabing

    2010-11-24

    The genetically modified (GM) food/feed quantification depends on the reliable detection systems of endogenous reference genes. Currently, four endogenous reference genes including sucrose phosphate synthase (SPS), GOS9, phospholipase D (PLD), and ppi phosphofructokinase (ppi-PPF) of rice have been used in GM rice detection. To compare the applicability of these four rice reference genes in quantitative PCR systems, we analyzed the target nucleotide sequence variation in 58 conventional rice varieties from various geographic and phylogenic origins, also their quantification performances were evaluated using quantitative real-time PCR and GeNorm analysis via a series of statistical calculation to get a "M value" which is negative correlation with the stability of genes. The sequencing analysis results showed that the reported GOS9 and PLD taqman probe regions had detectable single nucleotide polymorphisms (SNPs) among the tested rice cultivars, while no SNPs were observed for SPS and ppi-PPF amplicons. Also, poor quantitative performance was detectable in these cultivars with SNPs using GOS9 and PLD quantitative PCR systems. Even though the PCR efficiency of ppi-PPF system was slightly lower, the SPS and ppi-PPF quantitative PCR systems were shown to be applicable for rice endogenous reference assay with less variation among the C(t) values, good reproducibility in quantitative assays, and the low M values by the comprehensive quantitative PCR comparison and GeNorm analysis.

  3. Risk in Enterprise Cloud Computing: Re-Evaluated

    ERIC Educational Resources Information Center

    Funmilayo, Bolonduro, R.

    2016-01-01

    A quantitative study was conducted to get the perspectives of IT experts about risks in enterprise cloud computing. In businesses, these IT experts are often not in positions to prioritize business needs. The business experts commonly known as business managers mostly determine an organization's business needs. Even if an IT expert classified a…

  4. Hydraulic fracturing in unconventional reservoirs - Identification of hazards and strategies for a quantitative risk assessment

    NASA Astrophysics Data System (ADS)

    Helmig, R.; Kissinger, A.; Class, H.; Ebigbo, A.

    2012-12-01

    fractured reservoir, fracture propagation, fault zones and their role in regard to fluid migration into shallow aquifers). A quantitative risk assessment which should be the main aim of future work in this field has much higher demands, especially on site specific data, as the estimation of statistical parameter uncertainty requires site specific parameter distributions. There is already ongoing research on risk assessment in related fields like CO2 sequestration. We therefore propose these methodologies to be transferred to risk estimation relating to the use of the hydraulic fracking method, be it for unconventional gas or enhanced geothermal energy production. The overall aim should be to set common and transparent standards for different uses of the subsurface and their involved risks and communicate those to policy makers and stake holders.

  5. Quantitative risk assessment of entry of contagious bovine pleuropneumonia through live cattle imported from northwestern Ethiopia.

    PubMed

    Woube, Yilkal Asfaw; Dibaba, Asseged Bogale; Tameru, Berhanu; Fite, Richard; Nganwa, David; Robnett, Vinaida; Demisse, Amsalu; Habtemariam, Tsegaye

    2015-11-01

    Contagious bovine pleuropneumonia (CBPP) is a highly contagious bacterial disease of cattle caused by Mycoplasma mycoides subspecies mycoides small colony (SC) bovine biotype (MmmSC). It has been eradicated from many countries; however, the disease persists in many parts of Africa and Asia. CBPP is one of the major trade-restricting diseases of cattle in Ethiopia. In this quantitative risk assessment the OIE concept of zoning was adopted to assess the entry of CBPP into an importing country when up to 280,000 live cattle are exported every year from the northwestern proposed disease free zone (DFZ) of Ethiopia. To estimate the level of risk, a six-tiered risk pathway (scenario tree) was developed, evidences collected and equations generated. The probability of occurrence of the hazard at each node was modelled as a probability distribution using Monte Carlo simulation (@RISK software) at 10,000 iterations to account for uncertainty and variability. The uncertainty and variability of data points surrounding the risk estimate were further quantified by sensitivity analysis. In this study a single animal destined for export from the northwestern DFZ of Ethiopia has a CBPP infection probability of 4.76×10(-6) (95% CI=7.25×10(-8) 1.92×10(-5)). The probability that at least one infected animal enters an importing country in one year is 0.53 (90% CI=0.042-0.97). The expected number of CBPP infected animals exported any given year is 1.28 (95% CI=0.021-5.42). According to the risk estimate, an average of 2.73×10(6) animals (90% CI=10,674-5.9×10(6)) must be exported to get the first infected case. By this account it would, on average, take 10.15 years (90% CI=0.24-23.18) for the first infected animal to be included in the consignment. Sensitivity analysis revealed that prevalence and vaccination had the highest impact on the uncertainty and variability of the overall risk.

  6. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Song, Na; Caffo, Brian; Frey, Eric C.

    2015-03-01

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method pro- vided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  7. Farm to Fork Quantitative Risk Assessment of Listeria monocytogenes Contamination in Raw and Pasteurized Milk Cheese in Ireland.

    PubMed

    Tiwari, Uma; Cummins, Enda; Valero, Antonio; Walsh, Des; Dalmasso, Marion; Jordan, Kieran; Duffy, Geraldine

    2015-06-01

    The objective of this study was to model and quantify the level of Listeria monocytogenes in raw milk cheese (RMc) and pasteurized milk cheese (PMc) from farm to fork using a Bayesian inference approach combined with a quantitative risk assessment. The modeling approach included a prediction of contamination arising from the farm environment as well from cross-contamination within the cheese-processing facility through storage and subsequent human exposure. The model predicted a high concentration of L. monocytogenes in contaminated RMc (mean 2.19 log10 CFU/g) compared to PMc (mean -1.73 log10 CFU/g). The mean probability of illness (P1 for low-risk population, LR) and (P2 for high-risk population, HR, e.g., immunocompromised) adult Irish consumers following exposure to contaminated cheese was 7 × 10(-8) (P1 ) and 9 × 10(-4) (P2 ) for RMc and 7 × 10(-10) (P1 ) and 8 × 10(-6) (P2 ) for PMc, respectively. In addition, the model was used to evaluate performance objectives at various stages, namely, the cheese making and ripening stages, and to set a food safety objective at the time of consumption. A scenario analysis predicted various probabilities of L. monocytogenes contamination along the cheese-processing chain for both RMc and PMc. The sensitivity analysis showed the critical factors for both cheeses were the serving size of the cheese, storage time, and temperature at the distribution stage. The developed model will allow food processors and policymakers to identify the possible routes of contamination along the cheese-processing chain and to reduce the risk posed to human health.

  8. Quantitative risk assessment & leak detection criteria for a subsea oil export pipeline

    NASA Astrophysics Data System (ADS)

    Zhang, Fang-Yuan; Bai, Yong; Badaruddin, Mohd Fauzi; Tuty, Suhartodjo

    2009-06-01

    A quantitative risk assessment (QRA) based on leak detection criteria (LDC) for the design of a proposed subsea oil export pipeline is presented in this paper. The objective of this QRA/LDC study was to determine if current leak detection methodologies were sufficient, based on QRA results, while excluding the use of statistical leak detection; if not, an appropriate LDC for the leak detection system would need to be established. The famous UK PARLOC database was used for the calculation of pipeline failure rates, and the software POSVCM from MMS was used for oil spill simulations. QRA results revealed that the installation of a statistically based leak detection system (LDS) can significantly reduce time to leak detection, thereby mitigating the consequences of leakage. A sound LDC has been defined based on QRA study results and comments from various LDS vendors to assist the emergency response team (ERT) to quickly identify and locate leakage and employ the most effective measures to contain damage.

  9. Application of quantitative microbial risk assessments for estimation of risk management metrics: Clostridium perfringens in ready-to-eat and partially cooked meat and poultry products as an example.

    PubMed

    Crouch, Edmund A; Labarre, David; Golden, Neal J; Kause, Janell R; Dearfield, Kerry L

    2009-10-01

    The U.S. Department of Agriculture, Food Safety and Inspection Service is exploring quantitative risk assessment methodologies to incorporate the use of the Codex Alimentarius' newly adopted risk management metrics (e.g., food safety objectives and performance objectives). It is suggested that use of these metrics would more closely tie the results of quantitative microbial risk assessments (QMRAs) to public health outcomes. By estimating the food safety objective (the maximum frequency and/or concentration of a hazard in a food at the time of consumption) and the performance objective (the maximum frequency and/or concentration of a hazard in a food at a specified step in the food chain before the time of consumption), risk managers will have a better understanding of the appropriate level of protection (ALOP) from microbial hazards for public health protection. We here demonstrate a general methodology that allows identification of an ALOP and evaluation of corresponding metrics at appropriate points in the food chain. It requires a two-dimensional probabilistic risk assessment, the example used being the Monte Carlo QMRA for Clostridium perfringens in ready-to eat and partially cooked meat and poultry products, with minor modifications to evaluate and abstract required measures. For demonstration purposes, the QMRA model was applied specifically to hot dogs produced and consumed in the United States. Evaluation of the cumulative uncertainty distribution for illness rate allows a specification of an ALOP that, with defined confidence, corresponds to current industry practices.

  10. Risk assessment of supply chain for pharmaceutical excipients with AHP-fuzzy comprehensive evaluation.

    PubMed

    Li, Maozhong; Du, Yunai; Wang, Qiyue; Sun, Chunmeng; Ling, Xiang; Yu, Boyang; Tu, Jiasheng; Xiong, Yerong

    2016-01-01

    As the essential components in formulations, pharmaceutical excipients directly affect the safety, efficacy, and stability of drugs. Recently, safety incidents of pharmaceutical excipients posing seriously threats to the patients highlight the necessity of controlling the potential risks. Hence, it is indispensable for the industry to establish an effective risk assessment system of supply chain. In this study, an AHP-fuzzy comprehensive evaluation model was developed based on the analytic hierarchy process and fuzzy mathematical theory, which quantitatively assessed the risks of supply chain. Taking polysorbate 80 as the example for model analysis, it was concluded that polysorbate 80 for injection use is a high-risk ingredient in the supply chain compared to that for oral use to achieve safety application in clinic, thus measures should be taken to control and minimize those risks. PMID:26288999

  11. Retrospective analysis of a listeria monocytogenes contamination episode in raw milk goat cheese using quantitative microbial risk assessment tools.

    PubMed

    Delhalle, L; Ellouze, M; Yde, M; Clinquart, A; Daube, G; Korsak, N

    2012-12-01

    In 2005, the Belgian authorities reported a Listeria monocytogenes contamination episode in cheese made from raw goat's milk. The presence of an asymptomatic shedder goat in the herd caused this contamination. On the basis of data collected at the time of the episode, a retrospective study was performed using an exposure assessment model covering the production chain from the milking of goats up to delivery of cheese to the market. Predictive microbiology models were used to simulate the growth of L. monocytogenes during the cheese process in relation with temperature, pH, and water activity. The model showed significant growth of L. monocytogenes during chilling and storage of the milk collected the day before the cheese production (median increase of 2.2 log CFU/ml) and during the addition of starter and rennet to milk (median increase of 1.2 log CFU/ml). The L. monocytogenes concentration in the fresh unripened cheese was estimated to be 3.8 log CFU/g (median). This result is consistent with the number of L. monocytogenes in the fresh cheese (3.6 log CFU/g) reported during the cheese contamination episode. A variance-based method sensitivity analysis identified the most important factors impacting the cheese contamination, and a scenario analysis then evaluated several options for risk mitigation. Thus, by using quantitative microbial risk assessment tools, this study provides reliable information to identify and control critical steps in a local production chain of cheese made from raw goat's milk.

  12. Mechanical Model Analysis for Quantitative Evaluation of Liver Fibrosis Based on Ultrasound Tissue Elasticity Imaging

    NASA Astrophysics Data System (ADS)

    Shiina, Tsuyoshi; Maki, Tomonori; Yamakawa, Makoto; Mitake, Tsuyoshi; Kudo, Masatoshi; Fujimoto, Kenji

    2012-07-01

    Precise evaluation of the stage of chronic hepatitis C with respect to fibrosis has become an important issue to prevent the occurrence of cirrhosis and to initiate appropriate therapeutic intervention such as viral eradication using interferon. Ultrasound tissue elasticity imaging, i.e., elastography can visualize tissue hardness/softness, and its clinical usefulness has been studied to detect and evaluate tumors. We have recently reported that the texture of elasticity image changes as fibrosis progresses. To evaluate fibrosis progression quantitatively on the basis of ultrasound tissue elasticity imaging, we introduced a mechanical model of fibrosis progression and simulated the process by which hepatic fibrosis affects elasticity images and compared the results with those clinical data analysis. As a result, it was confirmed that even in diffuse diseases like chronic hepatitis, the patterns of elasticity images are related to fibrous structural changes caused by hepatic disease and can be used to derive features for quantitative evaluation of fibrosis stage.

  13. A Method for Quantitative Evaluation of the Results of Postural Tests.

    PubMed

    Alifirova, V M; Brazovskii, K S; Zhukova, I A; Pekker, Ya S; Tolmachev, I V; Fokin, V A

    2016-07-01

    A method for quantitative evaluation of the results of postural tests is proposed. The method is based on contact-free measurements of 3D coordinates of body point movements. The result can serve as an integral test based on the Mahalanobis distance. PMID:27492397

  14. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  15. Quantitative Evaluation of a First Year Seminar Program: Relationships to Persistence and Academic Success

    ERIC Educational Resources Information Center

    Jenkins-Guarnieri, Michael A.; Horne, Melissa M.; Wallis, Aaron L.; Rings, Jeffrey A.; Vaughan, Angela L.

    2015-01-01

    In the present study, we conducted a quantitative evaluation of a novel First Year Seminar (FYS) program with a coordinated curriculum implemented at a public, four-year university to assess its potential role in undergraduate student persistence decisions and academic success. Participants were 2,188 first-year students, 342 of whom completed the…

  16. An Elephant in the Room: Bias in Evaluating a Required Quantitative Methods Course

    ERIC Educational Resources Information Center

    Fletcher, Joseph F.; Painter-Main, Michael A.

    2014-01-01

    Undergraduate Political Science programs often require students to take a quantitative research methods course. Such courses are typically among the most poorly rated. This can be due, in part, to the way in which courses are evaluated. Students are generally asked to provide an overall rating, which, in turn, is widely used by students, faculty,…

  17. Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Stockdale, Myrah R.

    2015-01-01

    The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…

  18. Evaluation of Lung Metastasis in Mouse Mammary Tumor Models by Quantitative Real-time PCR

    PubMed Central

    Abt, Melissa A.; Grek, Christina L.; Ghatnekar, Gautam S.; Yeh, Elizabeth S.

    2016-01-01

    Metastatic disease is the spread of malignant tumor cells from the primary cancer site to a distant organ and is the primary cause of cancer associated death 1. Common sites of metastatic spread include lung, lymph node, brain, and bone 2. Mechanisms that drive metastasis are intense areas of cancer research. Consequently, effective assays to measure metastatic burden in distant sites of metastasis are instrumental for cancer research. Evaluation of lung metastases in mammary tumor models is generally performed by gross qualitative observation of lung tissue following dissection. Quantitative methods of evaluating metastasis are currently limited to ex vivo and in vivo imaging based techniques that require user defined parameters. Many of these techniques are at the whole organism level rather than the cellular level 3–6. Although newer imaging methods utilizing multi-photon microscopy are able to evaluate metastasis at the cellular level 7, these highly elegant procedures are more suited to evaluating mechanisms of dissemination rather than quantitative assessment of metastatic burden. Here, a simple in vitro method to quantitatively assess metastasis is presented. Using quantitative Real-time PCR (QRT-PCR), tumor cell specific mRNA can be detected within the mouse lung tissue. PMID:26862835

  19. Raman spectral imaging for quantitative contaminant evaluation in skim milk powder

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study uses a point-scan Raman spectral imaging system for quantitative detection of melamine in milk powder. A sample depth of 2 mm and corresponding laser intensity of 200 mW were selected after evaluating the penetration of a 785 nm laser through milk powder. Horizontal and vertical spatial r...

  20. Approaches for assessing risks to sensitive populations: Lessons learned from evaluating risks in the pediatric populations*

    EPA Science Inventory

    Assessing the risk profiles of potentially sensitive populations requires a 'tool chest' of methodological approaches to adequately characterize and evaluate these populations. At present, there is an extensive body of literature on methodologies that apply to the evaluation of...

  1. Approaches for Assessing Risks to Sensitive Populations: Lessons Learned from Evaluating Risks in the Pediatric Population

    EPA Science Inventory

    Assessing the risk profiles of potentially sensitive populations requires a "tool chest" of methodological approaches to adequately characterize and evaluate these populations. At present, there is an extensive body of literature on methodologies that apply to the evaluation of t...

  2. Quantitative risk assessment of rabies entering Great Britain from North America via cats and dogs.

    PubMed

    Jones, Rowena D; Kelly, Louise; Fooks, Anthony R; Wooldridge, Marion

    2005-06-01

    Great Britain has been rabies-free since 1922, which is often considered to be in part due to the strict laws requiring that imported cats and dogs be vaccinated and quarantined for 6 months immediately on entry into the country. Except for two isolated incidents, this quarantine policy has contributed to ensuring that Great Britain has remained free of rabies. In 2000, amendments to the UK quarantine laws were made and the Pet Travel Scheme (PETS) was launched for companion animals traveling from European Union countries and rabies-free islands. Since its introduction, it has been proposed that other countries including North America should be included within the UK scheme. A quantitative risk assessment was developed to assist in the policy decision to amend the long-standing quarantine laws for dogs and cats from North America. It was determined that the risk of rabies entry is very low and is dependent on the level of compliance (i.e., legally conforming to all of the required regulations) with PETS and the number of pets imported. Assuming 100% compliance with PETS and the current level of importation of cats and dogs from North America, the annual probability of importing rabies is lower for animals traveling via PETS (7.22 x 10(-6), 95th percentile) than quarantine (1.01 x 10(-5), 95th percentile). These results, and other scientific evidence, directly informed the decision to expand the PETS scheme to North America as of December 2002.

  3. Estimators of annual probability of infection for quantitative microbial risk assessment.

    PubMed

    Karavarsamis, N; Hamilton, A J

    2010-06-01

    Four estimators of annual infection probability were compared pertinent to Quantitative Microbial Risk Analysis (QMRA). A stochastic model, the Gold Standard, was used as the benchmark. It is a product of independent daily infection probabilities which in turn are based on daily doses. An alternative and commonly-used estimator, here referred to as the Naïve, assumes a single daily infection probability from a single value of daily dose. The typical use of this estimator in stochastic QMRA involves the generation of a distribution of annual infection probabilities, but since each of these is based on a single realisation of the dose distribution, the resultant annual infection probability distribution simply represents a set of inaccurate estimates. While the medians of both distributions were within an order of magnitude for our test scenario, the 95th percentiles, which are sometimes used in QMRA as conservative estimates of risk, differed by around one order of magnitude. The other two estimators examined, the Geometric and Arithmetic, were closely related to the Naïve and use the same equation, and both proved to be poor estimators. Lastly, this paper proposes a simple adjustment to the Gold Standard equation accommodating periodic infection probabilities when the daily infection probabilities are unknown.

  4. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  5. Hexavalent chromium and lung cancer in the chromate industry: a quantitative risk assessment.

    PubMed

    Park, Robert M; Bena, James F; Stayner, Leslie T; Smith, Randall J; Gibb, Herman J; Lees, Peter S J

    2004-10-01

    The purpose of this investigation was to estimate excess lifetime risk of lung cancer death resulting from occupational exposure to hexavalent-chromium-containing dusts and mists. The mortality experience in a previously studied cohort of 2,357 chromate chemical production workers with 122 lung cancer deaths was analyzed with Poisson regression methods. Extensive records of air samples evaluated for water-soluble total hexavalent chromium were available for the entire employment history of this cohort. Six different models of exposure-response for hexavalent chromium were evaluated by comparing deviances and inspection of cubic splines. Smoking (pack-years) imputed from cigarette use at hire was included in the model. Lifetime risks of lung cancer death from exposure to hexavalent chromium (assuming up to 45 years of exposure) were estimated using an actuarial calculation that accounts for competing causes of death. A linear relative rate model gave a good and readily interpretable fit to the data. The estimated rate ratio for 1 mg/m3-yr of cumulative exposure to hexavalent chromium (as CrO3), with a lag of five years, was RR=2.44 (95% CI=1.54-3.83). The excess lifetime risk of lung cancer death from exposure to hexavalent chromium at the current OSHA permissible exposure limit (PEL) (0.10 mg/m3) was estimated to be 255 per 1,000 (95% CI: 109-416). This estimate is comparable to previous estimates by U.S. EPA, California EPA, and OSHA using different occupational data. Our analysis predicts that current occupational standards for hexavalent chromium permit a lifetime excess risk of dying of lung cancer that exceeds 1 in 10, which is consistent with previous risk assessments. PMID:15563281

  6. Quantitative evaluation of strategies for erosion control on a railway embankment batter

    NASA Astrophysics Data System (ADS)

    Gyasi-Agyei, Y.; Sibley, J.; Ashwath, N.

    2001-12-01

    Strategies for erosion control on a railway embankment batter (side slope) are quantitatively evaluated in this paper. The strategies were centred on control (do nothing treatment), grass seeding, gypsum application, jute mat (an erosion control blanket) placement and planting hedgerows of Monto vetiver grass. Rainfall and runoff were monitored at 1 min intervals on 10 m wide embankment batter plots during 1998 and 1999. Total bedload and suspended sediment eroded from the plots were also measured but only for a group of storm events within sampling intervals. It has been demonstrated that vetiver grass is not cost-effective in controlling erosion on railway batters within Central Queensland region. Seeding alone could cause 60% reduction in the erosion rate compared with the control treatment. Applying gypsum to the calcium-deficient soil before seeding yielded an additional 25% reduction in the erosion rate. This is the result, primarily, of 100% grass cover establishment within seven months of sowing. Therefore, for railway embankment batter erosion control, the emphasis needs to be on rapid establishment of 100% grass cover. For rapid establishment of grass cover, irrigation is necessary during the initial stages of growth as the rainfall is unpredictable and the potential evaporation exceeds rainfall in the study region. The risk of seeds and fertilizers being washed out by short-duration and high-intensity rainfall events during the establishment phase may be reduced by the use of erosion control blankets on sections of the batters. Accidental burning of grasses on some plots caused serious erosion problems, resulting in very slow recovery of grass growth. It is therefore recommended that controlled burning of grasses on railway batters should be avoided to protect batters from being exposed to severe erosion.

  7. Quantitative evaluation of oligonucleotide surface concentrations using polymerization-based amplification

    PubMed Central

    Hansen, Ryan R.; Avens, Heather J.; Shenoy, Raveesh

    2008-01-01

    Quantitative evaluation of minimal polynucleotide concentrations has become a critical analysis among a myriad of applications found in molecular diagnostic technology. Development of high-throughput, nonenzymatic assays that are sensitive, quantitative and yet feasible for point-of-care testing are thus beneficial for routine implementation. Here, we develop a nonenzymatic method for quantifying surface concentrations of labeled DNA targets by coupling regulated amounts of polymer growth to complementary biomolecular binding on array-based biochips. Polymer film thickness measurements in the 20–220 nm range vary logarithmically with labeled DNA surface concentrations over two orders of magnitude with a lower limit of quantitation at 60 molecules/μm2 (∼106 target molecules). In an effort to develop this amplification method towards compatibility with fluorescence-based methods of characterization, incorporation of fluorescent nanoparticles into the polymer films is also evaluated. The resulting gains in fluorescent signal enable quantification using detection instrumentation amenable to point-of-care settings. Figure Polymerization-based amplification for quantitative evaluation of 3’ biotinylated oligonucleotide surface concentrations PMID:18661123

  8. Evaluation of a quantitative clinical method for assessment of sensory skin irritation.

    PubMed

    Robinson, M K; Perkins, M A

    2001-10-01

    Sensory skin irritation refers to the myriad of symptomatic complaints (e.g., sting and burn) frequently associated with inflammatory skin conditions or skin intolerance to various chemicals or finished products. Sensory irritation is an important factor in consumer acceptance of the products that they buy and use; however, from a safety testing and risk assessment standpoint, it has been difficult to evaluate. Recently, methods have been developed to more quantitatively assess sensory irritation using a semantically-labeled scale of sensation intensity, the labeled magnitude (LM) scale. Using this device, studies were conducted to determine if test subjects' perceptions of recalled or imagined sensory responses (from a series of survey questions) were related to their actual sensory reactivity to chemical challenge. Subjects were presented with 15 skin sensation scenarios of varying intensities and asked to record their self-perceived recalled or imagined responses using the LM scale. Individual and mean responses to each of the 15 survey questions were compared within and across studies. Considerable variation was seen between subjects' responses to the questions, particularly for questions pertaining to stronger stimuli (e.g., scalding water or skin lacerations). There was also little consistency seen in the pattern of individual responses across the questions. However, among 4 different study populations, the group mean scores for each of the 15 survey questions showed a high degree of consistency. Also, in spite of the variability in perceived responses to the recalled/imagined skin sensations, statistically significant dose-response and time-response patterns were observed in chemical (lactic acid and capsaicin) challenge studies. In one capsaicin study, a direct relationship was observed, among 83% of the study subjects, between the mean recall intensity scores and actual responses to subsequent capsaicin challenge. This pattern was not seen in a lactic acid

  9. PURE: a web-based decision support system to evaluate pesticide environmental risk for sustainable pest management practices in California.

    PubMed

    Zhan, Yu; Zhang, Minghua

    2012-08-01

    Farmers, policy makers, and other stakeholders seek tools to quantitatively assess pesticide risks for mitigating pesticide impacts on ecosystem and human health. This paper presents the Pesticide Use Risk Evaluation (PURE) decision support system (DSS) for evaluating site-specific pesticide risks to surface water, groundwater, soil, and air across pesticide active ingredient (AI), pesticide product, and field levels. The risk score is determined by the ratio of the predicted environmental concentrations (PEC) to the toxicity value for selected endpoint organism(s); except that the risk score for the air is calculated using the emission potential (EP), which is a pesticide product property for estimating potential volatile organic compound (VOC) emissions by California Environmental Protection Agency (CEPA). The risk scores range from 0 to 100, where 0 represents negligible risk while 100 means the highest risk. The procedure for calculating PEC in surface water was evaluated against monitoring data for 41 pesticide AIs, with a statistically significant correlation coefficient of r=0.82 (p<0.001). In addition, two almond fields in the Central Valley, California were evaluated for pesticide risks as a case study, where the commonly acknowledged high-risk pesticides gained high risk scores. Simazine, one of the most frequently detected pesticides in groundwater, was scored as 74 (the moderate high risk class) to groundwater; and chlorpyrifos, one of the frequently detected pollutants in surface water, was scored as 100 (the high risk class) to surface water. In support of pesticide risk quantitative assessment and use of reduced-risk pesticide selection, the PURE-DSS can be useful to assist growers, pesticide control advisors, and environmental protection organizations in mitigating pesticide use impacts on the environment.

  10. Quantitative Evaluation of Liver Fibrosis Using Multi-Rayleigh Model with Hypoechoic Component

    NASA Astrophysics Data System (ADS)

    Higuchi, Tatsuya; Hirata, Shinnosuke; Yamaguchi, Tadashi; Hachiya, Hiroyuki

    2013-07-01

    To realize a quantitative diagnosis method of liver fibrosis, we have been developing a modeling method for the probability density function of the echo amplitude. In our previous model, the approximation accuracy is insufficient in regions with hypoechoic tissue such as a nodule or a blood vessel. In this study, we examined a multi-Rayleigh model with three Rayleigh distributions, corresponding to the distribution of the echo amplitude from hypoechoic, normal, and fibrous tissue. We showed quantitatively that the proposed model can model the amplitude distribution of liver fibrosis echo data with hypoechoic tissue adequately using Kullback-Leibler (KL) divergence, which is an index of the difference between two probability distributions. We also found that fibrous indices can be estimated stably using the proposed model even if hypoechoic tissue is included in the region of interest. We conclude that the multi-Rayleigh model with three components can be used to evaluate the progress of liver fibrosis quantitatively.

  11. Combining qualitative and quantitative imaging evaluation for the assessment of genomic DNA integrity: The SPIDIA experience.

    PubMed

    Ciniselli, Chiara Maura; Pizzamiglio, Sara; Malentacchi, Francesca; Gelmini, Stefania; Pazzagli, Mario; Hartmann, Christina C; Ibrahim-Gawel, Hady; Verderio, Paolo

    2015-06-15

    In this note, we present an ad hoc procedure that combines qualitative (visual evaluation) and quantitative (ImageJ software) evaluations of Pulsed-Field Gel Electrophoresis (PFGE) images to assess the genomic DNA (gDNA) integrity of analyzed samples. This procedure could be suitable for the analysis of a large number of images by taking into consideration both the expertise of researchers and the objectiveness of the software. We applied this procedure on the first SPIDIA DNA External Quality Assessment (EQA) samples. Results show that the classification obtained by this ad hoc procedure allows a more accurate evaluation of gDNA integrity with respect to a single approach.

  12. Comprehensive evaluation of direct injection mass spectrometry for the quantitative profiling of volatiles in food samples.

    PubMed

    Lebrón-Aguilar, R; Soria, A C; Quintanilla-López, J E

    2016-10-28

    Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography-mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods.This article is part of the themed issue 'Quantitative mass spectrometry'. PMID:27644978

  13. Comprehensive evaluation of direct injection mass spectrometry for the quantitative profiling of volatiles in food samples.

    PubMed

    Lebrón-Aguilar, R; Soria, A C; Quintanilla-López, J E

    2016-10-28

    Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography-mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods.This article is part of the themed issue 'Quantitative mass spectrometry'.

  14. Protein quantitative trait loci analysis identifies genetic variation in the innate immune regulator TOLLIP in post lung transplant primary graft dysfunction risk

    PubMed Central

    Cantu, Edward; Suzuki, Yoshikazu; Diamond, Joshua M.; Ellis, John; Tiwari, Jaya; Beduhn, Ben; Nellen, James R.; Shah, Rupal; Meyer, Nuala J.; Lederer, David J.; Kawut, Steven M.; Palmer, Scott M.; Snyder, Laurie D.; Hartwig, Matthew G.; Lama, Vibha N.; Bhorade, Sangeeta; Crespo, Maria; Demissie, Ejigayehu; Wille, Keith; Orens, Jonathan; Shah, Pali D.; Weinacker, Ann; Weill, David; Wilkes, David; Roe, David; Ware, Lorraine B.; Wang, Fan; Feng, Rui; Christie, Jason D.

    2016-01-01

    Summary We previously identified plasma plasminogen activator inhibitor-1 (PAI-1) level as a quantitative lung injury biomarker in PGD. We hypothesized plasma levels of PAI-1 used as a quantitative trait could facilitate discovery of genetic loci important in PGD pathogenesis. A 2-stage cohort study was performed. In stage 1, we tested associations of loci with PAI-1 plasma level using linear modeling. Genotyping was performed using the Illumina CVD Bead Chip v2. Loci meeting a p<5×10−4 cutoff were carried forward and tested in Stage 2 for association with PGD. 297 enrollees were evaluated in Stage 1. 6 loci, associated with PAI-1, were carried forward to Stage 2 and evaluated in 728 patients. rs3168046 (Toll interacting protein, TOLLIP) was significantly associated with PGD (p=0.006). The increased risk of PGD for carrying at least one copy of this variant was 11.7% [95% CI: 4.9%, 18.5%]. The false positive rate for individuals with this genotype who did not have PGD was 6.1%. Variants in the TOLLIP gene are associated with higher circulating PAI-1 plasma levels and validate for association with clinical PGD. A protein quantitative trait analysis for PGD risk prioritizes genetic variations in TOLLIP, and supports a role for toll-like receptors in PGD pathogenesis. PMID:26663441

  15. Evaluation of volcanic risk management in Merapi and Bromo Volcanoes

    NASA Astrophysics Data System (ADS)

    Bachri, S.; Stöetter, J.; Sartohadi, J.; Setiawan, M. A.

    2012-04-01

    Merapi (Central Java Province) and Bromo (East Java Province) volcanoes have human-environmental systems with unique characteristics, thus causing specific consequences on their risk management. Various efforts have been carried out by many parties (institutional government, scientists, and non-governmental organizations) to reduce the risk in these areas. However, it is likely that most of the actions have been done for temporary and partial purposes, leading to overlapping work and finally to a non-integrated scheme of volcanic risk management. This study, therefore, aims to identify and evaluate actions of risk and disaster reduction in Merapi and Bromo Volcanoes. To achieve this aims, a thorough literature review was carried out to identify earlier studies in both areas. Afterward, the basic concept of risk management cycle, consisting of risk assessment, risk reduction, event management and regeneration, is used to map those earlier studies and already implemented risk management actions in Merapi and Bromo. The results show that risk studies in Merapi have been developed predominantly on physical aspects of volcanic eruptions, i.e. models of lahar flows, hazard maps as well as other geophysical modeling. Furthermore, after the 2006 eruption of Merapi, research such on risk communication, social vulnerability, cultural vulnerability have appeared on the social side of risk management research. Apart from that, disaster risk management activities in the Bromo area were emphasizing on physical process and historical religious aspects. This overview of both study areas provides information on how risk studies have been used for managing the volcano disaster. This result confirms that most of earlier studies emphasize on the risk assessment and only few of them consider the risk reduction phase. Further investigation in this field work in the near future will accomplish the findings and contribute to formulate integrated volcanic risk management cycles for both

  16. EVALUATING RISK IN OLDER ADULTS USING PHYSIOLOGICALLY BASED PHARMACOKINETIC MODELS

    EPA Science Inventory

    The rapid growth in the number of older Americans has many implications for public health, including the need to better understand the risks posed by environmental exposures to older adults. An important element for evaluating risk is the understanding of the doses of environment...

  17. Trust-level risk evaluation and risk control guidance in the NHS East of England.

    PubMed

    Card, Alan J; Ward, James R; Clarkson, P John

    2014-08-01

    In recent years, the healthcare sector has adopted the use of operational risk assessment tools to help understand the systems issues that lead to patient safety incidents. But although these problem-focused tools have improved the ability of healthcare organizations to identify hazards, they have not translated into measurable improvements in patient safety. One possible reason for this is a lack of support for the solution-focused process of risk control. This article describes a content analysis of the risk management strategies, policies, and procedures at all acute (i.e., hospital), mental health, and ambulance trusts (health service organizations) in the East of England area of the British National Health Service. The primary goal was to determine what organizational-level guidance exists to support risk control practice. A secondary goal was to examine the risk evaluation guidance provided by these trusts. With regard to risk control, we found an almost complete lack of useful guidance to promote good practice. With regard to risk evaluation, the trusts relied exclusively on risk matrices. A number of weaknesses were found in the use of this tool, especially related to the guidance for scoring an event's likelihood. We make a number of recommendations to address these concerns. The guidance assessed provides insufficient support for risk control and risk evaluation. This may present a significant barrier to the success of risk management approaches in improving patient safety.

  18. Assessing vertebral fracture risk on volumetric quantitative computed tomography by geometric characterization of trabecular bone structure

    NASA Astrophysics Data System (ADS)

    Checefsky, Walter A.; Abidin, Anas Z.; Nagarajan, Mahesh B.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2016-03-01

    The current clinical standard for measuring Bone Mineral Density (BMD) is dual X-ray absorptiometry, however more recently BMD derived from volumetric quantitative computed tomography has been shown to demonstrate a high association with spinal fracture susceptibility. In this study, we propose a method of fracture risk assessment using structural properties of trabecular bone in spinal vertebrae. Experimental data was acquired via axial multi-detector CT (MDCT) from 12 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. Common image processing methods were used to annotate the trabecular compartment in the vertebral slices creating a circular region of interest (ROI) that excluded cortical bone for each slice. The pixels inside the ROI were converted to values indicative of BMD. High dimensional geometrical features were derived using the scaling index method (SIM) at different radii and scaling factors (SF). The mean BMD values within the ROI were then extracted and used in conjunction with a support vector machine to predict the failure load of the specimens. Prediction performance was measured using the root-mean-square error (RMSE) metric and determined that SIM combined with mean BMD features (RMSE = 0.82 +/- 0.37) outperformed MDCT-measured mean BMD (RMSE = 1.11 +/- 0.33) (p < 10-4). These results demonstrate that biomechanical strength prediction in vertebrae can be significantly improved through the use of SIM-derived texture features from trabecular bone.

  19. Quantitative structure-activity relationships and ecological risk assessment: an overview of predictive aquatic toxicology research.

    PubMed

    Bradbury, S P

    1995-09-01

    In the field of aquatic toxicology, quantitative structure-activity relationships (QSARs) have developed as scientifically credible tools for predicting the toxicity of chemicals when little or no empirical data are available. A fundamental understanding of toxicological principles has been considered an important component to the acceptance and application of QSAR approaches as biologically relevant in ecological risk assessments. As a consequence, there has been an evolution of QSAR development and application from that of a chemical-class perspective to one that is more consistent with assumptions regarding modes of toxic action. In this review, techniques to assess modes of toxic action from chemical structure are discussed, with consideration that toxicodynamic knowledge bases must be clearly defined with regard to exposure regimes, biological models/endpoints and compounds that adequately span the diversity of chemicals anticipated for future applications. With such knowledge bases, classification systems, including rule-based expert systems, have been established for use in predictive aquatic toxicology applications. The establishment of QSAR techniques that are based on an understanding of toxic mechanisms is needed to provide a link to physiologically based toxicokinetic and toxicodynamic models, which can provide the means to extrapolate adverse effects across species and exposure regimes. PMID:7570660

  20. Establishment and evaluation of event-specific quantitative PCR method for genetically modified soybean MON89788.

    PubMed

    Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Kitta, Kazumi

    2010-01-01

    A novel real-time PCR-based analytical method was established for the event-specific quantification of a GM soybean event MON89788. The conversion factor (C(f)) which is required to calculate the GMO amount was experimentally determined. The quantitative method was evaluated by a single-laboratory analysis and a blind test in a multi-laboratory trial. The limit of quantitation for the method was estimated to be 0.1% or lower. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were both less than 20%. These results suggest that the established method would be suitable for practical detection and quantification of MON89788. PMID:21071908

  1. Establishment and evaluation of event-specific quantitative PCR method for genetically modified soybean MON89788.

    PubMed

    Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Kitta, Kazumi

    2010-01-01

    A novel real-time PCR-based analytical method was established for the event-specific quantification of a GM soybean event MON89788. The conversion factor (C(f)) which is required to calculate the GMO amount was experimentally determined. The quantitative method was evaluated by a single-laboratory analysis and a blind test in a multi-laboratory trial. The limit of quantitation for the method was estimated to be 0.1% or lower. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were both less than 20%. These results suggest that the established method would be suitable for practical detection and quantification of MON89788.

  2. Quantitative Microbial Risk Assessment for Clostridium perfringens in Natural and Processed Cheeses

    PubMed Central

    Lee, Heeyoung; Lee, Soomin; Kim, Sejeong; Lee, Jeeyeon; Ha, Jimyeong; Yoon, Yohan

    2016-01-01

    This study evaluated the risk of Clostridium perfringens (C. perfringens) foodborne illness from natural and processed cheeses. Microbial risk assessment in this study was conducted according to four steps: hazard identification, hazard characterization, exposure assessment, and risk characterization. The hazard identification of C. perfringens on cheese was identified through literature, and dose response models were utilized for hazard characterization of the pathogen. For exposure assessment, the prevalence of C. perfringens, storage temperatures, storage time, and annual amounts of cheese consumption were surveyed. Eventually, a simulation model was developed using the collected data and the simulation result was used to estimate the probability of C. perfringens foodborne illness by cheese consumption with @RISK. C. perfringens was determined to be low risk on cheese based on hazard identification, and the exponential model (r = 1.82×10−11) was deemed appropriate for hazard characterization. Annual amounts of natural and processed cheese consumption were 12.40±19.43 g and 19.46±14.39 g, respectively. Since the contamination levels of C. perfringens on natural (0.30 Log CFU/g) and processed cheeses (0.45 Log CFU/g) were below the detection limit, the initial contamination levels of natural and processed cheeses were estimated by beta distribution (α1 = 1, α2 = 91; α1 = 1, α2 = 309)×uniform distribution (a = 0, b = 2; a = 0, b = 2.8) to be −2.35 and −2.73 Log CFU/g, respectively. Moreover, no growth of C. perfringens was observed for exposure assessment to simulated conditions of distribution and storage. These data were used for risk characterization by a simulation model, and the mean values of the probability of C. perfringens foodborne illness by cheese consumption per person per day for natural and processed cheeses were 9.57×10−14 and 3.58×10−14, respectively. These results indicate that probability of C. perfringens foodborne illness

  3. A Suite of Models to Support the Quantitative Assessment of Spread in Pest Risk Analysis

    PubMed Central

    Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J.; Baker, Richard H. A.; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke

    2012-01-01

    Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice. PMID:23056174

  4. [Drifts and pernicious effects of the quantitative evaluation of research: the misuse of bibliometrics].

    PubMed

    Gingras, Yves

    2015-06-01

    The quantitative evaluation of scientific research relies increasingly on bibliometric indicators of publications and citations. We present the issues raised by the simplistic use of these methods and recall the dangers of using poorly built indicators and technically defective rankings that do not measure the dimensions they are supposed to measure, for example the of publications, laboratories or universities. We show that francophone journals are particularly susceptible to suffer from the bad uses of too simplistic bibliometric rankings of scientific journals.

  5. Using qualitative and quantitative methods to evaluate small-scale disease management pilot programs.

    PubMed

    Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha

    2009-02-01

    Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.

  6. Approaches for Assessing Risks to Sensitive Populations: Lessons Learned from Evaluating Risks in the Pediatric Population

    PubMed Central

    Hines, Ronald N.; Sargent, Dana; Autrup, Herman; Birnbaum, Linda S.; Brent, Robert L.; Doerrer, Nancy G.; Cohen Hubal, Elaine A.; Juberg, Daland R.; Laurent, Christian; Luebke, Robert; Olejniczak, Klaus; Portier, Christopher J.; Slikker, William

    2010-01-01

    Assessing the risk profiles of potentially sensitive populations requires a “tool chest” of methodological approaches to adequately characterize and evaluate these populations. At present, there is an extensive body of literature on methodologies that apply to the evaluation of the pediatric population. The Health and Environmental Sciences Institute Subcommittee on Risk Assessment of Sensitive Populations evaluated key references in the area of pediatric risk to identify a spectrum of methodological approaches. These approaches are considered in this article for their potential to be extrapolated for the identification and assessment of other sensitive populations. Recommendations as to future research needs and/or alternate methodological considerations are also made. PMID:19770482

  7. [Vascular risk of oral contraceptive agents: realities and mechanisms. I. Risk evaluation].

    PubMed

    Beaumont, V; Beaumont, J L

    1989-06-17

    The suspected risk of oral contraception was confirmed when large scale epidemiological studies became available. The present work recalls the methodology of a good evaluation, the pros and cons of retrospective and prospective studies, the different appreciations provided by measuring "relative risk" or "attributable risk". In terms of public health, the data obtained supported the necessity to include mortality related to oral contraception in an evaluation of reproductive mortality. This work compares the incidence of vascular complications evaluated through different studies, according to the criteria selected and the type of vascular disease. The advantage of lowering estrogen content is considered. PMID:2525761

  8. Reanalysis of the DEMS nested case-control study of lung cancer and diesel exhaust: suitability for quantitative risk assessment.

    PubMed

    Crump, Kenny S; Van Landingham, Cynthia; Moolgavkar, Suresh H; McClellan, Roger

    2015-04-01

    The International Agency for Research on Cancer (IARC) in 2012 upgraded its hazard characterization of diesel engine exhaust (DEE) to "carcinogenic to humans." The Diesel Exhaust in Miners Study (DEMS) cohort and nested case-control studies of lung cancer mortality in eight U.S. nonmetal mines were influential in IARC's determination. We conducted a reanalysis of the DEMS case-control data to evaluate its suitability for quantitative risk assessment (QRA). Our reanalysis used conditional logistic regression and adjusted for cigarette smoking in a manner similar to the original DEMS analysis. However, we included additional estimates of DEE exposure and adjustment for radon exposure. In addition to applying three DEE exposure estimates developed by DEMS, we applied six alternative estimates. Without adjusting for radon, our results were similar to those in the original DEMS analysis: all but one of the nine DEE exposure estimates showed evidence of an association between DEE exposure and lung cancer mortality, with trend slopes differing only by about a factor of two. When exposure to radon was adjusted, the evidence for a DEE effect was greatly diminished, but was still present in some analyses that utilized the three original DEMS DEE exposure estimates. A DEE effect was not observed when the six alternative DEE exposure estimates were utilized and radon was adjusted. No consistent evidence of a DEE effect was found among miners who worked only underground. This article highlights some issues that should be addressed in any use of the DEMS data in developing a QRA for DEE.

  9. Primer for evaluating ecological risk at petroleum release sites.

    PubMed

    Claff, R

    1999-02-01

    Increasingly, risk-based approaches are being used to guide decision making at sites such as service stations and petroleum product terminals, where petroleum products have been inadvertently released to the soil. For example, the API Decision Support System software, DSS, evaluates site human health risk along six different routes of exposure. The American Society for Testing and Materials' Risk-Based Corrective Action (RBCA) standard, ASTM 1739, establishes a tiered framework for evaluating petroleum release sites on the basis of human health risk. Though much of the risk assessment focus has been on human health risk, regulatory agencies recognize that protection of human health may not fully protect the environment; and EPA has developed guidance on identifying ecological resources to be protected through risk-based decision making. Not every service station or petroleum product terminal site warrants a detailed ecological risk assessment. In some cases, a simple preliminary assessment will provide sufficient information for decision making. Accordingly, the American Petroleum Institute (API) is developing a primer for site managers, to assist them in conducting this preliminary assessment, and in deciding whether more detailed ecological risk assessments are warranted. The primer assists the site manager in identifying relevant ecological receptors and habitats, in identifying chemicals and exposure pathways of concern, in developing a conceptual model of the site to guide subsequent actions, and in identifying conditions that may warrant immediate response. PMID:10189585

  10. Primer for evaluating ecological risk at petroleum release sites.

    PubMed

    Claff, R

    1999-02-01

    Increasingly, risk-based approaches are being used to guide decision making at sites such as service stations and petroleum product terminals, where petroleum products have been inadvertently released to the soil. For example, the API Decision Support System software, DSS, evaluates site human health risk along six different routes of exposure. The American Society for Testing and Materials' Risk-Based Corrective Action (RBCA) standard, ASTM 1739, establishes a tiered framework for evaluating petroleum release sites on the basis of human health risk. Though much of the risk assessment focus has been on human health risk, regulatory agencies recognize that protection of human health may not fully protect the environment; and EPA has developed guidance on identifying ecological resources to be protected through risk-based decision making. Not every service station or petroleum product terminal site warrants a detailed ecological risk assessment. In some cases, a simple preliminary assessment will provide sufficient information for decision making. Accordingly, the American Petroleum Institute (API) is developing a primer for site managers, to assist them in conducting this preliminary assessment, and in deciding whether more detailed ecological risk assessments are warranted. The primer assists the site manager in identifying relevant ecological receptors and habitats, in identifying chemicals and exposure pathways of concern, in developing a conceptual model of the site to guide subsequent actions, and in identifying conditions that may warrant immediate response.

  11. Enhancing understanding and recall of quantitative information about medical risks: a cross-cultural comparison between Germany and Spain.

    PubMed

    Garcia-Retamero, Rocio; Galesic, Mirta; Gigerenzer, Gerd

    2011-05-01

    In two experiments, we analyzed cross-cultural differences in understanding and recalling information about medical risks in two countries--Germany and Spain--whose students differ substantially in their quantitative literacy according to the 2003 Programme for International Student Assessment (PISA; OECD, 2003, 2010). We further investigated whether risk understanding can be enhanced by using visual aids (Experiment 1), and whether different ways of describing risks affect recall (Experiment 2). Results showed that Spanish students are more vulnerable to misunderstanding and forgetting the risk information than their German counterparts. Spanish students, however, benefit more than German students from representing the risk information using ecologically rational formats--which exploit the way information is represented in the human mind. We concluded that our results can have important implications for clinical practice.

  12. Food and Drug Administration Evaluation and Cigarette Smoking Risk Perceptions

    ERIC Educational Resources Information Center

    Kaufman, Annette R.; Waters, Erika A.; Parascandola, Mark; Augustson, Erik M.; Bansal-Travers, Maansi; Hyland, Andrew; Cummings, K. Michael

    2011-01-01

    Objectives: To examine the relationship between a belief about Food and Drug Administration (FDA) safety evaluation of cigarettes and smoking risk perceptions. Methods: A nationally representative, random-digit-dialed telephone survey of 1046 adult current cigarette smokers. Results: Smokers reporting that the FDA does not evaluate cigarettes for…

  13. Computer-Aided Image Analysis and Fractal Synthesis in the Quantitative Evaluation of Tumor Aggressiveness in Prostate Carcinomas

    PubMed Central

    Waliszewski, Przemyslaw

    2016-01-01

    The subjective evaluation of tumor aggressiveness is a cornerstone of the contemporary tumor pathology. A large intra- and interobserver variability is a known limiting factor of this approach. This fundamental weakness influences the statistical deterministic models of progression risk assessment. It is unlikely that the recent modification of tumor grading according to Gleason criteria for prostate carcinoma will cause a qualitative change and improve significantly the accuracy. The Gleason system does not allow the identification of low aggressive carcinomas by some precise criteria. The ontological dichotomy implies the application of an objective, quantitative approach for the evaluation of tumor aggressiveness as an alternative. That novel approach must be developed and validated in a manner that is independent of the results of any subjective evaluation. For example, computer-aided image analysis can provide information about geometry of the spatial distribution of cancer cell nuclei. A series of the interrelated complexity measures characterizes unequivocally the complex tumor images. Using those measures, carcinomas can be classified into the classes of equivalence and compared with each other. Furthermore, those measures define the quantitative criteria for the identification of low- and high-aggressive prostate carcinomas, the information that the subjective approach is not able to provide. The co-application of those complexity measures in cluster analysis leads to the conclusion that either the subjective or objective classification of tumor aggressiveness for prostate carcinomas should comprise maximal three grades (or classes). Finally, this set of the global fractal dimensions enables a look into dynamics of the underlying cellular system of interacting cells and the reconstruction of the temporal-spatial attractor based on the Taken’s embedding theorem. Both computer-aided image analysis and the subsequent fractal synthesis could be performed

  14. Computer-Aided Image Analysis and Fractal Synthesis in the Quantitative Evaluation of Tumor Aggressiveness in Prostate Carcinomas.

    PubMed

    Waliszewski, Przemyslaw

    2016-01-01

    The subjective evaluation of tumor aggressiveness is a cornerstone of the contemporary tumor pathology. A large intra- and interobserver variability is a known limiting factor of this approach. This fundamental weakness influences the statistical deterministic models of progression risk assessment. It is unlikely that the recent modification of tumor grading according to Gleason criteria for prostate carcinoma will cause a qualitative change and improve significantly the accuracy. The Gleason system does not allow the identification of low aggressive carcinomas by some precise criteria. The ontological dichotomy implies the application of an objective, quantitative approach for the evaluation of tumor aggressiveness as an alternative. That novel approach must be developed and validated in a manner that is independent of the results of any subjective evaluation. For example, computer-aided image analysis can provide information about geometry of the spatial distribution of cancer cell nuclei. A series of the interrelated complexity measures characterizes unequivocally the complex tumor images. Using those measures, carcinomas can be classified into the classes of equivalence and compared with each other. Furthermore, those measures define the quantitative criteria for the identification of low- and high-aggressive prostate carcinomas, the information that the subjective approach is not able to provide. The co-application of those complexity measures in cluster analysis leads to the conclusion that either the subjective or objective classification of tumor aggressiveness for prostate carcinomas should comprise maximal three grades (or classes). Finally, this set of the global fractal dimensions enables a look into dynamics of the underlying cellular system of interacting cells and the reconstruction of the temporal-spatial attractor based on the Taken's embedding theorem. Both computer-aided image analysis and the subsequent fractal synthesis could be performed

  15. Evaluate the Impact of your Education and Outreach Program Using the Quantitative Collaborative Impact Analysis Method

    NASA Astrophysics Data System (ADS)

    Scalice, D.; Davis, H. B.

    2015-12-01

    The AGU scientific community has a strong motivation to improve the STEM knowledge and skills of today's youth, and we are dedicating increasing amounts of our time and energy to education and outreach work. Scientists and educational project leads can benefit from a deeper connection to the value of evaluation, how to work with an evaluator, and how to effectively integrate evaluation into projects to increase their impact. This talk will introduce a method for evaluating educational activities, including public talks, professional development workshops for educators, youth engagement programs, and more. We will discuss the impetus for developing this method--the Quantitative Collaborative Impact Analysis Method--how it works, and the successes we've had with it in the NASA Astrobiology education community.

  16. Linking quantitative microbial risk assessment and epidemiological data: informing safe drinking water trials in developing countries.

    PubMed

    Enger, Kyle S; Nelson, Kara L; Clasen, Thomas; Rose, Joan B; Eisenberg, Joseph N S

    2012-05-01

    Intervention trials are used extensively to assess household water treatment (HWT) device efficacy against diarrheal disease in developing countries. Using these data for policy, however, requires addressing issues of generalizability (relevance of one trial in other contexts) and systematic bias associated with design and conduct of a study. To illustrate how quantitative microbial risk assessment (QMRA) can address water safety and health issues, we analyzed a published randomized controlled trial (RCT) of the LifeStraw Family Filter in the Congo. The model accounted for bias due to (1) incomplete compliance with filtration, (2) unexpected antimicrobial activity by the placebo device, and (3) incomplete recall of diarrheal disease. Effectiveness was measured using the longitudinal prevalence ratio (LPR) of reported diarrhea. The Congo RCT observed an LPR of 0.84 (95% CI: 0.61, 1.14). Our model predicted LPRs, assuming a perfect placebo, ranging from 0.50 (2.5-97.5 percentile: 0.33, 0.77) to 0.86 (2.5-97.5 percentile: 0.68, 1.09) for high (but not perfect) and low (but not zero) compliance, respectively. The calibration step provided estimates of the concentrations of three pathogen types (modeled as diarrheagenic E. coli, Giardia, and rotavirus) in drinking water, consistent with the longitudinal prevalence of reported diarrhea measured in the trial, and constrained by epidemiological data from the trial. Use of a QMRA model demonstrated the importance of compliance in HWT efficacy, the need for pathogen data from source waters, the effect of quantifying biases associated with epidemiological data, and the usefulness of generalizing the effectiveness of HWT trials to other contexts.

  17. Evaluation and Quantitative Prediction of Renal Transporter-Mediated Drug-Drug Interactions.

    PubMed

    Feng, Bo; Varma, Manthena V

    2016-07-01

    With numerous drugs cleared renally, inhibition of uptake transporters localized on the basolateral membrane of renal proximal tubule cells, eg, organic anion transporters (OATs) and organic cation transporters (OCTs), may lead to clinically meaningful drug-drug interactions (DDIs). Additionally, clinical evidence for the possible involvement of efflux transporters, such as P-glycoprotein (P-gp) and multidrug and toxin extrusion protein 1/2-K (MATE1/2-K), in the renal DDIs is emerging. Herein, we review recent progress regarding mechanistic understanding of transporter-mediated renal DDIs as well as the quantitative predictability of renal DDIs using static and physiologically based pharmacokinetic (PBPK) models. Generally, clinical DDI data suggest that the magnitude of plasma exposure changes attributable to renal DDIs is less than 2-fold, unlike the DDIs associated with inhibition of cytochrome P-450s and/or hepatic uptake transporters. It is concluded that although there is a need for risk assessment early in drug development, current available data imply that safety concerns related to the renal DDIs are generally low. Nevertheless, consideration must be given to the therapeutic index of the victim drug and potential risk in a specific patient population (eg, renal impairment). Finally, in vitro transporter data and clinical pharmacokinetic parameters obtained from the first-in-human studies have proven useful in support of quantitative prediction of DDIs associated with inhibition of renal secretory transporters, OATs or OCTs. PMID:27385169

  18. Quantitative morphological evaluation of laser ablation on calculus using full-field optical coherence microscopy

    NASA Astrophysics Data System (ADS)

    Xiao, Q.; Lü, T.; Li, Z.; Fu, L.

    2011-10-01

    The quantitative morphological evaluation at high resolution is of significance for the study of laser-tissue interaction. In this paper, a full-field optical coherence microscopy (OCM) system with high resolution of ˜2 μm was developed to investigate the ablation on urinary calculus by a free-running Er:YAG laser. We studied the morphological variation quantitatively corresponding to change of energy setting of the Er:YAG laser. The experimental results show that the full-field OCM enables quantitative evaluation of the morphological shape of craters and material removal, and particularly the fine structure. We also built a heat conduction model to simulate the process of laser-calculus interaction by using finite element method. Through the simulation, the removal region of the calculus was calculated according to the temperature distribution. As a result, the depth, width, volume, and the cross-sectional profile of the crater in calculus measured by full-field OCM matched well with the theoretical results based on the heat conduction model. Both experimental and theoretical results confirm that the thermal interaction is the dominant effect in the ablation of calculus by Er:YAG laser, demonstrating the effectiveness of full-field OCM in studying laser-tissue interactions.

  19. Evaluating 'good governance': The development of a quantitative tool in the Greater Serengeti Ecosystem.

    PubMed

    Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea

    2016-10-01

    Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective. PMID:27566933

  20. Evaluating 'good governance': The development of a quantitative tool in the Greater Serengeti Ecosystem.

    PubMed

    Kisingo, Alex; Rollins, Rick; Murray, Grant; Dearden, Phil; Clarke, Marlea

    2016-10-01

    Protected areas (PAs) can provide important benefits to conservation and to communities. A key factor in the effective delivery of these benefits is the role of governance. There has been a growth in research developing frameworks to evaluate 'good' PA governance, usually drawing on a set of principles that are associated with groups of indicators. In contrast to dominant qualitative approaches, this paper describes the development of a quantitative method for measuring effectiveness of protected area governance, as perceived by stakeholders in the Greater Serengeti Ecosystem in Tanzania. The research developed a quantitative method for developing effectiveness measures of PA governance, using a set of 65 statements related to governance principles developed from a literature review. The instrument was administered to 389 individuals from communities located near PAs in the Greater Serengeti Ecosystem. The results of a factor analysis suggest that statements load onto 10 factors that demonstrate high psychometric validity as measured by factor loadings, explained variance, and Cronbach's alpha reliability. The ten common factors that were extracted were: 1) legitimacy, 2) transparency and accountability, 3) responsiveness, 4) fairness, 5) participation, 6) ecosystem based management (EBM) and connectivity, 7) resilience, 8) achievements, 9) consensus orientation, and 10) power. The paper concludes that quantitative surveys can be used to evaluate governance of protected areas from a community-level perspective.

  1. Quantitative morphologic evaluation of magnetic resonance imaging during and after treatment of childhood leukemia

    PubMed Central

    Reddick, Wilburn E.; Laningham, Fred H.; Glass, John O.; Pui, Ching-Hon

    2008-01-01

    Introduction Medical advances over the last several decades, including CNS prophylaxis, have greatly increased survival in children with leukemia. As survival rates have increased, clinicians and scientists have been afforded the opportunity to further develop treatments to improve the quality of life of survivors by minimizing the long-term adverse effects. When evaluating the effect of antileukemia therapy on the developing brain, magnetic resonance (MR) imaging has been the preferred modality because it quantifies morphologic changes objectively and noninvasively. Method and results Computer-aided detection of changes on neuroimages enables us to objectively differentiate leukoencephalopathy from normal maturation of the developing brain. Quantitative tissue segmentation algorithms and relaxometry measures have been used to determine the prevalence, extent, and intensity of white matter changes that occur during therapy. More recently, diffusion tensor imaging has been used to quantify microstructural changes in the integrity of the white matter fiber tracts. MR perfusion imaging can be used to noninvasively monitor vascular changes during therapy. Changes in quantitative MR measures have been associated, to some degree, with changes in neurocognitive function during and after treatment Conclusion In this review, we present recent advances in quantitative evaluation of MR imaging and discuss how these methods hold the promise to further elucidate the pathophysiologic effects of treatment for childhood leukemia. PMID:17653705

  2. Quantitative evaluation of the cutting quality and abrasive resistance of scalers.

    PubMed

    Kaya, H; Fujimura, T; Kimura, S

    1995-01-01

    An automatic scaling apparatus that simulated the scaling process of hand instrumentation was developed to quantitatively analyze the cutting quality and abrasive resistance of scalers. We first tested 4 synthetic resins as the abraded material. Of the 4 synthetic resins tested, polycarbonate resin proved most similar to dentin. The effects of lateral scaling forces (700, 500, and 300 dyne) and scaler angles (70 degrees to 95 degrees) on the cutting quality and abrasive resistance of scalers were evaluated quantitatively by the amount of the abraded material worn away in 1,000 strokes. Comparison of the 3 scaling forces showed a greater amount of abrasion at higher force than that at lower force. This suggests that the decrease in the amount due to abrasion could be compensated by increasing the lateral scaling force. Regarding the scaler angle, results indicated that the amount of material removed increased with an increase of the scaler angle up to 70 degrees, but then rapidly decreased at an angle of 90 degrees or more. The most effective scaling angle was 87 degrees, and this was not affected by scaling force. These results suggest that a greater amount of removal could be obtained at a scaling angle of 87 degrees and a scaling force of 700 dyne. The present findings suggested the automatic scaling apparatus could be a useful tool for quantitatively evaluating the cutting quality and abrasive resistance of scalers.

  3. Risk assessment for transboundary rivers using fuzzy synthetic evaluation technique

    NASA Astrophysics Data System (ADS)

    Rai, Subash P.; Sharma, Nayan; Lohani, A. K.

    2014-11-01

    Large scale urbanization has resulted in greater withdrawals of shared waters and this withdrawal has been largely dependent on the hegemony of the riparian's. The last few decades has seen the upward surge of many countries in terms of development as well as hegemony. Existing structures of established water sharing framework typically evaluate only parameters related to historic water use such as historic water demand and supply, contribution to flow, and hydrology. Water conflicts and cooperation is affected by various issues related with development and hegemony. Characterization and quantification of development and hegemony parameters is a very complex process. This paper establishes a novel approach to predict river basins at risk; the approach addresses the issue of water conflict and cooperation within a methodologically more rigorous predictive framework. Fuzzy synthetic evaluation technique is used in this paper to undertake the risk assessment of international transboundary rivers. In this paper the fuzzy domain of risk consists of two fuzzy sets - hegemony and development, indices of which are developed with the help of fuzzy synthetic evaluation techniques. Then the compositional rule-base is framed to ascertain the fuzzy risk. This fuzzy risk can be further used to prioritize all the international river basins which can help in the identification of potentially high risk basins. Risk identification of international river basins is not only scientifically valuable, but also practically highly useful. Identifying those basins that are likely to be particularly prone to conflict or cooperation is of high interest to policy makers.

  4. An overview of BWR Mark-1 containment venting risk implications: An evaluation of potential Mark-1 containment improvements

    SciTech Connect

    Wagner, K.C.; Dallman, R.J.; Galyean, W.J.

    1989-06-01

    This report supplements containment venting risk evaluations performed for the Mark-I Containment Performance Improvement (CPI) Program. Quantitative evaluations using simplified containment event trees for station blackout sequences were performed to evaluate potential risk reduction offered by containment venting, and improved automatic depressurization system with a dedicated power source, and an additional supply of water to either the containment sprays or the vessel with a dedicated power source. The risk calculations were based on the Draft NUREG-1150 results for Peach Bottom with selected enhancements. Several sensitivity studies were performed to investigate phenomenological, operational, and equipment performance uncertainties. Qualitative risk evaluations were provided for loss of long-term containment heat removal and anticipated transients without scram for the same set of improvements. A limited discussion is provided on the generic applicability of these results to other plants with Mark-I containments. 23 refs., 15 figs., 13 tabs.

  5. Risk-based evaluation of total petroleum hydrocarbons in vapor intrusion studies.

    PubMed

    Brewer, Roger; Nagashima, Josh; Kelley, Michael; Heskett, Marvin; Rigby, Mark

    2013-06-13

    This paper presents a quantitative method for the risk-based evaluation of Total Petroleum Hydrocarbons (TPH) in vapor intrusion investigations. Vapors from petroleum fuels are characterized by a complex mixture of aliphatic and, to a lesser extent, aromatic compounds. These compounds can be measured and described in terms of TPH carbon ranges. Toxicity factors published by USEPA and other parties allow development of risk-based, air and soil vapor screening levels for each carbon range in the same manner as done for individual compounds such as benzene. The relative, carbon range makeup of petroleum vapors can be used to develop weighted, site-specific or generic screening levels for TPH. At some critical ratio of TPH to a targeted, individual compound, the overwhelming proportion of TPH will drive vapor intrusion risk over the individual compound. This is particularly true for vapors associated with diesel and other middle distillate fuels, but can also be the case for low-benzene gasolines or even for high-benzene gasolines if an adequately conservative, target risk is not applied to individually targeted chemicals. This necessitates a re-evaluation of the reliance on benzene and other individual compounds as a stand-alone tool to evaluate vapor intrusion risk associated with petroleum.

  6. Risk-based evaluation of total petroleum hydrocarbons in vapor intrusion studies.

    PubMed

    Brewer, Roger; Nagashima, Josh; Kelley, Michael; Heskett, Marvin; Rigby, Mark

    2013-06-01

    This paper presents a quantitative method for the risk-based evaluation of Total Petroleum Hydrocarbons (TPH) in vapor intrusion investigations. Vapors from petroleum fuels are characterized by a complex mixture of aliphatic and, to a lesser extent, aromatic compounds. These compounds can be measured and described in terms of TPH carbon ranges. Toxicity factors published by USEPA and other parties allow development of risk-based, air and soil vapor screening levels for each carbon range in the same manner as done for individual compounds such as benzene. The relative, carbon range makeup of petroleum vapors can be used to develop weighted, site-specific or generic screening levels for TPH. At some critical ratio of TPH to a targeted, individual compound, the overwhelming proportion of TPH will drive vapor intrusion risk over the individual compound. This is particularly true for vapors associated with diesel and other middle distillate fuels, but can also be the case for low-benzene gasolines or even for high-benzene gasolines if an adequately conservative, target risk is not applied to individually targeted chemicals. This necessitates a re-evaluation of the reliance on benzene and other individual compounds as a stand-alone tool to evaluate vapor intrusion risk associated with petroleum. PMID:23765191

  7. Quantitative Evaluation of Performance in Interventional Neuroradiology: An Integrated Curriculum Featuring Theoretical and Practical Challenges

    PubMed Central

    Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik

    2016-01-01

    Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840

  8. Importance of Purity Evaluation and the Potential of Quantitative 1H NMR as a Purity Assay

    PubMed Central

    2015-01-01

    In any biomedical and chemical context, a truthful description of chemical constitution requires coverage of both structure and purity. This qualification affects all drug molecules, regardless of development stage (early discovery to approved drug) and source (natural product or synthetic). Purity assessment is particularly critical in discovery programs and whenever chemistry is linked with biological and/or therapeutic outcome. Compared with chromatography and elemental analysis, quantitative NMR (qNMR) uses nearly universal detection and provides a versatile and orthogonal means of purity evaluation. Absolute qNMR with flexible calibration captures analytes that frequently escape detection (water, sorbents). Widely accepted structural NMR workflows require minimal or no adjustments to become practical 1H qNMR (qHNMR) procedures with simultaneous qualitative and (absolute) quantitative capability. This study reviews underlying concepts, provides a framework for standard qHNMR purity assays, and shows how adequate accuracy and precision are achieved for the intended use of the material. PMID:25295852

  9. Quantitative evaluation of translational medicine based on scientometric analysis and information extraction.

    PubMed

    Zhang, Yin; Diao, Tianxi; Wang, Lei

    2014-12-01

    Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future. PMID:25079489

  10. Infrared radiometric technique for rapid quantitative evaluation of heat flux distribution over large areas

    NASA Astrophysics Data System (ADS)

    Glazer, Stuart; Siebes, Georg

    1989-03-01

    This paper describes a novel approach for rapid, quantitative measurement of spatially distributed heat flux incident on a plane. The technique utilizes the spatial temperature distribution on an opaque thin film at the location of interest, as measured by an imaging infrared radiometer. Knowledge of film radiative properties, plus quantitative estimates of convection cooling permit the steady state energy balance at any location on the film sheet to be solved for the incident heat flux. Absolute accuracies on the order of 10-15 percent have been obtained in tests performed in air. The method is particularly useful for evaluation of spatial heat flux uniformity from distributed heat sources over large areas. It has recently been used in several applications at the Jet Propulsion Laboratory, including flux uniformity measurements from large distributed quartz lamp arrays used during thermal vacuum testing of several spacecraft components, and flux mapping of a low power NdYg laser beam.

  11. Evaluation of absolute quantitation by nonlinear regression in probe-based real-time PCR

    PubMed Central

    Goll, Rasmus; Olsen, Trine; Cui, Guanglin; Florholmen, Jon

    2006-01-01

    Background In real-time PCR data analysis, the cycle threshold (CT) method is currently the gold standard. This method is based on an assumption of equal PCR efficiency in all reactions, and precision may suffer if this condition is not met. Nonlinear regression analysis (NLR) or curve fitting has therefore been suggested as an alternative to the cycle threshold method for absolute quantitation. The advantages of NLR are that the individual sample efficiency is simulated by the model and that absolute quantitation is possible without a standard curve, releasing reaction wells for unknown samples. However, the calculation method has not been evaluated systematically and has not previously been applied to a TaqMan platform. Aim: To develop and evaluate an automated NLR algorithm capable of generating batch production regression analysis. Results Total RNA samples extracted from human gastric mucosa were reverse transcribed and analysed for TNFA, IL18 and ACTB by TaqMan real-time PCR. Fluorescence data were analysed by the regular CT method with a standard curve, and by NLR with a positive control for conversion of fluorescence intensity to copy number, and for this purpose an automated algorithm was written in SPSS syntax. Eleven separate regression models were tested, and the output data was subjected to Altman-Bland analysis. The Altman-Bland analysis showed that the best regression model yielded quantitative data with an intra-assay variation of 58% vs. 24% for the CT derived copy numbers, and with a mean inter-method deviation of × 0.8. Conclusion NLR can be automated for batch production analysis, but the CT method is more precise for absolute quantitation in the present setting. The observed inter-method deviation is an indication that assessment of the fluorescence conversion factor used in the regression method can be improved. However, the versatility depends on the level of precision required, and in some settings the increased cost effectiveness of NLR

  12. Risk evaluation and management to reaching a suggested FSO in a steam meal.

    PubMed

    Mejia, Z Sosa; Beumer, R R; Zwietering, M H

    2011-06-01

    Steam meals are ready-to-eat meals composed of raw and semi-cooked ingredients, which get cooked while microwave heating. In this study, an Indian style meal was selected, Chicken Tandoori, from two different producers. These meals were first evaluated with the Risk Ranger® to identify the main foodborne pathogens risks, which were Listeria monocytogenes, Salmonella Typhimurium and Bacillus cereus. Thereafter, quantitative microbiology was applied using different models and verified with growth and inactivation challenge tests. It was observed that the gamma model and the ComBase program® showed very similar results. However, in some cases the results obtained with the challenge tests showed different results. The information gathered was used to create different scenarios which indicate how to manage the risks by setting Performance Objectives during the different stages of the food chain of this product and hence reaching a suggested Food Safety Objective.

  13. A two-stage method of quantitative flood risk analysis for reservoir real-time operation using ensemble-based hydrologic forecasts

    NASA Astrophysics Data System (ADS)

    Liu, P.

    2013-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.

  14. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder

    PubMed Central

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-01

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7–11 years (27 males, six females) and twenty five adults participants aged 21–29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD. PMID:26797613

  15. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder.

    PubMed

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-18

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7-11 years (27 males, six females) and twenty five adults participants aged 21-29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD.

  16. Genetic algorithm based image binarization approach and its quantitative evaluation via pooling

    NASA Astrophysics Data System (ADS)

    Hu, Huijun; Liu, Ya; Liu, Maofu

    2015-12-01

    The binarized image is very critical to image visual feature extraction, especially shape feature, and the image binarization approaches have been attracted more attentions in the past decades. In this paper, the genetic algorithm is applied to optimizing the binarization threshold of the strip steel defect image. In order to evaluate our genetic algorithm based image binarization approach in terms of quantity, we propose the novel pooling based evaluation metric, motivated by information retrieval community, to avoid the lack of ground-truth binary image. Experimental results show that our genetic algorithm based binarization approach is effective and efficiency in the strip steel defect images and our quantitative evaluation metric on image binarization via pooling is also feasible and practical.

  17. Quantitative Evaluation System of Soft Neurological Signs for Children with Attention Deficit Hyperactivity Disorder.

    PubMed

    Kaneko, Miki; Yamashita, Yushiro; Iramina, Keiji

    2016-01-01

    Attention deficit hyperactivity disorder (ADHD) is a neurodevelopmental disorder characterized by symptoms of inattention, hyperactivity, and impulsivity. Soft neurological signs (SNS) are minor neurological abnormalities in motor performance, and are used as one evaluation method for neurodevelopmental delays in children with ADHD. Our aim is to establish a quantitative evaluation system for children with ADHD. We focused on the arm movement called pronation and supination, which is one such soft neurological sign. Thirty three children with ADHD aged 7-11 years (27 males, six females) and twenty five adults participants aged 21-29 years old (19 males, six females) participated in our experiments. Our results suggested that the pronation and supination function in children with ADHD has a tendency to lag behind that of typically developing children by several years. From these results, our system has a possibility to objectively evaluate the neurodevelopmental delay of children with ADHD. PMID:26797613

  18. Quantitative analysis of topoisomerase IIalpha to rapidly evaluate cell proliferation in brain tumors.

    PubMed

    Oda, Masashi; Arakawa, Yoshiki; Kano, Hideyuki; Kawabata, Yasuhiro; Katsuki, Takahisa; Shirahata, Mitsuaki; Ono, Makoto; Yamana, Norikazu; Hashimoto, Nobuo; Takahashi, Jun A

    2005-06-17

    Immunohistochemical cell proliferation analyses have come into wide use for evaluation of tumor malignancy. Topoisomerase IIalpha (topo IIalpha), an essential nuclear enzyme, has been known to have cell cycle coupled expression. We here show the usefulness of quantitative analysis of topo IIalpha mRNA to rapidly evaluate cell proliferation in brain tumors. A protocol to quantify topo IIalpha mRNA was developed with a real-time RT-PCR. It took only 3 h to quantify from a specimen. A total of 28 brain tumors were analyzed, and the level of topo IIalpha mRNA was significantly correlated with its immuno-staining index (p<0.0001, r=0.9077). Furthermore, it sharply detected that topo IIalpha mRNA decreased in growth-inhibited glioma cell. These results support that topo IIalpha mRNA may be a good and rapid indicator to evaluate cell proliferate potential in brain tumors.

  19. Quantitative risk assessment integrated with process simulator for a new technology of methanol production plant using recycled CO₂.

    PubMed

    Di Domenico, Julia; Vaz, Carlos André; de Souza, Maurício Bezerra

    2014-06-15

    The use of process simulators can contribute with quantitative risk assessment (QRA) by minimizing expert time and large volume of data, being mandatory in the case of a future plant. This work illustrates the advantages of this association by integrating UNISIM DESIGN simulation and QRA to investigate the acceptability of a new technology of a Methanol Production Plant in a region. The simulated process was based on the hydrogenation of chemically sequestered carbon dioxide, demanding stringent operational conditions (high pressures and temperatures) and involving the production of hazardous materials. The estimation of the consequences was performed using the PHAST software, version 6.51. QRA results were expressed in terms of individual and social risks. Compared to existing tolerance levels, the risks were considered tolerable in nominal conditions of operation of the plant. The use of the simulator in association with the QRA also allowed testing the risk in new operating conditions in order to delimit safe regions for the plant.

  20. Evaluation of the predictability of real-time crash risk models.

    PubMed

    Xu, Chengcheng; Liu, Pan; Wang, Wei

    2016-09-01

    The primary objective of the present study was to investigate the predictability of crash risk models that were developed using high-resolution real-time traffic data. More specifically the present study sought answers to the following questions: (a) how to evaluate the predictability of a real-time crash risk model; and (b) how to improve the predictability of a real-time crash risk model. The predictability is defined as the crash probability given the crash precursor identified by the crash risk model. An equation was derived based on the Bayes' theorem for estimating approximately the predictability of crash risk models. The estimated predictability was then used to quantitatively evaluate the effects of the threshold of crash precursors, the matched and unmatched case-control design, and the control-to-case ratio on the predictability of crash risk models. It was found that: (a) the predictability of a crash risk model can be measured as the product of prior crash probability and the ratio between sensitivity and false alarm rate; (b) there is a trade-off between the predictability and sensitivity of a real-time crash risk model; (c) for a given level of sensitivity, the predictability of the crash risk model that is developed using the unmatched case-controlled sample is always better than that of the model developed using the matched case-controlled sample; and (d) when the control-to-case ratio is beyond 4:1, the increase in control-to-case ratio does not lead to clear improvements in predictability.

  1. Risk assessment and remedial policy evaluation using predictive modeling

    SciTech Connect

    Linkov, L.; Schell, W.R.

    1996-06-01

    As a result of nuclear industry operation and accidents, large areas of natural ecosystems have been contaminated by radionuclides and toxic metals. Extensive societal pressure has been exerted to decrease the radiation dose to the population and to the environment. Thus, in making abatement and remediation policy decisions, not only economic costs but also human and environmental risk assessments are desired. This paper introduces a general framework for risk assessment and remedial policy evaluation using predictive modeling. Ecological risk assessment requires evaluation of the radionuclide distribution in ecosystems. The FORESTPATH model is used for predicting the radionuclide fate in forest compartments after deposition as well as for evaluating the efficiency of remedial policies. Time of intervention and radionuclide deposition profile was predicted as being crucial for the remediation efficiency. Risk assessment conducted for a critical group of forest users in Belarus shows that consumption of forest products (berries and mushrooms) leads to about 0.004% risk of a fatal cancer annually. Cost-benefit analysis for forest cleanup suggests that complete removal of organic layer is too expensive for application in Belarus and a better methodology is required. In conclusion, FORESTPATH modeling framework could have wide applications in environmental remediation of radionuclides and toxic metals as well as in dose reconstruction and, risk-assessment.

  2. Quantitative landslide hazard and risk assessment from long-term space-borne InSAR measurements

    NASA Astrophysics Data System (ADS)

    Lu, P.; Catani, F.; Casagli, N.; Li, R.

    2011-12-01

    Preparing reliable landslide hazard and risk maps is an essential part for landslide studies and nowadays many approaches have been developed for quantitative hazard and risk assessment. However, very few of these hazard and risk maps was reported updated after their first generation. In this study, aiming at a continuous update of landslide hazard and risk maps, a novel approach for quantitative landslide hazard and risk assessment was developed, chiefly based on long-term satellite InSAR products - Persistent Scatterer Interferometry (PSI) point targets. The study was performed in the Arno river basin (central Italy) where most of mass movements are slow-moving landslides which are properly within the detection precision of PSI point targets. In the Arno river basin, the initial hazard and risk assessment were performed by Catani et al. (2005) using all datasets before 2001 whereas in this study the previous hazard and risk maps were updated using PSI point targets processed from 4 years (2003-2006) of RADARSAT images. Those PSI point targets were then used to generate a landslide hotspot map through PSI Hotspot and Clustering Analysis (PSI-HCA). Landslide hazard and risk maps for five temporal predictions of 2, 5, 10, 20 and 30 years were produced based on this landslide hotspot map with the exposure of losses estimated in euro. In particular, the result indicates that a potential loss of approximate 3.22 billion euro were expected in the upcoming 30 years due to these slow-moving landslides detected by PSI point targets.

  3. Objective and quantitative evaluation of motor function in a monkey model of Parkinson's disease.

    PubMed

    Saiki, Hidemoto; Hayashi, Takuya; Takahashi, Ryosuke; Takahashi, Jun

    2010-07-15

    Monkeys treated with 1-methyl-4-phenyl-1,2,5,6-tetrahydropyridine (MPTP) are currently the best animal model for Parkinson's disease (PD) and have been widely used for physiological and pharmacological investigations. However, objective and quantitative assessments have not been established for grading their motor behaviors. In order to develop a method for an unbiased evaluation, we performed a video-based assessment, used qualitative rating scales, and carried out an in vivo investigation of dopamine (DA) transporter binding in systemically MPTP-treated monkeys. The video-based analysis of spontaneous movement clearly demonstrated a significant correlation with the qualitative rating score. The assessment of DA transporter (DAT) function by [(11)C]-CFT-PET showed that, when compared with normal animals, the MPTP-treated animals exhibited decreased CFT binding in the bilateral striatum, particularly in the dorsal part in the putamen and caudate. Among the MPTP-treated monkeys, an unbiased PET analysis revealed a significant correlation between CFT binding in the midbrain and qualitative rating scores or the amount of spontaneous movements. These results indicate that a video-based analysis can be a reliable tool for an objective and quantitative evaluation of motor dysfunction of MPTP-treated monkeys, and furthermore, that DAT function in the midbrain may also be important for the evaluation.

  4. Effect of Surface Sampling and Recovery of Viruses and Non-Spore-Forming Bacteria on a Quantitative Microbial Risk Assessment Model for Fomites.

    PubMed

    Weir, Mark H; Shibata, Tomoyuki; Masago, Yoshifumi; Cologgi, Dena L; Rose, Joan B

    2016-06-01

    Quantitative microbial risk assessment (QMRA) is a powerful decision analytics tool, yet it faces challenges when modeling health risks for the indoor environment. One limitation is uncertainty in fomite recovery for evaluating the efficiency of decontamination. Addressing this data gap has become more important as a result of response and recovery from a potential malicious pathogen release. To develop more accurate QMRA models, recovery efficiency from non-porous fomites (aluminum, ceramic, glass, plastic, steel, and wood laminate) was investigated. Fomite material, surface area (10, 100, and 900 cm(2)), recovery tool (swabs and wipes), initial concentration on the fomites and eluent (polysorbate 80, trypticase soy broth, and beef extract) were evaluated in this research. Recovery was shown to be optimized using polysorbate 80, sampling with wipes, and sampling a surface area of 10-100 cm(2). The QMRA model demonstrated, through a relative risk comparison, the need for recovery efficiency to be used in these models to prevent underestimated risks. PMID:27154208

  5. Conceptual Model of Offshore Wind Environmental Risk Evaluation System

    SciTech Connect

    Anderson, Richard M.; Copping, Andrea E.; Van Cleve, Frances B.; Unwin, Stephen D.; Hamilton, Erin L.

    2010-06-01

    In this report we describe the development of the Environmental Risk Evaluation System (ERES), a risk-informed analytical process for estimating the environmental risks associated with the construction and operation of offshore wind energy generation projects. The development of ERES for offshore wind is closely allied to a concurrent process undertaken to examine environmental effects of marine and hydrokinetic (MHK) energy generation, although specific risk-relevant attributes will differ between the MHK and offshore wind domains. During FY10, a conceptual design of ERES for offshore wind will be developed. The offshore wind ERES mockup described in this report will provide a preview of the functionality of a fully developed risk evaluation system that will use risk assessment techniques to determine priority stressors on aquatic organisms and environments from specific technology aspects, identify key uncertainties underlying high-risk issues, compile a wide-range of data types in an innovative and flexible data organizing scheme, and inform planning and decision processes with a transparent and technically robust decision-support tool. A fully functional version of ERES for offshore wind will be developed in a subsequent phase of the project.

  6. A Quantitative Ecological Risk Assessment of the Toxicological Risks from Exxon Valdez Subsurface Oil Residues to Sea Otters at Northern Knight Island, Prince William Sound, Alaska

    PubMed Central

    Harwell, Mark A.; Gentile, John H.; Johnson, Charles B.; Garshelis, David L.; Parker, Keith R.

    2010-01-01

    A comprehensive, quantitative risk assessment is presented of the toxicological risks from buried Exxon Valdez subsurface oil residues (SSOR) to a subpopulation of sea otters (Enhydra lutris) at Northern Knight Island (NKI) in Prince William Sound, Alaska, as it has been asserted that this subpopulation of sea otters may be experiencing adverse effects from the SSOR. The central questions in this study are: could the risk to NKI sea otters from exposure to polycyclic aromatic hydrocarbons (PAHs) in SSOR, as characterized in 2001–2003, result in individual health effects, and, if so, could that exposure cause subpopulation-level effects? We follow the U.S. Environmental Protection Agency (USEPA) risk paradigm by: (a) identifying potential routes of exposure to PAHs from SSOR; (b) developing a quantitative simulation model of exposures using the best available scientific information; (c) developing scenarios based on calculated probabilities of sea otter exposures to SSOR; (d) simulating exposures for 500,000 modeled sea otters and extracting the 99.9% quantile most highly exposed individuals; and (e) comparing projected exposures to chronic toxicity reference values. Results indicate that, even under conservative assumptions in the model, maximum-exposed sea otters would not receive a dose of PAHs sufficient to cause any health effects; consequently, no plausible toxicological risk exists from SSOR to the sea otter subpopulation at NKI. PMID:20862194

  7. Quantitative microbiological risk assessment as a tool to obtain useful information for risk managers--specific application to Listeria monocytogenes and ready-to-eat meat products.

    PubMed

    Mataragas, M; Zwietering, M H; Skandamis, P N; Drosinos, E H

    2010-07-31

    The presence of Listeria monocytogenes in a sliced cooked, cured ham-like meat product was quantitatively assessed. Sliced cooked, cured meat products are considered as high risk products. These ready-to-eat, RTE, products (no special preparation, e.g. thermal treatment, before eating is required), support growth of pathogens (high initial pH=6.2-6.4 and water activity=0.98-0.99) and has a relatively long period of storage at chilled temperatures with a shelf life equal to 60 days based on manufacturer's instructions. Therefore, in case of post-process contamination, even with low number of cells, the microorganism is able to reach unacceptable levels at the time of consumption. The aim of this study was to conduct a Quantitative Microbiological Risk Assessment (QMRA) on the risk of L. monocytogenes presence in RTE meat products. This may help risk managers to make decisions and apply control measures with ultimate objective the food safety assurance. Examples are given to illustrate the development of practical risk management strategies based on the results obtained from the QMRA model specifically developed for this pathogen/food product combination.

  8. A Quantitative Ecological Risk Assessment of the Toxicological Risks from Exxon Valdez Subsurface Oil Residues to Sea Otters at Northern Knight Island, Prince William Sound, Alaska.

    PubMed

    Harwell, Mark A; Gentile, John H; Johnson, Charles B; Garshelis, David L; Parker, Keith R

    2010-07-01

    A comprehensive, quantitative risk assessment is presented of the toxicological risks from buried Exxon Valdez subsurface oil residues (SSOR) to a subpopulation of sea otters (Enhydra lutris) at Northern Knight Island (NKI) in Prince William Sound, Alaska, as it has been asserted that this subpopulation of sea otters may be experiencing adverse effects from the SSOR. The central questions in this study are: could the risk to NKI sea otters from exposure to polycyclic aromatic hydrocarbons (PAHs) in SSOR, as characterized in 2001-2003, result in individual health effects, and, if so, could that exposure cause subpopulation-level effects? We follow the U.S. Environmental Protection Agency (USEPA) risk paradigm by: (a) identifying potential routes of exposure to PAHs from SSOR; (b) developing a quantitative simulation model of exposures using the best available scientific information; (c) developing scenarios based on calculated probabilities of sea otter exposures to SSOR; (d) simulating exposures for 500,000 modeled sea otters and extracting the 99.9% quantile most highly exposed individuals; and (e) comparing projected exposures to chronic toxicity reference values. Results indicate that, even under conservative assumptions in the model, maximum-exposed sea otters would not receive a dose of PAHs sufficient to cause any health effects; consequently, no plausible toxicological risk exists from SSOR to the sea otter subpopulation at NKI. PMID:20862194

  9. Perception of risks from electromagnetic fields: A psychometric evaluation of a risk-communication approach

    SciTech Connect

    MacGregor, D.G.; Slovic, P. ); Morgan, M.G. )

    1994-10-01

    Potential health risks from exposure to power-frequency electromagnetic fields (EMF) have become an issue of significant public concern. This study evaluates a brochure designed to communicate EMF health risks from a scientific perspective. The study utilized a pretest-posttest design in which respondents judged various sources of EMF (and other) health and safety risks, both before reaching the brochure and after. Respondents assessed risks on dimensions similar to those utilized in previous studies of risk perception. In addition, detailed ratings were made that probed respondents' beliefs about the possible causal effects of EMF exposure. The findings suggest that naive beliefs about the potential of EMF exposure to cause harm were highly influenced by specific content elements of the brochure. The implications for using risk-communication approaches based on communicating scientific uncertainty are discussed. 19 refs., 1 fig., 11 tabs.

  10. Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics

    NASA Technical Reports Server (NTRS)

    Vary, A.

    1980-01-01

    Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.

  11. Quantitative polarization and flow evaluation of choroid and sclera by multifunctional Jones matrix optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Sugiyama, S.; Hong, Y.-J.; Kasaragod, D.; Makita, S.; Miura, M.; Ikuno, Y.; Yasuno, Y.

    2016-03-01

    Quantitative evaluation of optical properties of choroid and sclera are performed by multifunctional optical coherence tomography. Five normal eyes, five glaucoma eyes and one choroidal atrophy eye are examined. The refractive error was found to be correlated with choroidal birefringence, polarization uniformity, and flow in addition to scleral birefringence among normal eyes. The significant differences were observed between the normal and the glaucoma eyes, as for choroidal polarization uniformity, flow and scleral birefringence. An automatic segmentation algorithm of retinal pigment epithelium and chorioscleral interface based on multifunctional signals is also presented.

  12. Quantitative risk assessment from farm to fork and beyond: a global Bayesian approach concerning food-borne diseases.

    PubMed

    Albert, Isabelle; Grenier, Emmanuel; Denis, Jean-Baptiste; Rousseau, Judith

    2008-04-01

    A novel approach to the quantitative assessment of food-borne risks is proposed. The basic idea is to use Bayesian techniques in two distinct steps: first by constructing a stochastic core model via a Bayesian network based on expert knowledge, and second, using the data available to improve this knowledge. Unlike the Monte Carlo simulation approach as commonly used in quantitative assessment of food-borne risks where data sets are used independently in each module, our consistent procedure incorporates information conveyed by data throughout the chain. It allows "back-calculation" in the food chain model, together with the use of data obtained "downstream" in the food chain. Moreover, the expert knowledge is introduced more simply and consistently than with classical statistical methods. Other advantages of this approach include the clear framework of an iterative learning process, considerable flexibility enabling the use of heterogeneous data, and a justified method to explore the effects of variability and uncertainty. As an illustration, we present an estimation of the probability of contracting a campylobacteriosis as a result of broiler contamination, from the standpoint of quantitative risk assessment. Although the model thus constructed is oversimplified, it clarifies the principles and properties of the method proposed, which demonstrates its ability to deal with quite complex situations and provides a useful basis for further discussions with different experts in the food chain.

  13. Evaluation of ViroCyt® Virus Counter for rapid filovirus quantitation.

    PubMed

    Rossi, Cynthia A; Kearney, Brian J; Olschner, Scott P; Williams, Priscilla L; Robinson, Camenzind G; Heinrich, Megan L; Zovanyi, Ashley M; Ingram, Michael F; Norwood, David A; Schoepp, Randal J

    2015-03-01

    Development and evaluation of medical countermeasures for diagnostics, vaccines, and therapeutics requires production of standardized, reproducible, and well characterized virus preparations. For filoviruses this includes plaque assay for quantitation of infectious virus, transmission electron microscopy (TEM) for morphology and quantitation of virus particles, and real-time reverse transcription PCR for quantitation of viral RNA (qRT-PCR). The ViroCyt® Virus Counter (VC) 2100 (ViroCyt, Boulder, CO, USA) is a flow-based instrument capable of quantifying virus particles in solution. Using a proprietary combination of fluorescent dyes that stain both nucleic acid and protein in a single 30 min step, rapid, reproducible, and cost-effective quantification of filovirus particles was demonstrated. Using a seed stock of Ebola virus variant Kikwit, the linear range of the instrument was determined to be 2.8E+06 to 1.0E+09 virus particles per mL with coefficient of variation ranging from 9.4% to 31.5% for samples tested in triplicate. VC particle counts for various filovirus stocks were within one log of TEM particle counts. A linear relationship was established between the plaque assay, qRT-PCR, and the VC. VC results significantly correlated with both plaque assay and qRT-PCR. These results demonstrated that the VC is an easy, fast, and consistent method to quantify filoviruses in stock preparations. PMID:25710889

  14. Four-Point Bending as a Method for Quantitatively Evaluating Spinal Arthrodesis in a Rat Model

    PubMed Central

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-01-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague–Dawley rat spines after single-level posterolateral fusion procedures at L4–L5. Segments were classified as ‘not fused,’ ‘restricted motion,’ or ‘fused’ by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4–L5 motion segment, and stiffness was measured as the slope of the moment–displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery. PMID:25730756

  15. Quantitative measurement of porphyrins in biological tissues and evaluation of tissue porphyrins during toxicant exposures.

    PubMed

    Woods, J S; Miller, H D

    1993-10-01

    Porphyrins are formed in most eukaryotic tissues as intermediates in the biosynthesis of heme. Assessment of changes in tissue porphyrin levels occurring in response to the actions of various drugs or toxicants is potentially useful in the evaluation of chemical exposures and effects. The present paper describes a rapid and sensitive method for the extraction and quantitation of porphyrins in biological tissues which overcomes difficulties encountered in previously described methods, particularly the loss of porphyrins during extraction and interference of porphyrin quantitation by coeluting fluorescent tissue constituents. In this procedure 8- through 2-carboxyl porphyrins are quantitatively extracted from tissue homogenates using HCl and methanol and are subsequently separated from potentially interfering contaminants by sequential methanol/phosphate elution on a C-18 preparatory column. Porphyrins are then separated and measured by reversed-phase high-performance liquid chromatography and spectrofluorometric techniques. Recovery of tissue porphyrins using this method is close to 100% with an intraassay variability of less than 10%. We have employed this procedure to measure liver and kidney porphyrin concentrations in male Fischer rats and to define the distinctive changes in tissue porphyrin patterns associated with treatment with the hepatic and renal porphyrinogenic chemicals, allylisopropylacetamide, and methyl mercury hydroxide, respectively. This method is applicable to the measurement of tissue porphyrin changes resulting from drug or toxicant exposures in clinical, experimental or environmental assessments.

  16. Experimental Evaluation of Quantitative Diagnosis Technique for Hepatic Fibrosis Using Ultrasonic Phantom

    NASA Astrophysics Data System (ADS)

    Koriyama, Atsushi; Yasuhara, Wataru; Hachiya, Hiroyuki

    2012-07-01

    Since clinical diagnosis using ultrasonic B-mode images depends on the skill of the doctor, the realization of a quantitative diagnosis method using an ultrasound echo signal is highly required. We have been investigating a quantitative diagnosis technique, mainly for hepatic disease. In this paper, we present the basic experimental evaluation results on the accuracy of the proposed quantitative diagnosis technique for hepatic fibrosis by using a simple ultrasonic phantom. As a region of interest crossed on the boundary between two scatterer areas with different densities in a phantom, we can simulate the change of the echo amplitude distribution from normal tissue to fibrotic tissue in liver disease. The probability density function is well approximated by our fibrosis distribution model that is a mixture of normal and fibrotic tissue. The fibrosis parameters of the amplitude distribution model can be estimated relatively well at a mixture rate from 0.2 to 0.6. In the inversion processing, the standard deviation of the estimated fibrosis results at mixture ratios of less than 0.2 and larger than 0.6 are relatively large. Although the probability density is not large at high amplitude, the estimated variance ratio and mixture rate of the model are strongly affected by higher amplitude data.

  17. A quantitative microbial risk assessment for meatborne Toxoplasma gondii infection in The Netherlands.

    PubMed

    Opsteegh, Marieke; Prickaerts, Saskia; Frankena, Klaas; Evers, Eric G

    2011-11-01

    Toxoplasma gondii is an important foodborne pathogen, and the cause of a high disease burden due to congenital toxoplasmosis in The Netherlands. The aim of this study was to quantify the relative contribution of sheep, beef and pork products to human T. gondii infections by Quantitative Microbial Risk Assessment (QMRA). Bradyzoite concentration and portion size data were used to estimate the bradyzoite number in infected unprocessed portions for human consumption. The reduction factors for salting, freezing and heating as estimated based on published experiments in mice, were subsequently used to estimate the bradyzoite number in processed portions. A dose-response relation for T. gondii infection in mice was used to estimate the human probability of infection due to consumption of these originally infected processed portions. By multiplying these probabilities with the prevalence of T. gondii per livestock species and the number of portions consumed per year, the number of infections per year was calculated for the susceptible Dutch population and the subpopulation of susceptible pregnant women. QMRA results predict high numbers of infections per year with beef as the most important source. Although many uncertainties were present in the data and the number of congenital infections predicted by the model was almost twenty times higher than the number estimated based on the incidence in newborns, the usefulness of the advice to thoroughly heat meat is confirmed by our results. Forty percent of all predicted infections is due to the consumption of unheated meat products, and sensitivity analysis indicates that heating temperature has the strongest influence on the predicted number of infections. The results also demonstrate that, even with a low prevalence of infection in cattle, consumption of beef remains an important source of infection. Developing this QMRA model has helped identify important gaps of knowledge and resulted in the following recommendations for

  18. Quantitative Risk Assessment of CO2 Sequestration in a commerical-scale EOR Site

    NASA Astrophysics Data System (ADS)

    Pan, F.; McPherson, B. J. O. L.; Dai, Z.; Jia, W.; Lee, S. Y.; Ampomah, W.; Viswanathan, H. S.

    2015-12-01

    Enhanced Oil Recovery with CO2 (CO2-EOR) is perhaps the most feasible option for geologic CO2 sequestration (GCS), if only due to existing infrastructure and economic opportunities of associated oil production. Probably the most significant source of uncertainty of CO2 storage forecasts is heterogeneity of reservoir properties. Quantification of storage forecast uncertainty is critical for accurate assessment of risks associated with GCS in EOR fields. This study employs a response surface methodology (RSM) to quantify uncertainties of CO2 storage associated with oil production in an active CO2-EOR field. Specifically, the Morrow formation, a clastic reservoir within the Farnsworth EOR Unit (FWU) in Texas, was selected as a case study. Four uncertain parameters (i.e., independent variables) are reservoir permeability, anisotropy ratio of permeability, water-alternating-gas (WAG) time ratio, and initial oil saturation. Cumulative oil production and net CO2 injection are the output dependent variables. A 3-D FWU reservoir model, including a representative 5-spot well pattern, was constructed for CO2-oil-water multiphase flow analysis. A total of 25 permutations of 3-D reservoir simulations were executed using Eclipse simulator. After performing stepwise regression analysis, a series of response surface models of the output variables at each step were constructed and verified using appropriate goodness-of-fit measures. The R2 values are larger than 0.9 and NRMSE values are less than 5% between the simulated and predicted oil production and net CO2 injection, suggesting that the response surface (or proxy) models are sufficient for predicting CO2-EOR system behavior for FWU case. Given the range of uncertainties in the independent variables, the cumulative distribution functions (CDFs) of dependent variables were estimated using the proxy models. The predicted cumulative oil production and net CO2 injection at 95th percentile after 5 years are about 3.65 times, and 1

  19. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    PubMed

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical

  20. Validation of a Quantitative HIV Risk Prediction Tool Using a National HIV Testing Cohort

    PubMed Central

    Haukoos, Jason S.; Hopkins, Emily; Bucossi, Meggan M.; Lyons, Michael S.; Rothman, Richard E.; White, Douglas A.E.; Al-Tayyib, Alia A.; Bradley-Springer, Lucy; Campbell, Jonathon D.; Sabel, Allison L.; Thrun, Mark W.

    2015-01-01

    Routine screening is recommended for HIV detection. HIV risk estimation remains important. Our goal was to validate the Denver HIV Risk Score (DHRS) using a national cohort from the CDC. Patients ≥13 years of age were included, 4,830,941 HIV tests were performed, and 0.6% newly-diagnosed infections were identified. Of all visits, 9% were very low risk (HIV prevalence = 0.20%); 27% low risk (HIV prevalence = 0.17%); 41% moderate risk (HIV prevalence = 0.39%); 17% high risk (HIV prevalence = 1.19%); and 6% very high risk (HIV prevalence = 3.57%). The DHRS accurately categorized patients into different HIV risk groups. PMID:25585300

  1. A quantitative exploratory evaluation of the circle of security-parenting program with mothers in residential substance-abuse treatment.

    PubMed

    Horton, Evette; Murray, Christine

    2015-01-01

    Maternal substance abuse is a risk factor for child maltreatment, child attachment insecurity, and maladaptive social information processing. The aim of this study was to conduct a quantitative exploratory evaluation of the effectiveness of an attachment-based parent program, Circle of Security-Parenting (COS-P; G. Cooper, K. Hoffman, & B. Powell, 2009), with a community sample of 15 mothers in residential treatment for substance abuse. Participants attended nine weekly group sessions and were given three measures at pretest and posttest: the Emotion Regulation Questionnaire (J.J. Gross & O.P. John, 2003), the Parent Attribution Test (D. Bugental, ), and the Parenting Scale (D.S. Arnold, S.G. O'Leary, L.S. Wolff, & M.M. Acker, 1993). The results indicate that mothers who attended the majority of group sessions showed greater improvements on all three variables. Participants who attended some of the sessions showed some improvements on the measures, but participants who did not attend the group sessions had no improvements, and on some measures, declined significantly. Further analyses of demographic data indicates that participants with more education, no personal history of child maltreatment, less time in the residential program, and lower social desirability scores demonstrated more positive outcomes. These findings suggest that the COS-P may positively impact parental risk factors associated with child maltreatment and maladaptive social information processing in the context of residential substance-abuse treatment.

  2. A new approach to quantitative NMR: fluoroquinolones analysis by evaluating the chemical shift displacements.

    PubMed

    Michaleas, S; Antoniadou-Vyza, E

    2006-10-11

    Quantitative NMR spectroscopy is always an attractive goal as the identity and quantity could be simultaneously determined. Although significant advancements have been achieved in this field it is common that all reported quantitative NMR methods perform the analysis by utilizing the average integral intensities of selected signals. During the calculation of the area under NMR peaks several response problems can occur which should always be treated carefully to overcome inaccuracies. In the method proposed in this work the quantitative information is obtained utilizing the measurement of selected protons chemical shift displacements which is a quite straightforward and highly reproducible process. The (1)H NMR spectra of multiple fluoroquinolone (FQ) solutions revealed that the chemical shifts of protons, especially the aromatic ones, were concentration dependent for all tested compounds, as a result of extensive self-association phenomena. In the present work a novel methodology is described for the quantitation of several FQs based on this dependence. The proposed method was applied to Ciprofloxacin solutions over a wide range of concentrations. Evaluation of the obtained data presented acceptable characteristics regarding accuracy, precision, and robustness. The applicability limitations of this method were found to be posed by current instrumentation, mainly by the magnetic field frequency e.g. the slope of the response function achieved with a 400MHz instrument was twice the one achieved at 200MHz. The pH effect was negligible from pD 2.5 to 5.5. The phenomenon appeared in a pattern that can be applied for a plethora of drug categories revealing self-association phenomena in a range of concentration determined by the magnet strength of the instrument.

  3. Evaluation of chemotherapy response in ovarian cancer treatment using quantitative CT image biomarkers: a preliminary study

    NASA Astrophysics Data System (ADS)

    Qiu, Yuchen; Tan, Maxine; McMeekin, Scott; Thai, Theresa; Moore, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2015-03-01

    The purpose of this study is to identify and apply quantitative image biomarkers for early prediction of the tumor response to the chemotherapy among the ovarian cancer patients participated in the clinical trials of testing new drugs. In the experiment, we retrospectively selected 30 cases from the patients who participated in Phase I clinical trials of new drug or drug agents for ovarian cancer treatment. Each case is composed of two sets of CT images acquired pre- and post-treatment (4-6 weeks after starting treatment). A computer-aided detection (CAD) scheme was developed to extract and analyze the quantitative image features of the metastatic tumors previously tracked by the radiologists using the standard Response Evaluation Criteria in Solid Tumors (RECIST) guideline. The CAD scheme first segmented 3-D tumor volumes from the background using a hybrid tumor segmentation scheme. Then, for each segmented tumor, CAD computed three quantitative image features including the change of tumor volume, tumor CT number (density) and density variance. The feature changes were calculated between the matched tumors tracked on the CT images acquired pre- and post-treatments. Finally, CAD predicted patient's 6-month progression-free survival (PFS) using a decision-tree based classifier. The performance of the CAD scheme was compared with the RECIST category. The result shows that the CAD scheme achieved a prediction accuracy of 76.7% (23/30 cases) with a Kappa coefficient of 0.493, which is significantly higher than the performance of RECIST prediction with a prediction accuracy and Kappa coefficient of 60% (17/30) and 0.062, respectively. This study demonstrated the feasibility of analyzing quantitative image features to improve the early predicting accuracy of the tumor response to the new testing drugs or therapeutic methods for the ovarian cancer patients.

  4. Risk evaluation of liquefaction on the site of Damien (Haiti)

    NASA Astrophysics Data System (ADS)

    Jean, B. J.; Boisson, D.; Thimus, J.; Schroeder, C.

    2013-12-01

    Under the proposed relocation of all faculties to the campus of Damien, owned by Université d'Etat d'Haïti (UEH), the Unité de Recherche en Géotechnique (URGéo) of the Faculté des Sciences (FDS) of UEH conducted several operations whose objective was to evaluate the risk of liquefaction on this site. This abstract presents a comprehensive and coherent manner the entire processus of assessing the risk of liquefaction. This evaluation was conducted mainly from seismic thechniques, laboratory tests and the response of a one-dimensional soil column. Then, we summarize the results of this evaluation on the various techniques through synthetic maps interpretations of MASW 1D and H/V and also measures on site response to seismic loading from the SPT test applied to evaluation of liquefaction potential.

  5. Reanalysis of the DEMS Nested Case-Control Study of Lung Cancer and Diesel Exhaust: Suitability for Quantitative Risk Assessment

    PubMed Central

    Crump, Kenny S; Van Landingham, Cynthia; Moolgavkar, Suresh H; McClellan, Roger

    2015-01-01

    The International Agency for Research on Cancer (IARC) in 2012 upgraded its hazard characterization of diesel engine exhaust (DEE) to “carcinogenic to humans.” The Diesel Exhaust in Miners Study (DEMS) cohort and nested case-control studies of lung cancer mortality in eight U.S. nonmetal mines were influential in IARC’s determination. We conducted a reanalysis of the DEMS case-control data to evaluate its suitability for quantitative risk assessment (QRA). Our reanalysis used conditional logistic regression and adjusted for cigarette smoking in a manner similar to the original DEMS analysis. However, we included additional estimates of DEE exposure and adjustment for radon exposure. In addition to applying three DEE exposure estimates developed by DEMS, we applied six alternative estimates. Without adjusting for radon, our results were similar to those in the original DEMS analysis: all but one of the nine DEE exposure estimates showed evidence of an association between DEE exposure and lung cancer mortality, with trend slopes differing only by about a factor of two. When exposure to radon was adjusted, the evidence for a DEE effect was greatly diminished, but was still present in some analyses that utilized the three original DEMS DEE exposure estimates. A DEE effect was not observed when the six alternative DEE exposure estimates were utilized and radon was adjusted. No consistent evidence of a DEE effect was found among miners who worked only underground. This article highlights some issues that should be addressed in any use of the DEMS data in developing a QRA for DEE. PMID:25857246

  6. Quantitative evaluation study of four-dimensional gated cardiac SPECT reconstruction †

    PubMed Central

    Jin, Mingwu; Yang, Yongyi; Niu, Xiaofeng; Marin, Thibault; Brankov, Jovan G.; Feng, Bing; Pretorius, P. Hendrik; King, Michael A.; Wernick, Miles N.

    2013-01-01

    In practice gated cardiac SPECT images suffer from a number of degrading factors, including distance-dependent blur, attenuation, scatter, and increased noise due to gating. Recently we proposed a motion-compensated approach for four-dimensional (4D) reconstruction for gated cardiac SPECT, and demonstrated that use of motion-compensated temporal smoothing could be effective for suppressing the increased noise due to lowered counts in individual gates. In this work we further develop this motion-compensated 4D approach by also taking into account attenuation and scatter in the reconstruction process, which are two major degrading factors in SPECT data. In our experiments we conducted a thorough quantitative evaluation of the proposed 4D method using Monte Carlo simulated SPECT imaging based on the 4D NURBS-based cardiac-torso (NCAT) phantom. In particular we evaluated the accuracy of the reconstructed left ventricular myocardium using a number of quantitative measures including regional bias-variance analyses and wall intensity uniformity. The quantitative results demonstrate that use of motion-compensated 4D reconstruction can improve the accuracy of the reconstructed myocardium, which in turn can improve the detectability of perfusion defects. Moreover, our results reveal that while traditional spatial smoothing could be beneficial, its merit would become diminished with the use of motion-compensated temporal regularization. As a preliminary demonstration, we also tested our 4D approach on patient data. The reconstructed images from both simulated and patient data demonstrated that our 4D method can improve the definition of the LV wall. PMID:19724094

  7. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  8. Evaluation of green coffee beans quality using near infrared spectroscopy: a quantitative approach.

    PubMed

    Santos, João Rodrigo; Sarraguça, Mafalda C; Rangel, António O S S; Lopes, João A

    2012-12-01

    Characterisation of coffee quality based on bean quality assessment is associated with the relative amount of defective beans among non-defective beans. It is therefore important to develop a methodology capable of identifying the presence of defective beans that enables a fast assessment of coffee grade and that can become an analytical tool to standardise coffee quality. In this work, a methodology for quality assessment of green coffee based on near infrared spectroscopy (NIRS) is proposed. NIRS is a green chemistry, low cost, fast response technique without the need of sample processing. The applicability of NIRS was evaluated for Arabica and Robusta varieties from different geographical locations. Partial least squares regression was used to relate the NIR spectrum to the mass fraction of defective and non-defective beans. Relative errors around 5% show that NIRS can be a valuable analytical tool to be used by coffee roasters, enabling a simple and quantitative evaluation of green coffee quality in a fast way.

  9. Quantitative evaluation of an experimental inflammation induced with Freund's complete adjuvant in dogs.

    PubMed

    Botrel, M A; Haak, T; Legrand, C; Concordet, D; Chevalier, R; Toutain, P L

    1994-10-01

    A chronic inflammation model in dogs was induced by intraarticular injection of Freund's Complete Adjuvant in the stifle. After a primary, acute response during the first 24 hr, a secondary subacute response was observed after a delay of approximately 3 weeks and persisted for several weeks. To evaluate the time course of the inflammatory process quantitatively, we tested more than 100 different parameters. Finally, only four parameters were selected based on practicability and metrological properties, namely, the body temperature, difference in skin temperature, difference in stifle diameter and vertical force exerted by arthritic hind limb measured using a force plate. The main results of the experimentation were the demonstration that these four parameters were sufficiently repeatable, reproducible, and appropriate to be used for quantitative evaluation of the inflammatory process, and that training of both animals and investigators was required. Finally, it was illustrated that an adjuvant periarthritis in dogs can be used to carry out a pharmacokinetic/pharmacodynamic modelling of an antiinflammatory drug. PMID:7865864

  10. Segmentation and quantitative evaluation of brain MRI data with a multiphase 3D implicit deformable model

    NASA Astrophysics Data System (ADS)

    Angelini, Elsa D.; Song, Ting; Mensh, Brett D.; Laine, Andrew

    2004-05-01

    Segmentation of three-dimensional anatomical brain images into tissue classes has applications in both clinical and research settings. This paper presents the implementation and quantitative evaluation of a four-phase three-dimensional active contour implemented with a level set framework for automated segmentation of brain MRIs. The segmentation algorithm performs an optimal partitioning of three-dimensional data based on homogeneity measures that naturally evolves to the extraction of different tissue types in the brain. Random seed initialization was used to speed up numerical computation and avoid the need for a priori information. This random initialization ensures robustness of the method to variation of user expertise, biased a priori information and errors in input information that could be influenced by variations in image quality. Experimentation on three MRI brain data sets showed that an optimal partitioning successfully labeled regions that accurately identified white matter, gray matter and cerebrospinal fluid in the ventricles. Quantitative evaluation of the segmentation was performed with comparison to manually labeled data and computed false positive and false negative assignments of voxels for the three organs. We report high accuracy for the two comparison cases. These results demonstrate the efficiency and flexibility of this segmentation framework to perform the challenging task of automatically extracting brain tissue volume contours.

  11. Highly sensitive and quantitative evaluation of the EGFR T790M mutation by nanofluidic digital PCR.

    PubMed

    Iwama, Eiji; Takayama, Koichi; Harada, Taishi; Okamoto, Isamu; Ookubo, Fumihiko; Kishimoto, Junji; Baba, Eishi; Oda, Yoshinao; Nakanishi, Yoichi

    2015-08-21

    The mutation of T790M in EGFR is a major mechanism of resistance to treatment with EGFR-TKIs. Only qualitative detection (presence or absence) of T790M has been described to date, however. Digital PCR (dPCR) analysis has recently been applied to the quantitative detection of target molecules in cancer with high sensitivity. In the present study, 25 tumor samples (13 obtained before and 12 after EGFR-TKI treatment) from 18 NSCLC patients with activating EGFR mutations were evaluated for T790M with dPCR. The ratio of the number of T790M alleles to that of activating mutation alleles (T/A) was determined. dPCR detected T790M in all 25 samples. Although T790M was present in all pre-TKI samples from 13 patients, 10 of these patients had a low T/A ratio and manifested substantial tumor shrinkage during treatment with EGFR-TKIs. In six of seven patients for whom both pre- and post-TKI samples were available, the T/A ratio increased markedly during EGFR-TKI treatment. Highly sensitive dPCR thus detected T790M in all NSCLC patients harboring activating EGFR mutations whether or not they had received EGFR-TKI treatment. Not only highly sensitive but also quantitative detection of T790M is important for evaluation of the contribution of T790M to EGFR-TKI resistance.

  12. Panoramic imaging is not suitable for quantitative evaluation, classification, and follow up in unilateral condylar hyperplasia.

    PubMed

    Nolte, J W; Karssemakers, L H E; Grootendorst, D C; Tuinzing, D B; Becking, A G

    2015-05-01

    Patients with suspected unilateral condylar hyperplasia are often screened radiologically with a panoramic radiograph, but this is not sufficient for routine diagnosis and follow up. We have therefore made a quantitative analysis and evaluation of panoramic radiographs in a large group of patients with the condition. During the period 1994-2011, 132 patients with 113 panoramic radiographs were analysed using a validated method. There was good reproducibility between observers, but the condylar neck and head were the regions reported with least reliability. Although in most patients asymmetry of the condylar head, neck, and ramus was confirmed, the kappa coefficient as an indicator of agreement between two observers was poor (-0.040 to 0.504). Hardly any difference between sides was measured at the gonion angle, and the body appeared to be higher on the affected side in 80% of patients. Panoramic radiographs might be suitable for screening, but are not suitable for the quantitative evaluation, classification, and follow up of patients with unilateral condylar hyperplasia. PMID:25798757

  13. Quantitative evaluation on internal seeing induced by heat-stop of solar telescope.

    PubMed

    Liu, Yangyi; Gu, Naiting; Rao, Changhui

    2015-07-27

    heat-stop is one of the essential thermal control devices of solar telescope. The internal seeing induced by its temperature rise will degrade the imaging quality significantly. For quantitative evaluation on internal seeing, an integrated analysis method based on computational fluid dynamics and geometric optics is proposed in this paper. Firstly, the temperature field of the heat-affected zone induced by heat-stop temperature rise is obtained by the method of computational fluid dynamics calculation. Secondly, the temperature field is transformed to refractive index field by corresponding equations. Thirdly, the wavefront aberration induced by internal seeing is calculated by geometric optics based on optical integration in the refractive index field. This integrated method is applied in the heat-stop of the Chinese Large Solar Telescope to quantitatively evaluate its internal seeing. The analytical results show that the maximum acceptable temperature rise of heat-stop is up to 5 Kelvins above the ambient air at any telescope pointing directions under the condition that the root-mean-square of wavefront aberration induced by internal seeing is less than 25nm. Furthermore, it is found that the magnitude of wavefront aberration gradually increases with the increase of heat-stop temperature rise for a certain telescope pointing direction. Meanwhile, with the variation of telescope pointing varying from the horizontal to the vertical direction, the magnitude of wavefront aberration decreases at first and then increases for the same heat-stop temperature rise.

  14. Weighted Fuzzy Risk Priority Number Evaluation of Turbine and Compressor Blades Considering Failure Mode Correlations

    NASA Astrophysics Data System (ADS)

    Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-06-01

    Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural

  15. Quantitative trait loci modifying cardiac atrial septal morphology and risk of patent foramen ovale in the mouse.

    PubMed

    Kirk, Edwin P; Hyun, Changbaig; Thomson, Peter C; Lai, Donna; Castro, M Leticia; Biben, Christine; Buckley, Michael F; Martin, Ian C A; Moran, Chris; Harvey, Richard P

    2006-03-17

    Atrial septal defect (ASD) is a common congenital heart disease (CHD) occurring in 5 to 7 per 10,000 live births. Mutations in 5 human genes (NKX2.5, TBX5, GATA4, MYHC, ACTC) are known to cause dominant ASD, but these account for a minority of cases. Human and mouse data suggest that ASD exists in an anatomical continuum with milder septal variants patent foramen ovale (PFO) and atrial septal aneurysm, strongly associated with ischemic stroke and migraine. We have previously shown in inbred mice that the incidence of PFO strongly correlates with length of the interatrial septum primum, defining a quantitative trait underlying PFO risk. To better understand genetic causation of atrial septal abnormalities, we mapped quantitative trait loci (QTL) influencing septal morphology using mouse strains (QSi5 and 129T2/SvEms) maximally informative for PFO incidence and 3 quantitative septal anatomical traits including septum primum length. [QSi5x129T2/SvEms]F2 intercross animals (n=1437) were phenotyped and a whole genome scan performed at an average 17-cM interval. Statistical methodology scoring PFO as a binary phenotype was developed as a confirmatory mapping technique. We mapped 7 significant and 6 suggestive QTL modifying quantitative phenotypes, with 4 supported by binary analysis. Quantitative traits, although strongly associated with PFO (P<0.001), correlated poorly with each other and in all but 1 case QTL for different traits were nonoverlapping. Thus, multiple anatomical processes under separate genetic control contribute to risk of PFO. Our findings demonstrate the feasibility of modeling the genetic basis of common CHD using animal genetic and genomic technologies.

  16. IWGT report on quantitative approaches to genotoxicity risk assessment II. Use of point-of-departure (PoD) metrics in defining acceptable exposure limits and assessing human risk.

    PubMed

    MacGregor, James T; Frötschl, Roland; White, Paul A; Crump, Kenny S; Eastmond, David A; Fukushima, Shoji; Guérard, Melanie; Hayashi, Makoto; Soeteman-Hernández, Lya G; Johnson, George E; Kasamatsu, Toshio; Levy, Dan D; Morita, Takeshi; Müller, Lutz; Schoeny, Rita; Schuler, Maik J; Thybaud, Véronique

    2015-05-01

    This is the second of two reports from the International Workshops on Genotoxicity Testing (IWGT) Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (the QWG). The first report summarized the discussions and recommendations of the QWG related to the need for quantitative dose-response analysis of genetic toxicology data, the existence and appropriate evaluation of threshold responses, and methods to analyze exposure-response relationships and derive points of departure (PoDs) from which acceptable exposure levels could be determined. This report summarizes the QWG discussions and recommendations regarding appropriate approaches to evaluate exposure-related risks of genotoxic damage, including extrapolation below identified PoDs and across test systems and species. Recommendations include the selection of appropriate genetic endpoints and target tissues, uncertainty factors and extrapolation methods to be considered, the importance and use of information on mode of action, toxicokinetics, metabolism, and exposure biomarkers when using quantitative exposure-response data to determine acceptable exposure levels in human populations or to assess the risk associated with known or anticipated exposures. The empirical relationship between genetic damage (mutation and chromosomal aberration) and cancer in animal models was also examined. It was concluded that there is a general correlation between cancer induction and mutagenic and/or clastogenic damage for agents thought to act via a genotoxic mechanism, but that the correlation is limited due to an inadequate number of cases in which mutation and cancer can be compared at a sufficient number of doses in the same target tissues of the same species and strain exposed under directly comparable routes and experimental protocols.

  17. A neural network model for credit risk evaluation.

    PubMed

    Khashman, Adnan

    2009-08-01

    Credit scoring is one of the key analytical techniques in credit risk evaluation which has been an active research area in financial risk management. This paper presents a credit risk evaluation system that uses a neural network model based on the back propagation learning algorithm. We train and implement the neural network to decide whether to approve or reject a credit application, using seven learning schemes and real world credit applications from the Australian credit approval datasets. A comparison of the system performance under the different learning schemes is provided, furthermore, we compare the performance of two neural networks; with one and two hidden layers following the ideal learning scheme. Experimental results suggest that neural networks can be effectively used in automatic processing of credit applications.

  18. [Initial evaluation of febrile neutropenic patients: risk quantification].

    PubMed

    Vázquez, Lourdes; García, José Elías

    2005-12-01

    Infection in immunocompromised hosts represents a serious clinical situation due the high morbidity and mortality it produces and is one of the most frequent complications in patients with cancer. In patients treated with chemotherapy the risk of infection mainly depends on the duration and intensity of neutropenia. It is essential to evaluate which pathogens are involved so that the most appropriate treatment can be selected a priori, as well as to determine the patient's general clinical status so that more or less aggressive treatment can be provided from the beginning, bearing in mind that "low risk" patients can be managed in the home. These questions can be determined by evaluating the patient's clinical history, physical examination, laboratory investigations, and radiological tests. Prompt initiation of broad-spectrum antibiotic therapy adapted to the the patient's risk is crucial.

  19. Development and evaluation of quantitative-competitive PCR for quantitation of coxsackievirus B3 RNA in experimentally infected murine tissues.

    PubMed

    Reetoo, K N; Osman, S A; Illavia, S J; Banatvala, J E; Muir, P

    1999-10-01

    A method is described for quantitation of enterovirus RNA in experimentally infected murine tissues. Viral RNA was extracted from tissue samples and amplified by reverse transcriptase PCR in the presence of an internal standard RNA. The ratio of PCR product derived from viral RNA and internal standard RNA was then determined using specific probes in a post-PCR electrochemiluminescent hybridization assay. This provided an estimate of the viral RNA copy number in the original sample, and detection of PCR product derived from internal standard RNA validated sample processing and amplification procedures. RNA copy number correlated with viral infectivity of cell culture-derived virus, and one tissue culture infective dose was found to contain approximately 10(3) genome equivalents. The ratio of RNA copy number to infectivity in myocardial tissue taken from mice during the acute phase of coxsackievirus B3 myocarditis was more variable ranging from 10(4)-10(7), and was dependent on the stage of infection, reflecting differential rates of clearance for viral RNA and viral infectivity. The assay is rapid, and could facilitate investigations which currently rely upon enterovirus quantitation by titration in cell culture. This would be useful for experimental studies of viral pathogenesis, prophylaxis and antiviral therapy.

  20. Rape Prevention with College Men: Evaluating Risk Status

    ERIC Educational Resources Information Center

    Stephens, Kari A.; George, William H.

    2009-01-01

    This study evaluates the effectiveness of a theoretically based rape prevention intervention with college men who were at high or low risk to perpetrate sexually coercive behavior. Participants (N = 146) are randomly assigned to the intervention or control group. Outcomes include rape myth acceptance, victim empathy, attraction to sexual…

  1. Developing and Evaluating a Cardiovascular Risk Reduction Project.

    ERIC Educational Resources Information Center

    Brownson, Ross C.; Mayer, Jeffrey P.; Dusseault, Patricia; Dabney, Sue; Wright, Kathleen; Jackson-Thompson, Jeannette; Malone, Bernard; Goodman, Robert

    1997-01-01

    Describes the development and baseline evaluation data from the Ozark Heart Health Project, a community-based cardiovascular disease risk reduction program in rural Missouri that targeted smoking, physical inactivity, and poor diet. Several Ozark counties participated in either intervention or control groups, and researchers conducted surveillance…

  2. [Evaluating occupational health risk in titanium alloys production workers].

    PubMed

    Bazarova, E L

    2007-01-01

    The authors present data on evaluation of personified and non-personified occupational risk of health disorders in titanium alloys production workers, concerning hygienic, medical and biologic, social and psychologic criteria. One-digit assessment of the work conditions is suggested.

  3. Field Evaluation of an Avian Risk Assessment Model

    EPA Science Inventory

    We conducted two laboratory subacute dietary toxicity tests and one outdoor subacute dietary toxicity test to determine the effectiveness of the U.S. Environmental Protection Agency's deterministic risk assessment model for evaluating the potential of adverse effects to birds in ...

  4. Risk management in technovigilance: construction and validation of a medical-hospital product evaluation instrument.

    PubMed

    Kuwabara, Cleuza Catsue Takeda; Evora, Yolanda Dora Martinez; de Oliveira, Márcio Mattos Borges

    2010-01-01

    With the continuous incorporation of health technologies, hospital risk management should be implemented to systemize the monitoring of adverse effects, performing actions to control and eliminate their damage. As part of these actions, Technovigilance is active in the procedures of acquisition, use and quality control of health products and equipment. This study aimed to construct and validate an instrument to evaluate medical-hospital products. This is a quantitative, exploratory, longitudinal and methodological development study, based on the Six Sigma quality management model, which has as its principle basis the component stages of the DMAIC Cycle. For data collection and content validation, the Delphi technique was used with professionals from the Brazilian Sentinel Hospital Network. It was concluded that the instrument developed permitted the evaluation of the product, differentiating between the results of the tested brands, in line with the initial study goal of qualifying the evaluations performed. PMID:21120414

  5. A quantitative release assessment for the noncommercial movement of companion animals: risk of rabies reintroduction to the United kingdom.

    PubMed

    Goddard, A D; Donaldson, N M; Horton, D L; Kosmider, R; Kelly, L A; Sayers, A R; Breed, A C; Freuling, C M; Müller, T; Shaw, S E; Hallgren, G; Fooks, A R; Snary, E L

    2012-10-01

    In 2004, the European Union (EU) implemented a pet movement policy (referred to here as the EUPMP) under EU regulation 998/2003. The United Kingdom (UK) was granted a temporary derogation from the policy until December 2011 and instead has in place its own Pet Movement Policy (Pet Travel Scheme (PETS)). A quantitative risk assessment (QRA) was developed to estimate the risk of rabies introduction to the UK under both schemes to quantify any change in the risk of rabies introduction should the UK harmonize with the EU policy. Assuming 100 % compliance with the regulations, moving to the EUPMP was predicted to increase the annual risk of rabies introduction to the UK by approximately 60-fold, from 7.79 × 10(-5) (5.90 × 10(-5), 1.06 × 10(-4)) under the current scheme to 4.79 × 10(-3) (4.05 × 10(-3), 5.65 × 10(-3)) under the EUPMP. This corresponds to a decrease from 13,272 (9,408, 16,940) to 211 (177, 247) years between rabies introductions. The risks associated with both the schemes were predicted to increase when less than 100 % compliance was assumed, with the current scheme of PETS and quarantine being shown to be particularly sensitive to noncompliance. The results of this risk assessment, along with other evidence, formed a scientific evidence base to inform policy decision with respect to companion animal movement.

  6. Quantitative microbial risk assessment related to urban wastewater and lagoon water reuse in Abidjan, Côte d'Ivoire.

    PubMed

    Yapo, R I; Koné, B; Bonfoh, B; Cissé, G; Zinsstag, J; Nguyen-Viet, H

    2014-06-01

    We assessed the infection risks related to the use of wastewater in Abidjan, Côte d'Ivoire, by using quantitative microbial risk assessment (QMRA). Giardia lamblia and Escherichia coli were isolated and identified in wastewater samples from the canal and lagoon. The exposure assessment was conducted using a cross-sectional survey by questionnaire with 150 individuals who were in contact with the wastewater during their daily activities of swimming, fishing, washing, and collecting materials for reuse. Risk was characterised using the Monte Carlo simulation with 10,000 iterations. Results showed high contamination of water by G. lamblia and E. coli (12.8 CFU/100 mL to 2.97 × 10(4)CFU/100 mL and from 0 cyst/L to 18.5 cysts/L, respectively). Estimates of yearly average infection risks for E. coli (90.07-99.90%, assuming that 8% of E. coli were E. coli O157:H7) and G. lamblia (9.4-34.78%) were much higher than the acceptable risk (10(-4)). These results suggest the need for wastewater treatment plants, raising awareness in the population in contact with urban wastewater and lagoon water. Our study also showed that QMRA is appropriate to study health risks in settings with limited data and budget resources. PMID:24937224

  7. Quantitative microbial risk assessment related to urban wastewater and lagoon water reuse in Abidjan, Côte d'Ivoire.

    PubMed

    Yapo, R I; Koné, B; Bonfoh, B; Cissé, G; Zinsstag, J; Nguyen-Viet, H

    2014-06-01

    We assessed the infection risks related to the use of wastewater in Abidjan, Côte d'Ivoire, by using quantitative microbial risk assessment (QMRA). Giardia lamblia and Escherichia coli were isolated and identified in wastewater samples from the canal and lagoon. The exposure assessment was conducted using a cross-sectional survey by questionnaire with 150 individuals who were in contact with the wastewater during their daily activities of swimming, fishing, washing, and collecting materials for reuse. Risk was characterised using the Monte Carlo simulation with 10,000 iterations. Results showed high contamination of water by G. lamblia and E. coli (12.8 CFU/100 mL to 2.97 × 10(4)CFU/100 mL and from 0 cyst/L to 18.5 cysts/L, respectively). Estimates of yearly average infection risks for E. coli (90.07-99.90%, assuming that 8% of E. coli were E. coli O157:H7) and G. lamblia (9.4-34.78%) were much higher than the acceptable risk (10(-4)). These results suggest the need for wastewater treatment plants, raising awareness in the population in contact with urban wastewater and lagoon water. Our study also showed that QMRA is appropriate to study health risks in settings with limited data and budget resources.

  8. Impact of acquired immunity and dose-dependent probability of illness on quantitative microbial risk assessment.

    PubMed

    Havelaar, A H; Swart, A N

    2014-10-01

    Dose-response models in microbial risk assessment consider two steps in the process ultimately leading to illness: from exposure to (asymptomatic) infection, and from infection to (symptomatic) illness. Most data and theoretical approaches are available for the exposure-infection step; the infection-illness step has received less attention. Furthermore, current microbial risk assessment models do not account for acquired immunity. These limitations may lead to biased risk estimates. We consider effects of both dose dependency of the conditional probability of illness given infection, and acquired immunity to risk estimates, and demonstrate their effects in a case study on exposure to Campylobacter jejuni. To account for acquired immunity in risk estimates, an inflation factor is proposed. The inflation factor depends on the relative rates of loss of protection over exposure. The conditional probability of illness given infection is based on a previously published model, accounting for the within-host dynamics of illness. We find that at low (average) doses, the infection-illness model has the greatest impact on risk estimates, whereas at higher (average) doses and/or increased exposure frequencies, the acquired immunity model has the greatest impact. The proposed models are strongly nonlinear, and reducing exposure is not expected to lead to a proportional decrease in risk and, under certain conditions, may even lead to an increase in risk. The impact of different dose-response models on risk estimates is particularly pronounced when introducing heterogeneity in the population exposure distribution.

  9. Blood color is influenced by inflammation and independently predicts survival in hemodialysis patients: quantitative evaluation of blood color.

    PubMed

    Shibata, Masanori; Nagai, Kojiro; Doi, Toshio; Tawada, Hideo; Taniguchi, Shinkichi

    2012-11-01

    Blood color of dialysis patients can be seen routinely. Darkened blood color is often observed in critically ill patients generally because of decreased oxygen saturation, but little is known about the other factors responsible for the color intensity. In addition, quantitative blood color examination has not been performed yet. Therefore, no one has evaluated the predictive power of blood color. The aim of this study was to evaluate if blood color darkness reflects some medical problems and is associated with survival disadvantage. Study design is a prospective cohort study. One hundred sixty-seven patients were enrolled in this study. Quantification of blood color was done using a reflected light colorimeter. Demographic and clinical data were collected to find out the factors that can be related to blood color. Follow-ups were performed for 2 years to analyze the risk factors for their survival. Regression analysis showed that C-reactive protein and white blood cell count were negatively correlated with blood color. In addition, blood color was positively correlated with mean corpuscular hemoglobin concentration and serum sodium concentration as well as blood oxygen saturation. During a follow-up, 34 (20.4%) patients died. Cox regression analysis revealed that darkened blood color was an independent significant risk factor of mortality in hemodialysis patients as well as low albumin and low Kt/V. These results suggest that inflammation independently affects blood color and quantification of blood color is useful to estimate prognosis in patients undergoing hemodialysis. It is possible that early detection of blood color worsening can improve patients' survival.

  10. Reprint of "Quantitative evaluation of brain development using anatomical MRI and diffusion tensor imaging".

    PubMed

    Oishi, Kenichi; Faria, Andreia V; Yoshida, Shoko; Chang, Linda; Mori, Susumu

    2014-02-01

    The development of the brain is structure-specific, and the growth rate of each structure differs depending on the age of the subject. Magnetic resonance imaging (MRI) is often used to evaluate brain development because of the high spatial resolution and contrast that enable the observation of structure-specific developmental status. Currently, most clinical MRIs are evaluated qualitatively to assist in the clinical decision-making and diagnosis. The clinical MRI report usually does not provide quantitative values that can be used to monitor developmental status. Recently, the importance of image quantification to detect and evaluate mild-to-moderate anatomical abnormalities has been emphasized because these alterations are possibly related to several psychiatric disorders and learning disabilities. In the research arena, structural MRI and diffusion tensor imaging (DTI) have been widely applied to quantify brain development of the pediatric population. To interpret the values from these MR modalities, a "growth percentile chart," which describes the mean and standard deviation of the normal developmental curve for each anatomical structure, is required. Although efforts have been made to create such a growth percentile chart based on MRI and DTI, one of the greatest challenges is to standardize the anatomical boundaries of the measured anatomical structures. To avoid inter- and intra-reader variability about the anatomical boundary definition, and hence, to increase the precision of quantitative measurements, an automated structure parcellation method, customized for the neonatal and pediatric population, has been developed. This method enables quantification of multiple MR modalities using a common analytic framework. In this paper, the attempt to create an MRI- and a DTI-based growth percentile chart, followed by an application to investigate developmental abnormalities related to cerebral palsy, Williams syndrome, and Rett syndrome, have been introduced. Future

  11. Quantitative evaluation of brain development using anatomical MRI and diffusion tensor imaging.

    PubMed

    Oishi, Kenichi; Faria, Andreia V; Yoshida, Shoko; Chang, Linda; Mori, Susumu

    2013-11-01

    The development of the brain is structure-specific, and the growth rate of each structure differs depending on the age of the subject. Magnetic resonance imaging (MRI) is often used to evaluate brain development because of the high spatial resolution and contrast that enable the observation of structure-specific developmental status. Currently, most clinical MRIs are evaluated qualitatively to assist in the clinical decision-making and diagnosis. The clinical MRI report usually does not provide quantitative values that can be used to monitor developmental status. Recently, the importance of image quantification to detect and evaluate mild-to-moderate anatomical abnormalities has been emphasized because these alterations are possibly related to several psychiatric disorders and learning disabilities. In the research arena, structural MRI and diffusion tensor imaging (DTI) have been widely applied to quantify brain development of the pediatric population. To interpret the values from these MR modalities, a "growth percentile chart," which describes the mean and standard deviation of the normal developmental curve for each anatomical structure, is required. Although efforts have been made to create such a growth percentile chart based on MRI and DTI, one of the greatest challenges is to standardize the anatomical boundaries of the measured anatomical structures. To avoid inter- and intra-reader variability about the anatomical boundary definition, and hence, to increase the precision of quantitative measurements, an automated structure parcellation method, customized for the neonatal and pediatric population, has been developed. This method enables quantification of multiple MR modalities using a common analytic framework. In this paper, the attempt to create an MRI- and a DTI-based growth percentile chart, followed by an application to investigate developmental abnormalities related to cerebral palsy, Williams syndrome, and Rett syndrome, have been introduced. Future

  12. Quantitative evaluation of proteins in one- and two-dimensional polyacrylamide gels using a fluorescent stain.

    PubMed

    Nishihara, Julie C; Champion, Kathleen M

    2002-07-01

    The characteristics of protein detection and quantitation with SYPRO Ruby protein gel stain in one- and two-dimensional polyacrylamide gels were evaluated. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE) analyses of three different purified recombinant proteins showed that the limits of detection were comparable to the limits of detection with ammoniacal silver staining and were protein-specific, ranging from 0.5 to 5 ng. The linearity of the relationship between protein level and SYPRO Ruby staining intensity also depended on the individual protein, with observed linear dynamic ranges of 200-, 500-, and, 1000-fold for proteins analyzed by SDS-PAGE. SYPRO Ruby protein gel stain was also evaluated in two-dimensional electrophoretic (2-DE) analysis of Escherichia coli proteins. The experiment involved analysis of replicates of the same sample as well as dilution of the sample from 0.5 to 50 nug total protein across gels. In addition to validating the 2-DE system itself, the experiment was used to evaluate three different image analysis programs: Z3 (Compugen), Progenesis (Nonlinear Dynamics), and PDQuest (Bio-Rad). In each program, we analyzed the 2-DE images with respect to sensitivity and reproducibility of overall protein spot detection, as well as linearity of response for 20 representative proteins of different molecular weights and pI. Across all three programs, coefficients of variation (CV) in total number of spots detected among replicate gels ranged from 4 to 11%. For the 20 representative proteins, spot quantitation was also comparable with CVs for gel-to-gel reproducibility ranging from 3 to 33%. Using Progenesis and PDQuest, a 1000-fold linear dynamic range of SYPRO Ruby was demonstrated with a single known protein. These two programs were more suitable than Z3 for examining individual protein spot quantity across a series of gels and gave comparable results.

  13. 35-mm film scanner as an intraoral dental radiograph digitizer. I: A quantitative evaluation.

    PubMed

    Shrout, M K; Potter, B J; Yurgalavage, H M; Hildebolt, C F; Vannier, M W

    1993-10-01

    A 35-mm slide scanner digital imaging system was tested for its suitability in digitizing intraoral dental radiographic film for quantitative studies. The system (Nikon model LS-3510AF Nikon Electronic Imaging, Nikon, Inc., Melville, N.Y.) uses a charge-coupled device linear photodiode array. The data content in the original film images was evaluated, and the system performance assessed objectively with the use of specially designed test films. Radiometric and geometric performances for the digitizing system were extracted from measurements and observations, and these were compared with published data for two other film digitizing systems (video camera DAGE MTI, Michigan City, Ind. and Barneyscan 35-mm film digitizer Barneyscan, Berkeley, Calif.). The techniques used to evaluate this system are easy and suitable for evaluation of any digitizing system. This scanner system (Nikon) was superior to previously evaluated systems in transforming and recording radiographic film densities across the range (0.3 to 2.0 optical density units) of clinically relevant optical densities. The scanner offers substantial advantage over the other digitizing systems for gray scale information from clinically important optical densities. PMID:8233432

  14. Evaluation of a Genetic Risk Score to Improve Risk Prediction for Alzheimer’s Disease

    PubMed Central

    Chouraki, Vincent; Reitz, Christiane; Maury, Fleur; Bis, Joshua C.; Bellenguez, Celine; Yu, Lei; Jakobsdottir, Johanna; Mukherjee, Shubhabrata; Adams, Hieab H.; Choi, Seung Hoan; Larson, Eric B.; Fitzpatrick, Annette; Uitterlinden, Andre G.; de Jager, Philip L.; Hofman, Albert; Gudnason, Vilmundur; Vardarajan, Badri; Ibrahim-Verbaas, Carla; van der Lee, Sven J.; Lopez, Oscar; Dartigues, Jean-François; Berr, Claudine; Amouyel, Philippe; Bennett, David A.; van Duijn, Cornelia; DeStefano, Anita L.; Launer, Lenore J.; Ikram, M. Arfan; Crane, Paul K.; Lambert, Jean-Charles; Mayeux, Richard; Seshadri, Sudha

    2016-01-01

    Effective prevention of Alzheimer’s disease (AD) requires the development of risk prediction tools permitting preclinical intervention. We constructed a genetic risk score (GRS) comprising common genetic variants associated with AD, evaluated its association with incident AD and assessed its capacity to improve risk prediction over traditional models based on age, sex, education, and APOE ε4. In eight prospective cohorts included in the International Genomics of Alzheimer’s Project (IGAP), we derived weighted sum of risk alleles from the 19 top SNPs reported by the IGAP GWAS in participants aged 65 and older without prevalent dementia. Hazard ratios (HR) of incident AD were estimated in Cox models. Improvement in risk prediction was measured by the difference in C-index (Δ–C), the integrated discrimination improvement (IDI) and continuous net reclassification improvement (NRI>0). Overall, 19,687 participants at risk were included, of whom 2,782 developed AD. The GRS was associated with a 17% increase in AD risk (pooled HR = 1.17; 95%CI = [1.13–1.21] per standard deviation increase in GRS; p-value = 2.86 × 10−16). This association was stronger among persons with at least one APOE ε4 allele (HRGRS = 1.24; 95%CI = [1.15–1.34]) than in others (HRGRS = 1.13; 95%CI = [1.08–1.18]; pinteraction = 3.45 × 10−2). Risk prediction after seven years of follow-up showed a small improvement when adding the GRS to age, sex, APOE ε4, and education (Δ–Cindex = 0.0043 [0.0019–0.0067]). Similar patterns were observed for IDI and NRI>0. In conclusion, a risk score incorporating common genetic variation outside the APOE ε4 locus improved AD risk prediction and may facilitate risk stratification for prevention trials. PMID:27340842

  15. Geneflow from GM plants--towards a more quantitative risk assessment.

    PubMed

    Poppy, Guy M

    2004-09-01

    Assessing the risks associated with geneflow from GM crops to wild relatives is a significant scientific challenge. Most researchers have focused on assessing the frequency of gene flow, too often on a localized scale, and ignoring the hazards caused by geneflow. To quantify risk, multi-disciplinary research teams need to unite and scale up their studies.

  16. QUANTITATION OF MOLECULAR ENDPOINTS FOR THE DOSE-RESPONSE COMPONENT OF CANCER RISK ASSESSMENT

    EPA Science Inventory

    Cancer risk assessment involves the steps of hazard identification, dose-response assessment, exposure assessment and risk characterization. The rapid advances in the use of molecular biology approaches has had an impact on all four components, but the greatest overall current...

  17. Multicenter Evaluation of the Elecsys Hepatitis B Surface Antigen Quantitative Assay ▿

    PubMed Central

    Zacher, B. J.; Moriconi, F.; Bowden, S.; Hammond, R.; Louisirirotchanakul, S.; Phisalprapa, P.; Tanwandee, T.; Wursthorn, K.; Brunetto, M. R.; Wedemeyer, H.; Bonino, F.

    2011-01-01

    The Elecsys hepatitis B surface antigen (HBsAg) II quantitative assay is a new quantitative electrochemiluminescence immunoassay which uses onboard dilution and a simple algorithm to determine HBsAg levels expressed in international units (IU)/ml (standardized against the World Health Organization [WHO] Second International Standard). This study evaluated its performance using routine serum samples from a wide range of HBsAg carriers and patients with chronic hepatitis B (CHB). HBsAg levels were measured in serum samples collected independently by five centers in Europe, Australia, and Asia. Serial dilution analyses were performed to assess the recommended dilution algorithm and determine the assay range free of hook effect. Assay precision was also established. Following assessment of serial dilutions (1:100 to 1:1,000,000) of the 611 samples analyzed, 70.0% and 85.6% of samples tested with analyzers incorporating 1:100 (Elecsys 2010 and cobas e 411) and 1:400 (Modular Analytics E170) onboard dilution, respectively, fell within the linear range of the assay, providing a final result on the first test. No high-dose hook effect was seen up to the maximum HBsAg serum level tested (870,000 IU/ml) using the dilution algorithm. HBsAg levels were reliably determined across all hepatitis B virus (HBV) genotypes, phases of HBV infection, and stages of disease tested. Precision was high across all analyzers (% coefficient of variation [CV], 1.4 to 9.6; HBsAg concentrations, 0.1 to 37,300 IU/ml). The Elecsys HBsAg II quantitative assay accurately and reliably quantifies HBsAg in routine clinical samples. Onboard dilution minimizes retesting and reduces the potential for error. PMID:21880853

  18. Multicenter evaluation of the Elecsys hepatitis B surface antigen quantitative assay.

    PubMed

    Zacher, B J; Moriconi, F; Bowden, S; Hammond, R; Louisirirotchanakul, S; Phisalprapa, P; Tanwandee, T; Wursthorn, K; Brunetto, M R; Wedemeyer, H; Bonino, F

    2011-11-01

    The Elecsys hepatitis B surface antigen (HBsAg) II quantitative assay is a new quantitative electrochemiluminescence immunoassay which uses onboard dilution and a simple algorithm to determine HBsAg levels expressed in international units (IU)/ml (standardized against the World Health Organization [WHO] Second International Standard). This study evaluated its performance using routine serum samples from a wide range of HBsAg carriers and patients with chronic hepatitis B (CHB). HBsAg levels were measured in serum samples collected independently by five centers in Europe, Australia, and Asia. Serial dilution analyses were performed to assess the recommended dilution algorithm and determine the assay range free of hook effect. Assay precision was also established. Following assessment of serial dilutions (1:100 to 1:1,000,000) of the 611 samples analyzed, 70.0% and 85.6% of samples tested with analyzers incorporating 1:100 (Elecsys 2010 and cobas e 411) and 1:400 (Modular Analytics E170) onboard dilution, respectively, fell within the linear range of the assay, providing a final result on the first test. No high-dose hook effect was seen up to the maximum HBsAg serum level tested (870,000 IU/ml) using the dilution algorithm. HBsAg levels were reliably determined across all hepatitis B virus (HBV) genotypes, phases of HBV infection, and stages of disease tested. Precision was high across all analyzers (% coefficient of variation [CV], 1.4 to 9.6; HBsAg concentrations, 0.1 to 37,300 IU/ml). The Elecsys HBsAg II quantitative assay accurately and reliably quantifies HBsAg in routine clinical samples. Onboard dilution minimizes retesting and reduces the potential for error.

  19. Quantitative analysis of real-time tissue elastography for evaluation of liver fibrosis

    PubMed Central

    Shi, Ying; Wang, Xing-Hua; Zhang, Huan-Hu; Zhang, Hai-Qing; Tu, Ji-Zheng; Wei, Kun; Li, Juan; Liu, Xiao-Li

    2014-01-01

    The present study aimed to investigate the feasibility of quantitative analysis of liver fibrosis using real-time tissue elastography (RTE) and its pathological and molecule biological basis. Methods: Fifty-four New Zealand rabbits were subcutaneously injected with thioacetamide (TAA) to induce liver fibrosis as the model group, and another eight New Zealand rabbits served as the normal control group. Four rabbits were randomly taken every two weeks for real-time tissue elastography (RTE) and quantitative analysis of tissue diffusion. The obtained twelve characteristic quantities included relative mean value (MEAN), standard deviation (SD), blue area % (% AREA), complexity (COMP), kurtosis (KURT), skewness (SKEW), contrast (CONT), entropy (ENT), inverse different moment (IDM), angular secon moment (ASM), correlation (CORR) and liver fibrosis index (LF Index). Rabbits were executed and liver tissues were taken for pathological staging of liver fibrosis (grouped by pathological stage into S0 group, S1 group, S2 group, S3 group and S4 group). In addition, the collagen I (Col I) and collagen III (Col III) expression levels in liver tissue were detected by Western blot. Results: Except for KURT, there were significant differences among the other eleven characteristic quantities (P < 0.05). LF Index, Col I and Col III expression levels showed a rising trend with increased pathological staging of liver fibrosis, presenting a positive correlation with the pathological staging of liver fibrosis (r = 0.718, r = 0.693, r = 0.611, P < 0.05). Conclusion: RTE quantitative analysis is expected for noninvasive evaluation of the pathological staging of liver fibrosis. PMID:24955175

  20. Non-HDL Cholesterol and Evaluation of Cardiovascular Disease Risk

    PubMed Central

    2010-01-01

    Cardiovascular disease (CVD), such as coronary heart disease (CHD), is the most frequent cause of death worldwide, especially in developed countries. The latest recommendations of European and American Cardiological Associations emphasize the role of non-HDL cholesterol (non-HDL-C) in evaluating the risk of CVD. Although this parameter has a lot of advantages, it is rarely used by general practitioners in lipid profile assessment. The aim of this article is to present the recent informations on the usage of non-HDL-C in the primary prevention of cardiovascular disease and to compare its diagnostic value to traditional and new CVD risk factors.

  1. Quantitative Assessment of Participant Knowledge and Evaluation of Participant Satisfaction in the CARES Training Program

    PubMed Central

    Goodman, Melody S.; Si, Xuemei; Stafford, Jewel D.; Obasohan, Adesuwa; Mchunguzi, Cheryl

    2016-01-01

    Background The purpose of the Community Alliance for Research Empowering Social change (CARES) training program was to (1) train community members on evidence-based public health, (2) increase their scientific literacy, and (3) develop the infrastructure for community-based participatory research (CBPR). Objectives We assessed participant knowledge and evaluated participant satisfaction of the CARES training program to identify learning needs, obtain valuable feedback about the training, and ensure learning objectives were met through mutually beneficial CBPR approaches. Methods A baseline assessment was administered before the first training session and a follow-up assessment and evaluation was administered after the final training session. At each training session a pretest was administered before the session and a posttest and evaluation were administered at the end of the session. After training session six, a mid-training evaluation was administered. We analyze results from quantitative questions on the assessments, pre- and post-tests, and evaluations. Results CARES fellows knowledge increased at follow-up (75% of questions were answered correctly on average) compared with baseline (38% of questions were answered correctly on average) assessment; post-test scores were higher than pre-test scores in 9 out of 11 sessions. Fellows enjoyed the training and rated all sessions well on the evaluations. Conclusions The CARES fellows training program was successful in participant satisfaction and increasing community knowledge of public health, CBPR, and research method ology. Engaging and training community members in evidence-based public health research can develop an infrastructure for community–academic research partnerships. PMID:22982849

  2. Overcoming Learning Aversion in Evaluating and Managing Uncertain Risks.

    PubMed

    Cox, Louis Anthony Tony

    2015-10-01

    Decision biases can distort cost-benefit evaluations of uncertain risks, leading to risk management policy decisions with predictably high retrospective regret. We argue that well-documented decision biases encourage learning aversion, or predictably suboptimal learning and premature decision making in the face of high uncertainty about the costs, risks, and benefits of proposed changes. Biases such as narrow framing, overconfidence, confirmation bias, optimism bias, ambiguity aversion, and hyperbolic discounting of the immediate costs and delayed benefits of learning, contribute to deficient individual and group learning, avoidance of information seeking, underestimation of the value of further information, and hence needlessly inaccurate risk-cost-benefit estimates and suboptimal risk management decisions. In practice, such biases can create predictable regret in selection of potential risk-reducing regulations. Low-regret learning strategies based on computational reinforcement learning models can potentially overcome some of these suboptimal decision processes by replacing aversion to uncertain probabilities with actions calculated to balance exploration (deliberate experimentation and uncertainty reduction) and exploitation (taking actions to maximize the sum of expected immediate reward, expected discounted future reward, and value of information). We discuss the proposed framework for understanding and overcoming learning aversion and for implementing low-regret learning strategies using regulation of air pollutants with uncertain health effects as an example. PMID:26491992

  3. Overcoming Learning Aversion in Evaluating and Managing Uncertain Risks.

    PubMed

    Cox, Louis Anthony Tony

    2015-10-01

    Decision biases can distort cost-benefit evaluations of uncertain risks, leading to risk management policy decisions with predictably high retrospective regret. We argue that well-documented decision biases encourage learning aversion, or predictably suboptimal learning and premature decision making in the face of high uncertainty about the costs, risks, and benefits of proposed changes. Biases such as narrow framing, overconfidence, confirmation bias, optimism bias, ambiguity aversion, and hyperbolic discounting of the immediate costs and delayed benefits of learning, contribute to deficient individual and group learning, avoidance of information seeking, underestimation of the value of further information, and hence needlessly inaccurate risk-cost-benefit estimates and suboptimal risk management decisions. In practice, such biases can create predictable regret in selection of potential risk-reducing regulations. Low-regret learning strategies based on computational reinforcement learning models can potentially overcome some of these suboptimal decision processes by replacing aversion to uncertain probabilities with actions calculated to balance exploration (deliberate experimentation and uncertainty reduction) and exploitation (taking actions to maximize the sum of expected immediate reward, expected discounted future reward, and value of information). We discuss the proposed framework for understanding and overcoming learning aversion and for implementing low-regret learning strategies using regulation of air pollutants with uncertain health effects as an example.

  4. Assessing the risk of impact of farming intensification on calcareous grasslands in Europe: a quantitative implementation of the MIRABEL framework.

    PubMed

    Petit, Sandrine; Elbersen, Berien

    2006-09-01

    Intensification of farming practices is still a major driver of biodiversity loss in Europe, despite the implementation of policies that aim to reverse this trend. A conceptual framework called MIRABEL was previously developed that enabled a qualitative and expert-based assessment of the impact of agricultural intensification on ecologically valuable habitats. We present a quantitative update of the previous assessment that uses newly available pan-European spatially explicit data on pressures and habitats at risk. This quantitative assessment shows that the number of calcareous grasslands potentially at risk of eutrophication and overgrazing is rapidly increasing in Europe. Decreases in nitrogen surpluses and stocking densities that occurred between 1990 and 2000 have rarely led to values that were below the ecological thresholds. At the same time, a substantial proportion of calcareous grassland that has so far experienced low values for indicators of farming intensification has faced increases between 1990 and 2000 and could well become at high risk from farming intensification in the near future. As such, this assessment is an early warning signal, especially for habitats located in areas that have traditionally been farmed extensively. When comparing the outcome of this assessment with the previous qualitative MIRABEL assessment, it appears that if pan-European data are useful to assess the intensity of the pressures, more work is needed to identify regional variations in the response of biodiversity to such pressures. This is where a qualitative approach based on regional expertise should be used to complement data-driven assessments. PMID:17240762

  5. Mechanism of variable structural colour in the neon tetra: quantitative evaluation of the Venetian blind model

    PubMed Central

    Yoshioka, S.; Matsuhana, B.; Tanaka, S.; Inouye, Y.; Oshima, N.; Kinoshita, S.

    2011-01-01

    The structural colour of the neon tetra is distinguishable from those of, e.g., butterfly wings and bird feathers, because it can change in response to the light intensity of the surrounding environment. This fact clearly indicates the variability of the colour-producing microstructures. It has been known that an iridophore of the neon tetra contains a few stacks of periodically arranged light-reflecting platelets, which can cause multilayer optical interference phenomena. As a mechanism of the colour variability, the Venetian blind model has been proposed, in which the light-reflecting platelets are assumed to be tilted during colour change, resulting in a variation in the spacing between the platelets. In order to quantitatively evaluate the validity of this model, we have performed a detailed optical study of a single stack of platelets inside an iridophore. In particular, we have prepared a new optical system that can simultaneously measure both the spectrum and direction of the reflected light, which are expected to be closely related to each other in the Venetian blind model. The experimental results and detailed analysis are found to quantitatively verify the model. PMID:20554565

  6. Introduction of a method for quantitative evaluation of spontaneous motor activity development with age in infants.

    PubMed

    Disselhorst-Klug, Catherine; Heinze, Franziska; Breitbach-Faller, Nico; Schmitz-Rode, Thomas; Rau, Günter

    2012-04-01

    Coordination between perception and action is required to interact with the environment successfully. This is already trained by very young infants who perform spontaneous movements to learn how their body interacts with the environment. The strategies used by the infants for this purpose change with age. Therefore, very early progresses in action control made by the infants can be investigated by monitoring the development of spontaneous motor activity. In this paper, an objective method is introduced, which allows the quantitative evaluation of the development of spontaneous motor activity in newborns. The introduced methodology is based on the acquisition of spontaneous movement trajectories of the feet by 3D movement analysis and subsequent calculation of specific movement parameters from them. With these movement-based parameters, it was possible to provide an objective description of age-dependent developmental steps in healthy newborns younger than 6 months. Furthermore, it has been shown that pathologies like infantile cerebral palsy influence development of motor activity significantly. Since the introduced methodology is objective and quantitative, it is suitable to monitor how newborns train their cognitive processes, which will enable them to cope with their environment by motor interaction.

  7. Evaluation of static and dynamic perfusion cardiac computed tomography for quantitation and classification tasks.

    PubMed

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R; La Riviere, Patrick J; Alessio, Adam M

    2016-04-01

    Cardiac computed tomography (CT) acquisitions for perfusion assessment can be performed in a dynamic or static mode. Either method may be used for a variety of clinical tasks, including (1) stratifying patients into categories of ischemia and (2) using a quantitative myocardial blood flow (MBF) estimate to evaluate disease severity. In this simulation study, we compare method performance on these classification and quantification tasks for matched radiation dose levels and for different flow states, patient sizes, and injected contrast levels. Under conditions simulated, the dynamic method has low bias in MBF estimates (0 to [Formula: see text]) compared to linearly interpreted static assessment (0.45 to [Formula: see text]), making it more suitable for quantitative estimation. At matched radiation dose levels, receiver operating characteristic analysis demonstrated that the static method, with its high bias but generally lower variance, had superior performance ([Formula: see text]) in stratifying patients, especially for larger patients and lower contrast doses [area under the curve [Formula: see text] to 96 versus 0.86]. We also demonstrate that static assessment with a correctly tuned exponential relationship between the apparent CT number and MBF has superior quantification performance to static assessment with a linear relationship and to dynamic assessment. However, tuning the exponential relationship to the patient and scan characteristics will likely prove challenging. This study demonstrates that the selection and optimization of static or dynamic acquisition modes should depend on the specific clinical task.

  8. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). PMID:24513349

  9. Evaluation of a Quantitative Serological Assay for Diagnosing Chronic Pulmonary Aspergillosis

    PubMed Central

    Fujita, Yuka; Suzuki, Hokuto; Doushita, Kazushi; Kuroda, Hikaru; Takahashi, Masaaki; Yamazaki, Yasuhiro; Tsuji, Tadakatsu; Fujikane, Toshiaki; Osanai, Shinobu; Sasaki, Takaaki; Ohsaki, Yoshinobu

    2016-01-01

    The purpose of this study was to evaluate the clinical utility of a quantitative Aspergillus IgG assay for diagnosing chronic pulmonary aspergillosis. We examined Aspergillus-specific IgG levels in patients who met the following criteria: (i) chronic (duration of >3 months) pulmonary or systemic symptoms, (ii) radiological evidence of a progressive (over months or years) pulmonary lesion with surrounding inflammation, and (iii) no major discernible immunocompromising factors. Anti-Aspergillus IgG serum levels were retrospectively analyzed according to defined classifications. Mean Aspergillus IgG levels were significantly higher in the proven group than those in the possible and control groups (P < 0.01). Receiver operating characteristic curve analysis revealed that the Aspergillus IgG cutoff value for diagnosing proven cases was 50 mg of antigen-specific antibodies/liter (area under the curve, 0.94; sensitivity, 0.98; specificity, 0.84). The sensitivity and specificity for diagnosing proven cases using this cutoff were 0.77 and 0.78, respectively. The positive rates of Aspergillus IgG in the proven and possible groups were 97.9% and 39.2%, respectively, whereas that of the control group was 6.6%. The quantitative Aspergillus IgG assay offers reliable sensitivity and specificity for diagnosing chronic pulmonary aspergillosis and may be an alternative to the conventional precipitin test. PMID:27008878

  10. Quantitative evaluation of hidden defects in cast iron components using ultrasound activated lock-in vibrothermography

    SciTech Connect

    Montanini, R.; Freni, F.; Rossi, G. L.

    2012-09-15

    This paper reports one of the first experimental results on the application of ultrasound activated lock-in vibrothermography for quantitative assessment of buried flaws in complex cast parts. The use of amplitude modulated ultrasonic heat generation allowed selective response of defective areas within the part, as the defect itself is turned into a local thermal wave emitter. Quantitative evaluation of hidden damages was accomplished by estimating independently both the area and the depth extension of the buried flaws, while x-ray 3D computed tomography was used as reference for sizing accuracy assessment. To retrieve flaw's area, a simple yet effective histogram-based phase image segmentation algorithm with automatic pixels classification has been developed. A clear correlation was found between the thermal (phase) signature measured by the infrared camera on the target surface and the actual mean cross-section area of the flaw. Due to the very fast cycle time (<30 s/part), the method could potentially be applied for 100% quality control of casting components.

  11. Proteus mirabilis biofilm - Qualitative and quantitative colorimetric methods-based evaluation

    PubMed Central

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant. PMID:25763050

  12. Mechanism of variable structural colour in the neon tetra: quantitative evaluation of the Venetian blind model.

    PubMed

    Yoshioka, S; Matsuhana, B; Tanaka, S; Inouye, Y; Oshima, N; Kinoshita, S

    2011-01-01

    The structural colour of the neon tetra is distinguishable from those of, e.g., butterfly wings and bird feathers, because it can change in response to the light intensity of the surrounding environment. This fact clearly indicates the variability of the colour-producing microstructures. It has been known that an iridophore of the neon tetra contains a few stacks of periodically arranged light-reflecting platelets, which can cause multilayer optical interference phenomena. As a mechanism of the colour variability, the Venetian blind model has been proposed, in which the light-reflecting platelets are assumed to be tilted during colour change, resulting in a variation in the spacing between the platelets. In order to quantitatively evaluate the validity of this model, we have performed a detailed optical study of a single stack of platelets inside an iridophore. In particular, we have prepared a new optical system that can simultaneously measure both the spectrum and direction of the reflected light, which are expected to be closely related to each other in the Venetian blind model. The experimental results and detailed analysis are found to quantitatively verify the model.

  13. Quantitative and Qualitative Evaluation of Iranian Researchers’ Scientific Production in Dentistry Subfields

    PubMed Central

    Yaminfirooz, Mousa; Motallebnejad, Mina; Gholinia, Hemmat; Esbakian, Somayeh

    2015-01-01

    Background: As in other fields of medicine, scientific production in the field of dentistry has significant placement. This study aimed at quantitatively and qualitatively evaluating Iranian researchers’ scientific output in the field of dentistry and determining their contribution in each of dentistry subfields and branches. Methods: This research was a scientometric study that applied quantitative and qualitative indices of Web of Science (WoS). Research population consisted of927indexed documents published under the name of Iran in the time span of 1993-2012 which were extracted from WoS on 10 March 2013. The Mann-Whitney test and Pearson correlation coefficient were used to data analyses in SPSS 19. Results: 777 (83. 73%) of indexed items of all scientific output in WoS were scientific articles. The highest growth rate of scientific productionwith90% belonged to endodontic sub field. The correlation coefficient test showed that there was a significant positive relationship between the number of documents and their publication age (P < 0. 0001). There was a significant difference between the mean number of published articles in the first ten- year (1993-2003) and that of the second one (2004-2013), in favor of the latter (P = 0. 001). Conclusions: The distribution frequencies of scientific production in various subfields of dentistry were very different. It needs to reinforce the infrastructure for more balanced scientific production in the field and its related subfields. PMID:26635439

  14. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    PubMed

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  15. Potential impacts of radon, terrestrial gamma and cosmic rays on childhood leukemia in France: a quantitative risk assessment.

    PubMed

    Laurent, Olivier; Ancelet, Sophie; Richardson, David B; Hémon, Denis; Ielsch, Géraldine; Demoury, Claire; Clavel, Jacqueline; Laurier, Dominique

    2013-05-01

    Previous epidemiological studies and quantitative risk assessments (QRA) have suggested that natural background radiation may be a cause of childhood leukemia. The present work uses a QRA approach to predict the excess risk of childhood leukemia in France related to three components of natural radiation: radon, cosmic rays and terrestrial gamma rays, using excess relative and absolute risk models proposed by the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR). Both models were developed from the Life Span Study (LSS) of Japanese A-bomb survivors. Previous risk assessments were extended by considering uncertainties in radiation-related leukemia risk model parameters as part of this process, within a Bayesian framework. Estimated red bone marrow doses cumulated during childhood by the average French child due to radon, terrestrial gamma and cosmic rays are 4.4, 7.5 and 4.3 mSv, respectively. The excess fractions of cases (expressed as percentages) associated with these sources of natural radiation are 20 % [95 % credible interval (CI) 0-68 %] and 4 % (95 % CI 0-11 %) under the excess relative and excess absolute risk models, respectively. The large CIs, as well as the different point estimates obtained under these two models, highlight the uncertainties in predictions of radiation-related childhood leukemia risks. These results are only valid provided that models developed from the LSS can be transferred to the population of French children and to chronic natural radiation exposures, and must be considered in view of the currently limited knowledge concerning other potential risk factors for childhood leukemia. Last, they emphasize the need for further epidemiological investigations of the effects of natural radiation on childhood leukemia to reduce uncertainties and help refine radiation protection standards.

  16. Evaluating biomarkers to model cancer risk post cosmic ray exposure

    NASA Astrophysics Data System (ADS)

    Sridharan, Deepa M.; Asaithamby, Aroumougame; Blattnig, Steve R.; Costes, Sylvain V.; Doetsch, Paul W.; Dynan, William S.; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D.; Peterson, Leif E.; Plante, Ianik; Ponomarev, Artem L.; Saha, Janapriya; Snijders, Antoine M.; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M.

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  17. Evaluating biomarkers to model cancer risk post cosmic ray exposure.

    PubMed

    Sridharan, Deepa M; Asaithamby, Aroumougame; Blattnig, Steve R; Costes, Sylvain V; Doetsch, Paul W; Dynan, William S; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D; Peterson, Leif E; Plante, Ianik; Ponomarev, Artem L; Saha, Janapriya; Snijders, Antoine M; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  18. Evaluating biomarkers to model cancer risk post cosmic ray exposure.

    PubMed

    Sridharan, Deepa M; Asaithamby, Aroumougame; Blattnig, Steve R; Costes, Sylvain V; Doetsch, Paul W; Dynan, William S; Hahnfeldt, Philip; Hlatky, Lynn; Kidane, Yared; Kronenberg, Amy; Naidu, Mamta D; Peterson, Leif E; Plante, Ianik; Ponomarev, Artem L; Saha, Janapriya; Snijders, Antoine M; Srinivasan, Kalayarasan; Tang, Jonathan; Werner, Erica; Pluth, Janice M

    2016-06-01

    Robust predictive models are essential to manage the risk of radiation-induced carcinogenesis. Chronic exposure to cosmic rays in the context of the complex deep space environment may place astronauts at high cancer risk. To estimate this risk, it is critical to understand how radiation-induced cellular stress impacts cell fate decisions and how this in turn alters the risk of carcinogenesis. Exposure to the heavy ion component of cosmic rays triggers a multitude of cellular changes, depending on the rate of exposure, the type of damage incurred and individual susceptibility. Heterogeneity in dose, dose rate, radiation quality, energy and particle flux contribute to the complexity of risk assessment. To unravel the impact of each of these factors, it is critical to identify sensitive biomarkers that can serve as inputs for robust modeling of individual risk of cancer or other long-term health consequences of exposure. Limitations in sensitivity of biomarkers to dose and dose rate, and the complexity of longitudinal monitoring, are some of the factors that increase uncertainties in the output from risk prediction models. Here, we critically evaluate candidate early and late biomarkers of radiation exposure and discuss their usefulness in predicting cell fate decisions. Some of the biomarkers we have reviewed include complex clustered DNA damage, persistent DNA repair foci, reactive oxygen species, chromosome aberrations and inflammation. Other biomarkers discussed, often assayed for at longer points post exposure, include mutations, chromosome aberrations, reactive oxygen species and telomere length changes. We discuss the relationship of biomarkers to different potential cell fates, including proliferation, apoptosis, senescence, and loss of stemness, which can propagate genomic instability and alter tissue composition and the underlying mRNA signatures that contribute to cell fate decisions. Our goal is to highlight factors that are important in choosing

  19. Comparative evaluation of six quantitative lifting tools to estimate spine loads during static activities.

    PubMed

    Rajaee, Mohammad Ali; Arjmand, Navid; Shirazi-Adl, Aboulfazl; Plamondon, André; Schmidt, Hendrik

    2015-05-01

    Different lifting analysis tools are commonly used to assess spinal loads and risk of injury. Distinct musculoskeletal models with various degrees of accuracy are employed in these tools affecting thus their relative accuracy in practical applications. The present study aims to compare predictions of six tools (HCBCF, LSBM, 3DSSPP, AnyBody, simple polynomial, and regression models) for the L4-L5 and L5-S1 compression and shear loads in twenty-six static activities with and without hand load. Significantly different spinal loads but relatively similar patterns for the compression (R(2) > 0.87) were computed. Regression models and AnyBody predicted intradiscal pressures in closer agreement with available in vivo measurements (RMSE ≈ 0.12 MPa). Due to the differences in predicted spinal loads, the estimated risk of injury alters depending on the tool used. Each tool is evaluated to identify its shortcomings and preferred application domains.

  20. Quantitative Risk Assessment for African Horse Sickness in Live Horses Exported from South Africa

    PubMed Central

    Sergeant, Evan S.

    2016-01-01

    African horse sickness (AHS) is a severe, often fatal, arbovirus infection of horses, transmitted by Culicoides spp. midges. AHS occurs in most of sub-Saharan Africa and is a significant impediment to export of live horses from infected countries, such as South Africa. A stochastic risk model was developed to estimate the probability of exporting an undetected AHS-infected horse through a vector protected pre-export quarantine facility, in accordance with OIE recommendations for trade from an infected country. The model also allows for additional risk management measures, including multiple PCR tests prior to and during pre-export quarantine and optionally during post-arrival quarantine, as well as for comparison of risk associated with exports from a demonstrated low-risk area for AHS and an area where AHS is endemic. If 1 million horses were exported from the low-risk area with no post-arrival quarantine we estimate the median number of infected horses to be 5.4 (95% prediction interval 0.5 to 41). This equates to an annual probability of 0.0016 (95% PI: 0.00015 to 0.012) assuming 300 horses exported per year. An additional PCR test while in vector-protected post-arrival quarantine reduced these probabilities by approximately 12-fold. Probabilities for horses exported from an area where AHS is endemic were approximately 15 to 17 times higher than for horses exported from the low-risk area under comparable scenarios. The probability of undetected AHS infection in horses exported from an infected country can be minimised by appropriate risk management measures. The final choice of risk management measures depends on the level of risk acceptable to the importing country. PMID:26986002

  1. Quantitative Risk Assessment for African Horse Sickness in Live Horses Exported from South Africa.

    PubMed

    Sergeant, Evan S; Grewar, John D; Weyer, Camilla T; Guthrie, Alan J

    2016-01-01

    African horse sickness (AHS) is a severe, often fatal, arbovirus infection of horses, transmitted by Culicoides spp. midges. AHS occurs in most of sub-Saharan Africa and is a significant impediment to export of live horses from infected countries, such as South Africa. A stochastic risk model was developed to estimate the probability of exporting an undetected AHS-infected horse through a vector protected pre-export quarantine facility, in accordance with OIE recommendations for trade from an infected country. The model also allows for additional risk management measures, including multiple PCR tests prior to and during pre-export quarantine and optionally during post-arrival quarantine, as well as for comparison of risk associated with exports from a demonstrated low-risk area for AHS and an area where AHS is endemic. If 1 million horses were exported from the low-risk area with no post-arrival quarantine we estimate the median number of infected horses to be 5.4 (95% prediction interval 0.5 to 41). This equates to an annual probability of 0.0016 (95% PI: 0.00015 to 0.012) assuming 300 horses exported per year. An additional PCR test while in vector-protected post-arrival quarantine reduced these probabilities by approximately 12-fold. Probabilities for horses exported from an area where AHS is endemic were approximately 15 to 17 times higher than for horses exported from the low-risk area under comparable scenarios. The probability of undetected AHS infection in horses exported from an infected country can be minimised by appropriate risk management measures. The final choice of risk management measures depends on the level of risk acceptable to the importing country. PMID:26986002

  2. Quantitative Risk Assessment for African Horse Sickness in Live Horses Exported from South Africa.

    PubMed

    Sergeant, Evan S; Grewar, John D; Weyer, Camilla T; Guthrie, Alan J

    2016-01-01

    African horse sickness (AHS) is a severe, often fatal, arbovirus infection of horses, transmitted by Culicoides spp. midges. AHS occurs in most of sub-Saharan Africa and is a significant impediment to export of live horses from infected countries, such as South Africa. A stochastic risk model was developed to estimate the probability of exporting an undetected AHS-infected horse through a vector protected pre-export quarantine facility, in accordance with OIE recommendations for trade from an infected country. The model also allows for additional risk management measures, including multiple PCR tests prior to and during pre-export quarantine and optionally during post-arrival quarantine, as well as for comparison of risk associated with exports from a demonstrated low-risk area for AHS and an area where AHS is endemic. If 1 million horses were exported from the low-risk area with no post-arrival quarantine we estimate the median number of infected horses to be 5.4 (95% prediction interval 0.5 to 41). This equates to an annual probability of 0.0016 (95% PI: 0.00015 to 0.012) assuming 300 horses exported per year. An additional PCR test while in vector-protected post-arrival quarantine reduced these probabilities by approximately 12-fold. Probabilities for horses exported from an area where AHS is endemic were approximately 15 to 17 times higher than for horses exported from the low-risk area under comparable scenarios. The probability of undetected AHS infection in horses exported from an infected country can be minimised by appropriate risk management measures. The final choice of risk management measures depends on the level of risk acceptable to the importing country.

  3. QUANTITATIVE NON-DESTRUCTIVE EVALUATION (QNDE) OF THE ELASTIC MODULI OF POROUS TIAL ALLOYS

    SciTech Connect

    Yeheskel, O.

    2008-02-28

    The elastic moduli of {gamma}-TiA1 were studied in porous samples consolidated by various techniques e.g. cold isostatic pressing (CIP), pressure-less sintering, or hot isostatic pressing (HIP). Porosity linearly affects the dynamic elastic moduli of samples. The results indicate that the sound wave velocities and the elastic moduli affected by the processing route and depend not only on the attained density but also on the consolidation temperature. In this paper we show that there is linear correlation between the shear and the longitudinal sound velocities in porous TiA1. This opens the way to use a single sound velocity as a tool for quantitative non-destructive evaluation (QNDE) of porous TiA1 alloys. Here we demonstrate the applicability of an equation derived from the elastic theory and used previously for porous cubic metals.

  4. An experimental method for quantitatively evaluating the elemental processes of indoor radioactive aerosol behavior.

    PubMed

    Yamazawa, H; Yamada, S; Xu, Y; Hirao, S; Moriizumi, J

    2015-11-01

    An experimental method for quantitatively evaluating the elemental processes governing the indoor behaviour of naturally occurring radioactive aerosols was proposed. This method utilises transient response of aerosol concentrations to an artificial change in aerosol removal rate by turning on and off an air purifier. It was shown that the indoor-outdoor exchange rate and the indoor deposition rate could be estimated by a continuous measurement of outdoor and indoor aerosol number concentration measurements and by the method proposed in this study. Although the scatter of the estimated parameters is relatively large, both the methods gave consistent results. It was also found that the size distribution of radioactive aerosol particles and hence activity median aerodynamic diameter remained not largely affected by the operation of the air purifier, implying the predominance of the exchange and deposition processes over other processes causing change in the size distribution such as the size growth by coagulation and the size dependence of deposition.

  5. [Quantitative evaluation of film-screen combinations for x-ray diagnosis].

    PubMed

    Bronder, T; Heinze-Assmann, R

    1988-05-01

    The properties of screen/film combinations for radiographs set a lower limit for the x-ray exposure of the patient and an upper limit for the quality of the x-ray picture. Sensitivity, slope and resolution of different screen/film combinations were determined using a measuring phantom which was developed in the PTB. For all screens used the measurements show the same relation between screen sensitivity and resolution. This allows quantitative evaluation of image quality. A classification scheme derived from these results facilitates the selection of screen/film combinations for practical use. In addition for quality assurance gross differences in material properties and conditions of film development can be detected with the aid of the measuring phantom. PMID:3399512

  6. Methods for quantitative evaluation of dynamics of repair proteins within irradiated cells

    NASA Astrophysics Data System (ADS)

    Hable, V.; Dollinger, G.; Greubel, C.; Hauptner, A.; Krücken, R.; Dietzel, S.; Cremer, T.; Drexler, G. A.; Friedl, A. A.; Löwe, R.

    2006-04-01

    Living HeLa cells are irradiated well directed with single 100 MeV oxygen ions by the superconducting ion microprobe SNAKE, the Superconducting Nanoscope for Applied Nuclear (=Kern-) Physics Experiments, at the Munich 14 MV tandem accelerator. Various proteins, which are involved directly or indirectly in repair processes, accumulate as clusters (so called foci) at DNA-double strand breaks (DSBs) induced by the ions. The spatiotemporal dynamics of these foci built by the phosphorylated histone γ-H2AX are studied. For this purpose cells are irradiated in line patterns. The γ-H2AX is made visible under the fluorescence microscope using immunofluorescence techniques. Quantitative analysis methods are developed to evaluate the data of the microscopic images in order to analyze movement of the foci and their changing size.

  7. Quantitative non-destructive evaluation of high-temperature superconducting materials

    SciTech Connect

    Achenbach, J.D.

    1990-09-15

    Even though the currently intensive research efforts on high-temperature superconducting materials have not yet converged on a well specified material, the strong indications are that such a material will be brittle, anisotropic, and may contain many flaws such as microcracks and voids at grain boundaries. Consequently, practical applications of high temperature superconducting materials will require a very careful strength analysis based on fracture mechanics considerations. Because of the high sensitivity of the strength of such materials to the presence of defects, methods of quantitative non-destructive evaluation may be expected to play an important role in strength determinations. This proposal is concerned with the use of ultrasonic methods to detect and characterize isolated cracks, clusters of microcracks and microcracks distributed throughout the material. Particular attention will be devoted to relating ultrasonic results directly to deterministic and statistical linear elastic fracture mechanics considerations.

  8. Thrombocytosis: Diagnostic Evaluation, Thrombotic Risk Stratification, and Risk-Based Management Strategies

    PubMed Central

    Bleeker, Jonathan S.; Hogan, William J.

    2011-01-01

    Thrombocytosis is a commonly encountered clinical scenario, with a large proportion of cases discovered incidentally. The differential diagnosis for thrombocytosis is broad and the diagnostic process can be challenging. Thrombocytosis can be spurious, attributed to a reactive process or due to clonal disorder. This distinction is important as it carries implications for evaluation, prognosis, and treatment. Clonal thrombocytosis associated with the myeloproliferative neoplasms, especially essential thrombocythemia and polycythemia vera, carries a unique prognostic profile, with a markedly increased risk of thrombosis. This risk is the driving factor behind treatment strategies in these disorders. Clinical trials utilizing targeted therapies in thrombocytosis are ongoing with new therapeutic targets waiting to be explored. This paper will outline the mechanisms underlying thrombocytosis, the diagnostic evaluation of thrombocytosis, complications of thrombocytosis with a special focus on thrombotic risk as well as treatment options for clonal processes leading to thrombocytosis, including essential thrombocythemia and polycythemia vera. PMID:22084665

  9. Quantitative microbial risk assessment of Cryptosporidium and Giardia in well water from a native community of Mexico.

    PubMed

    Balderrama-Carmona, Ana Paola; Gortáres-Moroyoqui, Pablo; Álvarez-Valencia, Luis Humberto; Castro-Espinoza, Luciano; Balderas-Cortés, José de Jesús; Mondaca-Fernández, Iram; Chaidez-Quiroz, Cristóbal; Meza-Montenegro, María Mercedes

    2015-01-01

    Cryptosporidium and Giardia are gastrointestinal disease-causing organisms transmitted by the fecal-oral route, zoonotic and prevalent in all socioeconomic segments with greater emphasis in rural communities. The goal of this study was to assess the risk of cryptosporidiosis and giardiasis of Potam dwellers consuming drinking water from communal well water. To achieve the goal, quantitative microbial risk assessment (QMRA) was carried out as follows: (a) identification of Cryptosporidium oocysts and Giardia cysts in well water samples by information collection rule method, (b) assessment of exposure to healthy Potam residents, (c) dose-response modelling, and (d) risk characterization using an exponential model. All well water samples tested were positive for Cryptosporidium and Giardia. The QMRA results indicate a mean of annual risks of 99:100 (0.99) for cryptosporidiosis and 1:1 (1.0) for giardiasis. The outcome of the present study may drive decision-makers to establish an educational and treatment program to reduce the incidence of parasite-borne intestinal infection in the Potam community, and to conduct risk analysis programs in other similar rural communities in Mexico. PMID:25494486

  10. Quantitative risk model for polycyclic aromatic hydrocarbon photoinduced toxicity in Pacific herring following the Exxon Valdez oil spill.

    PubMed

    Sellin Jeffries, Marlo K; Claytor, Carrie; Stubblefield, William; Pearson, Walter H; Oris, James T

    2013-05-21

    Phototoxicity occurs when exposure to ultraviolet radiation increases the toxicity of certain contaminants, including polycyclic aromatic hydrocarbons (PAHs). This study aimed to (1) develop a quantitative model to predict the risk of PAH phototoxicity to fish, (2) assess the predictive value of the model, and (3) estimate the risk of PAH phototoxicity to larval and young of year Pacific herring (Clupea pallasi) following the Exxon Valdez oil spill (EVOS) in Prince William Sound, Alaska. The model, in which median lethal times (LT50 values) are estimated from whole-body phototoxic PAH concentrations and ultraviolet A (UVA) exposure, was constructed from previously reported PAH phototoxicity data. The predictive value of this model was confirmed by the overlap of model-predicted and experimentally derived LT50 values. The model, along with UVA characterization data, was used to generate estimates for depths of de minimiz risk for PAH phototoxicity in young herring in 2003/2004 and immediately following the 1989 EVOS, assuming average and worst case conditions. Depths of de minimiz risk were estimated to be between 0 and 2 m deep when worst case UVA and PAH conditions were considered. A post hoc assessment determined that <1% of the young herring population would have been present at depths associated with significant risk of PAH phototoxicity in 2003/2004 and 1989. PMID:23600964

  11. Quantitative risk model for polycyclic aromatic hydrocarbon photoinduced toxicity in Pacific herring following the Exxon Valdez oil spill.

    PubMed

    Sellin Jeffries, Marlo K; Claytor, Carrie; Stubblefield, William; Pearson, Walter H; Oris, James T

    2013-05-21

    Phototoxicity occurs when exposure to ultraviolet radiation increases the toxicity of certain contaminants, including polycyclic aromatic hydrocarbons (PAHs). This study aimed to (1) develop a quantitative model to predict the risk of PAH phototoxicity to fish, (2) assess the predictive value of the model, and (3) estimate the risk of PAH phototoxicity to larval and young of year Pacific herring (Clupea pallasi) following the Exxon Valdez oil spill (EVOS) in Prince William Sound, Alaska. The model, in which median lethal times (LT50 values) are estimated from whole-body phototoxic PAH concentrations and ultraviolet A (UVA) exposure, was constructed from previously reported PAH phototoxicity data. The predictive value of this model was confirmed by the overlap of model-predicted and experimentally derived LT50 values. The model, along with UVA characterization data, was used to generate estimates for depths of de minimiz risk for PAH phototoxicity in young herring in 2003/2004 and immediately following the 1989 EVOS, assuming average and worst case conditions. Depths of de minimiz risk were estimated to be between 0 and 2 m deep when worst case UVA and PAH conditions were considered. A post hoc assessment determined that <1% of the young herring population would have been present at depths associated with significant risk of PAH phototoxicity in 2003/2004 and 1989.

  12. Technology Efficacy in Active Prosthetic Knees for Transfemoral Amputees: A Quantitative Evaluation

    PubMed Central

    El-Sayed, Amr M.; Abu Osman, Noor Azuan

    2014-01-01

    Several studies have presented technological ensembles of active knee systems for transfemoral prosthesis. Other studies have examined the amputees' gait performance while wearing a specific active prosthesis. This paper combined both insights, that is, a technical examination of the components used, with an evaluation of how these improved the gait of respective users. This study aims to offer a quantitative understanding of the potential enhancement derived from strategic integration of core elements in developing an effective device. The study systematically discussed the current technology in active transfemoral prosthesis with respect to its functional walking performance amongst above-knee amputee users, to evaluate the system's efficacy in producing close-to-normal user performance. The performances of its actuator, sensory system, and control technique that are incorporated in each reported system were evaluated separately and numerical comparisons were conducted based on the percentage of amputees' gait deviation from normal gait profile points. The results identified particular components that contributed closest to normal gait parameters. However, the conclusion is limitedly extendable due to the small number of studies. Thus, more clinical validation of the active prosthetic knee technology is needed to better understand the extent of contribution of each component to the most functional development. PMID:25110727

  13. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  14. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

  15. Quantitative MR evaluation of body composition in patients with Duchenne muscular dystrophy.

    PubMed

    Pichiecchio, Anna; Uggetti, Carla; Egitto, Maria Grazia; Berardinelli, Angela; Orcesi, Simona; Gorni, Ksenija Olga Tatiana; Zanardi, Cristina; Tagliabue, Anna

    2002-11-01

    The aim of this study was to propose a quantitative MR protocol with very short acquisition time and good reliability in volume construction, for the evaluation of body composition in patients affected by Duchenne muscular dystrophy (DMD). This MR protocol was compared with common anthropometric evaluations of the same patients. Nine boys affected by DMD, ranging in age from 6 to 12 years, were selected to undergo MR examination. Transversal T1-weighted spin-echo sequences (0.5T; TR 300 ms, TE 10 ms, slice thickness 10 mm, slice gap 1 mm) were used for all acquisitions, each consisting of 8 slices and lasting just 54 s. Whole-body examination needed an average of nine acquisitions. Afterwards, images were downloaded to an independent workstation and, through their electronic segmentation with a reference filter, total volume and adipose tissue volumes were calculated manually. This process took up to 2 h for each patient. The MR data were compared with anthropometric evaluations. Affected children have a marked increase in adipose tissue and a decrease in lean tissue compared with reference healthy controls. Mean fat mass calculated by MR is significantly higher than mean fat mass obtained using anthropometric measurements ( p<0.001). Our MR study proved to be accurate and easy to apply, although it was time-consuming. We recommend it in monitoring the progression of the disease and planning DMD patients' diet.

  16. Quantitative evaluation of MPTP-treated nonhuman parkinsonian primates in the HALLWAY task.

    PubMed

    Campos-Romo, Aurelio; Ojeda-Flores, Rafael; Moreno-Briseño, Pablo; Fernandez-Ruiz, Juan

    2009-03-15

    Parkinson's disease (PD) is a progressive neurodegenerative disorder. An experimental model of this disease is produced in nonhuman primates by the administration of the neurotoxin 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP). In this work, we put forward a new quantitative evaluation method that uses video recordings to measure the displacement, gate, gross and fine motor performance of freely moving subjects. Four Vervet monkeys (Cercopithecus aethiops) were trained in a behavioral observation hallway while being recorded with digital video cameras from four different angles. After MPTP intoxication the animals were tested without any drug and after 30 and 90 min of Levodopa/Carbidopa administration. Using a personal computer the following behaviors were measured and evaluated from the video recordings: displacement time across the hallway, reaching time towards rewards, ingestion time, number of attempts to obtain rewards, number of rewards obtained, and level of the highest shelf reached for rewards. Our results show that there was an overall behavioral deterioration after MPTP administration and an overall improvement after Levodopa/Carbidopa treatment. This demonstrates that the HALLWAY task is a sensitive and objective method that allows detailed behavioral evaluation of freely moving monkeys in the MPTP Parkinson's disease model.

  17. Noninvasive Quantitative Evaluation of the Dentin Layer during Dental Procedures Using Optical Coherence Tomography

    PubMed Central

    Sinescu, Cosmin; Negrutiu, Meda Lavinia; Bradu, Adrian; Duma, Virgil-Florin; Podoleanu, Adrian Gh.

    2015-01-01

    A routine cavity preparation of a tooth may lead to opening the pulp chamber. The present study evaluates quantitatively, in real time, for the first time to the best of our knowledge, the drilled cavities during dental procedures. An established noninvasive imaging technique, Optical Coherence Tomography (OCT), is used. The main scope is to prevent accidental openings of the dental pulp chamber. Six teeth with dental cavities have been used in this ex vivo study. The real time assessment of the distances between the bottom of the drilled cavities and the top of the pulp chamber was performed using an own assembled OCT system. The evaluation of the remaining dentin thickness (RDT) allowed for the positioning of the drilling tools in the cavities in relation to the pulp horns. Estimations of the safe and of the critical RDT were made; for the latter, the opening of the pulp chamber becomes unavoidable. Also, by following the fractures that can occur when the extent of the decay is too large, the dentist can decide upon the right therapy to follow, endodontic or conventional filling. The study demonstrates the usefulness of OCT imaging in guiding such evaluations during dental procedures. PMID:26078779

  18. Quantitative MR evaluation of body composition in patients with Duchenne muscular dystrophy.

    PubMed

    Pichiecchio, Anna; Uggetti, Carla; Egitto, Maria Grazia; Berardinelli, Angela; Orcesi, Simona; Gorni, Ksenija Olga Tatiana; Zanardi, Cristina; Tagliabue, Anna

    2002-11-01

    The aim of this study was to propose a quantitative MR protocol with very short acquisition time and good reliability in volume construction, for the evaluation of body composition in patients affected by Duchenne muscular dystrophy (DMD). This MR protocol was compared with common anthropometric evaluations of the same patients. Nine boys affected by DMD, ranging in age from 6 to 12 years, were selected to undergo MR examination. Transversal T1-weighted spin-echo sequences (0.5T; TR 300 ms, TE 10 ms, slice thickness 10 mm, slice gap 1 mm) were used for all acquisitions, each consisting of 8 slices and lasting just 54 s. Whole-body examination needed an average of nine acquisitions. Afterwards, images were downloaded to an independent workstation and, through their electronic segmentation with a reference filter, total volume and adipose tissue volumes were calculated manually. This process took up to 2 h for each patient. The MR data were compared with anthropometric evaluations. Affected children have a marked increase in adipose tissue and a decrease in lean tissue compared with reference healthy controls. Mean fat mass calculated by MR is significantly higher than mean fat mass obtained using anthropometric measurements ( p<0.001). Our MR study proved to be accurate and easy to apply, although it was time-consuming. We recommend it in monitoring the progression of the disease and planning DMD patients' diet. PMID:12386760

  19. Quantitative analysis of topoisomerase II{alpha} to rapidly evaluate cell proliferation in brain tumors

    SciTech Connect

    Oda, Masashi; Arakawa, Yoshiki; Kano, Hideyuki; Kawabata, Yasuhiro; Katsuki, Takahisa; Shirahata, Mitsuaki; Ono, Makoto; Yamana, Norikazu; Hashimoto, Nobuo; Takahashi, Jun A. . E-mail: jat@kuhp.kyoto-u.ac.jp

    2005-06-17

    Immunohistochemical cell proliferation analyses have come into wide use for evaluation of tumor malignancy. Topoisomerase II{alpha} (topo II{alpha}), an essential nuclear enzyme, has been known to have cell cycle coupled expression. We here show the usefulness of quantitative analysis of topo II{alpha} mRNA to rapidly evaluate cell proliferation in brain tumors. A protocol to quantify topo II{alpha} mRNA was developed with a real-time RT-PCR. It took only 3 h to quantify from a specimen. A total of 28 brain tumors were analyzed, and the level of topo II{alpha} mRNA was significantly correlated with its immuno-staining index (p < 0.0001, r = 0.9077). Furthermore, it sharply detected that topo II{alpha} mRNA decreased in growth-inhibited glioma cell. These results support that topo II{alpha} mRNA may be a good and rapid indicator to evaluate cell proliferate potential in brain tumors.

  20. Research and Evaluations of the Health Aspects of Disasters, Part VIII: Risk, Risk Reduction, Risk Management, and Capacity Building.

    PubMed

    Birnbaum, Marvin L; Loretti, Alessandro; Daily, Elaine K; O'Rourke, Ann P

    2016-06-01

    There is a cascade of risks associated with a hazard evolving into a disaster that consists of the risk that: (1) a hazard will produce an event; (2) an event will cause structural damage; (3) structural damage will create functional damages and needs; (4) needs will create an emergency (require use of the local response capacity); and (5) the needs will overwhelm the local response capacity and result in a disaster (ie, the need for outside assistance). Each step along the continuum/cascade can be characterized by its probability of occurrence and the probability of possible consequences of its occurrence, and each risk is dependent upon the preceding occurrence in the progression from a hazard to a disaster. Risk-reduction measures are interventions (actions) that can be implemented to: (1) decrease the risk that a hazard will manifest as an event; (2) decrease the amounts of structural and functional damages that will result from the event; and/or (3) increase the ability to cope with the damage and respond to the needs that result from an event. Capacity building increases the level of resilience by augmenting the absorbing and/or buffering and/or response capacities of a community-at-risk. Risks for some hazards vary by the context in which they exist and by the Societal System(s) involved. Birnbaum ML , Loretti A , Daily EK , O'Rourke AP . Research and evaluations of the health aspects of disasters, part VIII: risk, risk reduction, risk management, and capacity building. Prehosp Disaster Med. 2016;31(3):300-308. PMID:27025980

  1. Quantitative Ultrasonic Evaluation of Radiation-Induced Late Tissue Toxicity: Pilot Study of Breast Cancer Radiotherapy

    SciTech Connect

    Liu Tian; Zhou Jun; Yoshida, Emi J.; Woodhouse, Shermian A.; Schiff, Peter B.; Wang, Tony J.C.; Lu Zhengfeng; Pile-Spellman, Eliza; Zhang Pengpeng; Kutcher, Gerald J.

    2010-11-01

    Purpose: To investigate the use of advanced ultrasonic imaging to quantitatively evaluate normal-tissue toxicity in breast-cancer radiation treatment. Methods and Materials: Eighteen breast cancer patients who received radiation treatment were enrolled in an institutional review board-approved clinical study. Radiotherapy involved a radiation dose of 50.0 to 50.4 Gy delivered to the entire breast, followed by an electron boost of 10.0 to 16.0 Gy delivered to the tumor bed. Patients underwent scanning with ultrasound during follow-up, which ranged from 6 to 94 months (median, 22 months) postradiotherapy. Conventional ultrasound images and radio-frequency (RF) echo signals were acquired from treated and untreated breasts. Three ultrasound parameters, namely, skin thickness, Pearson coefficient, and spectral midband fit, were computed from RF signals to measure radiation-induced changes in dermis, hypodermis, and subcutaneous tissue, respectively. Ultrasound parameter values of the treated breast were compared with those of the untreated breast. Ultrasound findings were compared with clinical assessment using Radiation Therapy Oncology Group (RTOG) late-toxicity scores. Results: Significant changes were observed in ultrasonic parameter values of the treated vs. untreated breasts. Average skin thickness increased by 27.3%, from 2.05 {+-} 0.22mm to 2.61 {+-} 0.52mm; Pearson coefficient decreased by 31.7%, from 0.41 {+-} 0.07 to 0.28 {+-} 0.05; and midband fit increased by 94.6%, from -0.92 {+-} 7.35 dB to 0.87 {+-} 6.70 dB. Ultrasound evaluations were consistent with RTOG scores. Conclusions: Quantitative ultrasound provides a noninvasive, objective means of assessing radiation-induced changes to the skin and subcutaneous tissue. This imaging tool will become increasingly valuable as we continue to improve radiation therapy technique.

  2. Risk assessment relationships for evaluating effluents from coal industries.

    PubMed

    Cuddihy, R G

    1983-06-01

    Public awareness of the risks associated with traditional coal combustion and newer coal gasification and liquefaction industries is increasing. Assessing the health risks for people exposed to effluents from these industries generally involves four major steps: (1) characterizing the pollutant sources, (2) projecting the release and dispersion of toxic substances in workplaces and in the environment, (3) estimating their uptake by people through inhalation and ingestion and their contact with skin, and (4) evaluating their potential for causing health effects. Pollutants of special concern include toxic gases, carcinogenic organic compounds and trace metals. Relationships between the levels of pollutants released to the environment and the magnitudes of human exposures and methods of formulating exposure-dose-effect relationships for use in human risk assessment are discussed.

  3. Quantitative evaluation of susceptibility effects caused by dental materials in head magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Strocchi, S.; Ghielmi, M.; Basilico, F.; Macchi, A.; Novario, R.; Ferretti, R.; Binaghi, E.

    2016-03-01

    This work quantitatively evaluates the effects induced by susceptibility characteristics of materials commonly used in dental practice on the quality of head MR images in a clinical 1.5T device. The proposed evaluation procedure measures the image artifacts induced by susceptibility in MR images by providing an index consistent with the global degradation as perceived by the experts. Susceptibility artifacts were evaluated in a near-clinical setup, using a phantom with susceptibility and geometric characteristics similar to that of a human head. We tested different dentist materials, called PAL Keramit, Ti6Al4V-ELI, Keramit NP, ILOR F, Zirconia and used different clinical MR acquisition sequences, such as "classical" SE and fast, gradient, and diffusion sequences. The evaluation is designed as a matching process between reference and artifacts affected images recording the same scene. The extent of the degradation induced by susceptibility is then measured in terms of similarity with the corresponding reference image. The matching process involves a multimodal registration task and the use an adequate similarity index psychophysically validated, based on correlation coefficient. The proposed analyses are integrated within a computer-supported procedure that interactively guides the users in the different phases of the evaluation method. 2-Dimensional and 3-dimensional indexes are used for each material and each acquisition sequence. From these, we drew a ranking of the materials, averaging the results obtained. Zirconia and ILOR F appear to be the best choice from the susceptibility artefacts point of view, followed, in order, by PAL Keramit, Ti6Al4V-ELI and Keramit NP.

  4. The quantitative risks of mesothelioma and lung cancer in relation to asbestos exposure.

    PubMed

    Hodgson, J T; Darnton, A

    2000-12-01

    Mortality reports on asbestos exposed cohorts which gave information on exposure levels from which (as a minimum) a cohort average cumulative exposure could be estimated were reviewed. At exposure levels seen in occupational cohorts it is concluded that the exposure specific risk of mesothelioma from the three principal commercial asbestos types is broadly in the ratio 1:100:500 for chrysotile, amosite and crocidolite respectively. For lung cancer the conclusions are less clear cut. Cohorts exposed only to crocidolite or amosite record similar exposure specific risk levels (around 5% excess lung cancer per f/ml.yr); but chrysotile exposed cohorts show a less consistent picture, with a clear discrepancy between the mortality experience of a cohort of xhrysotile textile workers in Carolina and the Quebec miners cohort. Taking account of the excess risk recorded by cohorts with mixed fibre exposures (generally<1%), the Carolina experience looks uptypically high. It is suggested that a best estimate lung cancer risk for chrysotile alone would be 0.1%, with a highest reasonable estimate of 0.5%. The